Adding JDBC Driver to PySpark on Startup not Using Config File Adding JDBC Driver to PySpark on Startup not Using Config File azure azure

Adding JDBC Driver to PySpark on Startup not Using Config File


According to the issue information, it was caused by the SQL Database driver & connection string. The jdbc driver & connection string are used for Java, not for Python.

So you need to use the odbc driver and connection string for Python, please try to install the Python package pymssql to access the SQL Database via refer to the doc http://pymssql.org/en/stable/.

The connection string for ODBC with Python is as below.

Driver={SQL Server Native Client 10.0};Server=tcp:<your-server>.database.windows.net,1433;Database=<my_db>;Uid=<your-username>@<your-server>;Pwd={your_password_here};Encrypt=yes;Connection Timeout=30;

If you have to use the jdbc driver and connection string, you can try to refer to the document Databases and Jython: Object Relational Mapping and Using JDBC using in Jython instead of Python.

Any concern, please feel free to let me know.