Databricks SQL

The foglamp-north-databricks-sql plugin sends data from FogLAMP to Databricks using the ODBC Driver. This plugin facilitates the transfer of data from FogLAMP’s buffer to Databricks, ensuring seamless integration with Databricks’s cloud-based data warehouse.

databricks_1

Configuration Details

The plugin requires the following configuration parameters:

  • Databricks Server URL: The Databricks server URL (e.g., xyz.cloud.databricks.com).

  • Databricks Server Port: The Databricks server port (default: 443).

  • Database Name: The Databricks database name where the data will be stored.

  • Username: The username for connecting to the Databricks account.

  • Password: The password for authenticating the Databricks account.

  • FogLAMP Instance Name: FogLAMP instance name.

  • Data Source: The source of FogLAMP data to be sent to Databricks (e.g., readings).

  • Additional Connection Parameters: Any additional connection parameters required for the Databrics connection (e.g., HTTPPath, AuthMech, etc.). For example, HTTPPath=your_http_path;AuthMech=3;SSL=1;ThriftTransport=2; for token bases authentication.

The plugin will create a table in Databricks if it does not already exist. The table structure is dynamically determined based on the data provided by FogLAMP.


Hints

A filter plugin foglamp-filter-databricks-hints allows hints to be added to the readings. It will affect how the data is stored within/mapped with the Databricks SQL. Currently it supports only one hint that is table name. A new datapoint Databrics-Hint hint will be added into existing reading.

If foglamp-filter-databricks-hints filter is used with foglamp-north-databricks-sql then a table will be created as per the table name provided in the hints