- By default, this Databricks integration makes use of Unity Catalog data governance features. You will need Unity Catalog enabled on your Databricks Workspace.
Create a new SQL endpoint for data writing.
- Log in to the Databricks account.
- In the navigation pane, click into the workspace dropdown and select SQL.
- In the SQL console, in the SQL navigation pane, click Create and then SQL endpoint.
- In the New SQL Endpoint menu, choose a name and configure the options for the new SQL endpoint, and click Create.
Collect connection information and create an access token for the data transfer service.
- In the SQL Endpoints console, select the SQL endpoint you created in Step 1.
- Click the Connection Details tab, and make a note of the Server hostname, Port, and HTTP path.
- Click the link to Create a personal access token.
- Click Generate New Token.
- Name the token with a descriptive comment and assign the token lifetime. A longer lifetime will ensure you do not have to update the token as often. Click Generate.
- In the pop up that follows, copy the token and prepare to share securely with your support contact.
Create a staging bucket in one of the following cloud environments. Refer to our documentation on object storage for staging bucket configuration instructions.
- AWS S3
- Google Cloud Storage
- Azure Blob Storage
For the data export setup, you will need:
- server hostname
- HTTP path
- personal access token
- staging bucket details
Visit the LogRocket Streaming Data Export settings page to complete the setup.
Updated 3 days ago
Learn about how to configure the Streaming Data Export integration in app!