Table of Contents | ||
---|---|---|
|
1. Prerequisites on Azure portal
...
Your account must have Owner or Contributor privileges on Databricks workspace to be able to access it.
By default Azure Databricks cluster is not running. Either create a new cluster or start existing one (be sure to change cluster filter to show all clusters).
...
In Advanced Options set up Spark config for a connection to ADLS storage. Configuration is a set of key:value pairs separated by single spaces.
Spark metastore version
spark.sql.hive.metastore.version 1.2.1Spark drivers
spark.sql.hive.metastore.jars builtinOAuth access type
spark.hadoop.dfs.adls.oauth2.access.token.provider.type ClientCredentialOAuth login endpoint
spark.hadoop.dfs.adls.oauth2.refresh.url https://login.microsoftonline.com/6fdc3117-ec29-4d73-8b33-028c8c300872/oauth2/tokenOAuth secret
spark.hadoop.dfs.adls.oauth2.credential +gv2ThjPc++++++++++++++++++++++++++zV8NrM74=OAuth client ID
spark.hadoop.dfs.adls.oauth2.client.id 74731c8c-7290-4998-8005-1d0670cbe909
Info |
---|
Note down the server hostname, port and JDBC URL which will be used in SM storage definition. |
...
To create a database, first create a Notebook
for submission of SQL queries.
...
...
Now you can run create database SQL query.
...
Create TOKEN for remote access in User Settings (top right corner icon).
...
...
Save token for later use in SAP configuration.
...
Java connector is a critical middle-ware component. Please follow the steps in this guide (SM-2002) Java Connector Setup to set it up before you continue.
...
ADLS storage is required as a storage for temporary data. Please follow this guide to create the storage (SM-2002) Azure Data Lake Gen1.
4. Download Spark JDBC drivers
...