Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 13 Next »

1. Prerequisites on Azure portal

Entry point to Azure Databricks is https://portal.azure.com. From home page navigate to Databricks section.

Click

Your account must have Owner or Contributor privileges on Databricks workspace to be able to access it.
By default Azure Databricks cluster is not running. Either create a new cluster or start existing one (be sure to change cluster filter to show all clusters).

In Advanced Options set up Spark config for a connection to ADLS storage. Configuration is a set of key:value pairs separated by single spaces.

Spark metastore version spark.sql.hive.metastore.version 1.2.1
Spark drivers spark.sql.hive.metastore.jars builtin
OAuth access type spark.hadoop.dfs.adls.oauth2.access.token.provider.type ClientCredential
OAuth login endpoint spark.hadoop.dfs.adls.oauth2.refresh.url https://login.microsoftonline.com/6fdc3117-ec29-4d73-8b33-028c8c300872/oauth2/token
OAuth secret spark.hadoop.dfs.adls.oauth2.credential +gv2ThjPc++++++++++++++++++++++++++zV8NrM74=
OAuth client ID spark.hadoop.dfs.adls.oauth2.client.id 74731c8c-7290-4998-8005-1d0670cbe909

Note down the server hostname, port and JDBC URL which will be used in SM storage definition.

To create a database, first create a Notebook for submission of SQL queries.

Now you can run create database SQL query.

Create TOKEN for remote access in User Settings (top right corner icon).

Save token for later use in SAP configuration.

2. Setup Java connector

Java connector is a critical middle-ware component. Please follow the steps in this guide (SM-2002) Java Connector Setup  to set it up before you continue. 

3. Create ADLS Gen 1 storage

ADLS storage is required as a storage for temporary data. Please follow this guide to create the storage (SM-2002) Azure Data Lake Gen1.

4. Download Spark JDBC drivers

Download Spark JDBC driver from https://databricks.com/spark/odbc-driver-download

Then save the .jar file at /sapmnt/<SID>/global/security/dvd_conn/spark/

The file needs to be owned by <sid>adm:sapsys

5. Create Databricks storage

Go to transaction /DVD/SM_SETUP

Create new storage of type SM_TRS_MS

  • No labels