(SM-2002) SOW Prerequisites
The purpose of this page is to provide a single point of truth for SOW Prerequisites for each storage type that Storage management (SM) supports. The content should be in a form that can be copy-pasted to Word documents with minimal need for adjustments.
General prerequisites
These prerequisites are necessary for every storage in SM and are complementary to prerequisites specific to a single storage.
Prerequisite 1: “Remote connection - OSS or alternative connection”
Details: In order to execute the services described herein, customer must open an OSS connection (Type R3 Support) by creating an OSS ticket to the XX-PART-DVD component and store the user information in Customer Remote Logon Depot. Customer can also provide an access through an alternative service, for example by onboarding a DATAVARD employee and giving him access to corporate VPN.
Impact: If the connection is not granted, the project timeline might be prolonged.
Action: The Customer shall check with DATAVARD and its basis team for possible remote connection alternatives.
Prerequisite 2: “User authorization”
Details: In order to execute the services described herein, customer must create an SAP user with proper authorization and approval to operate within SAP environment using a combination of dialog and batch processing in SAP system. It is expected that users should be at least authorized to execute the following transactions: /DVD/*, RSA1, ST04, DB02, DB20, SM37, SM50, SM51, ST11, SE16; SE37, SE38, SM59, FILE, STRUST, SMGW.
Impact: If the authorization is not granted, DATAVARD consultants will be not able to perform any activity related to SAP system.
Action: The Customer shall check with DATAVARD for details on needed authorizations.
Prerequisite 3: “DATAVARD transports”
Details: Customer must import transports containing DATAVARD software required to establish the connection, provided by DATAVARD consultants.
Impact: If the DATAVARD software is not not imported into the SAP system, the connection can’t be established.
Action: The Customer shall check with DATAVARD for required software transports.
Hive (Impala)
These prerequisites are meant to be used when customer is looking for a connection to a traditional Hadoop platform (Cloudera, Hortonworks, MapR), or some cloud distributed Hadoop platforms (Amazon EMR, Azure HDInsight). Connection to Hive also requires connection to HDFS, so prerequisites for both services are documented here.
Prerequisite 4: “SAP and Hadoop platform connectivity”
Details: In order to execute the services described herein, customer must have a working network connection with sufficient bandwidth and stability between the SAP system and Hadoop platform. Specifically, outbound traffic from the SAP must be open to individual Hadoop components (Hive server, Impala daemon, HttpFS, WebHDFS, Datanodes, loadbalancers, etc..) and inbound traffic coming from the SAP system must be allowed on Hadoop services. DNS hostname resolution (also reverse) has to be working properly between SAP application servers and Hadoop hosts.
Impact: If the communication is not configured prior to DATAVARD activities, connection can’t be established.
Action: The Customer shall check with DATAVARD and its security/network team for details on further steps.
Prerequisite 5: “Hadoop components and configuration”
Details: In order to execute the services described herein, customer must have at least following Hadoop components available and configuration applied:
Hadoop Components
Hive 1.1.0 and higher
Impala 2.3.0 and higher (optional)
Default installation of httpFS or webHDFS (httpFS recommended)
Hadoop configuration
Create Hadoop technical user to be used per connected SAP system and provide it’s authentication details
Create HDFS location to be used as a temporary landing zone
Create Hive database per connected SAP system
Define Sentry/Ranger rules (for secure deployment) for Hive database and HDFS location
Impact: If components are not available and configuration not set, no data operation can be performed.
Action: The Customer shall check with DATAVARD and its Hadoop platform team for details on further steps.
Prerequisite 6: “SAP system configuration”
Details: In order to execute the services described herein, customer must have following SAP components available and configuration in place for SAP application servers and SAP system:
SAP release 7.01 SP15 or higher
Http/https service running and available in SAP (by default available)
Required SSL certificates are imported to STRUST
SAP Java Connector 3.0 library is available in LD_LIBRARY_PATH of <sid>adm user on every application server
Technical user of type communication data is created in SAP
The technical user has following authorization: object S_RFC with function group SYST, RFC1, SDIFRUNTIME and activity 16
Program DATAVARD_JAVA_CONN is allowed to register on the SAP gateway (by default wildcard (*) is set)
Appropriate JDBC drivers are installed on the application server
Impact: If the SAP components are not available and configuration not set, no data operation can be performed.
Action: The Customer shall check with DATAVARD and its SAP team for details on further steps.
HDFS
Following prerequisites are suitable for customer who seeks connection to a traditional Hadoop platform, but only want file based extraction to HDFS (no Hive, Impala). If customer also wants a connection to Hive (Impala), use prerequisites above.
Prerequisite 4: “SAP and Hadoop platform connectivity”
Details: In order to execute the services described herein, customer must have a working network connection with sufficient bandwidth and stability between the SAP system and Hadoop platform. Specifically, outbound traffic from the SAP system must be allowed to individual Hadoop components (HttpFS, WebHDFS, Datanodes, loadbalancers, etc..) and inbound traffic coming from the SAP system must be allowed on Hadoop environment. DNS hostname resolution (also reverse) has to be working properly between SAP application servers and Hadoop hosts.
Impact: If the communication is not configured prior to DATAVARD activities, connection can’t be established.
Action: The Customer shall check with DATAVARD and its security/network team for details on further steps.
Prerequisite 5: “Hadoop components and configuration”
Details: In order to execute the services described herein Customer must have at least following Hadoop components available and configuration applied:
Hadoop Components
Default installation of httpFS or webHDFS (httpFS recommended)
Hadoop configuration
Create Hadoop technical user to be used per connected SAP system and provide it’s authentication details
Create HDFS location to be used as a landing zone
Define Sentry/Ranger rules (for secure deployment) for the HDFS location
Impact: If components are not available and configuration not set, no data operation can be performed.
Action: The Customer shall check with DATAVARD and its Hadoop platform team for details on further steps.
Prerequisite 6: “SAP system configuration”
Details: In order to execute the services described herein, customer must have following SAP components available and configuration in place for SAP application servers and SAP system:
SAP release 7.01 SP15 or higher
Http/https service running and available in SAP (by default available)
Required SSL certificates are imported to STRUST
If kerberos authentication is required:
SAP Java Connector 3.0 library is available in LD_LIBRARY_PATH of <sid>adm user on every application server
Technical user of type communication data is created in SAP
The technical user has following authorization: object S_RFC with function group SYST, RFC1, SDIFRUNTIME and activity 16
Program DATAVARD_JAVA_CONN is allowed to register on the SAP gateway (by default wildcard (*) is set)
Impact: If the SAP components are not available and configuration not set, no data operation can be performed.
Action: The Customer shall check with DATAVARD and its SAP team for details on further steps.
AWS S3
Following prerequisites are suitable for customer who seeks connection to AWS S3. If customer also wants a connection to AWS Redshift, use the prerequisites for Amazon Redshift.
Prerequisite 4: “SAP and AWS connectivity”
Details: In order to execute the services described herein, customer must have a working network connection with sufficient bandwidth and stability between the SAP system and AWS. Specifically, outbound traffic from the SAP system to individual AWS services must be allowed, and inbound traffic coming from the SAP system must be allowed on AWS services. . DNS hostname resolution of AWS services has to be working properly on SAP application servers.
Impact: If the communication is not configured prior to DATAVARD activities, connection can’t be established.
Action: The Customer shall check with DATAVARD and its security/network team for details on further steps.
Prerequisite 5: “S3 configuration and authentication”
Details: In order to execute the services described herein, customer needs to apply following configuration on his AWS landscape:
Create an S3 bucket or identify an existing bucket to be used
Create a technical user in AWS IAM with privileges to read and create objects in the S3 bucket
Impact: If the bucket is not created and a user is not provided, then the connection can’t be established.
Action: The Customer shall check with DATAVARD for details on further steps.
Prerequisite 6: “SAP system configuration”
Details: In order to execute the services described herein, customer must have following SAP components available and configuration in place for SAP application servers and SAP system:
SAP release 7.01 SP15 or higher
SAP Java Connector 3.0 library is available in LD_LIBRARY_PATH of <sid>adm user on every application server
Technical user of type communication data is created in SAP
The technical user has following authorization: object S_RFC with function group SYST, RFC1, SDIFRUNTIME and activity 16
Program DATAVARD_JAVA_CONN is allowed to register on the SAP gateway (by default wildcard (*) is set)
Impact: If the SAP components are not available and configuration not set, no data operation can be performed.
Action: The Customer shall check with DATAVARD and its SAP team for details on further steps.
AWS Redshift
Following prerequisites are suitable for customer who seeks connection to AWS Redshift. Connection to Redshift also requires connection to S3, so prerequisites for both services are documented here.
Prerequisite 4: “SAP and AWS connectivity”
Details: In order to execute the services described herein, customer must have a working network connection with sufficient bandwidth and stability between the SAP system and AWS. Specifically, outbound traffic from the SAP system must be open to individual AWS services, and inbound traffic coming from the SAP system must be allowed on AWS services. DNS hostname resolution of AWS services has to be working properly on SAP application servers.
Impact: If the communication is not configured prior to DATAVARD activities, connection can’t be established.
Action: The Customer shall check with DATAVARD and its security/network team for details on further steps.
Prerequisite 5: “Redshift and S3 configuration and authentication”
Details: In order to execute the services described herein, customer needs to apply following configuration on his AWS landscape:
Create a S3 bucket or identify an existing bucket to be used
Create a technical user in AWS IAM with privileges to read and create objects in S3 bucket
Create a Redshift cluster
Create a database in Redshift
Create a technical user in Redshift that is an owner of the database created in step 4
Grant authorizations to the technical user so it will be able to select from pg_catalog.PG_TABLE_DEF and pg_catalog.SVV_TABLE_INFO
Impact: If mentioned steps aren’t completed, the connection can’t be established.
Action: The Customer shall check with DATAVARD for details on further steps.
Prerequisite 6: “SAP system configuration”
Details: In order to execute the services described herein, customer must have following SAP components available and configuration in place for SAP application servers and SAP system:
SAP release 7.01 SP15 or higher
SAP Java Connector 3.0 library is available in LD_LIBRARY_PATH of <sid>adm user on every application server
Technical user of type communication data is created in SAP
The technical user has following authorization: object S_RFC with function group SYST, RFC1, SDIFRUNTIME and activity 16
Program DATAVARD_JAVA_CONN is allowed to register on the SAP gateway (by default wildcard (*) is set)
Redshift JDBC drivers are installed on the SAP application servers
Impact: If the SAP components are not available and configuration not set, no data operation can be performed.
Action: The Customer shall check with DATAVARD and its SAP team for details on further steps.
ADLS (gen1 or gen2)
Following prerequisites are suitable for customer who seeks connection to ADLS (gen1 or gen2).
Prerequisite 4: “SAP and Azure connectivity”
Details: In order to execute the services described herein, customer must have a working network connection with sufficient bandwidth and stability between the SAP system and Azure. Specifically, outbound connection must be open to individual Azure services, and inbound traffic coming from the SAP system must be allowed on Azure services. DNS hostname resolution of Azure services has to be working properly on SAP application servers.
Impact: If the communication is not configured prior to DATAVARD activities, connection can’t be established.
Action: The Customer shall check with DATAVARD and its security/network team for details on further steps.
Prerequisite 5: “ADLS configuration and authentication”
Details: In order to execute the services described herein, customer needs to apply following configuration on his Azure landscape:
Create an ADLS resource or identify an existing one to be used
Create a directory on ADLS to be used
Create a new App Registration in Azure Active Directory
Generate a secret for the App registration
Assign correct filesystem privileges (RWX on target directory and X on all parent directories) on ADLS, so the App Registration can read/write to the assigned directory
For ADLS gen2 it is possible to only generate a SAS token that gives complete access to the ADLS resource. This can be done as a substitute for steps 3-5.
Impact: If mentioned steps aren’t completed, the connection can’t be established.
Action: The Customer shall check with DATAVARD for details on further steps.
Prerequisite 6: “SAP system configuration”
Details: In order to execute the services described herein, customer must have following SAP components available and configuration in place for SAP application servers and SAP system:
SAP release 7.01 SP15 or higher
Http/https service running and available in SAP (by default available)
Microsoft SSL certificates are imported to STRUST
Impact: If the SAP components are not available and configuration not set, no data operation can be performed.
Action: The Customer shall check with DATAVARD and its SAP team for details on further steps.
Azure BLOB
Following prerequisites are suitable for customer who seeks connection to Azure BLOB.
Prerequisite 4: “SAP and Azure connectivity”
Details: In order to execute the services described herein, customer must have a working network connection with sufficient bandwidth and stability between the SAP system and Azure. Specifically, outbound connection must be open to individual Azure services, and inbound traffic coming from the SAP system must be allowed on Azure services. DNS hostname resolution of Azure services has to be working properly on SAP application servers.
Impact: If the communication is not configured prior to DATAVARD activities, connection can’t be established.
Action: The Customer shall check with DATAVARD and its security/network team for details on further steps.
Prerequisite 5: “BLOB configuration and authentication”
Details: In order to execute the services described herein, customer needs to apply following configuration on his Azure landscape:
Create a BLOB resource or identify an existing one to be used
Create a container on BLOB to be used
Generate a SAS token that gives an access to the ADLS resource.
Impact: If mentioned steps aren’t completed, the connection can’t be established.
Action: The Customer shall check with DATAVARD for details on further steps.
Prerequisite 6: “SAP system configuration”
Details: In order to execute the services described herein, customer must have following SAP components available and configuration in place for SAP application servers and SAP system:
SAP release 7.01 SP15 or higher
Http/https service running and available in SAP (by default available)
Microsoft SSL certificates are imported to STRUST
Impact: If the SAP components are not available and configuration not set, no data operation can be performed.
Action: The Customer shall check with DATAVARD and its SAP team for details on further steps.
Azure Databricks
Following prerequisites are suitable for customer who seeks a connection to Azure Databricks. Azure Databricks also requires connection to ADLS gen1, so also those prerequisites are included here.
Prerequisite 4: “SAP and Azure connectivity”
Details: In order to execute the services described herein, customer must have a working network connection with sufficient bandwidth and stability between the SAP system and Azure. Specifically, outbound traffic from the SAP system must be allowed to individual Azure services, and inbound traffic coming from the SAP system must be allowed on Azure services. DNS hostname resolution of Azure services has to be working properly on SAP application servers.
Impact: If the communication is not configured prior to DATAVARD activities, connection can’t be established.
Action: The Customer shall check with DATAVARD and its security/network team for details on further steps.
Prerequisite 5: “Databricks and ADLS configuration and authentication”
Details: In order to execute the services described herein, customer needs to apply following configuration on his Azure landscape:
Create an ADLS resource or identify an existing one to be used
Create a directory on ADLS to be used
Create a new App Registration on Azure Active Directory
Generate a secret for the App registration
Assign correct filesystem privileges (RWX on target directory and X on all parent directories) on ADLS, so the App Registration can read/write to the assigned directory
Create a Databricks reasource and create a cluster
Create a database in the Databricks cluster
Add following configuration to the cluster’s advanced parameters:
spark.sql.hive.metastore.version 1.2.1
spark.sql.hive.metastore.jars builtin
spark.hadoop.dfs.adls.oauth2.access.token.provider.type ClientCredential
spark.hadoop.dfs.adls.oauth2.refresh.url https://login.microsoftonline.com/<your_tenant>/oauth2/token
spark.hadoop.dfs.adls.oauth2.credential <your oauth secret>
spark.hadoop.dfs.adls.oauth2.client.id <client ID of your app registration>In User settings, generate a new Access token
Impact: If mentioned steps aren’t completed, the connection can’t be established.
Action: The Customer shall check with DATAVARD for details on further steps.
Prerequisite 6: “SAP system configuration”
Details: In order to execute the services described herein, customer must have following SAP components available and configuration in place for SAP application servers and SAP system:
SAP release 7.01 SP15 or higher
SAP Java Connector 3.0 library is available in LD_LIBRARY_PATH of <sid>adm user on every application server
Technical user of type communication data is created in SAP
The technical user has following authorization: object S_RFC with function group SYST, RFC1, SDIFRUNTIME and activity 16
Http/https service running and available in SAP (by default available)
Microsoft SSL certificates are imported to STRUST
Program DATAVARD_JAVA_CONN is allowed to register on the SAP gateway (by default wildcard (*) is set)
Hive JDBC drivers are installed on SAP application servers
Impact: If the SAP components are not available and configuration not set, no data operation can be performed.
Action: The Customer shall check with DATAVARD and its SAP team for details on further steps.
Microsoft SQL server
Following prerequisites are suitable for customer who seeks connection to Microsoft SQL server.
Prerequisite 4: “SAP and Azure connectivity”
Details: In order to execute the services described herein, customer must have a working network connection with sufficient bandwidth and stability between the SAP system and MSSQL server. Specifically, outbound connection must be open to the MSSQL server, and inbound traffic coming from the SAP system must be allowed on MSSQL server. DNS hostname resolution of the SQL server has to be working properly on SAP application servers.
Impact: If the communication is not configured prior to DATAVARD activities, connection can’t be established.
Action: The Customer shall check with DATAVARD and its security/network team for details on further steps.
Prerequisite 5: “MSSQL server configuration and authentication”
Details: In order to execute the services described herein, customer needs to apply following configuration on his MSSQL server:
Create a database in the MSSQL server to be used or identify an existing one
Create a technical user that is allowed to interact with the database
Impact: If mentioned steps aren’t completed, the connection can’t be established.
Action: The Customer shall check with DATAVARD for details on further steps.
Prerequisite 6: “SAP system configuration”
Details: In order to execute the services described herein, customer must have following SAP components available and configuration in place for SAP application servers and SAP system:
Following SAP release or higher
7.21: 619
7.22: 21
7.42: 315
7.45: 29Application servers are running either on Windows or Linux operating system
Microsoft SQL server ODBC drivers are installed on every application server
Secondary DB connection is added through t-code DBCO and the connection is successfully tested through report ADBC_TEST_CONNECTION
Impact: If the SAP components are not available and configuration not set, no data operation can be performed.
Action: The Customer shall check with DATAVARD and its SAP team for details on further steps.
Google Cloud Storage
Following prerequisites are suitable for customer who seeks a connection to Google Cloud Storage (GCS).
Prerequisite 4: “SAP and GCP connectivity”
Details: In order to execute the services described herein, customer must have a working network connection with sufficient bandwidth and stability between the SAP system and GCP. Specifically, outbound traffic from the SAP system must be allowed to individual GCP services, and inbound traffic coming from the SAP system must be allowed on GCP services. DNS hostname resolution of GCP services has to be working properly on SAP application servers.
Impact: If the communication is not configured prior to DATAVARD activities, connection can’t be established.
Action: The Customer shall check with DATAVARD and its security/network team for details on further steps.
Prerequisite 5: “GCS configuration and authentication”
Details: In order to execute the services described herein, customer needs to apply following configuration on his Azure landscape:
Create an GCS resource or identify an existing one to be used
Create a service account to be used and generate a key in JSON format
Assign correct privileges to the service account for the bucket
Impact: If mentioned steps aren’t completed, the connection can’t be established.
Action: The Customer shall check with DATAVARD for details on further steps.
Prerequisite 6: “SAP system configuration”
Details: In order to execute the services described herein, customer must have following SAP components available and configuration in place for SAP application servers and SAP system:
SAP release 7.01 SP15 or higher
SAP Java Connector 3.0 library is available in LD_LIBRARY_PATH of <sid>adm user on every application server
Technical user of type communication data is created in SAP
Technical user has following authorization: object S_RFC with function group SYST, RFC1, SDIFRUNTIME and activity 16
Program DATAVARD_JAVA_CONN is allowed to register on the SAP gateway (by default wildcard (*) is set)
Impact: If the SAP components are not available and configuration not set, no data operation can be performed.
Action: The Customer shall check with DATAVARD and its SAP team for details on further steps.
Google BigQuery
Following prerequisites are suitable for customer who seeks a connection to Google BigQuery. Google BigQuery also requires a connection to Google Cloud Storage and also those prerequisites are included here.
Prerequisite 4: “SAP and GCP connectivity”
Details: In order to execute the services described herein, customer must have a working network connection with sufficient bandwidth and stability between the SAP system and GCP. Specifically, outbound traffic from the SAP system must be allowed to individual GCP services, and inbound traffic coming from the SAP system must be allowed on GCP services. DNS hostname resolution of GCP services has to be working properly on SAP application servers.
Impact: If the communication is not configured prior to DATAVARD activities, connection can’t be established.
Action: The Customer shall check with DATAVARD and its security/network team for details on further steps.
Prerequisite 5: “Google BigQuery & GCS configuration and authentication”
Details: In order to execute the services described herein, customer needs to apply following configuration on his Azure landscape:
Create an GCS resource or identify an existing one to be used
Create a service account to be used and generate a key in JSON format
Assign correct privileges to the service account for the bucket
Create a BigQuery resource or identify an existing one to be used
Create a dataset or identify an existing one to be used
Assign correct privileges to the service account for the BigQuery resource and dataset
Impact: If mentioned steps aren’t completed, the connection can’t be established.
Action: The Customer shall check with DATAVARD for details on further steps.
Prerequisite 6: “SAP system configuration”
Details: In order to execute the services described herein, customer must have following SAP components available and configuration in place for SAP application servers and SAP system:
SAP release 7.01 SP15 or higher
SAP Java Connector 3.0 library is available in LD_LIBRARY_PATH of <sid>adm user on every application server
Technical user of type communication data is created in SAP
The technical user has following authorization: object S_RFC with function group SYST, RFC1, SDIFRUNTIME and activity 16
Program DATAVARD_JAVA_CONN is allowed to register on the SAP gateway (by default wildcard (*) is set)
Impact: If the SAP components are not available and configuration not set, no data operation can be performed.
Action: The Customer shall check with DATAVARD and its SAP team for details on further steps.