(SM-1911) SOW Prerequisites

The purpose of this page is to provide a single point of truth for SOW Prerequisites for each storage type that Storage management (SM) supports. The content should be in a form that can be copy-pasted to Word documents with minimal need for adjustments.

 

General prerequisites

These prerequisites are necessary for every storage in SM and are complementary to prerequisites specific to single storage.

Prerequisite 1: “Remote connection - OSS or alternative connection”

Details: In order to execute the services described herein remotely Customer must open an OSS connection (Type R3 Support) by creating an OSS ticket to the XX-PART-DVD component and store the user information in the secure area. Customer can also provide access through alternative service, for example by onboarding a DATAVARD employee and giving him access to corporate VPN.

Impact: If the connection is not granted the project timeline might be prolonged.

Action: The Customer shall check with DATAVARD and its basis team for possible remote connection alternatives.

Prerequisite 2: “User authorization”

Details: In order to execute the services described herein Customer must create an SAP user with proper authorization and approval to operate within SAP environment using a combination of dialog and batch processing in SAP system. It is expected that users should be at least authorized to execute the following transactions: /DVD/*, RSA1, ST04, DB02, DB20, SM37, SM50, SM51, ST11, SE16; SE37, SE38, SM59, FILE, STRUST, SMGW.

Impact: If the authorization is not granted DATAVARD consultants will be not able to perform any activity related to SAP system.

Action: The Customer shall check with DATAVARD for details on needed authorizations.

 

Prerequisite 3: “DATAVARD transports”

Details: Customer must import transports containing DATAVARD software required to establish the connection. that are provided by DATAVARD consultants.

Impact: If the DATAVARD software is not imported into the SAP system, the connection can’t be established.

Action: The Customer shall check with DATAVARD for required software transports.

Hive (Impala)

These prerequisites are meant to be used when customer is looking for a connection to a traditional Hadoop platform (Cloudera, Hortonworks, MapR), or some cloud distributed Hadoop platforms (Amazon EMR, Azure HDInsight). Connection to Hive also requires a connection to HDFS, so prerequisites for both services are documented here.

Prerequisite 4: “SAP and Hadoop platform connectivity”

Details: In order to execute the services described herein Customer must have a working network connection with sufficient bandwidth and stability between the SAP system and Hadoop platform. Specifically, outbound traffic from the SAP must be open to individual Hadoop components (Hive server, Impala daemon, HttpFS, WebHDFS, Datanodes, load balancers, etc..) and inbound traffic coming from the SAP system must be allowed on Hadoop services. DNS hostname resolution (also reverse) has to be working properly between SAP application servers and Hadoop hosts.

Impact: If the communication is not configured prior to DATAVARD activities, the connection can’t be established.

Action: The Customer shall check with DATAVARD and its security/network team for details on steps needed.

Prerequisite 5: “Hadoop components and configuration”

Details: In order to execute the services described herein Customer must have at least following Hadoop components available and configuration applied.

Hadoop Components

  1. Hive 1.1.0 and higher 

  2. Impala 2.3.0 and higher (optional)

  3. The default installation of httpFS or webHDFS (httpFS recommended)

Hadoop configuration

  1. Create Hadoop technical user to be used per connected SAP system and provide it’s authentication details

  2. Create HDFS location to be used as a temporary landing zone

  3. Create Hive database per connected SAP system

  4. Define Sentry/Ranger rules (for secure deployment) for Hive database and HDFS location

Impact: If components are not available, and configuration not set, no data operation can be performed.

Action: The Customer shall check with DATAVARD and its Hadoop platform team for details on steps needed.

Prerequisite 6: “SAP system configuration”

Details: In order to execute the services described herein Customer must have the following SAP components available and configuration in place for SAP application servers and SAP system.

  1. SAP release 7.01 SP15 or higher

  2. Http/https service running and available in SAP (by default available)

  3. Required SSL certificates are imported to STRUST

  4. SAP Java Connector 3.0 library is available in LD_LIBRARY_PATH of <sid>adm user on every application server

  5. Technical user of type communication data is created in SAP

  6. The technical user has following authorization: object S_RFC with function group SYST, RFC1, SDIFRUNTIME and activity 16

  7. Program DATAVARD_JAVA_CONN is allowed to register on the SAP gateway (by default wildcard (*) is set)

  8. Appropriate JDBC drivers are installed on the application server

Impact: If the SAP components are not available and configuration not set, no data operation can be performed.

Action: The Customer shall check with DATAVARD and its SAP team for details on steps needed

 

HDFS

The following prerequisites are suitable for customer who seek connection to a traditional Hadoop platform, but only want file-based extraction to HDFS (no Hive, Impala). If customer also wants connection to Hive (Impala) use prerequisites above.

Prerequisite 4: “SAP and Hadoop platform connectivity”

Details: In order to execute the services described herein Customer must have a working network connection with sufficient bandwidth and stability between the SAP system and Hadoop platform. Specifically, outbound traffic from the SAP system must be allowed to individual Hadoop components (HttpFS, WebHDFS, Datanodes, load balancers, etc..) and inbound traffic coming from the SAP system must be allowed on Hadoop environment. DNS hostname resolution (also reverse) has to be working properly between SAP application servers and Hadoop hosts.

Impact: If the communication is not configured prior to DATAVARD activities, the connection can’t be established.

Action: The Customer shall check with DATAVARD and its security/network team for details on steps needed.

Prerequisite 5: “Hadoop components and configuration”

Details: In order to execute the services described herein Customer must have at least following Hadoop components available and configuration applied.

Hadoop Components

  1. Default installation of httpFS or webHDFS (httpFS recommended)

Hadoop configuration

  1. Create Hadoop technical user to be used per connected SAP system and provide it’s authentication details

  2. Create HDFS location to be used as a landing zone

  3. Define Sentry/Ranger rules (for secure deployment) for the HDFS location

Impact: If components are not available, and configuration not set, no data operation can be performed.

Action: The Customer shall check with DATAVARD and its Hadoop platform team for details on steps needed.

Prerequisite 6: “SAP system configuration”

Details: In order to execute the services described herein Customer must have the following SAP components available and configuration in place for SAP application servers and SAP system.

  1. SAP release 7.01 SP15 or higher

  2. Http/https service running and available in SAP (by default available)

  3. Required SSL certificates are imported to STRUST

If Kerberos authentication is required:

  1. SAP Java Connector 3.0 library is available in LD_LIBRARY_PATH of <sid>adm user on every application server

  2. Technical user of type communication data is created in SAP

  3. The technical user has following authorization: object S_RFC with function group SYST, RFC1, SDIFRUNTIME, and activity 16

  4. Program DATAVARD_JAVA_CONN is allowed to register on the SAP gateway (by default wildcard (*) is set)

Impact: If the SAP components are not available and configuration not set, no data operation can be performed.

Action: The Customer shall check with DATAVARD and its SAP team for details on steps needed

AWS S3

The following prerequisites are suitable for customer who seek connection to AWS S3. If customer also wants a connection to AWS Redshift, use the prerequisites for Amazon Redshift.

Prerequisite 4: “SAP and AWS connectivity”

Details: In order to execute the services described herein Customer must have a working network connection with sufficient bandwidth and stability between the SAP system and AWS. Specifically, outbound traffic from the SAP system to individual AWS services must be allowed, and inbound traffic coming from the SAP system must be allowed on AWS services. DNS hostname resolution of AWS services has to be working properly on SAP application servers.

Impact: If the communication is not configured prior to DATAVARD activities, the connection can’t be established.

Action: The Customer shall check with DATAVARD and its security/network team for details on steps needed.

Prerequisite 5: “S3 configuration and authentication”

Details: In order to execute the services described herein Customer applies following configuration on his AWS landscape:

  1. Create an S3 bucket, or identify an existing bucket to be used

  2. Create a technical user in AWS IAM with privileges to read and create objects in the S3 bucket

Impact: If the bucket is not created and a user is not provided, then the connection can’t be established.

Action: The Customer shall check with DATAVARD for details on steps needed.

Prerequisite 6: “SAP system configuration”

Details: In order to execute the services described herein Customer must have the following SAP components available and configuration in place for SAP application servers and SAP system.

  1. SAP release 7.01 SP15 or higher

  2. SAP Java Connector 3.0 library is available in LD_LIBRARY_PATH of <sid>adm user on every application server

  3. Technical user of type communication data is created in SAP

  4. The technical user has following authorization: object S_RFC with function group SYST, RFC1, SDIFRUNTIME, and activity 16

  5. Program DATAVARD_JAVA_CONN is allowed to register on the SAP gateway (by default wildcard (*) is set)

Impact: If the SAP components are not available and configuration not set, no data operation can be performed.

Action: The Customer shall check with DATAVARD and its SAP team for details on steps needed

AWS Redshift

Following prerequisites are suitable for customer who seek connection to AWS Redshift. Connection to Redshift also requires connection to S3, so prerequisites for both services are documented here.

Prerequisite 4: “SAP and AWS connectivity”

Details: In order to execute the services described herein Customer must have a working network connection with sufficient bandwidth and stability between the SAP system and AWS. Specifically, outbound traffic from the SAP system must be open to individual AWS services, and inbound traffic coming from the SAP system must be allowed on AWS services. DNS hostname resolution of AWS services has to be working properly on SAP application servers.

Impact: If the communication is not configured prior to DATAVARD activities, connection can’t be established.

Action: The Customer shall check with DATAVARD and its security/network team for details on steps needed.

Prerequisite 5: “Redshift and S3 configuration and authentication”

Details: In order to execute the services described herein Customer applies the following configuration on his AWS landscape:

  1. Create a S3 bucket, or identify an existing bucket to be used

  2. Create a technical user in AWS IAM with privileges to read and create objects in S3 bucket

  3. Create a Redshift cluster

  4. Create a database in Redshift

  5. Create a technical user in Redshift that is the owner of the database created in step 4

  6. Grant authorizations to the technical user so it will be able to select from pg_catalog.PG_TABLE_DEF and pg_catalog.SVV_TABLE_INFO

Impact: If the mentioned steps aren’t completed, the connection can’t be established.

Action: The Customer shall check with DATAVARD for details on steps needed.

Prerequisite 6: “SAP system configuration”

Details: In order to execute the services described herein Customer must have the following SAP components available and configuration in place for SAP application servers and SAP system.

  1. SAP release 7.01 SP15 or higher

  2. SAP Java Connector 3.0 library is available in LD_LIBRARY_PATH of <sid>adm user on every application server

  3. Technical user of type communication data is created in SAP

  4. The technical user has following authorization: object S_RFC with function group SYST, RFC1, SDIFRUNTIME and activity 16

  5. Program DATAVARD_JAVA_CONN is allowed to register on the SAP gateway (by default wildcard (*) is set)

  6. Redshift JDBC drivers are installed on the SAP application servers

Impact: If the SAP components are not available and configuration not set, no data operation can be performed.

Action: The Customer shall check with DATAVARD and its SAP team for details on steps needed

ADLS (gen1 or gen2)

The following prerequisites are suitable for customer who seek connection to ADLS (gen1 or gen2).

Prerequisite 4: “SAP and Azure connectivity”

Details: In order to execute the services described herein Customer must have a working network connection with sufficient bandwidth and stability between the SAP system and Azure. Specifically, outbound connection must be open to individual Azure services, and inbound traffic coming from the SAP system must be allowed on Azure services. DNS hostname resolution of Azure services has to be working properly on SAP application servers.

Impact: If the communication is not configured prior to DATAVARD activities, the connection can’t be established.

Action: The Customer shall check with DATAVARD and its security/network team for details on steps needed.

Prerequisite 5: “ADLS configuration and authentication”

Details: In order to execute the services described herein Customer applies following configuration on his Azure landscape:

  1. Create an ADLS resource or identify an existing one to be used

  2. Create a directory on ADLS to be used

  3. Create a new App Registration in Azure Active Directory

  4. Generate a secret for the App registration

  5. Assign correct filesystem privileges (RWX on target directory and X on all parent directories) on ADLS, so the App Registration can read/write to the assigned directory

  6. For ADLS gen2 it is possible to only generate a SAS token that gives complete access to the ADLS resource. This can be done as a substitute for steps 3-5.

Impact: If the mentioned steps aren’t completed, the connection can’t be established.

Action: The Customer shall check with DATAVARD for details on steps needed.

Prerequisite 6: “SAP system configuration”

Details: In order to execute the services described herein Customer must have the following SAP components available and configuration in place for SAP application servers and SAP system.

  1. SAP release 7.01 SP15 or higher

  2. Http/https service running and available in SAP (by default available)

  3. Microsoft SSL certificates are imported to STRUST

Impact: If the SAP components are not available and configuration not set, no data operation can be performed.

Action: The Customer shall check with DATAVARD and its SAP team for details on steps needed

Azure BLOB

The following prerequisites are suitable for customer who seek connection to Azure BLOB.

Prerequisite 4: “SAP and Azure connectivity”

Details: In order to execute the services described herein Customer must have a working network connection with sufficient bandwidth and stability between the SAP system and Azure. Specifically, outbound connection must be open to individual Azure services, and inbound traffic coming from the SAP system must be allowed on Azure services. DNS hostname resolution of Azure services has to be working properly on SAP application servers.

Impact: If the communication is not configured prior to DATAVARD activities, the connection can’t be established.

Action: The Customer shall check with DATAVARD and its security/network team for details on steps needed.

Prerequisite 5: “BLOB configuration and authentication”

Details: In order to execute the services described herein Customer applies following configuration on his Azure landscape:

  1. Create a BLOB resource or identify an existing one to be used

  2. Create a container on BLOB to be used

  3. Generate a SAS token that gives access to the ADLS resource.

 

Impact: If the mentioned steps aren’t completed, the connection can’t be established.

Action: The Customer shall check with DATAVARD for details on steps needed.

Prerequisite 6: “SAP system configuration”

Details: In order to execute the services described herein Customer must have the following SAP components available and configuration in place for SAP application servers and SAP system.

  1. SAP release 7.01 SP15 or higher

  2. Http/https service running and available in SAP (by default available)

  3. Microsoft SSL certificates are imported to STRUST

Impact: If the SAP components are not available and configuration not set, no data operation can be performed.

Action: The Customer shall check with DATAVARD and its SAP team for details on steps needed

Azure Databricks

The following prerequisites are suitable for customer who seek connection to Azure Databricks. Azure Databricks also requires a connection to ADLS gen1, so also those prerequisites are included here.

Prerequisite 4: “SAP and Azure connectivity”

Details: In order to execute the services described herein Customer must have a working network connection with sufficient bandwidth and stability between the SAP system and Azure. Specifically, outbound traffic from the SAP system must be allowed to individual Azure services, and inbound traffic coming from the SAP system must be allowed on Azure services. DNS hostname resolution of Azure services has to be working properly on SAP application servers.

Impact: If the communication is not configured prior to DATAVARD activities, the connection can’t be established.

Action: The Customer shall check with DATAVARD and its security/network team for details on steps needed.

Prerequisite 5: “Databricks and ADLS configuration and authentication”

Details: In order to execute the services described herein Customer applies following configuration on his Azure landscape:

  1. Create an ADLS resource or identify an existing one to be used

  2. Create a directory on ADLS to be used

  3. Create a new App Registration on Azure Active Directory

  4. Generate a secret for the App registration

  5. Assign correct filesystem privileges (RWX on target directory and X on all parent directories) on ADLS, so the App Registration can read/write to the assigned directory

  6. Create a Databricks resource and create a cluster

  7. Create a database in the Databricks cluster

  8. Add the following configration to the cluster’s advanced parameters:
    spark.sql.hive.metastore.version 1.2.1
    spark.sql.hive.metastore.jars builtin
    spark.hadoop.dfs.adls.oauth2.access.token.provider.type ClientCredential
    spark.hadoop.dfs.adls.oauth2.refresh.url https://login.microsoftonline.com/<your_tenant>/oauth2/token
    spark.hadoop.dfs.adls.oauth2.credential <your oauth secret>
    spark.hadoop.dfs.adls.oauth2.client.id <client ID of your app registration>

  9. In User settings, generate a new Access token

Impact: If the mentioned steps aren’t completed, the connection can’t be established.

Action: The Customer shall check with DATAVARD for details on steps needed.

Prerequisite 6: “SAP system configuration”

Details: In order to execute the services described herein Customer must have the following SAP components available and configuration in place for SAP application servers and SAP system.

  1. SAP release 7.01 SP15 or higher

  2. SAP Java Connector 3.0 library is available in LD_LIBRARY_PATH of <sid>adm user on every application server

  3. Technical user of type communication data is created in SAP

  4. The technical user has following authorization: object S_RFC with function group SYST, RFC1, SDIFRUNTIME and activity 16

  5. Http/https service running and available in SAP (by default available)

  6. Microsoft SSL certificates are imported to STRUST

  7. Program DATAVARD_JAVA_CONN is allowed to register on the SAP gateway (by default wildcard (*) is set)

  8. Hive JDBC drivers are installed on SAP application servers

Impact: If the SAP components are not available and configuration not set, no data operation can be performed.

Action: The Customer shall check with DATAVARD and its SAP team for details on steps needed

Microsoft SQL server

The following prerequisites are suitable for customer who seek connection to Microsoft SQL server.

Prerequisite 4: “SAP and Azure connectivity”

Details: In order to execute the services described herein Customer must have a working network connection with sufficient bandwidth and stability between the SAP system and MSSQL server. Specifically, outbound connection must be open to the MSSQL server, and inbound traffic coming from the SAP system must be allowed on MSSQL server. DNS hostname resolution of the SQL server has to be working properly on SAP application servers.

Impact: If the communication is not configured prior to DATAVARD activities, the connection can’t be established.

Action: The Customer shall check with DATAVARD and its security/network team for details on steps needed.

Prerequisite 5: “MSSQL server configuration and authentication”

Details: In order to execute the services described herein Customer applies following configuration on his MSSQL server:

  1. Create a database in the MSSQL server to be used or identify an existing one

  2. Create a technical user that is allowed to interact with the database

Impact: If the mentioned steps aren’t completed, the connection can’t be established.

Action: The Customer shall check with DATAVARD for details on steps needed.

Prerequisite 6: “SAP system configuration”

Details: In order to execute the services described herein Customer must have the following SAP components available and configuration in place for SAP application servers and SAP system.

  1. following SAP release or higher

    1. 21: 619

    2. 22: 21

    3. 42: 315

    4. 45: 29

  2. Application servers are running either on Windows or Linux operating system

  3. Microsoft SQL server ODBC drivers are installed on every application server

  4. Secondary DB connection is added through t-code DBCO and the connection is successfully tested through report ADBC_TEST_CONNECTION 

Impact: If the SAP components are not available and configuration not set, no data operation can be performed.

Action: The Customer shall check with DATAVARD and its SAP team for details on steps needed

Google Cloud Storage

The following prerequisites are suitable for customer who seek a connection to Google Cloud Storage (GCS).

Prerequisite 4: “SAP and GCP connectivity”

Details: In order to execute the services described herein Customer must have a working network connection with sufficient bandwidth and stability between the SAP system and GCP. Specifically, outbound traffic from the SAP system must be allowed to individual GCP services, and inbound traffic coming from the SAP system must be allowed on GCP services. DNS hostname resolution of GCP services has to be working properly on SAP application servers.

Impact: If the communication is not configured prior to DATAVARD activities, the connection can’t be established.

Action: The Customer shall check with DATAVARD and its security/network team for details on steps needed.

Prerequisite 5: “GCS configuration and authentication”

Details: In order to execute the services described herein Customer applies following configuration on his Azure landscape:

  1. Create an GCS resource or identify an existing one to be used

  2. Create a service account to be used and generate a key in JSON format

  3. Assign correct privileges to the service account for the bucket

Impact: If the mentioned steps aren’t completed, the connection can’t be established.

Action: The Customer shall check with DATAVARD for details on steps needed.

Prerequisite 6: “SAP system configuration”

Details: In order to execute the services described herein Customer must have following SAP components available and configuration in place for SAP application servers and SAP system.

  1. SAP release 7.01 SP15 or higher

  2. SAP Java Connector 3.0 library is available in LD_LIBRARY_PATH of <sid>adm user on every application server

  3. Technical user of type communication data is created in SAP

  4. The technical user has following authorization: object S_RFC with function group SYST, RFC1, SDIFRUNTIME, and activity 16

  5. Program DATAVARD_JAVA_CONN is allowed to register on the SAP gateway (by default wildcard (*) is set)

Impact: If the SAP components are not available and configuration not set, no data operation can be performed.

Action: The Customer shall check with DATAVARD and its SAP team for details on steps needed

 

Google BigQuery

The following prerequisites are suitable for customer who seek a connection to Google BigQuery. Google BigQuery also requires a connection to Google Cloud Storage, so also those prerequisites are included here.

Prerequisite 4: “SAP and GCP connectivity”

Details: In order to execute the services described herein Customer must have a working network connection with sufficient bandwidth and stability between the SAP system and GCP. Specifically, outbound traffic from the SAP system must be allowed to individual GCP services, and inbound traffic coming from the SAP system must be allowed on GCP services. DNS hostname resolution of GCP services has to be working properly on SAP application servers.

Impact: If the communication is not configured prior to DATAVARD activities, the connection can’t be established.

Action: The Customer shall check with DATAVARD and its security/network team for details on steps needed.

Prerequisite 5: “Google BigQuery & GCS configuration and authentication”

Details: In order to execute the services described herein Customer applies following configuration on his Azure landscape:

  1. Create a GCS resource or identify an existing one to be used

  2. Create a service account to be used and generate a key in JSON format

  3. Assign correct privileges to the service account for the bucket

  4. Create a BigQuery resource or identify an existing one to be used

  5. Create a dataset or identify an existing one to be used

  6. Assign correct privileges to the service account for the BigQuery resource and dataset

Impact: If the mentioned steps aren’t completed, the connection can’t be established.

Action: The Customer shall check with DATAVARD for details on steps needed.

Prerequisite 6: “SAP system configuration”

Details: In order to execute the services described herein Customer must have the following SAP components available and configuration in place for SAP application servers and SAP system.

  1. SAP release 7.01 SP15 or higher

  2. SAP Java Connector 3.0 library is available in LD_LIBRARY_PATH of <sid>adm user on every application server

  3. Technical user of type communication data is created in SAP

  4. The technical user has following authorization: object S_RFC with function group SYST, RFC1, SDIFRUNTIME, and activity 16

  5. Program DATAVARD_JAVA_CONN is allowed to register on the SAP gateway (by default wildcard (*) is set)

Impact: If the SAP components are not available and configuration not set, no data operation can be performed.

Action: The Customer shall check with DATAVARD and its SAP team for details on steps needed