Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Possible issueWhat to do
HttpFS node hostname cannot be resolvedThere can be a multitude of reasons why Hadoop host resolution fails. To be on the safe side Hadoop IP↔host couples can be added to each SAP application server /etc/hosts file (or in Windows: C:\Windows\System32\drivers\etc\hosts)
HttpFS/WebHDFS service port is no longer reachable

Check the availability of Hadoop service from SAP application server OS using 'telnet <host> <service_port>.
There is a possibility that a network team unaware of the necessity of these ports being open closes them for security hardening purposes or that the Hadoop cluster is simply down.

If WebHDFS is used, also datanode service (port 1022) needs to be reachable on all HDFS datanodes. There is an unresolved issue with specific SAP kernel versions and host architectures that cause failures in redirects (SAP ignores 307 redirects from WebHDFS to datanode). In this case, the HttpFS service has to be used to avoid redirects.

HttpFS service has failed over to an alternate host

Check in the Hadoop cluster manager (Cloudera Manager / Ambari), that the service (HttpFS/WebHDFS) is still running on the host defined in the RFC destination.
It is possible to define HA RFC Destination in Storage configuration to be used as a fallback if the primary RFC returns the error in the connection attempt.

SSL on HttpFS is active, but RFC is not set as SSL activeChange settings in the Logon & Security tab in SM59 to be SSL active.
HttpFS service is SSL secured, RFC is set to use SSL, but there are missing certificates in STRUSTAdd required certificates to STRUST. Details can be found in the Hadoop Storage setup documentation.
RFC is set to SSL active, but HTTPFS service is not SSL securedDisable SSL for the RFC
HTTP/HTTPS service is not activeCheck the transaction SMICM → Goto →  Services. Make sure that HTTP (HTTPS if SSL is used) has port number filled and is active.

...

Storage type HADOOP uses a Java connector only for authentication with Kerberos. If the Hadoop cluster is not kerberized, this section is not relevant.
The following page contains detailed information with possible issues related to Java connector setup.

...