Consuming Data from Datasphere to Azure Data Factory via ODBC

Estimated read time 5 min read

Prerequisites:

Access: Access to ADF and Datasphere.Credentials: Datasphere and ADF credential details.

Connect Datasphere to Azure Data Factory

DATASPHERE PART:

Log in to Datasphere -> Space Management -> Choose the space and select Edit

 

Click Create and Make sure that you have enabled Expose for consumption by default

Copy the Database Username, Hostname, Port, Password

 

Go to System-> Configuration-> IP Allowlist-> Trusted Ips
EXTERNAL IPV4 ADDRESS should be added here, not Internal IPV4

 

To get an External IPV4 Address, use this URL:  What Is My IP Address – See Your Public Address – IPv4 & IPv6Add and Save the External ipv4 address in the Datasphere’s IP Allowlist.

ODBC PART:

Need to install SAP HDODBC driver SAP Development Tools (ondemand.com) in the system.Open ODBC in the systemClick AddSelect HDODBC

 

 

Give any meaningful name to Data source name, description.Database type: SAP HANA Cloud or SAP HANA Single tenant (both will work fine).Already copied Host URL in datasphere space, Paste the copied Host URL.Click Test connectionPaste the Database username in the Username and Password

 

 

 

AZURE DATA FACTORY PART:

Open Azure Data Factory:

Go to your Azure Data Factory instance via the Azure Portal.

Create a Linked Service:

On the left pane, go to Manage > Linked services.Click New to create a new Linked Service.In the search box, search for ODBC or SAP HANA.Select ODBC

 

Enter Connection Information:

In the Connection String field, enter the connection string:

Driver={HDBODBC};ServerNODE=XXXXXXXXXX:443;UID=SAP_CONTENT#XXXX;PWD=XXXXXXXXXXXX;SCHEMA=SAP_CONTENT;

Breakdown of Components:

Driver: Specifies the SAP HANA ODBC driver needed to connect (e.g., {HDBODBC}).ServerNODE: Indicates the SAP HANA server address and port to connect to (e.g., rw2922…443).UID: The username used to authenticate to SAP HANA (e.g., SAP_CONTENT#XXXX).PWD: The password associated with the provided username for authentication.SCHEMA: The specific database schema where the data is located (e.g., SAP_CONTENT).Encrypt: Ensures the connection is encrypted for secure communication (e.g., Encrypt=1).

Authentication Type:

Choose Basic Authentication since you’re using a username and password.Enter the copied Datasphere’s Space username and password.

Test the Connection:

Click on the Test Connection button to make sure the connection is successful.

 

To check the connection, create a copy pipeline and choose the ODBC as the connector which we used to get data from Datasphere and corresponding Destination environment.

 

In this case, I have chosen the Azure Data Lake Storage Gen 2 as the Destination environment.

Login to Azure Data Lake Storage via Azure platform and check that the data copied

 

Hence the connection is established successfully from Datasphere to Azure Data Factory.

HINTS: 😀🤫

If your connection failed, then you have to check these two things,

Make sure your current IPV4 is added to Datasphere’s IP Allowlist.Ensure you have entered the correct Datasphere’s Space credentials in your systems ODBC and test connection.

Thank you!

 

​ Prerequisites:Access: Access to ADF and Datasphere.Credentials: Datasphere and ADF credential details.Connect Datasphere to Azure Data FactoryDATASPHERE PART:Log in to Datasphere -> Space Management -> Choose the space and select Edit Click Create and Make sure that you have enabled Expose for consumption by defaultCopy the Database Username, Hostname, Port, Password Go to System-> Configuration-> IP Allowlist-> Trusted IpsEXTERNAL IPV4 ADDRESS should be added here, not Internal IPV4 To get an External IPV4 Address, use this URL:  What Is My IP Address – See Your Public Address – IPv4 & IPv6Add and Save the External ipv4 address in the Datasphere’s IP Allowlist.ODBC PART:Need to install SAP HDODBC driver SAP Development Tools (ondemand.com) in the system.Open ODBC in the systemClick AddSelect HDODBC  Give any meaningful name to Data source name, description.Database type: SAP HANA Cloud or SAP HANA Single tenant (both will work fine).Already copied Host URL in datasphere space, Paste the copied Host URL.Click Test connectionPaste the Database username in the Username and Password   AZURE DATA FACTORY PART:Open Azure Data Factory:Go to your Azure Data Factory instance via the Azure Portal.Create a Linked Service:On the left pane, go to Manage > Linked services.Click New to create a new Linked Service.In the search box, search for ODBC or SAP HANA.Select ODBC Enter Connection Information:In the Connection String field, enter the connection string:Driver={HDBODBC};ServerNODE=XXXXXXXXXX:443;UID=SAP_CONTENT#XXXX;PWD=XXXXXXXXXXXX;SCHEMA=SAP_CONTENT;Breakdown of Components:Driver: Specifies the SAP HANA ODBC driver needed to connect (e.g., {HDBODBC}).ServerNODE: Indicates the SAP HANA server address and port to connect to (e.g., rw2922…443).UID: The username used to authenticate to SAP HANA (e.g., SAP_CONTENT#XXXX).PWD: The password associated with the provided username for authentication.SCHEMA: The specific database schema where the data is located (e.g., SAP_CONTENT).Encrypt: Ensures the connection is encrypted for secure communication (e.g., Encrypt=1).Authentication Type:Choose Basic Authentication since you’re using a username and password.Enter the copied Datasphere’s Space username and password.Test the Connection:Click on the Test Connection button to make sure the connection is successful. To check the connection, create a copy pipeline and choose the ODBC as the connector which we used to get data from Datasphere and corresponding Destination environment. In this case, I have chosen the Azure Data Lake Storage Gen 2 as the Destination environment.Login to Azure Data Lake Storage via Azure platform and check that the data copied Hence the connection is established successfully from Datasphere to Azure Data Factory.HINTS: 😀🤫If your connection failed, then you have to check these two things,Make sure your current IPV4 is added to Datasphere’s IP Allowlist.Ensure you have entered the correct Datasphere’s Space credentials in your systems ODBC and test connection.Thank you!   Read More Technology Blogs by Members articles 

#SAP

#SAPTechnologyblog

You May Also Like

More From Author