Read Write Data between HANA Datalake and HANA Cloud DB

A simple guide to Read Write table data between SAP HANA Datalake and SAP HANA Cloud DB. 

Key topics include:

Export from HANA Cloud and Import into HANA Datalake Filesystem using CSV and PARQUET formats.Export from HANA Datalake Relational Engine and import to the same or different HANA Datalake Filesystem in CSV and PARQUET formats.

 

Export from HANA Cloud and Import into HANA Datalake Filesystem using CSV and PARQUET formats

Please note that all the SQL queries are executed in the SQL Console of HANA Cloud DB

1. Creating and Verifying the PSE (Personal Security Environment)

Step 1: Create PSE
Use the following SQL query to create a PSE named ‘TESTING’:

 

CREATE PSE TESTING;

 

Step 2: Cross-verify the newly created PSE
To verify that the PSE has been successfully created, run:

 

SELECT * FROM PSES;

 

2. Download and Add DigiCertGlobalRootCA Certificate

Step 3: Download the DigiCertGlobalRootCA certificate
Download the certificate from this link: https://dl.cacerts.digicert.com/DigiCertGlobalRootCA.crt.pem 

Step 4: Copy Certificate Content
Open the downloaded certificate in a text editor and copy the content between “—–BEGIN CERTIFICATE—–” and “—–END CERTIFICATE—–“.

3. Create Certificate in HANA Cloud DB

Step 5: Create Certificate using SQL
Create the certificate in HANA Cloud DB using the following SQL query:

 

CREATE CERTIFICATE TESTING_DIGICERT FROM ‘
—–BEGIN CERTIFICATE—–
MIIDrzCCApegAwIBAgIQCDvgVpBCRrGhdWrJWZHHSjANBgkqhkiG9w0BAQUFADBh

—–END CERTIFICATE—–‘;

 

4. Map the DigiCertGlobalRootCA Certificate to the PSE

Step 6: Map the certificate to the PSE
To map the certificate TESTING_DIGICERT to the PSE TESTING, use:

 

ALTER PSE TESTING ADD CERTIFICATE TESTING_DIGICERT;

 

5. Map the HANA Datalake Filesystem Certificate Chain

Step 7: Map the HANA Datalake Filesystem certificate chain
Map the certificate chain using the following query:

 

ALTER PSE TESTING SET OWN CERTIFICATE
‘—–BEGIN PRIVATE KEY—–
Clientkey
—–END PRIVATE KEY—–
—–BEGIN CERTIFICATE—–
MainCertificate
—–END CERTIFICATE—–
—–BEGIN CERTIFICATE—–
Intermediatecertificate
—–END CERTIFICATE—–
—–BEGIN CERTIFICATE—–
RootCertificate
—–END CERTIFICATE—–‘;

 

Please provide the entire content of the certificate and key between —–BEGIN CERTIFICATE—– and —–END CERTIFICATE—– & —–BEGIN PRIVATE KEY—– and —–END PRIVATE KEY—–

6. Verify the Certificates

Step 8: Verify all certificates
Check all the certificates using:

 

SELECT * FROM CERTIFICATES;

 

Step 9: Verify if certificates are mapped to the PSE
Run the following query to verify certificate mappings:

 

SELECT * FROM PSE_CERTIFICATES;

 

7. Create Credential for Import/Export Operations

Step 10: Create a Credential using the SAPHDLRELOADUNLOAD Component
Create a credential using the following query:

 

CREATE CREDENTIAL FOR COMPONENT ‘SAPHANAIMPORTEXPORT’ PURPOSE ‘TESTPURP’ TYPE ‘X509’ PSE TESTING;

 

Note that PURPOSE is the name of the Credential. The TYPE is X509 as we are dealing with certificates. Map it to newly created PSE.

Step 11: Cross-verify Credential
To verify the credential, run:

 

SELECT * FROM CREDENTIALS;

 

8. Export / Import Operations from / to HANA Datalake Filesystem and HANA Cloud DB

Export Table Data to CSV File

Use the following query to export data from the TESTMYTABLE table which is present in the HANA Cloud DB as a CSV file onto HANA Datalake Filesystem:

 

EXPORT INTO CSV FILE
‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com/path_to_file/data.csv’
FROM TESTMYTABLE
WITH
CREDENTIAL ‘TESTPURP’
COLUMN LIST IN FIRST ROW;

 

Import Data from CSV File to Table

Use the following query to import the CSV file which is present in the HANA Datalake Filesystem into HANA Cloud DB:

 

IMPORT FROM CSV FILE ‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com/path_to_file/data.csv’
INTO TESTMYTABLE WITH
CREDENTIAL ‘TESTPURP’
COLUMN LIST IN FIRST ROW
FAIL ON INVALID DATA;

 

Export Table Data to PARQUET File

Use the following query to export data from the TESTMYTABLE table which is present in the HANA Cloud DB as a PARQUET file onto HANA Datalake Filesystem:

 

EXPORT INTO PARQUET FILE ‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com/path_to_file/data.parquet’
FROM (SELECT * FROM TESTMYTABLE)
WITH CREDENTIAL ‘TESTPURP’;

 

Import Data from PARQUET file to Table

Use the following query to import the PARQUET file which is present in the HANA Datalake Filesystem into HANA Cloud DB:

 

IMPORT FROM PARQUET FILE ‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com/path_to_file/data.parquet’
INTO TESTMYTABLE
WITH CREDENTIAL ‘TESTPURP’;

 

Please note that table TESTMYTABLE is an example. Replace it with valid table. Also, kindly change the HANA Datalake REST API Endpoint URL as per the requirement.

 

Export from HANA Datalake Relational Engine and import to the same or different HANA Datalake Filesystem in CSV and PARQUET formats

Please note that all the SQL queries are executed in the SQL Console of HANA Datalake Relational Engine.

Export / Import Operations from / to same HANA Datalake Relational Engine and HANA Datalake Filesystem

Export Table Data to PARQUET File

Use the following query to export data from the TESTMYTABLE table which is present in the HANA Datalake Relational Engine as a PARQUET file onto the same HANA Datalake Filesystem:

 

UNLOAD SELECT * FROM TESTMYTABLE
INTO FILE ‘hdlfs://path_to_file/DATA.parquet’
FORMAT PARQUET

 

Import Data from PARQUET File to Table

Use the following SQL query to load data a PARQUET file which is present in HANA Datalake Filesystem into the same HANA Datalake Relational Engine:

 

LOAD TABLE TESTMYTABLE (NAME, ADDRESS)
FROM ‘hdlfs://path_to_file/DATA.parquet’
FORMAT PARQUET
ESCAPES OFF;

 

Export Table Data to CSV File

Use the following query to export data from the TESTMYTABLE table which is present in the HANA Datalake Relational Engine as a CSV file onto the same HANA Datalake Filesystem:

 

UNLOAD SELECT * FROM TESTMYTABLE
INTO FILE ‘hdlfs://path_to_file/DATA.csv’;

 

Import Data from CSV File to Table

Use the following SQL query to load data a CSV file which is present in HANA Datalake Filesystem into the same HANA Datalake Relational Engine:

 

LOAD TABLE TESTMYTABLE (NAME, ADDRESS)
FROM ‘hdlfs://path_to_file/DATA.csv’
ESCAPES OFF;

 

Please note that table TESTMYTABLE and it’s columns NAME and ADDRESS are examples. Replace them with valid table and it’s column names.

 

Export / Import Operations from / to different HANA Datalake Relational Engine and HANA Datalake Filesystem

1. Creating and Verifying the PSE (Personal Security Environment)

Step 1: Create New PSE
Create a new PSE (Personal Security Environment) for the HANA Datalake Relational Engine using the following SQL query:

 

CREATE PSE MYNEWPSE;

 

Step 2: Cross-verify the newly created PSE
To cross-verify that the new PSE has been created, run the following query:

 

SELECT * FROM SYSPSE;

 

2. Adding Certificates to the PSE

Step 3: Find the DigiCertGlobalRootCA Certificate
Find the DigiCertGlobalRootCA certificate and take note of the object_id for mapping it to the newly created PSE:

 

SELECT * FROM SYSCERTIFICATE WHERE cert_name = ‘DigiCertRootCA’;

 

Step 4: Map the Certificate to the PSE
Use the retrieved object_id to map the DigiCertGlobalRootCA certificate to the newly created PSE:

 

ALTER PSE MYNEWPSE ADD CERTIFICATE 6515;

 

Note: 6515 is example. Replace 6515 with the actual object_id.

Step 5: Map the HANA Datalake Filesystem Certificate Chain
Use the following SQL query to map the certificate chain for the HANA Datalake Filesystem to the PSE:

 

ALTER PSE MYNEWPSE SET OWN CERTIFICATE
‘—–BEGIN PRIVATE KEY—–
Clientkey
—–END PRIVATE KEY—–
—–BEGIN CERTIFICATE—–
MainCertificate
—–END CERTIFICATE—–
—–BEGIN CERTIFICATE—–
Intermediatecertificate
—–END CERTIFICATE—–
—–BEGIN CERTIFICATE—–
RootCertificate
—–END CERTIFICATE—–‘;

 

Please provide the entire content of the certificate and key between —–BEGIN CERTIFICATE—– and —–END CERTIFICATE—– & —–BEGIN PRIVATE KEY—– and —–END PRIVATE KEY—–

3. Verify the Certificates

Step 6: Cross-verify the Certificates
To ensure that the certificates are correctly loaded, run:

 

SELECT * FROM SYSCERTIFICATE;

 

Step 7: Verify if Certificates are Mapped to the Correct PSE
Check if the certificates are correctly mapped to the newly created PSE:

 

SELECT * FROM SYSPSECERTIFICATE;

 

4. Create Credential for Import/Export Operations

Step 8: Create a Credential
Create a credential that will be used to authenticate export and import operations using the SAPHDLRELOADUNLOAD component:

 

CREATE CREDENTIAL FOR COMPONENT ‘SAPHDLRELOADUNLOAD’ PURPOSE ‘MYNEWTEST’ TYPE ‘X509’ PSE MYNEWPSE;

 

Note that PURPOSE is the name of the Credential. The TYPE is X509 as we are dealing with certificates. Map it to newly created PSE.

Step 9: Verify the Credential
Verify that the credential has been successfully created:

 

SELECT * FROM SYSCREDENTIAL;

 

5. Export / Import Operations from / to different HANA Datalake Relational Engine and HANA Datalake Filesystem

Export Table Data to PARQUET File

Use the following query to export data from the TESTMYTABLE table which is present in the HANA Datalake Relational Engine as a PARQUET file onto the different HANA Datalake Filesystem:

 

UNLOAD SELECT * FROM TESTMYTABLE INTO FILE ‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/path_to_file/data.parquet’
FORMAT PARQUET
CONNECTION_STRING ‘ENDPOINT=https://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com’
WITH CREDENTIAL ‘MYNEWTEST’

 

Import Data from PARQUET File to Table

Use the following SQL query to load data a PARQUET file which is present in different HANA Datalake Filesystem into this HANA Datalake Relational Engine:

 

LOAD TABLE TESTMYTABLE (NAME, ADDRESS)
FROM ‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/path_to_file/data.parquet’
FORMAT PARQUET
CONNECTION_STRING ‘ENDPOINT=https://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com’
WITH CREDENTIAL ‘MYNEWTEST’
FORMAT PARQUET
ESCAPES OFF;

 

Export Table Data to CSV File

Use the following query to export data from the TESTMYTABLE table which is present in the HANA Datalake Relational Engine as a CSV file onto the different HANA Datalake Filesystem:

 

UNLOAD SELECT * FROM TESTMYTABLE INTO FILE ‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/path_to_file/data.CSV’
CONNECTION_STRING ‘ENDPOINT=https://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com’
WITH CREDENTIAL ‘MYNEWTEST’

 

Import Data from CSV File to Table

Use the following SQL query to load data a CSV file which is present in different HANA Datalake Filesystem into this HANA Datalake Relational Engine:

 

LOAD TABLE TESTMYTABLE(NAME, ADDRESS) FROM ‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/data.CSV’
CONNECTION_STRING ‘ENDPOINT=https://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com’
WITH CREDENTIAL ‘MYNEWTEST’
ESCAPES OFF;

 

Please note that table TESTMYTABLE and it’s columns NAME and ADDRESS are examples. Replace them with valid table and it’s column names. Also, kindly change the HANA Datalake REST API Endpoint URL as per the requirement.

 

​ A simple guide to Read Write table data between SAP HANA Datalake and SAP HANA Cloud DB. Key topics include:Export from HANA Cloud and Import into HANA Datalake Filesystem using CSV and PARQUET formats.Export from HANA Datalake Relational Engine and import to the same or different HANA Datalake Filesystem in CSV and PARQUET formats. Export from HANA Cloud and Import into HANA Datalake Filesystem using CSV and PARQUET formatsPlease note that all the SQL queries are executed in the SQL Console of HANA Cloud DB1. Creating and Verifying the PSE (Personal Security Environment)Step 1: Create PSEUse the following SQL query to create a PSE named ‘TESTING’: CREATE PSE TESTING; Step 2: Cross-verify the newly created PSETo verify that the PSE has been successfully created, run: SELECT * FROM PSES; 2. Download and Add DigiCertGlobalRootCA CertificateStep 3: Download the DigiCertGlobalRootCA certificateDownload the certificate from this link: https://dl.cacerts.digicert.com/DigiCertGlobalRootCA.crt.pem Step 4: Copy Certificate ContentOpen the downloaded certificate in a text editor and copy the content between “—–BEGIN CERTIFICATE—–” and “—–END CERTIFICATE—–“.3. Create Certificate in HANA Cloud DBStep 5: Create Certificate using SQLCreate the certificate in HANA Cloud DB using the following SQL query: CREATE CERTIFICATE TESTING_DIGICERT FROM ‘
—–BEGIN CERTIFICATE—–
MIIDrzCCApegAwIBAgIQCDvgVpBCRrGhdWrJWZHHSjANBgkqhkiG9w0BAQUFADBh

—–END CERTIFICATE—–‘; 4. Map the DigiCertGlobalRootCA Certificate to the PSEStep 6: Map the certificate to the PSETo map the certificate TESTING_DIGICERT to the PSE TESTING, use: ALTER PSE TESTING ADD CERTIFICATE TESTING_DIGICERT; 5. Map the HANA Datalake Filesystem Certificate ChainStep 7: Map the HANA Datalake Filesystem certificate chainMap the certificate chain using the following query: ALTER PSE TESTING SET OWN CERTIFICATE
‘—–BEGIN PRIVATE KEY—–
Clientkey
—–END PRIVATE KEY—–
—–BEGIN CERTIFICATE—–
MainCertificate
—–END CERTIFICATE—–
—–BEGIN CERTIFICATE—–
Intermediatecertificate
—–END CERTIFICATE—–
—–BEGIN CERTIFICATE—–
RootCertificate
—–END CERTIFICATE—–‘; Please provide the entire content of the certificate and key between —–BEGIN CERTIFICATE—– and —–END CERTIFICATE—– & —–BEGIN PRIVATE KEY—– and —–END PRIVATE KEY—–6. Verify the CertificatesStep 8: Verify all certificatesCheck all the certificates using: SELECT * FROM CERTIFICATES; Step 9: Verify if certificates are mapped to the PSERun the following query to verify certificate mappings: SELECT * FROM PSE_CERTIFICATES; 7. Create Credential for Import/Export OperationsStep 10: Create a Credential using the SAPHDLRELOADUNLOAD ComponentCreate a credential using the following query: CREATE CREDENTIAL FOR COMPONENT ‘SAPHANAIMPORTEXPORT’ PURPOSE ‘TESTPURP’ TYPE ‘X509’ PSE TESTING; Note that PURPOSE is the name of the Credential. The TYPE is X509 as we are dealing with certificates. Map it to newly created PSE.Step 11: Cross-verify CredentialTo verify the credential, run: SELECT * FROM CREDENTIALS; 8. Export / Import Operations from / to HANA Datalake Filesystem and HANA Cloud DBExport Table Data to CSV FileUse the following query to export data from the TESTMYTABLE table which is present in the HANA Cloud DB as a CSV file onto HANA Datalake Filesystem: EXPORT INTO CSV FILE
‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com/path_to_file/data.csv’
FROM TESTMYTABLE
WITH
CREDENTIAL ‘TESTPURP’
COLUMN LIST IN FIRST ROW; Import Data from CSV File to TableUse the following query to import the CSV file which is present in the HANA Datalake Filesystem into HANA Cloud DB: IMPORT FROM CSV FILE ‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com/path_to_file/data.csv’
INTO TESTMYTABLE WITH
CREDENTIAL ‘TESTPURP’
COLUMN LIST IN FIRST ROW
FAIL ON INVALID DATA; Export Table Data to PARQUET FileUse the following query to export data from the TESTMYTABLE table which is present in the HANA Cloud DB as a PARQUET file onto HANA Datalake Filesystem: EXPORT INTO PARQUET FILE ‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com/path_to_file/data.parquet’
FROM (SELECT * FROM TESTMYTABLE)
WITH CREDENTIAL ‘TESTPURP’; Import Data from PARQUET file to TableUse the following query to import the PARQUET file which is present in the HANA Datalake Filesystem into HANA Cloud DB: IMPORT FROM PARQUET FILE ‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com/path_to_file/data.parquet’
INTO TESTMYTABLE
WITH CREDENTIAL ‘TESTPURP’; Please note that table TESTMYTABLE is an example. Replace it with valid table. Also, kindly change the HANA Datalake REST API Endpoint URL as per the requirement. Export from HANA Datalake Relational Engine and import to the same or different HANA Datalake Filesystem in CSV and PARQUET formatsPlease note that all the SQL queries are executed in the SQL Console of HANA Datalake Relational Engine.Export / Import Operations from / to same HANA Datalake Relational Engine and HANA Datalake FilesystemExport Table Data to PARQUET FileUse the following query to export data from the TESTMYTABLE table which is present in the HANA Datalake Relational Engine as a PARQUET file onto the same HANA Datalake Filesystem: UNLOAD SELECT * FROM TESTMYTABLE
INTO FILE ‘hdlfs://path_to_file/DATA.parquet’
FORMAT PARQUET Import Data from PARQUET File to TableUse the following SQL query to load data a PARQUET file which is present in HANA Datalake Filesystem into the same HANA Datalake Relational Engine: LOAD TABLE TESTMYTABLE (NAME, ADDRESS)
FROM ‘hdlfs://path_to_file/DATA.parquet’
FORMAT PARQUET
ESCAPES OFF; Export Table Data to CSV FileUse the following query to export data from the TESTMYTABLE table which is present in the HANA Datalake Relational Engine as a CSV file onto the same HANA Datalake Filesystem: UNLOAD SELECT * FROM TESTMYTABLE
INTO FILE ‘hdlfs://path_to_file/DATA.csv’; Import Data from CSV File to TableUse the following SQL query to load data a CSV file which is present in HANA Datalake Filesystem into the same HANA Datalake Relational Engine: LOAD TABLE TESTMYTABLE (NAME, ADDRESS)
FROM ‘hdlfs://path_to_file/DATA.csv’
ESCAPES OFF; Please note that table TESTMYTABLE and it’s columns NAME and ADDRESS are examples. Replace them with valid table and it’s column names. Export / Import Operations from / to different HANA Datalake Relational Engine and HANA Datalake Filesystem1. Creating and Verifying the PSE (Personal Security Environment)Step 1: Create New PSECreate a new PSE (Personal Security Environment) for the HANA Datalake Relational Engine using the following SQL query: CREATE PSE MYNEWPSE; Step 2: Cross-verify the newly created PSETo cross-verify that the new PSE has been created, run the following query: SELECT * FROM SYSPSE; 2. Adding Certificates to the PSEStep 3: Find the DigiCertGlobalRootCA CertificateFind the DigiCertGlobalRootCA certificate and take note of the object_id for mapping it to the newly created PSE: SELECT * FROM SYSCERTIFICATE WHERE cert_name = ‘DigiCertRootCA’; Step 4: Map the Certificate to the PSEUse the retrieved object_id to map the DigiCertGlobalRootCA certificate to the newly created PSE: ALTER PSE MYNEWPSE ADD CERTIFICATE 6515; Note: 6515 is example. Replace 6515 with the actual object_id.Step 5: Map the HANA Datalake Filesystem Certificate ChainUse the following SQL query to map the certificate chain for the HANA Datalake Filesystem to the PSE: ALTER PSE MYNEWPSE SET OWN CERTIFICATE
‘—–BEGIN PRIVATE KEY—–
Clientkey
—–END PRIVATE KEY—–
—–BEGIN CERTIFICATE—–
MainCertificate
—–END CERTIFICATE—–
—–BEGIN CERTIFICATE—–
Intermediatecertificate
—–END CERTIFICATE—–
—–BEGIN CERTIFICATE—–
RootCertificate
—–END CERTIFICATE—–‘; Please provide the entire content of the certificate and key between —–BEGIN CERTIFICATE—– and —–END CERTIFICATE—– & —–BEGIN PRIVATE KEY—– and —–END PRIVATE KEY—–3. Verify the CertificatesStep 6: Cross-verify the CertificatesTo ensure that the certificates are correctly loaded, run: SELECT * FROM SYSCERTIFICATE; Step 7: Verify if Certificates are Mapped to the Correct PSECheck if the certificates are correctly mapped to the newly created PSE: SELECT * FROM SYSPSECERTIFICATE; 4. Create Credential for Import/Export OperationsStep 8: Create a CredentialCreate a credential that will be used to authenticate export and import operations using the SAPHDLRELOADUNLOAD component: CREATE CREDENTIAL FOR COMPONENT ‘SAPHDLRELOADUNLOAD’ PURPOSE ‘MYNEWTEST’ TYPE ‘X509’ PSE MYNEWPSE; Note that PURPOSE is the name of the Credential. The TYPE is X509 as we are dealing with certificates. Map it to newly created PSE.Step 9: Verify the CredentialVerify that the credential has been successfully created: SELECT * FROM SYSCREDENTIAL; 5. Export / Import Operations from / to different HANA Datalake Relational Engine and HANA Datalake FilesystemExport Table Data to PARQUET FileUse the following query to export data from the TESTMYTABLE table which is present in the HANA Datalake Relational Engine as a PARQUET file onto the different HANA Datalake Filesystem: UNLOAD SELECT * FROM TESTMYTABLE INTO FILE ‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/path_to_file/data.parquet’
FORMAT PARQUET
CONNECTION_STRING ‘ENDPOINT=https://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com’
WITH CREDENTIAL ‘MYNEWTEST’ Import Data from PARQUET File to TableUse the following SQL query to load data a PARQUET file which is present in different HANA Datalake Filesystem into this HANA Datalake Relational Engine: LOAD TABLE TESTMYTABLE (NAME, ADDRESS)
FROM ‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/path_to_file/data.parquet’
FORMAT PARQUET
CONNECTION_STRING ‘ENDPOINT=https://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com’
WITH CREDENTIAL ‘MYNEWTEST’
FORMAT PARQUET
ESCAPES OFF; Export Table Data to CSV FileUse the following query to export data from the TESTMYTABLE table which is present in the HANA Datalake Relational Engine as a CSV file onto the different HANA Datalake Filesystem: UNLOAD SELECT * FROM TESTMYTABLE INTO FILE ‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/path_to_file/data.CSV’
CONNECTION_STRING ‘ENDPOINT=https://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com’
WITH CREDENTIAL ‘MYNEWTEST’ Import Data from CSV File to TableUse the following SQL query to load data a CSV file which is present in different HANA Datalake Filesystem into this HANA Datalake Relational Engine: LOAD TABLE TESTMYTABLE(NAME, ADDRESS) FROM ‘hdlfs://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/data.CSV’
CONNECTION_STRING ‘ENDPOINT=https://XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX.files.hdl.prod-eu10.hanacloud.ondemand.com’
WITH CREDENTIAL ‘MYNEWTEST’
ESCAPES OFF; Please note that table TESTMYTABLE and it’s columns NAME and ADDRESS are examples. Replace them with valid table and it’s column names. Also, kindly change the HANA Datalake REST API Endpoint URL as per the requirement.   Read More Technology Blogs by Members articles 

#SAP

#SAPTechnologyblog

You May Also Like

More From Author