In today’s data-driven business world, cloud data platforms have become the backbone of scalable and real-time analytics. One such powerhouse is Snowflake — a cloud-native data warehouse designed for performance, concurrency, and simplicity.
What is Snowflake in a Nutshell?
Snowflake is a cloud-based data platform that allows organizations to store and analyze vast amounts of structured and semi-structured data.
Snowflake Adapter in SAP CI
As part of SAP’s Integration Suite, SAP Cloud Integration (CI) offers connectivity to various third-party systems, and with the Snowflake adapter, it becomes possible to directly interact with your Snowflake data warehouse — automating business processes that rely on real-time or batch data movement.
Snowflake Adapter is your one-stop data management tool to interact and execute operations on Snowflake.
Bulk Export and Import: Facilitates importing from a range of input formats like CSV, JSON, Avro, etc. Similarly, you can export data to your desired format using the unload capability.Snowflake Operations Support: Offers support for all Snowflake operations along with SQL support for advanced users.Ease of Access and Connectivity: Enables saving of connection-related details on our platform to allow simplified operations and querying.Multiple options for Staging Storage: Various options to load data into staging storage like Amazon S3, Google Cloud Storage, or Microsoft Azure.Dynamic configuration with headers and properties: Assigning dynamic values to different properties allows enhanced flexibility to your integration flows. You can also refer to dynamic parameters using SAP Cloud Integration exchange headers and properties.
In this blog post, We will take a deep dive into the following operations mentioned below:
Insert – Load new data into Snowflake tables.
Select – Query data for reporting or downstream integration.
Delete – Remove specific records as part of data cleansing or archiving.
Execute – Run SQL statements
Tablelist – Retrieve metadata and structure information of Snowflake tables.
Update – Modify existing records based on dynamic business rules.
Snowflake:
Lets get started with the configurations that are needed to create Database and the Table on Snowflake:
1. Log on to your Snowflake account and in left panel select Data –> Databases.
Please note: You need to System Admin role to create Tables in Snowflake
2. In the Database section, click on + Database button on the top right and create a database with a desired name, I created a Database with name “TESTING”
3. Once the Database is created, you will see two schemas “INFORMATION_SCHEMA” and “PUBLIC”, select any schema under which you intend to create the Table. I selected “PUBLIC” schema to create the Table
I am using Standard option to create the Table.
4. At this step you need to define the schema of your table and click on “Create Table”
The Table name that I created is “ACCOUNT”, and it looks below :
We are good on Snowflake side, Lets dive into our SAP CI Tenant and check on the Snowflake Adapter.
SAP Cloud Integration Tenant:
In this blog, I will performing operations like Insert, Delete, Update, Table List, Execute and Select. I have created a simple iflow where I will be sending the data from Postman and perform the Operations on Snowflake using Snowflake Receiver Adapter.
Snowflake Receiver Adapter:
The connection tab contains the following fields:
Field
Description
Authentication
Select the Database Account method as the authentication mechanism.
Credential Name Alias
Specify the artifact Name that stores the username/password credentials in the Security Material.
Address
Specify the JDBC endpoint URL of the Snowflake application to be used for the connection. This address typically contains an account identifier which is a combination of the organization and account name separated by a hyphen(orgname-account_name).
Database
Specify the name of the database. “TESTING” in our case
Schema
Specify the name of the schema to be accessed. “Public” in our case
Warehouse
Specify the name of the Snowflake warehouse. If the warehouse provided is not found/incorrect, it uses DEFAULT.
1. Operation – Insert
Insert Operation is used to insert a record into the Table, Lets populated the Table “ACCOUNT” with some records.
Request Payload: It is mandatory to include metadata of the structure with datatype attribute for fieldname in the INSERT payload. The metadata/row tags are case sensitive and must be in lower case.
<?xml version=”1.0″ encoding=”UTF-8″?>
<ROOT>
<metadata>
<fieldname datatype=”VARCHAR”>ID</fieldname>
<fieldname datatype=”VARCHAR”>LASTNAME</fieldname>
<fieldname datatype=”VARCHAR”>FIRSTNAME</fieldname>
<fieldname datatype=”VARCHAR”>SALUTATION</fieldname>
<fieldname datatype=”VARCHAR”>FULLNAME</fieldname>
</metadata>
<row>
<ID>00QJ1000004yZSiMAM</ID>
<LASTNAME>Oswal</LASTNAME>
<FIRSTNAME>Punith</FIRSTNAME>
<SALUTATION>Mr.</SALUTATION>
<FULLNAME>Punith Oswal</FULLNAME>
</row>
</ROOT>
Request sent from Postman:
Message Successful in SAP CI:
Record inserted in Snowflake:
2. Operation – Update :
Used to Update an existing record in the Table.
Lets update the existing record that we created in the Insert Operation and Change Firstname from Punith to Rohit and Fullname to Rohit Oswal
Payload for Update: “where” field is mandatory in the payload, it acts as a where clause for the record to get updated.
<?xml version=”1.0″ encoding=”UTF-8″?>
<ROOT>
<metadata>
<fieldname datatype=”VARCHAR”>ID</fieldname>
<fieldname datatype=”VARCHAR”>LASTNAME</fieldname>
<fieldname datatype=”VARCHAR”>FIRSTNAME</fieldname>
<fieldname datatype=”VARCHAR”>SALUTATION</fieldname>
<fieldname datatype=”VARCHAR”>FULLNAME</fieldname>
</metadata>
<row>
<ID>00QJ1000004yZSiMAM</ID>
<LASTNAME>Oswal</LASTNAME>
<FIRSTNAME>Rohit</FIRSTNAME>
<SALUTATION>Mr.</SALUTATION>
<FULLNAME>Rohit Oswal</FULLNAME>
<where>ID=’00QJ1000004yZSiMAM'</where>
</row>
</ROOT>
Request sent from Postman:
Message successful in SAP CI
Record Updated in Snowflake:
3. Operation – Delete :
Used to Delete an existing record in the Table.
Currently there are 2 records in the “Account Table”, lets delete record with Firstname “Rohit”.
There is no request payload needed for DELETE operation, Just trigger a blank request from the Postman.
Message successful in SAP CI:
Record with FIRSTNAME = Rohit deleted in Snowflake Table:
4. Operation – Table List :
This operation lists the tables for the current specified schema.
This operation lists the all details of the Tables present under the schema, Schema = PUBLIC in our case.
There is no request payload needed for Table List operation, Just trigger a blank request from the Postman.
Message successful in SAP CI:
Response from Snowflake:
5. Operation – SELECT :
Select is used to fetch data from the table. You can customize data by selecting required columns in Response Fields for a specific condition using Where Clause. You can use Limit to fetch a
limited number of records from a predefined Offset value and finally, Orderby Statement allows
you to return records in descending or ascending order.
Currently there are 2 records in the “Account Table”, lets fetch the Record ID =
There is no request payload needed for SELECT operation, Just trigger a blank request from the Postman.
Message Successful in SAP CI:
Response from Snowflake:
6. Operation – EXECUTE :
Execute allows you to run SQL queries or stored procedures on the target table.
Lets delete our entire table using EXECUTE operation, using SQL statement – DROP TABLE ACCOUNT
There is no request payload needed for EXECUTE operation, Just trigger a blank request from the Postman.
Message successful in SAP CI:
Response from Snowflake:
Table “ACCOUNT” deleted in Snowflake account.
Conclusion:
In this blog, we explored key Snowflake adapter operations in SAP CI — from Insert and Update to Select and Execute — laying the groundwork for seamless data integration. These core actions empower you to connect SAP with Snowflake efficiently for real-time and batch scenarios.
In the next blog, I’ll cover advanced operations like Bulk Upsert and UNLOAD — stay tuned!
Hope this helps!
Cheers,
Punith Oswal
In today’s data-driven business world, cloud data platforms have become the backbone of scalable and real-time analytics. One such powerhouse is Snowflake — a cloud-native data warehouse designed for performance, concurrency, and simplicity.What is Snowflake in a Nutshell?Snowflake is a cloud-based data platform that allows organizations to store and analyze vast amounts of structured and semi-structured data. Snowflake Adapter in SAP CIAs part of SAP’s Integration Suite, SAP Cloud Integration (CI) offers connectivity to various third-party systems, and with the Snowflake adapter, it becomes possible to directly interact with your Snowflake data warehouse — automating business processes that rely on real-time or batch data movement.Snowflake Adapter is your one-stop data management tool to interact and execute operations on Snowflake.Bulk Export and Import: Facilitates importing from a range of input formats like CSV, JSON, Avro, etc. Similarly, you can export data to your desired format using the unload capability.Snowflake Operations Support: Offers support for all Snowflake operations along with SQL support for advanced users.Ease of Access and Connectivity: Enables saving of connection-related details on our platform to allow simplified operations and querying.Multiple options for Staging Storage: Various options to load data into staging storage like Amazon S3, Google Cloud Storage, or Microsoft Azure.Dynamic configuration with headers and properties: Assigning dynamic values to different properties allows enhanced flexibility to your integration flows. You can also refer to dynamic parameters using SAP Cloud Integration exchange headers and properties.In this blog post, We will take a deep dive into the following operations mentioned below:Insert – Load new data into Snowflake tables.Select – Query data for reporting or downstream integration.Delete – Remove specific records as part of data cleansing or archiving.Execute – Run SQL statementsTablelist – Retrieve metadata and structure information of Snowflake tables.Update – Modify existing records based on dynamic business rules. Snowflake:Lets get started with the configurations that are needed to create Database and the Table on Snowflake:1. Log on to your Snowflake account and in left panel select Data –> Databases.Please note: You need to System Admin role to create Tables in Snowflake2. In the Database section, click on + Database button on the top right and create a database with a desired name, I created a Database with name “TESTING”3. Once the Database is created, you will see two schemas “INFORMATION_SCHEMA” and “PUBLIC”, select any schema under which you intend to create the Table. I selected “PUBLIC” schema to create the TableI am using Standard option to create the Table.4. At this step you need to define the schema of your table and click on “Create Table”The Table name that I created is “ACCOUNT”, and it looks below :We are good on Snowflake side, Lets dive into our SAP CI Tenant and check on the Snowflake Adapter.SAP Cloud Integration Tenant:In this blog, I will performing operations like Insert, Delete, Update, Table List, Execute and Select. I have created a simple iflow where I will be sending the data from Postman and perform the Operations on Snowflake using Snowflake Receiver Adapter.Snowflake Receiver Adapter:The connection tab contains the following fields:FieldDescriptionAuthenticationSelect the Database Account method as the authentication mechanism.Credential Name AliasSpecify the artifact Name that stores the username/password credentials in the Security Material.AddressSpecify the JDBC endpoint URL of the Snowflake application to be used for the connection. This address typically contains an account identifier which is a combination of the organization and account name separated by a hyphen(orgname-account_name).DatabaseSpecify the name of the database. “TESTING” in our caseSchemaSpecify the name of the schema to be accessed. “Public” in our caseWarehouseSpecify the name of the Snowflake warehouse. If the warehouse provided is not found/incorrect, it uses DEFAULT.1. Operation – InsertInsert Operation is used to insert a record into the Table, Lets populated the Table “ACCOUNT” with some records.Request Payload: It is mandatory to include metadata of the structure with datatype attribute for fieldname in the INSERT payload. The metadata/row tags are case sensitive and must be in lower case.<?xml version=”1.0″ encoding=”UTF-8″?>
<ROOT>
<metadata>
<fieldname datatype=”VARCHAR”>ID</fieldname>
<fieldname datatype=”VARCHAR”>LASTNAME</fieldname>
<fieldname datatype=”VARCHAR”>FIRSTNAME</fieldname>
<fieldname datatype=”VARCHAR”>SALUTATION</fieldname>
<fieldname datatype=”VARCHAR”>FULLNAME</fieldname>
</metadata>
<row>
<ID>00QJ1000004yZSiMAM</ID>
<LASTNAME>Oswal</LASTNAME>
<FIRSTNAME>Punith</FIRSTNAME>
<SALUTATION>Mr.</SALUTATION>
<FULLNAME>Punith Oswal</FULLNAME>
</row>
</ROOT>Request sent from Postman:Message Successful in SAP CI:Record inserted in Snowflake:2. Operation – Update :Used to Update an existing record in the Table.Lets update the existing record that we created in the Insert Operation and Change Firstname from Punith to Rohit and Fullname to Rohit OswalPayload for Update: “where” field is mandatory in the payload, it acts as a where clause for the record to get updated.<?xml version=”1.0″ encoding=”UTF-8″?>
<ROOT>
<metadata>
<fieldname datatype=”VARCHAR”>ID</fieldname>
<fieldname datatype=”VARCHAR”>LASTNAME</fieldname>
<fieldname datatype=”VARCHAR”>FIRSTNAME</fieldname>
<fieldname datatype=”VARCHAR”>SALUTATION</fieldname>
<fieldname datatype=”VARCHAR”>FULLNAME</fieldname>
</metadata>
<row>
<ID>00QJ1000004yZSiMAM</ID>
<LASTNAME>Oswal</LASTNAME>
<FIRSTNAME>Rohit</FIRSTNAME>
<SALUTATION>Mr.</SALUTATION>
<FULLNAME>Rohit Oswal</FULLNAME>
<where>ID=’00QJ1000004yZSiMAM'</where>
</row>
</ROOT>Request sent from Postman:Message successful in SAP CIRecord Updated in Snowflake:3. Operation – Delete :Used to Delete an existing record in the Table.Currently there are 2 records in the “Account Table”, lets delete record with Firstname “Rohit”.There is no request payload needed for DELETE operation, Just trigger a blank request from the Postman.Message successful in SAP CI: Record with FIRSTNAME = Rohit deleted in Snowflake Table:4. Operation – Table List :This operation lists the tables for the current specified schema.This operation lists the all details of the Tables present under the schema, Schema = PUBLIC in our case.There is no request payload needed for Table List operation, Just trigger a blank request from the Postman.Message successful in SAP CI:Response from Snowflake:5. Operation – SELECT : Select is used to fetch data from the table. You can customize data by selecting required columns in Response Fields for a specific condition using Where Clause. You can use Limit to fetch alimited number of records from a predefined Offset value and finally, Orderby Statement allowsyou to return records in descending or ascending order.Currently there are 2 records in the “Account Table”, lets fetch the Record ID = 12QJ1000004yZSiPPP with response fields – ID,FULLNAMEThere is no request payload needed for SELECT operation, Just trigger a blank request from the Postman.Message Successful in SAP CI:Response from Snowflake:6. Operation – EXECUTE : Execute allows you to run SQL queries or stored procedures on the target table.Lets delete our entire table using EXECUTE operation, using SQL statement – DROP TABLE ACCOUNTThere is no request payload needed for EXECUTE operation, Just trigger a blank request from the Postman.Message successful in SAP CI:Response from Snowflake:Table “ACCOUNT” deleted in Snowflake account.Conclusion:In this blog, we explored key Snowflake adapter operations in SAP CI — from Insert and Update to Select and Execute — laying the groundwork for seamless data integration. These core actions empower you to connect SAP with Snowflake efficiently for real-time and batch scenarios.In the next blog, I’ll cover advanced operations like Bulk Upsert and UNLOAD — stay tuned! Hope this helps!Cheers,Punith Oswal Read More Technology Blog Posts by Members articles
#SAP
#SAPTechnologyblog