In this blog I will develop the BTP-IS iflow which will retrieve orders information from the HANA database service and then with the help of event driven architecture — AEM this data will be published and subscribed by another iflow to be delivered to an on-prem local file folder.
The aim of this blog is to create a common scenario where we can see the AEM, HANA cloud database, Cloud Connector integration to retrieve the data from the database….
Prerequisites:
SAP BTP IS Tenant Access with required roles.BTP HANA cloud subscription — Please refer the below blog for basic SOLACE configuration: https://sapzero2hero.com/2022/03/25/sap-btp-how-to-create-sap-hana-database-in-cloud-foundry/Solace Broker Account AEM — Please refer the below blog for basic SOLACE configuration:https://community.sap.com/t5/technology-blog-posts-by-members/a-beginner-s-demo-for-pushing-data-from-sap-cpi-to-solace/ba-p/13868012#M169245Cloud Connector- Please refer the below blogs to configure the Cloud connector with the BTP-IS account: https://sapzero2hero.com/2022/03/17/sap-cpi-how-to-install-and-configuration-sap-cloud-connector-scc/
Let’s see how we can send data from CPI to AEM solace and vice-versa to CPI and further.
Make sure HANA application in BTP is running and we have created a table in the database:
SCENARIO 1 – Retrieve orders data from Hana cloud and sending to the solace queue.
CPI Setup:
Configure the JDBC adapter channel and mention the connection data from the HANA endpoints in the iflowConfigiure the AMQP adapter with SOLACE endpoint in the iflow
Package – HANA_AEM_BTP_Cloud_Connector
Create two iFlows in Integration Suite-
IFLOW1: HANA_DB_BTP_AEM_Iflow_Pub
The iflow1(Publisher) will retrieve the data (using request-reply) from HANA Cloud database based on the query provided which will be trigger on the particular interval and the lookup data will be transformed and processed to the solace AEM event mesh queue.
Query used for an instance: Select * from DBADMIN.Orders;
The HANA connections are configured in the Manage Security > JDBC Material tab, providing the JDBC Data Source Alias created to maintain in the iflow. (jdbc_hana_connect)
Further we have used the groovy script to get the attachment payload for the orders data received from the HANA database
Log Script:
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
def Message processData(Message message) {
def body = message.getBody(java.lang.String) as String;
def messageLog = messageLogFactory.getMessageLog(message);
if(messageLog != null){
messageLog.setStringProperty(“Logging#1”, “Printing Payload As Attachment”)
messageLog.addAttachmentAsString(“DB_Payload:”, body, “text/plain”);
} return message; }
Now we included the message mapping in the iFlow to convert the incoming data from the HANA and transform the data to the required format. (This is just an additional step where we configured value mapping as well if required you can skip the message mapping step):
Transforming the source COUNTRYCODE values using VALUE MAPPING which will be further concat with the ORDERSPEC values to get the target data for the Order_Code field.
Further after the transformation(Message mapping step) messages will be published as an event to the AEM configured TOPIC an event driven architecture to the respective queues. (Note: Topic will publish the event, and queue is subscribed to get the event)
Configuration of Receiver channel – The connection details is configured from the AEM solace – along with the queue name I have maintained in the AEM portal.
Note: This details I have maintained in the AEM solace Pub Sub using trial account. Please refer the above link shared for the trail account creation.
Connection Details:
Queue and Topic Details:
However I have maintainted the basic authentication(solace username and password for JMS API) in the security material:
Now after maintaining the configuration I have deployed and triggered the Iflow with the select query maintaintained in the content modifer and the successful message is triggered with the log attachments (DB_Payload) for the data retrived from HANA :
Message Monitoring:
SCENARIO 2 – Retrieved orders data from AEM queue to an on-prem SFTP Folder configured in cloud connector.
CPI Setup:
Configure the AEM adapter and mention the connection data from the solace pub sub endpoints in the iflowConfigure the SFTP adapter using the on-prem cloud connector endpoints in the iflow
IFLOW2: AEM_BTP_SFTP_IFLOW_SUB
The iflow2(Subscriber) – the data from the iflow1 after transformation will be fetched from the queue subscribed and configured in AEM adapter in iflow2 (basically from solace PUB-SUB) and converting to JSON file format processing further to an on-premise SFTP server via SAP Cloud Connector (SCC).
AEM sender adapter configured:
As show in previous iflow this details(including queue, message vpn, queue and topic, topic endpoints) we can configure from Solace > Cluster Manager > Service Details.
The data will be further transformed from XML to JSON format using converter and I am using groovyscript to log the data after transformation before processing to the on-prem folder:
Script (Postpayload):
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
def Message processData(Message message) {
def body = message.getBody(java.lang.String) as String;
def messageLog = messageLogFactory.getMessageLog(message);
if(messageLog != null){
messageLog.setStringProperty(“Logging#1”, “Printing Payload As Attachment”)
messageLog.addAttachmentAsString(“PostPayload:”, body, “text/plain”);
} return message; }
Further the data will be processed to the below SFTP file folder configured.
(Note: Here I have Setup the connection between the Cloud Integration and the SFTP server and now that the configuration of the components is ready, I connect the SAP Integration Suite – Cloud Integration and our local SFTP Server. Please follow the above blog to set up the SFTP server to the on-premises systems using cloud connector)
Receiver SFTP adapter:
The credential details are maintained in the security material.
Note: Here I have used Rebex Tiny SFTP server to create the Server IP and Port which I configured further in the cloud connector maintaining the BTP-IS account and the know hosts in the BTP-IS security material.
Cloud Connector: PFB screenshot where I have maintained the SFTP internal host details and BTP – IS account to configure and create the on-prem scenario in CPI.
Now we can again trigger the message from IFLOW1- HANA_DB_BTP_AEM_Iflow_Pub so the message will be trigger and processed in both the iflows including IFLOW2- AEM_BTP_SFTP_IFLOW_SUB creating the pub-sub scenario.
JSON file received in the local folder with orders filename appending the timestamp:
In this blog I will develop the BTP-IS iflow which will retrieve orders information from the HANA database service and then with the help of event driven architecture — AEM this data will be published and subscribed by another iflow to be delivered to an on-prem local file folder. The aim of this blog is to create a common scenario where we can see the AEM, HANA cloud database, Cloud Connector integration to retrieve the data from the database…. Prerequisites:SAP BTP IS Tenant Access with required roles.BTP HANA cloud subscription — Please refer the below blog for basic SOLACE configuration: https://sapzero2hero.com/2022/03/25/sap-btp-how-to-create-sap-hana-database-in-cloud-foundry/Solace Broker Account AEM — Please refer the below blog for basic SOLACE configuration:https://community.sap.com/t5/technology-blog-posts-by-members/a-beginner-s-demo-for-pushing-data-from-sap-cpi-to-solace/ba-p/13868012#M169245Cloud Connector- Please refer the below blogs to configure the Cloud connector with the BTP-IS account: https://sapzero2hero.com/2022/03/17/sap-cpi-how-to-install-and-configuration-sap-cloud-connector-scc/Let’s see how we can send data from CPI to AEM solace and vice-versa to CPI and further.Make sure HANA application in BTP is running and we have created a table in the database: SCENARIO 1 – Retrieve orders data from Hana cloud and sending to the solace queue. CPI Setup:Configure the JDBC adapter channel and mention the connection data from the HANA endpoints in the iflowConfigiure the AMQP adapter with SOLACE endpoint in the iflow Package – HANA_AEM_BTP_Cloud_Connector Create two iFlows in Integration Suite-IFLOW1: HANA_DB_BTP_AEM_Iflow_Pub The iflow1(Publisher) will retrieve the data (using request-reply) from HANA Cloud database based on the query provided which will be trigger on the particular interval and the lookup data will be transformed and processed to the solace AEM event mesh queue.Query used for an instance: Select * from DBADMIN.Orders; The HANA connections are configured in the Manage Security > JDBC Material tab, providing the JDBC Data Source Alias created to maintain in the iflow. (jdbc_hana_connect) Further we have used the groovy script to get the attachment payload for the orders data received from the HANA database Log Script:import com.sap.gateway.ip.core.customdev.util.Message;import java.util.HashMap;def Message processData(Message message) { def body = message.getBody(java.lang.String) as String; def messageLog = messageLogFactory.getMessageLog(message); if(messageLog != null){ messageLog.setStringProperty(“Logging#1”, “Printing Payload As Attachment”) messageLog.addAttachmentAsString(“DB_Payload:”, body, “text/plain”); } return message; } Now we included the message mapping in the iFlow to convert the incoming data from the HANA and transform the data to the required format. (This is just an additional step where we configured value mapping as well if required you can skip the message mapping step): Transforming the source COUNTRYCODE values using VALUE MAPPING which will be further concat with the ORDERSPEC values to get the target data for the Order_Code field. Further after the transformation(Message mapping step) messages will be published as an event to the AEM configured TOPIC an event driven architecture to the respective queues. (Note: Topic will publish the event, and queue is subscribed to get the event) Configuration of Receiver channel – The connection details is configured from the AEM solace – along with the queue name I have maintained in the AEM portal. Note: This details I have maintained in the AEM solace Pub Sub using trial account. Please refer the above link shared for the trail account creation. Connection Details: Queue and Topic Details: However I have maintainted the basic authentication(solace username and password for JMS API) in the security material: Now after maintaining the configuration I have deployed and triggered the Iflow with the select query maintaintained in the content modifer and the successful message is triggered with the log attachments (DB_Payload) for the data retrived from HANA : Message Monitoring: SCENARIO 2 – Retrieved orders data from AEM queue to an on-prem SFTP Folder configured in cloud connector.CPI Setup:Configure the AEM adapter and mention the connection data from the solace pub sub endpoints in the iflowConfigure the SFTP adapter using the on-prem cloud connector endpoints in the iflow IFLOW2: AEM_BTP_SFTP_IFLOW_SUB The iflow2(Subscriber) – the data from the iflow1 after transformation will be fetched from the queue subscribed and configured in AEM adapter in iflow2 (basically from solace PUB-SUB) and converting to JSON file format processing further to an on-premise SFTP server via SAP Cloud Connector (SCC). AEM sender adapter configured: As show in previous iflow this details(including queue, message vpn, queue and topic, topic endpoints) we can configure from Solace > Cluster Manager > Service Details. The data will be further transformed from XML to JSON format using converter and I am using groovyscript to log the data after transformation before processing to the on-prem folder: Script (Postpayload):import com.sap.gateway.ip.core.customdev.util.Message;import java.util.HashMap;def Message processData(Message message) { def body = message.getBody(java.lang.String) as String; def messageLog = messageLogFactory.getMessageLog(message); if(messageLog != null){ messageLog.setStringProperty(“Logging#1”, “Printing Payload As Attachment”) messageLog.addAttachmentAsString(“PostPayload:”, body, “text/plain”); } return message; } Further the data will be processed to the below SFTP file folder configured.(Note: Here I have Setup the connection between the Cloud Integration and the SFTP server and now that the configuration of the components is ready, I connect the SAP Integration Suite – Cloud Integration and our local SFTP Server. Please follow the above blog to set up the SFTP server to the on-premises systems using cloud connector) Receiver SFTP adapter: The credential details are maintained in the security material. Note: Here I have used Rebex Tiny SFTP server to create the Server IP and Port which I configured further in the cloud connector maintaining the BTP-IS account and the know hosts in the BTP-IS security material. Cloud Connector: PFB screenshot where I have maintained the SFTP internal host details and BTP – IS account to configure and create the on-prem scenario in CPI. Now we can again trigger the message from IFLOW1- HANA_DB_BTP_AEM_Iflow_Pub so the message will be trigger and processed in both the iflows including IFLOW2- AEM_BTP_SFTP_IFLOW_SUB creating the pub-sub scenario. JSON file received in the local folder with orders filename appending the timestamp: Read More Technology Blog Posts by Members articles
#SAP
#SAPTechnologyblog