Accessing On-Premises HTTP APIs with SAP Joule Studio and SAP Cloud Connector

Estimated read time 20 min read

SAP Joule Studio enables you to design and build intelligent AI agents and skills that automate and optimize your business processes. To ensure these agents can operate effectively, it is essential to allow them to read from and write data to third-party systems using APIs. In a typical SAP landscape, some of these systems may reside on-premises. To securely access such on-premises systems, the SAP Cloud Connector is the recommended approach. This blog post will guide you through the process of setting up and connecting a Joule skill or agent project in SAP Joule Studio to integrate securely through the Cloud Connector.

The scenario is as follows: We have SAP Joule Studio running on the SAP Business Technology Platform (BTP), while the On-Premises systems exposes its endpoints. To establish a secure connection, we will configure the Cloud Connector, create a destination, set up a SAP Build Actions Project and use that Action in a Joule skill.

Let’s dive into the steps required to set up this integration and leverage the power of SAP Joule Studio in conjunction with the SAP Cloud Connector.

Prerequisite

Before proceeding with the steps outlined in this guide, it is essential to have an instance of the SAP Cloud Connector installed. While it is possible to install the Cloud Connector on a server, for the purposes of this demonstration, we will be using a Windows machine. We recommend following the instructions provided in this blog (https://blogs.sap.com/2021/09/05/installation-and-configuration-of-sap-cloud-connector/) to install and configure the Cloud Connector until the Subaccount is connected.

Second requirement is a BTP subaccount with a SAP Build Process Automation (Joule Studio) instance. To this subaccount we will connect the Cloud Connector.

1. Cloud Connector Configuration

The first step is to create a configuration in the Cloud Connector that connects to our subaccount and exposes the HTTP resource. For the purpose of this demonstration, I ran a small Node.js server on my Windows machine that outputs “Hello World”.

To create the configuration, navigate to the admin interface of the Cloud Connector. In here you need to select your registered Subaccount where your SAP Build Process Automation instance resides. Then create a “Cloud to On-Premise” configuration.

 

In the provided screenshot, you will notice that I have exposed the internal host “localhost” with port 3333 using a virtual host named “virtualhost”. This virtual host will be used for making requests from the BTP side. Currently, I have configured an unrestricted access policy, allowing access to all paths. However, in production scenarios, it is recommended to define access policies with more granular control.

Please note that it is crucial to ensure that you have exposed the necessary resources in your configuration. This ensures that the required endpoints are accessible and allows for successful communication between the cloud and on-premises environments.

In this example, the system type is a Non-SAP System using the HTTP protocol. To confirm that the connection works, you can use the “Check availability of internal host” button. This step is essential, as it verifies the accuracy of your configuration and ensures successful connectivity between the cloud and your on-premises environment.

Note: It’s important to understand the difference between the internal and virtual host. The internal host refers to the hostname accessible within your on-premises network — in this case, it’s the localhost on my laptop. Later, within SAP BTP, we’ll reference the virtual host instead.

On the BTP Cockpit end, we can check the connected Cloud Connectors in the respective menu tab. If you cannot see this tab, you may be missing some roles. In the image, you see my registered Cloud Connector and the backend system with its virtualhost available.

2. Create Destination


After successfully registering the Cloud Connector, the next step is to create a destination. A destination serves as a means for services to access an API by handling the authentication and networking aspects.

By configuring a destination, you can simplify the process of accessing APIs by abstracting the underlying technical details. The destination takes care of handling authentication, network communication, and other necessary configurations, allowing services to focus on consuming the API and performing business logic without worrying about the underlying implementation.

Creating a destination provides a convenient way to encapsulate the necessary information, such as the API endpoint URL, authentication credentials, and other relevant settings. This abstraction helps streamline the integration process and facilitates secure and reliable communication between your services and the targeted API.

 

When creating the destination, there are several important details to specify. These include the name of the destination, the URL you want to access, and the type of destination. In this case, we will specify the URL as the virtual host “http://virtualhost:3333“. To utilize the Cloud Connector, we set the Proxy Type to On-Premise.

Optionally one can also maintain the Authentication detail – my service does not require these.

Once the destination configuration is complete, it is crucial to use the “Check Connection” button to verify that everything is configured correctly and the connection can be established successfully. This step ensures that the destination is functioning as expected and is ready to be utilized in your automation processes. 

Note: Did you know? The Destination Service’s “Check Connection” behaves slightly differently when the destination is configured with the proxy type “OnPremise.”

For public APIs on the internet, the connection check typically validates both the authentication credentials and the specific path — in other words, it performs a real GET request on that endpoint.

In contrast, for on-premise HTTP destinations, the check is much simpler. It only performs an HTTP “ping” on the host to verify that some response is received. Even if the target system returns a 5xx error, the check will still show as successful as long as the Cloud Connector tunnel is reachable and the HTTP server responds.

Be aware that this behavior can have downstream effects when troubleshooting connectivity issues.

 

 

In the ideal case you get the green checkmark! But there are some common errors we need to discuss:

 

 

The “Backend not available” error occurs when the connectivity service is unable to locate the host you are trying to access. This could be due to the host not being exposed in the Cloud Connector configuration or a potential misspelling in the URL. To resolve this issue, it is important to revisit the URL and ensure that you are using the correct virtual host specified in the Cloud Connector configuration. Verify that the URL is accurate and matches the virtual host configuration to establish the necessary connectivity.

 

The second common issue that may arise is the “Resource not accessible” error. This occurs when the connectivity service successfully locates the backend you intend to connect to, but the specific resource (such as the subpath “/hello”) either does not exist or is not allowed to be accessed based on the rules defined in the configuration. To resolve this, ensure that you have included the correct path in the Cloud Connector’s resource configuration.

For the usage in SAP Build – there is one additional step to make the Destination available to be used inside the Build tool. We need to register the Destination in the Control Tower of SAP Build:

We can do so by navigating to SAP Build Control Tower > Destinations and then choose Add to add the Destination to our environments.

3. Create Actions Project

Now, let’s dive into SAP Build Process Automation and create an Actions project. In this case, since my API is not a standard one, I will create a custom API Specification. However, if you are working with a target system like S/4HANA, you can leverage the pre-defined API Specifications available in the SAP Business Accelerator Hub.

Let’s give the Project a name of our choice and choose the option to “Upload API Specification”.

SAP Build Actions allow us to upload OpenAPI Specifications in JSON format.

 

{
“openapi”: “3.0.0”,
“info”: {
“description”: “Demonstration Hello World”,
“title”: “helloworld”,
“version”: “1.0.0”
},
“servers”: [
{
“url”: “empty”
}
],
“paths”: {
“/hello”: {
“get”: {
“summary”: “get hello”,
“description”: “get a hello world message”,
“operationId”: “get.hello”,
“responses”: {
“200”: {
“description”: “Successful response”,
“content”: {
“application/json”: {
“schema”: {
“type”: “object”,
“title”: “Message Object”,
“properties”: {
“message”: {
“type”: “string”
}
}
}
}
}
}
}
}
}
}
}

The provided code snippet represents the API Specification, where I define the necessary details. In this case, the servers section is left empty as the URL will be retrieved from the destination configuration at a later stage. Additionally, I specify the available path as “/hello”.

It’s important to note that the path mentioned in the API specification will be preceded by the destination’s URL. Therefore, in our scenario, the complete URL for the request will be “http://virtualhost:3333/hello“.

Another crucial aspect for the functionality within the low-code environment is defining the expected response structure. In this case, the response payload will consist of a plain object with the key “message”. To build and test specifications, you can utilize the website https://editor.swagger.io/Be aware, that the response structure must match the specification; if, for example, fields are missing, you will be prompted with a schema error.

Once you have finalized the specification, save the file as a .json format and proceed to upload it to the Actions project.

Within the Actions Project, you will be prompted to add the desired actions, which you can proceed to do. Now, let’s move on to testing:

Within the selected Action, navigate to the “Test” tab. Here, you have the option to select the destination that serves as the basis for the request. For this demonstration, I will choose the “sap-sample-api-via-cloud-connector” destination that I created earlier. Once the test is executed, you will find the response payload displayed at the bottom. In this particular case, the test is successful, and I receive the expected message from my On-Premises server.

If you encounter an error during the process, make sure to check the “View API” Section for any response body that might provide insights into the cause of the issue.

In the final step, we need to release and publish the Action to ensure its availability for consumption in an Project. To accomplish this, locate the buttons located at the top right-hand side of the Actions editor. Click on these buttons and ensure that the status indicates “released” and then “published.”

4. Add Action to Joule Skill

Lets create a Project and include the Action in a Joule Skill. Do so by adding a step in the Skill with the + symbol.

First we need to add the Action Project in our dependencies:

We add a Action to the Skill, configure its Destination field by creating a new Destination Variable we give an arbitrary name.

We can see that the Action has the message variable in its output.

To prove that we can now also use the data from the API, I added an additional step to Send a Message with the API’s result “message” variable in its content.

With the message editor – we can send a custom message to the user. In this case I keep it simple and just output the response from the message field of the API response.

 

After going through the standard deployment process in SAP Build Joule Studio – we can test our newly deployed skill directly in Joule.

And in this case we are successful and can retrieve the actual answer from the On-Premises HTTP System.

In the following posts we aim to provide further guidance on how to connect to a SAP On-Premises System – as this is the most common scenario for the extension of Joule. Take this blog post as the basis for understanding the concept and flow.

Hope you find this blog insightful, if you have any questions feel free to leave a comment.

 

​ SAP Joule Studio enables you to design and build intelligent AI agents and skills that automate and optimize your business processes. To ensure these agents can operate effectively, it is essential to allow them to read from and write data to third-party systems using APIs. In a typical SAP landscape, some of these systems may reside on-premises. To securely access such on-premises systems, the SAP Cloud Connector is the recommended approach. This blog post will guide you through the process of setting up and connecting a Joule skill or agent project in SAP Joule Studio to integrate securely through the Cloud Connector.The scenario is as follows: We have SAP Joule Studio running on the SAP Business Technology Platform (BTP), while the On-Premises systems exposes its endpoints. To establish a secure connection, we will configure the Cloud Connector, create a destination, set up a SAP Build Actions Project and use that Action in a Joule skill.Let’s dive into the steps required to set up this integration and leverage the power of SAP Joule Studio in conjunction with the SAP Cloud Connector.PrerequisiteBefore proceeding with the steps outlined in this guide, it is essential to have an instance of the SAP Cloud Connector installed. While it is possible to install the Cloud Connector on a server, for the purposes of this demonstration, we will be using a Windows machine. We recommend following the instructions provided in this blog (https://blogs.sap.com/2021/09/05/installation-and-configuration-of-sap-cloud-connector/) to install and configure the Cloud Connector until the Subaccount is connected.Second requirement is a BTP subaccount with a SAP Build Process Automation (Joule Studio) instance. To this subaccount we will connect the Cloud Connector.1. Cloud Connector ConfigurationThe first step is to create a configuration in the Cloud Connector that connects to our subaccount and exposes the HTTP resource. For the purpose of this demonstration, I ran a small Node.js server on my Windows machine that outputs “Hello World”.To create the configuration, navigate to the admin interface of the Cloud Connector. In here you need to select your registered Subaccount where your SAP Build Process Automation instance resides. Then create a “Cloud to On-Premise” configuration. In the provided screenshot, you will notice that I have exposed the internal host “localhost” with port 3333 using a virtual host named “virtualhost”. This virtual host will be used for making requests from the BTP side. Currently, I have configured an unrestricted access policy, allowing access to all paths. However, in production scenarios, it is recommended to define access policies with more granular control.Please note that it is crucial to ensure that you have exposed the necessary resources in your configuration. This ensures that the required endpoints are accessible and allows for successful communication between the cloud and on-premises environments.In this example, the system type is a Non-SAP System using the HTTP protocol. To confirm that the connection works, you can use the “Check availability of internal host” button. This step is essential, as it verifies the accuracy of your configuration and ensures successful connectivity between the cloud and your on-premises environment.Note: It’s important to understand the difference between the internal and virtual host. The internal host refers to the hostname accessible within your on-premises network — in this case, it’s the localhost on my laptop. Later, within SAP BTP, we’ll reference the virtual host instead.On the BTP Cockpit end, we can check the connected Cloud Connectors in the respective menu tab. If you cannot see this tab, you may be missing some roles. In the image, you see my registered Cloud Connector and the backend system with its virtualhost available.2. Create DestinationAfter successfully registering the Cloud Connector, the next step is to create a destination. A destination serves as a means for services to access an API by handling the authentication and networking aspects.By configuring a destination, you can simplify the process of accessing APIs by abstracting the underlying technical details. The destination takes care of handling authentication, network communication, and other necessary configurations, allowing services to focus on consuming the API and performing business logic without worrying about the underlying implementation.Creating a destination provides a convenient way to encapsulate the necessary information, such as the API endpoint URL, authentication credentials, and other relevant settings. This abstraction helps streamline the integration process and facilitates secure and reliable communication between your services and the targeted API. When creating the destination, there are several important details to specify. These include the name of the destination, the URL you want to access, and the type of destination. In this case, we will specify the URL as the virtual host “http://virtualhost:3333”. To utilize the Cloud Connector, we set the Proxy Type to On-Premise.Optionally one can also maintain the Authentication detail – my service does not require these.Once the destination configuration is complete, it is crucial to use the “Check Connection” button to verify that everything is configured correctly and the connection can be established successfully. This step ensures that the destination is functioning as expected and is ready to be utilized in your automation processes. Note: Did you know? The Destination Service’s “Check Connection” behaves slightly differently when the destination is configured with the proxy type “OnPremise.”For public APIs on the internet, the connection check typically validates both the authentication credentials and the specific path — in other words, it performs a real GET request on that endpoint.In contrast, for on-premise HTTP destinations, the check is much simpler. It only performs an HTTP “ping” on the host to verify that some response is received. Even if the target system returns a 5xx error, the check will still show as successful as long as the Cloud Connector tunnel is reachable and the HTTP server responds.Be aware that this behavior can have downstream effects when troubleshooting connectivity issues.  In the ideal case you get the green checkmark! But there are some common errors we need to discuss:  The “Backend not available” error occurs when the connectivity service is unable to locate the host you are trying to access. This could be due to the host not being exposed in the Cloud Connector configuration or a potential misspelling in the URL. To resolve this issue, it is important to revisit the URL and ensure that you are using the correct virtual host specified in the Cloud Connector configuration. Verify that the URL is accurate and matches the virtual host configuration to establish the necessary connectivity. The second common issue that may arise is the “Resource not accessible” error. This occurs when the connectivity service successfully locates the backend you intend to connect to, but the specific resource (such as the subpath “/hello”) either does not exist or is not allowed to be accessed based on the rules defined in the configuration. To resolve this, ensure that you have included the correct path in the Cloud Connector’s resource configuration.For the usage in SAP Build – there is one additional step to make the Destination available to be used inside the Build tool. We need to register the Destination in the Control Tower of SAP Build:We can do so by navigating to SAP Build Control Tower > Destinations and then choose Add to add the Destination to our environments.3. Create Actions ProjectNow, let’s dive into SAP Build Process Automation and create an Actions project. In this case, since my API is not a standard one, I will create a custom API Specification. However, if you are working with a target system like S/4HANA, you can leverage the pre-defined API Specifications available in the SAP Business Accelerator Hub.Let’s give the Project a name of our choice and choose the option to “Upload API Specification”.SAP Build Actions allow us to upload OpenAPI Specifications in JSON format. {
“openapi”: “3.0.0”,
“info”: {
“description”: “Demonstration Hello World”,
“title”: “helloworld”,
“version”: “1.0.0”
},
“servers”: [
{
“url”: “empty”
}
],
“paths”: {
“/hello”: {
“get”: {
“summary”: “get hello”,
“description”: “get a hello world message”,
“operationId”: “get.hello”,
“responses”: {
“200”: {
“description”: “Successful response”,
“content”: {
“application/json”: {
“schema”: {
“type”: “object”,
“title”: “Message Object”,
“properties”: {
“message”: {
“type”: “string”
}
}
}
}
}
}
}
}
}
}
}The provided code snippet represents the API Specification, where I define the necessary details. In this case, the servers section is left empty as the URL will be retrieved from the destination configuration at a later stage. Additionally, I specify the available path as “/hello”.It’s important to note that the path mentioned in the API specification will be preceded by the destination’s URL. Therefore, in our scenario, the complete URL for the request will be “http://virtualhost:3333/hello”.Another crucial aspect for the functionality within the low-code environment is defining the expected response structure. In this case, the response payload will consist of a plain object with the key “message”. To build and test specifications, you can utilize the website https://editor.swagger.io/. Be aware, that the response structure must match the specification; if, for example, fields are missing, you will be prompted with a schema error.Once you have finalized the specification, save the file as a .json format and proceed to upload it to the Actions project.Within the Actions Project, you will be prompted to add the desired actions, which you can proceed to do. Now, let’s move on to testing:Within the selected Action, navigate to the “Test” tab. Here, you have the option to select the destination that serves as the basis for the request. For this demonstration, I will choose the “sap-sample-api-via-cloud-connector” destination that I created earlier. Once the test is executed, you will find the response payload displayed at the bottom. In this particular case, the test is successful, and I receive the expected message from my On-Premises server.If you encounter an error during the process, make sure to check the “View API” Section for any response body that might provide insights into the cause of the issue.In the final step, we need to release and publish the Action to ensure its availability for consumption in an Project. To accomplish this, locate the buttons located at the top right-hand side of the Actions editor. Click on these buttons and ensure that the status indicates “released” and then “published.”4. Add Action to Joule SkillLets create a Project and include the Action in a Joule Skill. Do so by adding a step in the Skill with the + symbol.First we need to add the Action Project in our dependencies:We add a Action to the Skill, configure its Destination field by creating a new Destination Variable we give an arbitrary name. We can see that the Action has the message variable in its output.To prove that we can now also use the data from the API, I added an additional step to Send a Message with the API’s result “message” variable in its content.With the message editor – we can send a custom message to the user. In this case I keep it simple and just output the response from the message field of the API response. After going through the standard deployment process in SAP Build Joule Studio – we can test our newly deployed skill directly in Joule.And in this case we are successful and can retrieve the actual answer from the On-Premises HTTP System.In the following posts we aim to provide further guidance on how to connect to a SAP On-Premises System – as this is the most common scenario for the extension of Joule. Take this blog post as the basis for understanding the concept and flow.Hope you find this blog insightful, if you have any questions feel free to leave a comment.   Read More Technology Blog Posts by SAP articles 

#SAP

#SAPTechnologyblog

You May Also Like

More From Author