Integrating AI with SAPUI5 Fiori Apps: Part 1 – Concept

Estimated read time 11 min read
Blog Series

Integrating AI with SAPUI5 Fiori Apps: Part 1 – ConceptIntegrating AI with SAPUI5 Fiori Apps: Part 2 – Building a Text SummarizerIntegrating AI with SAPUI5 Fiori Apps: Part 3 – Building an AI Chatbot Assistant

With the announcement of SAP Generative AI Hub, there has been an explosion of interest in exploring how artificial intelligence can change how enterprise applications are developed and consumed. The official BTP Reference Architecture for GenAI and RAG provides guidance on how to build full-stack apps integrating with LLMs via SAP AI Core. Many times, all you need is to integrate your freestyle or Fiori Elements SAPUI5 apps quickly with LLMs for use cases where you need GenAI – LLM integration only on the web client side.

Some of the use cases where a client-side LLM integration within SAPUI5 apps are needed include, but are not limited to:

Product Description GeneratorJob Description GeneratorCustomer Query SummarizerCampaign Idea/Title GeneratorAI Assistant Chatbot (without RAG—to answer based on a limited set of instructions)

I’m beginning this blog in this series by outlining an approach using SAP BTP Destinations to achieve LLM integration easily and securely, for deployment and development. The techniques outlined can be used right away in a productive setting. In the subsequent blogs, I’ll walk you through an overview of how to build a simple Text Summarizer and a slightly advanced AI Chatbot Assistant.

Below is the solution to how this works. Please note that Standalone Approuter is optional for consumption via SAP Workzone or directly via SAP HTML5 Repository since both have SAP Managed Approuter running behind the scenes and would respect the xs-app.json configuration.

With this context set, let’s get started to make your SAPUI5 apps AI-enabled !

1. Create an SAPUI5 app

If you don’t already have a SAPUI5 app (Freestyle or Fiori Elements), you can create a new app via SAP Build Code (Create > Build an Application > SAP Build Code > SAP Fiori Application) or in VS Code using SAP Fiori Tools Extensions.

2. Create a Destination for LLM deployment in SAP AI Core

Create a Destination in SAP BTP Cockpit with a name such as GENERATIVE_AI_HUB and the following info:

 

# Destination
Name: URL: <AI_API_URL>/v2 # Eg. https://api.ai….ml.hana.ondemand.com/v2 <- Note the suffix /v2
Authentication: OAuth2ClientCredentials
Client ID: <CLIENT_ID>
Client Secret: <CLIENT_SECRET>
Token Service Url: <TOKEN_URL>/oauth2/token

## Additional Properties
HTML5.DynamicDestination: true
URL.headers.AI-Resource-Group: default
URL.headers.Content-Type: application/json

 

Please note that the URL is suffixed with /v2 in order to align with cap-llm-plugin usage.

3. Prepare for Deployment

To generate the necessary files needed for deployment such as xs-app.json and mta.yaml, execute the below command.

 

npm run deploy-config

 

4. Add a route to Destination for LLM API

Now, add an API route to proxy API calls for LLM via the Destination created in the xs-app.json file.

 

{
“authenticationMethod”: “route”,
“routes”: [
{
“source”: “^/api/(.*)$”,
“target”: “/inference/deployments/<DEPLOYMENT_ID>/$1”,
“authenticationType”: “xsuaa”,
“destination”: “GENERATIVE_AI_HUB”,
“csrfProtection”: true
},

]
}

 

5. Call the LLM API

Now, you can call the LLM API from your code. To demonstrate, I’m using the modern Browser fetch to make a REST API call.

 

// first get relative module path
const sModulePrefix = this.getOwnerComponent().getManifestEntry(“/sap.app/id”);

// make an API call to an LLM
const res = await fetch(
`${sModulePrefix}/api/chat/completions?api-version=2024-02-01`,
{
method: “POST”,
headers: {
“X-CSRF-Token”: “<X-CSRF-Token>”, // recommended for security. You can disable this for testing by setting `csrfProtection: false` in `xs-app.json` for `/api` route
“Content-Type”: “application/json”,
“AI-Resource-Group”: “default”, // mandatory
},
credentials: “same-origin”,
body: JSON.stringify({
messages: [
{
role: “system”,
content: “Write a TL;DR/summary of the user content in a paragraph.”,
},
{
role: “user”,
content: sTxtInput,
},
],
}),
}
)

 

6. Deploy the SAPUI5 app

Now, you can build the MTA app and deploy it by executing the following commands:

 

# Build an MTAR archive
mbt build

# Deploy the MTAR archive to logged in space
npm run deploy

 

6. Bonus: How to configure during development

The above steps outlined the steps to deploy an SAPUI5 integrated with LLM capabilities. Every time you make a change, the app needs to get built and deployed again which is not a good developer experience. Here in the subsequent steps, you can follow to connect your dev/local setup with BTP Destination and XSUAA services to work seamlessly during development whether in SAP Business Application Studio or in VS Code.

6a. Deploy the SAPUI5 app once

You must deploy your SAPUI5 app at least once, if not done already to consume Destinations with OAuth2ClientCredentials work locally. This setup requires the credentials to BTP Destinations and XSUAA services locally to work properly.

 

# Build an MTAR archive
mbt build

# Deploy the MTAR archive to logged in space
npm run deploy

 

6b. Create a default-env.json

Now, create a default-env.json file with the following content and replace the placeholders with actual credentials.

 

{
“VCAP_SERVICES”: {
“destination”: [
{
“credentials”: {
“clientid”: “<CLIENT_ID>”,
“clientsecret”: “<CLIENT_SECRET>”,
“url”: “<AUTH_URL>”,
“uri”: “<DESTINATION_URI>”
},
“label”: “destination”,
“name”: “<INSTANCE_NAME>”,
“tags”: [“destination”, “conn”, “connsvc”]
}
],
“xsuaa”: [
{
“credentials”: {
“clientid”: “<CLIENT_ID>”,
“clientsecret”: “<CLIENT_SECRET>”,
“url”: “<AUTH_URL>”
},
“label”: “xsuaa”,
“tags”: [“xsuaa”]
}
]
}
}

 

6c. Install UI5 Middleware Plugin for Approuter

Now, install the ui5-middleware-approuter plugin, which wraps the actual @sap/approuter package to make the destination and routing work locally.

 

npm install -D ui5-middleware-approuter

 

6d. Update UI5 configuration to use the middleware

Now, update the ui5.yaml and ui5-local.yaml to add ui5-middleware-approuter as a custom middleware

 


server:
customerMiddleware:

– name: ui5-middleware-approuter
afterMiddleware: compression
configuration:
authenticationMethod: “route”
xsappJson: “./xs-app.json”
appendAuthRoute: true # trigger auth, inorder to connect to destination service

 

Now, start running your UI5 app server locally and calls to /api endpoint now would get proxied over to LLMs running in SAP AI Core ?.

 

npm start

 

This concludes the step-by-step approach to integrating LLMs with your SAPUI5 apps. In the subsequent blogs, I’ll walk you through practical use-case implementations – A text Summarizer and an AI Chatbot Assistant in SAP Freestyle UI. Please note, that the techniques mentioned above work for SAP Fiori Elements apps too.

If you have specific questions on the approach or other interesting Generative AI topics to know, please feel free to comment below. I’m all ears! ?

 

​ Blog SeriesIntegrating AI with SAPUI5 Fiori Apps: Part 1 – ConceptIntegrating AI with SAPUI5 Fiori Apps: Part 2 – Building a Text SummarizerIntegrating AI with SAPUI5 Fiori Apps: Part 3 – Building an AI Chatbot AssistantWith the announcement of SAP Generative AI Hub, there has been an explosion of interest in exploring how artificial intelligence can change how enterprise applications are developed and consumed. The official BTP Reference Architecture for GenAI and RAG provides guidance on how to build full-stack apps integrating with LLMs via SAP AI Core. Many times, all you need is to integrate your freestyle or Fiori Elements SAPUI5 apps quickly with LLMs for use cases where you need GenAI – LLM integration only on the web client side.Some of the use cases where a client-side LLM integration within SAPUI5 apps are needed include, but are not limited to:Product Description GeneratorJob Description GeneratorCustomer Query SummarizerCampaign Idea/Title GeneratorAI Assistant Chatbot (without RAG—to answer based on a limited set of instructions)I’m beginning this blog in this series by outlining an approach using SAP BTP Destinations to achieve LLM integration easily and securely, for deployment and development. The techniques outlined can be used right away in a productive setting. In the subsequent blogs, I’ll walk you through an overview of how to build a simple Text Summarizer and a slightly advanced AI Chatbot Assistant.Below is the solution to how this works. Please note that Standalone Approuter is optional for consumption via SAP Workzone or directly via SAP HTML5 Repository since both have SAP Managed Approuter running behind the scenes and would respect the xs-app.json configuration.With this context set, let’s get started to make your SAPUI5 apps AI-enabled ✨!1. Create an SAPUI5 appIf you don’t already have a SAPUI5 app (Freestyle or Fiori Elements), you can create a new app via SAP Build Code (Create > Build an Application > SAP Build Code > SAP Fiori Application) or in VS Code using SAP Fiori Tools Extensions.2. Create a Destination for LLM deployment in SAP AI CoreCreate a Destination in SAP BTP Cockpit with a name such as GENERATIVE_AI_HUB and the following info: # Destination
Name: URL: <AI_API_URL>/v2 # Eg. https://api.ai….ml.hana.ondemand.com/v2 <- Note the suffix /v2
Authentication: OAuth2ClientCredentials
Client ID: <CLIENT_ID>
Client Secret: <CLIENT_SECRET>
Token Service Url: <TOKEN_URL>/oauth2/token

## Additional Properties
HTML5.DynamicDestination: true
URL.headers.AI-Resource-Group: default
URL.headers.Content-Type: application/json Please note that the URL is suffixed with /v2 in order to align with cap-llm-plugin usage.3. Prepare for DeploymentTo generate the necessary files needed for deployment such as xs-app.json and mta.yaml, execute the below command. npm run deploy-config 4. Add a route to Destination for LLM APINow, add an API route to proxy API calls for LLM via the Destination created in the xs-app.json file. {
“authenticationMethod”: “route”,
“routes”: [
{
“source”: “^/api/(.*)$”,
“target”: “/inference/deployments/<DEPLOYMENT_ID>/$1”,
“authenticationType”: “xsuaa”,
“destination”: “GENERATIVE_AI_HUB”,
“csrfProtection”: true
},

]
} 5. Call the LLM APINow, you can call the LLM API from your code. To demonstrate, I’m using the modern Browser fetch to make a REST API call. // first get relative module path
const sModulePrefix = this.getOwnerComponent().getManifestEntry(“/sap.app/id”);

// make an API call to an LLM
const res = await fetch(
`${sModulePrefix}/api/chat/completions?api-version=2024-02-01`,
{
method: “POST”,
headers: {
“X-CSRF-Token”: “<X-CSRF-Token>”, // recommended for security. You can disable this for testing by setting `csrfProtection: false` in `xs-app.json` for `/api` route
“Content-Type”: “application/json”,
“AI-Resource-Group”: “default”, // mandatory
},
credentials: “same-origin”,
body: JSON.stringify({
messages: [
{
role: “system”,
content: “Write a TL;DR/summary of the user content in a paragraph.”,
},
{
role: “user”,
content: sTxtInput,
},
],
}),
}
) 6. Deploy the SAPUI5 appNow, you can build the MTA app and deploy it by executing the following commands: # Build an MTAR archive
mbt build

# Deploy the MTAR archive to logged in space
npm run deploy 6. Bonus: How to configure during developmentThe above steps outlined the steps to deploy an SAPUI5 integrated with LLM capabilities. Every time you make a change, the app needs to get built and deployed again which is not a good developer experience. Here in the subsequent steps, you can follow to connect your dev/local setup with BTP Destination and XSUAA services to work seamlessly during development whether in SAP Business Application Studio or in VS Code.6a. Deploy the SAPUI5 app onceYou must deploy your SAPUI5 app at least once, if not done already to consume Destinations with OAuth2ClientCredentials work locally. This setup requires the credentials to BTP Destinations and XSUAA services locally to work properly. # Build an MTAR archive
mbt build

# Deploy the MTAR archive to logged in space
npm run deploy 6b. Create a default-env.jsonNow, create a default-env.json file with the following content and replace the placeholders with actual credentials. {
“VCAP_SERVICES”: {
“destination”: [
{
“credentials”: {
“clientid”: “<CLIENT_ID>”,
“clientsecret”: “<CLIENT_SECRET>”,
“url”: “<AUTH_URL>”,
“uri”: “<DESTINATION_URI>”
},
“label”: “destination”,
“name”: “<INSTANCE_NAME>”,
“tags”: [“destination”, “conn”, “connsvc”]
}
],
“xsuaa”: [
{
“credentials”: {
“clientid”: “<CLIENT_ID>”,
“clientsecret”: “<CLIENT_SECRET>”,
“url”: “<AUTH_URL>”
},
“label”: “xsuaa”,
“tags”: [“xsuaa”]
}
]
}
} 6c. Install UI5 Middleware Plugin for ApprouterNow, install the ui5-middleware-approuter plugin, which wraps the actual @sap/approuter package to make the destination and routing work locally. npm install -D ui5-middleware-approuter 6d. Update UI5 configuration to use the middlewareNow, update the ui5.yaml and ui5-local.yaml to add ui5-middleware-approuter as a custom middleware …
server:
customerMiddleware:

– name: ui5-middleware-approuter
afterMiddleware: compression
configuration:
authenticationMethod: “route”
xsappJson: “./xs-app.json”
appendAuthRoute: true # trigger auth, inorder to connect to destination service Now, start running your UI5 app server locally and calls to /api endpoint now would get proxied over to LLMs running in SAP AI Core ?. npm start This concludes the step-by-step approach to integrating LLMs with your SAPUI5 apps. In the subsequent blogs, I’ll walk you through practical use-case implementations – A text Summarizer and an AI Chatbot Assistant in SAP Freestyle UI. Please note, that the techniques mentioned above work for SAP Fiori Elements apps too.If you have specific questions on the approach or other interesting Generative AI topics to know, please feel free to comment below. I’m all ears! ?   Read More Technology Blogs by SAP articles 

#SAP

#SAPTechnologyblog

You May Also Like

More From Author

+ There are no comments

Add yours