Generative AI has broken out of research labs and is now transforming the way business is done. SAP is moving at full speed to embrace this trend and has launched an agent called Joule. In this blog series, Iâll provide a âsuper-fast hands-onâ guide to help you quickly call default models of SAP AI Core and expand them into practical AI agents for real-world business use, so you can understand how these agents work behind the scenes.
Notice
The Japanese version is available here.
What Youâll Learn in This Series
How to spin up a custom AIâŻagent on SAPâŻAI Core in minutesHandsâon with LangChain, Google Search Tool, RAG, and StreamlitExposing the agent as a REST API and rebuilding the UI in SAPUI5/Fiori
Time Commitment
Each part is designed to be completed in 10â15 minutes
ïž Series Roadmap
Part 0 ProloguePart 1 Env Setup: SAP AICore & AI Launchpad [current blog]Part 2 Building a Chat Model with LangChainPart 3 Agent Tools: Integrating Google SearchPart 4 RAG Basics â : HANA Cloud VectorEngine & EmbeddingPart 5 RAG Basics âĄ: Building Retriever ToolPart 6 Streamlit BasicsPart 7 Streamlit UI Prototype Part 8 Expose as a REST APIPart 9 Rebuild the UI with SAPUI5 1st Half Part 10 Rebuild the UI with SAPUI5 2nd Half
If you enjoyed this post, please give it a kudos! Your support really motivates me. Also, if thereâs anything youâd like to know more about, feel free to leave a comment!
Env Setup: SAP AICore & AI Launchpad
1 | Overview
In this chapter, weâll connect to SAP AI Launchpad using the service key created for the SAP AI Core instance. Then, weâll proceed to deploy the LLM for running a chat model.
2 | Prerequisites
BTP sub-accountSAP AI Core instanceSAP AI LaunchPad SubscriptionPython 3.13 and pipVSCode, BAS or any IDE
3 | Create a Service Key
Creating a service key gives your local scripts a secure, OAuthâbased passport into SAP AI Core. Without it, Python SDK calls have no way to authenticate or discover endpoints.
Open your AIâŻCore instance in BTP Cockpit â Instances &âŻSubscriptions.
Go to the Service Keys tab and click Create. Give it any name.
Download the generated JSON; youâll need it later.
NOTE â Resource Group
When calling SAP AI Core from Python you must set an extra environment variable named RESOURCE_GROUP in addition to the values contained in the service key. In this tutorial weâll simply point it to the builtâin default group.
4 | Connect AIâŻLaunchpad
You can deploy an LLM (Large Language Model) directly to SAP AI Core using the service key you created earlier. However, in this guide, weâll use SAP AI Launchpad to make the deployment process easier.
To begin, youâll need to establish a secure connection route from SAP AI Launchpad to SAP AI Core.
From Subscriptions open AIâŻLaunchpad â Go to Application.
Click Add (topâright) â API Connection and upload the Service Key JSON.
Select the default Resource Group in the side panel and save.
5 | Configure & Deploy an LLM
On the âConfigurationâ screen of SAP Launchpad, you can select the type of foundation model to use. Think of this as creating a model profile that can be shared across multiple deployments.
Letâs go ahead and create the configuration. In this case, weâll use the default foundation_models scenario and configure it to use gapt-4o-mini.
Navigate MLâŻOperations â Settings and click Create.
Fill in: Name = <anyânameâyouâlike>, Scenario = foundation_models, Version = 0.0.1, Executable = azure-openai.
On the parameter screen leave modelName = gpt-4o-mini, modelVersion = latest and click Next â Create.
Youâll land on the new Setting detail pageâcheck that all fields look correct; no extra edits needed.
Click Create Deployment (topâright), breeze through the wizard by clicking Next until Create.
Wait until Status = Runningâhit the Refresh icon every few seconds. Copy the Deployment ID shown in the title bar for later use.
By the way, there are various LLMs available to run on SAP AI Core. You can browse them in the Generative AI Hub > Model Library section.
The Leaderboard also allows you to compare models based on accuracy, token cost, and other metrics.
6 | Next Up
Part 2 Building a Chat Model with LangChain / LangChain ă§ăăŁăăăąăă«ăæ§çŻ
Weâll finally implement the chat model! Make sure your Python development environment is readyâVS Code or a similar IDE will work perfectly!
Disclaimer
All the views and opinions in the blog are my own and is made in my personal capacity and that SAP shall not be responsible or liable for any of the contents published in this blog.
â Generative AI has broken out of research labs and is now transforming the way business is done. SAP is moving at full speed to embrace this trend and has launched an agent called Joule. In this blog series, Iâll provide a âsuper-fast hands-onâ guide to help you quickly call default models of SAP AI Core and expand them into practical AI agents for real-world business use, so you can understand how these agents work behind the scenes.NoticeThe Japanese version is available here. What Youâll Learn in This SeriesHow to spin up a custom AIâŻagent on SAPâŻAI Core in minutesHandsâon with LangChain, Google Search Tool, RAG, and StreamlitExposing the agent as a REST API and rebuilding the UI in SAPUI5/FioriTime CommitmentEach part is designed to be completed in 10â15 minutes
Series RoadmapPart 0 ProloguePart 1 Env Setup: SAP AICore & AI Launchpad [current blog]Part 2 Building a Chat Model with LangChainPart 3 Agent Tools: Integrating Google SearchPart 4 RAG Basics â : HANA Cloud VectorEngine & EmbeddingPart 5 RAG Basics âĄ: Building Retriever ToolPart 6 Streamlit BasicsPart 7 Streamlit UI Prototype Part 8 Expose as a REST APIPart 9 Rebuild the UI with SAPUI5 1st Half Part 10 Rebuild the UI with SAPUI5 2nd HalfIf you enjoyed this post, please give it a kudos! Your support really motivates me. Also, if thereâs anything youâd like to know more about, feel free to leave a comment!Env Setup: SAP AICore & AI Launchpad1 | OverviewIn this chapter, weâll connect to SAP AI Launchpad using the service key created for the SAP AI Core instance. Then, weâll proceed to deploy the LLM for running a chat model. 2 | PrerequisitesBTP sub-accountSAP AI Core instanceSAP AI LaunchPad SubscriptionPython 3.13 and pipVSCode, BAS or any IDE 3 | Create a Service KeyCreating a service key gives your local scripts a secure, OAuthâbased passport into SAP AI Core. Without it, Python SDK calls have no way to authenticate or discover endpoints.Open your AIâŻCore instance in BTP Cockpit â Instances &âŻSubscriptions.Go to the Service Keys tab and click Create. Give it any name.Download the generated JSON; youâll need it later.NOTE â Resource GroupWhen calling SAP AI Core from Python you must set an extra environment variable named RESOURCE_GROUP in addition to the values contained in the service key. In this tutorial weâll simply point it to the builtâin default group. 4 | Connect AIâŻLaunchpadYou can deploy an LLM (Large Language Model) directly to SAP AI Core using the service key you created earlier. However, in this guide, weâll use SAP AI Launchpad to make the deployment process easier.To begin, youâll need to establish a secure connection route from SAP AI Launchpad to SAP AI Core.From Subscriptions open AIâŻLaunchpad â Go to Application.Click Add (topâright) â API Connection and upload the Service Key JSON.Select the default Resource Group in the side panel and save. 5 | Configure & Deploy an LLMOn the âConfigurationâ screen of SAP Launchpad, you can select the type of foundation model to use. Think of this as creating a model profile that can be shared across multiple deployments.Letâs go ahead and create the configuration. In this case, weâll use the default foundation_models scenario and configure it to use gapt-4o-mini.Navigate MLâŻOperations â Settings and click Create.Fill in: Name = <anyânameâyouâlike>, Scenario = foundation_models, Version = 0.0.1, Executable = azure-openai.On the parameter screen leave modelName = gpt-4o-mini, modelVersion = latest and click Next â Create.Youâll land on the new Setting detail pageâcheck that all fields look correct; no extra edits needed.Click Create Deployment (topâright), breeze through the wizard by clicking Next until Create.Wait until Status = Runningâhit the Refresh icon every few seconds. Copy the Deployment ID shown in the title bar for later use.By the way, there are various LLMs available to run on SAP AI Core. You can browse them in the Generative AI Hub > Model Library section.The Leaderboard also allows you to compare models based on accuracy, token cost, and other metrics. 6 | Next UpPart 2 Building a Chat Model with LangChain / LangChain ă§ăăŁăăăąăă«ăæ§çŻWeâll finally implement the chat model! Make sure your Python development environment is readyâVS Code or a similar IDE will work perfectly! DisclaimerAll the views and opinions in the blog are my own and is made in my personal capacity and that SAP shall not be responsible or liable for any of the contents published in this blog. Read More Technology Blog Posts by SAP articles
#SAP
#SAPTechnologyblog