This article is translated version of the original Japanese blog:
SAP BTP上でLangChainアプリケーションを開発・実行・監視する
Introduction
In today’s rapidly advancing era of generative AI, the demand for applications integrating AI technologies is increasing, aiming to enhance business process efficiency and create added value. SAP BTP (Business Technology Platform) has gained significant attention as a flexible and scalable platform for extending SAP S/4HANA Cloud and other business systems. Particularly, building applications leveraging generative AI on SAP BTP is the key to unlocking new possibilities.
This blog provides a step-by-step guide to developing, operating, and monitoring large language model (LLM) applications on SAP BTP. Using the “gpt-4o” model provided by Azure OpenAI as an example, we will cover everything from configuring LLM access using SAP AI Launchpad to deploying LangChain applications with SAP Build Code, and monitoring them using Langfuse. By the end, you will have a clear understanding of how to maximize the potential of SAP BTP to build scalable AI solutions.
We hope this blog helps you learn the technical fundamentals of application development with generative AI and take your first steps toward applying them in real business scenarios.
Steps
Prerequisites
SAP AI Launchpad / SAP AI Core (Extended Plan) / SAP Build Code is activated.
A Full-Stack Application-type DevSpace is created using SAP Build Code.
1. Accessing Powerful LLMs via Generative AI Hub
Access external powerful LLMs through Generative AI Hub in SAP AI Launchpad. First, check the available models listed in the SAP Note below as needed:
https://me.sap.com/notes/3437766
For this guide, we’ll use the “gpt-4o” model provided by Azure OpenAI.
In SAP AI Launchpad, navigate to the working resource group, and go to “ML Operations” => “Settings” from the left pane. Click “Create” in the top right corner to create an LLM access configuration.
Input the information as shown in the image. The name can be arbitrary; in this case, we’ll use “gpt-4o.”
On the next page, configure the type and version of the LLM to use. Refer to the aforementioned SAP Note for the modelName and modelVersion.
After completing the settings, ensure that the modelName is displayed as “gpt-4o.” Click the “Create Deployment” button in the top right corner based on this startup configuration.
You can configure the startup time and other settings, but proceed with the default settings for this example. Once successful, you should see the following screen. Wait for the LLM access point to be created and activated; its current status should display “Running.”
The setup of Generative AI Hub is now complete.
2. Preparing a LangChain Application with SAP Build Code and Connecting it to Langfuse Pre-deployed on SAP BTP
Next, deploy a LangChain application to SAP BTP, Cloud Foundry Runtime using SAP Build Code. In our previous blog, we deployed Langfuse, a monitoring platform for LLM applications, on SAP BTP, Kyma Runtime. If you haven’t yet referred to it, please check it out here:
Deploying Langfuse on SAP BTP, Kyma Runtime
For this guide, clone a sample application from GitHub:
git clone https://github.com/watwatwhat/Langfuse_LangChain_onSAPBTP.git
The application is structured as a Multi-Target Application (MTA), and you can deploy it easily using the default command-line tools available in SAP Build Code.
Key Points in the Source Code:
Using Generative AI Hub SDK to utilize LangChain via SAP AI Core.
from gen_ai_hub.proxy.langchain.openai import OpenAIEmbeddings as g_OpenAIEmbeddings
from gen_ai_hub.proxy.langchain.openai import ChatOpenAI as g_ChatOpenAI
proxy_client = get_proxy_client(‘gen-ai-hub’) # <- GenAIHubプロキシ
embeddings = g_OpenAIEmbeddings(proxy_model_name=embedding_model) # <- SAP AI Core経由のOpenAI Embeddingモデル
llm = g_ChatOpenAI(proxy_model_name=model_option, proxy_client=proxy_client, temperature=0.0) # <- SAP AI Core経由のOpenAI Chatモデル
Sending logs to Langfuse.
agent = create_react_agent(
llm=llm,
tools=tools,
prompt=prompt
)
agent_executor = AgentExecutor(
agent=agent,
tools=tools,
handle_parsing_errors=True,
verbose=True,
max_iterations=5
)
from langfuse.callback import CallbackHandler
langfuse_handler = CallbackHandler(
public_key=”XXXXXXX”,
secret_key=”XXXXXXX”,
host=”http://XXXXXXXX:3000″
)
response = agent_executor.invoke({“input”: input}, config={“callbacks”: [langfuse_handler]})
Retrieve the connection information from Langfuse’s UI by clicking the “Configure Tracing” button on the dashboard page. For use with LangChain, open the LangChain tab and copy the code snippet.
Deployment Steps:
Navigate to the cloned directory and build the project:
mbt build
Deploy the generated mtar file to SAP BTP, Cloud Foundry Runtime. After logging in with cf login, specify the target organization and space:
cf login
cf deploy mta_archives/langfuse-test-app_1.0.0.mtar
Once deployment is complete, access the application via the internet. Retrieve the application URL using the following command:
cf apps
For this example, the application is published as “langfuse-langchain-srv,” and the URL will be displayed alongside it. Let’s run the LangChain application.
3. Running Inference Chains with LangChain and Checking Traces on Langfuse
Using Postman installed on your local PC, send requests to this Python application. When executed, the AI agent will perform inference on SAP BTP, Cloud Foundry Runtime, and return results.
Translation:
JapaneseEnglish3の3乗根は?What is the cube root of 3?3の3乗根は約1.442です。計算機を使用して求めました。The cube root of 3 is approximately 1.442. It was calculated using a calculator.
Since this example includes a trace handler using Langfuse, check the traces via Langfuse’s UI by navigating to “Tracing” => “Traces.” You will see a list of traces as shown below:
Inspect the details. The right-hand pane displays the sequence of actions performed by the AI agent implemented in LangChain. Selecting the topmost element reveals the input and output in the left-hand pane.
Following the right-hand pane, you can confirm that the “calculator” tool was used. This tool, passed to the AI agent in the LangChain source code, was appropriately selected for the task “What is the cube root of 3?” The AI agent deemed this action suitable and used it.
The statistics are also reflected on the dashboard:
While this scenario only used the calculator tool, real-world use cases can become more complex. The products introduced in this blog will prove even more effective in such scenarios.
Conclusion
This blog introduced the steps to develop, run, and monitor LLM applications using SAP BTP. From configuring Generative AI Hub via SAP AI Launchpad to deploying LangChain applications with SAP Build Code and monitoring them with Langfuse, we hope this guide demonstrated the potential of SAP BTP.
Applications integrated with generative AI bring new value to traditional business processes, enabling more efficient and flexible system operations. Additionally, utilizing monitoring tools like Langfuse allows for detailed analysis of AI behavior and inference results, improving reliability and optimizing operations.
As generative AI and LLM technologies continue to evolve, their applications across various use cases will grow. Through application development based on SAP BTP, we hope you accelerate your organization’s digital transformation. May this blog serve as the first step toward achieving that goal.
We look forward to seeing your continued efforts in leveraging SAP BTP and generative AI-related technologies for new challenges.
This article is translated version of the original Japanese blog:SAP BTP上でLangChainアプリケーションを開発・実行・監視する IntroductionIn today’s rapidly advancing era of generative AI, the demand for applications integrating AI technologies is increasing, aiming to enhance business process efficiency and create added value. SAP BTP (Business Technology Platform) has gained significant attention as a flexible and scalable platform for extending SAP S/4HANA Cloud and other business systems. Particularly, building applications leveraging generative AI on SAP BTP is the key to unlocking new possibilities.This blog provides a step-by-step guide to developing, operating, and monitoring large language model (LLM) applications on SAP BTP. Using the “gpt-4o” model provided by Azure OpenAI as an example, we will cover everything from configuring LLM access using SAP AI Launchpad to deploying LangChain applications with SAP Build Code, and monitoring them using Langfuse. By the end, you will have a clear understanding of how to maximize the potential of SAP BTP to build scalable AI solutions.We hope this blog helps you learn the technical fundamentals of application development with generative AI and take your first steps toward applying them in real business scenarios.StepsPrerequisitesSAP AI Launchpad / SAP AI Core (Extended Plan) / SAP Build Code is activated.A Full-Stack Application-type DevSpace is created using SAP Build Code.1. Accessing Powerful LLMs via Generative AI HubAccess external powerful LLMs through Generative AI Hub in SAP AI Launchpad. First, check the available models listed in the SAP Note below as needed:https://me.sap.com/notes/3437766For this guide, we’ll use the “gpt-4o” model provided by Azure OpenAI.In SAP AI Launchpad, navigate to the working resource group, and go to “ML Operations” => “Settings” from the left pane. Click “Create” in the top right corner to create an LLM access configuration.Input the information as shown in the image. The name can be arbitrary; in this case, we’ll use “gpt-4o.”On the next page, configure the type and version of the LLM to use. Refer to the aforementioned SAP Note for the modelName and modelVersion.After completing the settings, ensure that the modelName is displayed as “gpt-4o.” Click the “Create Deployment” button in the top right corner based on this startup configuration.You can configure the startup time and other settings, but proceed with the default settings for this example. Once successful, you should see the following screen. Wait for the LLM access point to be created and activated; its current status should display “Running.”The setup of Generative AI Hub is now complete.2. Preparing a LangChain Application with SAP Build Code and Connecting it to Langfuse Pre-deployed on SAP BTPNext, deploy a LangChain application to SAP BTP, Cloud Foundry Runtime using SAP Build Code. In our previous blog, we deployed Langfuse, a monitoring platform for LLM applications, on SAP BTP, Kyma Runtime. If you haven’t yet referred to it, please check it out here:Deploying Langfuse on SAP BTP, Kyma RuntimeFor this guide, clone a sample application from GitHub: git clone https://github.com/watwatwhat/Langfuse_LangChain_onSAPBTP.git The application is structured as a Multi-Target Application (MTA), and you can deploy it easily using the default command-line tools available in SAP Build Code.Key Points in the Source Code:Using Generative AI Hub SDK to utilize LangChain via SAP AI Core. from gen_ai_hub.proxy.langchain.openai import OpenAIEmbeddings as g_OpenAIEmbeddings
from gen_ai_hub.proxy.langchain.openai import ChatOpenAI as g_ChatOpenAI proxy_client = get_proxy_client(‘gen-ai-hub’) # <- GenAIHubプロキシ
embeddings = g_OpenAIEmbeddings(proxy_model_name=embedding_model) # <- SAP AI Core経由のOpenAI Embeddingモデル
llm = g_ChatOpenAI(proxy_model_name=model_option, proxy_client=proxy_client, temperature=0.0) # <- SAP AI Core経由のOpenAI Chatモデル Sending logs to Langfuse. agent = create_react_agent(
llm=llm,
tools=tools,
prompt=prompt
)
agent_executor = AgentExecutor(
agent=agent,
tools=tools,
handle_parsing_errors=True,
verbose=True,
max_iterations=5
)
from langfuse.callback import CallbackHandler
langfuse_handler = CallbackHandler(
public_key=”XXXXXXX”,
secret_key=”XXXXXXX”,
host=”http://XXXXXXXX:3000″
)
response = agent_executor.invoke({“input”: input}, config={“callbacks”: [langfuse_handler]}) Retrieve the connection information from Langfuse’s UI by clicking the “Configure Tracing” button on the dashboard page. For use with LangChain, open the LangChain tab and copy the code snippet.Deployment Steps:Navigate to the cloned directory and build the project: mbt build Deploy the generated mtar file to SAP BTP, Cloud Foundry Runtime. After logging in with cf login, specify the target organization and space: cf login
cf deploy mta_archives/langfuse-test-app_1.0.0.mtar Once deployment is complete, access the application via the internet. Retrieve the application URL using the following command: cf apps For this example, the application is published as “langfuse-langchain-srv,” and the URL will be displayed alongside it. Let’s run the LangChain application.3. Running Inference Chains with LangChain and Checking Traces on LangfuseUsing Postman installed on your local PC, send requests to this Python application. When executed, the AI agent will perform inference on SAP BTP, Cloud Foundry Runtime, and return results.Translation: JapaneseEnglish3の3乗根は?What is the cube root of 3?3の3乗根は約1.442です。計算機を使用して求めました。The cube root of 3 is approximately 1.442. It was calculated using a calculator.Since this example includes a trace handler using Langfuse, check the traces via Langfuse’s UI by navigating to “Tracing” => “Traces.” You will see a list of traces as shown below:Inspect the details. The right-hand pane displays the sequence of actions performed by the AI agent implemented in LangChain. Selecting the topmost element reveals the input and output in the left-hand pane.Following the right-hand pane, you can confirm that the “calculator” tool was used. This tool, passed to the AI agent in the LangChain source code, was appropriately selected for the task “What is the cube root of 3?” The AI agent deemed this action suitable and used it.The statistics are also reflected on the dashboard:While this scenario only used the calculator tool, real-world use cases can become more complex. The products introduced in this blog will prove even more effective in such scenarios.ConclusionThis blog introduced the steps to develop, run, and monitor LLM applications using SAP BTP. From configuring Generative AI Hub via SAP AI Launchpad to deploying LangChain applications with SAP Build Code and monitoring them with Langfuse, we hope this guide demonstrated the potential of SAP BTP.Applications integrated with generative AI bring new value to traditional business processes, enabling more efficient and flexible system operations. Additionally, utilizing monitoring tools like Langfuse allows for detailed analysis of AI behavior and inference results, improving reliability and optimizing operations.As generative AI and LLM technologies continue to evolve, their applications across various use cases will grow. Through application development based on SAP BTP, we hope you accelerate your organization’s digital transformation. May this blog serve as the first step toward achieving that goal.We look forward to seeing your continued efforts in leveraging SAP BTP and generative AI-related technologies for new challenges. Read More Technology Blogs by SAP articles
#SAP
#SAPTechnologyblog