Integrating AI with SAPUI5 Fiori Apps: Part 2 – Building a Text Summarizer

Estimated read time 5 min read
Blog Series

Integrating AI with SAPUI5 Fiori Apps: Part 1 – ConceptIntegrating AI with SAPUI5 Fiori Apps: Part 2 – Building a Text SummarizerIntegrating AI with SAPUI5 Fiori Apps: Part 3 – Building an AI Chatbot Assistant

In the previous blog in the series – Integrating AI with SAPUI5 Fiori Apps: Part 1 – Concept, I outlined the steps and the approach to integrate SAPUI5 Freestyle & Fiori Elements app with LLMs via SAP AI Core.

I’m structuring this blog as a short walkthrough to demonstrate building a simple AI-based Text Summarizer using the approach defined in the previous blog.

You can find the reference implementation here – GitHub – anselm94/blog-sap-fiori-genaihub-llm-integration 

AI Text Summarizer – Screengrab

You can reuse the implementation to build similar use cases on the client side within SAPUI5 apps such as Product Description Generator, Job Description Generator, Customer Query Summariser, Campaign Idea/Title Generator etc.

The SAPUI5 application was generated in the SAP Build Code using a freestyle ‘Basic’ template. All the changes outlined in the previous blog to integrate the LLM with the app, by exposing an /api route which proxies the responses to LLM. The LLM chosen here is GPT-3.5-Turbo for demonstration purposes. Please note, the recommendation from SAP is to try out Opensource LLMs available from the SAP Generative AI Hub such as LlaMa 3, Mixtral 8x7b etc., before going in for Partner provided LLMs such as Azure OpenAI GPT-4, Google Gemini etc. for 

Building an AI-based Text Summarizer is straightforward, it takes in a long text as an input and creates a succinct summary as an output. Once a user clicks on the ‘Summarize’ button, a POST API call is fired against the LLM endpoint via /api endpoint and gets the summarised content back.

At the time of writing, I could not make token streaming work by setting stream: true, except that I received the entire response as a chunk. I haven’t spent enough effort testing if @sap/approuter supports Server-Sent Events (SSE) protocol, through which you get ChatGPT-styled token streaming implemented.

Also note that I’ve enabled csrfProtection: true for the /api endpoint to secure against CSRF attacks. This implies, that every POST call needs an X-CSRF-Token header, for which the token can be fetched at the beginning of the session and cached within the web app.

Building the Text Summarizer, though a simple use case provided me with a solid understanding of how to build a quick SAPUI5 app without writing code in the backend.

The code is available under the open-source Apache license here – GitHub – anselm94/blog-sap-fiori-genaihub-llm-integration . Please feel free to modify it according to your needs.

If you have other interesting use cases, queries or feedback, please feel free to add a comment below. I’m all ears! ?

 

​ Blog SeriesIntegrating AI with SAPUI5 Fiori Apps: Part 1 – ConceptIntegrating AI with SAPUI5 Fiori Apps: Part 2 – Building a Text SummarizerIntegrating AI with SAPUI5 Fiori Apps: Part 3 – Building an AI Chatbot AssistantIn the previous blog in the series – Integrating AI with SAPUI5 Fiori Apps: Part 1 – Concept, I outlined the steps and the approach to integrate SAPUI5 Freestyle & Fiori Elements app with LLMs via SAP AI Core.I’m structuring this blog as a short walkthrough to demonstrate building a simple AI-based Text Summarizer using the approach defined in the previous blog.You can find the reference implementation here – GitHub – anselm94/blog-sap-fiori-genaihub-llm-integration AI Text Summarizer – ScreengrabYou can reuse the implementation to build similar use cases on the client side within SAPUI5 apps such as Product Description Generator, Job Description Generator, Customer Query Summariser, Campaign Idea/Title Generator etc.The SAPUI5 application was generated in the SAP Build Code using a freestyle ‘Basic’ template. All the changes outlined in the previous blog to integrate the LLM with the app, by exposing an /api route which proxies the responses to LLM. The LLM chosen here is GPT-3.5-Turbo for demonstration purposes. Please note, the recommendation from SAP is to try out Opensource LLMs available from the SAP Generative AI Hub such as LlaMa 3, Mixtral 8x7b etc., before going in for Partner provided LLMs such as Azure OpenAI GPT-4, Google Gemini etc. for Building an AI-based Text Summarizer is straightforward, it takes in a long text as an input and creates a succinct summary as an output. Once a user clicks on the ‘Summarize’ button, a POST API call is fired against the LLM endpoint via /api endpoint and gets the summarised content back.At the time of writing, I could not make token streaming work by setting stream: true, except that I received the entire response as a chunk. I haven’t spent enough effort testing if @sap/approuter supports Server-Sent Events (SSE) protocol, through which you get ChatGPT-styled token streaming implemented.Also note that I’ve enabled csrfProtection: true for the /api endpoint to secure against CSRF attacks. This implies, that every POST call needs an X-CSRF-Token header, for which the token can be fetched at the beginning of the session and cached within the web app.Building the Text Summarizer, though a simple use case provided me with a solid understanding of how to build a quick SAPUI5 app without writing code in the backend.The code is available under the open-source Apache license here – GitHub – anselm94/blog-sap-fiori-genaihub-llm-integration . Please feel free to modify it according to your needs.If you have other interesting use cases, queries or feedback, please feel free to add a comment below. I’m all ears! ?   Read More Technology Blogs by SAP articles 

#SAP

#SAPTechnologyblog

You May Also Like

More From Author

+ There are no comments

Add yours