Part 1 – SQL analytics with SAP Data Products
Part 2 – Build and deploy Mosaic AI and Agent Tools
Part 3 – Connect SAP Data Products with non-SAP data from AWS S3
Part 4 – End-to-end integration: SAP Databricks, SAP Datasphere, and SAP Analytics Cloud
Part 5 – Create inferences and endpoints for application integration with SAP Build
SAP Databricks in SAP Business Data Cloud
In part 2 of this series, we move beyond SQL analytics and explore how to harness the full power of Mosaic AI and Agent Tools within SAP Databricks. These capabilities enable developers and data scientists to rapidly prototype and deploy custom AI agent functions that interact with SAP Data Products—unlocking new levels of automation, personalization, and decision intelligence across the enterprise.
The Value of AI Powered by SAP Data Products
Many companies envision AI as a simple linear path:
Data → AI → Value
In reality, achieving business value from AI is far more complex. It starts with data selection, sourcing, and synthesis, followed by rigorous data engineering tasks such as cleaning, normalization, model training, evaluation, and hyperparameter tuning. The real challenge—and where many initiatives stall—is in operationalizing these models: deploying them in production, monitoring performance, and continuously retraining to maintain accuracy.
This is where SAP Databricks and SAP Data Products play a critical role. By combining SAP’s semantically rich, governed data with Databricks’ powerful analytics and machine learning platform, organizations can bridge the gap between experimentation and enterprise-scale value—making AI not just possible, but sustainable and impactful.
Let’s explore several key features of SAP Databricks and dive into two practical use cases:
Analyzing a Customer Data Product with LLM, and integrating SAP BTP Document AI with the SAP Databricks Playground.
AI/ML Features in SAP Databricks
AI Playground for testing generative AI models from your Databricks workspace. You can prompt, compare and adjust settings such as system prompt and inference parameters.AI Functions that you can use to apply AI, like text translation or sentiment analysis, on your data that is stored on Databricks.Mosaic AI Gateway for governing and monitoring access to supported generative AI models and their associated model serving endpoints.Mosaic AI Model Serving for deploying LLMs.Mosaic AI Vector Search provides a queryable vector database that stores embedding vectors and can be configured to automatically sync to your knowledge base.Lakehouse Monitoring for data monitoring and tracking model prediction quality and drift using automatic payload logging with inference tables.Managed MLflow for AI agent and ML model lifecycle.Mosaic AI Agent Framework for building and deploying production-quality agents like Retrieval Augmented Generation (RAG) applications.Mosaic AI Agent Evaluation for evaluating the quality, cost, and latency of generative AI applications, including RAG applications and chains.AutoML to simplify the process of applying machine learning to your datasets.Foundation Model Fine-tuning for customizing a foundation model using your own data to optimize its performance for your specific application.Unity Catalog for managing AI assets, including models and experiments.
SAP Databricks Model Serving
Model Serving provides a unified interface to deploy, govern, and query AI models for real-time and batch inference. Each model you serve is available as a REST API that you can integrate into your web or client application. Deploy custom models (including scikit-learn, XGBoost, PyTorch, Hugging Face transformer models) and foundational models hosted outside of Databricks.
SAP Databricks Vector Search
Specifically designed for RAG applications, Vector Search delivers similarity search results, enriching LLM queries with context and domain knowledge, and improving accuracy and quality of results.
Create a Vector Search Index using the Python SDK
SAP Databricks Playground
Databricks Playground provides a built-in integration with your functions.
It’ll analyze which functions are available, and call them to properly answer your question
Open the PlaygroundSelect a model (like Llama)Add the tools/functions you want your model to leverage
Ask a question, and the playground will do the magic for you!
SAP Databricks AI Gateway
Mosaic AI Gateway offers unified access to AI/ML models through a single standard query interface. This eliminates the need to maintain separate systems to manage AI traffic. Enterprises can effortlessly switch between foundation and custom models.
AI Gateway includes usage tracking and guardrail activation for secure storage, sharing, and management.
Use Case #1: Analyze Customer Data Product with LLM
Using generative AI, you can seamlessly analyze customer records from the Customer Data Product. The LLM output, consisting of business insights, is added as a new column. The enriched data is saved as a Delta table and published to the BDC Catalog.
AI prompt is built using key fields like Customer Name, Country, Industry, Tax Number, City, Postal Code, and business flagsThe databricks-meta-llama-3-3-70b-instruct LLM generates concise business analyses per record,
identifying potential issues or opportunitiesResponses are evaluated using MLflow Databricks Agent, ensuring clarity (e.g., checking the presence of customer names).Enriched data is saved as a Delta Table: default.customerllmThe Delta Table is published as a Data Product to the BDC Catalog using the sap-bdc-connect-sdk
Use Case #2: SAP Document AI & AI-Powered Validation in SAP Databricks
You can build an end-to-end pipeline for processing documents – with ingestion, validation, and audit review using SAP Document AI, SAP Databricks, MLflow, and Claude AI.
Integration with BTP Document AI via API Combine the result.json files within SAP Databricks Experiments and convert them into a parquet file. Upload as a delta table into the Unity Catalog.Use of MLflow and Claude-3-7-sonnet model Connectivity to SAP Databricks Playground with Tools/Functions
This allows business users to reduce manual review time through automation, improve data quality and compliance with AI-driven checks, and create summaries for finance teams.
In part 2 of this series, we’ve gone beyond SQL analytics to explore how Mosaic AI tools within SAP Databricks empower teams to build production-grade, enterprise-ready AI solutions. By combining the semantic richness of SAP Data Products with Databricks’ unified data and AI platform, organizations can move from isolated experiments to operationalized AI that drives real business value.
As you continue building with SAP Business Data Cloud and SAP Databricks, the focus shifts from experimentation to governance, reusability, and scale. This is the foundation of the intelligent enterprise—where trusted SAP data fuels impactful AI, delivered seamlessly and responsibly.
Part 1 – SQL analytics with SAP Data ProductsPart 2 – Build and deploy Mosaic AI and Agent ToolsPart 3 – Connect SAP Data Products with non-SAP data from AWS S3Part 4 – End-to-end integration: SAP Databricks, SAP Datasphere, and SAP Analytics CloudPart 5 – Create inferences and endpoints for application integration with SAP BuildSAP Databricks in SAP Business Data Cloud In part 2 of this series, we move beyond SQL analytics and explore how to harness the full power of Mosaic AI and Agent Tools within SAP Databricks. These capabilities enable developers and data scientists to rapidly prototype and deploy custom AI agent functions that interact with SAP Data Products—unlocking new levels of automation, personalization, and decision intelligence across the enterprise.The Value of AI Powered by SAP Data ProductsMany companies envision AI as a simple linear path:Data → AI → ValueIn reality, achieving business value from AI is far more complex. It starts with data selection, sourcing, and synthesis, followed by rigorous data engineering tasks such as cleaning, normalization, model training, evaluation, and hyperparameter tuning. The real challenge—and where many initiatives stall—is in operationalizing these models: deploying them in production, monitoring performance, and continuously retraining to maintain accuracy.This is where SAP Databricks and SAP Data Products play a critical role. By combining SAP’s semantically rich, governed data with Databricks’ powerful analytics and machine learning platform, organizations can bridge the gap between experimentation and enterprise-scale value—making AI not just possible, but sustainable and impactful.Let’s explore several key features of SAP Databricks and dive into two practical use cases:Analyzing a Customer Data Product with LLM, and integrating SAP BTP Document AI with the SAP Databricks Playground.AI/ML Features in SAP DatabricksAI Playground for testing generative AI models from your Databricks workspace. You can prompt, compare and adjust settings such as system prompt and inference parameters.AI Functions that you can use to apply AI, like text translation or sentiment analysis, on your data that is stored on Databricks.Mosaic AI Gateway for governing and monitoring access to supported generative AI models and their associated model serving endpoints.Mosaic AI Model Serving for deploying LLMs.Mosaic AI Vector Search provides a queryable vector database that stores embedding vectors and can be configured to automatically sync to your knowledge base.Lakehouse Monitoring for data monitoring and tracking model prediction quality and drift using automatic payload logging with inference tables.Managed MLflow for AI agent and ML model lifecycle.Mosaic AI Agent Framework for building and deploying production-quality agents like Retrieval Augmented Generation (RAG) applications.Mosaic AI Agent Evaluation for evaluating the quality, cost, and latency of generative AI applications, including RAG applications and chains.AutoML to simplify the process of applying machine learning to your datasets.Foundation Model Fine-tuning for customizing a foundation model using your own data to optimize its performance for your specific application.Unity Catalog for managing AI assets, including models and experiments.SAP Databricks Model ServingModel Serving provides a unified interface to deploy, govern, and query AI models for real-time and batch inference. Each model you serve is available as a REST API that you can integrate into your web or client application. Deploy custom models (including scikit-learn, XGBoost, PyTorch, Hugging Face transformer models) and foundational models hosted outside of Databricks.SAP Databricks Vector SearchSpecifically designed for RAG applications, Vector Search delivers similarity search results, enriching LLM queries with context and domain knowledge, and improving accuracy and quality of results.Create a Vector Search Index using the Python SDKSAP Databricks PlaygroundDatabricks Playground provides a built-in integration with your functions.It’ll analyze which functions are available, and call them to properly answer your questionOpen the PlaygroundSelect a model (like Llama)Add the tools/functions you want your model to leverageAsk a question, and the playground will do the magic for you!SAP Databricks AI GatewayMosaic AI Gateway offers unified access to AI/ML models through a single standard query interface. This eliminates the need to maintain separate systems to manage AI traffic. Enterprises can effortlessly switch between foundation and custom models.AI Gateway includes usage tracking and guardrail activation for secure storage, sharing, and management.Use Case #1: Analyze Customer Data Product with LLMUsing generative AI, you can seamlessly analyze customer records from the Customer Data Product. The LLM output, consisting of business insights, is added as a new column. The enriched data is saved as a Delta table and published to the BDC Catalog.AI prompt is built using key fields like Customer Name, Country, Industry, Tax Number, City, Postal Code, and business flagsThe databricks-meta-llama-3-3-70b-instruct LLM generates concise business analyses per record,identifying potential issues or opportunitiesResponses are evaluated using MLflow Databricks Agent, ensuring clarity (e.g., checking the presence of customer names).Enriched data is saved as a Delta Table: default.customerllmThe Delta Table is published as a Data Product to the BDC Catalog using the sap-bdc-connect-sdk Use Case #2: SAP Document AI & AI-Powered Validation in SAP Databricks You can build an end-to-end pipeline for processing documents – with ingestion, validation, and audit review using SAP Document AI, SAP Databricks, MLflow, and Claude AI.Integration with BTP Document AI via API Combine the result.json files within SAP Databricks Experiments and convert them into a parquet file. Upload as a delta table into the Unity Catalog.Use of MLflow and Claude-3-7-sonnet model Connectivity to SAP Databricks Playground with Tools/Functions This allows business users to reduce manual review time through automation, improve data quality and compliance with AI-driven checks, and create summaries for finance teams.In part 2 of this series, we’ve gone beyond SQL analytics to explore how Mosaic AI tools within SAP Databricks empower teams to build production-grade, enterprise-ready AI solutions. By combining the semantic richness of SAP Data Products with Databricks’ unified data and AI platform, organizations can move from isolated experiments to operationalized AI that drives real business value.As you continue building with SAP Business Data Cloud and SAP Databricks, the focus shifts from experimentation to governance, reusability, and scale. This is the foundation of the intelligent enterprise—where trusted SAP data fuels impactful AI, delivered seamlessly and responsibly. Read More Technology Blog Posts by SAP articles
#SAP
#SAPTechnologyblog