Part 1 – SQL analytics with SAP Data Products
Part 2 – Build and deploy Mosaic AI and Agent Tools
Part 3 – Connect SAP Data Products with non-SAP data from AWS S3
Part 4 – End-to-end integration: SAP Databricks, SAP Datasphere, and SAP Analytics Cloud
Part 5 – Create inferences and endpoints for application integration with SAP Build
SAP Databricks in SAP Business Data Cloud
SAP Business Data Cloud (BDC) enables seamless, governed, and semantically rich access to business data, unlocking its full potential through AI-driven analytics and intelligent applications.
The SAP blog by Mona has covered foundational capabilities and the value proposition of SAP Databricks in SAP Business Data Cloud. Building on those insights, this multi-blog series will deep-dive into pro-code AI/ML capabilities, real-world analytics with SAP Data Products, and ecosystem integration.
In this post, we focus on SQL analytics in SAP Databricks, demonstrating the seamless scalability on SAP Data Products exposed through SAP Business Data Cloud (BDC). With SAP BDC, these data products come pre-enriched with business context, making them immediately usable for high-impact analytics and AI.
Setting up SAP Databricks
To prepare for working with data products in SAP Databricks, a user with an administrator role must:
Add the SAP Databricks tenant to the formation (see Creating SAP Business Data Cloud Formations).Activate one or more data packages (see Activate Data Packages).Provide access to the SAP Databricks tenant (see Manage Users and Roles).
To make data products available for consumption in SAP Databricks, follow the step-by-step guide from the SAP Help Portal. Once connected, SAP Data Products will appear in SAP Databricks Unity Catalog, shared from SAP BDC Catalog—removing the need for duplication or external ETL tools.
An additional guide includes the Databricks Documentation, outlining the responsibilities of admins, data sharing configurations, and API references.
SQL Analytics with SAP Data Products
SAP Databricks SQL offers a unified development environment for querying SAP Data Products. Whether you’re analyzing financial transactions or supplier records, you can run powerful, AI-augmented SQL directly on trusted SAP data without replication.
Forecasting with SAP Databricks AI Functions – ai_forecast()
SAP Databricks provides built-in task-specific functions to automate common analytics workflows. One such function is ai_forecast(), which enables automated time series forecasting on SAP Data Products—without requiring custom ML code.
Behind the scenes, Databricks selects the optimal time series model—such as ARIMA, ETS, Prophet, and XGBoost. This allows business users to perform forecasting at scale with just SQL. For full model control and customization, SAP Databricks Python notebooks and MLFlow can be used to define, train, and deploy specific time series models.
The following workflow uses the Customer Data Product to forecast daily new customer counts:
Aggregate new customers per daySet a forecasting horizon (e.g., 30 days after 2025-07-28)Run ai_forecast() to generate daily projections along with upper/lower confidence intervals
Summarizing Daily Net Cash Movement with SAP Databricks Assistant
Harnessing the power of SAP Databricks Assistant, SQL queries can be effortlessly crafted using natural language prompts. For instance, leveraging the Cash Flow Data Product, you can track daily net cash movement by company code with the following prompt:
“Summarize daily net cash flow by company code from the table cash_flow_dp.cashflow.cashflow:
– Convert PostingDate to date format as posting_date
– Filter out rows with null PostingDate, AmountInCompanyCodeCurrency, or CompanyCode – Group by posting_date and CompanyCode
– Aggregate the sum of AmountInCompanyCodeCurrency as net_cash_flow
– Order results by posting_date”
Finance teams can identify cash positions, monitor liquidity in near real-time, and optimize working capital with predictive analytics in SAP Databricks.
SAP Databricks SQL Notebook to supercharge SQL Analytics
Quality Checks on Supplier Data Product
Using Databricks SQL Notebook and Gen AI, you can perform intelligent, real-time data quality assessments on the Supplier Data Product. This approach allows you to proactively identify missing or inconsistent data—such as VAT numbers, country codes, or postal codes—without building complex pipelines.
Identify missing VAT, country, or postal codeSummarize the issueAsk the LLM for diagnosis and recommendationsStore the result in a new summary table
This supports governed, AI-powered data collaboration across teams – all without moving or duplicating data outside the SAP landscape.
In Part 1, we explored how to:
Seamlessly access and explore SAP Data Products within SAP DatabricksPerform enterprise-grade SQL analytics on financial and supplier dataEnhance insights using AI-powered functions like ai_forecast() and ai_query() for forecasting and intelligent querying
In Part 2, we’ll explore how to build and deploy Mosaic AI and Agent Tools within SAP Databricks—enabling intelligent automation, real-time decisioning, and embedded AI experiences.
Part 1 – SQL analytics with SAP Data ProductsPart 2 – Build and deploy Mosaic AI and Agent ToolsPart 3 – Connect SAP Data Products with non-SAP data from AWS S3Part 4 – End-to-end integration: SAP Databricks, SAP Datasphere, and SAP Analytics CloudPart 5 – Create inferences and endpoints for application integration with SAP BuildSAP Databricks in SAP Business Data Cloud SAP Business Data Cloud (BDC) enables seamless, governed, and semantically rich access to business data, unlocking its full potential through AI-driven analytics and intelligent applications.The SAP blog by Mona has covered foundational capabilities and the value proposition of SAP Databricks in SAP Business Data Cloud. Building on those insights, this multi-blog series will deep-dive into pro-code AI/ML capabilities, real-world analytics with SAP Data Products, and ecosystem integration.In this post, we focus on SQL analytics in SAP Databricks, demonstrating the seamless scalability on SAP Data Products exposed through SAP Business Data Cloud (BDC). With SAP BDC, these data products come pre-enriched with business context, making them immediately usable for high-impact analytics and AI.Setting up SAP DatabricksTo prepare for working with data products in SAP Databricks, a user with an administrator role must:Add the SAP Databricks tenant to the formation (see Creating SAP Business Data Cloud Formations).Activate one or more data packages (see Activate Data Packages).Provide access to the SAP Databricks tenant (see Manage Users and Roles).To make data products available for consumption in SAP Databricks, follow the step-by-step guide from the SAP Help Portal. Once connected, SAP Data Products will appear in SAP Databricks Unity Catalog, shared from SAP BDC Catalog—removing the need for duplication or external ETL tools.An additional guide includes the Databricks Documentation, outlining the responsibilities of admins, data sharing configurations, and API references.SQL Analytics with SAP Data ProductsSAP Databricks SQL offers a unified development environment for querying SAP Data Products. Whether you’re analyzing financial transactions or supplier records, you can run powerful, AI-augmented SQL directly on trusted SAP data without replication.Forecasting with SAP Databricks AI Functions – ai_forecast()SAP Databricks provides built-in task-specific functions to automate common analytics workflows. One such function is ai_forecast(), which enables automated time series forecasting on SAP Data Products—without requiring custom ML code.Behind the scenes, Databricks selects the optimal time series model—such as ARIMA, ETS, Prophet, and XGBoost. This allows business users to perform forecasting at scale with just SQL. For full model control and customization, SAP Databricks Python notebooks and MLFlow can be used to define, train, and deploy specific time series models.The following workflow uses the Customer Data Product to forecast daily new customer counts:Aggregate new customers per daySet a forecasting horizon (e.g., 30 days after 2025-07-28)Run ai_forecast() to generate daily projections along with upper/lower confidence intervalsSummarizing Daily Net Cash Movement with SAP Databricks AssistantHarnessing the power of SAP Databricks Assistant, SQL queries can be effortlessly crafted using natural language prompts. For instance, leveraging the Cash Flow Data Product, you can track daily net cash movement by company code with the following prompt:“Summarize daily net cash flow by company code from the table cash_flow_dp.cashflow.cashflow:- Convert PostingDate to date format as posting_date- Filter out rows with null PostingDate, AmountInCompanyCodeCurrency, or CompanyCode – Group by posting_date and CompanyCode- Aggregate the sum of AmountInCompanyCodeCurrency as net_cash_flow- Order results by posting_date”Finance teams can identify cash positions, monitor liquidity in near real-time, and optimize working capital with predictive analytics in SAP Databricks.SAP Databricks SQL Notebook to supercharge SQL AnalyticsQuality Checks on Supplier Data ProductUsing Databricks SQL Notebook and Gen AI, you can perform intelligent, real-time data quality assessments on the Supplier Data Product. This approach allows you to proactively identify missing or inconsistent data—such as VAT numbers, country codes, or postal codes—without building complex pipelines.Identify missing VAT, country, or postal codeSummarize the issueAsk the LLM for diagnosis and recommendationsStore the result in a new summary tableThis supports governed, AI-powered data collaboration across teams – all without moving or duplicating data outside the SAP landscape.In Part 1, we explored how to:Seamlessly access and explore SAP Data Products within SAP DatabricksPerform enterprise-grade SQL analytics on financial and supplier dataEnhance insights using AI-powered functions like ai_forecast() and ai_query() for forecasting and intelligent queryingIn Part 2, we’ll explore how to build and deploy Mosaic AI and Agent Tools within SAP Databricks—enabling intelligent automation, real-time decisioning, and embedded AI experiences. Read More Technology Blog Posts by SAP articles
#SAP
#SAPTechnologyblog