The POWER of SAP Business Data Cloud with Existing Third-Party Investments

Estimated read time 34 min read

February 13th marked the announcement of SAP Business Data Cloud – SAP’s latest SaaS innovation that brings Datasphere, SAP Analytics Cloud (SAC), and BW together with an SAP Databricks partnership – providing a unified data foundation. The latest innovations with SAP Business Data Cloud also include Data Products and Insight Applications which are prepackaged, out-of-the-box objects and artifacts that can be quickly deployed, consumed, and even extended with ML/AI capabilities from Databricks – reducing the development efforts by customers. Generative AI capabilities will also be leveraged in this innovation to assist both business users and data experts in day-to-day Enterprise reporting needs. This new SaaS solution is planned for general availability in Q2 2025.

 

In today’s world, we are seeing many customers come to us with diverse data and analytics landscapes. Some customers have investments in other Hyperscalers, resulting in the same set of questions: Why SAP Business Data Cloud? Does Business Data Cloud work with my 3rd Party Data Lake/Solution? How does SAP complement my 3rd party investments?

In this blog I aim to address these questions by focusing on the Power SAP Business Data Cloud brings and the top benefits we see from our customers with 3rd Party Data Lake/Solution investments. In this blog I will cover the following 9 areas:

Connect SAP Data with Business ContextLeverage Zero-Copy Delta Share with Data ProductsConsume Insight Applications with SAP Business Data CloudExtend your Data with AI/ML Capabilities with DatabricksMove your BW to the Cloud!On-Demand Live Access with Data Federation through DatasphereFlexible Consumption from SAP Business Data Cloud for Third Party BI ToolsFoster AIReplicate Data Outbound to Your Third-Party Data Lakes (if Required)

Let’s dive in!

1. Connect SAP Data with Business Context

SAP Business Data Cloud’s key distinguisher on the market is its Business Data Fabric architecture, providing a semantically rich integration layer for data. This architecture maintains SAP data Business Context. What does this mean? The better question you need to be asking is What does my data mean? Business context is the meaning of your data. This includes the hierarchies, metadata, currency conversions, multi-languages, and security configuration.

For example, let’s say you are a finance business user that relies on analytic reports to capture the company’s performance and drive business decision making. However, you are not receiving timely reports. This challenge can be due to the way finance data is being extracted, modeled, and released for reporting. We commonly see many customers replicate SAP data to their analytic tool. With this approach, SAP data is typically being replicated at the database level, as seen in the image below, losing the Business Context or meaning of their data due to the lack of semantics.

Therefore, additional effort needs to be invested into manual rebuild of the Business Context. For instance, in the case of the Finance Reporting Use Case mentioned above, manual rebuild is required of the hierarchies of the department cost center locations; associations between cost centers and departments; currency conversions; languages for global departments; and security configuration of data access.

As seen in the image below, with SAP Business Data Cloud’s Business Data Fabric architecture customers can maintain the Business Context of SAP data leveraged in Data Products. Although, Data Product consumption is the purpose of SAP Business Data Cloud, customers can also access SAP and non-SAP data on-demand through the Business Data Fabric (Datasphere). For example, as seen in the image below, Data Products containing bundles of CDS views from S/4HANA leverage the Business Data Fabric for contextualized data. The same contextualized data can also be applied when just accessing CDS views on-demand from S/4HANA. However, when accessing it through a Data Product, customers benefit from leveraging prebuilt data models for fast time to value reporting. This means our customers can bring the semantics from SAP data, intact, saving time and improving productivity!

The image above shows an example of SAP Business Data Cloud’s capability to access semantically rich SAP data through (1) Data Products and/or (2) on-demand CDS views through the Business Data Fabric (Datasphere).

Source: Majo’s Blog

2. Leverage Zero-Copy Delta Share with Data Products

SAP Business Data Cloud leverages Databricks’ Delta Share, an open protocol for data access, allowing for zero copy sharing of Data Products. Meaning customers are not moving or copying the data – removing complex ETL while ensuring the data is aligned across the Business Data Cloud landscape. Data Products are, simply put, data models that describe the data whether its master data or transactional data and are ready for customers to consume. Data Products can come in 2 forms: (1) SAP Managed, meaning SAP oversees the production and lifecycle of the product and customers make use of their out-of-box capabilities; (2) Customer Managed where customers can blend SAP and non-SAP data and/or extend data products to curate their own. When a data record being consumed through a Data Product experiences a change, Delta Share tracks this change, providing (1) up to date data and (2) knowledge of what data record changed. This ensures the data is consistent across the enterprise and that, if required, the delta change can be pinpointed.

When a Data Package is activated in the Business Data Cloud Catalog and the Data Products within the package are installed in SAP Business Data Cloud, the Delta Sharing capability can be used to share the Data Product from the Business Data Cloud Catalog to Databricks. This use case is beneficial for extending Data Products with ML/AI capabilities (step 1 & 2 in below image). The Delta Share can then be used to share the newly extended data product back to Business Data Cloud Catalog (step 3 in image below) for further modeling or consumption in an analytics tool. When Delta Sharing occurs from the Business Data Cloud Catalog to SAP Databricks, the Data Product automatically populates the Unity Catalog of Databricks to sync metadata. In this bidirectional relationship, the newly curated Data Product automatically populates the Business Data Cloud Catalog when Delta Shared from Databricks – requiring no intervention and aligned governance. Once again, ensuring data consistency and cataloging across SAP Business Data Cloud.

Roadmap for Q2 2025 – SAP Databricks as a native component of SAP BDC: SAP Road Map Explorer

Source: Business Data Cloud FAQ & Databricks and SAP Partnership Announcement Blog 

3. Consume Insight Apps with SAP Business Data Cloud

Insight Applications are SAP’s latest SaaS innovation with the SAP Business Data Cloud. Think of them as a dashboard that comes with prepackaged ready to run data models and visualizations. SAP Insight Apps are comprised of Data Packages made up of Data Products. This differs from prebuilt content because SAP prebuilt content allows customers to have starter content which are customer managed. SAP Insight Apps follow SAP Best Practices and leverage integrations across SAP Lines of Businesses (eg. SuccessFactors, etc.) with a common data model all while being completely managed by SAP. Prior to Insight Apps, customers had to invest time into defining measures, locating and accessing data, building models, and building the dashboard. Insight Apps remove this heavy lift as upon installation customers will have all the work generated in the backend, leaving them to consume the ready data in a dashboard – providing faster time to value!

When an Insight Application is installed, it’s corresponding Data Package is activated, and appropriate Data Products are installed – all done with a click that triggers this process in the backend.  The outcome results in ready to use data models and data access in a Datasphere Space, which can then be consumed in the prebuilt dashboard such as a SAP Analytics Cloud (SAC) Story. SAC is one of the consumption solutions for Insight Apps where a live connection from SAP Datasphere feeds the data models to the dashboard. SAP is planning to release additional advanced Insight Apps using other Business Technology Platform (BTP) services, expanding consumption scenarios. Customers can also enhance Insight Applications and Data Products outside of the SAP Managed space to meet unique business requirements by curating their own Customer Managed Insight Apps and Data Products.

The image above summarizes the installation of an Insight Application. Source data access (SAP and non-SAP) is established and feeds the installed Data Products in their respective deployed Datasphere Space. Artifacts ingest, prepare, and expose data from analytic models within SAP BDC’s Datasphere Space to an Insight App containing relevant visualizations.

Source: Business Data Cloud FAQ

4. Extend Your Data with ML/AI Capabilities using Databricks

SAP Business Data Cloud brings a powerful partnership with Databricks to the table. In this partnership, SAP provides contextualized business data with a leading data processing platform – Databricks. Data Scientists and Data Engineers benefit by having access to Databricks notebooks for pro-coding and Mosaic AI for ML/AI lifecycle and production capabilities – including generative AI and LLMs. As mentioned earlier, customers, will be able to leverage Delta Sharing (zero-copy) of Data Products to Databricks to further extend them with ML/AI capabilities and/or create new Data Products. Here, pro-code data engineering can blend contextualized SAP data with 3rd party data as well as build governed and secured custom AI/ML solutions. SAP Databricks is optimized to share data (Delta Share), govern and publish curated data products, build spark pipelines, analyze data at a scale with SQL notebooks, and provide further pro-coding capabilities for custom AI scenarios.

Roadmap for Q2 2025 – SAP Databricks as a native component of SAP BDC: SAP Road Map Explorer

Source: Business Data Cloud FAQ & Databricks and SAP Partnership Announcement Blog 

5. Move Your BW to the Cloud!

SAP Business Data Cloud (BDC) allows BW customers to continue leveraging BW investments in a cloud landscape. With SAP Business Data Cloud, customers can leverage BW as a SaaS offering with BW 7.5 PCE or BW/4HANA PCE. Existing on-prem customers can take advantage of this through a lift and shift movement. By doing so, customers can decommission on-premise architecture, reducing total cost of operations. Once in the cloud, the Data Provisioning tool acts as a data transfer tooling to allow for sharing of InfoProviders with the Business Data Cloud’s Object Store, thereby allowing some BW data to be consumed in Data Products. Customers can also continue using existing BW use cases in BDC, firstly, by leveraging the semantic onboarding capability of Business Data Cloud’s Datasphere. Through semantic onboarding, BW objects can be onboarded with their metadata attached along with associated lower-level objects. This allows customers to access and tailor the BW object along with lower-level objects when creating new data models – all from within Business Data Cloud. Secondly, customers can leverage Data Products to blend with BW data to curate new scenarios – as mentioned in the “On-Demand Live Access with Data Federation through Datasphere” section.

Source: Business Data Cloud FAQ

 

6. On-Demand Live Access with Data Federation via Datasphere

SAP Business Data Cloud has data federation capabilities through it’s Datasphere component, meaning you DO NOT have to replicate your data. In other words, you can virtually access on-demand remote data from Datasphere without physically having to move it.

However, customers may now wonder ‘When is it best to use this approach?’

1. When no Data Product meets the desired scenario outcome.

If there is no Data Product that meets the use case requirements, then customers can leverage Datasphere’s connectivity capabilities with SAP and non-SAP data sources to federate data inbound for data modeling and consumption purposes.

2. When a federated data connection is available and there is no need for replication.

Data federation removes some of the challenges experienced with data replication as it improves storage costs and removes the need to replicate multiple versions of your data, overcoming the challenge of tracking the latest version. However, performance is optimized when querying smaller volumes of data in analytics reports and should be kept in mind when accessing federated data.

3. Blending live data with data products.

This can also be beneficial when blending data with Data Products. For example, customers can federate 3rd Party data (eg. Microsoft SQL server), BW Bridge data, and/or BW 7.5 PCE or BW/4HANA PCE data to Datasphere for further blending and modeling with Data Products. In addition, customers can easily switch to data replication with a click of a button if required, thereby still allowing them to blend data through a different data access method.

In addition, we also hear questions around ‘Can Datasphere federate data from 3rd Party Data Lakes?’

When it comes to federating from a 3rd Party Data Lake, customers can use a runtime engine on top of their Data Lake to allow for faster query execution on large volumes of data, including the retrieval and reading of parquet files. Such runtime engine examples include Apache Spark, Apache Hive, etc. The benefit here is that SAP Business Data Cloud provides contextually rich SAP data to virtualized data from 3rd Party Data Lakes.

Check out the SAP Discovery Center to complete a mission leveraging Datasphere’s data federation capabilities to consume Google Big Query data or Azure Synapse data2 examples of a solution containing runtime engines for quick retrieval and reading of Data Lake files. You can also try this for yourself, by starting the mission!

To see an example of the steps, look at this blog to see how easily you can virtualize data from Azure Synapse in Datasphere for reporting in SAP Analytics Cloud.

 

The above image zooms into Datasphere’s architecture (a component of SAP Business Data Cloud) in relation to its data acquisition capabilities through data federation and replication. Disclaimer – this image does not showcase how Data Products and Insight Apps interact with BDC’s Datasphere.

Source: Data Federation in Datasphere – SAP Community

7. Flexible Consumption from SAP Business Data Cloud for Third Party BI Tools

Customers often ask, ‘What if I have other BI tools I want to keep using, can I do so with SAP Business Data Cloud?’

Answer is yes! Besides Insight Apps customers can consume their data models in their third-party BI tools.

Customers can leverage third-party analytic tools with SAP Business Data Cloud, where they can connect to its Datasphere component in 2 ways: ODBC connection or OData connection. However, each connection can be better suited for different use cases. As shown in the image below, with both options you will have the ability to consume views (models with calculations, filters, aggregations, projections, etc.). OData connections will also allow customers to consume Datasphere’s analytic models, a unique modeling capability that combines multiple views allowing business users to find the answers to their business questions faster.

When evaluating which connection better suits your business requirement, it is important to understand the key differences between OData and ODBC. ODBC (SQL) is a standard protocol connection between applications and databases which accounts for data, authentication, and the metadata being exchanged. With the OLAP engine, ODBC connections can query views (dimensions, measures, calculations, descriptions) from Datasphere. Additionally, it can handle large datasets with high performance, where queries return flattened metadata-rich data files within output tables. Meaning, although large data file loads don’t have a significant impact on performance, hierarchies and some aggregations would require rebuild in the consumption tool, such as PowerBI. OData, on the other hand, is a function-rich REST-based protocol that can retrieve more advanced queries and metadata than ODBC, requiring no rebuild of aggregations and hierarchies. Unlike ODBC, this connection performs best for smaller data sets and allows business users to drill down deeper into their data from a view or analytic model.

When deciding between the two options, customers must consider (1) the desired outcome of their question, (2) frequency of data change/updates, and (3) data access management. For example, if the desired outcome of a query is unspecific and requires a large dataset load, then ODBC performance may be more suitable. However, if there is high frequency of change to the data and requires drill down in the consumption layer, then customers must consider the rebuild effort of the hierarchies and some aggregation. Specific desired outcomes resulting in smaller datasets and requiring deep drill downs may better suit OData. Additionally, access management may also be a determinant as data access controls are more strictly respected with OData connections. Therefore, these considerations are key when determining how to connect third party analytic applications to Datasphere.

Further details can also be found in SAP Help for OData APIs and ODBC connections.

Datasphere and SAP Analytics Cloud (SAC) are highly integrated solutions that allow for streamlined data modeling and reporting. Meaning approved changes that are made to the models in Datasphere can be reflected in live reports in SAC. With Datasphere and SAC, there is no burden of connection options as SAP has a prebuilt, integrated, native protocol that shares a common data foundation, HANA, resulting in fast and full consumption of metadata. In addition, the Business Data Cloud Catalog captures both the metadata and lineage of the data objects within Datasphere to their consumption report in SAC. In fact, this is a highly beneficial solution for use cases that require reporting on high and valuable SAP data gravity with some 3rd party data. Customers can acquire 3rd party data into Datasphere, model it with SAP Business Context rich data, and consume it live with SAP Analytics Cloud (SAC) with a cataloged metadata and lineage.

This image above shows the lineage of data objects in Datasphere being consumed in the Sales & Deliveries report within SAP Analytics Cloud (SAC) – highlighting the native connectivity.

Source: Consume Data in SAP Analytics Cloud via a Live Connection | SAP Help Portal

8. Foster AI

SAP Business Data Cloud provides contextualized data that not only provides data modeling and reporting benefits as discussed above, but also fosters AI by providing business meaning. SAP’s product vision for AI with SAP Business Data Cloud includes the following below:

Generative AI assistant is planned to support (1) administrative tasks and (2) augment additional workflows through SAP Analytics Cloud (SAC). SAP Road Map ExplorerJust Ask in SAC will fully support Datasphere allowing for natural language querying on Datasphere models while leveraging the HANA Vector engine and retrieval augmented generation (RAG) for grounding activities; resulting in reliable results when used by business and analytics teams. SAP Road Map ExplorerSAP metadata Knowledge Graph will allow for the mapping of ontologies between data models to better understand the relationship between entities, grounding on customer specific business context. This capability is also paired with the HANA Vector engine to provide reliable results. This will enrich Joule with insights from the models and customer specific information. Check out the announcement of Knowledge Graph and how it will be used by SAP Joule 

More to come on the details of this product vision form SAP.

 

9. Replicate Data Outbound to your 3rd Party Data Lakes (if Required)

Another benefit of SAP Business Data Cloud (BDC) is the ability to replicate data outbound to your 3rd Party Data Lake. With SAP BDC’s Datasphere component, Replication Flows can address several use cases when it comes to replicating mass entities of data inbound to Datasphere or Outbound to your 3rd Party Data Lake. The use cases can be better understood from Majo’s Blog in the section ‘Need to Push SAP S/4HANA Data Out to a Hyperscaler Data Lake / 3rd Party Data System?’ However, it is important to note that some rebuild of the Business Context is required when moving data outside of SAP. For further information, check out Majo’s blog under the section ‘Considerations of moving SAP Data Out’.

To understand how easily replication flows can be set-up in Datasphere, check out Cameron’s blog, which highlights the step-by-step of replicating mass entities from S/4HANA source to the Target Azure Data Lake.

 

 

To get a better visual of SAP Business Data Cloud and how you can interact with SAP Data Products, Insight Apps, Databricks, & more – take the product tour!

 

 

​ February 13th marked the announcement of SAP Business Data Cloud – SAP’s latest SaaS innovation that brings Datasphere, SAP Analytics Cloud (SAC), and BW together with an SAP Databricks partnership – providing a unified data foundation. The latest innovations with SAP Business Data Cloud also include Data Products and Insight Applications which are prepackaged, out-of-the-box objects and artifacts that can be quickly deployed, consumed, and even extended with ML/AI capabilities from Databricks – reducing the development efforts by customers. Generative AI capabilities will also be leveraged in this innovation to assist both business users and data experts in day-to-day Enterprise reporting needs. This new SaaS solution is planned for general availability in Q2 2025. In today’s world, we are seeing many customers come to us with diverse data and analytics landscapes. Some customers have investments in other Hyperscalers, resulting in the same set of questions: Why SAP Business Data Cloud? Does Business Data Cloud work with my 3rd Party Data Lake/Solution? How does SAP complement my 3rd party investments?In this blog I aim to address these questions by focusing on the Power SAP Business Data Cloud brings and the top benefits we see from our customers with 3rd Party Data Lake/Solution investments. In this blog I will cover the following 9 areas:Connect SAP Data with Business ContextLeverage Zero-Copy Delta Share with Data ProductsConsume Insight Applications with SAP Business Data CloudExtend your Data with AI/ML Capabilities with DatabricksMove your BW to the Cloud!On-Demand Live Access with Data Federation through DatasphereFlexible Consumption from SAP Business Data Cloud for Third Party BI ToolsFoster AIReplicate Data Outbound to Your Third-Party Data Lakes (if Required)Let’s dive in!1. Connect SAP Data with Business ContextSAP Business Data Cloud’s key distinguisher on the market is its Business Data Fabric architecture, providing a semantically rich integration layer for data. This architecture maintains SAP data Business Context. What does this mean? The better question you need to be asking is What does my data mean? Business context is the meaning of your data. This includes the hierarchies, metadata, currency conversions, multi-languages, and security configuration.For example, let’s say you are a finance business user that relies on analytic reports to capture the company’s performance and drive business decision making. However, you are not receiving timely reports. This challenge can be due to the way finance data is being extracted, modeled, and released for reporting. We commonly see many customers replicate SAP data to their analytic tool. With this approach, SAP data is typically being replicated at the database level, as seen in the image below, losing the Business Context or meaning of their data due to the lack of semantics.Therefore, additional effort needs to be invested into manual rebuild of the Business Context. For instance, in the case of the Finance Reporting Use Case mentioned above, manual rebuild is required of the hierarchies of the department cost center locations; associations between cost centers and departments; currency conversions; languages for global departments; and security configuration of data access.As seen in the image below, with SAP Business Data Cloud’s Business Data Fabric architecture customers can maintain the Business Context of SAP data leveraged in Data Products. Although, Data Product consumption is the purpose of SAP Business Data Cloud, customers can also access SAP and non-SAP data on-demand through the Business Data Fabric (Datasphere). For example, as seen in the image below, Data Products containing bundles of CDS views from S/4HANA leverage the Business Data Fabric for contextualized data. The same contextualized data can also be applied when just accessing CDS views on-demand from S/4HANA. However, when accessing it through a Data Product, customers benefit from leveraging prebuilt data models for fast time to value reporting. This means our customers can bring the semantics from SAP data, intact, saving time and improving productivity!The image above shows an example of SAP Business Data Cloud’s capability to access semantically rich SAP data through (1) Data Products and/or (2) on-demand CDS views through the Business Data Fabric (Datasphere).Source: Majo’s Blog2. Leverage Zero-Copy Delta Share with Data ProductsSAP Business Data Cloud leverages Databricks’ Delta Share, an open protocol for data access, allowing for zero copy sharing of Data Products. Meaning customers are not moving or copying the data – removing complex ETL while ensuring the data is aligned across the Business Data Cloud landscape. Data Products are, simply put, data models that describe the data whether its master data or transactional data and are ready for customers to consume. Data Products can come in 2 forms: (1) SAP Managed, meaning SAP oversees the production and lifecycle of the product and customers make use of their out-of-box capabilities; (2) Customer Managed where customers can blend SAP and non-SAP data and/or extend data products to curate their own. When a data record being consumed through a Data Product experiences a change, Delta Share tracks this change, providing (1) up to date data and (2) knowledge of what data record changed. This ensures the data is consistent across the enterprise and that, if required, the delta change can be pinpointed.When a Data Package is activated in the Business Data Cloud Catalog and the Data Products within the package are installed in SAP Business Data Cloud, the Delta Sharing capability can be used to share the Data Product from the Business Data Cloud Catalog to Databricks. This use case is beneficial for extending Data Products with ML/AI capabilities (step 1 & 2 in below image). The Delta Share can then be used to share the newly extended data product back to Business Data Cloud Catalog (step 3 in image below) for further modeling or consumption in an analytics tool. When Delta Sharing occurs from the Business Data Cloud Catalog to SAP Databricks, the Data Product automatically populates the Unity Catalog of Databricks to sync metadata. In this bidirectional relationship, the newly curated Data Product automatically populates the Business Data Cloud Catalog when Delta Shared from Databricks – requiring no intervention and aligned governance. Once again, ensuring data consistency and cataloging across SAP Business Data Cloud.Roadmap for Q2 2025 – SAP Databricks as a native component of SAP BDC: SAP Road Map ExplorerSource: Business Data Cloud FAQ & Databricks and SAP Partnership Announcement Blog 3. Consume Insight Apps with SAP Business Data CloudInsight Applications are SAP’s latest SaaS innovation with the SAP Business Data Cloud. Think of them as a dashboard that comes with prepackaged ready to run data models and visualizations. SAP Insight Apps are comprised of Data Packages made up of Data Products. This differs from prebuilt content because SAP prebuilt content allows customers to have starter content which are customer managed. SAP Insight Apps follow SAP Best Practices and leverage integrations across SAP Lines of Businesses (eg. SuccessFactors, etc.) with a common data model all while being completely managed by SAP. Prior to Insight Apps, customers had to invest time into defining measures, locating and accessing data, building models, and building the dashboard. Insight Apps remove this heavy lift as upon installation customers will have all the work generated in the backend, leaving them to consume the ready data in a dashboard – providing faster time to value!When an Insight Application is installed, it’s corresponding Data Package is activated, and appropriate Data Products are installed – all done with a click that triggers this process in the backend.  The outcome results in ready to use data models and data access in a Datasphere Space, which can then be consumed in the prebuilt dashboard such as a SAP Analytics Cloud (SAC) Story. SAC is one of the consumption solutions for Insight Apps where a live connection from SAP Datasphere feeds the data models to the dashboard. SAP is planning to release additional advanced Insight Apps using other Business Technology Platform (BTP) services, expanding consumption scenarios. Customers can also enhance Insight Applications and Data Products outside of the SAP Managed space to meet unique business requirements by curating their own Customer Managed Insight Apps and Data Products.The image above summarizes the installation of an Insight Application. Source data access (SAP and non-SAP) is established and feeds the installed Data Products in their respective deployed Datasphere Space. Artifacts ingest, prepare, and expose data from analytic models within SAP BDC’s Datasphere Space to an Insight App containing relevant visualizations.Source: Business Data Cloud FAQ4. Extend Your Data with ML/AI Capabilities using DatabricksSAP Business Data Cloud brings a powerful partnership with Databricks to the table. In this partnership, SAP provides contextualized business data with a leading data processing platform – Databricks. Data Scientists and Data Engineers benefit by having access to Databricks notebooks for pro-coding and Mosaic AI for ML/AI lifecycle and production capabilities – including generative AI and LLMs. As mentioned earlier, customers, will be able to leverage Delta Sharing (zero-copy) of Data Products to Databricks to further extend them with ML/AI capabilities and/or create new Data Products. Here, pro-code data engineering can blend contextualized SAP data with 3rd party data as well as build governed and secured custom AI/ML solutions. SAP Databricks is optimized to share data (Delta Share), govern and publish curated data products, build spark pipelines, analyze data at a scale with SQL notebooks, and provide further pro-coding capabilities for custom AI scenarios.Roadmap for Q2 2025 – SAP Databricks as a native component of SAP BDC: SAP Road Map ExplorerSource: Business Data Cloud FAQ & Databricks and SAP Partnership Announcement Blog 5. Move Your BW to the Cloud!SAP Business Data Cloud (BDC) allows BW customers to continue leveraging BW investments in a cloud landscape. With SAP Business Data Cloud, customers can leverage BW as a SaaS offering with BW 7.5 PCE or BW/4HANA PCE. Existing on-prem customers can take advantage of this through a lift and shift movement. By doing so, customers can decommission on-premise architecture, reducing total cost of operations. Once in the cloud, the Data Provisioning tool acts as a data transfer tooling to allow for sharing of InfoProviders with the Business Data Cloud’s Object Store, thereby allowing some BW data to be consumed in Data Products. Customers can also continue using existing BW use cases in BDC, firstly, by leveraging the semantic onboarding capability of Business Data Cloud’s Datasphere. Through semantic onboarding, BW objects can be onboarded with their metadata attached along with associated lower-level objects. This allows customers to access and tailor the BW object along with lower-level objects when creating new data models – all from within Business Data Cloud. Secondly, customers can leverage Data Products to blend with BW data to curate new scenarios – as mentioned in the “On-Demand Live Access with Data Federation through Datasphere” section.Source: Business Data Cloud FAQ 6. On-Demand Live Access with Data Federation via DatasphereSAP Business Data Cloud has data federation capabilities through it’s Datasphere component, meaning you DO NOT have to replicate your data. In other words, you can virtually access on-demand remote data from Datasphere without physically having to move it.However, customers may now wonder ‘When is it best to use this approach?’1. When no Data Product meets the desired scenario outcome.If there is no Data Product that meets the use case requirements, then customers can leverage Datasphere’s connectivity capabilities with SAP and non-SAP data sources to federate data inbound for data modeling and consumption purposes.2. When a federated data connection is available and there is no need for replication.Data federation removes some of the challenges experienced with data replication as it improves storage costs and removes the need to replicate multiple versions of your data, overcoming the challenge of tracking the latest version. However, performance is optimized when querying smaller volumes of data in analytics reports and should be kept in mind when accessing federated data.3. Blending live data with data products.This can also be beneficial when blending data with Data Products. For example, customers can federate 3rd Party data (eg. Microsoft SQL server), BW Bridge data, and/or BW 7.5 PCE or BW/4HANA PCE data to Datasphere for further blending and modeling with Data Products. In addition, customers can easily switch to data replication with a click of a button if required, thereby still allowing them to blend data through a different data access method.In addition, we also hear questions around ‘Can Datasphere federate data from 3rd Party Data Lakes?’When it comes to federating from a 3rd Party Data Lake, customers can use a runtime engine on top of their Data Lake to allow for faster query execution on large volumes of data, including the retrieval and reading of parquet files. Such runtime engine examples include Apache Spark, Apache Hive, etc. The benefit here is that SAP Business Data Cloud provides contextually rich SAP data to virtualized data from 3rd Party Data Lakes.Check out the SAP Discovery Center to complete a mission leveraging Datasphere’s data federation capabilities to consume Google Big Query data or Azure Synapse data – 2 examples of a solution containing runtime engines for quick retrieval and reading of Data Lake files. You can also try this for yourself, by starting the mission!To see an example of the steps, look at this blog to see how easily you can virtualize data from Azure Synapse in Datasphere for reporting in SAP Analytics Cloud. The above image zooms into Datasphere’s architecture (a component of SAP Business Data Cloud) in relation to its data acquisition capabilities through data federation and replication. Disclaimer – this image does not showcase how Data Products and Insight Apps interact with BDC’s Datasphere. Source: Data Federation in Datasphere – SAP Community7. Flexible Consumption from SAP Business Data Cloud for Third Party BI ToolsCustomers often ask, ‘What if I have other BI tools I want to keep using, can I do so with SAP Business Data Cloud?’Answer is yes! Besides Insight Apps customers can consume their data models in their third-party BI tools.Customers can leverage third-party analytic tools with SAP Business Data Cloud, where they can connect to its Datasphere component in 2 ways: ODBC connection or OData connection. However, each connection can be better suited for different use cases. As shown in the image below, with both options you will have the ability to consume views (models with calculations, filters, aggregations, projections, etc.). OData connections will also allow customers to consume Datasphere’s analytic models, a unique modeling capability that combines multiple views allowing business users to find the answers to their business questions faster.When evaluating which connection better suits your business requirement, it is important to understand the key differences between OData and ODBC. ODBC (SQL) is a standard protocol connection between applications and databases which accounts for data, authentication, and the metadata being exchanged. With the OLAP engine, ODBC connections can query views (dimensions, measures, calculations, descriptions) from Datasphere. Additionally, it can handle large datasets with high performance, where queries return flattened metadata-rich data files within output tables. Meaning, although large data file loads don’t have a significant impact on performance, hierarchies and some aggregations would require rebuild in the consumption tool, such as PowerBI. OData, on the other hand, is a function-rich REST-based protocol that can retrieve more advanced queries and metadata than ODBC, requiring no rebuild of aggregations and hierarchies. Unlike ODBC, this connection performs best for smaller data sets and allows business users to drill down deeper into their data from a view or analytic model.When deciding between the two options, customers must consider (1) the desired outcome of their question, (2) frequency of data change/updates, and (3) data access management. For example, if the desired outcome of a query is unspecific and requires a large dataset load, then ODBC performance may be more suitable. However, if there is high frequency of change to the data and requires drill down in the consumption layer, then customers must consider the rebuild effort of the hierarchies and some aggregation. Specific desired outcomes resulting in smaller datasets and requiring deep drill downs may better suit OData. Additionally, access management may also be a determinant as data access controls are more strictly respected with OData connections. Therefore, these considerations are key when determining how to connect third party analytic applications to Datasphere.Further details can also be found in SAP Help for OData APIs and ODBC connections.Datasphere and SAP Analytics Cloud (SAC) are highly integrated solutions that allow for streamlined data modeling and reporting. Meaning approved changes that are made to the models in Datasphere can be reflected in live reports in SAC. With Datasphere and SAC, there is no burden of connection options as SAP has a prebuilt, integrated, native protocol that shares a common data foundation, HANA, resulting in fast and full consumption of metadata. In addition, the Business Data Cloud Catalog captures both the metadata and lineage of the data objects within Datasphere to their consumption report in SAC. In fact, this is a highly beneficial solution for use cases that require reporting on high and valuable SAP data gravity with some 3rd party data. Customers can acquire 3rd party data into Datasphere, model it with SAP Business Context rich data, and consume it live with SAP Analytics Cloud (SAC) with a cataloged metadata and lineage.This image above shows the lineage of data objects in Datasphere being consumed in the Sales & Deliveries report within SAP Analytics Cloud (SAC) – highlighting the native connectivity.Source: Consume Data in SAP Analytics Cloud via a Live Connection | SAP Help Portal8. Foster AI SAP Business Data Cloud provides contextualized data that not only provides data modeling and reporting benefits as discussed above, but also fosters AI by providing business meaning. SAP’s product vision for AI with SAP Business Data Cloud includes the following below:Generative AI assistant is planned to support (1) administrative tasks and (2) augment additional workflows through SAP Analytics Cloud (SAC). SAP Road Map ExplorerJust Ask in SAC will fully support Datasphere allowing for natural language querying on Datasphere models while leveraging the HANA Vector engine and retrieval augmented generation (RAG) for grounding activities; resulting in reliable results when used by business and analytics teams. SAP Road Map ExplorerSAP metadata Knowledge Graph will allow for the mapping of ontologies between data models to better understand the relationship between entities, grounding on customer specific business context. This capability is also paired with the HANA Vector engine to provide reliable results. This will enrich Joule with insights from the models and customer specific information. Check out the announcement of Knowledge Graph and how it will be used by SAP Joule More to come on the details of this product vision form SAP. 9. Replicate Data Outbound to your 3rd Party Data Lakes (if Required)Another benefit of SAP Business Data Cloud (BDC) is the ability to replicate data outbound to your 3rd Party Data Lake. With SAP BDC’s Datasphere component, Replication Flows can address several use cases when it comes to replicating mass entities of data inbound to Datasphere or Outbound to your 3rd Party Data Lake. The use cases can be better understood from Majo’s Blog in the section ‘Need to Push SAP S/4HANA Data Out to a Hyperscaler Data Lake / 3rd Party Data System?’ However, it is important to note that some rebuild of the Business Context is required when moving data outside of SAP. For further information, check out Majo’s blog under the section ‘Considerations of moving SAP Data Out’.To understand how easily replication flows can be set-up in Datasphere, check out Cameron’s blog, which highlights the step-by-step of replicating mass entities from S/4HANA source to the Target Azure Data Lake.  To get a better visual of SAP Business Data Cloud and how you can interact with SAP Data Products, Insight Apps, Databricks, & more – take the product tour!     Read More Technology Blogs by SAP articles 

#SAP

#SAPTechnologyblog

You May Also Like

More From Author