Archive SAC Change Docs into Datasphere

Estimated read time 7 min read

Using SAC data change documents, you can track (audit) the data changes made to any planning model. SAP Analytics Cloud logs all changes to the transaction data of that model when users publish data from private versions to public versions.

Note that a data change log has up to 2^31 (i.e., 2,147,384,648, or ~2 billion) records. When the data change log reaches its size limit, publishing data will fail, and you will see the following message:

Publish cannot be performed because the data audit table has reached the maximum number of rows.

To avoid this issue, consider periodically downloading log entries as a backup before deleting them.

 

This document shows a design to back up SAC audit logs into Datasphere. Using Datasphere dataflow to backup change documents from the SAC Data Export Service via Cloud Data Integration adapter-based connection, it can backup and preserve the SAC audit logs history concerning the regular housekeeping needed in SAC due to the storage capacity limit.

 

DES: data export service

CDI: cloud data integration

 

Note: In SAC seamless planning or in BDC, the SAC model definition and calculation logic remain within SAC, while the persistence for planning data and change documents are stored in SAP Datasphere, so that both systems are closely linked – without the effort of manual data transfers. Therefore, the solution might not be necessary if the SAC change documents are already stored in Datasphere.

 

The following are the steps to implement the solution.

In SAC, System – Administration – App Integration, create an OAuth client with Access to the Data Export Service.

 

 

In SAC, keep the OAuth Client URLs for later configuration of the connection in Datasphere.

 

 

In SAC, copy the OAuth Client ID and Client Secret for later configuration of the connection in Datasphere.

 

 

In Datasphere, in a space, create a Cloud Data Integration connection pointing to the SAC Data Export Service. The URL is in the format of

https://<your domain>/api/v1/dataexport/administration

 

 

In the connections management screen, validate the connection, and make sure the Data Flows feature is now enabled. It shall give the following message after validation on the connection.

Connection “SAC DES Change Docs” is valid. – Data flows are enabled. – Replication flows are not supported. – Remote tables are enabled.

 

In SAC, open the model with the Audit Data flag switched on, and note down the model ID in the rear part of the URL. For instance:

https://<your domain name>/sap/fpa/ui/app.html?sap-ui-theme=sap_belize#/modeler&/m/model/C4003n1icbi0loo9lvgv7t4pu41

 

In Datasphere, Data Builder- Create a Data Flow to load SAC audit logs into a target table.

Use the model id to find the SAC source in the connection for the source node in the data flow.To make the daily backup more efficient, an exemplary expression implemented in the projection node can be: “Audit_Time” >= ADD_DAYS(NOW(), -1).To preserve the audit logs history that might be regularly cleaned in SAC, use Append mode on the target table for UPSERT.

 

 

 

 

In Datasphere, execute the data flow to load SAC audit logs and verify that the data is now present in Datasphere. Then, delete the audit logs in SAC, execute the data flow again to check that the audit logs history has been preserved after cleansing audit logs in SAC.

 

 

Last but not least, you may also develop an analytic model on top of the audit logs history table in Datasphere for analytics and build a report or visualization in SAC.

Note:

CDI connection type also supports the remote table as well as the data flow feature.

Remote table can use the following replication types, but none of those replications can well preserve the change document history.

Replication (snapshot)Replication (real-time)

 

 

​ Using SAC data change documents, you can track (audit) the data changes made to any planning model. SAP Analytics Cloud logs all changes to the transaction data of that model when users publish data from private versions to public versions.Note that a data change log has up to 2^31 (i.e., 2,147,384,648, or ~2 billion) records. When the data change log reaches its size limit, publishing data will fail, and you will see the following message:Publish cannot be performed because the data audit table has reached the maximum number of rows.To avoid this issue, consider periodically downloading log entries as a backup before deleting them. This document shows a design to back up SAC audit logs into Datasphere. Using Datasphere dataflow to backup change documents from the SAC Data Export Service via Cloud Data Integration adapter-based connection, it can backup and preserve the SAC audit logs history concerning the regular housekeeping needed in SAC due to the storage capacity limit. DES: data export serviceCDI: cloud data integration Note: In SAC seamless planning or in BDC, the SAC model definition and calculation logic remain within SAC, while the persistence for planning data and change documents are stored in SAP Datasphere, so that both systems are closely linked – without the effort of manual data transfers. Therefore, the solution might not be necessary if the SAC change documents are already stored in Datasphere. The following are the steps to implement the solution.In SAC, System – Administration – App Integration, create an OAuth client with Access to the Data Export Service.  In SAC, keep the OAuth Client URLs for later configuration of the connection in Datasphere.  In SAC, copy the OAuth Client ID and Client Secret for later configuration of the connection in Datasphere.  In Datasphere, in a space, create a Cloud Data Integration connection pointing to the SAC Data Export Service. The URL is in the format ofhttps://<your domain>/api/v1/dataexport/administration  In the connections management screen, validate the connection, and make sure the Data Flows feature is now enabled. It shall give the following message after validation on the connection.Connection “SAC DES Change Docs” is valid. – Data flows are enabled. – Replication flows are not supported. – Remote tables are enabled. In SAC, open the model with the Audit Data flag switched on, and note down the model ID in the rear part of the URL. For instance:https://<your domain name>/sap/fpa/ui/app.html?sap-ui-theme=sap_belize#/modeler&/m/model/C4003n1icbi0loo9lvgv7t4pu41 In Datasphere, Data Builder- Create a Data Flow to load SAC audit logs into a target table.Use the model id to find the SAC source in the connection for the source node in the data flow.To make the daily backup more efficient, an exemplary expression implemented in the projection node can be: “Audit_Time” >= ADD_DAYS(NOW(), -1).To preserve the audit logs history that might be regularly cleaned in SAC, use Append mode on the target table for UPSERT.    In Datasphere, execute the data flow to load SAC audit logs and verify that the data is now present in Datasphere. Then, delete the audit logs in SAC, execute the data flow again to check that the audit logs history has been preserved after cleansing audit logs in SAC.  Last but not least, you may also develop an analytic model on top of the audit logs history table in Datasphere for analytics and build a report or visualization in SAC.Note:CDI connection type also supports the remote table as well as the data flow feature.Remote table can use the following replication types, but none of those replications can well preserve the change document history.Replication (snapshot)Replication (real-time)    Read More Technology Blog Posts by SAP articles 

#SAP

#SAPTechnologyblog

You May Also Like

More From Author