Handling CSV Data Deployment in CAP HDI Without Data Loss
Introduction
In SAP CAP, deploying CSV data using .hdbtabledata files means the HDI container assumes full ownership of the data. As a result, any changes to the CSV file will overwrite the table content during the next deployment.
Background
Have you ever encountered a scenario where updating CSV data led to unexpected data loss? Here’s our experience:
In our hdi-content module, we maintained a UOM.csv (Unit of Measure) file, which is preloaded into the UOM table. This ensures that customers subscribing to our SaaS solution, SAP Omni Channel Promotional Pricing (OPPS), have immediate access to standard UOMs for defining product promotions. Since the UOM table is also exposed via CAP CDS views in the Fiori UI, customers can add their own entries as well.
The Issue
While working on a new feature that introduced another table with pre-delivered data, we discovered that any modification to the CSV file—such as adding or updating UOMs—could result in the loss of customer-maintained data in the UOM table. With many customers using our SaaS solution, this posed a significant risk of accidental data loss whenever the UOM.csv content was changed.
We needed a reliable solution to prevent customer data loss during deployments.
Exploring CAP Recommendations
According to the CAP documentation, you can use the include_filter option in .hdbtabledata to restrict which records are affected. This requires an additional column to distinguish between CSV data and user-generated data. In our case, we already had an origin column:
origin = ’02’: CSV dataorigin = ’10’: Customer-created entries
However, even with include_filter, we found that customer data was still being lost, and only the CSV content remained after deployment.
The CAP troubleshooting guide references SAP Note 2922271, but its solution is tailored for @SAP/hdi-deploy and not directly applicable to @SAP/hdi-dynamic-deploy.
The Solution
With support from the HANA-DB-DI team, we identified a practical approach to address this challenge. We hope sharing this helps others building multitenant applications on CAP and HANA Cloud.
Deployment 1: Undeploy .hdbtabledata Without Deleting Data
First, set the HDI deployer’s environment variable HDI_DEPLOY_OPTIONS in your MTA module properties to ensure the .hdbtabledata file is undeployed without deleting any data from the target table. Use the following configuration:
{
“undeploy”: [
“src/gen/OPPC-UOM.hdbtabledata”
],
“path-parameter”: {
“src/gen/OPPC-UOM.hdbtabledata:skip_data_deletion”: “true”
}
}
Then, proceed with the deployment and database upgrade. The logs below show this process:
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Undeploying “src/gen/OPPC-UOM.hdbtabledata”…
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Undeploying “src/gen/OPPC-UOM.hdbtabledata”… ok (0s 0ms)
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Deploying “src/gen/OPPC.UOM.hdbtable$OPPC_UOM.validate”…
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Expanded from “src/gen/OPPC.UOM.hdbtable”
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Undeploying “src/gen/OPPC-UOM.hdbtabledata$0.expand”…
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Expanded from “src/gen/OPPC-UOM.hdbtabledata”
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: The “skip_data_deletion” make parameter is set; no data will be deleted from the target table “OPPC_UOM”
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Deploying “src/gen/OPPC.UOM.hdbtable$OPPC_UOM.validate”… ok (0s 11ms)
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Undeploying “src/gen/OPPC-UOM.hdbtabledata$0.expand”… ok (0s 12ms)
Deployment 2: Deploy .hdbtabledata with include_filter
Remove the HDI deployer’s HDI_DEPLOY_OPTIONS environment variable used in the previous deployment.Update the .hdbtabledata file to include the include_filter option.Deploy and upgrade the database.
However, this deployment failed because the include_filter matched existing records in the table, as shown in the logs:
2025-06-03T20:31:52.57+0530 [APP/PROC/WEB/0] ERR Error: com.sap.hana.di.tabledata: The “include_filter” definitions match with 4364 records that already exist in the “OPPC_UOM” table [8251521]
2025-06-03T20:31:52.57+0530 [APP/PROC/WEB/0] ERR Error: com.sap.hana.di.tabledata: Deploying “src/gen/OPPC-UOM.hdbtabledata$0.expand”… failed [8212145]
2025-06-03T20:31:52.57+0530 [APP/PROC/WEB/0] ERR Error: Worker 1 has encountered an error; all remaining jobs will be canceled [8214600]
2025-06-03T20:31:52.57+0530 [APP/PROC/WEB/0] ERR Error: Processing work list… failed [8212102]
2025-06-03T20:31:52.57+0530 [APP/PROC/WEB/0] ERR Error: Making… failed [8211605]
2025-06-03T20:31:52.57+0530 [APP/PROC/WEB/0] ERR Error: Starting make in the container “679623F6772E4C3BA111AB4C1775754E” with 1 files to deploy, 0 files to undeploy… failed [8214168]
This deployment would only succeed if the database entries matching the CSV content were removed beforehand. In a multitenant application with many productive subscriptions, manually removing data for all tenants is not practical, and we lacked an automated solution for this process.
Possible Approaches
Automated Cleanup: If the origin values are consistently determined by application logic, you could develop an automated task to clean up these records. This cleanup would need to be performed as a separate deployment step before deploying the .hdbtabledata file, though this approach requires additional effort.Alternative Approach: Instead, you can maintain new CSV content with a different origin value (e.g., origin=’03’). This way, only the new CSV data is redeployed, and all existing data in the database table remains unchanged. Adjust the include_filter to target origin=’03’.
The following logs show a successful deployment using this approach:
2025-06-03T20:36:18.52+0530 [APP/PROC/WEB/0] OUT [be82fcfafc2dc1ee]: “src/gen/OPPC-UOM.hdbtabledata”
2025-06-03T20:36:18.52+0530 [APP/PROC/WEB/0] OUT [be82fcfafc2dc1ee]: ]
.
.
.
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Undeploying “src/gen/OPPC-UOM.hdbtabledata$0.expand”…
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Expanded from “src/gen/OPPC-UOM.hdbtabledata”
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Deploying “src/gen/OPPC.UOM.hdbtable$OPPC_UOM.validate”…
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Expanded from “src/gen/OPPC.UOM.hdbtable”
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Deploying “src/gen/OPPC.UOM.hdbtable$OPPC_UOM.validate”… ok (0s 7ms)
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Undeploying “src/gen/OPPC-UOM.hdbtabledata$0.expand”… ok (0s 15ms)
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Undeploying “src/csv/OPPC-UOM.csv”…
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Undeploying “src/csv/OPPC-UOM.csv”… ok (0s 5ms)
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Deploying “src/csv/OPPC-UOM.csv”…
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Deploying “src/csv/OPPC-UOM.csv”… ok (0s 3ms)
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Deploying “src/gen/OPPC-UOM.hdbtabledata$0.expand”…
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Expanded from “src/gen/OPPC-UOM.hdbtabledata”
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Inserted 1 records of the data source file “OPPC-UOM.csv” into the target table “OPPC_UOM”
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Deploying “src/gen/OPPC-UOM.hdbtabledata$0.expand”… ok (0s 43ms)
References
Deployment Options in HDI@sap/hdi-dynamic-deployCAP Troubleshooting: How do I keep existing data?https://sap.stackenterprise.co/questions/73269
Special Thanks
Special thanks to the HANA-DB-DI team members, particularly Pavel Sluzkin and Alexander Bunte, for their invaluable support and guidance!
Handling CSV Data Deployment in CAP HDI Without Data LossIntroductionIn SAP CAP, deploying CSV data using .hdbtabledata files means the HDI container assumes full ownership of the data. As a result, any changes to the CSV file will overwrite the table content during the next deployment.BackgroundHave you ever encountered a scenario where updating CSV data led to unexpected data loss? Here’s our experience:In our hdi-content module, we maintained a UOM.csv (Unit of Measure) file, which is preloaded into the UOM table. This ensures that customers subscribing to our SaaS solution, SAP Omni Channel Promotional Pricing (OPPS), have immediate access to standard UOMs for defining product promotions. Since the UOM table is also exposed via CAP CDS views in the Fiori UI, customers can add their own entries as well.The IssueWhile working on a new feature that introduced another table with pre-delivered data, we discovered that any modification to the CSV file—such as adding or updating UOMs—could result in the loss of customer-maintained data in the UOM table. With many customers using our SaaS solution, this posed a significant risk of accidental data loss whenever the UOM.csv content was changed.We needed a reliable solution to prevent customer data loss during deployments.Exploring CAP RecommendationsAccording to the CAP documentation, you can use the include_filter option in .hdbtabledata to restrict which records are affected. This requires an additional column to distinguish between CSV data and user-generated data. In our case, we already had an origin column:origin = ’02’: CSV dataorigin = ’10’: Customer-created entriesHowever, even with include_filter, we found that customer data was still being lost, and only the CSV content remained after deployment.The CAP troubleshooting guide references SAP Note 2922271, but its solution is tailored for @SAP/hdi-deploy and not directly applicable to @SAP/hdi-dynamic-deploy.The SolutionWith support from the HANA-DB-DI team, we identified a practical approach to address this challenge. We hope sharing this helps others building multitenant applications on CAP and HANA Cloud.Deployment 1: Undeploy .hdbtabledata Without Deleting DataFirst, set the HDI deployer’s environment variable HDI_DEPLOY_OPTIONS in your MTA module properties to ensure the .hdbtabledata file is undeployed without deleting any data from the target table. Use the following configuration:{
“undeploy”: [
“src/gen/OPPC-UOM.hdbtabledata”
],
“path-parameter”: {
“src/gen/OPPC-UOM.hdbtabledata:skip_data_deletion”: “true”
}
}Then, proceed with the deployment and database upgrade. The logs below show this process:2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Undeploying “src/gen/OPPC-UOM.hdbtabledata”…
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Undeploying “src/gen/OPPC-UOM.hdbtabledata”… ok (0s 0ms)
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Deploying “src/gen/OPPC.UOM.hdbtable$OPPC_UOM.validate”…
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Expanded from “src/gen/OPPC.UOM.hdbtable”
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Undeploying “src/gen/OPPC-UOM.hdbtabledata$0.expand”…
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Expanded from “src/gen/OPPC-UOM.hdbtabledata”
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: The “skip_data_deletion” make parameter is set; no data will be deleted from the target table “OPPC_UOM”
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Deploying “src/gen/OPPC.UOM.hdbtable$OPPC_UOM.validate”… ok (0s 11ms)
2025-06-03T20:20:48.69+0530 [APP/PROC/WEB/0] OUT [9d2219cebfa570a2]: Undeploying “src/gen/OPPC-UOM.hdbtabledata$0.expand”… ok (0s 12ms)
Deployment 2: Deploy .hdbtabledata with include_filterRemove the HDI deployer’s HDI_DEPLOY_OPTIONS environment variable used in the previous deployment.Update the .hdbtabledata file to include the include_filter option.Deploy and upgrade the database.However, this deployment failed because the include_filter matched existing records in the table, as shown in the logs:2025-06-03T20:31:52.57+0530 [APP/PROC/WEB/0] ERR Error: com.sap.hana.di.tabledata: The “include_filter” definitions match with 4364 records that already exist in the “OPPC_UOM” table [8251521]
2025-06-03T20:31:52.57+0530 [APP/PROC/WEB/0] ERR Error: com.sap.hana.di.tabledata: Deploying “src/gen/OPPC-UOM.hdbtabledata$0.expand”… failed [8212145]
2025-06-03T20:31:52.57+0530 [APP/PROC/WEB/0] ERR Error: Worker 1 has encountered an error; all remaining jobs will be canceled [8214600]
2025-06-03T20:31:52.57+0530 [APP/PROC/WEB/0] ERR Error: Processing work list… failed [8212102]
2025-06-03T20:31:52.57+0530 [APP/PROC/WEB/0] ERR Error: Making… failed [8211605]
2025-06-03T20:31:52.57+0530 [APP/PROC/WEB/0] ERR Error: Starting make in the container “679623F6772E4C3BA111AB4C1775754E” with 1 files to deploy, 0 files to undeploy… failed [8214168]
This deployment would only succeed if the database entries matching the CSV content were removed beforehand. In a multitenant application with many productive subscriptions, manually removing data for all tenants is not practical, and we lacked an automated solution for this process.Possible ApproachesAutomated Cleanup: If the origin values are consistently determined by application logic, you could develop an automated task to clean up these records. This cleanup would need to be performed as a separate deployment step before deploying the .hdbtabledata file, though this approach requires additional effort.Alternative Approach: Instead, you can maintain new CSV content with a different origin value (e.g., origin=’03’). This way, only the new CSV data is redeployed, and all existing data in the database table remains unchanged. Adjust the include_filter to target origin=’03’.The following logs show a successful deployment using this approach:2025-06-03T20:36:18.52+0530 [APP/PROC/WEB/0] OUT [be82fcfafc2dc1ee]: “src/gen/OPPC-UOM.hdbtabledata”
2025-06-03T20:36:18.52+0530 [APP/PROC/WEB/0] OUT [be82fcfafc2dc1ee]: ]
.
.
.
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Undeploying “src/gen/OPPC-UOM.hdbtabledata$0.expand”…
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Expanded from “src/gen/OPPC-UOM.hdbtabledata”
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Deploying “src/gen/OPPC.UOM.hdbtable$OPPC_UOM.validate”…
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Expanded from “src/gen/OPPC.UOM.hdbtable”
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Deploying “src/gen/OPPC.UOM.hdbtable$OPPC_UOM.validate”… ok (0s 7ms)
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Undeploying “src/gen/OPPC-UOM.hdbtabledata$0.expand”… ok (0s 15ms)
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Undeploying “src/csv/OPPC-UOM.csv”…
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Undeploying “src/csv/OPPC-UOM.csv”… ok (0s 5ms)
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Deploying “src/csv/OPPC-UOM.csv”…
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Deploying “src/csv/OPPC-UOM.csv”… ok (0s 3ms)
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Deploying “src/gen/OPPC-UOM.hdbtabledata$0.expand”…
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Expanded from “src/gen/OPPC-UOM.hdbtabledata”
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Inserted 1 records of the data source file “OPPC-UOM.csv” into the target table “OPPC_UOM”
2025-06-03T20:42:49.12+0530 [APP/PROC/WEB/0] OUT [d9a3e16053ae1f81]: Deploying “src/gen/OPPC-UOM.hdbtabledata$0.expand”… ok (0s 43ms)
ReferencesDeployment Options in HDI@sap/hdi-dynamic-deployCAP Troubleshooting: How do I keep existing data?https://sap.stackenterprise.co/questions/73269 Special ThanksSpecial thanks to the HANA-DB-DI team members, particularly Pavel Sluzkin and Alexander Bunte, for their invaluable support and guidance! Read More Technology Blog Posts by SAP articles
#SAP
#SAPTechnologyblog