How to perform mass upload from an Excel file in a FIORI App in SAP BTP ABAP Environment – Part 1

Estimated read time 14 min read

Introduction

In this blog series, I will showcase a business use case of performing mass upload using an Excel file to create/update entries in a RAP BO in the SAP BTP ABAP Environment. A FIORI App provides the user interface to perform mass upload and display error messages validating file records. A couple of open-source libraries are already available to integrate into your FIORI application to parse the base64 Excel data to a human-readable JSON format. You may refer to a wonderful similar blog series mentioned below.

Excel Upload using RAP: Part -1 

But, We will use the XCO library to parse the Excel file. From release 2208 onwards in SAP BTP ABAP Environment, XCO library offers a new XLSX module including abstractions and APIs to programmatically work with XLSX workbooks and their worksheets (e.g. coming form an uploaded Microsoft Excel .XLSX file).

Reference:

SAP BTP ABAP Environment – Release 2208 

SAP Help Documentation on XCO XLSX Module  

Hence, going forward in BTP ABAP Environment, an Excel file can be uploaded and parsed into a readable format in a RAP based FIORI app without using an open-source library.

Please note that XCO library is available in S/4 HANA Cloud and on-premise edition 2020 onwards.

However, this blog post also offers an easy way of performing controller extension in a FIORI Elements for OData V4 app using flexible programming model. So, it is divided in 3 parts as follows.

Part 1: Upload Part – T

Talks about XCO XLSX module integrated into a RAP BO to read from an Excel file and create/update records in a RAP BO.File validation while reading records.

Part 2: FIORI App extension

We will learn an easy way of performing controller extension in an FIORI Elements for OData V4 based app and adding a custom action to the list report table without creating a custom JS file.We will learn to trigger oData V4 operations from UI.

Part 3: Download Part

Talks about XCO XLSX module to write data into an Excel file and download a template file.Generate template dynamically based on the file structure defined.Use case of XCO I18N API to read the translated text from a Data Element and create a language specific template.

Background

Generally,it is a very common requirement to allow users to perform mass upload to create multiple entries at one go. In a classical SAP Gateway based project, we used to create a media type entity earlier and read or write the file using GET_STREAM or CREATE_STREAM methods. RAP does not support media type entity, though from release 2208 onwards, it supports stream operations using @Semantics.largeObject annotation but it has some limitations. This works only on the object page where it allows to attach files and all such attachments will be stored in a BTP table. Hence, to read the file records there were no other option than to use open source libraries to parse the excel file in a JSON format and trigger create operation in backend. 

How do we solve this problem?

Well, the trick is to define a static action in the RAP BO and pass the file content, mime type, and file name as the input parameter. The action will utilize the XCO XLSX module to parse the bae64 Excel file content to a human-readable format in an internal table based on the selection pattern defined. Now, you can use the internal table to perform validation and trigger error messages. Upon successful validation, the records can be used to create entries in the RAP BO entity using the EML statement.

Let’s get started

I am not showing all the steps to create a RAP BO entity and perform the managed implementation. But, these are the basic steps explained in many blog posts and SAP tutorials. So, you may refer them. Let me directly jump to the problem statement.

As mentioned, we will define an action in the RAP BO and define its input parameters using an abstract entity as follows.

‘fileUpload’ action needs to be explicitly triggered from the UI. I will talk about it later.

Within the ‘fileUpload’ action following code sample can be used to parse the file base64 content.

1. First, get the read access of the document using the file content.

2. Get the worksheet reference using the position number or the worksheet name using at_position or for_name methods.

” Read access for the worksheet with name INVOICES.
DATA(lo_invoices_worksheet) = lo_read_access->get_workbook(
)->worksheet->for_name( ‘INVOICES’ ).

Hence, if your excel contains multiple sheets, the same can also be read and uploaded at one go. For example, both the header and the item records, maintained in two different sheets can be uploaded using this option.

3. Use the pattern builder to define a pattern based on the file template defined. Within the XCO XLSX module, a worksheet is represented as a space with two dimensions, with the first dimension being the column and the second dimension being the row.  So, a cell in a worksheet is identified by its column and row values. A coordinate (an object of type CL_XCO_XLSX_COORDINATE) fixes a value for a given dimension, either for a row or column. In Microsoft Excel, alphabetic values are used to identify columns and numeric values are used to identify rows. The XCO XLSX module provides support for both variants so that alphabetic as well as numeric values can be used freely to determine a cell in a worksheet.

Alphabetic coordinates: A, B, C, … Z, AA, AB, …ZZ, AAA, AAB, …

Numeric coordinates: 1, 2, 3, …

Hence, if your template contains 7 columns, the first column becomes A and the last column becomes G. This is represented using the below code section.

from_column( xco_cp_xlsx=>coordinate->for_alphabetic_value( ‘A’ )

to_column( xco_cp_xlsx=>coordinate->for_alphabetic_value( ‘G’ )

Also, as you would like to read the records from the second row onwards, skipping the header row, you need to define it using the following code section.

from_row( xco_cp_xlsx=>coordinate->for_numeric_value( 2 ).

Note, to_row( ) method is not used as the number of rows is not known. Hence, we would like to read all the records present in the excel sheet.

4. There are two ways of accessing data. First via a stream and second via a cursor. We read the data using stream approach as we have a structured data statically known to read. So, row stream approach is best suitable for this scenario. You may refer to SAP official documentation to explore more. Reference link is already given.

So, finally using the row stream approach and with a defined pattern you can read the data and populate it an internal table.

lo_worksheet->select( lo_selection_pattern )->row_stream( )->operation->write_to( REF #( lt_vendor_email ) )->if_xco_xlsx_ra_operation~execute( ).

Note: the structure of the internal table needs to be defined explicitly when using row stream approach. For dynamic reading scenarios, use the cell stream approach to process each cell individually.

Once the data is read we can use ABAP logic to validate each record and display error messages. Otherwise, value transformation is also possible while writing the records in an internal table in row stream approach or reading individual cell in cell stream approach.

Default value transformation is ‘best effort’ which can be changed to other types like ‘string value’ and ‘identity’ using SET_VALUE_TRANSFORMATION method of IF_XCO_XLSX_RA_RS_OP_WRITE_TO when row values are read and written to an internal table as part of the write to row stream operation. 

But, none of the approach performs validation and display an error message when the value is wrong, example supplier number. 

Finally, when the validation done, you can use EML statement MODIFY ENTITIES to create or update records. So, using this approach you can perform both mass create and update.

To create:

To update:

 

 

 

 

​ IntroductionIn this blog series, I will showcase a business use case of performing mass upload using an Excel file to create/update entries in a RAP BO in the SAP BTP ABAP Environment. A FIORI App provides the user interface to perform mass upload and display error messages validating file records. A couple of open-source libraries are already available to integrate into your FIORI application to parse the base64 Excel data to a human-readable JSON format. You may refer to a wonderful similar blog series mentioned below.Excel Upload using RAP: Part -1 But, We will use the XCO library to parse the Excel file. From release 2208 onwards in SAP BTP ABAP Environment, XCO library offers a new XLSX module including abstractions and APIs to programmatically work with XLSX workbooks and their worksheets (e.g. coming form an uploaded Microsoft Excel .XLSX file).Reference:SAP BTP ABAP Environment – Release 2208 SAP Help Documentation on XCO XLSX Module  Hence, going forward in BTP ABAP Environment, an Excel file can be uploaded and parsed into a readable format in a RAP based FIORI app without using an open-source library.Please note that XCO library is available in S/4 HANA Cloud and on-premise edition 2020 onwards.However, this blog post also offers an easy way of performing controller extension in a FIORI Elements for OData V4 app using flexible programming model. So, it is divided in 3 parts as follows.Part 1: Upload Part – TTalks about XCO XLSX module integrated into a RAP BO to read from an Excel file and create/update records in a RAP BO.File validation while reading records.Part 2: FIORI App extension -We will learn an easy way of performing controller extension in an FIORI Elements for OData V4 based app and adding a custom action to the list report table without creating a custom JS file.We will learn to trigger oData V4 operations from UI.Part 3: Download Part -Talks about XCO XLSX module to write data into an Excel file and download a template file.Generate template dynamically based on the file structure defined.Use case of XCO I18N API to read the translated text from a Data Element and create a language specific template.BackgroundGenerally,it is a very common requirement to allow users to perform mass upload to create multiple entries at one go. In a classical SAP Gateway based project, we used to create a media type entity earlier and read or write the file using GET_STREAM or CREATE_STREAM methods. RAP does not support media type entity, though from release 2208 onwards, it supports stream operations using @Semantics.largeObject annotation but it has some limitations. This works only on the object page where it allows to attach files and all such attachments will be stored in a BTP table. Hence, to read the file records there were no other option than to use open source libraries to parse the excel file in a JSON format and trigger create operation in backend. How do we solve this problem?Well, the trick is to define a static action in the RAP BO and pass the file content, mime type, and file name as the input parameter. The action will utilize the XCO XLSX module to parse the bae64 Excel file content to a human-readable format in an internal table based on the selection pattern defined. Now, you can use the internal table to perform validation and trigger error messages. Upon successful validation, the records can be used to create entries in the RAP BO entity using the EML statement.Let’s get startedI am not showing all the steps to create a RAP BO entity and perform the managed implementation. But, these are the basic steps explained in many blog posts and SAP tutorials. So, you may refer them. Let me directly jump to the problem statement.As mentioned, we will define an action in the RAP BO and define its input parameters using an abstract entity as follows.’fileUpload’ action needs to be explicitly triggered from the UI. I will talk about it later.Within the ‘fileUpload’ action following code sample can be used to parse the file base64 content.1. First, get the read access of the document using the file content.2. Get the worksheet reference using the position number or the worksheet name using at_position or for_name methods.” Read access for the worksheet with name INVOICES.
DATA(lo_invoices_worksheet) = lo_read_access->get_workbook(
)->worksheet->for_name( ‘INVOICES’ ).Hence, if your excel contains multiple sheets, the same can also be read and uploaded at one go. For example, both the header and the item records, maintained in two different sheets can be uploaded using this option.3. Use the pattern builder to define a pattern based on the file template defined. Within the XCO XLSX module, a worksheet is represented as a space with two dimensions, with the first dimension being the column and the second dimension being the row.  So, a cell in a worksheet is identified by its column and row values. A coordinate (an object of type CL_XCO_XLSX_COORDINATE) fixes a value for a given dimension, either for a row or column. In Microsoft Excel, alphabetic values are used to identify columns and numeric values are used to identify rows. The XCO XLSX module provides support for both variants so that alphabetic as well as numeric values can be used freely to determine a cell in a worksheet.Alphabetic coordinates: A, B, C, … Z, AA, AB, …ZZ, AAA, AAB, …Numeric coordinates: 1, 2, 3, …Hence, if your template contains 7 columns, the first column becomes A and the last column becomes G. This is represented using the below code section.from_column( xco_cp_xlsx=>coordinate->for_alphabetic_value( ‘A’ )to_column( xco_cp_xlsx=>coordinate->for_alphabetic_value( ‘G’ )Also, as you would like to read the records from the second row onwards, skipping the header row, you need to define it using the following code section.from_row( xco_cp_xlsx=>coordinate->for_numeric_value( 2 ).Note, to_row( ) method is not used as the number of rows is not known. Hence, we would like to read all the records present in the excel sheet.4. There are two ways of accessing data. First via a stream and second via a cursor. We read the data using stream approach as we have a structured data statically known to read. So, row stream approach is best suitable for this scenario. You may refer to SAP official documentation to explore more. Reference link is already given.So, finally using the row stream approach and with a defined pattern you can read the data and populate it an internal table.lo_worksheet->select( lo_selection_pattern )->row_stream( )->operation->write_to( REF #( lt_vendor_email ) )->if_xco_xlsx_ra_operation~execute( ).Note: the structure of the internal table needs to be defined explicitly when using row stream approach. For dynamic reading scenarios, use the cell stream approach to process each cell individually.Once the data is read we can use ABAP logic to validate each record and display error messages. Otherwise, value transformation is also possible while writing the records in an internal table in row stream approach or reading individual cell in cell stream approach.Default value transformation is ‘best effort’ which can be changed to other types like ‘string value’ and ‘identity’ using SET_VALUE_TRANSFORMATION method of IF_XCO_XLSX_RA_RS_OP_WRITE_TO when row values are read and written to an internal table as part of the write to row stream operation. But, none of the approach performs validation and display an error message when the value is wrong, example supplier number. Finally, when the validation done, you can use EML statement MODIFY ENTITIES to create or update records. So, using this approach you can perform both mass create and update.To create:To update:      Read More Technology Blogs by SAP articles 

#SAP

#SAPTechnologyblog

You May Also Like

More From Author

+ There are no comments

Add yours