DVM Data Blogging

❮   all Posts

Paschke Consulting: How to easily export your SAP data to the Cloud (Full ABAP code included)

September 20, 2022

Jim Paschke

September 20, 2022

Jim Paschke
CEO/ Founder

PCC:  How-to quickly export your SAP data to other sources

Problem:  you have a need to copy/send all, or a subset of, your SAP data from your SAP instance to an external source but don’t know how to get the data out of your system via a standard format.

Solution:  this article will demonstrate how to quickly and easily export your SAP data from the database and archives to an external location via the standard, public JSON format using SAP standard, public methods.

To sign up for this blog, visit our website at www.PaschkeConsulting.com/blog

At Paschke Consulting (PCC), we thrive on finding new solutions to longstanding problems especially in, but not solely limited to, the SAP archiving solution space!  This problem has been around for a long time and lots of people have already solved it.  Now that it’s been solved, these solutions have been made available by SAP for free use.  

We at PCC believe a well-informed consumer is our best customer.  To that end, this post teaches you about this set of free solutions.  Empowered with this information, you’ll be free to select the firm of your choosing armed with the knowledge of which services are proprietary and which are free. Our firm excels in this type of training and information sharing that is so intrinsic to every project and process!

In this month’s blog post, we’re going to show you how to extract your SAP data from the database and/or archives and export it to an external location via the public JSON format.  We will even provide you with a fully functioning ABAP program to do this conversion for you.

Our steps for this example will be

1) Determine the scope of the documents to extract, i.e., which Objects,         Company codes and Years

2) Use the PCC version of the SAP JSON Serializer program provided to         convert the database and archive data from structured table and ADK         format to a file in JSON format for use in Step 3

SAP standard JSON Serializer

       a. Spot check the file and edit to ensure proper format based on the                  required format of the receiving solution

3) Use an Internal Transmit Function (FTP) to send the files created
       in Step 2 to the receiving solution

        a. Confirm the files were received in their entirety, are accessible,                 and have not been duplicated

4) Go get coffee, stop recreating the wheel, and perform other more         important, value-add functions!

Determine the scope of documents to extract

In the Use Case that we will review today, we are extracting all FI documents (Journal entries) from the system to be exported to an external reporting solution.  

In any SAP instance, whether this is a full system decommissioning or a partial carve out, you may need to extract other business objects like Purchase Orders, Invoices, etc.  This process and the provided programs may be copied or abstracted to support any object and any set of tables in the SAP system.

In your system, the Separation Agreement or business users will determine the requirements and scope of the data to be extracted, but this step is a hard prerequisite to Step 1.

As shown below, there is no arcane input, just enter the Company codes and Years and extract your data!

Simple user-friendly selections to control data retrieval and data transformation

Run the SAP standard JSON Serializer

In the Use Case that we will review today, we are extracting all FI documents (Journal entries) from the system to be exported to an external reporting solution.  

We will be extracting Journal entries from the BKPF and BSEG tables as well as the FI_DOCUMNT archives restricting each group by Company code and Year.  In this below set of steps, we will extract Company code 0001 for Year 2018 and place them into a JSON file.

Click execute to quickly and easily convert your data to JSON format.

Once you’ve entered your criteria, simply execute in foreground or background.  The job will extract all the BKPF and BSEG, FI document header and detail line tables, for the Company codes and Years selected, convert them into standard JSON format, and in this case will write the file to a SPOOL list.

Technical data section: Alert!  Feel free to skip this section if you’re not an ABAP person.

The elegance of the ABAP solution in this program that we’re running above, the code for which is provided at the end of this article, is that it uses encapsulated code in the form of one PCC Function Module CALL and one SAP standard METHOD, resulting in extremely powerful and robust, yet compact, code.

The first FM CALL is to the PCC proprietary function /PCC/ARCH_READ (see below).  This is what retrieves all target data based on in the selection criteria from both the database and the archives regardless of when it was archived or how many files its distributed across and returns all the selected data to the calling program via internal tables.  This is the format that’s required by the subsequent METHOD.  

Note: this is a the common API Interface that is used by PCC to archive-enable every SAP or custom application in every customer system regardless of platform, release level, archive repository, etc.

PCC's common API supporting fast and flexible archive retrieval.

The second invocation uses SAP standard CLASS “cl_trex_json_serializer”, METHOD “get_data” which then simply converts those tables into JSON format.  So there is no need to write your own cumbersome JSON converter trying to accommodate every data field and value – it’s all been done for you by SAP.  

Implementation of SAP standard JSON serializer CLASS available for use by SAP Customers

What you may need to do is some post-conversion modifications to the file, some of which are given in the sample program provided below.  There may be additional modifications necessary based on the receiving solution to which you are sending this data and other scenarios.

The resulting program by using these encapsulated procedures is very tiny and very powerful.  The code provided is less than 150 lines but does 98% of what you’ll ever need in SAP data extraction projects.  

Below is a screen shot of the JSON extract format:

Actual output JSON sample

Use an Internal Transmit Function to send the data to the receiving solution

In your system, you will need to modify the provided programs to write to an output file somewhere on your Application Server but this is a basic change and some of the ABAP is OS-dependent so I will leave that to you.

The transmission to the receiving system portion is also non-trivial but is dependent on the receiving solution so we can’t provide that portion of the solution in a simple article.  However, it is critical that you be able to validate:

1) that the file was received,

2) it was received in its entirety, and

3) that it was received only once.

These also are non-trivial but these requirements can likely be provided by the receiving system’s solution provider.  

Vendor selection note:  these are the project tasks for which you’ll need to choose your solution provider carefully. There are many vendors that provide cloud and other hosting solutions.  IF you still need a vendor to help you with the remaining true-value-add tasks like migrating, tracking, reporting, and reconciling the target data population, we trust you’ll consider us as we execute your project with expert resources and also provide training and KT like this throughout the project.

Monitor the conversion and transmission jobs and stop recreating the wheel!

The job of tracking the extraction of the data and creation of the JSON files is extremely important but the work of doing the tracking is straightforward.  We can quickly develop custom tracking tables in your system based on yours and your provider’s requirements to perform the monitoring.  

Now that the work of extracting, converting and transmitting is underway, you can simply monitor the progress, stop recreating the wheel, grab a cup of coffee, relax and do more true-value-add work!

Actual code that powered the above solution

The below ABAP Program is very short and very simple.  Prerequisites are solely that 1) SAP Standard ABAP Class, CL_TREX_JSON_SERIALIZER, exists; and 2) that the /PCC/ Archive Connector solution has been implemented.   If you are on the ECC6 Release Level with that ABAP Class and have the PCC Solution, with all that encapsulated logic, this small, powerful program ends up around 100 lines of code.

To obtain a PCC software license, visit www.PaschkeConsulting.com and click on big, green CONNECT button; also, PCC software is coming very soon to the SAP App Store (est. June 2021) with low-price, starter packages.

To get the PCC-independent version of the below code with only SAP calls,  visit our website, click on CONNECT and ask for it; we will send it to you.


*& Report               /PCC/SAP_JSON_SERIALIZER

*& Author                Paschke Consulting, Inc.

*& Date written       May 15, 2021

*&   Some parts copied from SAP std TREX_SEND_REGRESSION_TEST_INFO


*& All rights reserved without permission.


REPORT /pcc/sap_json_serializer.

TABLES: bkpf.

DATA:   global_string TYPE string.





 so_bukrs FOR bkpf-bukrs   DEFAULT '0001',

 so_gjahr FOR bkpf-gjahr   DEFAULT '2018'.


 p_readdb AS CHECKBOX,

 p_readar AS CHECKBOX,




 PERFORM pcc_extract.

 PERFORM sap_serialize  USING global_string.

 PERFORM results    USING global_string.


*& This section uses SAP Standard API to serialize the

*& tabular data into JSON format.



FORM sap_serialize USING l_local.

DATA: l_serializer TYPE REF TO cl_trex_json_serializer,

     json_c TYPE string.

 CLEAR l_local.

 CREATE OBJECT l_serializer

   EXPORTING   data = i_bkpf. l_serializer->serialize( ) .

 json_c = l_serializer->get_data( ) .

 CONCATENATE l_local json_c INTO l_local.

 CLEAR json_c.

 CREATE OBJECT l_serializer

   EXPORTING   data = i_bseg. l_serializer->serialize( ) .

 json_c = l_serializer->get_data( ) .

 CONCATENATE l_local json_c INTO l_local.

 CLEAR json_c.



*& This section uses PCC-proprietary, patent-pending Archive

*& Connector to return all data from archive and database in one pass

*& for all Tables requested that meet all the provided selection criteria.

*& All rights to PCC Software are retained.


FORM pcc_extract.

 LOOP AT so_bukrs.

   MOVE-CORRESPONDING so_bukrs TO xs_fld1. APPEND xs_fld1.


 LOOP AT so_gjahr.

   MOVE-CORRESPONDING so_gjahr TO xs_fld2. APPEND xs_fld2.



   EXPORTING archive_object   = p_aobj

             access_method        = p_access

             access_cutoff        = p_cutoff

             hash_date_field      = p_hashdt

             hash_source          = p_hashsr

             read_database      = p_readdb

             tabname1             = 'BKPF'

             tabname2             = 'BSEG'

             fieldname1           = c_bukrs

             fieldname2           = c_gjahr

   TABLES    tableval1            = xs_fld1[]

             tableval2            = xs_fld2[]

             tab1                 = i_bkpf[]

             tab2                 = i_bseg[]

  EXCEPTIONS OTHERS               = 1.

 DESCRIBE TABLE i_bkpf LINES w_lines.

 IF w_lines EQ 0.

   MESSAGE 'No records found' TYPE 'E'.



FORM results USING l1.

DATA: x TYPE i, tt TYPE i, spc TYPE i, spc1 TYPE i, eol TYPE i, diff TYPE i.

DATA: bl TYPE string VALUE ' '.

x = spc = 0. bl = l1+8(1).

tt = strlen( l1 ).

WHILE x LT tt.

 IF l1+x(1) EQ bl OR l1+x(1) EQ '['.

    tt = strlen( l1 ).

    spc = x. spc1 = spc + 1.

    IF l1+spc1(1) = '{'.

       ADD 1 TO x. eol  = tt - x - 1. ADD 1 TO spc1.

       CONCATENATE l1(spc1) ' ' l1+spc1(eol) INTO l1 RESPECTING BLANKS.

       ADD 2 TO spc.



 IF l1+x(1) EQ ':'.

    tt = strlen( l1 ).

    diff = x - spc - 1. spc1 = spc + 1. ADD 2 TO x. eol  = tt - x.

    CONCATENATE l1(spc) '"' l1+spc1(diff) '":' l1+x(eol) INTO l1.

    tt = strlen( l1 ).


 ADD 1 TO x.


x = 0.

WHILE ( x + 80 LT tt ).

 WRITE : / l1+x(80).

 ADD 80 TO x.



❮   all POSTS

Stay in Touch

Subscribe to our newsletter

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.