Parallel Processing with Starting New Task - front end screen timeout

11,079

Solution 1

I think, best way for parallel processing in SAP is Bank Parallel Processing framework as Jagger mentioned. Unfortunently its rarerly mentioned in any resource and its not documented well. Actually, best documentation I found was in this book

https://www.sap-press.com/abap-performance-tuning_2092/

Yes, it's tricky. It costed me about 5 or 6 days to force it going. But results were good.

All stuff is situated in package BANK_PP_JOBCTRL and you can use its name for googling.

Main idea there is to divide all your work into steps (simplified):

  1. Preparation

  2. Parallel processing

    2.1. Processing preparation

    2.2. Processing (Actually there are more steps there)

First step is not paralleized. Here you should prepare all you data for parallel processing and devide it into 'piece' which will be processed in parallel. Content of pieces, in turn, can be ID or preloaded data as well. After that, you can run step 2 in parallel processing. Great benefit of all this is that error in one piece of parallel work won't lead to crash of all your processing. I recomend you check demo in function group BANK_API_PP_DEMO

Solution 2

As already mentioned, thats not an easy topic that is handled with a few lines of codes. The general steps you have to conduct in a thoughtful way to gain the desired benefit is:

1) Get free work processes available for parallel processing

2) Slice your data in packages to be processed

3) Call an RFC enabled function module asynchronously for each package with the available work processes. Handle waiting for free work processes, if packages > available processes

4) Receive your results asynchronously

5) Wait till everything is processed and merge the data together again and assure that every package was handled properly

Although it is bad practice to just post links, the code is very long and would make this answer very messy, therfore take a look at the following links:

Example1-aRFC

Example2-aRFC

Example3-aRFC

Other RFC variants (e.g. qRFC, tRFC etc.) can be found here with short description but sadly cannot give you further insight on them.

EDIT:

Regarding process type of aRFC:

In parallel processing, a job step is started as usual in a background processing work process. (...)While the job itself runs in a background process, the parallel processing tasks that it starts run in dialog work processes. Such dialog work processes may be located on any SAP server.

The server is specified with the GROUP (default: parallel_generators) see transaction RZ12 and can have its own ressources just for parallel processing. If your process times out, you have to slice your packages differently in size.

Solution 3

To implement parallel processing, you need to do a bit more than just add that clause. The information is contained in this help topic. A lot of design effort needs to be devoted to ensure that the communication and result merging overhead of the parallel processing does not negate the performance advantage gained by the parallel processing in the first place and that referential integrity of the data is maintained even when some of the parallel tasks fail. Do not under-estimate the complexity of this task.

Share:
11,079
Michael Meyer
Author by

Michael Meyer

Updated on June 04, 2022

Comments

  • Michael Meyer
    Michael Meyer almost 2 years

    I am running an ABAP program to work with a huge amount of data. The SAP documentation gives the information that I should use Remote Function Modules with the addition STARTING NEW TASK to process the data.

    So my program first selects all the data, breaks the data into packages and calls a function module with a package of data for further processing.

    So that's my pseudo code:

    Select KEYFIELD from MYSAP_TABLE into table KEY_TABLE package size 500.
    
     append KEY_TABLE to ALL_KEYS_TABLE.
    
    Endselect.
    
    
    Loop at ALL_KEYS_TABLE assigning <fs_table> .
    
      call function 'Z_MASS_PROCESSING'
         starting new TASK 'TEST' destination in group default
           exporting
               IT_DATA = <fs_table> .
    
    Endloop .
    

    But I am surprised to see that I am using Dialog Processes instead of Background Process for the call of my function module.

    So now I encountered the problem that one of my Dialog Processes were killed after 60 Minutes because of Timeout.

    For me, it seems that STARTING NEW TASK is not the right solution for parallel processing of mass data.

    What will be the alternative?