Error "GEN0334A Out of Memory" when running EngageOne Generate

Product Feature: DOC1GEN
 

Issue

When running DOC1GEN with extremely large input data for a single customer, EngageOne Generate is failing with the below error message:

GEN0334A Out of Memory

Cause

Before a data set for a single customer is processed, all of the data is read into memory.  If you have an extremely large amount of data for a single customer, this can break the memory buffers giving the error shown above.

Resolution

UPDATED: March 3, 2020
NOTE: these parameters should only be in the OPS file when there is a problem. These parameters should not be left in for all processing runs as they will have a substantial negative impact on processing time. 

NOTE: If using this parameter on a mainframe, and the data being processed contains a large number of customer data sets (all small), the way memory is allocated and released on a mainframe may still result in out of memory errors.  This is not a product issue, but a limitation with mainframe functionality.


If the process is being run on 64-bit Windows with sufficient memory, the 64-bit version of DOC1GEN can be used to use all available memory (up to 16 TB).

If the above solution is not possible, the below steps can be followed:

1. Use the MMGX command at the end of the DOC1GEN command. This parameter turns off all internal memory management in DOC1GEN (and memory management is handled by OS). MMGX only takes the memory that it needs and is a generic parameter that works on all platforms. The structure of the command is as follows:


On Windows and Unix: DOC1GEN yourhip.hip OPS=yourops.ops MMGX

On Mainframe: Amend the Line to call DOC1GEN in the JCL example: 
DOC1GEN EXEC PGM=DOC1GEN,PARM=’DD:HIP OPS=DD:OPS MMGX

2 There are OPS files settings which write large amounts of text and Data Fields to an external file, rather than storing all of them in memory. The OPS file parameters are as follows:

On Windows and Unix:
<Custom>
DataFieldBufferFile=field.buf
DataTextBufferFile=text.buf
DataBufferThreshold=4M


On Mainframe:
The dataset GENFBUF and GENTBUF needs to be defined in the JCL first.
<Custom>
DataFieldBufferFile=DD:GENFBUF
DataTextBufferFile =DD:GENTBUF
DataBufferThreshold=4M


For the DataBufferThreshold value, there is no one setting appropriate for all. The minimum suggested value is 4M and the maximum is 30M. The end user needs to modify this value as they see fit.

Example of how to setup the dataset in the JCL:
GENFBUF  DD DSN=xxxxxx.FILES.GENFBUF,&BUF
            DISP=(NEW,CATLG,DELETE),
            SPACE=(TRK,(1200,200)),   -- This needs to be adjusted based on how much space is available on mainframe.
            UNIT=SYSDA,
            DCB=(RECFM=U,LRECL=0,BLKSIZE=0)

GENTBUF  DD DSN=xxxxxx.FILES.GENTBUF,&BUF
            DISP=(NEW,CATLG,DELETE),
            SPACE=(TRK,(1200,200)),      -- This needs to be adjusted based on how much space is available on mainframe.
            UNIT=SYSDA,
            DCB=(RECFM=U,LRECL=0,BLKSIZE=0)


3. Lastly, there are Memory Overflow parameters that may be used in the OPS file to limit the number of composed pages which are held in memory. Overflow pages will be written to an external file rather than being held in memory. The OPS file parameters are as follows:

On Windows and Unix:
<OverFlow>
OverFlowFile=OverflowMemory.txt
OverFlowSize=10M


On Mainframe:
The dataset OVERFSIZE needs to be defined in the JCL first.
<OverFlow>
OverFlowFile=DD:OVERFLOW
OverFlowSize=10M


Example of how to setup the dataset in the JCL:
OVRFSIZE  DD DSN=xxxxxx.FILES.OVRFSIZE
            DISP=(NEW,CATLG,DELETE),
            SPACE=(TRK,(1200,200)),   -- This needs to be adjusted based on how much space is available on mainframe.
            UNIT=SYSDA,
            DCB=(RECFM=U,LRECL=0,BLKSIZE=0)


For the OverflowSize value, there is no one setting appropriate for all. The minimum suggested value is 10M and the maximum is 50M. The end user needs to modify this value as they see fit.

Some customers have reported setting "OverFlowSize" very small (example 512k) has shown to help for cases with extremely large input.

These settings should not be used as default.  If you do get any "out of memory" errors, we recommend you try option number 1 as described above and see if it works. If not, try option number 1 and number 2 together and see if that works. If DOC1GEN still fails with an Out of Memory error, try all 3 options at the same time.

All of these settings are also available in the Production Job dialog, under the Advanced Options section (Memory Handling and Custom). For more information, refer to the EngageOne Designer "Production Guide" under OPS File "Sections, keywords and parameters".


If the failure still occurs after this, report the issue to software.support@pb.com.