We have moved to www.dataGenX.net, Keep Learning with us.

Tuesday, December 03, 2013

Error : Datastage Job Aborts with "The record is too big to fit in a block"


To fix this error you need to increase the block size to accommodate the record size:

1. Log into Designer and open the job.

2. Open the job properties--> parameters-->add environment variable and select: 
   APT_DEFAULT_TRANSPORT_BLOCK_SIZE

3.You can set this up to 256MB but you really shouldn't need to go over 1MB.
  NOTE: value is in KB

  For example to set the value to 1MB:
   APT_DEFAULT_TRANSPORT_BLOCK_SIZE=1048576

The default for this value is 128kb.


When setting APT_DEFAULT_TRANSPORT_BLOCK_SIZE you want to use the smallest possible value since this value will be used for all links in the job.

For example if your job fails with APT_DEFAULT_TRANSPORT_BLOCK_SIZE set to 1 MB and succeeds at 4 MB you would want to do further testing to see what it the smallest value between 1 MB and 4 MB that will allow the job to run and use that value. Using 4 MB could cause the job to use more memory than needed since all the links would use a 4 MB transport block size.

NOTE: If this error appears for a dataset use APT_PHYSICAL_DATASET_BLOCK_SIZE





No comments :

Post a Comment