VERIFIED SOLUTION i
X

Error "Out of memory" encountered for PostgreSQL database while reading a large number of records in the 'Read from DB' stage of Spectrum

Issue

While reading a large number (10-20 million) of records for PostgreSQL via "Read from DB" stage in Spectrum, 'Out of memory' error is encountered. In case a batch size is provided, yet Spectrum ignores the batch size and tries to load all the records at once.

Cause

PostgreSQL driver handles the results based on the fetch size that user mentions from Spectrum user interface. In case of PostgreSQL, ‘fetch size’ does not come into play. This setting is used in case of other databases.

Resolution

UPDATED: August 2, 2018


This issue has been identified and is planned to be fixed in spectrum 2018 H2 release which is scheduled to release by the end of Oct ’18. This fix is expected to work on all platforms since the 2018 H2 release of Spectrum.

PS: A patch file containing the fix for this issue is also available for spectrum 12.1. Contact Pitney Bowes client support and mention case# 17038115 in order to get that fix.

Environment Details

Product Feature: Performance

Downloads

  • No Downloads