UNVERIFIED SOLUTION i
X

Dataflow fails when the record count goes over 1000 even if the the record limit configured as 1000 in Spectrum Sorter

Issue

In Spectrum sorter functionality, if the input data is more than 1000, the flow execution is unsuccessful even if the record limit was configured as 1000 without any errors.
But if the input data is less than 1000 records, then the flow completes successfully.

Cause

The cause of failure is that there are two separate sorting algorithms used - InMemory and External Sort.

InMemory algorithm is used till the limit of 1000, if it increases, the External Sort algorithm takes over.
The InMemory sort uses a merge sort which handles any type of the data whereas the External Sort relies on the type definition to determine how to serialize the values to disk so the sort can be done and not use up as much memory.

 

Resolution

UPDATED: May 29, 2018


The bug with reference ID CDQE-66438 has been logged for the inconsistent behavior between InMemory and External sorters.

As a workaround:
  1. set the output type of the ErrorFlag as integer in the Memory Sort options and reconfigure the sort stage or
  2. use a string when setting the ErrorFlag value in the script

Environment Details

Product Affected: Spectrum Technology Platform
Product Feature: Dataflow Design / Implementation

Downloads

  • No Downloads