Hello All,
On the SQLQRY node there is a checkbox for "Generate CSV from resultset (This will skip SQL Looping)". This feature seems to work for me until at some point I believe the resultset is too large.
For example, I have an IPA that saves a CSV file to a local folder. The process runs fine daily and we don't have any issues with it. If I try to expand the range of the query within the node to find more data, it fails. It ran fine for 2136 rows, but failed at 7713. When I look at the log, I can see the query does find the records and then loses them:
----------
Log Snippet:
getItems_RECORD_COUNT = 7713
SQL Query getItems: Executing loop 1 of 7,713
getItems_RECORD_COUNT = 0
If I copy the query from the work unit log to a SQL editor and export a CSV, there are no issues.
I can't figure out why when there is more data, the skip SQL looping fails. Has anyone experienced this? My assumption is that I'm hitting some sort of memory limit.