Postgres - run a query in batches? Postgres - run a query in batches? postgresql postgresql

Postgres - run a query in batches?


Those last 3 bytes were the straw that broke the camel's back. Probably an allocation attempt in a long string of allocations leading to the failure.

Unfortunately libpq will try to fully cache result sets in memory before relinquishing control to the application. This is in addition to whatever memory you are using up in $myArray.

It has been suggested to use LIMIT ... OFFSET ... to reduce the memory envelope; this will work, but is inefficient as it could needlessly duplicate server-side sorting effort every time the query is reissued with a different offset (e.g. in order to answer LIMIT 10 OFFSET 10000, Postgres will still have to sort the entire result set, only to return rows 10000..10010.)

Instead, use DECLARE ... CURSOR to create a server-side cursor, followed by FETCH FORWARD x to fetch the next x rows. Repeat as many times as needed or until less-than-x rows are returned. Do not forget to CLOSE the cursor when you are done, even when/if an exception is risen.

Also, do not SELECT *; if you only need id and name, create your cursor FOR SELECT id, name (otherwise libpq will needlessly retrieve and cache columns you never use, increasing memory footprint and overall query time.)

Using cursors as illustrated above, libpq will hold at most x rows in memory at any one time. However, make sure you also clean up your $myArray in between FETCHes if possible or else you could still run out of memory on account of $myArray.


You can use LIMIT (x) and OFFSET (y)


The PostgreSQL server caches query results until you actually retrieve them, so adding them to the array in a loop like that will cause an exhaustion of memory no matter what. Either process the results one row at a time, or check the length of the array, process the results pulled so far, and then purge the array.