Find without limit in loop causes process out of memory
I've written a simple test script to compare performance between massive.js and sequelize (which massive was doing incredibly well in until this!) that does a find without a limit, 100 times. I'm doing a find on a table with ~200,000 records in it and after the 14th loop (every time) it throws
<--- Last few GCs --->
27791 ms: Scavenge 1410.8 (1457.0) -> 1410.8 (1457.0) MB, 16.9 / 0 ms (+ 1.8 ms in 1 steps since last GC) [allocation failure] [incremental marking delaying mark-sweep].
28618 ms: Mark-sweep 1410.8 (1457.0) -> 1409.9 (1457.0) MB, 827.7 / 0 ms (+ 2.8 ms in 2 steps since start of marking, biggest step 1.8 ms) [last resort gc].
29426 ms: Mark-sweep 1409.9 (1457.0) -> 1409.8 (1457.0) MB, 807.1 / 0 ms [last resort gc].
<--- JS stacktrace --->
==== JS stack trace =========================================
Security context: 0x76c271b4629 <JS Object>
1: parseDate [/Users/jwhitmarsh/src/massive-test/node_modules/massive/node_modules/pg/node_modules/pg-types/node_modules/postgres-date/index.js:~8] [pc=0x32f6cf6679b] (this=0x3db47f92c7a1 <JS Array[9]>,isoDate=0x2078d2b89419 <String[26]: 2015-12-26 19:04:44.406+00>)
2: new constructor [0x76c271041b9 <undefined>:~1] [pc=0x32f6d00d503] (this=0x2078d2b89471 <JS Object>,parsers=0x3db47f92c7a...
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
Abort trap: 6
Obviously this is not a real world scenario so I wouldn't make it a priority! but I thought it might be useful if it's pointing to some memory leakage somewhere?