I have a csv of 1.9M rows / 187MB and it gives me TransientError: There is not enough memory to perform the current task
when I try to LOAD CSV
it.
I did increase dbms.memory.heap.max_size
as error message suggested, setting initial size
to 4G and max size
to 32G.
So my question is how much memory do I need to load this, as I understand, not-so-big dataset? Is it even possible with 16G ram home computer?
Much thanks for any help..
If you are not already specifying USING PERIODIC COMMIT
, as indicated by the dev manual for your data size, you should. That would allow LOAD CSV
to process your data in smaller chunks instead of trying to do everything in a single transaction, which is likely why you are running out of memory.
Here is a simple example:
USING PERIODIC COMMIT
LOAD CSV FROM 'file:///foo.csv' AS line
CREATE (:Person { name: line[1], address: line[2] });