Search code examples
pythonfileout-of-memorytext-fileslarge-data

MemoryError when trying to load 5GB text file


I want to read data stored in text format in a 5GB file. when I try to read the content of file using this code:

file = open('../data/entries_en.txt', 'r')
data = file.readlines()

an error occurred: data = file.readlines() MemoryError My laptop has 8GB memory and at least 4GB is empty when I want to run the program. but when I monitor the system performance, when python uses about 1.5GB of memory, this error happens.
I'm using python 2.7, but if it matters please tell me solution for 2.x and 3.x What should I do to read this file?


Solution

  • The best way for you to handle large files would be -

    with open('../file.txt', 'r') as f:
        for line in f:
            # do stuff
    

    readlines() would error because you are trying to load too large a file directly into the memory. The above code will automatically close your file once you are done processing on it.