Search code examples
pythonartificial-intelligencedata-sciencedata-miningdata-analysis

Is there any solution for this memory issue?


Salam all...

I have analyzed a big dataset of tweets with python in windows operating system.. when i try to transform the data to vectors (tfidf) it gives me this error message:

MemoryError unable to allocate 298. Gib for an array with shape (439563, 90889) and data type float64

How I can solve this problem?

the following is the used code:

vectorizer = TfidfVectorizer()
X = vectorizer.fit_transform("MyData")
tf_idf = pd.DataFrame(data = X.toarray(), columns=vectorizer.get_feature_names())
final_df = tf_idf
print("{} rows".format(final_df.shape[0]))
final_df.T.nlargest(5, 0)

Solution

  • I suggest you to use Google Colab
    you can access to a strong GPU & RAM for free