I'm doing a little test of a Fasttext wrapper in a docker runner. This is the test:
import fasttext
import tempfile
def test_fasttext_fit_save():
x_clean = [
"comment important one",
"this is other comment",
"this is other comment",
"hi this is a comment",
]
temp = tempfile.NamedTemporaryFile("w+", suffix=".txt")
for com in x_clean:
temp.write(com)
temp.write("\n")
temp.seek(0)
model = fasttext.train_unsupervised(input=temp.name,
dim=3,
epoch=25,
lr=0.1,
minCount=1,
word_ngrams=1,
bucket=2000000,
)
# Test save
model.save("model.bin")
but, when run this in the gitlab docker runner I obtain:
test_fasttext_fit_save Fatal Python error: Floating point exception
and doesn't show more. When do this test in my computer installing the docker, this run good.
The docker file has this:
FROM python:3.8
# Upgrade pip
RUN pip install --upgrade pip
RUN pip install --upgrade setuptools wheel
# Install dependencies
RUN pip install fasttext
RUN pip install tempfile
RUN python -c "import fasttext; print(fasttext)"
RUN pip install pytest==6.0.1
RUN pip install pytest-cov==2.10.1
RUN pip install pytest-testmon
...
My computer has 4GB and the runner 1GB, but the test don't use 1GB of memory.
It seems like a memory issue, not because the code you run actually requires it, but because fasttext (facebook) requires a minimum of RAM to correctly run things, for instance, this also happens with prophet.
A great alternative is to use gensim's implementation of fasttext.