I am importing a large application to Docker. Part of the application is a database. I have dumped the database into .sql file and now I am trying to import it to the docker container running official mysql image by mounting a directory from host machine and issuing command
mysql -u myUsername -p myDB < /mountdir/databasedump.sql
The database dump is very large, more than 10GB. Everything goes well for an hour, but then it issues error
loop: Write error at byte offset
I have a feeling that the size of the container runs out.
Is there a smarter way in accomplishing the dockerization of the database? In case not, how can I import the enormous database to the container?
The problem was that any container couldn't store the whole imported database, which was 26Gb and the error was due to the container ran out of disk space.
I solved the problem by mounting a directory from host as a external volume using the -v switch and editing the MySQL config to store its databases there.
This solution of course 'un-virtualizes' the database and it might be a security risk. Database server runs well virtualized though. In my situation the slightly weakened security wasn't an issue.