I have my local database and I created a dump file that is 1.1GB I need that data to put on Heroku but every time I run this command
pg_restore --verbose --clean --no-acl --no-owner -h heroku-host -U db-heroku-username -d db-name dumpfile
my internet connection slows down and after few minutes I get following error pg_restore: error: error returned by PQputCopyData: server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request. I don't know the reason I suspect that my internet connection is too weak to transfer data but nothing else comes to mind. Also, I saw that on the Heroku site number of rows stays the same about 16.9k but the size of the data changes. This is the last line before the error:
pg_restore: processing data for table "public.stock_app_stock"
I tried changing commands and also made a few dump files with different configurations but the problem is the same always.
The problem was that database was big and the internet connection was not good to handle the data transfer between my DB and Heroku PostgreSQL DB also it helped when I upgraded Postgres DB to standard0