Search code examples
dockerherokudocker-composefastapistreamlit

ConnectionError on multi-Docker app (Streamlit + FastAPI) when deployed on Heroku


I have a multi-Docker application with a Streamlit frontend and a FastAPI backend deployed on Heroku: https://morning-everglades-39854.herokuapp.com/

The code is on GitHub: https://github.com/BioGeek/streamlit-fastapi-langchain (add an OPENAPI_API_KEY to a .env file if you want to reproduce).

It has:

locally, I can do:

docker compose build
docker compose up

and go to http://172.19.0.3:8501, ask a math question and get an answer:

app working locally

But after deploying the app on Heroku with

heroku create
heroku stack:set container
git push heroku main
heroku open

and trying the same math question there, I get the error:

ConnectionError: HTTPConnectionPool(host='api', port=8080): Max retries exceeded with url: /ask (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f9f5e00d040>: Failed to establish a new connection: [Errno -2] Name or service not known'))
Traceback:
File "/usr/local/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 565, in _run_script
    exec(code, module.__dict__)
File "/app/streamlit_app.py", line 34, in <module>
    main()
File "/app/streamlit_app.py", line 24, in main
    response = requests.request("POST", host, headers=header, data=payload)
File "/usr/local/lib/python3.9/site-packages/requests/api.py", line 59, in request
    return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 587, in request
    resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 701, in send
    r = adapter.send(request, **kwargs)
File "/usr/local/lib/python3.9/site-packages/requests/adapters.py", line 520, in send
    raise ConnectionError(e, request=request)

How do I fix this?

edit:

To answer the question of Chris:

[Could you] explain what api represents in host = "http://api:8080/ask" at line 21 of streamlit_app.py

I based my code on this repository with a similar setup. In streamlit-fastapi-model-serving/streamlit/ui.py there you will see at lines 9-10:

# interact with FastAPI endpoint
backend = "http://fastapi:8000/segmentation"

It is my understanding (but I might be wrong in this) that fastapi here refers to the name of the service/container as defined in the docker-compose.yml.

Similary, see this question with a similar error where the solution was to use host = 'http://container_2:8000' where container_2 is the name of the service in the docker-compose.yml file.

Since I have an api service defined in my docker-compose.yml, I named my host http://ask:8080, which works locally but doesn't work on Heroku.

edit 2:

M.O. said:

You're not actually running your FastAPI application. To run multiple processes, you have to specify them in the run: section in heroku.yml

I have done that, but the same problem still persists.

The error is now slightly different. It says:

ConnectionError: HTTPConnectionPool(host='0.0.0.0', port=8080): 

instead of

ConnectionError: HTTPConnectionPool(host='api', port=8080): 

Solution

  • When you use docker-compose locally, it will set up a network for you (deploy_network in your case). As such, all the services will be routable by default with their name (web and api in this case).

    On Heroku Docker images run in dynos the same way that slugs do, and under the same constraints:

    Network linking of dynos is not supported.

    When you deploy two Docker images, two dynos will be spun up, one for each container based on their respective image. As such, they will not be networked together privately.

    What you could possibly do is first deploy the api image, get back the public URL assigned to it along with any TLS termination configuration, then set that as a variable inside the web image and use that as the connection URL for the web container.

    However, this would potentially route those requests over the Internet.

    Another option could be to run both the api and web on the same HTTP server instance in the same container. For example, serving the api code under /api route and the web under /app route. You would only run one image that contains both the web and api code combined.