Search code examples
pythonwebserver

Python - How to workaround web server connection limit


I have a Heating system with web server and I want to read data from this web server. Problem is in limitation of web server - max. number of connections is 6. I wrote some simple script to get XML (Heating system is based on XML). This script works fine, but only for 6 calls per minute. I try to force Python to close the connections after get data, but connection stay open.

Do you have any idea how to force remote web server to close the connection?

My simplified code:

import requests
from pprint import pprint

s = requests
s.keep_alive = False
link1 = "http://some_ip/TOP1.XML"
f = s.get(link1, headers={'Connection':'close', "Timeout":"5000"})
pprint(vars(f))

Solution

  • From requests docs:

    ( ... ) So if you’re making several requests to the same host, the underlying TCP connection will be reused, which can result in a significant performance increase (see HTTP persistent connection).

    I would try to use session object and reuse TCP connection.

    s = requests.Session()
    s.get(url)
    

    Probably you want to read about keep-alive