Search code examples
pythonsocketsssltls1.2pyopenssl

How to get the server's selected TLS verison in python ssl


I am using python 3.6.5 and Windows 10. I need to get the TLS version that the server selected (i.e. the one that will be used for the rest of the handshake and NOT the client's offered version). I did this:

import socket, ssl

context =  context = ssl.create_default_context()
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
domain = 'google.com'
sslSocket = context.wrap_socket(s, server_hostname = domain)
sslSocket.connect((domain, 443))
print('the selected version by the server: ', sslSocket.cipher()[1])
print("success")
sslSocket.close()

But the output is:

the selected version by the server:  TLSv1/SSLv3
success

I need accurate version. i.e, in a connection to google.com, it is TLS 1.2 as the browser shows. How can I get accurate version in my code?


Solution

  • You are confusing cipher and protocol version. The cipher describes which cryptographic algorithms are selected (i.e. like AES encryption, SHA-1 HMAC, RSA authentication, ECDHE key exchange) while the protocol version says which SSL/TLS protocol version is used. The value you currently show is only the protocol version since when the common cipher used by client and server is available by the standard.

    To get the protocol version used by the connection you need instead to use SSLSocket.version():

    print('the selected version by the server: ', sslSocket.version())