(Curl --head also Not OK)
That is a bug.
ctemplarpizuduxk3fkwrieizstx33kg5chlvrh37nz73pv5smsvl6ad.onion: b’Parsing HTTP header date failed.
I am not sure. That might be happening because url_to_unixtime uses http 1.0
. ctemplar might require at least http 1.1
. The following code is related:
s.send('HEAD / HTTP/1.0\r\n\r\n'.encode())
I didn’t manage to update that to http 1.1
. (No, it’s not just about changing the string 1.0 to 1.1. That would be easy. By doing that, many webservers report invalid request.)
So instead of figuring out how to do http 1.0 vs http 1.1 (some servers only support the one or the other), I thought why not use a standard/popular python3 library for the purpose of fetching https headers.
I have now a python3 requests based implementation. Not pushed to git yet but soon. It takes care of:
- HTTP header fetching
- HTTP header parsing (we need the
Date:
field) - HTTP 1.0 and HTTP 1.1 compatbility
- TLS support
- socks support
That fixes one onion v3 (ctemplar) but breaks two onion v2 using invalid TLS certificates.
pool 2 url ltcpool5brio2gaj.onion: connect error: SOCKSHTTPSConnectionPool(host=‘ltcpool5brio2gaj.onion’, port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError(“bad handshake: Error([(‘SSL routines’, ‘tls_process_server_certificate’, ‘certificate verify failed’)])”))) (Curl --head also Not OK)
Testing the URL Chunk:
[‘cyphdbyhiddenbhs.onion’, ‘wooprzddebtxfhnq.onion’]
pool 2 url cyphdbyhiddenbhs.onion: connect error: SOCKSHTTPSConnectionPool(host=‘cyphdbyhiddenbhs.onion’, port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError(“bad handshake: Error([(‘SSL routines’, ‘tls_process_server_certificate’, ‘certificate verify failed’)])”))) (Curl --head also Not OK)
In past some other onions might have been removed by mistake due to url_to_unixtime lacking HTTP 1.1 support and being unaware that this was the issue.
We could now look which onions have TLS support and add these.