You’re not even using them at all! Not just lessening their weighting. Please add at least one subdomain site from each organization. For Debian keep their full Onion services listing site.
I removed TPO and Debian and Qubes and Whonix and DuckDuckGo because they are:
We are already using Whonix, Qubes, Debian as sources of package upgrades , so no need for them to be in sdwate as well.
Tor/ Tor Project : is the anonymizer
Not using too many things from these organizations (not on top using them as time source provider) for better decentralization.
DuckDuckGo using TLS by default to their .onion and sdwdate doesnt support that yet. Later patrick done a test to their .onion list and found out its working but sadly it wasnt always the case because it didnt works out with protonmail .onion url. Thats why i suggested its better not to use TLS+Onion to avoid issues in the future and we use pure .onion.
@Patrick this reasoning doesn’t add up. The fact that we trust their code because it is open and hopefully reproducible one day, doesn’t negate that their infrastructure/servers are also a trustworthy source for time data.
I just had a look at the source pools. How is it compiled? It includes the recently FBI-seized site deepdotweb and others that look fishy.
Yes, time sync is an issue. We need to be careful not to expose users to other dangers. I don’t think it is users’ best interest to unknowingly be connected to sites considered illegal or just prime targets of authorities.
Edit: I read the discussion regarding “pal” / “foe” and “neutral” pools. In this case we have a “foe” site in “pal” pool. My point is, with sites considered targets, there is a high likelyhood of them actually being “foes” before it becomes public knowledge. https://github.com/Whonix/Whonix/issues/310
This wasn’t added on my watch. We pick mainly journalistic sites running securedrop which are most likely hosted on their own local hardware for security reasons.
We do have a unit test I think. For best results you would run the testing script once then a second time a week later to avoid false positives of servers being own for maintenance.
Then you would try to find an alternative onion for the one that was down.
I don’t see how most of the sites on the list above (from Apr 15) contribute anything here apart from not being part of a large organization.
Requirement was:
What web servers do you consider trustworthy, to take great care of their visitors’ privacy, that are stable and that get great amounts of traffic
I don’t see how they qualify.
Where are they hosted? any reason to assume it’s not on a large and cheap host that gives vps for $20 a month? where will you host a small onion site when you want 24 hours availability?
Any reason to trust them? operators completely anonymous, they can be anyone.
http://tor66sezptuu2nta.onion/ - this site has banners of drugs and fraud product on the homepage. Looks like a mini-deepdotweb story. I don’t think we should include any sites that are openly promoting that as formal “pals”. I don’t trust it, I don’t think anyone there cares about my privacy, and I don’t know if they receive much traffic.
And those come to replace the most trusted sources we have? Actually if I had to pick only 4 then I’d say Debian, Tor, Qubes, Whonix.
Yes.
Sorry, but I don’t see how any random anonymous developer or blogger onion site helps here either. Why should we trust them? assume they care about our privacy just because they are not part of a big organization?
I’d only pick sites with non-anonymous organizations behind them. The others might be nice, but not for this purpose. Impossible to evaluate when nothing is known about them. Some onion search engine. Good? no? Nobody knows.
Also not sure it is needed as per above. url_to_unixtime can get the time even if TLS. The reason is that url_to_unixtime is really simple. HSTS is a client feature. url_to_unixtime does not use HSTS. And these web servers don’t reject non-TLS connections. They reply with a non-TLS redirect request to the TLS version.
url_to_unixtime is similar to curl. But it does not use curl internally. curl is good to show what it does.
curl --head jlve2y45zacpbz6s.onion
(bold added by me)
HTTP/1.1 302 Found Date: Sun, 03 Nov 2019 19:03:04 GMT
Server: Apache
Location: https://jlve2y45zacpbz6s.onion/
Content-Type: text/html; charset=iso-8859-1
Could be non-certificate verifying TLS implementation. Accept the TLS connection no matter the certificate validity. In other words, ignore the certificate. Onions (at least v3) should be secure and authenticated enough even without TLS. This is if such an implementation is easier than certificate validating.