These criteria are meant to be fitting the dynamic trust of the internet and to be as close as possible to the highest trustable level.
Time Source Inclusion Criteria
trustworthy. This criteria probably means many different things for many different people. To clarify, it needs to be compatible with the Whonix ™ Platform Goals. Trustworthy as far as infrastructure goes, for example as in unlikely to be using cloud and/or insecure hosting for receiving confidential documents.
If there is a forced redirection from (non-TLS) http onion to TLS https onion, the TLS certificate must be valid. 
highly likely to be hosted on different hardware than other sdwdate time source pool members.
It is required that each sdwdate time source pool member has both, a clearnet domain name and an onion domain name. An example of a clearnet domain name is whonix.org. An example of a onion domain name is dds6qkxpwdeubwucdiaord2xgbbeyds25rbsgr73tbfpqpt4a6vjwsyd.onion. The clearnet domain must be reachable TLS with a valid TLS certificate. This is because when a website is reachable over .onion which has a corresponding clearnet domain name with the same contents, hosted by the same author, its easier to verify the identity of the website author, when the website was created, where the website or its maintainers are located.
There needs to be evidence that that onion domain is hosted by the same author as the clearnet domain. This can be a mention of the onion domain on the clearnet domain or the Onion-Location HTTP header[archive]. The latter can be conveniently noticed by visiting the website using Tor Browser and then showing onion available and seen by using services such as securityheaders.com or using the curl command line tool, i.e. curl --head https://clearnet.domain [archive].
Onion services likely hosted on the same hardware or by the same author will be grouped together and act as one. I.e. these will be considered mirrors of the same onion. sdwdate picks one mirror from the group randomly. Any onion from that author will not be used more than other pool members. The load among these grouped pool members will therefore be load balanced.
This provides higher certainty of having trustworthy time source members because these websites and services services have a reputation to maintain. This includes for example e-mail services such as protonmail, ctemplar and so forth or big news network like The Guardian and so on. Note: Just because these are known organizations and very hard to make them operate maliciously that doesn’t mean there are guarantees whether by mistake, hacks or by outside pressure.
Unrealistic Time Source Criteria
The onion service being popular or receiving great amount of traffic. This is very hard to verify, compare as outsider and reason about. Also (very) high traffic onion services might be less reliable.
Rules for sdwdate time source related git pull requests
Contain or Related to Any Form of Governmental Website: like ministries or military websites or anything similar. (Specially those which end with .gov.)
Draw highly controversial attention to Whonix ™ or sdwdate due to their on-site or off-site activities.
Websites which Whonix ™ as default software sources (such as Debian, Whonix, Qubes, The Tor Project) or other purposes (The Tor Project’s check.torproject.org webiste for whonixcheck --leak-tests). This is should there be any issues with these services (such as being down for maintenance or other issues such as being under a denial of service attack) this should not break multiple things in Whonix ™ such as sdwdate and APT upgrading at the same time.
It is being proposed to drop the requirement hosted by non-anonymous organizations or persons. I.e. onion’s hosted by anonymous organizations or persons should also be permitted under the following conditions.
Here things are little bit more trickier as we cannot know much except what the website claiming to be so we cannot know who, where, how long etc. this website was running. So we need verification mechanism to check:
Consensus or Aggregation of Testimonies: We try to collect users opinions on this website and thus clearnet will be heavily involved into this specially in social media and blogs. So we can verify this website is really doing what it claims to be doing. For example, an e-mail service claiming to not spam their users should not spam their users.
Seniority: The older a website becomes, the more trustworthy it will be considered if there have not been any (deliberate or by mistake?) public verifiable breaches of its promises. Recently established websites cannot be with reasonable certainty considered well tested, established, being scam, fraud, deception or not.
How many we need anyhow. If servers aren’t swamped by sdwdate and stable, I guess ~ 15 per pool is sufficient to keep maintenance load reasonable. Otherwise too much work generated from ones getting offline, slow, domain changes.
I suggest lowering the request timeout threshold, so sdwdate does not hang for long times on broken links.
Would require some data on average load times on different connection setups (slow ISP, perhaps with bridges). It really took around 110 seconds often for me to get a reply from a time server. Therefore 120 seconds timeout seemed reasonable. Less reports of sdwdate issues. But perhaps that has improved with onion v3?
A lot work for effectively just 1 source. Generally, too many mirrors grouped into the same source just increases maintenance effort for little gain.
I disagree. Very relevant since we don’t allow services by anonymous operators for liability reasons. I think the opinion of the privacyIO dev is important to re-read here:
The reason being we do not like to provide information which cannot be verified by public sources. We don’t allow anonymous companies to provide services because it involves people trusting an unknown entity with their data that cannot be verified. If the company fails or does something disastrous there is no recourse.
To add CTemplar we would have to relax/remove our trust requirements. If we did this, we’d have all sorts of services recommended (we actually put that requirement in place to ward against people recommending random unknown .onion service email providers).
We won’t be signing any NDAs regarding this, as it would mean we cannot reveal what we learn, and thus puts it on the community to trust us instead of the company they’re doing business with.
No you got it wrong, anonymous here (in sdwdate) not on persons level or company activities or any of that it was just talking about hidden services with no clearnet access.
But since there is clearnet link and hidden services link with no suspicious contents which violates whonix sdwdate rules then its considered absolute fine to add.
Thats why we said in the rules if website change its content or got sold we gonna delete it why we say that? because we dont know anything about the website itself from its (hidden) services to persons…etc and we dont need to know because its irrelevant here.
I meant any anonymous website that only exist on hidden services which is unlikely to find the person behind it is non-anonymous.
The idea of anonymous and non-anonymous is based on the website itself not on the operators/staffs which is nearly impossible to know them.
Its meaningless to know who is behind x server in person whereas the whole idea of trusting a sdwdate source is based on the website itself like to not act maliciously or has very bad content or violation to any whonix rules. And ctemplar is faraway from any of that.
The company registered or not, their services slow and bad…etc this is meaningless to care about in this topic.
The problem is anonymously operated services. Where the controllers behind that website are anonymous. This applies in theory bot both, anonymously operated clearnet services and anonymously operated onion services. Just that sdwdate is only using onions, not clearnet.
If there is no known real identity behind a website, if it’s a pseudonym, then many different websites could actually be hosted by the same entity. For example .onion e-mail providers. If one is hosting one, it would be easy to script the process and host lots of others with a minor different design. For comparison look up how high the percentage of malicious Tor relays is.
sdwdate needs to trust onions to not show fake information. There are safeguards in place. One of these is that trust is distributed. It takes the media (not average) of 3 onions. One from each pool. But if we started adding all the anonymously hosted onions we wouldn’t know if sdwdate is still getting the time from different sources. It could as well as get the time from the very same source (operator).
Good point, But this is more of a technical issue not really going to be solved by oh is the website hosted by mr.x or unknown, Its very easy to pseudonym anything even registered companies.
Also do we really know who is behind each and every sdwdate onion source? i highly doubt that and i can go to 1 by 1 if you want, And the problem even if all of them turned to be non-anonymous still doesnt really mitigate the issue you described. (non-anonymous doesnt mean non-malicious,fake,scam…etc)
I think the solution to the issue you mentioned is by using different websites vary itself from one another starting from location,content…etc which theoretically makes it hard to believe they are all acting maliciously at the same time and the good part of that we are already doing it (or any other technical solution if available.)
To say ctemplar is untrusted onion source will lead to the removing multiple other onion v3 as well and restrict the acceptance of new onions just because its anonymous but nothing bad recorded.
Also according to my proposal for accepting anonymous sources ctemplar and others win this with no question.
Yeah, if we have enough onion sources and can afford it, a review of that would be good.
Do you think you could add a pull request with the recent two suggested lists?
(Please check no duplicates will be added. Evidence onion is hosted by that clearnet domain. As usual.)
(But no need to add if you can see already that the operator is actually anonymous.)
Indeed but since trust is distributed (picking median of 3 onions) it’s unlikely all of them are malicious. It’s based on chance. As many systems, similar to Tor, we need to assume that most or at least a high percentage is honest and base the design on that.
pool 0: -1000000000 second(s) (malicious)
pool 1: -1 second(s)
pool 2: -2 seconds(s)
In that case pool 1 wins since it is the media. sdwdate sets clock -1 seconds (+ clock randomization in Whonix). Good.
pool 0: -1000000000 second(s) (malicious)
pool 1: -2000000000 second(s) (malicious)
pool 2: -2 seconds(s)
In that case pool 0 would win and clock would be set back -1000000000 second(s) but probably actually not not due to safeguard sdwdate Time Replay Protection.
pool 0: +1000000000 second(s) (malicious)
pool 1: +2000000000 second(s) (malicious)
pool 2: +2 seconds(s)
For anonymous onions we wouldn’t know location.
And tons of different onions with different content (a forum, a wiki, a mail service) isn’t hard, actually to setup for bigger malicious entities paying a few sysadmins fulltime.
we dont and i think we wont have enough onion sources as well because to know really this website has no anonymous operations behind it while using anonymous technology is sorta paradoxical in itself.
yeah no problem.
This line i never checked it before nor i can think of a way to prove really that the operator of x website non-anonymous. Claiming im x is very easy with no way to have sure verification if x is really x (if he even said im x).
we have clearnet can know from it. (The discussion above is not about anonymous pure onion, its about anonymous clearnet operator mirrored over onion)
We have our criteria for anonymous operators can handle this (seniority, consensus…)
Other ways can be invented/added but currently only this available.