sdwdate Time Sources Criteria

Quote https://www.whonix.org/wiki/Sdwdate#sdwdate_Time_Sources_Criteria

sdwdate Time Sources Criteria

Current Implementation 1.0

Prerequisite knowledge: sdwdate time source pool design

These criteria are meant to be fitting the dynamic trust of the internet and to be as close as possible to the highest trustable level.

Time Source Inclusion Criteria

  • trustworthy. This criteria probably means many different things for many different people. To clarify, it needs to be compatible with the Whonix ™ Platform Goals. Trustworthy as far as infrastructure goes, for example as in unlikely to be using cloud and/or insecure hosting for receiving confidential documents.
  • hosted by non-anonymous organizations or persons.
  • reachable over an .onion domain. [11]
    • If there is a forced redirection from (non-TLS) http onion to TLS https onion, the TLS certificate must be valid. [12]
  • highly likely to be hosted on different hardware than other sdwdate time source pool members.


It is required that each sdwdate time source pool member has both, a clearnet domain name and an onion domain name. An example of a clearnet domain name is whonix.org. An example of a onion domain name is dds6qkxpwdeubwucdiaord2xgbbeyds25rbsgr73tbfpqpt4a6vjwsyd.onion. The clearnet domain must be reachable TLS with a valid TLS certificate. This is because when a website is reachable over .onion which has a corresponding clearnet domain name with the same contents, hosted by the same author, its easier to verify the identity of the website author, when the website was created, where the website or its maintainers are located.

There needs to be evidence that that onion domain is hosted by the same author as the clearnet domain. This can be a mention of the onion domain on the clearnet domain or the Onion-Location HTTP header [archive]. The latter can be conveniently noticed by visiting the website using Tor Browser and then showing onion available and seen by using services such as securityheaders.com or using the curl command line tool, i.e. curl --head https://clearnet.domain [archive].

Onion services likely hosted on the same hardware or by the same author will be grouped together and act as one. I.e. these will be considered mirrors of the same onion. sdwdate picks one mirror from the group randomly. Any onion from that author will not be used more than other pool members. The load among these grouped pool members will therefore be load balanced.


This provides higher certainty of having trustworthy time source members because these websites and services services have a reputation to maintain. This includes for example e-mail services such as protonmail, ctemplar and so forth or big news network like The Guardian and so on. Note: Just because these are known organizations and very hard to make them operate maliciously that doesn’t mean there are guarantees whether by mistake, hacks or by outside pressure.

Unrealistic Time Source Criteria

  • The onion service being popular or receiving great amount of traffic. This is very hard to verify, compare as outsider and reason about. Also (very) high traffic onion services might be less reliable.

Rules for sdwdate time source related git pull requests

New sdwdate pool member additions must be proposed in public in Whonix ™ development forum thread Suggest Trustworthy Tor Hidden Services as Time Sources for sdwdate [archive] to allow anyone to comment on it.

  • the following type of changes need to be proposed separately using separate pull requests
    • removal of sdwdate pool members because these are offline, unreliable, their clock is too much off or otherwise no longer comply with the requirements
    • updates to already existing sdwdate pool members
      • such as updated onion domain names in case the onion domain name change
      • or if the onion domain was upgraded from onion v2 to onion v3
    • additions of new sdwdate time sources (if there where no objections in previous forum discussion)

Time Sources Exclusion Criteria

The rationale for the following exclusion criteria is to avoid likely insecure websites and also to avoid any mention whatsoever of controversial content within sdwdate source code.

The following categories must be avoided and deleted if turning out later so:

  • Unstable Website: Its not useful to add a service which goes off and on periodically.
  • Sold Out Website: Its better to remove website if its happen to be sold out and its content will be changed.
  • Website Went Offline: If the website went offline then it should removed.
  • Contain Any Form of Pornographic Content.
  • Contain or Encourage on Damaging Human Health: like drugs, alcohol, smoke, etc.
  • Contain Any Form of gore, gangs, terrorist, assassination Content.
  • Contain Deanonymization or Cracking Services or Spying Agencies: like HackingTeam [archive] or Cellebrite [archive] or the NSA, GCHQ, etc.
  • Contain or Related to Any Form of Governmental Website: like ministries or military websites or anything similar. (Specially those which end with .gov.)
  • Draw highly controversial attention to Whonix ™ or sdwdate due to their on-site or off-site activities.
  • Websites which Whonix ™ as default software sources (such as Debian, Whonix, Qubes, The Tor Project) or other purposes (The Tor Project’s check.torproject.org webiste for whonixcheck --leak-tests). This is should there be any issues with these services (such as being down for maintenance or other issues such as being under a denial of service attack) this should not break multiple things in Whonix ™ such as sdwdate and APT upgrading at the same time.

Credits: Written by @TNT_BOM_BOM, revised by @Patrick.


@TNT_BOM_BOM suggested some changes:

Quote https://www.whonix.org/wiki/Sdwdate#Contributor_Proposed_Version_2.0

It is being proposed to drop the requirement hosted by non-anonymous organizations or persons. I.e. onion’s hosted by anonymous organizations or persons should also be permitted under the following conditions.

  • Here things are little bit more trickier as we cannot know much except what the website claiming to be so we cannot know who, where, how long etc. this website was running. So we need verification mechanism to check:
    • Consensus or Aggregation of Testimonies: We try to collect users opinions on this website and thus clearnet will be heavily involved into this specially in social media and blogs. So we can verify this website is really doing what it claims to be doing. For example, an e-mail service claiming to not spam their users should not spam their users.
    • Seniority: The older a website becomes, the more trustworthy it will be considered if there have not been any (deliberate or by mistake?) public verifiable breaches of its promises. Recently established websites cannot be with reasonable certainty considered well tested, established, being scam, fraud, deception or not.
1 Like
[Imprint] [Privacy Policy] [Cookie Policy] [Terms of Use] [E-Sign Consent] [DMCA] [Investors] [Priority Support] [Professional Support]