IETF discusses improving NTP security

Some of the smaller sites probably don’t qualify as “having great amount of traffic” then. For example, anonguide.cyberguerrilla.org or ev0ke.net.

How is local hardware a reliable source for time vs a large data center? where will distortions more likely to be noticed? where do they get their time from? probably ntp? do the smaller securedrop-centered site care much about accurate time?

Is the “securedrop pool” (or any pool) regularly quite off in relation to the larger servers time? I would assume it’s the case. Any stats collected regarding that? any stats collected regarding which of the 3 groups ends up supplying the median time value? in a perfect world it should be 33% chance for each group. if we check and find that it’s 50% or 70% chance for one of the groups to supply the chosen onion (due to frequently inaccurate time in the others, or a consistent bias to having late or early in some types of servers) that’s an issue. Any attempts done to check those points?

Not that I have a better way, but the whole concept looks pretty subjective. Which pools are “likely to keep their users privacy” becomes highly speculative when we need a large number of servers.

According to riseup’s own admission, their servers are occasionally taken and examined by authorities. They had the canary saga a couple of years ago. Still considered a “pal”?

Is it better to have 10 well-trusted and high-volume servers instead of 20+ maybe’s?

onion vs clearnet: in the case of clearnet (Tails pools use clearnet AFAIK) at least we know for sure on which servers they are.

Anyone maintains those list? are there onions that are not accessible over a long period of time that need to be removed? any mechanism (not in users side) to regularly cycle through them and check availability or response times? is there any fixed procedure to periodically evaluate the list?