Yes, that onion address should be fine.
use onion sources list for apt-get updating by default
Due to flakiness of onion v4 this should be postponed.
A fix has been implemented and is expected to arrive in Tor 0.3.5 in December.
While we are at it… Should we separate Whonix webiste onion and Whonix repository onion? We would keep Whonix website onion as is (even functional for apt-get to not break it for anyone) but the next upgrade of Whonix wuould move everyone else to a another, fresh onion address.
- separate website and apt-get downloads
- future-proof with respect to future server load
- we could move apt-get downloads to a different server when needed
Should we even create 2-10 or even 10-100 different onion domains and randomly assign Whonix users one? Why: load balancing. Why not: probably overkill. Debian manages without such gymnastics. But they don’t have onion v3 yet as far as I know.
Perhaps we’ll wait for onionbalance v3 support?
Research on this is low priority due to above status.
Given APT is simple HTTP I think you could solve most load issues by letting Varnish cache the repo data. (We could purge the cache when pushing new data to the repo). Or yes, could look at onionbalance.
So I think a single APT server is probably sufficient (it has been sufficient to date, right?) but I also agree in the rule of not putting all eggs in one basket. For example, long term I would advocate having a separate server for website/wiki/forum and one for apt repo, also for security/attack surface reasons (no need for a hack of MediaWiki to place the apt repo at risk). It’s the same ‘compartmentalisation by design’ principle that we appreciate of Whonix and QubesOS. The only downside is commercial impact (more servers) and a little extra server maintenance (more of my time).
Random onion domains is probably overkill/administrative overhead indeed (would rather use onionbalance when it’s available for v3). It would also be then harder for users to confirm with one another that they’re using a legitimate APT source, e.g with 100 different onion domains, hard to ‘google it’ and confirm you’re not using a malicious onion address (hard enough already to read just one long v3 onion address and recognise it as legitimate, Zooko’s triangle, etc etc).