Given APT is simple HTTP I think you could solve most load issues by letting Varnish cache the repo data. (We could purge the cache when pushing new data to the repo). Or yes, could look at onionbalance.
So I think a single APT server is probably sufficient (it has been sufficient to date, right?) but I also agree in the rule of not putting all eggs in one basket. For example, long term I would advocate having a separate server for website/wiki/forum and one for apt repo, also for security/attack surface reasons (no need for a hack of MediaWiki to place the apt repo at risk). It’s the same ‘compartmentalisation by design’ principle that we appreciate of Whonix and QubesOS. The only downside is commercial impact (more servers) and a little extra server maintenance (more of my time).
Random onion domains is probably overkill/administrative overhead indeed (would rather use onionbalance when it’s available for v3). It would also be then harder for users to confirm with one another that they’re using a legitimate APT source, e.g with 100 different onion domains, hard to ‘google it’ and confirm you’re not using a malicious onion address (hard enough already to read just one long v3 onion address and recognise it as legitimate, Zooko’s triangle, etc etc).