[HOME] [DOWNLOAD] [DOCS] [BLOG] [SUPPORT] [TIPS] [ISSUES] [Priority Support]

Cryptolog for Whonix website


#1

Should the Whonix website default to using cryptolog? What about qubes-os.org?

Cryptolog is a simple log filter program that reads log file entries from standard input and writes to either a file or pipes them to the standard input of another program (like logrotate or cronolog). The filter takes the IP address in the entry (everything before the first space character) and encrypts it, throwing away the key.

Technically, cryptolog takes 16 bytes of random data from /dev/urandom and stores it in a file (called the salt). It then calculates a sha256 hash of the salt concatonated with the original IP address, base64-encodes that, and chops off the first six characters of the result. That’s what gets stored instead of the IP address in the resulting log entry. Of course this means that if someone who wishes to know the original IP addresses gets access to these logs, all they need to know is the salt (which is also stored on the hard drive) to uncover the original IPs. In order to prevent this, the salt gets updated once a day with a new random 16 bytes. At worst, an attacker can only get the last day’s worth of original IP addresses.

Cryptolog makes logs that look like this:

67.169.69.72 - - [12/May/2011:17:58:07 -0700] “GET / HTTP/1.1” 200 430

Look like this instead:

UkezVh - - [12/May/2011:17:58:07 -0700] “GET / HTTP/1.1” 200 430

The string that replaces the IP address will remain the same for the same day, so you can tell the difference between unique visitors and pageviews.

Here are some example CustomLog lines for your Apache config files:

CustomLog “| /usr/bin/cryptolog -w /root/cryptolog-access.log” combined
CustomLog “| /usr/bin/cryptolog -c /usr/bin/cronolog\\ /root/cryptolog-access-%Y-%m-%d.log” combined
CustomLog “| /usr/bin/cryptolog -s /tmp/salt_file -w /root/cryptolog-access.log” combined

Notice that if you’re using the -c option, you need to escape spaces in the command you’re running with three backslashes.

On a related issue, is there a reason the Whonix forums can’t be used/run under the .onion domain (unlike the wiki)?

Assume in advance everyone and everything posted here is of great interest to the police state authoritarians.


#2

Good day,

I don’t think Cryptolog has any advantage for us, as the webserver is configured to not store IP-Adresses of any kind (instead 127.0.0.1 for everyone).

Using “forums.kkkkkkkkkk63ava6.onion” works for me.

Which?

Have a nice day,

Ego


#3

OK - good to know re: IP logs.

Thanks re: forums info. I wasn’t aware of it. The main Whonix webpage on http://kkkkkkkkkk63ava6.onion/ has the forums button link going to ->

https://forums.whonix.org/

Whereas the ‘Home’, ‘Download’ and ‘Wiki’ buttons are all pointing to the .onion references.

So, whomever is responsible for that webpage, can they please update the forum link to point to->

forums.kkkkkkkkkk63ava6.onion

Forum users would be generally advised to stay within the Tor network whenever possible.

Re: the authoritarians jibe, I’m referring to almost every government in Europe, Asia, North America, the Pacific, Africa, the ME… they have a serious hardon for people using encryption and Tor, so, that makes Whonix, TAILS and users of other similar platforms of extreme interest. Witness the UK Snoopers Charter, Oz meta-data rules, FBI Rule 41 changes, stacks of disclosed documents, “Collect it all” slides etc etc. But I’ll leave the political ranting for another day :wink:

And you have a good day too.

Cheers


#4

Good day,

Well, that would be me then, as I was the one who created the page. The reason for this is the fact that the wiki, etc. all are normal pages in the same directory (i.e. /wiki, /download, …) while the forum is in its own seperate section (forums.whonix.org) making it hard to dynamically link there. Will look into solutions.

Regarding that, is a slightly controversial subject. There have been (and still are) cases where accessing a “Clearnet-Site” via Tor is considered safer than accessing a Hidden Service. Example: https://github.com/filosottile/hstools

Either way, will try to find a way to direct the users to our HS in any case.

Have a nice day,

Ego


#5

There are more of such issues. Making popular webapps reachable though two different domains is unsupported by those. Therefore the best solution we can suggest is the following:

I doubt that will be possible without involving php and whatnot. I guess the above solution is good enough, also since we don’t have the resources to solve similar issues in the other webapps we are using.


#6

Good day,

That’s slightly creepy, as using PHP (which already is being used for the translation of our documentation) would have been the way I’d go.

Have a nice day,

Ego


#7

I am not opposed to PHP. That would be hard since mediawiki, wordpress is based on it. But I don’t think we can solve this issue on whole Whonix website for all or most of our webapps. So we will always need the https://www.whonix.org/wiki/Forcing_.onion_on_Whonix.org workaround. I just don’t think it’s worth introducing PHP at this point for our homepage for this very reason. If there is another reason such as implementing the mailing list on our homepage, then PHP may be the way to go.


#8

Good day,

Like mentioned, has already been used for the new translation solution on our new homepage which will be deployed once the Quick-Start-Guide is finished. Adding to that, the newsletter.

Have a nice day,

Ego


#9

That’s really interesting Ego re: HSDir profiling and Patrick re: setting rule for .onion sites. And you have a phabricator .onion, nice! Thanks - learning a lot of you guys.

The (mis)understanding of normal users is that staying within the .onion network is much safer in general, as this is the boiler plate advice coming from the Tor Project.

For instance:

1. Certificate Authority compromises (CAs are the main weakness in the whole internet - take them down and nothing is secure)

https://blog.torproject.org/blog/detecting-certificate-authority-compromises-and-web-browser-collusion

Quite seriously, when a CA is compromised, it will impact a great deal more than the web; users of email systems (SMTP, IMAP, POP,etc), Jabber servers, and any other SSL/TLS enabled systems are all at risk. Blocking specific serial numbers or relying on flawed, provably broken methods of revocation will simply not cut it anymore. When the actual protection mechanisms are not enforced, there is little hope of end users being protected.

This should serve as a wake up call to the internet. We need to research, build, and share new methods for ensuring trust, identity, authenticity, and confidentiality on the internet. Proposals such as DANE, CAA, HASTLS, and Monkeysphere are steps in the right direction but they face an uphill battle from entrenched economic interests.

Certification Authorities may continue to provide a piece of the puzzle but it’s high time we ensure that they’re not the alpha and the omega, anymore.

2. PRISM surveillance and their suggestion of .onions as a means of protection

https://blog.torproject.org/blog/prism-vs-tor

However, the real interesting use cases for Tor in the face of dragnet surveillance like this is not that Tor can protect your gmail/facebook accounts from analysis (in fact, Tor could never really protect account usage metadata), but that Tor and hidden services are actually a key building block to build systems where it is no longer possible to go to a single party and obtain the full metadata, communications frequency, or contents.

Tor hidden services are arbitrary communications endpoints that are resistant to both metadata analysis and surveillance.

A simple (to deploy) example of a hidden service based mechanism to significantly hinder exactly this type of surveillance is an XMPP client that also ships with an XMPP server and a Tor hidden service. Such a P2P communication system (where the clients are themselves the servers) is both end-to-end secure, and does not have a single central server where metadata is available. This communication is private, pseudonymous, and does not have involve any single central party or intermediary.

My summary (if correct, this is not well known by general Tor/Tor Browser users):

.onions provide protections against fraudulent CAs or malicious certificates issued by state-level adversaries and hinders dragnet surveillance attempting to obtain full meta-data of internet use, for example, communications frequency or contents. However, HSDir profiling tools exist that can harvest IPs connecting to a hidden service, so it is not a foolproof system by any means.

I can add this to the wiki when I clean up that miscellaneous stuff from the hardening thread, if it’s not already noted somewhere.


#10

Can you please elaborate on this? I don’t know what you are referring to. Neither that reference is any makes me wiser. Specifically I don’t see how it worsens user anonymity? Or worsens server anonymity?


#11

That would be good!

The information on certificate authorities / https could be added in either one. There is some overlap.


#12

Good day,

What these researchers showcased (and made possible via the linked tool) is to use hash based analysis to find Hidden Service Directories responsible for a specific Hidden Service. Via this method, they can pinpoint the six directory servers responsible for a given hidden service on a specific date, which are chosen by a client trying to access said hidden service on a radom basis. Via a brute-force attack, it is possible to use this knowledge to gain the responsibilty for the directory of a hidden service you’d like to attack. This method can easily be used to deanonymize those trying to access said hidden service and it is much easier to do than more famous methods, which require control of a large chunk of the Tor network to work properly.

The researchers who found this method actually where able to take over four of the six directory servers employed by the Facebook Hidden Service on the day of their presentation via this method.

That is the reason why in certain situations, using the “normal TLS based version” of a public website via Tor is better then using the Hidden Service.

A way to prevent this would be a overhaul of the current design the Tor network employs currently discussed here: https://gitweb.torproject.org/torspec.git/tree/proposals/224-rend-spec-ng.txt Another way would perhaps be to combine TLS and Hidden Services, though getting a certificate for a Hidden Service isn’t that easy at this point in time.

The tool I linked to on Github actually is the one used for this kind of attack. More information can also be found here: https://conference.hitb.org/hitbsecconf2015ams/sessions/non-hidden-hidden-services-considered-harmful-attacks-and-detection/

Have a nice day,

Ego