sdwdate and sdwdate-gui development thread

Please cherry pick the two following ones. (Last one first.)

[code]commit a39e5780b39a1d3d317337b2e05994c0c1000455
Author: Patrick Schleizer adrelanos@riseup.net
Date: Wed Sep 9 17:26:17 2015 +0000

comment[/code]

[code]commit 6e11cfd12662cf576268b4bc047c7e4f46e0fd05
Author: Patrick Schleizer adrelanos@riseup.net
Date: Wed Sep 9 17:16:40 2015 +0000

timedatectl set-ntp false >/dev/null 2>&1 || true[/code]

Gui looks good.

Now, that log has been improved, it gets visible, that there seems to be a bug. Bad url dtsxnd3ykn32ywv6.onion wasn’t remembered. Tried twice.

2015-09-10 23:59:15,813 - sdwdate_log - INFO - Signal SIGTERM received. Exiting.
2015-09-10 23:59:16,066 - sdwdate_log - INFO - Fetching remote times, start 1441929556.07
2015-09-10 23:59:16,068 - sdwdate_log - INFO - The clock is sane<br> Current time "Thu Sep 10 23:59:16 UTC 2015"
2015-09-10 23:59:16,068 - sdwdate_log - INFO - Running sdwdate loop, iteration 1
2015-09-10 23:59:16,068 - sdwdate_log - INFO - Requested urls ['dtsxnd3ykn32ywv6.onion', 'w6csjytbrl273che.onion', 'msydqstlz2kzerdg.onion']
2015-09-10 23:59:23,094 - sdwdate_log - INFO - Returned urls "['msydqstlz2kzerdg.onion', 'w6csjytbrl273che.onion', 'dtsxnd3ykn32ywv6.onion']"
2015-09-10 23:59:23,094 - sdwdate_log - INFO - Remote status "msydqstlz2kzerdg.onion", True
2015-09-10 23:59:23,094 - sdwdate_log - INFO - Remote status "w6csjytbrl273che.onion", True
2015-09-10 23:59:23,094 - sdwdate_log - INFO - Remote status "dtsxnd3ykn32ywv6.onion", False: 0x04: Host unreachable
2015-09-10 23:59:23,094 - sdwdate_log - INFO - Pool two: last_url w6csjytbrl273che.onion, web_time 1441929055
2015-09-10 23:59:23,094 - sdwdate_log - INFO - Pool three: last_url msydqstlz2kzerdg.onion, web_time 1441929556
2015-09-10 23:59:23,094 - sdwdate_log - INFO - Running sdwdate loop, iteration 2
2015-09-10 23:59:23,095 - sdwdate_log - INFO - Requested urls ['swdi5ymnwmrqhycl.onion']
2015-09-10 23:59:26,071 - sdwdate_log - INFO - Returned urls "['swdi5ymnwmrqhycl.onion']"
2015-09-10 23:59:26,071 - sdwdate_log - INFO - Remote status "swdi5ymnwmrqhycl.onion", False: Timeout
2015-09-10 23:59:26,071 - sdwdate_log - INFO - Running sdwdate loop, iteration 3
2015-09-10 23:59:26,071 - sdwdate_log - INFO - Requested urls ['dtsxnd3ykn32ywv6.onion']
2015-09-10 23:59:26,072 - sdwdate_log - INFO - Returned urls "['dtsxnd3ykn32ywv6.onion']"
2015-09-10 23:59:26,072 - sdwdate_log - INFO - Remote status "dtsxnd3ykn32ywv6.onion", False: Timeout
2015-09-10 23:59:26,072 - sdwdate_log - INFO - Running sdwdate loop, iteration 4
2015-09-10 23:59:26,072 - sdwdate_log - INFO - Requested urls ['poulsensqiv6ocq4.onion']
2015-09-10 23:59:26,939 - sdwdate_log - INFO - Returned urls "['poulsensqiv6ocq4.onion']"
2015-09-10 23:59:26,939 - sdwdate_log - INFO - Remote status "poulsensqiv6ocq4.onion", True
2015-09-10 23:59:26,939 - sdwdate_log - INFO - Pool one: last_url poulsensqiv6ocq4.onion, web_time 1441929565
2015-09-10 23:59:26,939 - sdwdate_log - INFO - Valid urls:
msydqstlz2kzerdg.onion: "https://ahmia.fi/address/msydqstlz2kzerdg"
w6csjytbrl273che.onion: "Ljost[24][25]  2012-September-30       Transparency Activism   w6csjytbrl273che.onion  https://w6csjytbrl273che.tor2web.org/   Iceland"
poulsensqiv6ocq4.onion: "Wired's Kevin Poulsen  https://pressfreedomfoundation.org/about/tech/kevin-poulsen     poulsensqiv6ocq4.onion"
2015-09-10 23:59:26,939 - sdwdate_log - INFO - Bad urls:
dtsxnd3ykn32ywv6.onion: "BalkanLeaks    https://www.balkanleaks.eu      dtsxnd3ykn32ywv6.onion"
swdi5ymnwmrqhycl.onion: "NRKbeta        https://nrkbeta.no/tips swdi5ymnwmrqhycl.onion"
dtsxnd3ykn32ywv6.onion: "BalkanLeaks    https://www.balkanleaks.eu      dtsxnd3ykn32ywv6.onion"
2015-09-10 23:59:26,939 - sdwdate_log - INFO - Fetching remote times, end 1441929566.94
2015-09-10 23:59:26,944 - sdwdate_log - INFO - Pool differences, sorted: [-508, -7, -1]
2015-09-10 23:59:26,945 - sdwdate_log - INFO - Median time difference: -7
2015-09-10 23:59:26,945 - sdwdate_log - INFO - Seconds to add: + 0.491922837
2015-09-10 23:59:26,945 - sdwdate_log - INFO - New time difference: -6.508077163
2015-09-10 23:59:26,949 - sdwdate_log - INFO - Running sclockaj, PID=8662
2015-09-10 23:59:26,950 - sdwdate_log - INFO - Sleeping for 10 minutes

[hr]

Some thoughts on making the gui correspondent with the logs.

–>

[hr]

–>

“date” must sound confusing to readers. Those who don’t know that it’s a unix command line utility. What do you think about logging the whole cmd? (Refering to… “cmd = ‘sudo /bin/date --set @’ + str(new_unixtime)” Moving the logging down after variable cmd has been set.)


is ending up in logs and passive popups.

  1. have sdwdate + sdwdate-gui running
  2. sudo service sdwdate stop
  3. sudo rm /var/run/sdwdate/*
  4. sudo service sdwdate start

From now, sdwdate-gui will no longer update. Stay in the “sdwdate stopped, signal SIGTERM received” state.

Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/sdwdate_gui/sdwdate_gui.py", line 147, in status_changed with open(self.status_path, 'rb') as f: IOError: [Errno 2] No such file or directory: '/var/run/sdwdate/status'

The timeout is too short. Getting a lot timeouts. But during development, it’s actually very good for testing.

[hr]

The following log has some problems.

2015-09-11 00:20:27,220 - sdwdate_log - INFO - Signal SIGTERM received. Exiting.
2015-09-11 00:20:43,074 - sdwdate_log - INFO - Fetching remote times, start 1441930843.07
2015-09-11 00:20:43,076 - sdwdate_log - INFO - The clock is sane<br> Current time "Fri Sep 11 00:20:43 UTC 2015"
2015-09-11 00:20:43,076 - sdwdate_log - INFO - Running sdwdate loop, iteration 1
2015-09-11 00:20:43,076 - sdwdate_log - INFO - Requested urls ['33y6fjyhs3phzfjj.onion', 'abkjckdgoabr7bmm.onion', 'bitmailendavkbec.onion']
2015-09-11 00:20:47,646 - sdwdate_log - INFO - Returned urls "['abkjckdgoabr7bmm.onion', 'bitmailendavkbec.onion', '33y6fjyhs3phzfjj.onion']"
2015-09-11 00:20:47,646 - sdwdate_log - INFO - Remote status "abkjckdgoabr7bmm.onion", True
2015-09-11 00:20:47,646 - sdwdate_log - INFO - Remote status "bitmailendavkbec.onion", True
2015-09-11 00:20:47,646 - sdwdate_log - INFO - Remote status "33y6fjyhs3phzfjj.onion", False: 0x04: Host unreachable
2015-09-11 00:20:47,646 - sdwdate_log - INFO - Pool two: last_url abkjckdgoabr7bmm.onion, web_time 1441930850
2015-09-11 00:20:47,646 - sdwdate_log - INFO - Pool three: last_url bitmailendavkbec.onion, web_time 1441930849
2015-09-11 00:20:47,646 - sdwdate_log - INFO - Running sdwdate loop, iteration 2
2015-09-11 00:20:47,647 - sdwdate_log - INFO - Requested urls ['y6xjgkgwj47us5ca.onion']
2015-09-11 00:20:53,081 - sdwdate_log - INFO - Returned urls "['y6xjgkgwj47us5ca.onion']"
2015-09-11 00:20:53,081 - sdwdate_log - INFO - Remote status "y6xjgkgwj47us5ca.onion", False: Timeout
2015-09-11 00:20:53,081 - sdwdate_log - INFO - Running sdwdate loop, iteration 3
2015-09-11 00:20:53,081 - sdwdate_log - INFO - Requested urls ['v6gdwmm7ed4oifvd.onion']
2015-09-11 00:20:53,081 - sdwdate_log - INFO - Returned urls "['v6gdwmm7ed4oifvd.onion']"
2015-09-11 00:20:53,082 - sdwdate_log - INFO - Remote status "v6gdwmm7ed4oifvd.onion", False: Timeout
2015-09-11 00:20:53,082 - sdwdate_log - INFO - Running sdwdate loop, iteration 4
2015-09-11 00:20:53,082 - sdwdate_log - INFO - Requested urls ['33y6fjyhs3phzfjj.onion']
2015-09-11 00:20:57,825 - sdwdate_log - INFO - Returned urls "['33y6fjyhs3phzfjj.onion']"
2015-09-11 00:20:57,826 - sdwdate_log - INFO - Remote status "33y6fjyhs3phzfjj.onion", False: 0x04: Host unreachable
2015-09-11 00:20:57,826 - sdwdate_log - INFO - Running sdwdate loop, iteration 5
2015-09-11 00:20:57,826 - sdwdate_log - INFO - Requested urls ['hkjpnjbvhrxjvikd.onion']
2015-09-11 00:21:07,836 - sdwdate_log - WARNING - No values returned from url_to_unixtime.<br> Retrying...
2015-09-11 00:21:07,837 - sdwdate_log - INFO - Sleeping for 0.1 minutes
2015-09-11 00:21:13,846 - sdwdate_log - INFO - Fetching remote times, start 1441930873.85
2015-09-11 00:21:13,847 - sdwdate_log - INFO - The clock is sane<br> Current time "Fri Sep 11 00:21:13 UTC 2015"
2015-09-11 00:21:13,847 - sdwdate_log - INFO - Running sdwdate loop, iteration 1
2015-09-11 00:21:13,847 - sdwdate_log - INFO - Requested urls ['znig4bc5rlwyj4mz.onion', '5r4bjnjug3apqdii.onion', 'timaq4ygg2iegci7.onion']
2015-09-11 00:21:18,870 - sdwdate_log - INFO - Returned urls "['znig4bc5rlwyj4mz.onion', 'timaq4ygg2iegci7.onion', '5r4bjnjug3apqdii.onion']"
2015-09-11 00:21:18,871 - sdwdate_log - INFO - Remote status "znig4bc5rlwyj4mz.onion", True
2015-09-11 00:21:18,876 - sdwdate_log - INFO - Remote status "timaq4ygg2iegci7.onion", True
2015-09-11 00:21:18,876 - sdwdate_log - INFO - Remote status "5r4bjnjug3apqdii.onion", True
2015-09-11 00:21:18,877 - sdwdate_log - INFO - Pool one: last_url znig4bc5rlwyj4mz.onion, web_time 1441930879
2015-09-11 00:21:18,877 - sdwdate_log - INFO - Pool two: last_url 5r4bjnjug3apqdii.onion, web_time 1441930883
2015-09-11 00:21:18,877 - sdwdate_log - INFO - Pool three: last_url timaq4ygg2iegci7.onion, web_time 1441930879
2015-09-11 00:21:18,878 - sdwdate_log - INFO - Valid urls:
znig4bc5rlwyj4mz.onion: "ExposeFacts    https://exposefacts.org znig4bc5rlwyj4mz.onion"
timaq4ygg2iegci7.onion: "https://github.com/meejah/txtorcon http://txtorcon.readthedocs.org"
5r4bjnjug3apqdii.onion: "Irpileaks[29][30]      2013-October-7  Investigative Journalism        5r4bjnjug3apqdii.onion  https://5r4bjnjug3apqdii.tor2web.org/   Italy"
2015-09-11 00:21:18,878 - sdwdate_log - INFO - Bad urls:
2015-09-11 00:21:18,878 - sdwdate_log - INFO - Fetching remote times, end 1441930878.88
2015-09-11 00:21:18,896 - sdwdate_log - INFO - Pool differences, sorted: [1, 1, 5]
2015-09-11 00:21:18,897 - sdwdate_log - INFO - Median time difference: 1
2015-09-11 00:21:18,897 - sdwdate_log - INFO - Seconds to add: - 0.573118188
2015-09-11 00:21:18,897 - sdwdate_log - INFO - New time difference: 0.426881812
2015-09-11 00:21:18,897 - sdwdate_log - INFO - Setting time using date.
2015-09-11 00:21:18,897 - sdwdate_log - INFO - Old unixttime: 1441930878.9
2015-09-11 00:21:18,897 - sdwdate_log - INFO - New unixtime 1441930879.32
2015-09-11 00:21:19,321 - sdwdate_log - INFO - Sleeping for 10 minutes
  • “WARNING - No values returned from url_to_unixtime.” - How can this happen?
  • It already had two of three valid ones. But then gave up, started fresh and got three fresh ones.
  • Didn’t list any bad urls. But that’s probably just a follow up error.

Other issues:

  • no way to see which time server returned which time diff

Currently the code flow in url_to_unixtime.py is difficult to grasp and follow.

It was a lot nicer, when there was a main function like in the old ur_to_unixtime. See highlighted code:

Worked on this…

work on refactoring and passing back errors in url_to_unixtime.py back to sdwdate
https://github.com/adrelanos/sdwdate/commit/c1d756bd1e3f9cd902a45e840fde0dfd8f753857

What do you think about that approach generally?

I guess this would also solve the “WARNING - No values returned from url_to_unixtime.” case. And be a lot nicer, because issues with time servers would end up in the sdwdate log.

There are still issues to be sorted out.

  • “raise” - I am not happy with that yet. It’s just a shortcut for “exit this thread”.
  • “print >> sys.stderr” is not very useful. Should be converted to “error =”, passed to “raise” so it gets passed back to sdwdate and logging.
  • Needs testing. Yet to be tested if the failure if each and every functions would be correctly passed back.

[quote=“Patrick, post:126, topic:1137”]Currently the code flow in url_to_unixtime.py is difficult to grasp and follow.

It was a lot nicer, when there was a main function like in the old ur_to_unixtime. See highlighted code:

Worked on this…

What do you think about that approach generally?

I guess this would also solve the “WARNING - No values returned from url_to_unixtime.” case. And be a lot nicer, because issues with time servers would end up in the sdwdate log.[/quote]
Yes, the next big thing is to find a way to pass back url_to_unixtime errors to sdwdate. It cannot be left as is. Your approach is one step ahead.

Also, while we are at it, the plan is to detach url_to_unixtime, making it an executable, so that we can AppArmor it separately, and perhaps apply furher hardening with seccomp, pypy.sandbox or whatever becomes available.

Sounds good!

In that case, we can reuse the old single threaded url_to_unixtime.py cli version? Restore this one?

It does everything right. Writes errors to stderr and exits non-zero. Otherwise writes to stdout and exits zero.

And transform usr/lib/python2.7/dist-packages/sdwdate/url_to_unixtime.py into a small python gevent wrapper (Perhaps rename.) that runs the single threaded url_to_unixtime.py cli version?

There are still issues to be sorted out. - "raise" - I am not happy with that yet. It's just a shortcut for "exit this thread". - "print >> sys.stderr" is not very useful. Should be converted to "error =", passed to "raise" so it gets passed back to sdwdate and logging. - Needs testing. Yet to be tested if the failure if each and every functions would be correctly passed back.
"raise" is not required. By keeping the structure without the utu_main function, url is written in ulr[] list, the error in unix_times[] list, followed by return. This is what has been done from the beginning in request_data_from_remote_server. No change in sdwdate, errors are logged like timeout or "Host unreachable". https://github.com/troubadoour/sdwdate/commit/d4e6b015cd4c59d87747f781b9ac867e40bbd86b

At the moment, only the general error is logged, but we can make it as verbose as necessary.

Added some commented out lines for testing each function (tested OK).

“No values returned from url_to_unixtime.” is still popping once in a while without any error reported. First it looked like it was linked to a specific url, but that was a coincidence.

And transform usr/lib/python2.7/dist-packages/sdwdate/url_to_unixtime.py into a small python gevent wrapper (Perhaps rename.) that runs the single threaded url_to_unixtime.py cli version?
That would be nice, leaving url_to_unixtime working by its name exclusively. However, not sure gevent can handle threads with external calls.

Without utu_main [or better name] the code is difficult to grasp. Bad overview in which order functions are called / calling themselves. The functions don’t just do what they say, but also some “extra”.

I think ‘raise’ is one way to exit the thread.

We need proper exception handling in the main thread anyhow. Because for example “s.recv” could fail. Perhaps also “s.send” could fail? Or something else could also fail? In these cases, it’s best if the main thread handles the exceptions, passed back to sdwdate, so it gets logged there.

Logging the full error has value. I think something can be learned of this. Otherwise analysis exactly what went wrong, attack analysis and reporting issues to servers operators gets more cumbersome.

We can kill two births with one stone. Fix all of these issues at once… If we can make gevent run the old external standalone url_to_unixtime.py. A quick search engine research didn’t indicate that gevent can’t do it. On the contrary.
http://www.gevent.org/gevent.subprocess.html

Without utu_main [or better name] the code is difficult to grasp. Bad overview in which order functions are called / calling themselves. The functions don't just do what they say, but also some "extra".
Without utu_main, each function is calling the one above itself. Not sure it's harder to grasp than all the functions called from a main(). But the issue is not there. I do not remember the exact reason for changing the structure of the script, but if we use utu_main as in https://github.com/adrelanos/sdwdate/commit/c1d756bd1e3f9cd902a45e840fde0dfd8f753857, some of the function tests work (suprisingly), some don't. Where they do not, the cycle ends with "No values returned from url_to_unixtime." or "Time is not set: no valid time returned from pool".

“raise” without argument should raise the last exception in the python exception stack and leave the thread. In the script we have some home made errors and some python exceptions. I guess we should raise a proper custom exeption in each case and catch it in sdwdate. It seems overkill when we have a smooth pass-through system which can reuse the same mechanics for everything, including “s.send” or “s.recv”. An exception/error in any function stops there and returns to the original calling function url_to_unixtime.

We can kill two births with one stone. Fix all of these issues at once... If we can make gevent run the old external standalone url_to_unixtime.py. A quick search engine research didn't indicate that gevent can't do it. On the contrary. http://www.gevent.org/gevent.subprocess.html
Loking into it. But the old external standalone url_to_unixtime.py is only writing to stdout (for sdwdate bash) or stderr. I guess we would still have to modify it to return the data, as well as the errors.

The old external standalone url_to_unixtime.py writes only unixtime to stdout. Everything else it writes to stderr are only errors. [Or when using verbosity.]

So if it exits 0, just use stdout as data. Otherwise, if it exits non-zero, use the stderr as error message.

I was slightly behind your reasoning but i have catched up, eventually :slight_smile:

url_to_unixtime is detached as an exectutable in /usr/lib/sdwdate.

A new script “remote_times” runs the url_to_unixtime threads (gevent wrapper).

Reusing the old standalone url_to_unixtime, with two changes:

  • evreything including errors is written to stdout.
  • the errors exit normally.

The reason is the use of Popen with gevent. It does not check the exit code and there is no need to discriminate data from errors since everything is passed back to sdwdate (verbosely for the errors).

Have implemented the functions tests in url_to_unixtime. Tested OK.

Bonus: no more “No values returned from url_to_unixtime”. So this setup seems more robust, and faster too (but that could be my imagination).

Comit reorganize logging and icon/message in main loop · troubadoour/sdwdate@becab9f · GitHub

Reorganize logging in main loop, with HTML stripping.

Still missing in your branch.

[quote=“Patrick, post:121, topic:1137”]Please cherry pick the two following ones. (Last one first.)

[code]commit a39e5780b39a1d3d317337b2e05994c0c1000455
Author: Patrick Schleizer adrelanos@riseup.net
Date: Wed Sep 9 17:26:17 2015 +0000

comment[/code]

[code]commit 6e11cfd12662cf576268b4bc047c7e4f46e0fd05
Author: Patrick Schleizer adrelanos@riseup.net
Date: Wed Sep 9 17:16:40 2015 +0000

timedatectl set-ntp false >/dev/null 2>&1 || true[/code][/quote]

Due to etc/xdg/autostart/sdwdate-gui.desktop using 'sudo -u sdwdate, sdwdate-gui needs to ‘Depends:’ on sdwdate.

Your commits were cherry picked and pushed a few days ago:

Some commits.

Logging:

Misc:

url_to_unixtime AppArmor profile (one abstraction, one line, the shortest so far). url_to_unixtime AppArmor profile · troubadoour/sdwdate@608f9c0 · GitHub

Now that the ‘No values returned from url_to_unixtime’ error has gone, the retry loop is no longer required. I think it can be replaced by a standard error in case nothing is returned (probably asking to report a bug).

With the new setup, when internet is down, all the connections time out. Will implement the detection.

Added sdwdate to “Depends” in sdwdate-gui.

Works for me.

[hr]

->

[hr]

->

(So it can be easier compared to old unixtime.)

[hr]

(Removing value statements about urls that might be misunderstood.)

->

[hr]

->

I am not happy with “unreachable”. But “bad” implies “should not trust/visit that page”.