wget security issues and
curl lack of strict SSL by default, Whonix provides
scurl wrapper script, which invokes
curl with a few prepared command line parameters, allowing simple secure downloads.
It was a nice approach, until T673 phabricator task was resolved by this commit:
add --remote-name so scurl can be used as wget replacement
And here the problem comes: this parameter enforces file saving and cannot be canceled by subsequent parameters if a user ever wants to prevent this.
This new behavior broke a lot of instructions and examples in Whonix Wiki using
scurl. As well as any manual replacement of
scurl for better security.
E.g. it broke test of per-process stream isolation:
scurl https://check.torproject.org | grep IP
Now it even gives an error: it can’t save output to file cause can’t get a filename from URL.
So it spread outside the scurl Wiki page now (where all
scurl commands are broken foremost, of course).
It broke the whole philosophy of original approach reflected in a wrapper name
“scurl” == “secure curl”, which meant transparent replacement for
curl with more strict policy for protocols only. File auto-saving is completely new and additional functionality outside of this concept, and should be solved the other way.
And it was! Later there was added
scurl-download wrapper (together with
curl-download), which adds the same
--remote-name parameter (along with
--location to follow redirects). Corresponding documentation has also been added into Whonix Wiki.
But this parameter still remains in main
scurl wrapper as well
Please fix this.
- Consider removing
--remote-nameparameter from main
- Maybe consider renaming
curl-download, or adding symlinks with short and simple names, like
curlget— in case you need a short replacement for
wgetcommand, and long name with dash is inconvenient here for regular users.
- Consider improving
--remote-name-all, which activates the same behavior for all given URLs (for multiple downloads), also allows to cancel it for any URL (by
--remote-header-nameparameter, which tells curl to try to get server-specified filename first (via
Content-Dispositionheader) and then extract from URL on failure only — this will allow to download dynamically generated files, “attachments”, etc.