To workaround wget security issues and curl lack of strict SSL by default, Whonix provides scurl wrapper script, which invokes curl with a few prepared command line parameters, allowing simple secure downloads.
It was a nice approach, until T673 phabricator task was resolved by this commit: add --remote-name so scurl can be used as wget replacement
And here the problem comes: this parameter enforces file saving and cannot be canceled by subsequent parameters if a user ever wants to prevent this.
This new behavior broke a lot of instructions and examples in Whonix Wiki using scurl. As well as any manual replacement of curl with scurl for better security.
E.g. it broke test of per-process stream isolation:
scurl https://check.torproject.org | grep IP
Now it even gives an error: it canāt save output to file cause canāt get a filename from URL.
So it spread outside the scurl Wiki page now (where all scurl commands are broken foremost, of course).
It broke the whole philosophy of original approach reflected in a wrapper name
āscurlā == āsecure curlā, which meant transparent replacement for curl with more strict policy for protocols only. File auto-saving is completely new and additional functionality outside of this concept, and should be solved the other way.
And it was! Later there was added scurl-download wrapper (together with curl-download), which adds the same --remote-name parameter (along with --location to follow redirects). Corresponding documentation has also been added into Whonix Wiki.
But this parameter still remains in main scurl wrapper as well
Please fix this.
Consider removing --remote-name parameter from main scurl wrapper.
Maybe consider renaming scurl-download / curl-download, or adding symlinks with short and simple names, like scurld / curld or scurlget / curlget ā in case you need a short replacement for wget command, and long name with dash is inconvenient here for regular users.
Replace --remote-name parameter with --remote-name-all, which activates the same behavior for all given URLs (for multiple downloads), also allows to cancel it for any URL (by "-o -" or --no-remote-name).
Add --remote-header-name parameter, which tells curl to try to get server-specified filename first (via Content-Disposition header) and then extract from URL on failure only ā this will allow to download dynamically generated files, āattachmentsā, etc.
Not sure if I did everything right, this is my first contributionā¦
As for symlinks or names - I leave it on you dear developers, cause I donāt know how do it properly and which names to choose. I marked this on Phabricator as well.
Also as far as I see, there are some changes still needed anyway, like bumping of changelog or etc.
Donāt worry too much about formalities. This one was rather easy and quickly to review and merge. The more difficult, the more time needed, the more on the stack, the more useful a ticket gets as a reminder.
Thanks! Iām glad to give at least a little help to this amazing project, and thanks for keep working on it!
I hope the next time I notice something more to fix, I could do it again instead of just trying to workaround like usually when busy with own things And that time it will be easier, since I learned a lot.
Too heavy for Whonix project. This should be contributed to Debian project (or perhaps curl if theyād accept it). Once available, could be enabled by default in scurl.
In usr/bin/curl-download line 17:
opt=ā$opt "$res"ā
^-^ SC2089: Quotes/backslashes will be treated literally. Use an array.
In usr/bin/curl-download line 39:
echo curl --location $opt
^ā^ SC2090: Quotes/backslashes in this variable will not be respected.
^ā^ SC2086: Double quote to prevent globbing and word splitting.
āāinvalid xā isnāt correctly passed.
It adds an āāremote-name xā where there was none before.
Therefore it produces a different error message compared to non-wrapped curl.
curl-download canāt parse --invalid because it is not know to curl.
Therefore, it canāt also know that x is from invalid. By default, curl would also treat x as an url, but the purpose of scurl-download is to treat every positional argument that is not an argument of an option to be an URl.
So I canāt fix it. Feel free to close the pr if you want.