scurl wrapper problem

To workaround wget security issues and curl lack of strict SSL by default, Whonix provides scurl wrapper script, which invokes curl with a few prepared command line parameters, allowing simple secure downloads.

It was a nice approach, until T673 phabricator task was resolved by this commit:
add --remote-name so scurl can be used as wget replacement
And here the problem comes: this parameter enforces file saving and cannot be canceled by subsequent parameters if a user ever wants to prevent this.

This new behavior broke a lot of instructions and examples in Whonix Wiki using scurl. As well as any manual replacement of curl with scurl for better security.
E.g. it broke test of per-process stream isolation:

scurl https://check.torproject.org | grep IP

Now it even gives an error: it can’t save output to file cause can’t get a filename from URL.
So it spread outside the scurl Wiki page now (where all scurl commands are broken foremost, of course).

It broke the whole philosophy of original approach reflected in a wrapper name
“scurl” == “secure curl”, which meant transparent replacement for curl with more strict policy for protocols only. File auto-saving is completely new and additional functionality outside of this concept, and should be solved the other way.

And it was! Later there was added scurl-download wrapper (together with curl-download), which adds the same --remote-name parameter (along with --location to follow redirects). Corresponding documentation has also been added into Whonix Wiki.
But this parameter still remains in main scurl wrapper as well :confused:

Please fix this.

  • Consider removing --remote-name parameter from main scurl wrapper.
  • Maybe consider renaming scurl-download / curl-download, or adding symlinks with short and simple names, like scurld / curld or scurlget / curlget — in case you need a short replacement for wget command, and long name with dash is inconvenient here for regular users.
  • Consider improving scurl-download / curl-download wrappers:
    • Replace --remote-name parameter with --remote-name-all, which activates the same behavior for all given URLs (for multiple downloads), also allows to cancel it for any URL (by "-o -" or --no-remote-name).
    • Add --remote-header-name parameter, which tells curl to try to get server-specified filename first (via Content-Disposition header) and then extract from URL on failure only — this will allow to download dynamically generated files, “attachments”, etc.
1 Like

short and simple names

durl / sdurl ?
cURL stands for “client URL”,
dURL for “download URL”.

1 Like

Could you please send pull requests?


(Symlinks can be sorted soon by me [or anyone] through file debian/scurl.links.)

I sent a pull request:

Also I put a tracking task on Phabricator:

Not sure if I did everything right, this is my first contribution…

As for symlinks or names - I leave it on you dear developers, cause I don’t know how do it properly and which names to choose. I marked this on Phabricator as well.
Also as far as I see, there are some changes still needed anyway, like bumping of changelog or etc.

1 Like

Symlinks sorted.



Don’t worry too much about formalities. This one was rather easy and quickly to review and merge. The more difficult, the more time needed, the more on the stack, the more useful a ticket gets as a reminder.

1 Like

Thanks! I’m glad to give at least a little help to this amazing project, and thanks for keep working on it! :+1:
I hope the next time I notice something more to fix, I could do it again instead of just trying to workaround like usually when busy with own things :sweat_smile: And that time it will be easier, since I learned a lot.

1 Like
[Imprint] [Privacy Policy] [Cookie Policy] [Terms of Use] [E-Sign Consent] [DMCA] [Investors] [Priority Support] [Professional Support]