scurl - secure curl wrapper

To workaround wget security issues and curl lack of strict SSL by default, Whonix provides scurl wrapper script, which invokes curl with a few prepared command line parameters, allowing simple secure downloads.

It was a nice approach, until T673 phabricator task was resolved by this commit:
add --remote-name so scurl can be used as wget replacement
And here the problem comes: this parameter enforces file saving and cannot be canceled by subsequent parameters if a user ever wants to prevent this.

This new behavior broke a lot of instructions and examples in Whonix Wiki using scurl. As well as any manual replacement of curl with scurl for better security.
E.g. it broke test of per-process stream isolation:

scurl | grep IP

Now it even gives an error: it can’t save output to file cause can’t get a filename from URL.
So it spread outside the scurl Wiki page now (where all scurl commands are broken foremost, of course).

It broke the whole philosophy of original approach reflected in a wrapper name
“scurl” == “secure curl”, which meant transparent replacement for curl with more strict policy for protocols only. File auto-saving is completely new and additional functionality outside of this concept, and should be solved the other way.

And it was! Later there was added scurl-download wrapper (together with curl-download), which adds the same --remote-name parameter (along with --location to follow redirects). Corresponding documentation has also been added into Whonix Wiki.
But this parameter still remains in main scurl wrapper as well :confused:

Please fix this.

  • Consider removing --remote-name parameter from main scurl wrapper.
  • Maybe consider renaming scurl-download / curl-download, or adding symlinks with short and simple names, like scurld / curld or scurlget / curlget — in case you need a short replacement for wget command, and long name with dash is inconvenient here for regular users.
  • Consider improving scurl-download / curl-download wrappers:
    • Replace --remote-name parameter with --remote-name-all, which activates the same behavior for all given URLs (for multiple downloads), also allows to cancel it for any URL (by "-o -" or --no-remote-name).
    • Add --remote-header-name parameter, which tells curl to try to get server-specified filename first (via Content-Disposition header) and then extract from URL on failure only — this will allow to download dynamically generated files, “attachments”, etc.
1 Like

short and simple names

durl / sdurl ?
cURL stands for “client URL”,
dURL for “download URL”.

1 Like

Could you please send pull requests?

(Symlinks can be sorted soon by me [or anyone] through file debian/scurl.links.)

I sent a pull request:

Also I put a tracking task on Phabricator:

Not sure if I did everything right, this is my first contribution…

As for symlinks or names - I leave it on you dear developers, cause I don’t know how do it properly and which names to choose. I marked this on Phabricator as well.
Also as far as I see, there are some changes still needed anyway, like bumping of changelog or etc.

1 Like

Symlinks sorted.



Don’t worry too much about formalities. This one was rather easy and quickly to review and merge. The more difficult, the more time needed, the more on the stack, the more useful a ticket gets as a reminder.

1 Like

Thanks! I’m glad to give at least a little help to this amazing project, and thanks for keep working on it! :+1:
I hope the next time I notice something more to fix, I could do it again instead of just trying to workaround like usually when busy with own things :sweat_smile: And that time it will be easier, since I learned a lot.

1 Like

Maybe consider to add --cert-status and/or the --hsts option?

Maybe someone could maintain a preloaded HSTS file or would that be “overkill”?

Possible future additions
CURLOPT_HSTS_PRELOAD - provide a set of preloaded HSTS host names

1 Like

Too heavy for Whonix project. This should be contributed to Debian project (or perhaps curl if they’d accept it). Once available, could be enabled by default in scurl.

That does not seem well supported.

scurl --head --cert-status
scurl --head --cert-status

curl: (91) No OCSP response received


1 Like
    if [ "$res" = "--url" ]; then
      opt="$opt $def_opt $res"
      opt="$opt $res"

Breaks when there are spaces?

no, that part is only passing the --url option, not its value.

1 Like

Some shellcheck warnings.

In usr/bin/curl-download line 17:
opt=“$opt "$res"”
^-^ SC2089: Quotes/backslashes will be treated literally. Use an array.

In usr/bin/curl-download line 39:
echo curl --location $opt
^–^ SC2090: Quotes/backslashes in this variable will not be respected.
^–^ SC2086: Double quote to prevent globbing and word splitting.

Did you mean:
echo curl --location “$opt”

For more information:
ShellCheck: SC2089 – Quotes/backslashes will be treated literally. Use an array. – Quotes/backslashes will be treate…
ShellCheck: SC2090 – Quotes/backslashes in this variable will not be respected. – Quotes/backslashes in this variab…
ShellCheck: SC2086 – Double quote to prevent globbing and word splitting. – Double quote to prevent globbing …
zsh: exit 1 shellcheck usr/bin/curl-download

if echo "${list}" | tr " " "\n" | grep -q "^${res}$"; then

Or better

if echo "${list}" | tr " " "\n" | grep -q "^${res}\$"; then


curl --invalid x

curl: option --invalid: is unknown
curl: try ‘curl --help’ or ‘curl --manual’ for more information
zsh: exit 2 curl --invalid x

curl-download --invalid x

curl --location --invalid --remote-name x --remote-name --remote-name

  • “–invalid x” isn’t correctly passed.
  • It adds an “–remote-name x” where there was none before.
  • Therefore it produces a different error message compared to non-wrapped curl.

I can’t fix this.

“–invalid x” isn’t correctly passed.
It adds an “–remote-name x” where there was none before.
Therefore it produces a different error message compared to non-wrapped curl.

curl-download can’t parse --invalid because it is not know to curl.
Therefore, it can’t also know that x is from invalid. By default, curl would also treat x as an url, but the purpose of scurl-download is to treat every positional argument that is not an argument of an option to be an URl.

So I can’t fix it. Feel free to close the pr if you want.

1 Like



opt=“$opt "$res"”

better than

opt="$opt $res"


And if yes, should this be done inside the script consistently?

I dont understand why one is preferable to the other

Personally I like this style:
opt+=" $res"

1 Like