scurl - secure curl wrapper

To workaround wget security issues and curl lack of strict SSL by default, Whonix provides scurl wrapper script, which invokes curl with a few prepared command line parameters, allowing simple secure downloads.

It was a nice approach, until T673 phabricator task was resolved by this commit:
add --remote-name so scurl can be used as wget replacement
And here the problem comes: this parameter enforces file saving and cannot be canceled by subsequent parameters if a user ever wants to prevent this.

This new behavior broke a lot of instructions and examples in Whonix Wiki using scurl. As well as any manual replacement of curl with scurl for better security.
E.g. it broke test of per-process stream isolation:

scurl https://check.torproject.org | grep IP

Now it even gives an error: it canā€™t save output to file cause canā€™t get a filename from URL.
So it spread outside the scurl Wiki page now (where all scurl commands are broken foremost, of course).

It broke the whole philosophy of original approach reflected in a wrapper name
ā€œscurlā€ == ā€œsecure curlā€, which meant transparent replacement for curl with more strict policy for protocols only. File auto-saving is completely new and additional functionality outside of this concept, and should be solved the other way.

And it was! Later there was added scurl-download wrapper (together with curl-download), which adds the same --remote-name parameter (along with --location to follow redirects). Corresponding documentation has also been added into Whonix Wiki.
But this parameter still remains in main scurl wrapper as well :confused:

Please fix this.

  • Consider removing --remote-name parameter from main scurl wrapper.
  • Maybe consider renaming scurl-download / curl-download, or adding symlinks with short and simple names, like scurld / curld or scurlget / curlget ā€” in case you need a short replacement for wget command, and long name with dash is inconvenient here for regular users.
  • Consider improving scurl-download / curl-download wrappers:
    • Replace --remote-name parameter with --remote-name-all, which activates the same behavior for all given URLs (for multiple downloads), also allows to cancel it for any URL (by "-o -" or --no-remote-name).
    • Add --remote-header-name parameter, which tells curl to try to get server-specified filename first (via Content-Disposition header) and then extract from URL on failure only ā€” this will allow to download dynamically generated files, ā€œattachmentsā€, etc.
1 Like

short and simple names

durl / sdurl ?
cURL stands for ā€œclient URLā€,
dURL for ā€œdownload URLā€.

1 Like

Could you please send pull requests?

https://github.com/Whonix/scurl/tree/master/usr/bin

(Symlinks can be sorted soon by me [or anyone] through file debian/scurl.links.)

I sent a pull request:
https://github.com/Whonix/scurl/pull/1

Also I put a tracking task on Phabricator:
https://phabricator.whonix.org/T899

Not sure if I did everything right, this is my first contributionā€¦

As for symlinks or names - I leave it on you dear developers, cause I donā€™t know how do it properly and which names to choose. I marked this on Phabricator as well.
Also as far as I see, there are some changes still needed anyway, like bumping of changelog or etc.

1 Like

Symlinks sorted.

https://github.com/Whonix/scurl/commit/e200c6e38c395406840894820ba8ae9bd8bdc374

Perfect!

Merged.

Donā€™t worry too much about formalities. This one was rather easy and quickly to review and merge. The more difficult, the more time needed, the more on the stack, the more useful a ticket gets as a reminder.

1 Like

Thanks! Iā€™m glad to give at least a little help to this amazing project, and thanks for keep working on it! :+1:
I hope the next time I notice something more to fix, I could do it again instead of just trying to workaround like usually when busy with own things :sweat_smile: And that time it will be easier, since I learned a lot.

1 Like

Maybe consider to add --cert-status and/or the --hsts option?

Maybe someone could maintain a preloaded HSTS file or would that be ā€œoverkillā€?

Possible future additions
CURLOPT_HSTS_PRELOAD - provide a set of preloaded HSTS host names

https://curl.se/docs/hsts.html

1 Like

Too heavy for Whonix project. This should be contributed to Debian project (or perhaps curl if theyā€™d accept it). Once available, could be enabled by default in scurl.

That does not seem well supported.

scurl --head --cert-status https://www.github.com
scurl --head --cert-status https://www.google.com

curl: (91) No OCSP response received

Related:

1 Like
    if [ "$res" = "--url" ]; then
      opt="$opt $def_opt $res"
    else
      opt="$opt $res"
    fi

Breaks when there are spaces?

no, that part is only passing the --url option, not its value.

1 Like

Some shellcheck warnings.

In usr/bin/curl-download line 17:
opt=ā€œ$opt "$res"ā€
^-^ SC2089: Quotes/backslashes will be treated literally. Use an array.

In usr/bin/curl-download line 39:
echo curl --location $opt
^ā€“^ SC2090: Quotes/backslashes in this variable will not be respected.
^ā€“^ SC2086: Double quote to prevent globbing and word splitting.

Did you mean:
echo curl --location ā€œ$optā€

For more information:
https://www.shellcheck.net/wiki/SC2089 ā€“ Quotes/backslashes will be treateā€¦
ShellCheck: SC2090 ā€“ Quotes/backslashes in this variable will not be respected. ā€“ Quotes/backslashes in this variabā€¦
ShellCheck: SC2086 ā€“ Double quote to prevent globbing and word splitting. ā€“ Double quote to prevent globbing ā€¦
zsh: exit 1 shellcheck usr/bin/curl-download

if echo "${list}" | tr " " "\n" | grep -q "^${res}$"; then

Or better

if echo "${list}" | tr " " "\n" | grep -q "^${res}\$"; then

?

curl --invalid x

curl: option --invalid: is unknown
curl: try ā€˜curl --helpā€™ or ā€˜curl --manualā€™ for more information
zsh: exit 2 curl --invalid x


curl-download --invalid x https://www.kicksecure.com/w/images/2/25/Gui-preview.jpg https://www.kicksecure.com

curl --location --invalid --remote-name x --remote-name https://www.kicksecure.com/w/images/2/25/Gui-preview.jpg --remote-name https://www.kicksecure.com

  • ā€œā€“invalid xā€ isnā€™t correctly passed.
  • It adds an ā€œā€“remote-name xā€ where there was none before.
  • Therefore it produces a different error message compared to non-wrapped curl.

I canā€™t fix this.

ā€œā€“invalid xā€ isnā€™t correctly passed.
It adds an ā€œā€“remote-name xā€ where there was none before.
Therefore it produces a different error message compared to non-wrapped curl.

curl-download canā€™t parse --invalid because it is not know to curl.
Therefore, it canā€™t also know that x is from invalid. By default, curl would also treat x as an url, but the purpose of scurl-download is to treat every positional argument that is not an argument of an option to be an URl.

So I canā€™t fix it. Feel free to close the pr if you want.

1 Like

Merged.

Is

opt=ā€œ$opt "$res"ā€

better than

opt="$opt $res"

?

And if yes, should this be done inside the script consistently?

I dont understand why one is preferable to the other

Personally I like this style:
opt+=" $res"

1 Like