I set up a quick demonstration to show risks of curl|bash and how a bad-actor could potentially hide a malicious script.
It’s nothing new or groundbreaking, but I figure it never hurts to have another reminder.
Curl bash is no different than running an sh script you dont know manually…
True, but this is specifically about scripts you think you know, and how curl bash might trick you into running a different script entirely.
you’d have to be mad to willingly pipe a script to bash without checking it. holy shit
And it’s wild how much even that has been absolutely normalized by all these shitty lazy developers and platforms. Vibe coding it just going to make it worse. All these programs that look nice on the surface and are just slop on the inside. It’s going to be a mess.
Is it different from running a bash script you downloaded without checking it? E.g. the installer that you get with GOG games?
Genuine question, I’m no expert.
I have no problems with running scripts from the internet, AFTER you check them. Do NOT blindly run a script you found on the internet. As others have said download them, then check them, then and only then run them if they’re safe. NEVER pipe to bash, ever.
Ok but not everyone has that skill. And anyway, how is this different to running a binary where you can’t check the code?
it’s exactly the same. Don’t run binaries you don’t trust fully. But i get what you mean. miley_cyrus_nude.jpg.exe is probably gonna end badly.
Yeah I get that, but I would install docker, cloudflared, etc by piping a convenience script to bash without hesitation. I’ve already decided to install their binary, I don’t see why the install script is any higher risk.
I know it’s a controversial thing for everyone to make their own call on, I just don’t think the risk for a bash script is any higher than a binary.
the difference though is you can check a script. if it’s an open source project, you can also compile from source. but I get what you mean
You can, but to me it seems weird to say it’s crazy to pipe to bash when people happily run binaries. If anything, the convenience script is lower risk than the binary since people have probably checked it before you.
I wouldn’t pipe a random script to bash though, nothing where I wouldn’t trust the people behind it.
I won’t lie, I use curl | bash as well, but I do dislike it for two reasons:
Firstly, it is much, much easier to compromise the website hosting than the binary itself, usually. Distributed binaries are usually signed by multiple keys from multiple servers, resulting in them being highly resistant to tampering. Reproducible builds (two users compiling a program get the same output) make it trivial to detect tampering as well.
On the other hand, websites hosting infrastructure is generally nowhere near as secure. It’s typically one or two VPS’s, and there is no signature or verification that the content is “official”. So even if I’m not tampering with the binary, I can still tamper with the bash script to add extra goodies to it.
On the other hand (but not really relevant to what OP is talking about), just because I trust someone to give me a binary in a mature programming language they have experience writing in, doesn’t mean I trust them to give me a script in a language known for footguns. A steam bug in their bash script once deleted a user’s home directory. There have also been issues with AUR packages, which are basically bash scripts, breaking people’s systems as well. When it comes to user/community created scripts, I mostly trust them to not be malicious, and I am more fearful of a bug or mistake screwing things up. But at the same time, I have little confidence in my ability to spot these bugs.
Generally, I only make an exception for running bash installers if the program being installed is a “platform” that I can use to install more software. K3s (Kubernetes distro), or the Nix package manager are examples. If I can install something via Nix or Docker then it’s going to be installed via there and not installed via curl | bash. Not every developer under the sun should be given the privilege of running a bash script on my system.
As a sidenote, docker doesn’t recommend their install script anymore. All the instructions have been removed from the website, and they recommend adding their own repo’s instead. Personally, I prefer to get it from the distro’s repositories, as usually that’s the simplest and fastest way to install docker nowadays.
Firstly, it is much, much easier to compromise the website hosting than the binary itself, usually. Distributed binaries are usually signed by multiple keys from multiple servers, resulting in them being highly resistant to tampering. Reproducible builds (two users compiling a program get the same output) make it trivial to detect tampering as well.
Yeah this is a fair call.
But at the same time, I have little confidence in my ability to spot these bugs.
This is the key thing for me. I am not likely to spot any issues even if they were there! I’d only be scanning for external connections or obviously malicious code, which I do when I don’t have as much trust in the source.
As a sidenote, docker doesn’t recommend their install script anymore.
Yeah I used it as an example because there are very few times I ever remember piping to bash, but that’s probably the most common one I have done in the past.
It’s really only about trusting the source. Your operating system surely has thousands of scripts that you’ve never read and never checked. And wouldn’t have time to. And people don’t complain about that.
But it’s really bad practice to run random things from random sites. So the practice of downloading a script and running it is frowned upon. Mostly as a way of maintaining good security hygiene.
The post is specifically about how you can serve a totally different script than the one you inspect. If you use
curlto fetch the script via terminal, the webserver can send a different script to a browser based on the UserAgent.And whether or not you think someone would be mad to do it, it’s still a widespread practice. The article mentions that piping curl straight to bash is already standard procedure for Proxmox helper scripts. But don’t take anyone’s word for it, check it out:
https://community-scripts.github.io/ProxmoxVE/
It’s also the recommended method for PiHole:
The reality is a lot of newcomers to Linux won’t even understand the risks involved, it’s run because that’s what they’re told or shown to do. That’s what I did for pihole many years ago too, I’ll admit
I’ve been accused of “gate keeping” when I tell people that this is a shitty way to deploy applications and that nobody should do it.
And you better inspect and execute a downloaded copy, because a malicious actor can serve a different file for curl/wget than to your browser
They can even serve a different file for curl vs curl|bash
Yeah that do, I remember that the demo was pretty impressive ten fifteen years ago!
Does curl send a different useragent when it’s piped?
Searching for those words just vomits ‘hOW to SeT cUrL’s UseRaGenT’ blog spam.
Its timing based. Ehen piped a script, bash executes Dach line completly before taking the next line from the input. Curl has a limited output buffer.
- Operation that takes a long time. Like a sleep, or if zou want it less obvious. A download, an unzip operation, apt update, etc.
- Fill the buffer with more bash commands.
- Measure on the server if at some point curl stops downloading the script.
- Serve a malicious payload.
Not that I know of, which means I can only assume it’ll be a timing-based attack.
With strategic use of sleep statements in the script you should stand a pretty good chance of detecting the HTTP download blocking while the script execution is paused.
If you were already shipping the kind of script that unpacks a binary payload from the tail end of the file and executes it, it’s well within the realm of possibility to swap it for a different one.
Hit the nail on the head. Download the file, inspect, then run that local copy.
Yep! That’s what the post shows.
I created a live demo file, too, so that you can actually see the difference based on how you request the file.
Most developers I’ve looked at would happily just paste the curl|bash thing into the terminal.
I often would skim the script in the browser, but a. This post shows that’s not fool proof and b. a sufficiently sophisticated malicious script would fool a casual read
Most developers I’ve looked at would happily just paste the curl|bash thing into the terminal.
I mean, I typically see it used for installing applications, and so long as TLS is used for the download, I’m still not aware of a good reason why you should check the Bash script in particular in that case, since the application itself could just as well be malware.
Of course, it’s better to check the Bash script than to not check it, but at that point we should also advise to download the source code for the application, review it and then compile it yourself.
At some point, you just have to bite the bullet and I have not yet seen a good argument why the Bash script deserves special treatment here…Having said that, for cases where you’re not installing an application, yeah, reviewing the script allows you to use it, without having to trust the source to the same degree as you do for installing an application.
In addition to the other examples it’s also in the default installation mode for node.js - they use this to install nvm
Ya cant even blame someone non-technical falling for this if they haven’t been explicitly informed - it’s getting reinforced as completely normal by too many “reputable” projects.
I’m pretty sure brew on mac is the same too
I mean, true, but most of the things I do that with are private scripts that I wrote. I think the main exception to that is Oh-my-zsh.
Also it’s not really a full pipe…
bash <(curl cht.sh/curl)That’s saves the URL as a temporary file and opens it with bash. Frankly, the URL I gave you is very bad because it is not actually a script, just the help page for curl. Frankly, it would better if it wasn’t nested.
the article isn’t about scripts you wrote yourself. run your own scripts all you like.
Yes this has risks. At the same time anytime you run any piece of software you are facing the same risks, especially if that software is updated from the internet. Take a look at the NIST docs in software supply chain risks.
Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:
Fewer Letters More Letters DNS Domain Name Service/System HTTP Hypertext Transfer Protocol, the Web PiHole Network-wide ad-blocker (DNS sinkhole) SSL Secure Sockets Layer, for transparent encryption TLS Transport Layer Security, supersedes SSL VPS Virtual Private Server (opposed to shared hosting)
4 acronyms in this thread; the most compressed thread commented on today has 10 acronyms.
[Thread #111 for this comm, first seen 23rd Feb 2026, 04:40] [FAQ] [Full list] [Contact] [Source code]







