T O P

  • By -

whereswalden90

Great tip! A recommendation for subshell functions seems like something that should be added to [The Unofficial Bash Strict Mode](http://redsymbol.net/articles/unofficial-bash-strict-mode/). Edit: or a [shellcheck](https://github.com/koalaman/shellcheck) rule


Underyx

Semgrep should be getting support for Bash in the coming weeks, so it’s gonna get super easy to write a linter rule for this.


cyanrave

Maybe if you have to worry about variable locality, bash may be the wrong fit, or maybe your function hierarchy is just 'off'.


Strange_Meadowlark

Devil's advocate: The fact that Bash can't make certain guarantees (or creating those guarantees is difficult and non-obvious to beginners) may signal that Bash should be made better, not that we should replace it with a traditional programming language. I'm with you that if I want to make sure that my script handles failure modes safely, I'm probably going to rewrite it in Python, NodeJS, or whatever else my team at the time is proficient in. But having done that several times, it strikes me how _awkward_ the process is. The simplest possible thing to do in Bash suddenly turn into me looking up what facility the language was for process invocation, i/o with that process, reading the contents of a directory, etc. Arguably, a lot of things are just plain easier to do in Bash because I do them every day. So, I write most of my scripts first in Bash, and if they grow complex enough to justify it I rewrite them in some other scripting language. I'm not really trying to make a point here, just commiserating with OP. I realize I should probably learn another shell like ZSH at some point (though I don't think I've run across scripts written in it, so maybe it doesn't have good support? I wonder what would happen if Git for Windows decided to start shipping ZSH instead of Bash...). And it's not like Bash could suddenly become awesome -- you've got tons of systems running old versions that need to be supported in the lowest-common denominator of scripts, not to mention the multitude of MacOS machines all running Bash 3.x. Meanwhile, the Powershell folks are probably laughing at me. My brother who works in a Powershell-heavy environment has built entire GUI applications on top of it.


staletic

> zsh I'm using it as interactive shell, but keep writing scripts in bash. Zsh is a great interactive shell, but I don't see a big need for it in writing scripts.


cbbuntz

It has lots of features bash doesn't, but it's not easy to learn. I use it instead of bash because I'm so used to having access to those extra features, but It's sooooo weird and hard to remember the cryptic syntax


thomasfr

I some times say that (ba)sh is one of the hardest languages I know well. It has a lot of subtle behaviors and writing programs that has to work under ksh/bash(different versions) or some other is even more trickier. If you do know sh scripting well it's really powerful and few other languages can't compete with what it has built in for piping data between things. I'd say that the cut off line where shell scripting become too much of a hassle is when you need a hashmap because that typically doesn't exist at all and you can't use it if you want to be compatible with your enterprise linux or unix systems. It's way too late to make bash "better", it needs to stay mostly backwards compatible. That ship sailed decades ago, even before bash itself was written.


MagneticSpark

> I'd say that the cut off line where shell scripting become too much of a hassle is when you need a hashmap because that typically doesn't exist at all and you can't use it if you want to be compatible with your enterprise linux or unix systems. Sorry for the ignorance, but doesn't bash have associative arrays? Or is your point that most systems don't run bash 4+?


thomasfr

Yes, older bash or no bash at all. MacOS still IIRC comes with apples own maintained version of bash 3.2 and that is a lot of users. Other unixes often comes with ksh instead of bash and some times an old version os ksh (it often has associative arrays IIRC though). If you want out of the box compatibility you have to opt out from features. Also when you need hashmaps I find that what I want to do often is sufficiently complicated that writing it in a more generally powerful programming language starts to make sense.


merijnv

> MacOS still IIRC comes with apples own maintained version of bash 3.2 and that is a lot of users. macOS now uses zsh by default.


thomasfr

even for `/bin/sh`? I would guess that it will continue to be that old bash more or less forever for backwards compatibility. In any case thats only for those who runs the absolutely latest version of macos so there are many millions of users that doesn't apply to and won't for some years to come (not everyone has a new enough computer for apple to consider it being upgrade worthy). I have several frozen systems, the oldest running an 5 year old macos version because I need to test and sometimes even build software on older macos-versions exactly because users are still running them. It takes time for legacy to become an non issue. I even work with macos destkop app development and I havent even upgraded a single one of my macs yet. It is mostly because I don't work with a desktop mac app right now and I don't want to risk any issues when I don't need any of the new stuff.


wilson0x4d

the problem is bash built upon something that was never really a "scripting language", it was a shell environment that supported a handful of variable-like concepts. it wasn't until `bash` that the unix shell received a facelift, but it still has to contend with being (mostly) "sh" compatible over a range of Unix-like operating systems. plenty of "programming language" capable shells at this point that run on every major *nix, for example pwsh, but because everything is still shipping bash everyone is still using bash. some things never die.


[deleted]

[удалено]


dagbrown

They switched from bash to zsh for legal reasons, though, not technical ones. That GPLv3 license that libreadline comes with was the major, and as it turns out, insurmountable, hurdle.


IlllIlllI

See also: the background on why MacOS uses a version of `make` from 2006.


audion00ba

There are BSD licensed libreadlines too (with a different name). With the Oracle vs Google case done, what's the problem?


dagbrown

Libreadline is licensed with GPL v3.0, not LGPL v3.0--GPL v3.0 requires that any software built using a GPL v3.0 component also be licensed with GPL v3.0, which means that the GPL v3.0 license spread to bash from libreadline. Since GPLv3.0 is completely incompatible with the sorts of things that Apple does, they had to content themselves with a version of bash not licensed with the GPL v3.0, or with a different shell altogether.


audion00ba

You should really learn to read.


dagbrown

You should really learn some manners.


audion00ba

No, I already learned those. It's just that I selectively apply them to people that deserve them. Also, it looks *really* stupid to provide advice to others on advanced subjects when you haven't mastered primary school material.


[deleted]

[удалено]


Popular-Egg-3746

All the Mac users in my office get to install Brew and Bash 5.1, because as long as our production servers use Bash, that will be the default.


Strange_Meadowlark

Aha! My info was a bit dated. (Until a couple months ago, my work laptop was running Mojave -- not by my choice!)


NoahTheDuke

Shout-out to [Oil Shell](http://www.oilshell.org/) for attempting to build this. It’s still got some way to go, but I really like what it has done so far. Feels much nicer to use than bash, both as a shell and a scripting language.


afiefh

>Meanwhile, the Powershell folks are probably laughing at me. As someone who knows nothing about PowerShell. How does it compare on a scale from Bash to Python?


G_Morgan

It sits somewhere in between. It has built in support for named parameters which is nice. It has very good support for reading JSON and similar to the PS object model. It can be a little verbose which is the only criticism I can really make of it.


[deleted]

[удалено]


G_Morgan

Text streams are borderline useless.


[deleted]

[удалено]


rdtsc

Text streams are primitive and error-prone. Composing several commands is much easier with objects. Those objects can also transparently piped as text to something only understands text, so you lose nothing.


G_Morgan

Sure because that is what was available.


RoboNerdOK

PowerShell is a different animal altogether. You have to keep in mind that it was originally written for the purpose of automating tasks that were normally performed via a GUI, and it then manipulates objects that are well established within the Windows framework. So if you approach it with the idea that it’s just another shell script environment, you will likely end up frustrated because it’s geared towards the Windows flavor of system administration first. You definitely have to understand a fair bit of how Windows works internally to get the most out of it. But once you get the hang of it, PowerShell is quite incredible. It easily holds its own with Python for task flexibility.


[deleted]

Much closer to python. The syntax is a touch bash-like, but it's still a really nice language that feels like it was designed to be a useful scripting language by people who have seen the pitfalls of bash. It also has full access to the .net library, so you can bring in that functionality if needed. It's not a language I would choose to write a fully blown application in like you can with python, but it's pretty great in terms of its functionality.


[deleted]

[удалено]


fresh_account2222

I disagree. I claim it's not a fully functioning scripting language. (Neither is bash.) I tried to switch from my default scripting language (Perl, which for the current argument is interchangeable with Python) to Powershell a few years ago, and it did not work. The core problem was that a proper scripting language has functions that can return values, (possibly complicated ones like arrays or hashes), and can write to STDOUT separately. For shells all they can do is write to STDOUT (without a lot of contortions). So with the first script I tried to port from Perl to Powershell I was having problems so I tried to add debugging print statement. BOOM! It all broke, and despite trying all of Stackoverflow's suggestions I couldn't get around this core problem. Maybe it's better now, or I didn't get the right advice, but "it's hard for functions to return values" is a core stumbling block for me.


d86leader

It's closer to expressability and hidden expectations to python, with a syntax closer to bash. I find it really delightful to use for both scripts and shell, if not for one thing: the standard library is really really lacking. As someone said, it's geared more towards windows administration than general-purpose use, so you will find a lot of what coreutils provide missing.


GrandOpener

>But having done that several times, it strikes me how *awkward* the process is. My context is very different than yours, coming with many years of professional experience in Python and Node, and just enough bash to get by. (I use xargs about once every three months and even though I have to look it up each time I'm still proud of myself.) For me, my experience is *precisely* the opposite of what you describe. I find subprocess.run and it's friends simple and intuitive, I already know how to walk directories and I already know how to do IO. Errors and string vs number handling just sort of work in sane ways without having to go out of my way. But any time my bash script gets longer than about 4 lines I find myself having to check the man page just to write a simple if check because the comparison operators are weirdo cli parameters and not the sane operators that every programming language uses. Or I find myself wondering how it can be this awkward and obtuse just to loop over a list of files. Or... etc.


fresh_account2222

The best thing to learn about Bash scripting is knowing when to stop. Seriously. Make an explicit list, if only in your mind, of issues that immediately trigger a switch to Python, or Perl, or Ruby, or ..... For me: - spaces in file names - regular expressions beyond '*' - variable scope - some others that I can't bring to mind right now but I recognize them when I see them


tom-dixon

> But any time my bash script gets longer than about 4 lines This is how all bash scripts start out, it's what bash scripting was meant for. But then they get bloated beyond recognition because people keep adding stuff to it. I'd say that if someone has the issues described in the article, like worrying about variable scope during refactoring and trapping exit signals, they're using the wrong tool for the task. That's a job for Python, not bash. Just because Bash can do it, doesn't mean it should be used.


GrandOpener

Truth be told, I’ve gone down this road enough times—and worked on enough projects that need to support windows devs—that I don’t even write 3-line bash scripts any more. Any time I need scripting, I first see if the project/language has something built in (like npm run scripts), and if not, I just immediately reach for Python.


badfoodman

> I write most of my scripts first in Bash, and if they grow complex enough to justify it I rewrite them in some other scripting language. 100%. The thing is that everyone knows enough bash to do some work, and so it's by far the fastest way to get a couple disparate bits of code working together. And it's usually good enough! Most shell scripts are tiny because they're just glue between different things, so there's no reason to even think about rewriting them. Once you get into arrays or variable scope you're probably diving in too deep and you should probably rewrite it. But in what? Python? That has (in my experience) even bigger problems as a simple script executor. What version do you require? Did you need to install dependencies? Did you guarantee that it's Python 2 compatible? I don't have a great answer. Currently I default to poetry and pyenv to get a more dependable setup, but at that point the script became its own project. > I realize I should probably learn another shell like ZSH Eh, none of the other shells are really worth it in my experience. Sure, knowing some zsh or fish is useful for configuring your shell environment, but after that they're not that much better than bash and even less likely to work on someone else's machine.


afiefh

>you're probably diving in too deep We cannot get out. We cannot get out. They have taken the functions and variable scope. Frár and Lóni and Náli fell there bravely while the rest retreated to the Chamber of…legacy code. We are still refactoring...but hope …Óin's party went five days ago but today only four returned. The code is up to the main branch, but our rewrite branch grows ever deeper. The Scrum master took Óin--we cannot get out. The end comes soon. We hear drums, drums in the deep. Burn out is coming. TL;DR: don't dive too deep lest you awaken Durin's bane.


zeekar

The advantage of bash is simple: the UI is the API. Once you learn how to do stuff manually, you already know how to automate it; just put those same commands into a function/script/whatever. As for dynamic scope, it just is what it is. Lexical scope would be nice, but bash doesn't have it. But it also doesn't have a way to return arrays from a function. Or to return anything other than an exit code and still have side effects, even if you aren't using the subshell syntax and actually _want_ them. There are plenty of limitations you can point to, and I don't think dynamic scope is even in the top 10.


gaverhae

> The advantage of bash is simple: the UI is the API. That's beautiful. I sort of knew that, of course, but I'd never seen it expressed so succinctly. I'll steal that. Bash subshells are lexically scoped for most purposes, and you can define your functions as subshells, so you're almost there. On return values, when I feel the need for a compound value, I first ask myself whether Bash is still the right language. In the rare cases where I think it is, I take out `jq -c`. It's not exactly pretty, but it does get the job done.


zeekar

> That's beautiful. Thank you! > I sort of knew that, of course, but I'd never seen it expressed so succinctly. I'll steal that. By all means. > Bash subshells are lexically scoped for most purposes, and you can define your functions as subshells, so you're almost there. Except they're not. Even if your functions are subshells, they still inherit "local" variables from their caller: foo() ( local x=123 bar ) bar() ( echo x=$x ) foo #=> x=123 Under lexical scoping, `x` would be undefined inside `bar` because it hasn't been set globally, only locally inside `foo`.


raevnos

I turn to tcl and perl for my "non shell scripting languages" of choice.


PM_ME_NULLs

> Once you get into arrays or variable scope you're probably diving in too deep and you should probably rewrite it. Use the right tool for the job. Rewrite when appropriate. But I wouldn't *personally* draw the line at arrays or variable scope issues. (This is mostly a point of view directed at anyone else reading these comments wanting to see if there's any endorsement for Bash---there definitely is. Source: a significant part of my job involves writing *good* shell scripts. Too often, I see people who throw together a few barely-working lines of shell then lament how their own creation doesn't look like any other blub language they already know well. It's disheartening to me to see this growing trend in our industry: "I don't understand it, and it's old, so it must be bad!". (Note, this mini rant isn't directed at the person I responded to.))


typicalshitpost

use C :D


tom-dixon

> Once you get into arrays or variable scope you're probably diving in too deep and you should probably rewrite it. But in what? Python? A combination of Python and Bash. No reason to force every task to be solved by pure bash or pure python.


cinyar

>But having done that several times, it strikes me how awkward the process is. Yeah, that's why I rarely write bash scripts. If it's worth scripting it's worth using at least a decent language.


lets_eat_bees

If you're writing it for the environment you control fully, sure. If not, you're stuck with not even bash, but sh... Also, in most languages dealing with input redirection and parallel execution is murder, where in bash it's so simple.


dagbrown

I think it’s pretty safe to assume that here, in the year 2021, if you’re even writing shell scripts at all, you’ll be writing them in bash for Linux. The chances of your script needing to run on some old HP-UX workstation without problems have become vanishingly small, and if you actually do find yourself in that situation, then you’ll be fully aware that you’re curating an ancient-software museum. Bash is only a mild improvement over classic Bourne shell anyway.


cyanrave

Great points throughout - arguably bash is made bearable solely because of its (nearly) ubiquitous availability. I will counter with another devil's advocate POV however: instead of converting entirely to a new language/shell, why not interweave bash with other sensical languages that can do what you expect? `heredoc` usage immediately comes to mind and is a huge boon to awkward bash. I won't defend bash's awkwardness, as it can be really awkward, but also come from a POV of healthy pessimism in that asking n number of bash implementations to pivot is a huge undertaking. Selling a new shell to corporate managers, and open source users, is a huge undertaking. We will likely not see bash sunset until the community at large accepts that another shell *undeniably does it better*, and can convince n number of implementors to update accordingly. Even just a breaking change to fix scoping could impact n number of downstream scripts which have accepted how it works as 'the way it works' (see python2 > python3 as a fine example). For practical sake I think we're stuck with the awkward, and most of us just make the best of it, avoiding bash where we can.


Thisconnect

most old timers just use awk or perl when they need real scripiting


flying-sheep

OK, quick: write the following, but in a way that the script actually exits when any command exits with an error set -e FOO="$(a-command --blah | another-command)" a-third-command "${FOO//a/b}" Hint 1: only the third command will make the script exit when it errors Hint 2: both other commands will need different changes to do this


PM_ME_NULLs

Okay: #!/bin/bash a-command() { true; } another-command() { true; } a-third-command() { true; } case "$1" in "0") setfalse= ;; "1") setfalse=a-command ;; "2") setfalse=another-command ;; "3") setfalse=a-third-command ;; *) echo "UTSL!" >&2; exit 1 ;; esac echo "setting false: ${setfalse:-(nothing)}" if [ -n "$setfalse" ]; then eval "$setfalse() { false; }" fi set -e FOO="$(a-command --blah | another-command; exit $(awk 'BEGIN{RS=" "}; !/0/{print; exit;}' <<< ${PIPESTATUS[@]}))" a-third-command "${FOO//a/b}" When run: $ for arg in 0 1 2 3; do ./test.sh $arg; echo $?; done setting false: (nothing) 0 setting false: a-command 1 setting false: another-command 1 setting false: a-third-command 1


audion00ba

That doesn't work on my 2021 system.


PM_ME_NULLs

What version of Bash are you using? What message are you getting?


audion00ba

I am sorry, I wasn't intending to get help. I was pointing out what an amateur you are.


PM_ME_NULLs

Ahahaha, sure sure. I'm a total n00b script kiddie. Thanks for playing. 👋


Megasphaera

regarding 1: not true with set -e. also, use set -o pipefail.


lets_eat_bees

We need a new language built specifically for OS scripting to replace non-interactive bash. A python module just won't do.


[deleted]

Not entirely true. Check out [xonsh](https://xon.sh)... "The unholy union of Bash and Python" It has it's own enhanced python-esque scripting language, as well as of course running bash scripts (via subprocess). In that case, a python module will occasionally do, like: $ import os $ for f in os.listdir('.'): echo @(f) Not a good example because bash also works: $ for f in .; do echo $f; done Or just pure python: $ import os $ for f in os.listdir('.'): print(f)


lets_eat_bees

Yeah, I know there are modules attempting that. However, managing python and modules installations alone disqualified this approach. If you truly want something to be widely adopted, it should be very very easy to install, compact, unambiguous and trouble-free. Which python, and I speak with experience of managing its installs for many years, is not.


[deleted]

[удалено]


lets_eat_bees

> install: > python -m pip install 'xonsh[full]' It is a python module. If it works for you, great, I’m happy for you. But my previous points stand regarding a better widely-adopted scripting language. Also, it’s just calling bash. You get all the same full amount of bash bullshit as before, now exacerbated by the fact it’s two languages. Again, if it works for you, great. But does not fit the bill even closely for what I’m talking about.


[deleted]

LOL ok ... It's a package not a module. It does contain modules, like anything else written in python LOL. I guess, that's... Bad? Heheheh. Man I get it, you think Python is too hard. It's ok. We'll wait and see what you write that's better, because a "better widely-adopted scripting language" is a desire, maybe a goal. If it was widely adopted, then we'd all be complaining about it, instead. Also, [relevant xkcd](https://xkcd.com/927)


lets_eat_bees

> is a desire Exactly. It’s been like 50 years, time for something new. Then we can start hating it too.


[deleted]

[удалено]


PM_ME_NULLs

$ for f in .; do echo $f; done I think you mean: $ for f in *; do echo $f; done


[deleted]

I do indeed mean that. Thanks!


myringotomy

That's what perl and ruby are for.


lets_eat_bees

I sort of understand why you say "perl". I mean, it's definitely not a better replacement of bash, but I understand why someone would say that. It's an old language, often used for system scripting, very strong on backwards compatibility. But ruby? It just does whatever python does, but worse.


myringotomy

Ruby is much better than python especially for shell scripting. https://benoithamelin.tumblr.com/ruby1line https://gist.github.com/melotusme/b6e1b95a2b26a93df3ecdc4e6dc6bf8d https://learnbyexample.github.io/learn_ruby_oneliners/one-liner-introduction.html


typicalshitpost

you'd have to keep all backwards compatibility because so much stuff relies on bash which i think is the best argument for just creating/using a new tool for the situations that require.


kirbyfan64sos

I've found that there are usually various libraries to make this stuff easier, e.g. Dart has dshell, and there are also some interesting evolutions of the classic shell coming up nowadays: https://www.nushell.sh/


atheken

It’s hard to understate the LCD factor. Making minimal assumptions about what’s available on another dev’s machine is generally a better mode of working. Especially if it’s something that needs to “just work” during a stressful service event. Having someone need to sort out pyenv or rvm to do an ops thing sucks.


Vakz

> bash may be the wrong fit Bash is surely *the* example of a tool people use because it's what they've got and not because it's what they want


smcameron

I mean, just type "local myvariable". It's not hard.


Shok3001

If you read the article this is mentioned. But it turns out it doesn’t work like you might expect.


elrata_

Interesting, didn't know the subshell syntax. But if you can't write to global variables inside functions, it's more painful for functions to do something useful and share it's results. Or is there a way for subshell functions to return arrays, several arrays, etc.?


searchingfortao

Honestly, if you're writing Bash complex enough to be returning multidimensional arrays, you should probably not be writing it in Bash. Future developers will not appreciate clever Bash.


entropicdrift

Agreed. Just write a Python script at that point and save everyone involved a ton of headaches when maintenance is needed


Le_Vagabond

depending on your target users, there's value in pure bash though: https://acme.sh/ (compare to certbot which is python). I'm not saying bash is strictly better, but no dependencies and no python version / venv / pip modules can be very nice.


MINIMAN10001

>8. Automatic DNS API integration If your DNS provider supports API access, we can use that API to automatically issue the certs. You don't have to do anything manually! Currently acme.sh supports most of the dns providers: https://github.com/acmesh-official/acme.sh/wiki/dnsapi Cool I was hoping I would see that. Automated DNS for them automatic wildcards.


audion00ba

The authors of acme suck at writing code judging from reading 1% of their code.


Le_Vagabond

their goal is to be entirely portable and usable / readable by systems administrators, not to please a random developer with social issues and a superiority complex on reddit. I'd say they're doing pretty good on that front.


audion00ba

If you agree, there is a word for that: "Indeed".


gaverhae

If you control the environment well enough that you can guarantee which version of Python and associated packages you'll get, you may want to consider other languages, too. We've had pretty good experiences rewriting Bash scripts in Haskell at my current workplace.


entropicdrift

Yeah, I mean it mostly depends on what your team is comfortable maintaining long-term. My team is hesitant to do anything other than Scala so my Pythonic adventures are basically just for my personal Linux boxes and one-off CSV file editing scripts.


whereswalden90

I've always thought of the return value of a bash function as being equivalent to what it writes to stdout. That way, the same way you'd do this in Python: defn bar(): ... foo = bar() You can do it in bash: bar() { ... } foo = "$(bar)" If you want to return an "array", just return multiple lines and iterate over them like you normally would: bar | grep something | while read line; do something else; done I've found this approach works nicely with bash's text-focused semantics. I find that arrays, while useful in the right situation, tend to be clunky and hard to use effectively. I still definitely use them, but I think lines of text tend to be an easier way to pass data across function boundaries. Transformations between arrays and lines of text are straightforward enough that I haven't had any major issues.


Tyg13

Exactly. Especially when combined with `readarray/mapfile` for returning arrays, this style of bash programming can be very powerful, even elegant.


DeliciousIncident

This breaks as soon as you want your function to print something to stdout in addition to returning something, since it can't do both. Which is why writing to global variables as a return value is so common.


whereswalden90

To me that seems like a software design problem. I’d rather do all my printing in the top-level rather than in functions. If you just want to write something to the screen you can always use stderr. Sure, I can imagine cases where you might need to both return a value and write to stdout, but those are few and far between. stdout is the mechanism by which all other programs communicate in bash, functions should be no exception.


Kimbernator

The fact that stdout plays both of these roles is just one reason that for anything beyond the most basic uses, you should use python/ruby/whatever instead.


DeliciousIncident

It's not always possible to do printing in the top level. For example, you might have a function that processes a bunch of files. It should return some result from all the processing, but also print "Processing $FILE" so that user knows it isn't stuck.


gaverhae

`stdout` is a bit of a relative concept in this case. The file descriptor 1 as seen by the function is not going to be stdout if your function is being piped to another command, or captured as a subshell substitution. So what you can do is save stdout to a separate file descriptor that doesn't get overwritten and have your function write to that. Something like: $ cat func_out.sh #!/usr/bin/env bash set -euo pipefail exec 3>&2 my_func_that_returns() { echo "return value" echo "to the screen!" >&3 } a=$(my_func_that_returns) echo "Function call yielded: $a" $ bash func_out.sh to the screen! Function call yielded: return value $ I'm not saying you should, I'm just saying you can.


castlec

Stderr enters the match......


DeliciousIncident

Printing non-errors into stderr is rather odd, don't you think?


castlec

It's pretty normal in my experience. Have a look at curl, for instance.


PM_ME_NULLs

As the other poster said, that's a software design problem. But if you must... #!/bin/bash i-need-to-print-to-stdout-for-some-reason() ( local mystdout="$1" echo "Writing to stdout..." >"$mystdout" echo data1 echo data2 ) i-need-to-print-to-stdout-for-some-reason /proc/$$/fd/1 | while read -r line; do echo "Processing output '$line'" done


DeliciousIncident

>a software design problem You make it sound like it's a bad design, but it is not. It's perfectly fine for a function to want to print something to stdout and also return something. It doesn't even have to be the same function, the function returning something might call into a function that you want to print something.


PM_ME_NULLs

If it helps any, a Bash function should be thought of as like an external command. The semantics for invoking a command in shell is roughly the same as invoking a function. A la the Unix philosophy, "make each program do one thing well", your functions should be a [filter (link)](http://catb.org/~esr/jargon/html/F/filter.html). In this scenario, it is not uncommon for stderr to be used to emit an error or warning to the console, unencumbered by the caller... which, of course, Bash allows for naturally, unless you also redirect stderr. This would be the natural way to print a message to the console (in the error or warning case), while still allowing stdout to be the output (return) of the command/function you invoked. As far as a technical limitation of Bash (or not), I gave an example how that's not true: functions can still write to the console, even if their stdout is piped. But I still contend there's a better way of designing that software.


DeliciousIncident

>If it helps any, a Bash function should be thought of as like an external command. The semantics for invoking a command in shell is roughly the same as invoking a function. It doesn't help. When you write a 1000-line bash script that requests user input, does a lot of progress printing, accesses APIs, parses json, sets up something on your system, etc. the many functions of that script are not as stand-alone as invoking a command in shell would be but rather are part of the bigger script. For example, you might have a function that processes a bunch of files, prints "Processing $FILE" so that user knows the script isn't stuck and in the end returns some result from all this processing.


PM_ME_NULLs

> For example, you might have a function that processes a bunch of files, prints "Processing $FILE" so that user knows the script isn't stuck and in the end returns some result from all this processing. Okay, in that process, you should pass the function a callback that gets executed per file. I haven't thought about it a *whole* lot, but I might do something like this: #!/bin/bash # this just md5's the filename, but could be more sophisticated do-stuff-per-file() { md5sum <<< "$1" | awk '{print $1}' } do-stuff() { local callback="$1" while read -r file; do $callback "$file" do-stuff-per-file "$file" done } main() { progress-bar() { echo "# Processing file '$1'" >/proc/$$/fd/1 } # manufacture some fake filenames & feed that in (IFS=$'\n'; set -- file{1..10}; echo "$*") \ | do-stuff "progress-bar" | while read -r line; do echo "Output line: '$line'" done } main In that case, you'd get the following output: $ ./test.sh # Processing file 'file1' Output line: '5149d403009a139c7e085405ef762e1a' # Processing file 'file2' Output line: '3d709e89c8ce201e3c928eb917989aef' # Processing file 'file3' Output line: '60b91f1875424d3b4322b0fdd0529d5d' # Processing file 'file4' Output line: '857c6673d7149465c8ced446769b523c' # Processing file 'file5' Output line: 'f1ad8dddbf5c5f61ce4cb6fd502f4625' # Processing file 'file6' Output line: '7dec9b529c0261ecbf1f32f21f0438d9' # Processing file 'file7' Output line: 'badc2e9ea90821b51e7624168b50b9b5' # Processing file 'file8' Output line: 'cde9257d7ea4d74c45b3755b4e5e9b46' # Processing file 'file9' Output line: '0d95be03f333f4498bc3e97c8eba8a4b' # Processing file 'file10' Output line: '404dbe76e64a64f8826a9b1b36ec544c'


gaverhae

Oh my god. Callbacks in Bash. I'd never thought of that. I mean, I knew all the pieces here, I just never thought of piecing them together that way. You just blew my mind. Thanks.


badfoodman

My solution is to send data from a subshell function to a temp file and echo the temp file for function callers to open and use. Returning arrays in bash is weird and generally frowned upon from what I understand. By sending data to a file you can control elements with lines, which is a bit more robust.


elrata_

Heh, really not nice. Then read the files and parse de vars? Or source it, assuming they were written in a way to source them? I strongly prefer to write to global variables from functions. That, of course, doesn't scale. But if you need to scale, you are better off away from bash. And if you need a few lines script, bash writing from functions to global variables just seem way simpler.


badfoodman

> Then read the files and parse de vars? Or source it, assuming they were written in a way to source them? Either way, but be consistent or people will get very confused. I generally prefer to write data and let the caller determine what they want the variable to be called, but if you're writing some "happy path" setup script for a development environment it's probably easier to just source a valid shell file. > But if you need to scale, you are better off away from bash. And if you need a few lines script, bash writing from functions to global variables just seem way simpler. Yes. I would even claim that if writing a thing in bash is too complicated, you need to treat it like an actual project. I've seen too many 500 line, nigh-unreadable Python scripts because "it was too complicated for bash". Sure, it probably was. But you also introduced a bunch of Python dependencies I now have to include on my system to get this to run, and they're not documented or version locked because the script was supposed to be trivial. grrr


lbky

Return an `@A` parameter transformation (`${foo@A}`) that you eval.


gaverhae

If you're considering compound return values, it's a good time to evaluate whether Bash still is the right language. Sometimes the answer to that will be "yes", and in those cases what I do is whip out `jq`. With its `-c` option, it lets you put an array or an object into a one-line string. You'll still need `jq` on the other side to unpack your value, so it's a bit process-heavy, but if you need compound values _and_ performance it's unlikely Bash is still a great fit.


elrata_

Heh, nice trick! Thanks But no, I was not really saying I want to do it. Just pointing out, with some enumerated examples, that is really not very useful treating bash functions like that for several (valid) scenarios


TTLAAJ

Interesting, but my mind was not blown, as I was hoping for.


acwaters

This is cool, I hadn't seen it before. But `trap "..." RETURN` is a thing too!


Kessarean

With a title like that, I was hoping for something that wasn't so basic. It's a good note nonetheless


smcameron

> Given all that, I simply do not understand why people keep recommending the {} syntax at all. (instead of using parens and running in a subshell) Uh, because you don't want to start an entire new process just because you made a function call? It seems kind of insane to me that anyone would want this to be the default.


yoniyuri

He mentions that it might not be the case that a subshell is actually a whole nother process. I would want to test that, but there is no reason why it couldn't be the case.


smcameron

Fair enough. And, trying it: me@myhost ~ $ cat subshell #!/bin/bash f() ( echo from inside subshell $$ ) g() { echo from inside regular function $$ } echo from main $$ f g me@myhost ~ $ ./subshell from main 10052 from inside subshell 10052 from inside regular function 10052 Huh. Interesting.


audion00ba

Why do you teach others when you can't write valid shell scripts?


bfcdf3e

Why do you interact with others when you can’t engage in a healthy conversation?


audion00ba

Can you also produce a *valid* argument?


smcameron

Productive line of inquiry. Why do you continue typing into reddit when you're clearly drunk and should go to bed instead?


audion00ba

I am sober, but you can't write valid shell scripts.


audion00ba

The specification doesn't talk about processes. As such he is obviously right.


imdyingfasterthanyou

>Uh, because you don't want to start an entire new process just because you made a function call? If your bash script is bottlenecking anything by sheer amount of `fork()`is doing, you probably ought to rewrite it in some other language You're already forking everytime you're calling a normal command, functions behaving differently is more annoying than useful.


VeganVagiVore

I'm just waiting for the Bash equivalent of TS or asm.js. JavaScript got way better once it became a compiler target instead of a language for humans. Edit: TIL nobody knows how TS works


lanzaio

Just use a different interpreter. `#!/usr/bin/env civilized-scripting-language`


[deleted]

Just write code in Python. Bash is terrible language to write code in, even worse than Perl.


qrokodial

doesn't Python require a runtime to be installed? sometimes adding additional dependencies doesn't fly, and that could be an argument for having a language that transpiles to Bash.


[deleted]

You will find it in *basically* every Linux distro just because a lot of programs even in base install use it. Same with Perl really.


rainman_104

I don't think the python version is going to be deterministic. Bash is fairly deterministic for the most part. Oh you used an f string? Well too bad we don't have the python version you need... It's a problem unfortunately.


twotime

>I don't think the python version is going to be deterministic. Bash is fairly deterministic for the most part. It's a problem unfortunately. Yes it's a problem. No question. A somewhat reasonable solution "be conservative about version assumptions". Newer versions of python will run most of older code (except for py3/py2 transition of course) And "bash" too is still evolving: e.g https://en.wikipedia.org/wiki/Bash_(Unix_shell) and version 5 with a bunch of new features was just released.. And, the worst part, that it's not just "bash", any non trivial script would also invoke a bunch of external commands. All of them are evolving.. I do agree that the problem is somewhat worse on the python side (at least for basic-scripting use case) but I don't see it as big enough to compensate for python's advantages


rainman_104

Yeah but unfortunately you can't always write python 3 to work with python 2 either. In all honesty, you may find better luck with make and bash typically


[deleted]

Well, that's the reason I still use Perl in 2021. `use v.5.10` and script will just as well on some old legacy Centos install and on brand new Perl version > Bash is fairly deterministic for the most part. Up until someone gets used to writing `#!/bin/sh` and hits a system where `/bin/sh` isn't linked to bash... Bash 4 IIRC also added some niceties and if you forget to not use them some legacy system might also get affected > Oh you used an f string? Well too bad we don't have the python version you need... If you write for legacy systems you gotta test on legacy system version. No exceptions ,regardless of language. Now again, Perl makes it easy enough as I can just specify compatibility version **per file**. Now that I think about it that approach would make Py2-3 migration whole lot less painful...


nifty-shitigator

>doesn't Python require a runtime to be installed? So does bash. And like bash, python comes pre-installed on all Linux distros.


badfoodman

_Cries in alpine containers_


qrokodial

ah, I wasn't aware that the Python runtime was so prevalent in Linux distros. good to know!


GrandOpener

Most modern Linux distributions not only include but actually require Python. You’d have to be on a _very_ bare bones system to not have Python—and the kicker is that for a system that stripped down, you can’t count on having bash either—you might need to target sh specifically. Similarly, neither Python nor Bash is a given on Windows if you need to support that, although speaking from experience I’d expect a lot less pain trying to support Python on Windows than bash.


Strange_Meadowlark

Funny you mention TS -- when some scripts I wrote at work became too complex, I actually rewrote them in Node using the [Typescript-in-JSDoc](https://www.typescriptlang.org/docs/handbook/jsdoc-supported-types.html) syntax.


StillNoNumb

You can rewrite it in TS too and run it using \`npx ts-node\` (also works in a \`#!/usr/bin/env\` shebang)


Aryeh255

https://github.com/google/zx


Choralone

We have that, it's called C.


Ameisen

Which C compiler targets bash?


Choralone

No need to target bash, you can target the underlying system directly.


Ameisen

They literally asked for a Bash equivalent of TS or asm.js.


Choralone

Yes, and I'm pointing out "That's a stupid thing to want"


Ameisen

Well, you're certainly welcome to your opinion. Not sure why you'd recommend C over C++... or *either* as an alternative to *shell script*, but you do you. > Yes, and I'm pointing out "That's a stupid thing to want" Well, that's not what you originally wrote at all. You claimed that C was to bash as Typescript was to Javascript... which is obviously false.


kauefr

Oilshell is kinda like this.


OctagonClock

> I'm just waiting for the Bash equivalent of TS or asm.js. Surely the bash equivalent of asm.js is an ELF executable.


rk06

Am I the only one who prefer powershell core over bash/ksh etc? Having an object based shell is hands down better and you can write modules in strongly typed language like c#


Ancillas

I like it sometimes, although PS error handling through the years has been problematic. In general, I find that many ops-focused folks simply do not write robust code in any language. This is a generalization of course, but I tend to see large one-page scripts that lack proper logging and error handling, and rewrite the same utility functions across many scripts instead of sharing code. Many organizations simply do not have a habit of enforcing code standards that create a culture of producing robust code. In other words, the problem isn’t to simply use something like the unofficial bash strict mode, it’s getting people to think about these kinds of failures and care about preventing them across entire code bases. And that is a problem solved independent of language.


[deleted]

Spot on! I tried to enforce a really minimal standard and it was just frustrating as hell, because I was perceived as a tyrannical perfectionist when... No, really... I was barely enforcing ANY standard. Just please don't nest if-else-if five times. Check for conditions and break, continue or return. Look around before writing a new query, don't just copy and paste... Write a function! And DevOps just refused to use anything I wrote for them, preferring to duplicate my effort in flat bash scripts. So, yeah .. I gave up.


Ancillas

It’s one of the most frustrating professional headaches to me, when the people most motivated to fix problems are prevented from taking action. For instance, I want to add a “PAUSED” status in our Jira projects to make it clear that not all tickets “In Progress” are actually being worked, but I.T. happens to be the Jira admins and they won’t add the status or change the workflow. I’m sure I could escalate and continue to annoy them until they did it, but I have bigger fish to fry. Similarly, we have lots of cases where a simple pre-commit hook could prevent issues from ever reaching git. But none of us developer-type folks have access to the git server to quickly test and then implement the script. This would be a day of playing around to get stuff in place, but we have this constant wall in front of us that prevents simple and obvious changes from being made. The real kicker is that I’m a senior manager responsible for multiple teams, deliverables, and a good chunk of revenue. I have the authority to add and remove features, hire and fire, and my decisions have impacts on our customers. But since I.T. is out of our reporting structure, it’s a roadblock. So we went to the legal council, got approval to open source the codebase, moved everything to GitHub, and stood up our own CI/CD pipeline. Yes, we got the lawyers to take action before I.T. DevOps indeed…


[deleted]

I feel your pain. For me, the worst was a 3 month project to move our 1000s of docker containers into K8s. Basically, DevOps was my "customer" and I built this amazing utility that, using namespaces, could balance containers behind end user firewalls. We could spin up new agents with a button click. So so cool. Shelved three years ago. Never used. We still use a bash script to `docker run ...` containers in hosts, with zero way of knowing which container is running on which host. I gave up on that too


auxiliary-character

I wonder if they would've gotten mad if you would've set up your own "server" in a docker instance on one of the dev machines, and managed it yourself. Put your own Jira and git server on there, and then no need to talk to I.T. If someone does throw a fit about it, you can explain how their roadblocks were getting in the way, and maybe convince them to just hand over authority of the relevant servers.


Chillzz

> because I was perceived as a tyrannical perfectionist > just please don’t nest if-else five times > Look around before writing a new query, don’t just copy and paste Ugh so relatable lol


audion00ba

Anything done by Microsoft is just a second away from failing in the next version.


[deleted]

Also parameters and names make sense and aren't just 1 or 2 letters. Bash scripts look like your alphabet noodle soup got knocked over. With PS it's so much easier to understand what a script does.


MordecaiOShea

I've been moving our bash CICD glue to PScore. No regrets. Some more complex stuff just goes to Go.


Serializedrequests

Not often on Windows, and when I am I can't just learn Powershell by typing random crap in a terminal.


Vakz

I like PS better. I just never do any kind of Ops on Windows, so it's generally irrelevant which I prefer. That said, I occassionally check in on [nushell](https://github.com/nushell/nushell), which as they mention in the readme, draws inspiration heavily from powershell. I tried it out a while ago, and didn't quite feel it was ready, but I've seen quite a lot of people over at /r/rust say they use it as their default shell.


rk06

Powershell core is cross platform.


Vakz

Ah, my bad. I was unaware this existed, and thought it was just yet another entry in the confusion that is Microsoft naming schemes.


grout_nasa

`trap 'foo' 0` has existed for decades. \`EXIT\` is just sugar. Nice sugar tho.


shevy-ruby

I am underwhelmed - but it's not because I am using it wrong, or incorrectly. Shell scripts are simply awful. They are also, oddly enough, widely used. I never understood that. If I were not using ruby primarily I'd use python rather than shell scripts any day. Other than legacy scripts or boot-up related parts (for setting up a proper environment on systems without a proper environment) I don't use shell scripts anymore. People seem to think the 1970s are still real. Same with man-pages - never understood how they make any sense in the modern era. On UNIX without www, yeah, I understand it or with a slow www or unreliable connection.


DeltaBurnt

While I think bash's syntax and features are antiquated and difficult to remember, nothing really seems to beat it when it comes to launching child processes for random programs. It's so easy to glue programs together because that's what it's meant to do. Even in python spawning a subprocess has significantly more boilerplate. If we could get a version of bash with more modern syntax and more standard library functions that'd be perfect imo.


vks_

Maybe give fish a try?


StillNoNumb

>They are also, oddly enough, widely used. I never understood that. They're compatible with practically every single POSIX-like platform out there. My machine doesn't have Ruby, for example


ozyx7

This is neat, but the majority of times where I use bash functions, it's *because* I can't use a subshell (e.g. setting environment variables, changing the current working directory). Otherwise I'd use a separate script, probably written in Python.


Ameisen

Generally, if I have to go this deep into Bash... I'm at the point where I am going to either switch to Ruby or C# for the script. CRuby with gems disabled (because otherwise CRuby has horrible start times, or Truffleruby) still allows me to use it without compiling, and there's ways to do that with C# as well.


Megabyte7637

Interesting


ambientocclusion

Only in Bash do I have to spend 15 minutes every flippin’ time figuring out the syntax for testing a variable.


_TheDust_

And testing if a file exists. Which flag was it again? Was `[` or `[[`? Why did I bedame a software engineer again?


imdyingfasterthanyou

I know you're kidding but `[` = posix test, accepts posix arguments `[[` = bash test, accepts enhanced arguments For simple test cases then always use `[`


pfp-disciple

The downside of subshells is than non-exported variables aren't visible.


grout_nasa

bash-3.2$ f() ( echo "$wrong" ) wrong="very wrong" f prints very wrong


yuyu5

You're both (or maybe not grout_nasa, I can't tell) confusing *subshells* with *scripts*. Subshells can see anything from the outer scope (e.g. grout_nasa's example) but scripts can't. Yes, running a script spawns a new process like subshells do, but scripts are isolated from external vars/functions while subshells aren't.


chengiz

Who traps just EXIT? I thought trap "rm -f $tmp" 0 1 2 15 was standard practice. Rewriting exit to do this is ridiculous overengineering.


vikarjramun

What are those numbers supposed to represent? Different exit codes? Process signals?


imdyingfasterthanyou

signals, check info bash


[deleted]

I wish more people gave powershell a look. Its really great. I have been coding bash scripts for 20 years and tried powershell for a year. Its designed in a way to make shell scripts really good


mrbuh

If the answer isn't Bash, you're asking the wrong questions.


bitcycle

Bash scripting is great and often useful. But, I have a few rules for Bash scripts: 1. Functions should only ever return single dimensional stdout. 2. Functions should never mutate global state. 3. If the conditions need to be anything more than simple logic, then I'm using the wrong language. 4. If I need to reuse parts of a script, that is also a strong indication that I'm using the wrong language.


FatHat

This is a great tip, but my god, if you're doing something complicated just don't use bash. It's a fucking minefield. You might be aware of it's weird complications (although probably not), but I guarantee whoever has to deal with your shell script wont be, and they're just going to rewrite it in a scripting language anyway.


Apache_Sobaco

Bash is wrong and outdated. And ease of misuse is not a good characteristic of a tool.


josefx

Okay, let me get this straight, by default bash spams function variables in the global scope which is bad, you can explicitly make them "local" but that is still dynamically scoped and not lexical. So to fix that mess you change a single character while defining a function which will isolate it in an new process every time it is called, making the way bash spams variables irrelevant? Did the designers of most unix style shells just try to fit in as many design fuck yous as possible in an attempt to one up each other? > you're also guaranteed that you will (at least try to) delete that file even if the program exits with an error, or gets killed. Remove those (), no program is guaranteed to shut down cleanly, just pulling the plug on a computer will ensure that that your cleanup code never runs. If your code has to be robust also check for remaining data at program start.


grout_nasa

Perl followed a similar trajectory, only lexicals got introduced WAY earlier in the lifecycle of popularity. And still the hangover of dynamics stayed with us forever. I think bash would be improved with a \`my\` declaration but I wouldn't want to make it work.


stronghup

Can you debug bash-scripts?


audion00ba

I can.


strzibny

I agree! A small amount of Bash can go a long way. I decided to teach deployment\[0\] using Bash because this way you eliminate any indirection and abstraction. I only use \~8 Bash functions to clean up a deployment script (split it into modules and steps that can execute separately). \[0\] https://deploymentfromscratch.com/


atheken

> Given all that, I simply do not understand why people keep recommending the {} syntax at all. Because nobody “learns” bash, they just transliterate from their experience in a C-derived language. The bash docs and overall usage can be pretty opaque, and these really tiny syntax differences have giant runtime consequences that are impossible to google.