%(fmt)T -output the date-time string resulting from using FMT as a format string for strftime(3)
The man page provides a bit more detail: %(datefmt)T causes printf to output the date-time string resulting from using datefmt as a format string for strftime(3). The corresponding argument is an integer
representing the number of seconds since the epoch. Two special argument values may be used: -1 represents the current time, and -2 represents the time
the shell was invoked. If no argument is specified, conversion behaves as if -1 had been given. This is an exception to the usual printf behavior.
With that, timestamp=$(date +'%y.%m.%d %H:%M:%S')
becomes printf -v timestamp '%(%y.%m.%d %H:%M:%S)T' -1Agreed WRT shellcheck.
I use -e sometimes but I really dislike scripts that rely on it for all error handling instead of handling errors and logging them.
^^ this tool has proven very useful for avoiding some of the most silly mistakes and making my scripts better. If you're maintaining scripts with other people then it is a great way of getting people to fix things without directly criticising them.
#!/bin/bash
die() {
local frame=0
while caller $frame; do
((++frame));
done
echo "$*"
exit 1
}
f1() { die "*** an error occured ***"; }
f2() { f1; }
f3() { f2; }
f3
Output
12 f1 ./callertest.sh
13 f2 ./callertest.sh
14 f3 ./callertest.sh
16 main ./callertest.sh
*** an error occured ***
Via: https://bash-hackers.gabe565.com/commands/builtin/caller/That's throwing the baby out with the bathwater. Instead, default the optional global variables with something like:
"${GLOBAL_VAR:-}"
That will satisfy the optionality of the variable whilst keeping the check for the cases you actually want them.I suppose what is really tripping people up is that bash can show up on all kinds of runtimes, some of which have the external tools one might need (jq, logger, etc) and some of which don't. So then you go searching for a minimum standard that can be expected to be present. Maybe POSIX or gnu coreutils. Reminds me of the shell horrors of the late 1990s where every script had to figure out if sh was really ksh and what variant of UNIX it was running on, and therefore what commands and options were available. I swear this was one of the great things about Perl when it came along, it just worked.
In 2025, I kind of see the attraction of single binaries like Go does. Ship the binary and be done. It is very un-UNIX I suppose (not so much golfing as having the beer cart drive you to the hole) but then again its not 1985 any more.
Every time I write a shell script that grows to more than about 20 lines I curse myself for not having written it in Python. The longer I have waited before throwing it away and redoing it, the more I curse.
This article says nothing to change my mind. I could build logging and stack traces in Bash. I admire the author's ingenuity. But again, why?
I can’t even tell how many times I’ve seen multi-line Python scripts which could instead have been a shell one-liner. Shorter and faster.
I have also written shell scripts with hundreds of lines, used by thousands of people, which work just fine and would be more complicated and slower in other languages.
I firmly disagree with the all too pervasive blanket statement of “there are better languages”. It depends. It always does.
If you have a standard-ish environment, you'll have an array of Unix tools to compose together, which is what a shell is best at. Even a minimal image like busybox will have enough to do serious work. Golfing in shell can be a pipeline of tools: lately "curl | jq | awk" does a lot of lifting for me in a one-liner.
As soon as you say "switch to (favorite decent scripting environment)", you're committing to (a) many megs of its base install, (b) its package management system, (c) whatever domain packages you need for $work, and (d) all the attendant dependency hells that brings along. Golfing in a scripting environment is composing a bunch of builtin operations.
Yes it's a tradeoff. Every line of code is a liability. Powershell or python are probably "slower" which in my use case is negligible and almost never relevant. On the other hand, I can't help but view the often esoteric and obscurely clever bash mechanisms as debt.
Thanks for fzf, by the way. Always one of the first things I install in a new environment.
That being said, as a guy who does not have big prominent OSS tools under his belt, I am slowly but surely migrating away from shell scripts and changing them to short Golang programs. Already saved my sanity a few times.
Nothing against the first cohort of people who had to make computers work; they are heroes. But at one point the old things only impede and slow everyone else and it's time to move on.
This. Bash gives you all the tools to dig a hole and none to climb out. It's quick and easy to copy commands from your terminal to a file, and it beats not saving them at all.
Support for digging: once you have a shell script, adding one more line conditioned on some env var is more pragmatic than rewriting the script in another language. Apply mathematical induction to grow the script to 1000 lines. Split into multiple files when one becomes too large and repeat.
Missing support for climbing out: janky functions, no modules, user types, or tests; no debugger and no standard library. I've successfully refactored messy python code in the past, but with bash I've had no idea where to even start.
There is hope that LLMs can be used to convert shell scripts to other languages, because they can make the jump that experienced devs have learned to avoid: rewriting from scratch. What else do you do when refactoring in small steps is not feasible?
(a) instead of writing a shell script to operate a shell-operated tool, write a python script with a bunch of os.system('shell out') commands.
(b) instead of just invoking ffmpeg to do the things you want done, install an ffmpeg development library, and call the functions that ffmpeg itself calls to do those things.
What would be the argument for either of those?
There were some languages shown in HN that compile to sh/bash (like oilshell[0]). I would think that's also a viable vector of attack but not sure how viable it actually is i.e. maintainers might have moved on for various reasons.
Ish. You can source whatever files you want, so if you split up your functions into logical directories / files, you can get modules (-ish).
> no tests
BATS [0].
[0]: https://github.com/bats-core/bats-core
> I've successfully refactored messy python code in the past, but with bash I've had no idea where to even start.
I say this with all kindness: you probably need to know more bash before you can safely refactor it. It is a very pointy and unforgiving language.
This is most common in Debian and Ubuntu, where ash is /bin/sh, and /bin/bash does not run in POSIX mode by default.
Some behavior of legacy bash of the '80s, prior to POSIX.2, can be surprising.
[1] https://www.reddit.com/r/commandline/comments/g1vsxk/the_fir...
What? If globals are set outside the scripts, -u still works. If the author means they may or may not be defined outside the script, the ${VAR:-} construct allows it to expand to nothing if unset (just throw VAR=${VAR:-} at the top if you don't want to edit the body)
Also, I do not like the function return based on error code:
function ... {
...
(( check_level >= current_level ))
}
Unless I'm reading this wrong, this is a bad idea if using set -e. This is a function and it should instead: return $(( check_level < current_level ))For actually _testing_ the scripts or its functions, I recommend ShellSpec