I did it for an hour or so a night for a week or so.
That being said, a few of my personal favorites to memorize:
* Parameter expansion: https://www.gnu.org/software/bash/manual/html_node/Shell-Par...
* All of test(1) as you can use them in any if statement (/usr/bin/[ is a real command!): https://linux.die.net/man/1/test
* Knowing most of the bash internal variables: http://tldp.org/LDP/abs/html/internalvariables.html
* Keyboard shortcuts and how they are useful. A few example: CTRL-l (no need to ever use /usr/bin/clear), CTRL-k, CTRL-u, CTRL-e, CTRL-a, CTRL-w, CTRL-arrow left, CTRL-arrow right, CTRL-r (history reverse search with find as you type autocomplete)
The best way you can learn the shell is by using Linux as your primary desktop for at least a few months. You'll get very proficient very quickly by doing that.
One thing I have found that less people seem to know, is that the Unix metacharacters are expanded by the shell (bash etc.) not by individual commands. What this implies is that any command, whether built-in or written by you (in C, bash, Python or any other language), has metacharacter support automatically. That is, things like the file globbing/wildcard characters like *, ?, and [ ranges ].
This was not originally true on DOS (as a counterexample) and not sure whether it is true on Windows today (haven't checked), though I did notice that more commands seem to support wildcards in Windows nowadays.
Also, for some years now, Windows too has had redirections like:
command >file.txt 2>&1
(redirect stderr (2) to the same destination that stdout (1) is pointing to, i.e. file.txt), which Unix had from the start.
keith@illy:~$ man bash | col -b | wc
5738 46385 323374
A slim novella's worth on Debian Sid's man page. Never thought of just reading the whole thing, so thanks.[1] http://www.gnu.org/software/coreutils/manual/html_node/index...
Yes, and not only in an if statement. You can also use test or [ command in commands of the form:
test condition && commmand2
or
test condition || command2
which will only work if the condition is true or false, respectively (IIRC, need to check this).
Also one should take a look at rlwrap after becoming comfortable with the keyboard shortcuts.
Mind your pipes and quotes. Guard your variables with braces. Do not export everything, test for and (try to) handle return codes and conditions and keep it simple (emphasis simple) but most of all just write it.
BASH (or Bourne) is ubiquitous when dealing with systems (vs programs). You don't need to be on the fashionable lang of the day by any measure. BASH, for most cases, will always be there, always ready and, in most cases, is the default human interface for deployed systems. As scripting languages go you don't need "better", you need dependability, zero dependencies with no requirement for modules or any other whizbangwoohoo plug-in. Language Fashionistas and personal preferences aside at least some level of fluency with BASH should be mandatory for anyone interfacing with a system.
You're going to get a lot of snark from people saying things like "don't", or "learn python instead".
This epitomizes "a little knowledge is a dangerous thing".
Bash has many cringeworthy aspects, but we have to deal with the world as it is, not the world as we would like it to be, and the reality is that bash is the default shell on 99.9% of unix boxes you encounter — even if you use an alt shell on your machine.
Coworkers machine? Bash. Default AWS AMI? Bash. init script to bootstrap $DAEMON? Bash. ssh to a server at your workplace? Bash. Random O'Reilly Linux tutorial? Assumes bash.
My advice?
Take some time.
Sit down.
and read "man bash"
cover-to-cover.
at least once.
A lot of the illogical things in bash make a lot more sense once you understand its parsing/expansion rules. And once you understand what is in the language vs an external terminal program in your PATH.
Since that sounds unappealing (and I scoffed at that very advice for many years), I've also found the wooledge bash guide to be very helpful.
Basically, every time I learn something useful about a command, I add it to its mann page and then whenever I need it in the future, I simply run 'mann <command>' to find it.
Here's the current output of my 'mann sed', for example:
# Add char to beginning of each line
sed 's/^/#/'
# Replace with newline
sed 's/<oldvalue>/\'$'\n''/g'
# Replace newline
sed -e ':a' -e 'N' -e '$!ba' -e 's/\n/<newvalue>/g'
# Plus sign
sed -E 's/foo+/bar'
# Digit
sed -E 's/[[:digit:]]/bar'
# Inplace
sed -i'.bak' -e <pattern> <file>Bash is essentially a DSL for these. A lot of the weirdness you see in the language is due to these abstractions leaking through. For example:
* Quoting is building execve's argv parameter. It's hard to quote correctly if you don't know what exactly you're working towards.
* Redirections are opening and copying file descriptors. It explains their scope, order and nesting behavior.
* Variables are modifying and passing the environment, and their weird scope is due to forks imposed by the process model.
Once you know how you can do whatever you want in C through the basic syscalls, Bash is an extremely efficient and far less surprising shortcut to do it.
can you expand on this? does this elucidate, e.g., variable substitution and the difference between single- and double-quotes? or does it just help demonstrate when you need quotes for an argument that may contain whitespace?
For example, all the various wrong ways of quoting var="My File.txt" or otherwise incorrectly using such a name will result in variations on a wrong argument list:
execlp("cat", "$var", NULL); // cat '$var'
execlp("cat", "My", "File.txt", NULL); // cat $var
execlp("cat My File.txt", NULL); // cmd="cat $var"; "$cmd"
execlp("cat", "'My", "File.txt'", NULL);// cmd="cat '$var'"; $cmd
execlp("cat", "My\\", "File.txt", NULL);// cmd="cat My\ File.txt"; $cmd
execlp("cat", "'My File.txt'", NULL); // var="'$var'"; cat "$var"
execlp("cat", "\"$var\"", NULL); // arg='"$var"'; cat $arg
Meanwhile, all the correct ones result in the same, correct argv: execlp("cat", "My File.txt", NULL); // cat "$var"
execlp("cat", "My File.txt", NULL); // cat 'My File.txt'
execlp("cat", "My File.txt", NULL); // cat "My File.txt"
execlp("cat", "My File.txt", NULL); // cat My\ File.txt
execlp("cat", "My File.txt", NULL); // cmd=("cat" "$var"); "${cmd[@]}"
execlp("cat", "My File.txt", NULL); // arg="'My File.txt'"; eval "cat $arg"
If you don't know which argument list you're aiming for, you basically have to go by guesswork and superstitions.Do you recommend a book or something like that? And preferably for a beginner?
Love - Linux System Programming
The Unix-Haters Handbook
(Unfortunately, the best resource ever for this kind of stuff, from which I learned, does not have an English translation.)
It will highlight common mistakes, and their wiki explains each one detail and how you should use an alternate, better implementation.
1. If you are not running a unix as your default OS switch to one (ie Linux or Mac).
2. Create a bin (~/bin) directory in your home directory of all your shells scripts and source control it. Any script you ever write put in that directory. Even if its not bash (ie python, perl). I find that it is critical to look at how you did things previously to help you learn as well as it saves time.
3. Any command that is complicated one liner that you create or see on the internet... create script and put in the bin directory mentioned above.
4. Optimize your personal bin directory and review frequently.
5. If you run Linux build your system from scratch (ie read Linux from scratch).
6. Bonus to the above: Automate the creation of your personal system through Packer and Bash!
7. Find where things are not automated.
8. Bash is more than just the shell. A knowledge of GNU coreutils as well as tmux/screen is worthwhile and highly recommended.
9. Learn the readline shortcuts. Particularly "ctrl-r".
I've not been able to find anything good on setting up a dev machine image with packer.
Anything that is not simple in bash gets hard to read and debug and probably is wrong on some subtle levels.
I have a rule of thumb that any shell script that grows beyond a screenful of lines gets redone in a proper scripting language.
It's not a full-fledged programming language by any stretch of the imagination (lacking structures more complex than associative arrays), but it's damn good for scripts of all sorts.
As an example, I've reimplemented a subset of Ansible (a command able to send "modules" on multiple machines via SSH and capturing+caching their output for subsequent queries) in ~150 lines of Bash. Considering that the size of Ansible, written in the more proper Python, is ~15000 LOC, I'd say Python is the much lesser scripting language.
Edit: to answer the OP's question, the documentation I've found most helpful to learn Bash is the one present on the Linux Documentation Project, with the page for arrays deserving special mention : http://tldp.org/LDP/abs/html/arrays.html. I spent a lot of time reading the manual before stumbling upon that documentation, and none of it really clicked until I had a few examples before my eyes.
Writing efficient bash code forces you to think about your problem differently. It's a very similar process to thinking functionally; e.g. you don't want to deal with lines of a file one at a time in a loop, you want to do filters and maps in languages like grep, sed and awk to deal with data in a streaming fashion with a minimum of forked processes.
I would love something like that.
Also, even if you manage to become better in Bash, you are bound to lose your skills at some point when you have been programming in other languages for a while.
I always have to look up how to do even basic things in Bash. I just don't use it often enough for these things to "stick".
Since when is simplicity an argument against writing programs? Whether scripts or frameworks? "Hard to read" is not neccessarily an inherent trait[1] of the language, and more likely wrong on some PEBKAC level.
I have a customised environment at near 10k lines of bash in 5 projects, all of it in the correct tool for the job, aka a proper scripting language, so I can suggest another use for your thumb :-)
1: https://www.reddit.com/r/commandline/comments/2kq8oa/the_mos...
https://github.com/EtiennePerot/parcimonie.sh/blob/48044f913...
http://www.tldp.org/LDP/Bash-Beginners-Guide/html/
EDIT : Additional links -
Advanced - http://tldp.org/LDP/abs/html/
Bash programming - http://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO.html
I use bash/shell scripts frequently and have many running 'in production' as cron jobs th at run various jobs or manipulate data for other jobs to run on.
One thing I really like about pure shell is that it's extremely portable and transparent about what it's doing.
I still have to re-learn control structures almost everytime I write a new script, I don't try and memorize [[]] vs [] and all the weird ways equality can work, I just google each time and answers are always on top (once you know what you're looking for).
set files [glob /etc/*.conf]
foreach f $files {file lstat $f file_info; puts "$f: $file_info(size)"}
/etc/asl.conf: 1051
/etc/autofs.conf: 1935
/etc/dnsextd.conf: 2378
... and so forth ...
Also there is an `exec` command that supports pipes, redirections, and so forth: set result [exec cat /etc/passwd | grep Directory]
The pipe has no special meaning in Tcl, but because of its DSL capabilities you can do things like that. Exec is a DSL basically.I invite you to read about it: http://doc.cat-v.org/plan_9/4th_edition/papers/rc.
I find the control structures simpler and more elegant, and overall its design feels more consistent. For example, consider an if statement in bash:
if [ condition ]; then
...
else
...
fi
And now in rc: if (condition) {
...
} else {
...
}
Or a case statement in bash: case $1 in
"bar")
... ;;
"baz")
... ;;
esac
And expressed in rc: switch ($1) {
case "bar"
...
case "baz"
...
}
In the past, I've used it as my shell too, but now I use it only for scripting. I think you can install it in most platforms.https://web.archive.org/web/20161227222637/http://blog.extra...
From there, move on to using the shell as your IDE. How? First, understand the Unix philosophy. I think Ted Dzubia describes this pretty well in his Taco Bell Programming blog posting:
http://widgetsandshit.com/teddziuba/2010/10/taco-bell-progra...
Great, so now you understand that there are a bunch of useful tools out there and you can string them together to do great things. Now you need to discover the tools themselves.
If you're a "read the dictionary" kind of person, go ahead and start off w/ the Gnu Coreutils documentation. https://www.gnu.org/doc/doc.html
However, if you're like me you'll learn fastest by watching other people work. In this case, I have to point back to Gary Bernhardt again. Specifically, his "Composing a Unix Command Line" screencast will open your eyes wide and very quickly introduce you to a range of incredibly useful coreutils programs in the context of solving a very specific problem. This content is $29/mo, but I'd argue it's money well spent. https://www.destroyallsoftware.com/screencasts/catalog/compo...
I learned to use shell, about 7 years ago, by reading O'Reilly "Classic Shell Scripting". It is well written, and teach you something that you can hardly learn from google. But don't try to remember everything, especially those advanced string manipulation syntax, because one would usually use a scripting language such as ruby for advanced job.
I can second this recommendation. :)
I'd say learn the following topics:
pipe grep sed awk find
Once you feel comfortable using and combining these tools you should be able to find out the rest by yourself.
The Bash syntax is not daunting, or if it is that's never been the problem with Bash. The problem is that Bash or shell programming in general gives you a million ways to shoot yourself in the foot and maybe one or two obscure, funny-looking ways to do what you want. Like iterating over the files in a directory, for example. If you think that's easy in Bash you have either have a funny definition of easy or you're forgetting a few corner cases. Or getting the output from a program—did you know that $(my_prog) or `my_prog` modifies the program output?
For containers we do everything declaratively.
Maybe I'm overlooking something...but why wouldn't this work:
for file in $(ls); do {<looped command>}; done
Get a very brief reference book of every common UNIX command. Read all the commands, what they do, what options they take. Start using them.
Shells are most useful when they are used to tie together other programs. In order to do this, you have to know what all the command-line tools you have at your disposal are. Learn the tools, then start writing examples using them. Keep the examples and the docs somewhere to reference them later.
For quick reference, the command 'whatis' will give a blurb from the top of the command's man page. `whatis ls' "ls (1) - list directory contents". View many at once with "(cd /usr/bin; whatis * | grep -v noth)". Many often-used commands come in "util-linux" and "coreutils" packages. Read man pages completely when convenient.
It may also help to have a VM or desktop which has no GUI, where you will be forced to use the command-line. When I was starting out I used a desktop with no X server for months. You can get a lot more done than you think (and 'links -g' provides a graphical browser if you need images)
To learn more about Bash itself, you can look for server installation software packages made with Bash, or in the "init" tools distributed with big distros like RedHat, SuSE, etc before they used systemd. But it's better to get used to more shell-agnostic scripting using UNIX commands than it is to use Shell-specific language/syntax.
Without X presumably needs framebuffer or something?
echo "345.44
544.50" | rg "€#{l}: €#{(l.to_f * 3).to_i}"
Produces the following output:
€345.44: €1036
€544.50: €1633
Based on this code:
https://gist.github.com/zachaysan/4a31386f944ed31a3f8a920c85...
I find it's much faster to be productive like this than it is to try to do the same with ruby -e because I really only want to manipulate single incoming lines. I don't want to have to write the looping code or the code that sets variables or what have you.
Also, sometimes it gets confusing what tools are just bash functions or alias and which are scripts, so if you ever forget what a tools definition is just type:
type toolname
As for actually answering your question, look at your friend's dotfiles on their github account to learn which tools and tricks they use and when you don't know how something works ask them questions. People will usually point you in the right direction.
Aliases are something I use a lot - it's very basic but just having "big long command with options" aliases to something easy to remember makes it much more likely I will not make mistakes, can repeat it easily in loops.
Another thing that complements using bash effectively are using other applications config files. As soon as I have a new host to interact with I add it to my ssh.config file - then any scripting I need to do I don't need to deal with any special files. Other files like ~/.netrc or ~/.pgpass make my shell sessions that much more productive. For some reason many people don't bother ever updating these and rely on the shell history to do anything more than once.
CommandlineFu (http://www.commandlinefu.com/commands/browse) has some nice one liners and there's often some gems on ServerFault (https://serverfault.com/questions/tagged/bash) - just browsing those for topics matching your workflow can be very fruitful.
I've found the more I do anything at the shell of any complexity I end up writing a small python command line client to do any heavy lifting. Argparse (https://docs.python.org/3/library/argparse.html) makes this trivially easy and then I can use regular shell "glue" to combine a few commands together.
FOO="Hello World"
...
BAR="$(echo "$FOO" | sed "s/World/Hacker News/")"
until I remembered that bash can do string replacement by itself. A quick search for "pattern" and "substitute" in the manpage turned up the right syntax, BAR="${FOO/World/Hacker News}"It runs about 70-80 pages, so counts as a small book.
Hrm. Make that 107 pages:
man bash | pr | grep 'Page [0-9][0-9]*' | tail -1
46,000 words.http://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3...
The POSIX specification of the Shell Command Language.
Also, don't overlook that there is a GNU Info manual for Bash, not just the manual page:
https://www.gnu.org/software/bash/manual/html_node/index.htm...
> Add, "#! /usr/bin/python" to the top of your scripts, it will make your life easier.
However, after reading the rest of the thread, it seems Python and similar langs are not actually great for the kind of things people use Bash for, and Perl is the way to go!
Great, another language to learn...
edit, re: python:
fiatjaf suggested xon.sh:
"shell language and command prompt [..] based on Python, with additional syntax added that makes calling subprocess commands, manipulating the environment, and dealing with the file system easy.
If your use case is pragmatic in nature, I would recommend my post on the topic: http://alexpetralia.com/posts/2017/6/26/learning-linux-bash-...
"\e[A": history-search-backward
"\e[B": history-search-forward
in your ~/.inputrc. So, if you are typing a command which begins with "git", it will only search in history for commands that start with git (instead of returning all commands that may include the string 'git' like Ctrl+r). Having trouble trying to remember that option you passed to `git log`? Just type in `git log` and press the up arrow to find your last usages.I think it is also helpful to learn Emacs or vim keybindings. I use Emacs keybindings in bash a lot (enabled by default). I have summarized the ones that I used more often in a previous comment [2].
[1]: https://www.ukuug.org/events/linux2003/papers/bash_tips/
Anyway, here's a few steps that I would recommend:
1. Go through http://tldp.org/LDP/abs/html/ and http://www.tldp.org/LDP/Bash-Beginners-Guide/html/ , or at least go through the table of contents so that you have a feeling of what bash is capable of. A few important things are: if, while, for, switch, functions, string manipulation, pipe, subshell, command substitution
2. Understand the execution model. Variables in subshell cannot be accessed from the parent shell, this is a common mistake
3. Learn to avoid common pitfalls. I always recommend my colleagues to always quote the variables in double quote, always use "$@" instead of "$*", always use double square bracket instead of single square bracket for testing, use echo to pass return value from functions instead of assigning to global variable
4. Learn awk, sed, grep. Bash can be quite limiting when it comes to data processing and these tools can be quite powerful. You can use bash to glue different filters together at a higher level.
Bash is a fantastic language and there are quite a lot of things that can be much more quickly in bash than in other "proper" languages. A lot of people says that it's too difficult to maintain a shell script beyond a "critical mass" but I believe that if you follow good practices and write modular codes, shell scripts can be very manageable.
It's what worked for me, though. There are also some workflow ideas that have really helped a lot. Autocompletion and being about to look through your history for commands is super helpful too.
``` cat $HOME/.bash_history | grep -E 'command|argument' ```
https://github.com/zsh-users/zsh-autosuggestions
I just finished a guid on my site about my terminal setup. I hope to read yours once you've customized the pixels out of it.
Aside from things to get your interested in the internals of your shell via bash scripting, you should also consider writing more shell scripts specifically around your workflows. I keep mine in a .files repo on GitHub. Take a look at the install script. It took me over a year to get really fluent in bash scripting enough to make it possible to get better and better at it.
Good luck on your journey!
Also this one to learn some cool tricks:
That and shellcheck(+syntastic if you use vim) ramps up the skill level quite fast.
Here's my study suggestion:
0. Learn to use variable interpolation and backticks.
1. if blocks and the [ built-in function. Go read about the grammar and look at the flags that [ takes. Memorize the most common couple (file exists, is a directory), and know how to look up the others when needed. Find examples of variable interpolation tricks needed to make this function.
2. for and while blocks. Learn the grammer. for is mostly useful with `seq ...` or a file glob.
3. Learn some of the options to make bash fail early and loudly like pipefail.
4. Most of the power of bash is in the programs you call, and they aren't always the same ones you use interactively. Other folks have mentioned some of these. find, xargs, wait...
Just because you don't feel comfortable writing long scripts doesn't mean you should discourage others.
There are many many justifications for sticking to shell, for example if you need to write a portable installer that works across every UNIX variation.
And deprive them of the hard learned lessons from decades of experience?
> There are many many justifications for sticking to shell, for example if you need to write a portable installer that works across every UNIX variation.
Do you mean every Linux variation? Because trying to write shell portable from old HP-UX to Darwin is an exercise in insanity.
I used to rely on fish, but after a couple of bugs (either in fish or my fingers, not sure) I switched back to bash at my job (on a Linux desktop).
After a few months I had built up a good set of aliases and functions (my most used function is rgrep, see below) and was confidently ^R reverse searching and so on. These things are great because as you jump systems (e.g. to macOS) they continue to work.
TLDR: Practise practise practise!
# the rgrep function
# recursive text file search from the current directory.
function rgrep {
if [ -z "$1" ];
then
echo "please supply a search string."
return 1
fi
grep -rn $1 .
} https://github.com/ggreer/the_silver_searcher
https://github.com/BurntSushi/ripgrepIn any programming language, you learn by practice. Given that your shell does so much, that's the easiest place to find tasks to practice on. I have been leaning on my shell scripts to do a lot of automation. The list is long and I just pick something from that list to work on for most days.
If you don't have system automation that you want to work on, then you probably have a lot of personal data that you can work with. I have scripts setup to manipulate data exports from the various services I consume and then remix that data in my own database. My shell scripts can get the data, operate on it and then shove it into a DB. Then I'll use something else to display that data.
I like bash for the same reason I like emacs, in that no matter what the environment is like, I can usually count on my bash scripts to work. I keep them in emacs org-mode files where I store them in src code blocks. I can tangle multiple code blocks into single executable scripts to different directories. Check out org-mode babel, tangeling, and noweb. Keeping all my bash code in a single file solves my issue with having to dig for that one script I wrote that one time because I forgot how to do this one thing ...
If you aren't running Linux on your desktop yet, consider it. Full immersion is a fast way to learn.
What Kai Thinks Every Developer Should Know About the Shell
2. Conceive of use-cases you can't already solve, and see if you can find a way to do them using Bash.
3. Consider that perhaps Bash isn't the best tool for every job. (It most certainly isn't, though you can abuse it frightfully.)
4. Books. Jerry Peek's guides are getting rather dated, but they're still a good introduction.
5. Read the manpage. Frequently. Find some part of it that doesn't make sense, or that you haven't played with before, and play with it. Shell substitutions, readline editing, parameter substitution, shell functions, math, list expansions, loops, tests, are all high-payoff areas.
6. Take a hard look at zsh, which does a great deal Bash doesn't.
It will make you search for several special use cases and will give you some experience with the command line. Basically, you SSH into a box, solve a problem and the solution for that problem is the password for the next SSH connection for the next problem.
This one is for beginners: http://overthewire.org/wargames/bandit/
I recently released https://terminal.training (paid course for 4 hours) which is just for this kind of question, but I've also started a free mini email course (same URL) that tries to share some of the CLI shortcuts I've come to rely on over the years.
It covers basic Bash commands (head, less, grep, cut, sort, uniq, curl, awk, join), but also pipes, for loops, variables, arrays, and command substitution.
If you're not already comfortable with input/output redirection (including pipes, as well as reading from / writing to files via <file / >file, respectively), then that's where I'd start.
Everything will take longer but imho it's the only way to get better.
Here is the path for learning bash : https://learn-anything.xyz/operating-systems/unix/shells/bas...
Korn shell is a much more complete and capable variant of Bourne. BASH partially implemented many Korn features, but not everything.
The standard reference is the Korn and Bolsky book (2nd edition). I'm not aware of any free/online resources that are profoundly good.
Korn is the very best for scripting.
export PROMPT_COMMAND='echo "$(history 1)" >> $HOME/.basheternalhistory'
Now you can search it later for arcane commands you've forgotten how to use.That worked for me. You could also read the man pages, but step #1 is crucial regardless.
https://web.stanford.edu/class/cs124/kwc-unix-for-poets.pdf
(text analysis in bash)
Vivek (founder) has been writing these tutorials for 17+ years, he knows his stuff.
There is also this great resource: http://wiki.bash-hackers.org
If there is one thing I wish I had understood sooner, that would be it.
Top to bottom programmatic flow is one thing. Conditional execution and branching are on another level.
If you need to automate something use your favorite programming language.
larger point: how do you learn more about X? or get better at doing X? figure that general pattern out and you can re-apply it for anything, not just bash.
I used to force myself to do all of my ad-hoc scripting in bash, but I got sick of the clunky way of parsing arguments, dealing with arrays, looping over data objects, etc.
I got pretty good at it, but at some point I decided just to stick to a language I knew well (R) to string together various pipelines and construct commands. Any high-level language would work, though. I'm much more productive now.
learn from this wiki that has many tutorials and good examples http://wiki.bash-hackers.org/bash4
Using ${EDITOR} to build command lines is awesome. 'nuf said.
First, there is moving around in bash - the arrow keys, or backspace/delete to remove a character, or ^A to go to line start, or ^R to search command history, or tab to complete a command. ^L clears the screen, although from habit I still type clear.
I use shell/bash builtins cd, and pwd often enough. Sometimes export, umask, exit, ulimit -a, echo.
I use shell variables like PS1, HOME, and PATH. I set them in $HOME/.bashrc, which sometimes references files like $HOME/.bash_aliases. I often set a larger than default history file size. I use ~ tilde expansion as an abbreviation for $HOME. For long commands I type regularly, I put an alias in the run control (or run control delegated) file.
I use job control commands like bg, fg, jobs and kill. You should know how bash job control complements and diverges from the system process commands. & starts a process as a background process, and preceding it from nohup tells it to ignore hangup signals.
You should know how single quotes work, and escape characters for them if they are needed.
Then there are pipes (| - "pipelines"), and redirecting of stdin, stdout, and stderr. I use this a lot. Also redirecting or appending output to a file (>, >>). I don't use tee often but sometimes do.
Then there are commands used with the shell a lot. Such as parallel, or xargs.
Also nice which modifies process scheduling.
Script, or typescript, keeps a log of your shell session.
Screen allows for multiple shell sessions. Useful on remote hosts especially (tmux is an alternative).
Then there are the standard file and directory commands I often use like pwd, cd, ls, mv, cp, df, chmod, du, file, find, locate, mkdir, touch, rm, which, and wc.
I manipulate these with commands like awk, sed, tr, grep, egrep, cat, head, tail, diff, and less.
I edit with vim or emacs -nw.
Command like htop, ps, w, uptime and kill let me deal with system processes.
Then there are just handy commands like bc or cal if I need to do some simple addition or see which day of the week the first of the month is.
Man shows you manual pages for various commands. "man command" will show the manual page. For a command like kill, the default will show the command kill - "man kill" which specifically is "man 1 kill". But "man 2 kill" would show the kill system call. You can see what these different manual sections are with "man man" - 1 is executable programs, 2 is system calls etc.
All of it is a process. I mentioned awk. It is one of the commands handy to use with the shell. I have seen entire programs written in awk. Some parts of awk I can use from memory, some I use occasionally and have to look up the flags to refresh my memory, some parts I have never used at all. As time goes on you pick up more and more as you need it.
Knowing how to reverse search (Ctrl-R) and run last command !vim or !curl to rerun last instance of vim or curl command with args so you don't have to search every time.
m, a Unix shell utility to save cleaned-up man pages as text:
https://jugad2.blogspot.in/2017/03/m-unix-shell-utility-to-s...
I've been using it from earlier Unix versions, where these formatted man pages (nroff/troff-formatted) were more of an issue. Also works if you want to open the text form of the man page in vi or vim, for reading, searching, etc.
Direct your focus to plain Bourne sh as much as possible, moving on only after you understand what enhancements over vanilla sh Korn or Bourne-Again actually offer.
Pick up Manis, Schaffer, and Jorgensen's "UNIX Relational Database Management" and work through the examples to get a feel for the philosophy behind large, complex applications written in shell.
Join a Unix community (sdf.org) and try to do useful things with shell (e.g. cron-schedule stock quote e-mail notifications via shell scripts, etc).