curly brace substitution:
$ mkdir -p new_project/{img,js,css}
mkdir -p new_project/img new_project/js new_project/css
$ mv some_file.txt{,.old}
mv some_file.txt some_file.txt.old
Caret substitution: # systemctl status mysql.service
-- snip output --
# ^status^restart
systemctl restart mysql.service
Global substitution (and history shortcut): $ echo "We're all mad here. I'm mad. You're mad."
we're all mad here. I'm mad. You're mad.
$ !!:gs/mad/HN/
we're all HN here. I'm HN. You're HN.
I have a (WIP) ebook with more such tricks on GitHub if anyone is interested:
https://tenebrousedge.github.io/shell_guide/g++ -o foo{,.cpp}
~ cat test.cpp
#include <iostream>
using namespace std;
int main()
{
cout << "hi\n";
return 0;
}
~ stat Makefile
stat: cannot stat 'Makefile': No such file or directory
~ 1 make test
g++ test.cpp -o test
~ ./test
hiPractical example: when you are doing "make oldconfig" on the kernel, and you don't care about all those questions:
yes "" | make oldconfig
Or, if you prefer answering no instead:
yes "n" | yourcommand
Also, the author refers to watch as a "supervisor" ("supervise command" - his words). That is bad terminology. Process supervision has well defined meaning in this context, and watch isn't even close to doing it.
Examples of actual supervisors are supervisord, runit, monit, s6, and of course systemd (which also does service management and, er, too much other stuff, honestly).
"Examples of actual supervisors are..."
1999-2001
http://cr.yp.to/daemontools/supervise.html
runit and s6 are copies of daemontools.
Otherwise I liked your comment about use of yes.
ssh -oStrictHostKeyChecking=no 192.168.0.100
...or by adding rules in /etc/ssh/ssh_config or ~/.ssh/config Host 192.168.0.*
StrictHostKeyChecking no find . -exec echo {} \; # One file by line
You don't need to execute echo to do that as find will output by default anyway. There is also a `-print` flag if you wish to force `find` to output. find . -exec echo {} \+ # All in the same line
This is think is a dangerous example because any files with spaces will look like two files instead of one delimited by space.Lastly, in both the above examples you're returning files and directories rather than just files. If you wanted to exclude directories then use the `-type f` flag to specify files:
find . -type f ...
(equally you could specify only directories with `-type d`)Other `find` tips I've found useful that might be worth adding:
* You don't need to specify the source directory in GNU find (you do on FreeBSD et al) so if you're lazy then `find` will default to your working directory:
find -type f -name "*.txt"
* You can do case insensitive named matches with `-iname` (this is also GNU specific): find -type f -iname "invoice*"This was added to OpenBSD 17 years ago. Other BSDs soon followed. Solaris & IllumOS support it too.
http://cvsweb.openbsd.org/cgi-bin/cvsweb/src/usr.bin/find/op...
Apologies for this. It just goes to show how fallible the human memory is. :-/
ls **/*.c
Results in something like: array.c helpers/gendec.c msg.c
awkgram.c helpers/mb_cur_max.c node.c
awklib/eg/lib/grcat.c helpers/scanfmt.c old-extension/bindarr.c
Turn on with: shopt -s globstarWant to mess with such a script?
$ touch "$(echo -e -n 'lol\nposix')"
If you're trying it out now and cannot figure out how to delete it: "ls -li" to find the file's inode number, then `find -inum $INODE_NUMBER -delete`.
However for normal day to day usage, file names with \n are rare while files with spaces in their name are common. So returning an array of space delimited file names is a potentially dangerous practice for common scenarios where as find's default behaviour is only dangerous for weird and uncommon edge cases. (And if you think those are a likely issue then you probably shouldn't be doing your file handling inside a POSIX shell in the first place).
grep -P "\t"
Not criticising the author here, as grep -P is good, but you might not also know that you can enter tabs and other codes in bash by pressing ctrl-v. So you could also type: grep "[ctrl-v]TAB"You can also do $'\t' in at least Bash (and probably Zsh).
$ set -o vi; bind -q quoted-insert
quoted-insert can be invoked via "\C-v".:%s/Ctrl-Q[Enter]//g[RealEnter]
or replace each tab with 4 spaces:
:%s/Ctrl-Q[Tab]/[4 spaces]/g[RealEnter]
where [RealEnter] means an unquoted Enter.
This is mainly useful for making such changes in a single file (being edited) at a time. For batch operations of this kind, there are many other ways to do it, such as dostounix (and unixtodos), which come built-in in many Unixen, or tr, maybe sed and awk (need to check), a simple custom command-line utility like dostounix, which can easily be written in C, Perl, Python, Ruby or any other language that supports the command-line interface (command-line arguments, reading from standard input or filenames given as command line arguments, pipes, etc.).
$ echo $SECONDS
83783
$ echo $SECONDS
83784
$ echo $SECONDS
83787 > Doesn't matter how long you've been using a GNU/Linux shell
Or a Unix shell, even.> tar -ztvf file.tgz
> tar -zxvf file.tgz filename
Tar no longer requires the modifier for the compression when extracting an archive. So no matter if it's a .tar.gz, .tar.Z, .tar.bz2, etc you can just use "tar xvf"
tar caf foo.tar.xz foo/
The extension of the file after the `f` switch tells tar what compression to use.c - compress
a - auto
And in fact whenever you want to do some command like:
cmd some_flags_and_args_and_metacharacters ...
you can just replace cmd with echo first to see what that full command will expand to, before you actually run it, so you know you will get the result you want (or not). This works because the $foo and its many variants and other metacharacters are all expanded by the shell, not by the individual commands. However watch out for > file and >> file which will overwrite or append to the given file with the output of the echo command.
So typing 'rm foo* ' and pressing Alt-? (on non-OSX computers) will give you a list of what your wildcard would expand to.
M-* does the same thing but actually expands the wildcards, instead of just showing what that expansion would be.
For example, I can do the following:
mmv "flight.*" "flight-new.#1"
and this will rename all of my files that start with flight. to flight-new. preserving the file extension. So useful, when you've got a bunch of different files with the same name but with different extensions such as html, txt and sms.http://wiki.bash-hackers.org/syntax/pe
Bash is insane.
${####}: http://www.oilshell.org/blog/2016/10/28.html
Each # means a different thing.
${foo//z//}: http://www.oilshell.org/blog/2016/10/29.html
There are three different meanings of / there.
"${a[@]}" -- http://www.oilshell.org/blog/2016/11/06.html
You need 8 punctuation chars in addition to the array name to correctly interpolate it. Every other way gives you word splitting issues.
And yes, a lot of the things mentioned work in a lot of shells, but some don't, or act differently.
Description: Remove executable recursively for all files
Command: chmod -x $(find . -type f)
Description: List files with permissions number Command: ls -l | awk '{k=0;for(i=0;i<=8;i++)k+=((substr($1,i+2,1)~/[rwx]/) *2^(8-i));if(k)printf("%0o ",k);print}'
Description: Output primary terminal colors Command: for i in {0..16}; do echo -e "\e[38;05;${i}m\\\e[38;05;${i}m"; done | column -c 80 -s ' '; echo -e "\e[m"
Description: NPM list with top-level only Command: npm list --depth=0 2>/dev/null
Description: Show active connections Command: netstat -tn 2>/dev/null | grep :80 | awk '{print $5}' | sed -e 's/::ffff://' |cut -d: -f1 | sort | uniq -c | sort -rn | head find . -type f -exec chmod -x {} \+Another approach that works is
find . type f -print0 | xargs -0 chmod -x
although find's built-in -exec can be easier to use than xargs for constructing some kinds of command lines.Is this true? Not impossible, but I am surprised. If true, what is this fixing? In my naive view (never studied swap management), if at the current time a page is swapped out (and by now we have more memory -- we can kill swap completely and do fine), it should get swapped in when needed next time. As there is more memory now it should not, in general, be swapped out again.
If true we are exchanging a number of short delays later for a longer delay now, which to me hardly looks like a win.
By flushing the swap, you wait some time first but then it all runs smoothly. When using a hard drive and not SSD the difference is even bigger.
However, swapoff/swapon only solves part of the problem - you still have binaries, libraries and other file-backed memory that were thrown out under the memory pressure and they won't be reloaded with the swapoff/swapon. Does anyone know how to force these kinds of things to be re-loaded before they are needed again?
a better "20. Randomize lines in file":
shuf file.txt
instead of cat file.txt | sort -R
(sort -R sorts by hash, which is not really randomisation.)I looked at the source code for GNU sort and what they're doing is reading 16 bytes from the system CSPRNG and then initializing an MD5 digest object with those 16 bytes of input. Then the input lines are sorted according to the hash of each line with the 16 bytes prepended.
Although they should no longer use MD5 for this, I don't think we know anything about the structure of MD5 that would even allow an adversary to have any advantage above chance in distinguishing between output created this way and an output created via a different randomization method. (Edit: or distinguishing between the distribution of output created this way and the distribution of output created via another method!)
The output of sort -R is different on each output and ordinarily covers the whole range of possible permutations.
$ for i in $(seq 10000); do seq 6 | sort -R | sha256sum; done | sort -u | wc -l
720Eg `(seq 3; seq 3; seq 3) | sort -R`.
This is fantastic, a game changer for me.. I often give people a command like ls -lad /path /path/to /path/to/file
Thanks!
whatever < file.txt
not cat file.txt | whatever
Also, there is no need for - or z on GNU tar in t or x modes: tar tf whatever.tgz
tar tf whatever.tar.bz2
tar tf whatever.txz
...
tar xf whatever.tar.gz
tar xf whatever.tar.Z
...
all work just fine pwd -P
all the time to get the real full path (no symlinks) of the current directory. Really easy to remember as well, Print Working Directory.You could use this to get the current location of the script being run (not the location where it runs).
readlink -f $0 ag --ocaml to_string
Very fast and simple syntax.When I last tried, ag couldn't handle this in .ignore file:
!foo.1
foo.*
That would ignore all foo.* files except foo.1 in rg.Also rg is a bit faster than ag for my use cases.
[0] http://man7.org/linux/man-pages/man3/readline.3.html#EDITING...
UNIX one-liner to kill a hanging Firefox process:
https://jugad2.blogspot.in/2008/09/unix-one-liner-to-kill-ha...
#!/bin/sh
scale=4 # results will print to the 4th decimal
echo "scale=$scale; $@" | bc -l
Now you can do math. $ math '1+1'
2
$ math '2/3'
.6666
This is especially useful in shell scripts with interpolated variables: x=10
x=`math $x - 1` # .bashrc
alias bc='bc --mathlib'
and a .bcrc file: # .bcrc
scale = 4
Actually, this is what my .bcrc looks like: scale = 39
k_c = 299792458 # Speed of Light
k_g = 6.67384 * 10^-11 # Gravitation
k_atm = 100325 # Atmospheric pressure
k_h = 6.62606957 * 10^-34 # Planck's constant
k_hbar = 1.054571726 * 10^-34 # H Bar
k_mu = 1.256637061 * 10^-6 # Vacuum permeability
k_ep = 8.854187817 * 10^-12 # Vacuum permittivity
k_epsilon = 8.854187817 * 10^-12 # Vacuum permittivity
k_e = 1.602176565 * 10^-19 # Elementary charge
k_coulomb = 8.987551787 * 10^9 # Coulomb's constant
k_me = 9.10938294 * 10^-31 # Rest mass of an electron
k_mp = 1.672621777 * 10^-27 # Rest mass of a proton
k_n = 6.02214129 * 10^23 # Avogadro's number
k_b = 1.3806488 * 10^-23 # Boltzmann's constant
k_r = 8.3144621 # Ideal gas constant
k_si = 5.670373 * 10^-8 # Stefan-Boltzmann constant
k_sigma = 5.670373 * 10^-8 # Stefan-Boltzmann constant
k_mt = 5.97219^24 # Mass of Earth (Tierra)
k_rt = 6.371 * 10^6 # Mean radius of Earth (Tierra)
pi = 3.1415926535897932384626433832795028841968
# requires --mathlib
define t(x) { return s(x)/c(x); }
define as(x) { return 2*a(x/(1+sqrt(1-x^2))); }
define ac(x) { return 2*a(sqrt(1-x^2)/(1+x)); }
define at(x) { return a(x); }
define csc(x) { return 1/s(x); }
define sec(x) { return 1/c(x); }
define cot(x) { return c(x)/s(x); }I checked and it has all of the ones that you mentioned, sometimes under slightly different names. I was surprised that e is defined the elementary charge rather than Euler's constant!
$ x=10
$ echo $((x - 1))
9
Though if I need to do a floating point calculation at the shell, I start python or R, which both have their own interactive shells (with the same readline interface, which I like).The rename example fails with "rename: not enough arguments".
Debian ships this: https://metacpan.org/pod/distribution/File-Rename/rename.PL
Most(?) other distros ship the one from util-linux: http://man7.org/linux/man-pages/man1/rename.1.html
Read it like this: rename <expression> <replacement> <file(s)...>
Argument 1 is a string (not a regex), argument 2 is string to replace the old string, and arguments 3+ is a file, or list of files, or a glob.
The man page includes a warning about the lack of safeguards. It is unfortunate that rename doesn't have an equivalent of mv or cp's -i flag because it's so easy to overwrite files if your expressions aren't exactly correct.
On some Ubuntu systems, I've seen another version of rename that's actually a Perl script that uses regular expressions. I think that version lets you set some safeguards, which are sorely lacking in the binary version of rename that ships with CentOS.