What are your favorite command line tips or tricks?

Posted September 16, 2016 14.8k views
Linux BasicsLinux Commands

In the same vein as my previous question about bash aliases, what are your favorite command line tricks? What command has saved you the most time? What bash features did you never realize existed and now use daily and can’t live without?

Of course, there are the classics:

  • !! - Re-runs the last command you entered
  • sudo !! - Re-runs the last command you entered as a super user

What are yours? (No cheating.)

edited by etel

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

23 answers

screen is one of my favorites. You can do so many things with it. It’s really handy when you want something to keep running after you disconnect from SSH. To disconnect a running screen press ctrl + a, ctrl + d. To see your screens screen -list. To restore one screen -r [somescreenname]

  • I use screen on quite a few projects to quickly get a process running in the background while allowing me to switch back to it as needed whether it’s a minecraft server or a ruby script. Your reply is a great quick reference for commonly used screen commands. We also have this tutorial which walks through using screen step by step.

    by Justin Ellingwood
    In this article, we will discuss how to install and use "screen", a tool that allows you to manage multiple shell sessions inside a single terminal. The screen program is very useful for multi-tasking in an ssh environment, due to its ability to resume terminal sessions and perform multiple-concurrent tasks.

Ctrl+r i.e. reverse search is my favorite.

I love pushd and popd for treating your paths as an array. Good for backtracking and for scripting. Here’s an example:

$ mkdir -p my/test/folder
$ pushd my/test
~/my/test ~
$ pushd folder
~/my/test/folder ~/my/test ~
$ popd
~/my/test ~
$ pwd
$ popd
$ pwd
  • rsync for moving stuff
  • wget is great (especially with the -c (continue) flag)
  • gnu parallel for running batch jobs in…parallel
  • flock for making sure your cron job is only running one instance at any given time
  • the $() for getting the output from a command that was mentioned earlier is great, and you can also
  • I’m not sure where exactly it comes from, I think it ships with perl in deb-based distros, but the rename command is great for bulk-renaming jobs

Some GNU readline hotkeys:

Ctrl + W - delete previous argument and add to yank ring
Ctrl + Y - paste last item from yank ring
Ctrl + U - delete all the symbols left from the cursor
Ctrl + K - delete all the symbols right from the cursor

The hash which keeps track of the number of times you’ve called a given outer command within the current shell:

[cvetomir@localhost:~]$ hash
hits command
1 /usr/bin/pwgen
1 /usr/bin/vim
2 /bin/ls

Wrapping subcommands in $() is pretty brilliant for chaining things together.

For example, you can put together grep and awk to find a docker container via a docker-compose name to execute a script on. A silly example, but it works.

docker exec $(docker ps -aqf "name=$(docker-compose ps | awk '{print $1}'|grep "db")") bash "/run/a/script/"

Here is a simpler example that stops all docker containers:

docker stop $(docker ps -q)

I love rsync..
I rsync all over the place…backups…moving websites to another folder…you can even rsync to another server with ssh:

rsync -e -av --recursive --progress  --rsh='ssh -p3222  ~/folder

Which :)

which name, for instance where is the dnf command on fedora..
which dnf gives /usr/bin/dnf


It is pretty handy when you need to know if were a failure or to audit the system login, restart, or shutdown.

maybe not that much, but i love tail -f to see what’s happening live on some demon’s logs

cd .. - go up one directory

Not that I use some much but what comes to my mind:
command & - running command in background
Keep in mind that it sometimes still can output but it doesn’t lock your bash.
There is workaround something like command &>/dev/null & will fix this.

Don’t forget very old classic TAB for auto complete. I can’t think about terminal without it.

screenfetch for system details output, but it is not installed on default (it’s in repo tho).

Also thanks for !! and this thread, I don’t know a lot of this tips&tricks :P
Edit: Forgot about CTRL+SHIFT + T on Ubuntu for new Terminal Tab. You switch between them with keyboard shortcut ALT+1,2,3...

Printing CRC checksum and byte count of a file

cksum foo.txt

File status output

stat foo.txt

Get the last modified date of a file

echo $(stat -c %y

Generating tar.gz files with current date/time name pattern

tar -czf $(date +%Y%m%d%H%M%S).tar.gz foo

Get the process list ordered by memory and cpu usage:

clear; ps -auxf | sort -nr -k 4 | head -10

A good one is: find . -type f -iname "*.something" -del; which recursively finds and deletes files. find has many neat tricks and is a good one to have handy.

du for disk usage, along with the --max-depth and -h flags will give you a nice overview if you are trying to find what is taking up space. -c will give you a grand total at the end.

cd ~- to go to the previous directory

Grep tool is really one my favorite.. specially when you are developer..its really handy in both front end and and backend development..back in the days when i started using sass preprocessor it was really difficult for me to figure out source of sass but thanks to source mapping in browser now a days its not a big deal…grep -rn all the way for me ;)

find / -name “filename”

You can also use wildcards, very handy. Pipe it into less and you get to go line for line :)

Two of my favorite CLI tricks are throwing all logs into a variable for processing. You end up with a $LOGS variable that you can do stuff like tail -f $LOGS to look at all logs file at once, or you can drill-down with tail -f $(echo "$LOGS" | grep error) to see live all files that processes have a w File Descriptor set and are writing to:

LOGS=$(lsof -ln | awk '$4 ~ /[0-9]w/ && $5 ~ /REG/ {FILE[$NF]++}END{for (i in FILE) print i}')

The next one you can Google “speed up grep” and read my article about it, but it’s to take advantage of your Locale settings and setting LC_ALL=C prior to a command like awk, grep, etc…

This forces ASCII characters, instead of UTF8 which isn’t found in most logs, and can speed up searches through log files by up to around 1400%. Which is really helpful when you want to see the top hits across all logs very quickly.

Then of course I can’t recommend awk enough, and the power of its arrays.

LC_ALL=C awk '/'$(date "+%d\/%b")'/ {gsub("/srv/users/serverpilot/log/.*/","",FILENAME); REQ[FILENAME" "$6" "$7]++}END{for (i in REQ) print REQ[i],i}' $(echo "$LOGS" | grep apache.access) | sort -rn | head -10

27 example_apache.access.log "GET /
10 domain_apache.access.log "GET /
9 wordpress_apache.access.log "POST /wp-login.php
8 blog_apache.access.log "POST /wp-login.php
6 blog2_apache.access.log "POST /xmlrpc.php

i tend to have a single set of commands set up on a server (once it’s up and running)
They are already pre-typed into a “read-only” .bash_history file.

Examples are a few Rsync and MySqlDump commands to backup servers, as well as other commands housekeeping tasks I tend to use a lot.

Sounds terrible idea, but it saves me from doing a lot of typing. (Also i make tonnes of typing mistakes and this stops me from doing anything too damaging!!) lol and it’s easy to add to if I need to update or add new commands.

CTRL-r to search recently issued commands.

Just type in a snippet of a recently issued command and it will pop-up!

pgrep to display the PID of a program

Submit an Answer