Useful Linux Commands

alias cp curl cut dhclient
du file find free grep
groupadd groups gzip head hostname
iconv id kill killall last
ln ls lsof man mount
ldd nc netstat passwd rm
scp sort stat tail tar
test tr uname uniq unset
uptime useradd usermod wc who

Introduction

Sometimes you want to run two commands with a single line input, especially if they take some time to execute. This can be done by simply putting && between them, for example:
sudo apt-get update && sudo apt-get upgrade

There are also occasions where you want to run the command in the background, for example you want to start a program with a GUI from a terminal, this can be done by placing an ampersand at the end of the line, which makes the command run in the background so you can continue using the terminal session, although it will still output to that terminal window:
uex ./file.txt &

If your command generates a lot of output you might find it easier to redirect it to a file, like this cmd > output_file.txt however you might notice this does not actually redirect everything. In fact it only redirects stdout not stderr, to redirect both to file use this cmd &> output_file.txt, see Linux Shell Scripts for further explanation.

Putting some of the commands listed here together gives some powerful command lines, see Useful Linux Tips and Tricks for more details.

alias

Executing this command on its own lists all the defined alias' for your login. You can define a new alias thus alias lsbk='ls -al /tmp/backups which means you can execute "lsbk" and it will actually execute the ls command. This is usually defined in your ".profile" file and is very handy.

cp

This is the Linux copy command and generally works exactly as you would expect. However I have found that copying the contents of a directory with hidden or dot files in the root in is not so simple where the target directory exists. So I found doing two commands as follows works:
cp -Rv ./drupal-7.34/* /var/www/html
cp -v ./drupal-7.34/.* /var/www/html

It is worth noting that "-R" means recursive, or in other words go down all the sub-directories, "-v" means, display everything you do, be verbose and -Rv is just both together. The first command copies all the files and directories, including dot files in subdirectories but misses dot files in "./drupal-7.34/", the second command does these but it does mention it could not copy the special directories "." and "..".
Another usful switch is "-a" which will do the copy preserving as much structure, attributes and SELinux information as possible, however it does ignore failures to preserve such information. Also "-d" will copy symbolic links rather than the files themselves.
More information can be found with this command: info coreutils 'cp invocation'

curl

A very handy little command line utility for sending data to or getting it from a URL using a variety of protocols. For example curl http://www.example.com/ gets the default HTML for the example URL. If you re-direct this to a file you can capture the output as follows curl http://www.example.com/ > output.txt. However there are other neat tricks like curl -I http://www.example.com/ which displays the response headers. The curl command is also ideal for use in scripts as it has some handy exit codes.

Handy curl examples are as follows:
curl -u username:password -O "https://example.com/files/binaryfile.dat" - this downloads a file using the filename in the url with the credentials specified, you can use -o filename.dat to save the file to a different name.

You can also use curl to do ftp, some basic examples follow:
Directory Listing: curl --ssl --ftp-pasv --user username:password ftp://ftpserver.example.com
Download File: curl --ssl --ftp-pasv --user username:password ftp://ftpserver.example.com/remote_file.txt -o ./local_file.txt
Upload File: curl --ssl --ftp-pasv --user username:password ftp://ftpserver.example.com/ -T ./file_to_upload.txt
These examples all use FTPS Explicit in Passive mode. Specifying ftps in the url will switch to FTPS Implicit which you probably don't want. Leaving the password off means curl will prompt for it.

If you wish to use a proxy server, then something like this would work:
curl --proxy hostname:8080 http://www.example.com
This will get the HTML of the home page of www.example.com via a proxy server running on port 8080 on a server called hostname.

I recommend looking at curl - Tool Documentation for the documentation, FAQ and a Tutorial.

cut

The default delimiter is the tab character and you can optionally specify which field or fields, so cut -d: -f2 means give me the second field where : is the delimiter. You can also specify the output delimiter, which defaults to the input one, try the following:
cat /etc/passwd | cut -d: -f1,6-7 --output-delimiter=" - "
Which outputs the first, sixth and seventh fields of the passwd file but separated by " - ". However the following has the same effect:
cut -d: -f1,6-7 --output-delimiter=" - " /etc/passwd
You can just specify the file, rather than pipe it in.

dhclient

Simple little commend for doing DHCP stuff. If you have a Linux install without a desktop then you might find plugging a cable in does not automatically pick up an IP Address from the DHCP server, dhclient -v will get an IP address, the -v is verbose mode, just so you can see what is going on. There is also dhclient -r -v which will release the IP address. By default this works on all network interfaces, you can specify one, like this dhclient -v -r eth0

du

The "disk usage" command is great for seeing how much space the current directory and all its children take up, it works off the current directory and specifying "-h" is always good to get the output in "human readable" format, rather than blocks. However sometimes you do not want a complete list of all subdirectories. So using du -h --max-depth=1 is a good option.

file

The command file is the best way to find out about executable files. For example, it tells you whether they are 32-bit or 64-bit. For example if you execute file /usr/local/bin/uex, which is UltraEdit, then the result is this:
/usr/local/bin/uex: ELF 64-bit LSB executable, x86-64, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.9, stripped
However the command goes further, it works on all kinds of files including, scripts, certificates and more.

find

I use this most to find files, obviously, and whilst this is a powerful command I tend to just use it to search by filename, so to find something under the current directory find . -name 'settings.php'. Sometimes though you want to search the whole machine, in which case I use find / -name 'mysql*' 2> /dev/null, the points to note in this example are that I changed . meaning from current directory to / being the root, I also appended "2> /dev/null" which means redirect stderr to null, or in other words all the error messages are not displayed as you will inevitably have loads of errors about permissions when scanning the whole disk.

The find command just lists the name, if you want more detail like you would get from ls -l /tmp/filename.txt then you need to do this:
find / -name 'vsftpd.conf*' -exec ls -l {} \; 2>/dev/null
The "\;" is an escaped semi-colon and is needed to terminate the -exec option, also note that the filename found is put where the braces are.

A useful option for the find command is "-type ?" where ? is "f" for files and "d" for directories, see the man page for more options.

free

Handy little utility to get the amount of free memory, just execute free -h for a nice summary, where -h gives "human readable" output, the default output is in kilobytes and -m and -g give output in megabytes and gigbytes respectively.

grep

The grep command is a very handy little search utility and whilst quite simple is actually very powerful. The classic use case would be to search files for a piece of text in the current directory, which would be something like grep drupal *, meaning search all the files in the current directory for the word "drupal". Clearly you can change * to *.txt or *.php, however there are three options which I find very useful which are as follows:

  • -r meaning search all sub-directories, a recursive search
  • -i which means ignore the case, do a case-insensitive search
  • -c just count the number of occurences in a file
  • -v returns lines that do not match
  • -n shows line numbers
Clearly you can put these together so grep -ric 'hello world' *.txt.

groupadd

Creates new groups, for example groupadd -g 500 grpname. Clearly the number needs to be unused, ideally it should be unique. I believe groups are normally numbered 500 or above. If you want to know the highest currently used group number then the following will help:
cut -d: -f3 /etc/group | sort -g | tail -5

groups

The easy and proper way to see which groups a user is a member of groups usrname, however if you leave off the user name then it runs for the current user.

gzip

If you are working on the command line rather than via a UI then this command is needed for extracting from a .gz file. For example gzip -d vsftpd-3.0.2.tar.gz will extract the tar file from the gz archive but note that it will also remove the original .gz file.

head

If you just want to look at the first few lines of a file then the head command is handy, when cat would list the whole file. So head thefile.txt will list the first part of the file, you can also specify how many lines you want with head -n25 thefile.txt to get the first 25 lines of the file.
Note that the head and tail commands work in a very similar way.

hostname

Simple run this command and it will display the fully qualified domain name for the local machine. If you use hostname -I you will see a list of IP addresses used by your machine.

iconv

The iconv command is a GNU one that does character set conversion, the following example will convert a "UCS-2 LE BOM" file to UTF-8:
iconv -f UTF-16le -t UTF-8 UCS2-Test-File.txt > UTF-8-Output-File.txt
More complex conversions can be done with many other formats, execute iconv -l to see the options on the specific system.

id

Executing this command on its own returns your user id and the id of all the groups you are a memeber of as well as SELinux information if that is on your distribution. If you execute id usrname then you get the same information but on the user account specified.

kill

This is used to terminate (or kill) a background process, for example kill 10824 will send a terminate signal to the process with a process ID of 10824. If the process is having issues and that failes then kill -9 10824 will forcibly terminate the process.

killall

When you have lots of similar processes to terminate then you can do killall nc to terminate all the nc processes. Note that this command behaves differently on AIX.

last

The last command basically scans the file /var/log/wtmp, just type last on its own and see. The file contains logins and reboots. Executing last oracle shows all the logins for the oracle user listed in the file. A common use is last reboot which shows all the reboot times listed in the file, alternatively last -1 reboot shows just the latest reboot, however I think the output of last reboot | head -1 is cleaner.

ldd

This handy little command will show which libraries are dynamically linked to an executable. So ldd /usr/local/sbin/vsftpd will tell you which libraries are used by vsftp. I have noticed on AIX that this command lists .a files which are archives, so you need to use the ar command to look inside those.

ln

This is used for creating symbolic links. Executing ln -s /media/sf_Linux will create a link to /media/sf_Linux in the current directory called sf_Linux. To specify a different name for the symbolic link use ln -s /media/sf_Linux HostLinux, which still works in the current directory.

ls

I recently wanted to list the subdirectories off the current directory, which is not as easy as it should be. Here are some options:
ls -l | grep "^d" - this is the "obvious" solution, find lines beginning with a d
ls -l -d */ - seems clunky putting "*/" on the end but it does work, however it cannot show hidden directories
find . -maxdepth 1 -type d - nice option which works with hidden directories

lsof

This is a very handy command for seeing open network ports, however it looks at other things too. If you execute lsof -i you will see a list of open ports and what has them open, if you see nothing then try again but using sudo sudo lsof -i as it needs permission to see detail not belonging to ordinary users. If you run lsof -i -P then you will see port numbers rather than port names. See also: netstat

man

This is a handy little command to get help on other commands, you simply type man with the name of the command you are interested in as an argument. So help on the passwd command is obtained via man passwd, you can then page down or press q to quit. On some machines you might get two commands with the same name and hence the wrong help! So man -a passwd will show all the manual pages for all passwd commands. Note that passwd is the standard command to change passwords but also exists in OpenSSL to compute password hashes, so can have two entries.

mount

The mount command shows where all disks etc around mounted within the file system. It is also useful for seeing where a CD/DVD is mounted, especially when you switch between distributions and forget.

nc

The nc or netcat command is very useful for testing network connections, open ports or anything TCP or UDP related. If you want to test access to a port on another machine then
nc -v -w 5 192.168.56.101 5666
will help, if makes a TCP connection to port 5666 on 192.168.56.101, holding the connection open for 5 seconds before timing out and does so in verbose mode.
If you need to have something listening then nc -l -p 2112 will open port 2112 and listen for a connection. If you then telnet to that server/port you can type and see it on the server where the nc command was run, proving network connectivity.
There is a lot more but that is a good starting point.

netstat

This does a similar job to lsof and can be run as netstat -lptu or netstat -lptun if you prefer port numbers to port names.

passwd

Change your password or that of another user if specified as an argument

rm

The rm or remove command is capable of deleting files and directories, including directory trees. However you can delete your entire file system so do be careful but this is a good reason to not be running as "root", ever! Some useful paramteres are:
-f force, never prompt
-R recursive, so ideal for removing directories and all their contents
So a classic delete directory is this: rm -fR ./subdir, which is a common use case.

scp

This command is short for Secure Copy and it uses SSH to copy files from one server to another. An example syntax is scp root@192.168.56.102:/root/Downloads/nrpe-2.15.tar.gz root@192.168.56.101:/tmp/nrpe-2.15.tar.gz2. If you are copying from your current logged in session to a remote machine you can shorten this slightly to scp /root/Downloads/nrpe-2.15.tar.gz root@192.168.56.101:/tmp/nrpe-2.15.tar.gz2. The basic syntax here is scp source destination.

sort

If a file is specified then the sorted file is sent to standard out, there are plenty of handy options. If you are trying to sort numbers then using -g will help as this will treat numbers as numbers.

stat

The stat command gives more detail information than ls -l and is used simply with stat filename.ext.

tail

The tail command will show the last part of a file, for example tail thefile.txt, you can specify how many lines like this: tail -n20 thefile.txt. A very common use of tail is to "follow" the end of a log file, which is done with tail -f thefile.log and you can combine this with a number of lines too if needed.
Note that the head and tail commands work in a very similar way.

tar

Simple command to extract tar files or gzip files or gzipped tar files, however it does have some more complex arguments. For me a common use of the tar command is when working with Drupal. To extract a Drupal archive, just run tar -xzvf ./drupal-7.34.tar.gz. However Drupal archives have all their contents in a sub-directory within the archive, so you end up with everything extracted to ./drupal-7.34/ in this example. However executing tar --strip-components=1 -xzvf ~/Downloads/drupal-7.34.tar.gz will strip this first level, which is very handy.

To add stuff to a new tar, use something like the following: tar -cvf new.tar /tmp/stuff
This will put /tmp/stuff and any sub-directories into the file "new.tar". Note the following switches:

  • -c create a new archive
  • -f means use archive name provided
  • -t list files in the tar
  • -v means list name of each file
  • -x means extract from the tar
  • -z to process the archive through gzip
By default tar will recurse through all the sub-directories.

test

This is actually a very handy command for using in scripts, for example test -n "$BASH_VERSION" tests if the length of the script is non-zero, however it also does a lot of file and directory related test.

tr

This command will "translate" characters and can do a number of clever things, like convert case, remove special characters etc. One nice use is the display the PATH or Java ClassPath with each item on its own line. Try this:
echo $PATH | tr ':' '\n'

uname

The uname command is useful for getting system information, for example uname -r gives the Linux Kernel version. You can get the same information with cat /proc/version, however uname does give other information.

uniq

This can remove adjacent duplicates and leave the unique line or remove them all and there are options for counts etc.

unset

This command removes an environment variable set with export, so to set and then unset an environment variable you would do the following:
export http_proxy=http://server:port
unset http_proxy

uptime

This command shows how long a box has been running, however I have found it gets confused when running in a VM which you pause or put to sleep for a few days. If you just want the simple, how long then uptime -p works nicely and uptime -s displays when the box started up.

useradd

Create a new user, for example useradd -c "Geoff Lawrence" -d /home/geoff -m -u 50001 -U geoff, which creates a new login called "geoff", however the parameters need explaining:

  • -c "comment": this is useful for documenting the login
  • -d home_directory: specify the home directory for the new login
  • -m create the specified home directory
  • -u the user id
  • geoff_adm is the new username

usermod

Handy command to grant group membership to a user. Note the capital G, the lowercase g changes the default group which is not usually what you want. usermod -a -G grpname usrname

wc

The wc command will, quite simply print the number of newlines, words or bytes for a given file or the pipeline, for example.
wc -l readme.txt - this will print the number of lines in the readme.txt file
cat readme.txt | wc -l - does the same as the previous command

who

The who command has multiple uses, when used on its own it lists all the currently logged on user sessions. If you do who am i then you will just see your session and then whoami is a similar command that returns just your username. There is also the following:

  • who -b - show the system boot time
  • who -q - show the logged in user names and how many people are logged in
  • who -u - show full details of each logged in user
There is more but that's for another day!