Flipkart

Showing posts with label Script. Show all posts
Showing posts with label Script. Show all posts

Tuesday, 7 March 2017

How to Debug a Shell Script-(Linux&UNIX)

In this article, I will explain how to debug a shell script.
I am taking a script which was used in one of my previous articles that generates Alerts whenever /opt File System on your Red Hat/Centos Linux Server exceeds its usage 65%.

Below is the reference:
Shell SCRIPT to Monitor File System Usage/size in Linux

The script used to generate alerts is as shown below:
=====
#!/bin/sh
df -kh | grep "/opt" |  awk '{print $5" "$6}'|cut -d " " -f1 --output-delimiter='  '|tail -n 1 | while read Value;
do
  echo $Value
  fssize=$(echo $Value | awk '{ print $1}' | cut -d'%' -f1)

 if [ $fssize -ge 65 ]; then
mail -s "Alert: /opt is Almost out of disk space on Server `uname -n`, Needs Immediate Attention"
serversupport@domain.com
fi
done
======


 Now, how can a system administrator debug this script to make sure if it works fine before he deploys this script on his production Linux Server?

SCRIPT DEBUGGING:

 1.To start debugging any script, take meaning full lines or parts of a line from the above script which works in a command line and try to get the output for each meaningful Linux command of the script.

Output from my system for the above script is as below:

[root@server2 ~]# df -kh | grep "/opt"
/dev/sda7             7.8G  1.4G  6.1G  68% /opt


 [root@server2 ~]# df -kh | grep "/opt" |  awk '{print $5" "$6}'
68% /opt


[root@server2 ~]# df -kh | grep "/opt" |  awk '{print $5" "$6}'|cut
 -d " " -f1 --output-delimiter='  '
68%


[root@server2 ~]# df -kh | grep "/opt" |  awk '{print $5" "$6}'|cut -d " " -f1 --output-delimiter='  '|tail -n 1
68%

  • Tail command helps us especially if there are more than one lines in the output to select the last line
[root@server2 ~]# echo 18%| awk '{ print $1}' | cut -d'%' -f1
68


Since 68 > 65, script is bound to generate an alert for us.

IMPORTANT NOTE:
  • It is always a better idea to redirect the output of this script to a log file which can be viewed later if you want to track what the script did when this job was actually run.
So a modified Crontab entry would look like below:

CRON ENTRY:

##This job runs every 5 minutes to check file system usage
*/5 * * * * /home/admin/FSalerts.sh > /home/admin/FSalerts.log

 You can open the log file /home/admin/FSalerts.log anytime you wish and see what was the activity done when the job was actually run.

This file is useful to troubleshoot the issues with the script or Cron when you did not receive any alert even if the FS usage exceeded your set threshold.

HAPPY LINUX LEARNING :)

Wednesday, 1 March 2017

SCRIPT: To Monitor File System Size in Linux

I thougt of introducing a Mid-Week Post going forward along with my Saturday weekly article.

Here comes a shell script which would help you to monitor size of critical file systems for the threshold you wish on Linux(Centos/Fedora/RHEL) servers.

Say the name of this script I am going to discuss is: /home/admin/FSalerts.sh

========SCRIPT STARTS=======

#!/bin/sh
df -kh | grep "/opt" |  awk '{print $5" "$6}'|cut -d " " -f1 --output-delimiter='  '|tail -n 1 | while read Value;
do
  echo $Value
  fssize=$(echo $Value | awk '{ print $1}' | cut -d'%' -f1)

 if [ $fssize -ge 65 ]; then
mail -s "CRITICAL FS ALERT: /opt is Almost out of disk space on Server `uname -n`" serversupport@domain.com
fi
done

=========SCRIPT ENDS========


  • In the script, I used #uname -n command to print the server name. It is much safer than hostname command :)
  • This script actually introduced/used CUT command along with AWK command.
  • This script basically sends you an email for /opt file system when it reaches 65% of its used space to the specified recipient email.


However, how do you make this script to run every 5 Minutes on Linux server? Just place the script in cron and schedule to run every 5 Minutes or the frequency you wish.

CRON ENTRY:

##This job runs every 5 minutes to check file system usage
*/5 * * * * /home/admin/FSalerts.sh

So, with the above entry, I have schedule a cron job which runs the FSalerts.sh script every 5 minutes and sends us the details.

Follow up points to experience more hands on with the above logic:
---------------------------------------------------------------------------------

1. Try to use SED instead of cut command and make the script work. I believe using SED, script looks much simpler
2. Try to implement the same script for all the File Systems in a single script so that a single report is received as and when a file system is breached its threshold
3. Try with different thresholds, I used 65% as the threshold here.

HAPPY LINUX LEARNING :)

Related posts from my blog:
Learning AWK and SED Tools for LINUX/UNIX

Tuesday, 21 February 2017

Learning AWK and SED Tools for LINUX/UNIX

GOAL:

Learning AWK and SED Tools for LINUX/UNIX:


I have posted so far L2 level of Linux issues and the processes to fix them.
  • In this post I would like to take something simple but still interesting.
  • It is sometimes may be important to have some kind of scripting knowledge in hand for a better Linux/Unix system administration.
  • Before I post something on scripting with examples, I would like to introduce two command line tools namely AWK and SED which are really powerful in LINUX/Unix world.
  • So, this post has the information about most frequently used data filtering commands in Linux/Unix world, none other than AWK and SED.
What is AWK and why is it used for?

Awk is both a programming language and text processor that can be used to manipulate text data in desirable manner.

Exercise using AWK:

To exercise a scenario with AWK, let's take a file called myfile.txt which has the below information.
=======
root x 0 0 root /root /bin/bash
bin x 1 1 bin /bin /sbin/nologin
Albert x 2 2 daemon /sbin /sbin/nologin
Chin x 3 4 adm /var/adm /sbin/nologin
Neon x 4 7 lp /var/spool/lpd /sbin/nologin
sync x 5 0 sync /sbin /bin/sync
shutdown x 6 0 shutdown /sbin /sbin/shutdown
halt x 7 0 halt /sbin /sbin/halt
mail x 8 12 mail /var/spool/mail /sbin/nologin
=======
  • In the above data, first column shows user name, and last column(7th column) shows login details like whether bash login is allowed or not.
  • From the above output I Just need to know the user names and corresponding login details.
Here is how AWK is used to get the required output.

#cat myfile.txt | awk '{print $1" "$7}'
root /bin/bash
bin /sbin/nologin
Albert /sbin/nologin
Chin /sbin/nologin
Neon /sbin/nologin
sync /bin/sync
shutdown /sbin/shutdown
halt /sbin/halt
mail /sbin/nologin
If I want to separate the output using a colon, I can use the below command:
# cat myfile.txt | awk '{print $1" : "$7}'
root : /bin/bash
bin : /sbin/nologin
Albert : /sbin/nologin
Chin : /sbin/nologin
Neon : /sbin/nologin
sync : /bin/sync
shutdown : /sbin/shutdown
halt : /sbin/halt
mail : /sbin/nologin
We can separate the columns as per our convenience like below:
 #cat myfile.txt | awk '{print $1" ==> "$7}'
root ==> /bin/bash
bin ==> /sbin/nologin
Albert ==> /sbin/nologin
Chin ==> /sbin/nologin
Neon ==> /sbin/nologin
sync ==> /bin/sync
shutdown ==> /sbin/shutdown
halt ==> /sbin/halt
mail ==> /sbin/nologin
--------------

In this same article, I would like to introduce SED as well.

What is SED?
  • SED is a stream editor. 
  • SED is used to perform basic text transformations on an input stream like  a file or input from a pipeline. SED works by making only one pass over the input(s), and is consequently more efficient.  But it is sed’s ability to filter text in a pipeline which particularly distinguishes it from other types of editors.
Suppose say, you have the data in newfile.txt is something like below:

root:x:0:0:root:/root:/bin/bash
bin:x:1:1:bin:/bin:/sbin/nologin
Albert:x:2:2:daemon:/sbin:/sbin/nologin
Chin:x:3:4:adm:/var/adm:/sbin/nologin
Neon:x:4:7:lp:/var/spool/lpd:/sbin/nologin
sync:x:5:0:sync:/sbin:/bin/sync
shutdown:x:6:0:shutdown:/sbin:/sbin/shutdown
halt:x:7:0:halt:/sbin:/sbin/halt
mail:x:8:12:mail:/var/spool/mail:/sbin/nologin
  
The above data cannot be filtered by AWK as the columns are seperated by (:). Here we use SED to present the data to AWK in its readable limits.
Just to see try the above mentioned command on newfile.txt.

You have below output:

#cat myfile.txt | awk '{print $1" ==> "$7}'

root:x:0:0:root:/root:/bin/bash ==>
bin:x:1:1:bin:/bin:/sbin/nologin ==>
Albert:x:2:2:daemon:/sbin:/sbin/nologin ==>
Chin:x:3:4:adm:/var/adm:/sbin/nologin ==>
Neon:x:4:7:lp:/var/spool/lpd:/sbin/nologin ==>
sync:x:5:0:sync:/sbin:/bin/sync ==>
shutdown:x:6:0:shutdown:/sbin:/sbin/shutdown ==>
halt:x:7:0:halt:/sbin:/sbin/halt ==>
mail:x:8:12:mail:/var/spool/mail:/sbin/nologin ==>

But if you use SED in the middle, like shown below, we have a meaningful ouput like below:

# cat myfile.txt | sed 's/:/ /g' | awk '{print $1" ==> "$7}'

root ==> /bin/bash
bin ==> /sbin/nologin
Albert ==> /sbin/nologin
Chin ==> /sbin/nologin
Neon ==> /sbin/nologin
sync ==> /bin/sync
shutdown ==> /sbin/shutdown
halt ==> /sbin/halt
mail ==> /sbin/nologin

So SED allowed us to pass the output without delimiter to AWK which made AWK's life easy and thus ours :)

  • You can implement SED on all delimiters and modify the data to required output.
  • Using tail and head commands with AWK and SED makes your command or script more powerful.
  • I hope this article gave a boost that you are looking to start as a first step towards scripting.
  • Feel free to comment on my blog, ask questions and share to the people in need.

Last but not least, referring Man Pages for AWK and SED may help you with more options.

HAPPY LINUX LEARNING :)

You may be interested in my other posts below:
-----------------------------------------------------------
Please review or follow and share your comments if it helps.
File System State is clean with errors in Linux:
http://linuxunixdatabase.blogspot.com/2017/02/file-system-state-is-clean-with-errors.html

How to use IPERF to test interface/network throughput in Linux:
http://linuxunixdatabase.blogspot.com/2017/02/how-to-use-iperf-to-test.html

Linux/Unix Network Troubleshooting:
http://linuxunixdatabase.blogspot.com/2017/02/linuxunix-network-troubleshooting.html

Removing existing LVM from your Linux System
http://linuxunixdatabase.blogspot.com/2017/02/removing-existing-lvm-from-your-linux.html