Wednesday, July 14, 2010

Simple Bash script to rotate log

If you have some program that doesn't handle its own log rotation, you could use the below script to rotate the log. Ideally, you would call this script in a daily crob job.


#!/bin/bash
logfile=$1
if [ ! -f $logfile ]; then
echo "log file not found $logfile"
exit 1
fi
timestamp=`date +%Y%m%d`
newlogfile=$logfile.$timestamp
cp $logfile $newlogfile
cat /dev/null > $logfile
gzip -f -9 $newlogfile

6 comments:

Freefloris said...

Nice, but warning if u have very lot of logfile... u can check and delete the old logfile like that:
find "path" -name "logfilename" -type f -mtime +90 | xargs rm -f
for example.

prabhakar said...

you can do lot of thing with logrotate tool, /etc/logrotate.d.
Whatever script you have written should run daily/weekly/monthly or yearly, so you should use cron job for that. That's not consistent from usability perspective. In enterprise people use logrotate tool which comes with SLES,(i have seen some people use SLF4J too -but it's specific to application log)

Vincent Janelle said...

 You should probably look at the logrotate package present on most linux distributions.

Also, unless the application itself releases the file by closing it, it'll continue to write to the inode on a lot of operating systems until it stops(consuming disk space phantomly).

Hung Huynh said...

Thanks for the tip on logrotate. The script doesn't delete the current log file, only truncate it so I don't think it'll lead to phantom inode.

Francisco Vanegas said...

Some of you may find this variation of the same script useful:
I have a lot of different log files on the same directory, this script looks into a directory to find .log files and if they are bigger than 512Kb then rotate them.  Additionally I delete compressed log files older than 90 days

#!/bin/bash
timestamp=`date +%Y%m%d`
LOGDIR=/backup/shells/log
find $LOGDIR -name '*.log' -type f -size +512k | while read logfile
do       # the parsing of a line
 echo $logfile
 newlogfile=$logfile.$timestamp
 cp $logfile $newlogfile
 cat /dev/null > $logfile
 gzip -f -9 $newlogfile
done
#Delete logs older than 90 days:
find $LOGDIR -name '*.gz' -type f -mtime 90 |xargs rm -f

locuas said...

works, great, regards