What I m trying to do
Rotate my logs everyday at midnight, archive the rotated files. then upload them to an S3 bucket. all archived files on S3 will be referenced with a timestamp, to make the search process easy
Enviroment
- EC2 instance running ubuntu machine 14.04
Tools
- awscli Tool
- logrotate
Basic Steps
- configure logrotate
- create bash script to upload files to S3
- setup init.d to upload files in case of (restart or shutdown)
Configuration
1. logrotate
- first make sure that
/etc/logrotate.conf
file has this lineinclude /etc/logrotate.d
. this line tells logrotate to pickup any configuration sitting in this directory - cd
/etc/logrotate.d/
and create new config file, in this demo I will use apache logs as an example . you can replicate the same configuration for other types of logs (syslogs, mail logs ..). - this is what my configuration file
/etc/logrotate.d/apache2
looks like
/var/log/apache2/*.log {
daily
missingok
rotate 10
compress
delaycompress
ifempty
create 640 root adm
prerotate
/bin/bash /home/ubuntu/backup_current_logs.sh
endscript
postrotate
if /etc/init.d/apache2 status > /dev/null ; then \
/etc/init.d/apache2 reload > /dev/null; \
fi;
endscript
}
- let me explain what the config does
- fitst I tell logrotate to rotate any file ends with
.log
inside the directory/var/log/apache2/
daily
: rotate my files everydaymissingok
: to avoid error messages in case the file is missingrotate 10
: keep 10 rotated files, for example I will have error.log error.log.1 error.log.3 … error.log.10delaycompress
: start compressing files from third file, so error.log and error.log.1 won’t be compressedifempty
: Rotate the log file even if it is emptyprerotate
: the action I want to execute before rotation, in my case I want to upload files to S3postrotate
: then I reload apache2
- fitst I tell logrotate to rotate any file ends with
2. Uploading to S3
- this is my script file that uploads logs to S3, I m appending the date and instance ID for better reference. Also if you have multiple environment you can replace “environment” with the adequate env name.
# this script will create an archive file of current apache logs
tar -C/var --warning=no-file-changed -zcf /tmp/log.tar.gz log/apache2/error.log log/apache2/access.log
# then upload the archived file to S3 bucket
EC2_INSTANCE_ID="`wget -q -O - http://instance-data/latest/meta-data/instance-id`"
su - ubuntu -c "aws s3 cp /tmp/log.tar.gz s3://my-bucket-name/environment/`date +%Y-%m-%d`_${EC2_INSTANCE_ID}.tar.gz "
3. Managing shutdown/restart
- just copy
/home/ubuntu/backup_current_logs.sh
to/etc/init.d/
Comments