• Daily ,Weekly and Monthly backup from Linux to Windows

    Scenario :-

    Setup a backup script to take daily, weekly and monthly backups to a remote windows server.  I wrote this bash script to meet the clients requirement and its worked  perfectly.  With some minor changes you can use the same script to setup daily ,weekly and monthly backup’s to the local linux server. I have setup separate scripts for daily, weekly and monthly backup’s. So that if somebody searching for the same scenario then they can understand the logic very easily.

    Overview :-

    1)  Created a folder ” backup ” in the remote windows server backup drive.

    2) Created a user as follows



    3)  Grant necessary  privileges  for this user to the backup directory as follows .


    4 )   Mount the windows backup drive using /etc/fstab file follows ( with windows username and password )

    //winserver/backup  /WIN_BACKUP  cifs  username=backup,password=pass 0 0

    5)  Decided to run and setup  separate Daily , weekly and monthly backup scripts as follows

     

    Crontab entries :-

    #### Daily  backup at 03:01 am on  Monday to Satuarday
    01 03  * * 1-6  /bin/bash /usr/local/scripts/daily_backup.sh > /dev/null 2>&1

    ##### Weekly  backukp – every  Sunday at 05:01 am
    01 05  * * 0    /bin/bash /usr/local/scripts/weekly_backup.sh > /dev/null 2>&1

    ##### Monthly  backup – First day of every month at 06:01 am
    01 06 1 * *  /bin/bash /usr/local/scripts/monthly_backup.sh > /dev/null 2>&1

     

    Backup Script Files :-

     

    1)   /usr/local/scripts/daily_backup.sh

    #!/bin/bash
    PATH=/usr/bin:/bin:/usr/sbin:/sbin
    export PATH
    ## To find the day output will be like "Mon,Tue,Wed etc "
    path=`date | awk '{print $1}'`
    # Already created the folders Mon,Tue,Wed,..Sat inside /WIN_BACKUP/daily
    # Backup scripts directory
    rsync -avzub --copy-links /usr/local/scripts/   /WIN_BACKUP/daily/$path/scripts
    # Backup website files 
    rsync -avzub --copy-links  --exclude 'logs'  --exclude 'logs'  --exclude '*.tar'  --exclude '*.gz'  --exclude '*.zip' --exclude '*.sql'  /usr/local/www/   /WIN_BACKUP/daily/$path/UsrLocalWww

     

    2)  /usr/local/scripts/weekly_backup.sh

    #!/bin/bash PATH=/usr/bin:/bin:/usr/sbin:/sbin
    export PATH
    cd /WIN_BACKUP/website_weekly/
    mkdir sun-`date +%Y%m%d`
    cd sun-`date +%Y%m%d`
    mkdir -p scripts
    mkdir -p UsrLocalWww
    # Backup scripts directory
    rsync -avzub --copy-links /usr/local/scripts/    /WIN_BACKUP/website_weekly/sun-`date +%Y%m%d`/scripts
    # Backup website files 
    rsync -avzub --copy-links  --exclude 'logs'  --exclude 'logs'  --exclude '*.tar'  --exclude '*.gz'  --exclude '*.zip' --exclude '*.sql'  /usr/local/www/  /WIN_BACKUP/website_weekly/sun-`date +%Y%m%d`/UsrLocalWww

     

    3) /usr/local/scripts/monthly_backup.sh

    #!/bin/sh
    PATH=/usr/bin:/bin:/usr/sbin:/sbin
    export PATH
    ## To find the current month , out put will be " Jan, Feb, Mar etc " 
    path=`date | awk '{print $2}'`
    # Create the corresponding direcotries with current month
    mkdir -p /WIN_BACKUP/website_monthly/$path/scripts
    mkdir -p /WIN_BACKUP/website_monthly/$path/UsrLocalWww
    # Backup scripts directory
    rsync -Cavz /usr/local/scripts/   /WIN_BACKUP/website_monthly/$path/scripts
    # Backup all websites
    rsync -Cavz --exclude 'log'  --exclude 'logs'  --exclude '*.tar'  --exclude '*.gz'  --exclude '*.zip' --exclude '*.sql'  /usr/local/www/   /WIN_BACKUP/website_monthly/$path/UsrLocalWww

    I took almost 1 day to complete this setup and now its running fine 🙂 . Hope that this documentation will  definitely help somebody, who is looking for the same setup.

    For mysql daily,weekly and monthly backup setup check : MySql Backup Script

     

     

  • Linux Local and Remote backup using rsync and rdiff

    Scenario : –

    1)      Take daily backup of linux server to local backup disk/secondary drive

    2)      Copy the backup drive to the remote server.

    ( Using this method we can keep our data in 3 different places )

    Setup Local Backup

    1)      create a folder /backup

    2)      rsync the necessary files to /backup

    rsync -avzub /boot/   /backup/boot

    rsync -avzub /usr/local/scripts/   /backup/User_local_scripts

    rsync -avpuzb  –exclude ‘logs’ –copy-links –exclude ‘logs’  –exclude ‘*.tar’  –exclude ‘*.gz’ –exclude ‘*.zip’ –exclude ‘*.sql’ –exclude ‘log’  /usr/local/apache/htdocs    /backup/apache_htdocs

    Copy the local backup to Remote server

    Now all the necessary files are copied to /backup.  We need to copy this to a remote backup server for data redundancy .

    I found Rdiff is a nice backup tool for copying folders overs a network.

    Rdiff-backup:- is a very nice backup tool.We can copy one directory to another ,possibly over a network.The target directory ends up a copy of the source directory, but extra reverse diffs are stored in a special subdirectory of that target directory, so you can still recover files lost some time ago. The idea is to combine the best features of a mirror and an incremental backup. rdiff-backup also preserves subdirectories, hard links, dev files, permissions, uid/gid ownership, modification times, extended attributes, acls, and resource forks. Also, rdiff-backup can operate in a bandwidth efficient manner over a pipe, like rsync. Thus you can use rdiff-backup and ssh to securely back a hard drive up to a remote location, and only the differences will be transmitted. Finally, rdiff-backup is easy to use

    1) Download rdiff  and install rdiff

    yum install rdiff-backup ( easiest method )

    Or you can download the tar and Install

    Wget http://savannah.nongnu.org/download/rdiff-backup/rdiff-backup-1.2.8.tar.gz

    2) tar –zxf rdiff-backup-1.2.8.tar.gz

    3) cd rdiff-backup

    4) ./configure

    5) make

    6) make install

    7) Create a script to copy the directory to remote server.

    vi  rdiff-backupTo-Remote.sh

    #!/bin/bash

    # hostname adminlogs.info

    # this will copy the /backup folder to remoteserver /backup location

    rdiff-backup /backup  remoteserverIP::/backup/adminlogs  > /dev/null 2>&1

    8)  add this script to cron  ( execute the cron at 1:30 am  )

    30  01 * * * bash  rdiff-backupTo-Remote.sh

    NB :- You should allow password less login from the local server to remote server.

    You can use the following  url to create the ssh password less login

    http://adminlogs.info/2011/05/27/passwordless-login-ssh/

    That’s it…You have successfully configured your backup script. You can relax , your data is safe and available in 3 disks 😉

  • Daily,Weekly and Monthly mysql backup

     

    Auto mysql backup is a fantastic script to create daily , weekly and monthly backup of mysql db’s .  I am using this in my most of the DB servers . The main advantage is , it will do a good level of compression ( backup file )

    Download the script :-

    http://sourceforge.net/projects/automysqlbackup/

    Configuration :-

    cp  automysqlbackup-2.5.1-01.sh  /etc/automysqlbackup/automysqlbackup.sh

    mkdir –p /etc/automysqlbackup/

    vi   /etc/automysqlbackup/automysqlbackup.conf

    To create a config file just copy the code between “### START CFG ###” and “### END CFG ###   to /etc/automysqlbackup/automysqlbackup.conf from automysqlbackup-2.5.1-01.sh

    In the configuration you need to change only the following

    # Username to access the MySQL server e.g. dbuser

    USERNAME=root

    # Password to access the MySQL server e.g. password

    PASSWORD=password

    # Host name (or IP address) of MySQL server e.g localhost

    DBHOST=localhost

    # List of DBNAMES for Daily/Weekly Backup e.g. “DB1 DB2 DB3”

    DBNAMES=”wordpress_db  forum_db”

    # Backup directory location e.g /backups

    BACKUPDIR=”/backup/MysqlAuto”

    # Email Address to send mail to? ([email protected])

    MAILADDR=[email protected]

    # setup a cron to execute the mysql backup script.

    30 00 * * *  bash  /etc/automysqlbackup/automysqlbackup.sh

     

    Enjoy !! you done that..It’s a fantastic tool for mysql daily weekly and monthly backup . ..You can sleep with out worrying about your DB’s