I'm new to Parallels Panel. I use version 11.0.9. I want to backup mysql database daily. First of all, what is the best way for daily database backup in plesk. I'm trying to do this in Scheduled Tasks and I use mysqldump command although I'm not sure.
I chose the time and day first and then I switched on the task. I typed the following command to Command line.
This created only a blank file. When I use this without gzip, nothing changes.
1- Is mysqldump right command for database backup? 2- Should I define full path for mysqldump, gzip and database? If so, how can I find out the full path of mysqldump, gzip and my database? Because I can't see their locations in panel. 3- I can't see any error message. There is not any log file in httpdocs folder. Where does the log file exist? 4- It is weird but should the username be "database user" or should I write "root" ?
how to backup MySQL databases by cron and have the backups sent to you by email, or have them uploaded by FTP. It is based on a script I found at another website though I can confirm it is fully working.
Change the commented variables in the following file and save it as backup.sh:
Code: #!/bin/sh
# This script will backup one or more mySQL databases # and then optionally email them and/or FTP them
# This script will create a different backup file for each database by day of the week # i.e. 1-dbname1.sql.gz for database=dbname1 on Monday (day=1) # This is a trick so that you never have more than 7 days worth of backups on your FTP server. # as the weeks rotate, the files from the same day of the prev week are overwritten. #/bin/sh /home/user/directory/scriptname.sh > /dev/null ############################################################ #===> site-specific variables - customize for your site
# List all of the MySQL databases that you want to backup in here, # each seperated by a space # If not run by root, only one db per script instance databases="mydbname"
# Directory where you want the backup files to be placed backupdir=/home/mydomain/backups
# MySQL dump command, use the full path name here mysqldumpcmd=/usr/bin/mysqldump
# MySQL Username and password userpassword=" --user=myuser --password=mypasswd"
# MySQL dump options dumpoptions=" --quick --add-drop-table --add-locks --extended-insert --lock-tables"
# Send Backup? Would you like the backup emailed to you? # Set to "y" if you do sendbackup="n" subject="mySQL Backup" mailto="me@mydomain.com"
#===> site-specific variables for FTP ftpbackup="y" ftpserver="myftpserver.com" ftpuser="myftpuser" ftppasswd="myftppasswd" # If you are keeping the backups in a subdir to your FTP root ftpdir="forums"
#===> END site-specific variables - customize for your site ############################################################
# Get the Day of the Week (0-6) # This allows to save one backup for each day of the week # Just alter the date command if you want to use a timestamp DOW=`date +%w`
# Create our backup directory if not already there mkdir -p ${backupdir} if [ ! -d ${backupdir} ] then echo "Not a directory: ${backupdir}" exit 1 fi
# Dump all of our databases echo "Dumping MySQL Databases" for database in $databases do $mysqldumpcmd $userpassword $dumpoptions $database > ${backupdir}/${DOW}-${database}.sql done
# Compress all of our backup files echo "Compressing Dump Files" for database in $databases do rm -f ${backupdir}/${DOW}-${database}.sql.gz $gzip ${backupdir}/${DOW}-${database}.sql done
# Send the backups via email if [ $sendbackup = "y" ] then for database in $databases do $uuencode ${backupdir}/${DOW}-${database}.sql.gz > ${backupdir}/${database}.sql.gz.uu $mail -s "$subject : $database" $mailto < ${backupdir}/${DOW}-${database}.sql.gz.uu done fi
# FTP it to the off-site server echo "FTP file to $ftpserver FTP server" if [ $ftpbackup = "y" ] then for database in $databases do echo "==> ${backupdir}/${DOW}-${database}.sql.gz" ftp -n $ftpserver <<EOF user $ftpuser $ftppasswd bin prompt cd $ftpdir lcd ${backupdir} put ${DOW}-${database}.sql.gz quit EOF done fi
# And we're done ls -l ${backupdir} echo "Dump Complete!" exit Upload backup.sh to your server, to any directory you want. A directory which is not web-accessible will stop your login information being seen by just anyone .
You should chmod the file to 777:
Code: chmod 777 backup.sh If you uploaded this file from a Windows machine you will need to convert the file to Unix format. You should run the following command by SSH in the appropriate directory:
Code: dos2unix backup.sh If you don't have dos2unix installed, you can install it using yum if you have that:
Code: yum install dos2unix If you don't have yum, get it here.
You may want to test the script at this point to make sure it's doing what you want it to. Change to the appropriate directory and run this command:
Code: ./backup.sh Once you're happy with it, enter it into the crontab to run daily (or whenever you want). Cron jobs vary a lot depending on the configuration of your system, so check Google for how to do it on your system. The command you will need to run by cron is:
i need information about this option 'Check /etc/cron.daily/logrotate for /tmp noexec workaround', there are in the server check, of the csf test, someone can explain to me about this function? should do it?
A couple weeks ago, I encountered a big server crash on my VPS that caused me a lot of downtime. I'm currently trying to figure out a solution to keep a current "clone" of all of my server accounts on a second server. That way, if I ever encounter another crash, I'll be able to simply change DNS information to have all accounts "live" using the backup server.
I appreciate any input, advice, suggestions, criticism, etc. Here's what I have in mind...
1. I currently have all of my websites hosted on Server #1. (We'll call it that for the sake of avoiding confusion.)
2. I have an automatic nightly backup setup via cPanel / WHM that backs up all accounts from Server #1 to Server #2 via FTP. (Server #2 is in a totally different data center, with a different provider.)
3. The nightly backup packages all of the accounts as "cPanel Full Backups." So, they're compressed, and as such, they don't work as "live, functioning websites" on Server #2.
The only way to make them "live and functional" on Server #2 would be to use cPanel to "restore" the backups.
4. So, what I'd like to do is setup a CRON job that would automatically "Restore" the backups each morning on Server #2. That way, Server #2 would always have a functional version of all my accounts, that is less than a day old. Then, if Server #1 ever crashed, I'd just have to change DNS information to point to Server #2, and all of the websites would be live again, without having to physically restore all of the backups using cPanel.
I don't know a ton about CRON. However, as I understand it, CRON couldn't actually make cPanel restore the backups. However, I'm assuming that when you use cPanel's "Restore" function, it just goes through a series of processes. So, it seems logical to me that, if you knew what those processes were, you could write a CRON job to automate the process every morning.
Did that make sense?
If so, is it possible?
Do you guys have any input, criticism, etc?
If it's doable, can you make any suggestions that would help me make this happen?
Finally, if you think you have the expertise to make this happen, I'd be interested in chatting with you via Private Message. I'd be willing to pay to have this done.(Note to Moderators: I'm not sure if my last comment is allowed or not ... if not, please feel free to remove it. I'm far more interested in the discussion of this process than trying to solicit help in making it happen.)
I encountered a big server crash on my VPS that caused me a lot of downtime. I'm currently trying to figure out a solution to keep a current "clone" of all of my server accounts on a second server. That way, if I ever encounter another crash, I'll be able to simply change DNS information to have all accounts "live" using the backup server.
I appreciate any input, advice, suggestions, criticism, etc. Here's what I have in mind...
1. I currently have all of my websites hosted on Server #1. (We'll call it that for the sake of avoiding confusion.)
2. I have an automatic nightly backup setup via cPanel / WHM that backs up all accounts from Server #1 to Server #2 via FTP. (Server #2 is in a totally different data center, with a different provider.)
3. The nightly backup packages all of the accounts as "cPanel Full Backups." So, they're compressed, and as such, they don't work as "live, functioning websites" on Server #2. The only way to make them "live and functional" on Server #2 would be to use cPanel to "restore" the backups.
4. So, what I'd like to do is setup a CRON job that would automatically "Restore" the backups each morning on Server #2. That way, Server #2 would always have a functional version of all my accounts, that is less than a day old. Then, if Server #1 ever crashed, I'd just have to change DNS information to point to Server #2, and all of the websites would be live again, without having to physically restore all of the backups using cPanel.
I don't know a ton about CRON. However, as I understand it, CRON couldn't actually make cPanel restore the backups. However, I'm assuming that when you use cPanel's "Restore" function, it just goes through a series of processes. So, it seems logical to me that, if you knew what those processes were, you could write a CRON job to automate the process every morning.
Did that make sense?
If so, is it possible?
Do you guys have any input, criticism, etc?
If it's doable, can you make any suggestions that would help me make this happen?
Finally, if you think you have the expertise to make this happen, I'd be interested in chatting with you via Private Message. I'd be willing to pay a reasonable sum for some help with this.
I seem to be having a problem where periodically the data in one file is getting corrupted. I haven't been able to figure out a pattern to it, so I wanted to run command by crontab that would create a copy of the file each day. To avoid overwriting previous backups the filename of each day's copy would have to be unique like...
cp filename filename-2008-07-03
Is there a way to include this year, this month, and this day variables in a linux command?
I have my WHM/cPanel installation configured with daily and weekly backups. I checked at what time of the day the server load was at the minimum and configured the cPanel backup cron to run then.
The problem now is: Backing up a few hundred accounts results in a high server load. My server configuration:
Dual Processor Quad Core Xeon 5335 2.0GHz with 4GB RAM and 2 x 250GB SATA HDD hosted at SoftLayer.
The accounts are located on the first HDD and the backup archives are placed on the second HDD.
What can I do about this? I'd like to take daily backups of all accounts but not if my server load increases up to 10... That kind of renders the cPanel backup feature useless if it doesn't even work on a powerful server like this one...
Would it help if I use an application such as Auto Nice Daemon to give the backup process a lower priority? But then again that won't work on the MySQL dumps? And I think it's not a CPU problem but an I/O wait problem? Other processes have to wait for disk access because the disk-intensive backup process is running?
I have seen resellerzoom provides daily full backup to their customer. How should i configure my WHM so that it create daily backup and delete old backup.
to decrease the load in server when daily backup start ,, the load in server before backup start from 0.80 to 1.20 after daily backup started i see very high load from 16.00 to 32 and 40
any solve for decrease load when backup start from 3 to 7 alot