i need to backup database,that not that much huge a small database,now iam useing cpanel server so iam takeing via phpmyadmin manually,now i need to set cron to take backup at every 12hours and save it on specified path,
I just went through a week of nightmares, because my host lost hard drives and stuff.. lost my data, etc.. what ever..
I want to work out how I can cron cPanel backups. I did a search online for a cPanel backup manager, but I didn't have much luck.
Or, maybe there's some other way to get regular backups?
I don't really need to backup my whole sites. I keep my files locally, but of course I don't have the databases locally -- or the email records.
All I "really" need is the database SQL files, so that I can rebuild my forums if need be, and also the email setup files -- because I have over 50 domains and I have emails set up on all of them.
I want to have them emailed to myself.
There's an option in cPanel too, which allows me to "generate/download a full backup".. well, how do I "restore" the backup? ie: I know how to backup and restore the home directory, databases and email stuff (there's 3 options in cPanel), but there doesn't seem to be anywhere to restore the full backup.
Anyway, can anyone give me advice on how to keep backups of my cPanel accounts? I don't trust hosts anymore. They say they back stuff up, but they never to.
Im setting up new windows server, packed with only FTP server (FileZilla) to remote backups.
How can i get my cpanel account make backups and upload them to my FTP server as an cron job? Offcourse i have option to backup via cpanel, but this is needed to do manually every time (setting remote ftp ip, account, passwd etc,).
Is there a option in cpanel / whm to do full-backup all my accounts in my reseller account? Or do i need to do it per account? Some of my servers is only reseller accounts without root access, only reseller account.
Is it possible to get [Crone job] that sends a backup copy for single user in the server to another hosting via FTP which has [Domain name/ User /Password] either weekly or 3 times a week
how to backup MySQL databases by cron and have the backups sent to you by email, or have them uploaded by FTP. It is based on a script I found at another website though I can confirm it is fully working.
Change the commented variables in the following file and save it as backup.sh:
Code: #!/bin/sh
# This script will backup one or more mySQL databases # and then optionally email them and/or FTP them
# This script will create a different backup file for each database by day of the week # i.e. 1-dbname1.sql.gz for database=dbname1 on Monday (day=1) # This is a trick so that you never have more than 7 days worth of backups on your FTP server. # as the weeks rotate, the files from the same day of the prev week are overwritten. #/bin/sh /home/user/directory/scriptname.sh > /dev/null ############################################################ #===> site-specific variables - customize for your site
# List all of the MySQL databases that you want to backup in here, # each seperated by a space # If not run by root, only one db per script instance databases="mydbname"
# Directory where you want the backup files to be placed backupdir=/home/mydomain/backups
# MySQL dump command, use the full path name here mysqldumpcmd=/usr/bin/mysqldump
# MySQL Username and password userpassword=" --user=myuser --password=mypasswd"
# MySQL dump options dumpoptions=" --quick --add-drop-table --add-locks --extended-insert --lock-tables"
# Send Backup? Would you like the backup emailed to you? # Set to "y" if you do sendbackup="n" subject="mySQL Backup" mailto="me@mydomain.com"
#===> site-specific variables for FTP ftpbackup="y" ftpserver="myftpserver.com" ftpuser="myftpuser" ftppasswd="myftppasswd" # If you are keeping the backups in a subdir to your FTP root ftpdir="forums"
#===> END site-specific variables - customize for your site ############################################################
# Get the Day of the Week (0-6) # This allows to save one backup for each day of the week # Just alter the date command if you want to use a timestamp DOW=`date +%w`
# Create our backup directory if not already there mkdir -p ${backupdir} if [ ! -d ${backupdir} ] then echo "Not a directory: ${backupdir}" exit 1 fi
# Dump all of our databases echo "Dumping MySQL Databases" for database in $databases do $mysqldumpcmd $userpassword $dumpoptions $database > ${backupdir}/${DOW}-${database}.sql done
# Compress all of our backup files echo "Compressing Dump Files" for database in $databases do rm -f ${backupdir}/${DOW}-${database}.sql.gz $gzip ${backupdir}/${DOW}-${database}.sql done
# Send the backups via email if [ $sendbackup = "y" ] then for database in $databases do $uuencode ${backupdir}/${DOW}-${database}.sql.gz > ${backupdir}/${database}.sql.gz.uu $mail -s "$subject : $database" $mailto < ${backupdir}/${DOW}-${database}.sql.gz.uu done fi
# FTP it to the off-site server echo "FTP file to $ftpserver FTP server" if [ $ftpbackup = "y" ] then for database in $databases do echo "==> ${backupdir}/${DOW}-${database}.sql.gz" ftp -n $ftpserver <<EOF user $ftpuser $ftppasswd bin prompt cd $ftpdir lcd ${backupdir} put ${DOW}-${database}.sql.gz quit EOF done fi
# And we're done ls -l ${backupdir} echo "Dump Complete!" exit Upload backup.sh to your server, to any directory you want. A directory which is not web-accessible will stop your login information being seen by just anyone .
You should chmod the file to 777:
Code: chmod 777 backup.sh If you uploaded this file from a Windows machine you will need to convert the file to Unix format. You should run the following command by SSH in the appropriate directory:
Code: dos2unix backup.sh If you don't have dos2unix installed, you can install it using yum if you have that:
Code: yum install dos2unix If you don't have yum, get it here.
You may want to test the script at this point to make sure it's doing what you want it to. Change to the appropriate directory and run this command:
Code: ./backup.sh Once you're happy with it, enter it into the crontab to run daily (or whenever you want). Cron jobs vary a lot depending on the configuration of your system, so check Google for how to do it on your system. The command you will need to run by cron is:
I want to create a cron job that deletes backup files that are older than 5 days. I have created a shell script (/usr/local/src/runjob.sh) that runs successfully from the SSH command line:
Code: #pwd /usr/local/src #./runjob.sh Here is the script:
Code: #!/bin/sh #
# find /usr/local/apache/sites/*/BACKUPS/ -maxdepth 1 -atime +5 -iname 'test*.txt' -exec rm {} ; Script has 777 permissions and is owned by root.
As shown in the cron log, the job does run, but it does not delete the files. And there are files older than 5 days in the directory that meet the find criteria.
I just want to move from hostmonster to dedicated server
the problem is the database(vbulletin) size about 1 GB
when I try make backup for it through ssh it arraive to 723 MB then stop becuase it have highload on hostmonster system they not allowed more than 20% cpu
I take to them they tell me can’t do any thing I please them make backup but they say we can’t do any thing for you and I’m sure they can log by root and do it with ssh on less than 1 minute
how can I make backup or if there r way to move the database to the new server
I have a tiny MySQL database (10MB) and I'm looking for some remote backup service. Someting that will backup the database and let me recover it if the host is in trouble.
I strongly prefer something simple and fully automated.
I have a 1GB MySQL database (compresses down to 300MB) and would like an automated method of backing it up to a remote server. Both accounts are shared hosting accounts (if it matters, both are running CPanel, no shell access on either).
i recently find out that I don;t have an database backup for one of my biggest websites.
The database I'm talking about has 228 tables (198.6 MB)
In the backup file I only get this :
-- MySQL dump 10.11 -- -- Host: localhost Database: lxxxxxxa_sXXXl -- ------------------------------------------------------ -- Server version5.0.45-community-log
/*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT */; /*!40101 SET @OLD_CHARACTER_SET_RESULTS=@@CHARACTER_SET_RESULTS */; /*!40101 SET @OLD_COLLATION_CONNECTION=@@COLLATION_CONNECTION */; /*!40101 SET NAMES utf8 */; /*!40103 SET @OLD_TIME_ZONE=@@TIME_ZONE */; /*!40103 SET TIME_ZONE='+00:00' */; /*!40014 SET @OLD_UNIQUE_CHECKS=@@UNIQUE_CHECKS, UNIQUE_CHECKS=0 */; /*!40014 SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0 */; /*!40101 SET @OLD_SQL_MODE=@@SQL_MODE, SQL_MODE='NO_AUTO_VALUE_ON_ZERO' */; /*!40111 SET @OLD_SQL_NOTES=@@SQL_NOTES, SQL_NOTES=0 */;
The other databases I have tried are all ok, but none is this big. Is there any size limit ?
I have uploaded an printscreen with the setings of my WHM.
Due to LayeredTech price hike, i'm going to cancel my server with LT within two weeks.
Does anyone know an easy way to backup all the domians and mysql database in FreeBSD/DirectAdmin? Is there a built in feature in DirectAdmin that can do that?
Is the same type of file created through cPanel, going into backup then downloading the database backup as going through cPanel/phpmyadmin to export then saving the file as gzipped?
when i try to make backup of mssql 2012 on plesk 11.5 i have the following error:Error: dbbackup failed: Unable to backup database 'DBANME'. Unable to open connection: A specified logon session does not exist. It may already have been terminated.i already configured temp/network directory,