I have two VPS's which run a single MySQL intensive site. The first VPS runs cPanel and has about 50 databases, 200mb each. Second has another 30 databases, of the same size.
What would be the best method to backup this website daily?
At Current;
I use automysqlbackup to backup all the databases at midnight (this crashes the VPS's for about 10 minutes each night). This dumps each one into a zipped file.
Then the rsync copies the changed files (forum attachmenets, cpanel changes etc(whole server)) to an off site location.
Are there any easier ways to do this? The databases are most important!
I'm working on a new project that involves some heavy data processing in the preparation stage. As an example, one of the setup jobs is now running on a P4-3.0HT desktop at 100% CPU and I estimate it'll be finished in just under a week! Memory and i/o usage are minimal but it's extremely CPU-hungry.
There are other similar jobs to follow and some correlation between the number of cpu cycles and the quality of results so I foresee an ongoing need for computing power for at least a few months.
Viable options are to continue running my own hardware (probably supplementing the P4 with something a bit newer), Amazon EC2 or a cheap dedicated server - I can get a good handle on the cost and performance of each of these.
But I wonder if I'd get more bang per buck from a few VPSs? Some of the VPS benchmark results are spectacular so I suspect the answer could be yes, in the short term anyway. But I don't want to hog the host-node's CPU to the point where it degrades other users' performance or gets my account shut down... Then again, judging by the performance of the VPSs I've used for hosting the node seems to have a lot of spare CPU available...
So (finally) to the questions: Am I crazy to even consider this? If not, which technology would be most suitable? (I'm thinking Xen because of its reputation for better isolation.) Has anyone else done anything similar?
Calling on all hosting and server experts here. (If you're not an expert, still feel free to take an educated stab at this. But please leave out total made up answers or foolish answers like "Have LittleJoeShmoe Hosting services do it all for $9.99/month".)
Scenario: If you knew or were planning on developing a site that you knew would generate millions to tens of millions of page views a day, how would you go about supporting a site of such traffic? The site would not serve up videos, but the average page size would contain up to 75-100kB. It would incorporate databases (user logins, accounts, user submitted content, server side scripting, CMS, etc.)
Don't assume anything. Don't assume you have too little or too much money. Just, what would you plan out to accomodate such a scenario?
What hosting companies would you use? Would you do it in-house and build your own datacenter? Farm out the server management? How much would it cost to implement your plan? What platform would you recommend for a site to handle this much traffic?
I have some NAS storage to do backups to with my dedicated hosting provider. I setup automatic daily backups with WHM to backup the databases and accounts. The server is hosting one site. When it does the backup, it does a mysql dump, and essentially takes the site down for the whole time it is doing the backup because nobody can connect to the database while the dump is happening. Is there a better way to do the backup so that this won't happen, whether it be a different method of backing up to NAS, or just not using NAS at all and using some other method.
I am trying to backup and restore a database and seem to be running in to the following errors.
ERROR 1005 (HY000) at line 29: Can't create table './tcadmin/tc_bandwidth_type.frm' (errno: 121)
Source machine Windows Server 2003 mysql 5.0.11
destination machine RHEL 4.x MYsql 4.1.20
I think it is the way I am backing up the database that is the real problem
I have checked the compatibility mode option in the administration panel for backup, run full backup, and lockled the tables and still I get the error.
I have done some research and I think there is a command line I need to run to make it compatible with mysql 4.x systems but nothing seems to be right that I am doing. Can anyone here offer some insight to what the problem might be?
today when the indata1 file got corrupted and I found out that rsync was NOT backing it up. I had thought the rsyncing the life files was good enough and that if I had to restore I could just run mysqlrepair and get everything back.
what is the RIGHT way to backup MySQL databases and do I want to somehow get rid of the inoddb stuff?
Does anyone have experience in backing up mysql incrementally? From what I've read in the docs it's possible using the binary logs but I haven't been able to find a good resource on how to make this work.
I have a database that is over 5GB. There are a few Myisam tables that are insert/select only and one innodb table that receives updates/deletes/selects/inserts.
Ideally I wouldn't have to backup the 5GB every night, I'd prefer to only get the items that have changed. If I could make this work, then I could also get backups more often rather than once a night.
3 different sizes, I would figure maybe adding droptables would amount to higher size but the first way and third way i use, how come there is a size difference of 2kb?
Which is the right way to backup mysql database safely?
Any of you know any good mysql backup software that does all the packages for you? meaning, backup the whole shabang with only ssh access? users,dbs, etc? and probably restore it too?
I host with hostgator and I was wondering if there are any software programs or services, or even something in my cpanel that can automatically grab my MySQL databases and everything on my server and make a backup on my personal PC?
I know I can manually do this, but I would like something that automatically does it once a week or something to insure my clients data is always backed up.
I only have FTP access to webhost and no webhost cpanel access nor shell access. Is there a way to backup the MySQL database that is being used for ZenCart on this host?
I'm thinking that the answer to this question is "no". However, it would be wonderful if someone told me "yes".
However, my site has a large MySQL database (200 megabytes) and it's impossible to backup or restore via phpMyadmin. Godaddy has a feature on their control panel, where you can restore/backup your MySQL database to a special diretory on your web http file space, which you can upload/download vis FTP. I use this alot.
Is there any other web hosting company that offers such a feature? I haven't seen this anywhere except on Godaddy.
As to why I need to get away from Godaddy:
Basically what they did was, my site was getting too much traffic (even though it's still well under Godaddy's advertised limits for that plan)... so they sent me an email that says:
"Your site is using too much server resources. We have moved you to a new server to protect our other customers. Please identify steps to reduce your site traffic and contact us."
They moved my site to what is, apparently, a punishment area (Godaddy Hell) where all the high volume sites go. It is so extremely slow, my site might as well not exist... it is inaccessible to my visitors for all intents and purposes.
Is there a method of running a back-up script,as root, using MySQL, without passing or storing the root password in the clear?
I have tried OpenSSH with a nologin option using certificates, but I still have problems. I need to run it as a cron job every so often, (without specifics) securely.
I understand I can use another user, other than root, with read-only, but I need to back up the whole db at once, not specific users db's.
I have a corrupted VPS, and have some mysql databases on it.
I want to backup databases and restore them on the new server.
Cpanel service is down on the vps and I cannot transfer accts.
Is it possible to do so ?
1- Zip or Tar the username folder in /home dir.
2- zip or tar the database name in /var/lib/mysql folder
Either by SSH or File manager trough HyperVM
Then transfer files some where safe...
When vps is rebuilt, restore the archives and databases to the folders I have backed up before rebuilt.
There are also some accounts on a terminated vps that there is only an image of the vps available so, I only can tar files using a jailed shell account, move to the new vps and untar them.
My site is databases driven and runs on around 15 mySQL databases. Im wanting to download a local copy of these databases daily, however, if i try and back them up via cPanel (on a WHM VPS) they give me blank files. Each databases is around 55mb and growing. I can back them up one by one via phpmyadmin, it just takes around 5 minutes per database. Meaning around 40 minutes per night..
Is there any solution for having a script downloading them automatically so i can download them via ftp? I've tried [url].htm but it gives me blank files.
how to backup MySQL databases by cron and have the backups sent to you by email, or have them uploaded by FTP. It is based on a script I found at another website though I can confirm it is fully working.
Change the commented variables in the following file and save it as backup.sh:
Code: #!/bin/sh
# This script will backup one or more mySQL databases # and then optionally email them and/or FTP them
# This script will create a different backup file for each database by day of the week # i.e. 1-dbname1.sql.gz for database=dbname1 on Monday (day=1) # This is a trick so that you never have more than 7 days worth of backups on your FTP server. # as the weeks rotate, the files from the same day of the prev week are overwritten. #/bin/sh /home/user/directory/scriptname.sh > /dev/null ############################################################ #===> site-specific variables - customize for your site
# List all of the MySQL databases that you want to backup in here, # each seperated by a space # If not run by root, only one db per script instance databases="mydbname"
# Directory where you want the backup files to be placed backupdir=/home/mydomain/backups
# MySQL dump command, use the full path name here mysqldumpcmd=/usr/bin/mysqldump
# MySQL Username and password userpassword=" --user=myuser --password=mypasswd"
# MySQL dump options dumpoptions=" --quick --add-drop-table --add-locks --extended-insert --lock-tables"
# Send Backup? Would you like the backup emailed to you? # Set to "y" if you do sendbackup="n" subject="mySQL Backup" mailto="me@mydomain.com"
#===> site-specific variables for FTP ftpbackup="y" ftpserver="myftpserver.com" ftpuser="myftpuser" ftppasswd="myftppasswd" # If you are keeping the backups in a subdir to your FTP root ftpdir="forums"
#===> END site-specific variables - customize for your site ############################################################
# Get the Day of the Week (0-6) # This allows to save one backup for each day of the week # Just alter the date command if you want to use a timestamp DOW=`date +%w`
# Create our backup directory if not already there mkdir -p ${backupdir} if [ ! -d ${backupdir} ] then echo "Not a directory: ${backupdir}" exit 1 fi
# Dump all of our databases echo "Dumping MySQL Databases" for database in $databases do $mysqldumpcmd $userpassword $dumpoptions $database > ${backupdir}/${DOW}-${database}.sql done
# Compress all of our backup files echo "Compressing Dump Files" for database in $databases do rm -f ${backupdir}/${DOW}-${database}.sql.gz $gzip ${backupdir}/${DOW}-${database}.sql done
# Send the backups via email if [ $sendbackup = "y" ] then for database in $databases do $uuencode ${backupdir}/${DOW}-${database}.sql.gz > ${backupdir}/${database}.sql.gz.uu $mail -s "$subject : $database" $mailto < ${backupdir}/${DOW}-${database}.sql.gz.uu done fi
# FTP it to the off-site server echo "FTP file to $ftpserver FTP server" if [ $ftpbackup = "y" ] then for database in $databases do echo "==> ${backupdir}/${DOW}-${database}.sql.gz" ftp -n $ftpserver <<EOF user $ftpuser $ftppasswd bin prompt cd $ftpdir lcd ${backupdir} put ${DOW}-${database}.sql.gz quit EOF done fi
# And we're done ls -l ${backupdir} echo "Dump Complete!" exit Upload backup.sh to your server, to any directory you want. A directory which is not web-accessible will stop your login information being seen by just anyone .
You should chmod the file to 777:
Code: chmod 777 backup.sh If you uploaded this file from a Windows machine you will need to convert the file to Unix format. You should run the following command by SSH in the appropriate directory:
Code: dos2unix backup.sh If you don't have dos2unix installed, you can install it using yum if you have that:
Code: yum install dos2unix If you don't have yum, get it here.
You may want to test the script at this point to make sure it's doing what you want it to. Change to the appropriate directory and run this command:
Code: ./backup.sh Once you're happy with it, enter it into the crontab to run daily (or whenever you want). Cron jobs vary a lot depending on the configuration of your system, so check Google for how to do it on your system. The command you will need to run by cron is:
I have following problem-i have over 20 sites on server and each site has own database.Is there a way to speed up backup and transfer or other server?
Method which i use right now it's following:first i archive entire directory using command tar -pczf name.tar.gz public_html,and then repeat with each directory.But i think i could simply archive all requied directories,but that will took toomuch time,so if i drop connection during archiving it wont be archived at all then.So i think best solution will be to create some kind of batch command which can load in background,so that means command wont stop if client lost connection.
So let's say i have 2 sites and two directories located and different places.
One is at home/site1 and other at home/site2 So i think i would need to put command into batch file tar -pczf site.tar.gz. /home/site1 and tar -pczf site2.tar.gz /home/site2
Will that work? Also second part,mysql databases,i founded if i login into phpmyadmin as root i can see all databases.I managed to export all databases,but question will import again to phpmyadmin work.I think phpmyadmin create command for each database "if there is no db sitename_mysqlbase,create it",but howmuch i know phpmyadmin have limit with importing size of mysql database.Could that be done with import/export mysql command?
I have a 1GB MySQL database (compresses down to 300MB) and would like an automated method of backing it up to a remote server. Both accounts are shared hosting accounts (if it matters, both are running CPanel, no shell access on either).
Due to LayeredTech price hike, i'm going to cancel my server with LT within two weeks.
Does anyone know an easy way to backup all the domians and mysql database in FreeBSD/DirectAdmin? Is there a built in feature in DirectAdmin that can do that?
one day, I have to replicate 2 servers together. so, I have to stop webserver and use MySQL Administrator GUI Tool to backup it. My steps follow: (i work directly on Server 2 and i wantna to use mysql administrator gui to monitor and config replication).
1. Server 2: ssh to server 1-> connect mySQL -> add any host to root access
2. Server 2: use mySQL Administrator connect to server 1 with root
3. Server 2: backup from server 1 on Desktop location
4. Server 2: Restore it use mysql administrator tool.
It works well but when i click Restore button, MySQL Administrator GUI Tool disappear!