Backup MySQL Intensive VPS
Jun 12, 2009
I have two VPS's which run a single MySQL intensive site. The first VPS runs cPanel and has about 50 databases, 200mb each. Second has another 30 databases, of the same size.
What would be the best method to backup this website daily?
At Current;
I use automysqlbackup to backup all the databases at midnight (this crashes the VPS's for about 10 minutes each night). This dumps each one into a zipped file.
Then the rsync copies the changed files (forum attachmenets, cpanel changes etc(whole server)) to an off site location.
Are there any easier ways to do this? The databases are most important!
View 1 Replies
ADVERTISEMENT
May 8, 2009
Since my /var partition is full, so I moved /var/lib/mysql to /backup/mysql/.
Seems all the files are copied and I changed my.cnf reboot mysql, but all the web sites using db is not working anymore..
View 4 Replies
View Related
Jul 14, 2008
I'm working on a new project that involves some heavy data processing in the preparation stage. As an example, one of the setup jobs is now running on a P4-3.0HT desktop at 100% CPU and I estimate it'll be finished in just under a week! Memory and i/o usage are minimal but it's extremely CPU-hungry.
There are other similar jobs to follow and some correlation between the number of cpu cycles and the quality of results so I foresee an ongoing need for computing power for at least a few months.
Viable options are to continue running my own hardware (probably supplementing the P4 with something a bit newer), Amazon EC2 or a cheap dedicated server - I can get a good handle on the cost and performance of each of these.
But I wonder if I'd get more bang per buck from a few VPSs? Some of the VPS benchmark results are spectacular so I suspect the answer could be yes, in the short term anyway. But I don't want to hog the host-node's CPU to the point where it degrades other users' performance or gets my account shut down... Then again, judging by the performance of the VPSs I've used for hosting the node seems to have a lot of spare CPU available...
So (finally) to the questions: Am I crazy to even consider this? If not, which technology would be most suitable? (I'm thinking Xen because of its reputation for better isolation.) Has anyone else done anything similar?
View 7 Replies
View Related
Aug 2, 2008
my client a dedicated server which could handle the following things.
1). Intensive DB queries done from many operators for custom project.
2).DB when info process completes, Generates a PDF which would be upto 10MB in size.and also runs MAPLAB Applications.
3). should be able to run back-up's or please suggest an alternative to always have a copy of recent data.
4). pro-active scanning on Incoming FTP content for Viruses. (is it possible ?)
Considering softlayer with:
either
Single Processor Quad Core Xeon 3220 - 2.40GHz (Kentsfield) - 2 x 4MB cache
8 GB RAM
+ 147GB SA-SCSI 10K RPM
OR Preferring
Dual Processor Quad Core Xeon 5335 - 2.00GHz (Clovertown) - 2 x 8MB cache
8 GB FB-DIMM Registered 533/667
+ 147GB SA-SCSI 10K RPM
View 9 Replies
View Related
Apr 14, 2008
I own a couple of servers xeon setups. All use cPanel/WHM.
A client yesterday asked if we could have ffmpeg-php installed on the server so that they could run phpfox.
I have heared ffmpeg is resource intensive? Will it make a big difference on dual core server? Should I install ffmpeg-php or risk losing a client?
If I should install ffmpeg-php I have been having trouble doing so, can anyone help me with this.
View 4 Replies
View Related
Oct 22, 2006
I would like to know how resource intensive is ClamAV Scanner. Should I allow it or not to my VPS clients/resellers?
Can I set it to use it as root? How?
View 0 Replies
View Related
Oct 4, 2009
We were on shared hosting until we were asked to move off due to our database using a lot of the CPU - how much, we dont know.
Anyway, moved to a 1GB Hyper-V VPS, seems like we have one core of an E5420 but presume the server is shared between 12 people.
When our ASP application runs, the MSSQL server reaches 100% CPU and the website performs slower than the shared server we were on.
Where can we go from here? Seems like going from shared to a Dual/Quad Core Dedicated Server is a large jump and a massive jump in price.
What about a host with a high performance dedicated server for MSSQL shared between customers.
View 14 Replies
View Related
Mar 28, 2008
Calling on all hosting and server experts here. (If you're not an expert, still feel free to take an educated stab at this. But please leave out total made up answers or foolish answers like "Have LittleJoeShmoe Hosting services do it all for $9.99/month".)
Scenario:
If you knew or were planning on developing a site that you knew would generate millions to tens of millions of page views a day, how would you go about supporting a site of such traffic? The site would not serve up videos, but the average page size would contain up to 75-100kB. It would incorporate databases (user logins, accounts, user submitted content, server side scripting, CMS, etc.)
Don't assume anything. Don't assume you have too little or too much money. Just, what would you plan out to accomodate such a scenario?
What hosting companies would you use? Would you do it in-house and build your own datacenter? Farm out the server management? How much would it cost to implement your plan? What platform would you recommend for a site to handle this much traffic?
View 5 Replies
View Related
Aug 9, 2008
I have cpanel server
I found today that my server gets overloading - load spikes to 60 - 80
I sit down and watch top command for 1 hour. Suddanly I noticed that something is going on - loop0 process started to eat over 100% cpu
I tried to stop httpd on another console window using
service httpd stop
After 4 tries it stoped and then loop0 calm down and load lowered to normal size
(below 1)
View 3 Replies
View Related
May 25, 2009
I have some NAS storage to do backups to with my dedicated hosting provider. I setup automatic daily backups with WHM to backup the databases and accounts. The server is hosting one site. When it does the backup, it does a mysql dump, and essentially takes the site down for the whole time it is doing the backup because nobody can connect to the database while the dump is happening. Is there a better way to do the backup so that this won't happen, whether it be a different method of backing up to NAS, or just not using NAS at all and using some other method.
View 3 Replies
View Related
Jan 19, 2007
I am trying to backup and restore a database and seem to be running in to the following errors.
ERROR 1005 (HY000) at line 29: Can't create table './tcadmin/tc_bandwidth_type.frm' (errno: 121)
Source machine
Windows Server 2003
mysql 5.0.11
destination machine
RHEL 4.x
MYsql 4.1.20
I think it is the way I am backing up the database that is the real problem
I have checked the compatibility mode option in the administration panel for backup, run full backup, and lockled the tables and still I get the error.
I have done some research and I think there is a command line I need to run to make it compatible with mysql 4.x systems but nothing seems to be right that I am doing. Can anyone here offer some insight to what the problem might be?
View 4 Replies
View Related
Jun 28, 2007
today when the indata1 file got corrupted and I found out that rsync was NOT backing it up. I had thought the rsyncing the life files was good enough and that if I had to restore I could just run mysqlrepair and get everything back.
what is the RIGHT way to backup MySQL databases and do I want to somehow get rid of the inoddb stuff?
View 14 Replies
View Related
Apr 14, 2009
Does anyone have experience in backing up mysql incrementally? From what I've read in the docs it's possible using the binary logs but I haven't been able to find a good resource on how to make this work.
I have a database that is over 5GB. There are a few Myisam tables that are insert/select only and one innodb table that receives updates/deletes/selects/inserts.
Ideally I wouldn't have to backup the 5GB every night, I'd prefer to only get the items that have changed. If I could make this work, then I could also get backups more often rather than once a night.
View 6 Replies
View Related
Apr 14, 2009
The automatic backups you can setup in cpanel/whm bundle the entire user account. Is it possible for it to save everything except the mysql databases?
I have an account that has databases over 5GB+ and would like to handle backing those up separately.
View 6 Replies
View Related
Jan 18, 2008
Using Cpanel:
-I went to backup option and downloaded the database backup. 951kb
-I went into phpadmin and checked addrop table. 1.5m
Using SSH:
command mysqldump -u myusername_usernam -pmypassword myusername_name > /home/name/backupz/name.sql = 949k
3 different sizes, I would figure maybe adding droptables would amount to higher size but the first way and third way i use, how come there is a size difference of 2kb?
Which is the right way to backup mysql database safely?
View 6 Replies
View Related
Aug 20, 2007
Any of you know any good mysql backup software that does all the packages for you? meaning, backup the whole shabang with only ssh access? users,dbs, etc? and probably restore it too?
View 6 Replies
View Related
Jul 20, 2007
I host with hostgator and I was wondering if there are any software programs or services, or even something in my cpanel that can automatically grab my MySQL databases and everything on my server and make a backup on my personal PC?
I know I can manually do this, but I would like something that automatically does it once a week or something to insure my clients data is always backed up.
View 2 Replies
View Related
Sep 26, 2007
I only have FTP access to webhost and no webhost cpanel access nor shell access. Is there a way to backup the MySQL database that is being used for ZenCart on this host?
I'm thinking that the answer to this question is "no". However, it would be wonderful if someone told me "yes".
View 2 Replies
View Related
Nov 29, 2007
What is the recommended method of backing up big MySQL database of size 2 GB ?
Quote:
# mysqldump gallery > gallery.sql
mysqldump: Got error: 1017: Can't find file: './gallery/10tir_user_group.frm' (errno: 24) when using LOCK TABLES
#
I checked the table, it is fine, but can't do the backup.
Quote:
mysql> check table 10tir_user_group;
+-------------------------+-------+----------+----------+
| Table | Op | Msg_type | Msg_text |
+-------------------------+-------+----------+----------+
| gallery.10tir_user_group | check | status | OK |
+-------------------------+-------+----------+----------+
1 row in set (0.07 sec)
mysql>
View 13 Replies
View Related
Sep 24, 2009
I need to get the h3ll away from Godaddy, fast!
However, my site has a large MySQL database (200 megabytes) and it's impossible to backup or restore via phpMyadmin. Godaddy has a feature on their control panel, where you can restore/backup your MySQL database to a special diretory on your web http file space, which you can upload/download vis FTP. I use this alot.
Is there any other web hosting company that offers such a feature? I haven't seen this anywhere except on Godaddy.
As to why I need to get away from Godaddy:
Basically what they did was, my site was getting too much traffic (even though it's still well under Godaddy's advertised limits for that plan)... so they sent me an email that says:
"Your site is using too much server resources. We have moved you to a new server to protect our other customers. Please identify steps to reduce your site traffic and contact us."
They moved my site to what is, apparently, a punishment area (Godaddy Hell) where all the high volume sites go. It is so extremely slow, my site might as well not exist... it is inaccessible to my visitors for all intents and purposes.
View 8 Replies
View Related
Jul 29, 2008
Is there a method of running a back-up script,as root, using MySQL, without passing or storing the root password in the clear?
I have tried OpenSSH with a nologin option using certificates, but I still have problems. I need to run it as a cron job every so often, (without specifics) securely.
I understand I can use another user, other than root, with read-only, but I need to back up the whole db at once, not specific users db's.
View 2 Replies
View Related
May 10, 2008
I have a corrupted VPS, and have some mysql databases on it.
I want to backup databases and restore them on the new server.
Cpanel service is down on the vps and I cannot transfer accts.
Is it possible to do so ?
1- Zip or Tar the username folder in /home dir.
2- zip or tar the database name in /var/lib/mysql folder
Either by SSH or File manager trough HyperVM
Then transfer files some where safe...
When vps is rebuilt, restore the archives and databases to
the folders I have backed up before rebuilt.
There are also some accounts on a terminated vps that there is only an image of the vps available so, I only can tar files using a jailed shell account, move to the new vps and untar them.
how to do it and avoid database problems.
View 0 Replies
View Related
Feb 8, 2008
My site is databases driven and runs on around 15 mySQL databases. Im wanting to download a local copy of these databases daily, however, if i try and back them up via cPanel (on a WHM VPS) they give me blank files. Each databases is around 55mb and growing. I can back them up one by one via phpmyadmin, it just takes around 5 minutes per database. Meaning around 40 minutes per night..
Is there any solution for having a script downloading them automatically so i can download them via ftp? I've tried [url].htm but it gives me blank files.
View 10 Replies
View Related
Aug 26, 2007
how do i backup mysql user privileges?
this will only backup the databases:
mysqldump --opt -u root -p --all-databases > data.sql
i would like to backup mysql user privileges too.
View 1 Replies
View Related
Apr 1, 2007
how to backup MySQL databases by cron and have the backups sent to you by email, or have them uploaded by FTP. It is based on a script I found at another website though I can confirm it is fully working.
Change the commented variables in the following file and save it as backup.sh:
Code:
#!/bin/sh
# This script will backup one or more mySQL databases
# and then optionally email them and/or FTP them
# This script will create a different backup file for each database by day of the week
# i.e. 1-dbname1.sql.gz for database=dbname1 on Monday (day=1)
# This is a trick so that you never have more than 7 days worth of backups on your FTP server.
# as the weeks rotate, the files from the same day of the prev week are overwritten.
#/bin/sh /home/user/directory/scriptname.sh > /dev/null
############################################################
#===> site-specific variables - customize for your site
# List all of the MySQL databases that you want to backup in here,
# each seperated by a space
# If not run by root, only one db per script instance
databases="mydbname"
# Directory where you want the backup files to be placed
backupdir=/home/mydomain/backups
# MySQL dump command, use the full path name here
mysqldumpcmd=/usr/bin/mysqldump
# MySQL Username and password
userpassword=" --user=myuser --password=mypasswd"
# MySQL dump options
dumpoptions=" --quick --add-drop-table --add-locks --extended-insert --lock-tables"
# Unix Commands
gzip=/bin/gzip
uuencode=/usr/bin/uuencode
mail=/bin/mail
# Send Backup? Would you like the backup emailed to you?
# Set to "y" if you do
sendbackup="n"
subject="mySQL Backup"
mailto="me@mydomain.com"
#===> site-specific variables for FTP
ftpbackup="y"
ftpserver="myftpserver.com"
ftpuser="myftpuser"
ftppasswd="myftppasswd"
# If you are keeping the backups in a subdir to your FTP root
ftpdir="forums"
#===> END site-specific variables - customize for your site
############################################################
# Get the Day of the Week (0-6)
# This allows to save one backup for each day of the week
# Just alter the date command if you want to use a timestamp
DOW=`date +%w`
# Create our backup directory if not already there
mkdir -p ${backupdir}
if [ ! -d ${backupdir} ]
then
echo "Not a directory: ${backupdir}"
exit 1
fi
# Dump all of our databases
echo "Dumping MySQL Databases"
for database in $databases
do
$mysqldumpcmd $userpassword $dumpoptions $database > ${backupdir}/${DOW}-${database}.sql
done
# Compress all of our backup files
echo "Compressing Dump Files"
for database in $databases
do
rm -f ${backupdir}/${DOW}-${database}.sql.gz
$gzip ${backupdir}/${DOW}-${database}.sql
done
# Send the backups via email
if [ $sendbackup = "y" ]
then
for database in $databases
do
$uuencode ${backupdir}/${DOW}-${database}.sql.gz > ${backupdir}/${database}.sql.gz.uu
$mail -s "$subject : $database" $mailto < ${backupdir}/${DOW}-${database}.sql.gz.uu
done
fi
# FTP it to the off-site server
echo "FTP file to $ftpserver FTP server"
if [ $ftpbackup = "y" ]
then
for database in $databases
do
echo "==> ${backupdir}/${DOW}-${database}.sql.gz"
ftp -n $ftpserver <<EOF
user $ftpuser $ftppasswd
bin
prompt
cd $ftpdir
lcd ${backupdir}
put ${DOW}-${database}.sql.gz
quit
EOF
done
fi
# And we're done
ls -l ${backupdir}
echo "Dump Complete!"
exit
Upload backup.sh to your server, to any directory you want. A directory which is not web-accessible will stop your login information being seen by just anyone .
You should chmod the file to 777:
Code:
chmod 777 backup.sh
If you uploaded this file from a Windows machine you will need to convert the file to Unix format. You should run the following command by SSH in the appropriate directory:
Code:
dos2unix backup.sh
If you don't have dos2unix installed, you can install it using yum if you have that:
Code:
yum install dos2unix
If you don't have yum, get it here.
You may want to test the script at this point to make sure it's doing what you want it to. Change to the appropriate directory and run this command:
Code:
./backup.sh
Once you're happy with it, enter it into the crontab to run daily (or whenever you want). Cron jobs vary a lot depending on the configuration of your system, so check Google for how to do it on your system. The command you will need to run by cron is:
Code:
/path/to/file/backup.sh
View 2 Replies
View Related
Jun 10, 2009
Is it possible to export the MYSQL DB from an Hypervm backup file?
If possible,how to?
As I fail several times to restore it in Hypervm.
View 9 Replies
View Related
Jun 29, 2008
I have following problem-i have over 20 sites on server and each site has own database.Is there a way to speed up backup and transfer or other server?
Method which i use right now it's following:first i archive entire directory using command tar -pczf name.tar.gz public_html,and then repeat with each directory.But i think i could simply archive all requied directories,but that will took toomuch time,so if i drop connection during archiving it wont be archived at all then.So i think best solution will be to create some kind of batch command which can load in background,so that means command wont stop if client lost connection.
So let's say i have 2 sites and two directories located and different places.
One is at home/site1 and other at home/site2
So i think i would need to put command into batch file
tar -pczf site.tar.gz. /home/site1 and tar -pczf site2.tar.gz /home/site2
Will that work? Also second part,mysql databases,i founded if i login into phpmyadmin as root i can see all databases.I managed to export all databases,but question will import again to phpmyadmin work.I think phpmyadmin create command for each database "if there is no db sitename_mysqlbase,create it",but howmuch i know phpmyadmin have limit with importing size of mysql database.Could that be done with import/export mysql command?
View 5 Replies
View Related
Jul 9, 2008
I have a 1GB MySQL database (compresses down to 300MB) and would like an automated method of backing it up to a remote server. Both accounts are shared hosting accounts (if it matters, both are running CPanel, no shell access on either).
View 13 Replies
View Related
Jun 12, 2008
Due to LayeredTech price hike, i'm going to cancel my server with LT within two weeks.
Does anyone know an easy way to backup all the domians and mysql database in FreeBSD/DirectAdmin? Is there a built in feature in DirectAdmin that can do that?
View 5 Replies
View Related
May 15, 2009
Is there any way to create a cron to backup mysql data daily ( or weekly )? I mean an "auto script" to run this command daily
mysqldump -u usernam -p password dataname > file.sql
View 11 Replies
View Related
Aug 5, 2008
I have some server with OS: CentOS 5.x
one day, I have to replicate 2 servers together. so, I have to stop webserver and use MySQL Administrator GUI Tool to backup it. My steps follow: (i work directly on Server 2 and i wantna to use mysql administrator gui to monitor and config replication).
1. Server 2: ssh to server 1-> connect mySQL -> add any host to root access
2. Server 2: use mySQL Administrator connect to server 1 with root
3. Server 2: backup from server 1 on Desktop location
4. Server 2: Restore it use mysql administrator tool.
It works well but when i click Restore button, MySQL Administrator GUI Tool disappear!
What's problem with this tools or server 2?
View 4 Replies
View Related