I have a corrupted VPS, and have some mysql databases on it.
I want to backup databases and restore them on the new server.
Cpanel service is down on the vps and I cannot transfer accts.
Is it possible to do so ?
1- Zip or Tar the username folder in /home dir.
2- zip or tar the database name in /var/lib/mysql folder
Either by SSH or File manager trough HyperVM
Then transfer files some where safe...
When vps is rebuilt, restore the archives and databases to
the folders I have backed up before rebuilt.
There are also some accounts on a terminated vps that there is only an image of the vps available so, I only can tar files using a jailed shell account, move to the new vps and untar them.
I recently had a harddrive failure and luckliy I can still access certain directories on this failed drive. I can still access the /var/lib/mysql/ directory which holds all the users databases and have backed all these up separately using tar.
Now what I need to know is how do you restore these database files to another server? I tried simply untar'ing one of these to the new servers /var/lib/mysql/ direcotry and it stuffed Mysql up - it went offline. I had to get a cpanel tech to bring Mysql back online.
how can I get these database files to fully work on a new server?
I host with hostgator and I was wondering if there are any software programs or services, or even something in my cpanel that can automatically grab my MySQL databases and everything on my server and make a backup on my personal PC?
I know I can manually do this, but I would like something that automatically does it once a week or something to insure my clients data is always backed up.
My site is databases driven and runs on around 15 mySQL databases. Im wanting to download a local copy of these databases daily, however, if i try and back them up via cPanel (on a WHM VPS) they give me blank files. Each databases is around 55mb and growing. I can back them up one by one via phpmyadmin, it just takes around 5 minutes per database. Meaning around 40 minutes per night..
Is there any solution for having a script downloading them automatically so i can download them via ftp? I've tried [url].htm but it gives me blank files.
how to backup MySQL databases by cron and have the backups sent to you by email, or have them uploaded by FTP. It is based on a script I found at another website though I can confirm it is fully working.
Change the commented variables in the following file and save it as backup.sh:
Code: #!/bin/sh
# This script will backup one or more mySQL databases # and then optionally email them and/or FTP them
# This script will create a different backup file for each database by day of the week # i.e. 1-dbname1.sql.gz for database=dbname1 on Monday (day=1) # This is a trick so that you never have more than 7 days worth of backups on your FTP server. # as the weeks rotate, the files from the same day of the prev week are overwritten. #/bin/sh /home/user/directory/scriptname.sh > /dev/null ############################################################ #===> site-specific variables - customize for your site
# List all of the MySQL databases that you want to backup in here, # each seperated by a space # If not run by root, only one db per script instance databases="mydbname"
# Directory where you want the backup files to be placed backupdir=/home/mydomain/backups
# MySQL dump command, use the full path name here mysqldumpcmd=/usr/bin/mysqldump
# MySQL Username and password userpassword=" --user=myuser --password=mypasswd"
# MySQL dump options dumpoptions=" --quick --add-drop-table --add-locks --extended-insert --lock-tables"
# Send Backup? Would you like the backup emailed to you? # Set to "y" if you do sendbackup="n" subject="mySQL Backup" mailto="me@mydomain.com"
#===> site-specific variables for FTP ftpbackup="y" ftpserver="myftpserver.com" ftpuser="myftpuser" ftppasswd="myftppasswd" # If you are keeping the backups in a subdir to your FTP root ftpdir="forums"
#===> END site-specific variables - customize for your site ############################################################
# Get the Day of the Week (0-6) # This allows to save one backup for each day of the week # Just alter the date command if you want to use a timestamp DOW=`date +%w`
# Create our backup directory if not already there mkdir -p ${backupdir} if [ ! -d ${backupdir} ] then echo "Not a directory: ${backupdir}" exit 1 fi
# Dump all of our databases echo "Dumping MySQL Databases" for database in $databases do $mysqldumpcmd $userpassword $dumpoptions $database > ${backupdir}/${DOW}-${database}.sql done
# Compress all of our backup files echo "Compressing Dump Files" for database in $databases do rm -f ${backupdir}/${DOW}-${database}.sql.gz $gzip ${backupdir}/${DOW}-${database}.sql done
# Send the backups via email if [ $sendbackup = "y" ] then for database in $databases do $uuencode ${backupdir}/${DOW}-${database}.sql.gz > ${backupdir}/${database}.sql.gz.uu $mail -s "$subject : $database" $mailto < ${backupdir}/${DOW}-${database}.sql.gz.uu done fi
# FTP it to the off-site server echo "FTP file to $ftpserver FTP server" if [ $ftpbackup = "y" ] then for database in $databases do echo "==> ${backupdir}/${DOW}-${database}.sql.gz" ftp -n $ftpserver <<EOF user $ftpuser $ftppasswd bin prompt cd $ftpdir lcd ${backupdir} put ${DOW}-${database}.sql.gz quit EOF done fi
# And we're done ls -l ${backupdir} echo "Dump Complete!" exit Upload backup.sh to your server, to any directory you want. A directory which is not web-accessible will stop your login information being seen by just anyone .
You should chmod the file to 777:
Code: chmod 777 backup.sh If you uploaded this file from a Windows machine you will need to convert the file to Unix format. You should run the following command by SSH in the appropriate directory:
Code: dos2unix backup.sh If you don't have dos2unix installed, you can install it using yum if you have that:
Code: yum install dos2unix If you don't have yum, get it here.
You may want to test the script at this point to make sure it's doing what you want it to. Change to the appropriate directory and run this command:
Code: ./backup.sh Once you're happy with it, enter it into the crontab to run daily (or whenever you want). Cron jobs vary a lot depending on the configuration of your system, so check Google for how to do it on your system. The command you will need to run by cron is:
However, my site has a large MySQL database (200 megabytes) and it's impossible to backup or restore via phpMyadmin. Godaddy has a feature on their control panel, where you can restore/backup your MySQL database to a special diretory on your web http file space, which you can upload/download vis FTP. I use this alot.
Is there any other web hosting company that offers such a feature? I haven't seen this anywhere except on Godaddy.
As to why I need to get away from Godaddy:
Basically what they did was, my site was getting too much traffic (even though it's still well under Godaddy's advertised limits for that plan)... so they sent me an email that says:
"Your site is using too much server resources. We have moved you to a new server to protect our other customers. Please identify steps to reduce your site traffic and contact us."
They moved my site to what is, apparently, a punishment area (Godaddy Hell) where all the high volume sites go. It is so extremely slow, my site might as well not exist... it is inaccessible to my visitors for all intents and purposes.
one day, I have to replicate 2 servers together. so, I have to stop webserver and use MySQL Administrator GUI Tool to backup it. My steps follow: (i work directly on Server 2 and i wantna to use mysql administrator gui to monitor and config replication).
1. Server 2: ssh to server 1-> connect mySQL -> add any host to root access
2. Server 2: use mySQL Administrator connect to server 1 with root
3. Server 2: backup from server 1 on Desktop location
4. Server 2: Restore it use mysql administrator tool.
It works well but when i click Restore button, MySQL Administrator GUI Tool disappear!
So your host keeps at least a daily backup of your database, but if you don't get to the backup in time, you may find that they have a backup of a crashed database. That means its important to have a procedure for getting regular backups by other means too. How are you doing this?
I've recently been asked to do a lot of data extraction from a state that has about 20 databases, each with between 10 and 100 tables. I often find myself diagramming things out on paper to try to visualize how everything works together.
I wondered if there's a tool that would "draw" the tables, columns, and relationships? I hesitate to say this, but almost like how MS Access does it, but that runs on Linux and does MySQL. Is there such a thing? I know about phpMyAdmin and MySQL's Query Browser, but it's not what I'm looking for.
I am running a dedicated server with Debian, and I installed a community sofware that has a lot of mySQL entries, many of which need to be changed to fit my needs
However, it is very hard to know exactly where each value I need to change is stored. Is there a way to search all database tables for a specific value?
For example, one thing that is stored in the database is the site's title displayed in the browser's title bar. The sofware does not give me the option to change it, so I have to find where it is located in the database and change it myself, but it would be extremely time comsuming to check all tables one by one for any occurences of the current title.
My HD was failing. So ServerBeach quickly set me up with a fresh box with the dying drive installed as a secondary drive. (nice job guys)
I'm able to mount the secondary drive and browse/copy the files... but when I get to the most important files of all, the database binaries, my /var/lib dir looks like this:
Code: [root@rosemary v2]# ls -l /mnt/dying/var/lib/ total 68 ... .... -rw-r--r-- 1 root root 2171 Sep 14 03:02 logrotate.status drwxrwsr-x 6 root 41 4096 Oct 12 2005 mailman drwxr-xr-x 2 root root 4096 Oct 12 2005 misc ?--------- ? ? ? ? ? mysql drwxr-xr-x 4 root root 4096 Oct 12 2005 nfs drwxr-xr-x 2 ntp ntp 4096 Sep 14 10:31 ntp ... ... And if I try to cd into the dir, I get this:
Code: [root@rosemary v2]# cd /mnt/dying/var/lib/mysql -bash: cd: /mnt/dying/var/lib/mysql: Input/output error I really *really* need this data!
How can i check (using SSH) which databases/users cause server load to mysql ?
I've tried "mysqladmin proc stat" but it shows just the current. How can i get stats of the last 24 hours for example ?
I've also seen slow connections stats. What is the command to check more detailed report of the slow connections; which databases caused it etc', in the last 24 hours for example as well.
Since I have never worked on the server end of things I had a quick question for all you web hosting gurus.
Is it possible to have PHP installed on ONE single server and still have the ability for the server to work with both MS Access AND MySQL at the same time?
I would think YES, but I am being told by our server branch at my current job that this is not the case. They claim there is no way for the server on one machine to be able to handle both types of databases. Are they right?
If they are wrong and it is possible to have one server run both type of databases, what steps would be necessary to set up the server to handle both types? Do we need to tweak the php.ini file or is there another method of allowing the server the ability to work with both MySQL and MS Access.
Sorry if this question seems stupid or odd, as I said, I have minimal experience on the server end but I am confident that a server can handle both.
So I was trying to look up something from the horde Databases just to find that the full server backup that can be triggered in "tools & settings" -> "backup manager" does NOT include the horde and roundcube databases, even with "configuration and content" selected.
Sure, there are 7 days worth of daily backups in /var/lib/dumps but it would be nice to have those files included in the "full backup" Or am I missing something here?
I have a server that is running linux with cpanel and I am running out of space on a partition.
I was getting this error: Drive Warning: /dev/sda3 (/var) is 83% full
I looked in the folder and went to /var/lib/mysql and noticed that I have about 6 databases that are a little over 1 GB each. The sda3 partition only has a capacity for 9.9 GB and I was suggested to configure mysql to hold my databases on another partition that has more space..