Just had a quick question about backing up a large MySQL DB. I have a database that is 50gb with about half a billion entries in it. One table itself is about 40gb, the other 10gb consists of smaller tables.
The problem is, I want to back the database up and be able to keep it LIVE at the same time (as it will fall behind quickly if it's pulled for more than a few hours, as there are somewhere in the area of a million entries an hour, plus other deletions and queries).
After I create a backup of a VPS using Virtuozzo (I'm using the 1and1 VPS), where is the image stored? Is it necessary to manually save this image to a different server, or do 1and1 take of that automatically?
I understand that servers can do automatic backups of information, yet I also see forum modifications that enable simple ways of doing a backup. Are there different types of backups? Why is it necessary to manually backup a forum database when its done automatically by the server? In terms of assuring the data, what is required and whats a typical procedure, what does it entail, is it manual and if so usually how often, or is it usually automatic?
I currently have 2 VPS accounts. The first one runs cPanel and hosts my website's files.
The second VPS hosts the mysql databases which the website uses. There are about 70 databases, each averaging about 120MB in size.
At current, i use sourceforge.net/projects/automysqlbackup/ to backup each database over two days. (This crashes the server briefly for about 10 minutes per day ). I then use rsync on both servers to backup everything to off-site location.
Obviously, I need to find a better solution. I was thinking about backing up the databases weekly instead. However, the 10 minute downtime period from the script will still happen.
We have a dedicated server with CPanel/WHM. It appears that the CPanel/WHM backup portion of the control panel is not functioning properly, as we just had an incident where all mySQL databases were deleted from the system and the last backup we had was from 03/31/08. This is not acceptable, as we need to have atleast at the min. weekly backups done of our system.
We are in need of a FREE 3rd Party (preferrably open source) solution where we can install on our Linux CentOS system to be able to have backups, and would like to have this software preferrably be able to backup to a remote location, but this is not required.
I am basically a Mac user and my windows desktop just went kaput. I am looking for a cheap and dirty way to make back-ups of my MS SQL 2000 DB.
Is it possible to run a script that creates a copy of it in MS Access which can later be convereted back into MS SQL?
Is it possible to go from MS SQL to MySQL?
The MS SQL 2000 server is hosted on a shared server and I cannot leave DTS packages on it. I prefer not to go the DTS route as it goes way over my head anyway. I do, by the way, have a dedicated Win 2003 server.
I need to have a windows PC on my internal network connect via sftp and download the generated backups on my webserver.
What files do I need?
I'd like to do it on a daily basis as I run local tape backups every night.
My backups are being put into /backup/cpbackup/daily/ but there is a tarball, dirs/, and files/.
Do I just need to download the tarball? I was hoping to do this with psftp and the windows task manager so that I wouldn't have to do it manually every day.
I want to back up my entire server, probably 50gb of data.
i have a 20mbit connection at home, and get 1.5+mbytes/second when pushing a file onto my home pc from the server via ftp, so i know the pipe between the two can handle the bandwidth.
i've set up rsync at home but for some reason the transfer rate does not go over 50-60kbytes/sec, i'm guessing cause the files are too small.
is this a problem with my rsync configuration? i havne't set any speed limits...
finally, is there some better solution than rsync to backup my server?
command i'm using for rsync: rsync --links --backup --recursive /mydirectory myserver::backuplocation.
I have a dedicated server with a hdd capacity of 750Gb.. currently, only about 100GB is being used.. what kind of backup would you recommend?
- Rsync to another server (like BQBackup.com) - Rsync to another harddrive within the same server - Cpanel archived backups onto another server/hdd - Something else?
we dont host our own website, but recenty our server crashed, and our hosting provider was supposed to be backing things up. They told us that the entire server crashed and the backups were on the same server so we lost everthing. we actually did end up getting evertying back but it took nearly a week to get back online. they said they are now backing up everthing in a remote location so this wont happen again, but we dont want to put this in their hands anymore, we didnt even get everything back .htaccess, hidden files, subdomains were not backup up and lost. So we want to do our own backing up.
Is there some thing I can set up as a cron task to weekly automatically backup every single file on the server, and the database?
We think this would be very beinificial to us as we lost alot of money from this ordeal and we do not want it to happen again.
I am in the process of upgrading PHP. I have successfully compiled with with no errors. However, before I run the make install command, I would like to backup all the files associated with my current version of PHP. I am going from 4.3.11 to 4.4.6. I was wondering if anyone knew what the directories and files to back up? I am using Fedora Core.
I'd like to back one of our servers up using S3 but I'm not sure the best way to go. Server is using CPanel. Obviously I want to keep costs down.
I need the files to be secure, backups need to be nightly incremental, and I need to be able to restore to another CPanel server fast in case of hardware failure. CPanel supports FTP for backups. I currently back up to another server, but I want to move to the cloud.
Specific questions I have:
1. Any recommendations on a cloud backup service (S3, Rackspace, etc?), and why?
2. Any recommendations on tools? I want this to be simple to set up. As easy as setting up an FTP account.
3. Is there another option I should consider? That server has about 62GB of data on it to be backed up.
So, I have my own dedicated server and have been rather neglectfull.
I currently have about 24 thousand email messages from the last few years. Like a tard, I always clicked "leave a copy of messages on server" on my mail clients.
So, I've been doing some reading and have decided to start over usin IMAP instead of POP3. Since I have all the messages in Outloop (go go 700 Meg .PST!) I',m looking for the best way to:
1. Back up my mail on the server 2. Nuke it all 3. Convert to IMAP
I then plan on installing something like http://www.roundcube.net/
There is so much information on disaster recovery and backing up one's server, that I'm getting glassy-eyed trying to take it in. Maybe if I became an actual case study, and get some "group think" help, this thread could benefit many others in a similar situation.
Current Situation:
1. I'm a small hosting company, 5 years in existence, with about 350 clients. www.mlhi.net
2. Dedicated Linux server, PLESK CP w/unlimited domains license, fully managed at HostNexus (great guys). It does not have a RAID array (used to have that at Rackspace) but it does have a backup drive that everything is backed up to with a cron job every night.
3) In addition I have a Linux Sys Admin on retainer, www.linuxbox.co.uk (he is better than excellent). Two years of excellent server maintenance and security on top of the managed service I get at HostNexus.
4) I just bought a VPS plan at JauguarPC.com after much research (a lot of it here at WHT) and as they say "so far so good" with the ease of dealing with them. I have not setup anything there yet- just got the VPS provisioned a few days ago.
Fears and Concerns:
1. Data center destroyed/ my server burns up (including backup drive) etc etc.
2. DDOS attack (which did hit this data center a few months ago and I was down for hours)
3. If I had to FTP everything back to another server from my local, at 18 GB, it's not too cool.
Want to do this:
1. I want my Sys Admin to run a backup copy (and incrementals every night) to an identically configured VPS server at JaguarPC. Both servers are now running identical PLESK 8.4.
2. I want the fastest recovery possible without spending a ton of money. I know this means I don't get an "instant" recovery, but recovery within 24 hours is more than OK. None of my customers are ecommerce... just brochureware sites.
My "I'm not an expert" plan:
1. If primary server goes bye-bye forever, I can login to my BulkRegister/Enom account and change the child nameserver IPs to the IP's of the VPS. In 24 hours or less, every request for the nameservers would then be routed to the new server.
2. I can create an A record on every domain like www2.johndoeinsurance.com that would point to the IP at the VPS, so I can ease my mind anytime I want to make sure everything is safe and sound on the second server, and ready to go in an emergency.
How do I configure the DNS?
I control dns at Enom for about two-thirds of my customers. I have ALL domains pointed to ns.mlhi.net and ns2.mlhi.net. Here are my options??
1. I create two more child nameservers... ns3 and ns4 and have then pointed to the IPs at the new server, then update all the domains I control. The rest of the customers I can email and ask them to add the additional nameservers. I know... good luck on them doing it.
2. I change the ns2 IP to go to the new server. And I make sure when I make edits on a website during the day that I FTP to both servers.
3. I don't have any nameservers assigned to the new server. I just change the IP on the existing nameservers in the event of an emergency.
I'm having a lengthy issue where my databases are to large to import in phpmyadmin using plesk. Unfortunately I dont have direct access to phpmyadmin and can only access it by DB user through plesk.
I have tried to edit php.ini in the following locations:
upload_max_filesize = changed this to 64M
post_max_size = changed this to 32M
maximum_execution_time = changed this to 300
maximum_input_time = changed this to 300
Why am I still not able to import my DB's which are about 8MB each?