am hosting with servage most such hosts dont provide backup for ur files in case ur moving to other host so i was wondering if i can manually through FTP download my enitre directory including all files (html, jpg, PHP SCRIPTS) and manually reupload them onto the other host i will move to soon will that actually work
and for MySql databases i have created a PROPER backups via their auto backup system
they only lack backup for the files itself only, MySql is fine
will restoring the MySql databases but via a different control panel (CPanel) will it possible?
to get a copy of their entire site onto our servers.
The problem is that they have no backups, and their current host has denied them access to their account. (It's a long story.)
Rather then try and fight with their old host, I wonder if it's easier to just run some kind of "website copy" program and grab a copy of all pages and images, linked files, etc..
Their current website pages are served using an ASP database application, but the copy they need only has to work. It doesn't need to be dynamic, etc... as long as all the pages are there it's good.
A single website is returning 503 to every request - it's a wordpress site - and we have a lot of those, none of the others are returning the same errors, so this is quite odd to me:
The backup program (cpbackup) that comes with WHM/Cpanel is not ideal for our current situation. We have lots of third party applications installed on our server along with many customized configuration files. We need a solution that will backup everything and allow for fairly simple restoration.
How viable is rsync for full server backups? Can it handle 100 GB of data?
More importantly, how would you restore the backup to a new server? If the new server already has an OS the restored files would break the system, right?
Is their a script that you can schedule to run as a cron to backup your public_html folder to a dir on the server then have it email u a download link?
We run CentOS for our web servers (not providing hosting) and manually compile Apache, PHP etc. What would be the benefits to using Yum rather than compiling the software ourselves?
The way I see it, Yum allows Apache etc to automatically update to latest versions and saves time. When compiling myself though, I can specify exactly where to install to and which modules etc to install with the software. Is there any reason we should be using Yum instead?
I have several servers and i want to provide at my servers a continuos backup data protection and disaster recovery backups to a off-site location using R1Soft software.
So does anyone knows any good and reliable company who provides the service off making backups from external servers using R1Soft?
I have been looking around and until know i found:
The Planet - Here i can buy any kind of server and havint installed with R1Soft for a extra cost of 50$
SteadFast - Here i can rent a full managed backup service using Steadfast Off-Site R1Soft CDP Backups described at: [url]
I have a script that will create a compressed backup file of my website files as well as a compressed backup file of my mysql databases.
What I want to do is create a script that will site on my external hard drive that will be connected to the internet and would have it automattically connect to my webhosting account (via ssh or ftp) and download the specifid backup files everyother day. My goal is to be able to have a backup file on my local hard drive here in case anything happens to the backups that are already saved to another webhosting server.
to the httpd-manual.conf to enable access by other machines on my network but it seems to be hard coded to localhost only.
Is there something else I can add to httpd-manual.conf to enable it or do I need to copy manual folder to the htdocs folder and then restrict that <Directory> ?
The data centre which I use, is moving all collacated servers to a brand new data centre next week, which will mean a two hour downtime for each of my servers and customer websites.
At the moment my servers utilise two nameservers on seperate servers and when the move happens all websites will be offline with an ugly error message to any visitors.
Does anyone know a good way to setup a page which would be displayed if the web server was down? I am using MS DNS.
My thoughts so far is
1. Setup a 3rd nameserver which is off site from the data centre.
2. Purcahse web hosting / vps for a month on a seperate hosting company
3. Set it up so that it accepts * to a specific IP address in IIS or apache
4. I create an index.php script which gets the host header value sent i.e. [url]and the page then displays a nice maintenace i.e
"We are sorry joeblogs.com is currently down for maintenace, we will be back online shortly".
I think my main question is do I need to setup a 2nd www record in DNS for each site and how do I ensure the 2nd dns (backup record) only gets used when the first website / server is down.
I moved from a host to another and wanted to restore/use the email adresses that are already saved on my old host account.
I did a full backup by using Cpanel's wizard and saved it on my computer. Extracted everything and then by using FTP I have put all the files on my new server which is using Cpanel as well.
Although all email adresses are now visible in my Cpanel the old passwords will not work anymore......
When you delete a site backup from its "Backup Manager" Panel, it is removed and no longer displayed in the Panel. However, I cannot tell if this action actually does anything with the real site backup files in "/var/lib/psa/dumps". Does this action merely remove it from PSA's database but not touch any actual files? If this is true, then how are site backup files supposed to be managed if this action doesn't actually delete them?
Warning: server "server" Unable to backup server Site Builder content[Error in execute ''"C:Program Files (x86)ParallelsPlesksbutilsbru.php" --backup --log=stdout --target=server_settings --file="C:Program Files (x86)ParallelsPleskPrivateTemp mp34720.tmp"': # Error: Cannot execute query to database. #
I figure while I am twiddling my thumbs here waiting for my host to tell me what the heck happened for the second time in two or three months why they have to do an entire VPS hard restart, which of course causes another hour of fck delays, that I'd ask some of the more skilled and experience folks here, how?
Just before it happened, as I was watching, the load shotup over 1, 2, 4, 20, 30 boom. (I opened a ticket at 4)
Shouldn't virtuozzo always guarantee a certain amount of cpu and bandwidth to the node root? Why do they have to hard reboot and not access it directly and stop the badly behaving vps? Better yet, why isn't the badly behaving vps stopped automatically by virtuozzo?
(oh and am I an idiot for putting up with over two hours of downtime?)
I always use Apache and it's easy to redidrect an entire domain in the htaccess.
Code: RewriteCond %{HTTP_HOST} ^www.domain1.com$ RewriteRule ^(.*)$ [url]$1 [R=301,L] For a site I'm working on I need to achieve the same thing with IIS but have no idea how to and the technician claims it can't be done. Surely, that's not the case. Any ideas?
By the way I want the exact same effect as the above not just a generic redirect that sends everthing to the home page of the new domain.