Backup Site Everynight
Feb 28, 2009Is their a script that you can schedule to run as a cron to backup your public_html folder to a dir on the server then have it email u a download link?
View 8 RepliesIs their a script that you can schedule to run as a cron to backup your public_html folder to a dir on the server then have it email u a download link?
View 8 Repliesa complete backup of my site including CRM system.
Would it work if I make a tarball of my www dir and then I make a database backup from webmin control panel?
I have several servers and i want to provide at my servers a continuos backup data protection and disaster recovery backups to a off-site location using R1Soft software.
So does anyone knows any good and reliable company who provides the service off making backups from external servers using R1Soft?
I have been looking around and until know i found:
The Planet - Here i can buy any kind of server and havint installed with R1Soft for a extra cost of 50$
SteadFast - Here i can rent a full managed backup service using Steadfast Off-Site R1Soft CDP Backups described at: [url]
The most common way is tar the whole files and downlaod it. But as there's also mysql DB. The common way is usually take a lot of time to restore it.
View 2 Replies View Relatedhow i can prevent a site to take a full backup from his Cpanel?
i am sure there is a way to do that from the server
I have a script that will create a compressed backup file of my website files as well as a compressed backup file of my mysql databases.
What I want to do is create a script that will site on my external hard drive that will be connected to the internet and would have it automattically connect to my webhosting account (via ssh or ftp) and download the specifid backup files everyother day. My goal is to be able to have a backup file on my local hard drive here in case anything happens to the backups that are already saved to another webhosting server.
am hosting with servage most such hosts dont provide backup for ur files in case ur moving to other host so i was wondering if i can manually through FTP download my enitre directory including all files (html, jpg, PHP SCRIPTS) and manually reupload them onto the other host i will move to soon will that actually work
and for MySql databases i have created a PROPER backups via their auto backup system
they only lack backup for the files itself only, MySql is fine
will restoring the MySql databases but via a different control panel (CPanel) will it possible?
The data centre which I use, is moving all collacated servers to a brand new data centre next week, which will mean a two hour downtime for each of my servers and customer websites.
At the moment my servers utilise two nameservers on seperate servers and when the move happens all websites will be offline with an ugly error message to any visitors.
Does anyone know a good way to setup a page which would be displayed if the web server was down? I am using MS DNS.
My thoughts so far is
1. Setup a 3rd nameserver which is off site from the data centre.
2. Purcahse web hosting / vps for a month on a seperate hosting company
3. Set it up so that it accepts * to a specific IP address in IIS or apache
4. I create an index.php script which gets the host header value sent i.e. [url]and the page then displays a nice maintenace i.e
"We are sorry joeblogs.com is currently down for maintenace, we will be back online shortly".
I think my main question is do I need to setup a 2nd www record in DNS for each site and how do I ensure the 2nd dns (backup record) only gets used when the first website / server is down.
I moved from a host to another and wanted to restore/use the email adresses that are already saved on my old host account.
I did a full backup by using Cpanel's wizard and saved it on my computer. Extracted everything and then by using FTP I have put all the files on my new server which is using Cpanel as well.
Although all email adresses are now visible in my Cpanel the old passwords will not work anymore......
When you delete a site backup from its "Backup Manager" Panel, it is removed and no longer displayed in the Panel. However, I cannot tell if this action actually does anything with the real site backup files in "/var/lib/psa/dumps". Does this action merely remove it from PSA's database but not touch any actual files? If this is true, then how are site backup files supposed to be managed if this action doesn't actually delete them?
View 3 Replies View RelatedWhen making backup, I get this error:
Warning: server "server"
Unable to backup server Site Builder content[Error in execute ''"C:Program Files (x86)ParallelsPlesksbutilsbru.php" --backup --log=stdout --target=server_settings --file="C:Program Files (x86)ParallelsPleskPrivateTemp mp34720.tmp"':
# Error: Cannot execute query to database. #
Attached is a (badly) drawn diagram of two sites, connected by a vpn.
The site to the left, is network 10.0.0.0/24 which runs a linux server as the router for the network.
The site to the right, is network 10.1.0.0/24 which runs a windows 2003 server as the router for the network.
Now, my problem is, the clients behind the windows 2003 server can ping any machine on the first network because i setup a static route to route all traffic to 10.0.0.0/24 over the vpn interface.
now, my problem is, only the linux server can ping any machine on the windows 2003 network, any client behind the linux server cant seem to route over the interface.
I have the following route on the linux server: .....
Starting point: a working site using a shared IPv4, dedicated IPv6, and SSL. HTTP and HTTPS work, the latter only using SNI of course.
The good news: If I simply allocate an IP resource of 1 to a subscription it is pulled from the pool, assigned to the service node, assigned to the web site, DNS is updated, and the site is automatically changed to using a Dedicated IPv4 and Dedicated IPv6.
The bad news: visitors land on the default web site of the service node, with the default SSL certificate.
Other info: I can't ping the new IP, even though it shows in "ip a l" and /etc/sysconfig/network-scripts/ifcfg-eth0:0. [edited]
After the IP assignment, it is still installed, and /etc/httpd/conf/plesk.conf.d/ip_default/domainname.conf shows the new certificate is being used.
However, a second set of VirtualHost entries is created in server.conf for this IP for ports 80 and 443, with NameVirtualHost enabled on the new IP. The port 443 entry uses the default certificate. Apache's setup this default VirtualHost entry will override the web site configuration because Apache is listening on port 443 with the wrong cert.
If I go to "Change webspace settings" and toggle to Shared IPv4, Dedicated IPv6 the site works again via HTTPS, and Dedicated IPv4 and Dedicated IPv6 breaks it again. Setting the SSL cert to None and back again does not work.
Setting the SSL cert to None, changing to a dedicated IP, and enabling SSL results in the server being inexplicably inaccessible...browsers no longer connect to either the default site or the correct site, and I don't see any entries in the vhosts's logs.
is there anyone knows for a good hosting located in uk,which is allowed : adult site and casino betting online site ?
im looking for vps and dedicated server.
please help me i really need as soon as possible.thx
I run basicly run two main site.
1.Forum big one .
2.File and image sharing site.
(image sharing site generates thumbnails which produces lots of hits)
In these conditions how much difference can lighttpd can do as compared to apache for keeping my 600 MB Ram VPS host constant.
I'm on a short assignment to inventory and manage the fixed assets of a small company, and we've just bought a web-based database for this purpose. While I'm pretty good at administering/running local databases, the web part has me stymied. Our company is between IT people, and there's no one on site with any more idea than I have about what's going on!!
Here's what I have so far:
--The company has a website which I'll call "ourwebsite.org" -- which I think, from searching the IP address the website points to, is hosted by HostMySite.com.
--There's also a record in DNS Management with the same name (ourwebsite.org), but pointing to our little server's local IP address.
--I need to find a way to get my database -- which I can access on the network at (server's IP address)/database (ie 0.0.00.0/database) -- online. I tried creating records in DNS Management (for ex., assets.ourwebsite.org) that point to our server's IP (the one that, if I type it in on the network, I can get to the site I'm looking for), but get generic "can't find the page" or "can't connect to the server" errors, even after 72 hours, when trying to access it from off the network.
--If I browse to assets.ourwebsite.org/database on the server itself, I get to the website! But if I go to that page from any other computer, on or off the network, it doesn't work.
--The Server is running Windows Server 2003
So, what are my options? Do I have to talk to the HostMySite.com people to add this page? Shouldn't I just be able to use my server's name (ourcompanyadc.ourcompany.org) and have that route to the server? What's going on here! Is there a simple way to get a tiny local-server-hosted website online outside of the network?
I just transferred a domain from one cpanel box to another.
Now, that site is showing someone else's page. I've seen this happen before, but I cannot remember the fix.
the virtual host in httpd.conf is fine, shows proper IP, username, docroot, etc
Dns zone is fine as well.
The domain is using the server's main IP, so that's not the cause.
Centos 5 / cpanel 11 / apache 1.3 / php 4x
I have multiple backups stored under server repository (subscriptions --> <domainname> --> website and domains --> backup manager).
The physical files are located at: /var/lib/psa/dumps/clients/904279/domains/<domainname>/
When I click the green arrow to download these files to a local computer (see attached image) I get a new page with title "Download the backup file". On this page I have the option to set a password on the downloaded file, but no matter what I do (password or no password) the file is not downloaded to my local PC. I don't get a pop-up box with the option to save the file. Just nothing happens ...
I have 2 problems:
Firstly I wonder if there is any possibility to limit the number of cores the plesk backup zipping tool uses? This pigz takes up all my CPU. Is there any way I can reduce the amount of cores it uses because all my websites are down every time a backup takes place for around 3 minutes.
Secondly I get the following in my syslog:
1 baby plesk sendmail[20189]: Error during 'check-quota' handler
I don't know what is wrong. I think it's since the upgrade to Plesk 12. I now have 12.0.18 Update #13.
I have a 6GB backup file created with another Plesk Backup Manager, now I trying to upload this backup file to my Plesk Backup Manager but after upload 3% I am getting "413 Request Entity Too Large" error, I tried with disable NGINX but still getting this error.
how can I resolve this error or is their any other way to upload my file on backup manager?
I see that Backup Manager have a file size restriction of 2GB how can I increase this?