I am asking myself what the best backup strategy for my dedicated server could be. I do not host any commercial website on it (just private ones), but we all know that even losing non-commercial data hurts.
What I already do:
More or less recent backups of the most important websites (just webspace, no email or database) on my local computer
Having hardware Raid-1 on the server and 2 HDs
Using R1Soft Backup Space provided by my server host
I would like to add some FTP space at an external (!) DC to do scheduled cPanel backups.
I think that a further dedicated server or VPS will not be necessary for this task.
Can you recommend a provider of pure FTP Backup Space which is reliable, at least on a 100 Mbps line and affordable?
I guess I need about 100-200 GB of space to keep daily, weekly and monthly backups.
I am writing a website in PHP and just about make the first version live.
I'm a bit concerned with backups at the moment. I don't want to lose all the data, as a lot of effort will be put into adding content to the site. It uses a MySQL database as well as the file system to store data.
I was just going to do MySQL dumps nightly, but I realized that I know nothing about the topic and therefor should ask advice before writing code in PHP ( which only run for 30seconds max? Is that long enough to backup?)
What is the common practices to do backups on a CMS type site? How often? How do you manage backups? Where do you store them too? Does doing backups effect performance? Do you do full backups or just changes? If I write a backup script, should I try do it in perl/python/bash or stick with PHP?
Or any other information which will fill me with confidence.
I'm gonna order my first dedi box. So one question comes up: How should i partition the HD ? It's a low end box with 120gb HD containing CentOS + Plesk, hosting 4-5 websites, nothing special.
We are going to host an application for 20 customers. Our application is related to online order system. We will create 20 virtual host on Windows machine. Application developed in dot Net and database is MS SQL 2005. each client have its own database. I Just want to get an Idea from you people about CPU, RAM, Hard Disk, and Bandwidth.
I am working towards launching a site that, among other things, will be a repository for sensitive data on war crimes. As these crimes are ongoing, and occur in a location where assassinations are endemic, I need to develop a comprehensive security strategy that takes into account all levels of the interface between end user / witness and the site / database itself.
I have considered, but am open to insight and advice on, the following:
1. Data security laws in given countries, in order to ensure the privacy and integrity, as much as possible and away from political / state interference, of data communications. Concerns include the interception of data in transfer and the security of stored data (the United States and the UK are almost certainly cancelled out in this regard. Canada appears significantly better, though Greece, it appears, has the greatest level of legislative protection).
2. Encryption as a technique to ensure the security of transfered and stored data. I am particularly interested in best practice advice on encryption.
3. Javascript as a means to establish a more secure interface between the end user (i.e., the browser interface) and the secured database into which sensitive data will be inputed. Has anyone used this, or other techniques of overcoming the inherent insecurity of the browser interface?
4. Various best practices concerning php, MySQL and Apache security. Any and all advice, or guidelines, welcome.
5. Considerations relative to dedicated hosting, and also colocation hosting as an option.
In general, my problem is to ensure that the identities of witnesses, so much as is technically conceivable, can be protected from extra-judicial interference or surveillance. Nothing about this site will be illegal in any way. The problem is that the witness testimony will be about the actions of a powerful state that has demonstrated its disrespect for law.
Ideally I'd find in these forums a few individuals with whom I could discuss these technical matters off-forum. At the same time, general responses would be values.
The site that I'm building is non-profit (indeed zero budget) and does not represent any political party. It's a people's initiative, against aggressive violence and in support of international law.
I have multiple backups stored under server repository (subscriptions --> <domainname> --> website and domains --> backup manager).
The physical files are located at: /var/lib/psa/dumps/clients/904279/domains/<domainname>/
When I click the green arrow to download these files to a local computer (see attached image) I get a new page with title "Download the backup file". On this page I have the option to set a password on the downloaded file, but no matter what I do (password or no password) the file is not downloaded to my local PC. I don't get a pop-up box with the option to save the file. Just nothing happens ...
Firstly I wonder if there is any possibility to limit the number of cores the plesk backup zipping tool uses? This pigz takes up all my CPU. Is there any way I can reduce the amount of cores it uses because all my websites are down every time a backup takes place for around 3 minutes.
Secondly I get the following in my syslog:
1 baby plesk sendmail[20189]: Error during 'check-quota' handler
I don't know what is wrong. I think it's since the upgrade to Plesk 12. I now have 12.0.18 Update #13.
I have a 6GB backup file created with another Plesk Backup Manager, now I trying to upload this backup file to my Plesk Backup Manager but after upload 3% I am getting "413 Request Entity Too Large" error, I tried with disable NGINX but still getting this error.
how can I resolve this error or is their any other way to upload my file on backup manager?
I see that Backup Manager have a file size restriction of 2GB how can I increase this?
I have an Ubuntu 14.04 LTS 64 bit virtual private server with Plesk 12. The server is hired from a hosting provider. The server is used to run the Odoo ERP application (using postgres database).
The Odoo application is running fine and now I want to create a backup of the application using Plesks Backup manager.
I choose configurations and content option in the backup manager but the created backup is only 200kb.
I think the problem is the location where the Odoo application is installed is not included in the backup. I made a tar backup from the server and extracted it on my pc. It seems that the main parts of the Odoo application are in the var, opt, etc and usr directories (not in a domain but under root).
Installing the application in a domain would solve the Plesk backup issue I think but the installation script of Odoo puts Odoo in var, opt, etc and usr directories even if I put the install script in the directory of a created domain. Since the manual Odoo installation is complicated I am very happy to use the script.
My questions are:
1. Is it possible to include the directories var, opt, etc and usr in the Plesk backup and how and where do I do that?
2. Can I restore such a backup without no problem in Plesk?
What I want to do is have some incremental backups in there in subdirectories. So, for example, something like this on the remote server /home/user/something.tuesday /home/user/something.friday
I thought the --backup --backup-dir Switches were used to store just the files that had changed in seperate directories, am I wrong on that?
I've read everything I could find, including the big rsnapshot scripts, but I'm not able to do what I want, it seems so simple but something's not right, am I wrong that subdirs should have just files that are new or have changed. I tried various things like this, but had no luck
my cpanel doesn't get backups. When I force it, it gives me this error:
mount: can't find /backup in /etc/fstab or /etc/mtab mount: can't find /backup in /etc/fstab or /etc/mtab [cpbackup] Backup failed! /bekkaplars is not mounted! at /scripts/cpbackup line 415.
It's a vps. Another interesting thing is, my other 3 vpses run good even /etc/fstab has no line like /backup in there, also.
How To take Backup in a Reseller account..I have over 54 account in an Reseller hosting ..i don't have ssh ..then how can i take full backup of all the account... any idea ..the Server is REDHAT Enterprise 3 With WHM 11.23.0 cPanel 11.23.1...
Is it possible to upload your backups to another server trough ftp? For example I made backups of my sites and they are all in a tar.gz file. I only need to upload them trough ftp? Are the mysql databases etc already included in the files?