Legit Backup Script
Jan 10, 2007
I got this from 1and1 and have no confidence in them. I want to be sure my site is backed up (I will be using bq and whoever the new host is to back it up also) I have done these commands and it "backs up" and then I FTP the backup to my computer but I want to be sure there is actually information in that backup. Does this sound legit?
Open Putty
login
at command, type
mysqldump --opt -Q -h localhost -databaseusername -p databasename >sitename.backup.sql
hit enter
it goes to next line and is done
then FTP to my computer and I have a backup.
So first, does it sound legit? Second, what do I DO with it should my site go down. Do I just FTP it back to the server?
View 9 Replies
ADVERTISEMENT
Jan 14, 2008
webhostmagazine.com have their own hosting reviews, does anyone have any experience or opinions with reards to the reliability or integrity of these reviews?
Im not one to take such things at face value.
View 4 Replies
View Related
Jul 26, 2008
[url]
is this a trustworthy and accurate resource?
View 12 Replies
View Related
Aug 21, 2007
This is just a notice: one of the staff of a large site I run was no longer able to log into the site. As it turns out his IP was being blocked by APF.
The reason for his IP being blocked was that it ended in 255 (x.x.x.255). Any such addresses are blocked by the PKT_SANITY_STUFFED option, which is turned on by default in recent versions of APF. When restarting APF this option shows up as {pkt_sanity} deny all to/from 0.0.0.255/0.0.0.255 and can be seen under "OUT_SANITY" when doing "apf --list".
As you notice the problem is that some ISPs are are assigning supposedly "bad" IPs ending in 255 to users. And I'm not the only one hitting this problem either: [url]
If you are also using (a recent version of) APF, you might want to turn this option OFF.
In the meanwhile, if anyone is so enlighted... why was this option in APF in the first place? What so bad about IPs ending on 255? The APF docs say they're bad broadcast addresses, so why are ISP assigning them anyway? Who is at fault: APF or ISPs?
View 3 Replies
View Related
May 18, 2009
Code:
Mon May 18 15:17:08 2009 lfd: *Suspicious File* /tmp/perl_install.work.TLoX0YtaJBrzShwA/.cpan [someuser:someuser
] - Suspicious directory
The 'someuser' is a legitimate user on the server, an auto body website setup last October.
The content of the directory:
Quote:
root@server [/tmp/perl_install.work.TLoX0YtaJBrzShwA/.cpan/CPAN]# ls -lh
total 3.0K
drwx------ 2 someuser someuser 1.0K May 16 17:54 ./
drwx------ 3 someuser someuser 1.0K May 16 17:54 ../
-rw-r--r-- 1 someuser someuser 361 May 16 17:54 MyConfig.pm
File content:
Code:
$CPAN::Config->{'cpan_home'} = "/tmp/perl_install.work.TLoX0YtaJBrzShwA/.cpan";
$CPAN::Config->{'build_dir'} = "/tmp/perl_install.work.TLoX0YtaJBrzShwA/.cpan/build";
$CPAN::Config->{'histfile'} = "/tmp/perl_install.work.TLoX0YtaJBrzShwA/.cpan/histfile";
$CPAN::Config->{'keep_source_where'} = "/tmp/perl_install.work.TLoX0YtaJBrzShwA/.cpan/sources";
1;
__END__
Code:
root@server [/tmp/perl_install.work.TLoX0YtaJBrzShwA/.cpcpan/STABLE]# ls -lh
total 3.0K
drwx------ 2 someuser someuser 1.0K May 16 17:54 ./
drwx------ 3 someuser someuser 1.0K May 16 17:54 ../
-rw-r--r-- 1 someuser someuser 735 May 16 17:54 modules.versions
View 0 Replies
View Related
Dec 15, 2008
I wanted to post about a site I'm very concerned and frustrated with, HostJury.
It's simple. The other day, one of our web hosting customers posted our HostJury URL in our customer forums. Since then, a few of our customers saw followed the link and decided to submit reviews about us, which was very pleasing.
Friday night, I saw we had our first 4 reviews. Suddenly yesterday (Saturday) afternoon I checked on the page, and saw all of them had been deleted.
But I looked up the reviewers. They were all posted by legit customers of ours which I was able to find in our customer database, so the reviews are definitely legit.
I am especially frustrated as those 4 had rated us very high, so these are important reviews that have been removed by HostJury.
I don't understand what's going on here...
o Legit reviews
o We didn't ask them to review us, they did it on their own
o We didn't bribe our reviewers
o We didn't reward our reviewers
So what's the problem, HostJury? These kind of instances are interfering with honesty in the hosting industry.
View 10 Replies
View Related
Dec 10, 2007
mod_evasive bans some of the legit users (galleries , typo3 etc.) with following settings:
<IfModule mod_dosevasive20.c>
DOSHashTableSize 3097
DOSPageCount 10
DOSSiteCount 150
DOSPageInterval 1
DOSSiteInterval 3
DOSBlockingPeriod 10
</IfModule>
Somebody have an idea for some less restrictive but still usefull rules?
View 10 Replies
View Related
Oct 29, 2007
My server runs on CSF.
Very often the firewall automatically ban some of my customers IP who has fix IP to access to their webmail and website, because they have over 100 staffs, maybe that is why the IP was banned automatically for having too many connections to the server.
Everytime I unban the IP, it keeps being banned again. I have to stop / restart iptables to flush it.
How can I allow the IP permanently?
View 1 Replies
View Related
Feb 8, 2015
So I've got a problem where a small percentage of incoming requests are resulting in "400 bad request" errors and I could really use some input. At first I thought they were just caused by malicious spiders, scrapers, etc. but they seem to be legitimate requests.
I'm running Apache 2.2.15 and mod_perl2.
The first thing I did was turn on mod_logio and interestingly enough, for every request where this happens the request headers are between 8000-9000 bytes, whereas with most requests it's under 1000. Hmm.
There are a lot of cookies being set, and it's happening across all browsers and operating systems, so I assumed it had to be related to bad or "corrupted" cookies somehow - but it's not.
I added "%{Cookie}i" to my LogFormat directive hoping that would provide some clues, but as it turns out half the time the 400 error is returned the client doesn't even have a cookie. Darn.
Next I fired up mod_log_forensic hoping to be able to see ALL the request headers, but as luck would have it nothing is logged when it happens. I guess Apache is returning the 400 error before the forensic module gets to do its logging?
By the way, when this happens I see this in the error log:
request failed: error reading the headers
To me this says Apache doesn't like something about the raw incoming request, rather than a problem with our rewriting, etc. Or am I misunderstanding the error?
I'm at a loss where to go from here. Is there some other way that I can easily see all the request headers? I feel like that's the only thing that will possibly provide a clue as to what's going on.
View 1 Replies
View Related
May 17, 2015
I have multiple backups stored under server repository (subscriptions --> <domainname> --> website and domains --> backup manager).
The physical files are located at: /var/lib/psa/dumps/clients/904279/domains/<domainname>/
When I click the green arrow to download these files to a local computer (see attached image) I get a new page with title "Download the backup file". On this page I have the option to set a password on the downloaded file, but no matter what I do (password or no password) the file is not downloaded to my local PC. I don't get a pop-up box with the option to save the file. Just nothing happens ...
View 1 Replies
View Related
Aug 15, 2014
I have 2 problems:
Firstly I wonder if there is any possibility to limit the number of cores the plesk backup zipping tool uses? This pigz takes up all my CPU. Is there any way I can reduce the amount of cores it uses because all my websites are down every time a backup takes place for around 3 minutes.
Secondly I get the following in my syslog:
1 baby plesk sendmail[20189]: Error during 'check-quota' handler
I don't know what is wrong. I think it's since the upgrade to Plesk 12. I now have 12.0.18 Update #13.
View 9 Replies
View Related
Sep 17, 2014
I have a 6GB backup file created with another Plesk Backup Manager, now I trying to upload this backup file to my Plesk Backup Manager but after upload 3% I am getting "413 Request Entity Too Large" error, I tried with disable NGINX but still getting this error.
how can I resolve this error or is their any other way to upload my file on backup manager?
I see that Backup Manager have a file size restriction of 2GB how can I increase this?
View 2 Replies
View Related
Feb 11, 2015
I have an Ubuntu 14.04 LTS 64 bit virtual private server with Plesk 12. The server is hired from a hosting provider. The server is used to run the Odoo ERP application (using postgres database).
The Odoo application is running fine and now I want to create a backup of the application using Plesks Backup manager.
I choose configurations and content option in the backup manager but the created backup is only 200kb.
I think the problem is the location where the Odoo application is installed is not included in the backup. I made a tar backup from the server and extracted it on my pc. It seems that the main parts of the Odoo application are in the var, opt, etc and usr directories (not in a domain but under root).
Installing the application in a domain would solve the Plesk backup issue I think but the installation script of Odoo puts Odoo in var, opt, etc and usr directories even if I put the install script in the directory of a created domain. Since the manual Odoo installation is complicated I am very happy to use the script.
My questions are:
1. Is it possible to include the directories var, opt, etc and usr in the Plesk backup and how and where do I do that?
2. Can I restore such a backup without no problem in Plesk?
View 1 Replies
View Related
Jan 26, 2007
I current do some rsync backups with a command like so every day
rsync -az -e ssh --stats --delete --exclude "stuff" / user@server:/home/user/
What I want to do is have some incremental backups in there in subdirectories. So, for example, something like this on the remote server
/home/user/something.tuesday
/home/user/something.friday
I thought the --backup --backup-dir Switches were used to store just the files that had changed in seperate directories, am I wrong on that?
I've read everything I could find, including the big rsnapshot scripts, but I'm not able to do what I want, it seems so simple but something's not right, am I wrong that subdirs should have just files that are new or have changed. I tried various things like this, but had no luck
rsync -az -e ssh --stats --delete --backup --backup-dir=/home.Thursday --exclude "stuff" / user@server:/home/user/
View 0 Replies
View Related
Dec 14, 2008
I have transfer full backup with cpanel to ftp backup,
But,How transfer full backup from FTP Backup to cPanel?
Whitch software?
View 9 Replies
View Related
Dec 16, 2008
my cpanel doesn't get backups. When I force it, it gives me this error:
mount: can't find /backup in /etc/fstab or /etc/mtab
mount: can't find /backup in /etc/fstab or /etc/mtab
[cpbackup] Backup failed! /bekkaplars is not mounted! at /scripts/cpbackup line 415.
It's a vps. Another interesting thing is, my other 3 vpses run good even /etc/fstab has no line like /backup in there, also.
View 5 Replies
View Related
Oct 31, 2007
I would like to weight the benefits and costs of using remote backup and local backup to another hardisk? Let's assume the price is the same.
What's the benefits of using remote backup?
Is it secured to use local backup in another hardisk?
I'm running on linux centos.
What happend if hacker get hold of my server?
Currently, i have 80GB of diskspace. Does it mean i will need at least 80GB of ANOTHER hardisk to backup that?
View 14 Replies
View Related
May 30, 2008
How To take Backup in a Reseller account..I have over 54 account in an Reseller hosting ..i don't have ssh ..then how can i take full backup of all the account... any idea ..the Server is REDHAT Enterprise 3 With WHM 11.23.0 cPanel 11.23.1...
View 5 Replies
View Related
Feb 24, 2008
I am using cpanel, and cpanel take backup of my accont on a remote backup space @ 1 am once in day.
Is this possible it take backup twice or more in a day?
View 4 Replies
View Related
May 28, 2008
i have 2 dedicated server and use FTP backup
Serve 1 > server 2
server2 > server 1
but i have tension , if someone knows my FTP user or any link of Backup directory they can download my backup !
What shoukd i do ?
first way is Change the Permition of Backup directory
View 2 Replies
View Related