Is Hyperspin Rankings Legit
Jul 26, 2008[url]
is this a trustworthy and accurate resource?
[url]
is this a trustworthy and accurate resource?
Which of these two do you think is more reliable? We have been using hyperspin for the last two years, but right now are looking to switch to pingdom due to their pricing. They offer 1 minute monitoring at only $0.5/month, compared to $12/month with hyperspin. I have been testing pingdom over the last two weeks, and they seem to be quite reliable.
View 14 Replies View RelatedI have been using siteuptime.com and they seem to be working well, but since I do a content check (making sure vBulletin and MySQL are working) and I want two minute checks, I pay two extra $5 charges, so it comes out to $15 to check my one site.
I read about hyperspin the other day and they do 2 minute checks (with content check - looking for keywords on page) for around $5 a month.
So, from a cost perspective hyperspin is a third of the cost of siteuptime for my needs (2 minute check, look for keywords).
I am curious about people that have used both. Are they both equally reliable? Has anyone had any problems with hyperspin?
I am considering a server from Iweb wich is based in Canada.
My question is about search engines especially google. My site will be targeting US visitors and I am in the US. When google and other search engines see that my IP address is in Canada will it have any affect on search engine ranking for the regular us google.com? I don't care much about rankings on google.ca since my visitors will mainly be in the USA. anybody have any insite on this? I also want to mention my domain name will be a .com domain
webhostmagazine.com have their own hosting reviews, does anyone have any experience or opinions with reards to the reliability or integrity of these reviews?
Im not one to take such things at face value.
This is just a notice: one of the staff of a large site I run was no longer able to log into the site. As it turns out his IP was being blocked by APF.
The reason for his IP being blocked was that it ended in 255 (x.x.x.255). Any such addresses are blocked by the PKT_SANITY_STUFFED option, which is turned on by default in recent versions of APF. When restarting APF this option shows up as {pkt_sanity} deny all to/from 0.0.0.255/0.0.0.255 and can be seen under "OUT_SANITY" when doing "apf --list".
As you notice the problem is that some ISPs are are assigning supposedly "bad" IPs ending in 255 to users. And I'm not the only one hitting this problem either: [url]
If you are also using (a recent version of) APF, you might want to turn this option OFF.
In the meanwhile, if anyone is so enlighted... why was this option in APF in the first place? What so bad about IPs ending on 255? The APF docs say they're bad broadcast addresses, so why are ISP assigning them anyway? Who is at fault: APF or ISPs?
I got this from 1and1 and have no confidence in them. I want to be sure my site is backed up (I will be using bq and whoever the new host is to back it up also) I have done these commands and it "backs up" and then I FTP the backup to my computer but I want to be sure there is actually information in that backup. Does this sound legit?
Open Putty
login
at command, type
mysqldump --opt -Q -h localhost -databaseusername -p databasename >sitename.backup.sql
hit enter
it goes to next line and is done
then FTP to my computer and I have a backup.
So first, does it sound legit? Second, what do I DO with it should my site go down. Do I just FTP it back to the server?
Code:
Mon May 18 15:17:08 2009 lfd: *Suspicious File* /tmp/perl_install.work.TLoX0YtaJBrzShwA/.cpan [someuser:someuser
] - Suspicious directory
The 'someuser' is a legitimate user on the server, an auto body website setup last October.
The content of the directory:
Quote:
root@server [/tmp/perl_install.work.TLoX0YtaJBrzShwA/.cpan/CPAN]# ls -lh
total 3.0K
drwx------ 2 someuser someuser 1.0K May 16 17:54 ./
drwx------ 3 someuser someuser 1.0K May 16 17:54 ../
-rw-r--r-- 1 someuser someuser 361 May 16 17:54 MyConfig.pm
File content:
Code:
$CPAN::Config->{'cpan_home'} = "/tmp/perl_install.work.TLoX0YtaJBrzShwA/.cpan";
$CPAN::Config->{'build_dir'} = "/tmp/perl_install.work.TLoX0YtaJBrzShwA/.cpan/build";
$CPAN::Config->{'histfile'} = "/tmp/perl_install.work.TLoX0YtaJBrzShwA/.cpan/histfile";
$CPAN::Config->{'keep_source_where'} = "/tmp/perl_install.work.TLoX0YtaJBrzShwA/.cpan/sources";
1;
__END__
Code:
root@server [/tmp/perl_install.work.TLoX0YtaJBrzShwA/.cpcpan/STABLE]# ls -lh
total 3.0K
drwx------ 2 someuser someuser 1.0K May 16 17:54 ./
drwx------ 3 someuser someuser 1.0K May 16 17:54 ../
-rw-r--r-- 1 someuser someuser 735 May 16 17:54 modules.versions
I wanted to post about a site I'm very concerned and frustrated with, HostJury.
It's simple. The other day, one of our web hosting customers posted our HostJury URL in our customer forums. Since then, a few of our customers saw followed the link and decided to submit reviews about us, which was very pleasing.
Friday night, I saw we had our first 4 reviews. Suddenly yesterday (Saturday) afternoon I checked on the page, and saw all of them had been deleted.
But I looked up the reviewers. They were all posted by legit customers of ours which I was able to find in our customer database, so the reviews are definitely legit.
I am especially frustrated as those 4 had rated us very high, so these are important reviews that have been removed by HostJury.
I don't understand what's going on here...
o Legit reviews
o We didn't ask them to review us, they did it on their own
o We didn't bribe our reviewers
o We didn't reward our reviewers
So what's the problem, HostJury? These kind of instances are interfering with honesty in the hosting industry.
mod_evasive bans some of the legit users (galleries , typo3 etc.) with following settings:
<IfModule mod_dosevasive20.c>
DOSHashTableSize 3097
DOSPageCount 10
DOSSiteCount 150
DOSPageInterval 1
DOSSiteInterval 3
DOSBlockingPeriod 10
</IfModule>
Somebody have an idea for some less restrictive but still usefull rules?
My server runs on CSF.
Very often the firewall automatically ban some of my customers IP who has fix IP to access to their webmail and website, because they have over 100 staffs, maybe that is why the IP was banned automatically for having too many connections to the server.
Everytime I unban the IP, it keeps being banned again. I have to stop / restart iptables to flush it.
How can I allow the IP permanently?
So I've got a problem where a small percentage of incoming requests are resulting in "400 bad request" errors and I could really use some input. At first I thought they were just caused by malicious spiders, scrapers, etc. but they seem to be legitimate requests.
I'm running Apache 2.2.15 and mod_perl2.
The first thing I did was turn on mod_logio and interestingly enough, for every request where this happens the request headers are between 8000-9000 bytes, whereas with most requests it's under 1000. Hmm.
There are a lot of cookies being set, and it's happening across all browsers and operating systems, so I assumed it had to be related to bad or "corrupted" cookies somehow - but it's not.
I added "%{Cookie}i" to my LogFormat directive hoping that would provide some clues, but as it turns out half the time the 400 error is returned the client doesn't even have a cookie. Darn.
Next I fired up mod_log_forensic hoping to be able to see ALL the request headers, but as luck would have it nothing is logged when it happens. I guess Apache is returning the 400 error before the forensic module gets to do its logging?
By the way, when this happens I see this in the error log:
request failed: error reading the headers
To me this says Apache doesn't like something about the raw incoming request, rather than a problem with our rewriting, etc. Or am I misunderstanding the error?
I'm at a loss where to go from here. Is there some other way that I can easily see all the request headers? I feel like that's the only thing that will possibly provide a clue as to what's going on.