I am just wondering whether my idea will work for the google search engine.
Basically, I have my official VPS root (home/admin/public_html/) and this is where my main website will be hosted. However, since my VPS will be used for additional website, I will direct additional domains to it.
My second site hosted on this VPS may have a document root of (home/admin/public_html/advertising/). And my domain will then be setup to have that as its document root.
However, when google searches for my second site (e.g. advertising.com) will it go 'below' the domain root? For example, will it also creep the files under /public_html/ for this domain, even though the domains root is /public_html/advertising/?
edit: Or do people host multiple sites differently? Is this an appropriate method?
I recently found out that the Google robot spidered my site and used up 5.74 GB in bandwidth! The hits were 4498 and I even have it scheduled to return every 2 weeks.
Now my site is down for the rest of the month unless I up my bandwidth. This is the second time this has happened to me in the past year. What is going on?
Googlebot has been absolutely ripping through my bandwidth. This has been on going for many months now, but each month the damage has gotten worse and worse. I have posted about this problem here but with no luck:
Yes, it’s quite true. DreamHost representatives are asking their clients to block GoogleBot trough the .htaccess file, because their websites were “hammered by GoogleBot”.
Dreamhost representatives are also guiding their clients to make all their websites “unsearchable and uncrawlable by search engine robots”, because they cause “high memory usage and load on the server”.
PS: This is NOT a rumor. Multiple people are already complaining about Dreamhost asking clients to block search engine spiders. A post in QuickOnlineTips confirms this too.
If your server is blocking googlebot from finding your robots.txt file, how do you configure your firewall to unblock it?
I've searched through Google and I've seen may people just say your firewall is blocking it, but none mention how to really stop it from doing that. Like does Google have an IP it uses, and if so, what is the IP you should whitelist for your server?
As I keep getting that message: Network unreachable: robots.txt unreachable and I'm sure it's due to a firewall issue, just have no idea how to fix that.
One of my customers uses Webmasters Tools from google , looking at what pages he have indexed by Googlebot, found that 180 pages are giving a "DNS lookup timeout" error, tried searching for help on google and the only thing that i found is " We received a timeout on DNS lookup."
DNS are ok, same as the zone file, everything is responding OK, I dont know what can be the issue.. ? any ideas ?
I recently leased a dedicated server and it has somethign called modsecurity installed and I "think" it is causing me a slight problem. I installed Tikiwiki (using FANTASTICO as teh installer) to put a wiki on my site. Problem: When I edit a page and hit "Save." I get "FOBIDDEN you do not have permission to access /tiki/tiki-editpage.php on this server". After playing around with it all day, I finally asked my server management folks if they could figure out the issue and they said it looked like a "modsecurity" issue. If I understand correctly, modsecurity will clocu URLs that have certain characteristics.
my questions are:
1) How can I determine exactly which modsecurity rule is being violated and
2) How can I remove just taht rule so that things will work with the wiki program?
I am running apache 1.3 + modsecurity 1 my problem is i can not use ajax coz of modsecurity is there any way to make ajax work with modsecurity on apache 1 coz i know it's work on apache 2
We have a small Hosting reseller account at eNom. We have a new customer that moved his website from another hosting company to ours. The website is on a shared IP. Enom also uses a internal IP for internal use associated to the domain.
The problem we have is that AOL users can not see the website. As far as we can tell no other ISP's are having this problem. Everyone can see it except AOL users.
When AOL users go to the site they get "Page can not be found". After several calls to eNom support and them triple checking the DNS we still have the problem.
I looked at the error log for the website this morning. I found several errors. I looked up the IP's with the errors and they all pointed back to AOL.. See below for two examples of the errors....
Is this a server problem or DNS?
What do these errors mean and what do I do about it?
The domain is http://2hotlicks.com . They sell Hot Sauce.. Would AOL block it because of the keywords in the Domain name?
[Wed Oct 17 08:11:56 2007] [error] [client 207.200.116.7] ModSecurity: Access denied with code 400 (phase 2). Pattern match "(?:\bhttp.(?:0\.9|1\.[01])|<(?:html|meta)\b)" at REQUEST_HEADERS:Via. [id "950911"] [msg "HTTP Response Splitting Attack. Matched signature <http/1.1>"] [severity "ALERT"] [hostname "www.2hotlicks.com"] [uri "/"] [unique_id "uPWvAgoHAlYAAA25N5AAAAAI"]
[Tue Oct 16 13:11:20 2007] [error] [client 207.200.116.137] ModSecurity: Access denied with code 400 (phase 2). Pattern match "(?:\bhttp.(?:0\.9|1\.[01])|<(?:html|meta)\b)" at REQUEST_HEADERS:Via. [id "950911"] [msg "HTTP Response Splitting Attack. Matched signature <http/1.1>"] [severity "ALERT"] [hostname "www.2hotlicks.com"] [uri "/combos.htm"] [unique_id "yddhwAoHAlYAAEEfgyEAAAAi"]
I currently have the Web Application Firewall (ModSecurity) installed but would like a visual interface to block IP's, subnets etc.. Can I install the Plesk firewall as well without any conflict with the Web Application Firewall?
I have a Real Time Web Application Security Rules Subscription. I change the ModSecurity Rule Setup and add the Atomic LoginData to Plesk. All looks fine but the ModSecurity Log is now empty.
- Debian 7 with all Updates - Plesk Version 12.0.18 Update #49
Access denied with code 406. Error verifying files: Received no output from the approver script (execution failed?) "/usr/local/apache/htdocs/upload_scan.pl" ....
I have enabled modsecurity system and in 1 day the modsec_audit.log file has grown to more than 700Mb. Is there any way to reduce the number of messages that this module logs?
Error when trying to set atomic subscription rule:
Failed to install the ModSecurity rule set: SecReadStateLimit is depricated, use SecConnReadStateLimit instead. Syntax error on line 70 of /etc/httpd/conf/modsecurity.d/rules/atomic/modsec/00_asl_zz_strict.conf: Error creating rule: Could not add entry "127.0.0.0/8" from: 127.0.0.0/8.
In directory /etc/httpd/conf/modsecurity.d/rules I have only: atomic.new modsecurity_crs-plesk tortix tortix.backup
Once Atomic Basic is enabled, the following error appears:
Code:
Failed to install the ModSecurity rule set: modsecurity_ctl failed: gpg: key 4520AFA9: "Atomicorp (Atomicorp Official Signing Key) <support@atomicorp.com>" not changed gpg: Total number processed: 1 gpg: unchanged: 1 gpg: Signature made Tue Jun 17 16:53:49 2014 CEST using RSA key ID 4520AFA9 gpg: Good signature from "Atomicorp (Atomicorp Official Signing Key)
[Code] .....
OS Debian 7.5 Plesk version 12.0.18 Update #4, last updated at June 18, 2014 02:51 AM