DNS Lookup Timeout From Googlebot Indexing Webpages
Jul 4, 2007
One of my customers uses Webmasters Tools from google , looking at what pages he have indexed by Googlebot, found that 180 pages are giving a "DNS lookup timeout" error, tried searching for help on google and the only thing that i found is " We received a timeout on DNS lookup."
DNS are ok, same as the zone file, everything is responding OK, I dont know what can be the issue.. ? any ideas ?
Im using Ensim Pro and RHES 4.
View 1 Replies
ADVERTISEMENT
Nov 19, 2007
Does anyone know of a host which permits pdftotext.exe to run on their servers? My insite search engine ("Orca") requires this exe to index pdf files.
View 3 Replies
View Related
Jul 13, 2008
Every time I enter my name in GOOGLE's search box and press enter, it returns "Index of /" as the first listing, and this listing is then my site.
The oddity of this is that I have the appropriate META all filled in. I have the following:
- The Google Verification META
- Content-Type
- author
- keywords
- abstract
- description
and
- robots set to content "all".
I have nothing in an .htaccess file that would cause this because it's completely blank to begin with.
My host utilizes CPANEL which bloats the space with all sorts of misc. folders and other junk which I know is useful, but sometimes visually and organizationally "cluttersome".
Anyway, what am I doing wrong here or what is my host doing wrong here? Why isn't my "www.whatever.com" coming up in the search results instead of this generic "Index of /" crap?
View 4 Replies
View Related
May 13, 2009
I've been testing out LiteSpeed on a few of my servers, extremely pleased with it! Only issue I have left are directory indexes. A good chunk of my users store files and rely on Apache directory indexing to serve up the content. With LiteSpeed, the default behavior for a folder with no default document is to display nothing at all. I've looked through the documentation and found that LiteSpeed does have the functionality, but I'm having some issue getting it to actually work.
Can someone tell me exactly where the index files for LiteSpeed are placed, what to configure, etc?
View 9 Replies
View Related
Jan 26, 2009
I have this problem with my hosting provider. Website is loading slowly (sometimes up to 10 sec) from my location. I've made several checks:
1. Script execution time is within 0.1 sec.
2. Tracert shows times within 0.3 sec.
3. Also, host-tracker.com shows average loading time ~2.0 sec (they check from 20-30 different locations around the world).
So, the questions:
1. What can cause such long slow-downs?
2. What average loading times can be expected from shared hosting? Is 2.0 sec (average) good or bad? For example, google.com has average response time of about 0.2-0.3 sec, but they have many servers around the world.
View 14 Replies
View Related
Oct 30, 2014
This is just an internal server setup to learn web programming and looking to create a environment close as possible to the real world setup. I'm using xxxx.dlinkddns.com
View 1 Replies
View Related
Aug 6, 2007
at my logs, i can see this timeout error caused my mod_security at my rhes 4 server.
Code:
mod_security-message: Access denied with code 403. Error reading request body, error code 70007: The timeout specified has expired [severity "EMERGENCY"]
I didnt specify any mod sec rule about "timeout", what timeout is this about?
View 3 Replies
View Related
Oct 3, 2006
Upon reviewing my modsecurity log today, I found an interesting hit from google.
-------------------
Requesting IP: 66.249.65.67 is http://ws.arin.net/cgi-bin/whois.pl?...t=66.249.65.67
Date: 2006-10-03
Time: 07:10:16
Handler: mod_gzip_handler
Get: /page/index/1&show=25,07,2005?php%20echo%20$bmc_vars%5B'site_url'%5D;%20?%3E/profile.php?id=1
Mod_Security-Message: Access denied with code 406. Pattern match "echo " at THE_REQUEST
Mod_Security-Action: 406
------------
The rule that set off this 406 response was:
SecFilterSelective THE_REQUEST "echo "
What I find interesting is that I do not have any such URL structure on this website that google requested.
View 2 Replies
View Related
Mar 22, 2007
I'm running a web server with mod_evasive and want to know how can I prevent mod_evasive from blocking the googlebot crawler ip address.
Is there a script out there that can detect this crawler and make sure its ip doesnt get blocked by iptables or mod_evasive?
View 9 Replies
View Related
Mar 29, 2007
I use bluehost's pagewizard. So i created the members.htm then moved it to a password protected directory.
I don't know how to get the pics to show up, i only know that the location of the images is in wizard images.
View 1 Replies
View Related
Jan 19, 2015
I have five domains redirecting to my main site. I originally setup these redirect domains because many people in my target demographic are not very computer literate, and I worried they would mis-spell or mis-type the site name. I want to do what Google does with gogle.com and their other domains.
exampledomains.com <-- Main Site
exampledomains.net
exampledomains.org
exampledomain.com
exampledomain.net
exampledomain.org
Now I'm seeing that Google is indexing about 3000 results for each redirect domain, and I'm worried it's going to hurt my SEO.
My host setup the domains with the same IP as my main site, then put non-masking redirects in the .htaccess. The sites do not have their own accounts on the server, so I can't setup a separate htaccess or robots.txt as their presently configured. How should I set this up?Â
View 5 Replies
View Related
Aug 2, 2008
I am just wondering whether my idea will work for the google search engine.
Basically, I have my official VPS root (home/admin/public_html/) and this is where my main website will be hosted. However, since my VPS will be used for additional website, I will direct additional domains to it.
My second site hosted on this VPS may have a document root of (home/admin/public_html/advertising/). And my domain will then be setup to have that as its document root.
However, when google searches for my second site (e.g. advertising.com) will it go 'below' the domain root? For example, will it also creep the files under /public_html/ for this domain, even though the domains root is /public_html/advertising/?
edit: Or do people host multiple sites differently? Is this an appropriate method?
View 4 Replies
View Related
Sep 24, 2005
I recently found out that the Google robot spidered my site and used up 5.74 GB in bandwidth! The hits were 4498 and I even have it scheduled to return every 2 weeks.
Now my site is down for the rest of the month unless I up my bandwidth. This is the second time this has happened to me in the past year. What is going on?
View 14 Replies
View Related
Oct 2, 2006
Googlebot has been absolutely ripping through my bandwidth. This has been on going for many months now, but each month the damage has gotten worse and worse. I have posted about this problem here but with no luck:
http://www.webmastertrader.com/showthread.php?t=389
To demonstrate the problem, I have decided to show you all my server stats for one of my websites.
I have no choice but to ban googlebot from my site and follow this action with a letter to google!
1) Aug: http://www.webmastertrader.com/attac...0&d=1159816704
2) Sep: http://www.webmastertrader.com/attac...1&d=1159816714
3) Oct: http://www.webmastertrader.com/attac...2&d=1159816723
We're only a day and a half into this month and Google's robots have already consumed over 1gig of bandwidth!
View 14 Replies
View Related
Oct 2, 2008
im a college student that just got visual studio 2008. i want to build a few webpages with this (asp.net) and was curious what kind of hosting i needed? i was told i needed a dedicated server, and when i checked the prices on that, it was expensive
View 9 Replies
View Related
Apr 17, 2014
I'm using Concrete5 CMS to create a website. This CMS creates/manages all its webpages in a mySQL database. Thus, there is no physical folder associated with each webpage, so I can't simply create an .htaccess file and place it in the directory tree in the right sub-folder to restrict access for that sub-folder and all folders it contains.
I have one .htaccess file located at the root level (e.g top-level folder for the website).
QUESTION 1: I need place in this top-level .htaccess file to (1) restrict access to only two specific IP addresses that I can specify (blocking access to all other IP addresses), and (2) specify the URL addresses that I wish to apply this rule to?
For example, let's say my website is [URL] ....
And I want to restrict access to the [URL] ....
and my .htaccess file is located at
/home/myname/public_html/conc/.htaccess
What code can do that?
View 1 Replies
View Related
Jun 3, 2007
Yes, it’s quite true. DreamHost representatives are asking their clients to block GoogleBot trough the .htaccess file, because their websites were “hammered by GoogleBot”.
Dreamhost representatives are also guiding their clients to make all their websites “unsearchable and uncrawlable by search engine robots”, because they cause “high memory usage and load on the server”.
PS: This is NOT a rumor. Multiple people are already complaining about Dreamhost asking clients to block search engine spiders. A post in QuickOnlineTips confirms this too.
Initial news bit via Zoso.
View 12 Replies
View Related
May 21, 2008
If your server is blocking googlebot from finding your robots.txt file, how do you configure your firewall to unblock it?
I've searched through Google and I've seen may people just say your firewall is blocking it, but none mention how to really stop it from doing that. Like does Google have an IP it uses, and if so, what is the IP you should whitelist for your server?
As I keep getting that message: Network unreachable: robots.txt unreachable
and I'm sure it's due to a firewall issue, just have no idea how to fix that.
View 5 Replies
View Related
Dec 15, 2008
I have recently upgraded to a virtual private server. What is a safe speed for googlebot to crawl my website.
Can you please give me results in the format of:
xxx requests per second
xxx seconds between requests.
View 2 Replies
View Related
May 5, 2008
if I do a reverse lookup on my server's IP it returns a host assigned by the data center. So is this something I should even be managing too or is that always left up to the DC? I run my own DNS for the forward lookup zones.
Also just a technical question, when you do a reverse lookup, how does it know where to look to get the host? I'm not sure I fully understand how these work.
View 11 Replies
View Related
Jan 13, 2008
Somebody purchased a domain from a popular registrar. When you lookup the domain, it points to its own name servers (i.e. ns1.registrar.com and ns2.registrar.com), which means that it is hosted by them as well. correct?
this person is saying they bought their domain from the registrar, but they are hosted by company X. he says that he is pointing the ip address of his web site on company X to his registrar.
View 7 Replies
View Related
Mar 14, 2007
I want to know how serious following nameserver problem can be in reality:
[url] ?
From DNS timing someone could conclude it's a slow hosting.
Can anyone explain above results "accurately" ? I still don't know what's "good" result in DNS timing: A or F?
The problem is that
[url] loads slow (5-10 seconds on T1 connection), here's a speed test. I'm aware images and scripts contribute to slow site loading.
My question is: Beside removing some images from the Home page, is it any reason to change the hosting (regarding DNS tests).
View 2 Replies
View Related
Sep 26, 2007
Friend of mine asked me what "CURL named lookup issue resolved" means, he's got a CentOS 4 machine, with cPanel/WHM latest builds I tried recompiling apache and even cPanel at some point but no good..
View 1 Replies
View Related
May 28, 2007
I am having a problem. My server was able to ping google.com successfully before installing DirectAdmin.
However after installing DirectAdmin, I am no longer able to ping google.com anymore. I just get a host cannot be found error. This is the same with all other domains.
The /etc/resolv.conf is exactly the same, so I do not know the issue. I also have other servers using the same nameserver and are still able to ping and resolve successfully.
The main problem here is that exim no longer can send email as it cannot lookup the hostnames.
the error from exim is: R=lookuphost defer (-1): host lookup did not complete.
Thats the only error within the message.
View 6 Replies
View Related
Nov 17, 2006
My VPS Config:
centos with webmin
mail program sendmail
Its a newly configured VPS have a problem with smtp realying through outlook and other smtp programs. When i use smtp on outlook it give following error:
The message could not be sent because one of the recipients was rejected by the server. The rejected e-mail address was 'xxxxx@yahoo.com'. Subject 'Outlook', Account: 'Testing', Server: 'xxxx.com', Protocol: SMTP, Server Response: '550 5.7.1 <xxxxx@yahoo.com>... Relaying denied. IP name possibly forged [xx.xxx.xx.xxx]', Port: 25, Secure(SSL): No, Server Error: 550, Error Number: 0x800CCC79
I want to make outgoing authentication activate instead of allowing some iprange or domain for relaying through /etc/mail/access.
View 4 Replies
View Related
May 14, 2009
ffmpeg: symbol lookup error: /usr/lib/libavcodec.so.52: undefined symbol: speex_header_free
I am using a centos 5 vps
View 4 Replies
View Related
May 30, 2009
I have six dedicated servers on a VPR. I use one of these as the Database server and the other five connects to that server to read the Database.
When I setup my scripts for the first time I used the external IP to connect to the DB server. Everything worked perfect for these weeks:
mysql_connect('174.xxx.xxx.xx', 'user', 'pass');
Since three days ago I finally received the internal IPs of my servers and now I use the internal IP of the DB server:
mysql_connect('10.0.0.1:3306', 'user', 'pass');
When I did this last change this my database started to collapse. Queries stuck on a bottleneck every 10 minutes. Here's an example:
40468 simon 10.0.0.7:47002 pkindigo_db Query 115 Locked update `captured` set hp_left = '79', status = 'normal', att1pp = '25', att2pp = '30', att3p
40523 simon 10.0.0.2:58080 pkindigo_db Query 115 Locked update `captured` set experience = '5120', level = '16', hp_left = '58', hp = '58', status =
... (tons of locked queries before) ...
As you may see table is locked since 115 seconds for a simple update... after some hours this ends on a "#1135 - Can't create a new thread (errno 13); if you are not out of available memory, you can consult the manual for a possible OS-dependent bug" and even some of my tables finally crashing.
I think that my Database stuck because of a DNS lookup delay. When I had this problem before using the external IPs, I added skip-name-resolve and the problem was fixed. But now with the internal IPs it seems that's not enough.
View 2 Replies
View Related
Aug 3, 2009
I'm having tons of problems trying to get reverse lookup working on my dns server. I've scoured the net and tried lots of methods but so far none work.
Forward lookup DNS is working perfect, but reverse lookup is completely broken.
db.44.80.212.67:
Code:
$TTL 86400 ; Default TTL in secs(1 day)
@ IN SOA ns1.capturetheprize.com. info.capturetheprize.com. (
2009072902; serial number YYMMDDNN
28800 ; Refresh
7200 ; Retry
864000 ; Expire
86400 ; Min TTL
)
IN NS ns1.capturetheprize.com.
IN NS ns1.mytempmail.com.
; Reverse lookups
44 IN PTR ns2.mytempmail.com.
44 IN PTR ns1.mytempmail.com.
44 IN PTR ns2.capturetheprize.com.
44 IN PTR ns1.capturetheprize.com.
44 IN PTR mail.mytempmail.com.
44 IN PTR mail.capturetheprize.com.
44 IN PTR stats.capturetheprize.com.
44 IN PTR secure.capturetheprize.com.
named.conf
Code:
zone "80.212.67.in-addr.arpa" IN {
type master;
file "etcdb.44.80.212.67";
allow-transfer { localhost; };
};
View 3 Replies
View Related
Jul 19, 2008
if its possible to change the reverse lookup, or at least i think thats what it is called.
I have 2 accounts on my CentOS, WHM/cPanel VPS and they are hosted on their own seperate IPs however when you do a lookup on one domain it also lists the other domain as being hosted on the server and vice-versa.
How do i change this so they appear like they are on different servers? Each one has their own nameservers which are on seperate IPs also.
View 3 Replies
View Related