DNS Lookup Timeout From Googlebot Indexing Webpages
Jul 4, 2007
One of my customers uses Webmasters Tools from google , looking at what pages he have indexed by Googlebot, found that 180 pages are giving a "DNS lookup timeout" error, tried searching for help on google and the only thing that i found is " We received a timeout on DNS lookup."
DNS are ok, same as the zone file, everything is responding OK, I dont know what can be the issue.. ? any ideas ?
Every time I enter my name in GOOGLE's search box and press enter, it returns "Index of /" as the first listing, and this listing is then my site.
The oddity of this is that I have the appropriate META all filled in. I have the following: - The Google Verification META - Content-Type - author - keywords - abstract - description and - robots set to content "all".
I have nothing in an .htaccess file that would cause this because it's completely blank to begin with.
My host utilizes CPANEL which bloats the space with all sorts of misc. folders and other junk which I know is useful, but sometimes visually and organizationally "cluttersome".
Anyway, what am I doing wrong here or what is my host doing wrong here? Why isn't my "www.whatever.com" coming up in the search results instead of this generic "Index of /" crap?
I've been testing out LiteSpeed on a few of my servers, extremely pleased with it! Only issue I have left are directory indexes. A good chunk of my users store files and rely on Apache directory indexing to serve up the content. With LiteSpeed, the default behavior for a folder with no default document is to display nothing at all. I've looked through the documentation and found that LiteSpeed does have the functionality, but I'm having some issue getting it to actually work.
Can someone tell me exactly where the index files for LiteSpeed are placed, what to configure, etc?
I have this problem with my hosting provider. Website is loading slowly (sometimes up to 10 sec) from my location. I've made several checks:
1. Script execution time is within 0.1 sec.
2. Tracert shows times within 0.3 sec.
3. Also, host-tracker.com shows average loading time ~2.0 sec (they check from 20-30 different locations around the world).
So, the questions:
1. What can cause such long slow-downs?
2. What average loading times can be expected from shared hosting? Is 2.0 sec (average) good or bad? For example, google.com has average response time of about 0.2-0.3 sec, but they have many servers around the world.
This is just an internal server setup to learn web programming and looking to create a environment close as possible to the real world setup. I'm using xxxx.dlinkddns.com
I have five domains redirecting to my main site. I originally setup these redirect domains because many people in my target demographic are not very computer literate, and I worried they would mis-spell or mis-type the site name. I want to do what Google does with gogle.com and their other domains.
exampledomains.com <-- Main Site exampledomains.net exampledomains.org exampledomain.com exampledomain.net exampledomain.org
Now I'm seeing that Google is indexing about 3000 results for each redirect domain, and I'm worried it's going to hurt my SEO.
My host setup the domains with the same IP as my main site, then put non-masking redirects in the .htaccess. The sites do not have their own accounts on the server, so I can't setup a separate htaccess or robots.txt as their presently configured. How should I set this up?Â
I am just wondering whether my idea will work for the google search engine.
Basically, I have my official VPS root (home/admin/public_html/) and this is where my main website will be hosted. However, since my VPS will be used for additional website, I will direct additional domains to it.
My second site hosted on this VPS may have a document root of (home/admin/public_html/advertising/). And my domain will then be setup to have that as its document root.
However, when google searches for my second site (e.g. advertising.com) will it go 'below' the domain root? For example, will it also creep the files under /public_html/ for this domain, even though the domains root is /public_html/advertising/?
edit: Or do people host multiple sites differently? Is this an appropriate method?
I recently found out that the Google robot spidered my site and used up 5.74 GB in bandwidth! The hits were 4498 and I even have it scheduled to return every 2 weeks.
Now my site is down for the rest of the month unless I up my bandwidth. This is the second time this has happened to me in the past year. What is going on?
Googlebot has been absolutely ripping through my bandwidth. This has been on going for many months now, but each month the damage has gotten worse and worse. I have posted about this problem here but with no luck:
im a college student that just got visual studio 2008. i want to build a few webpages with this (asp.net) and was curious what kind of hosting i needed? i was told i needed a dedicated server, and when i checked the prices on that, it was expensive
I'm using Concrete5 CMS to create a website. This CMS creates/manages all its webpages in a mySQL database. Thus, there is no physical folder associated with each webpage, so I can't simply create an .htaccess file and place it in the directory tree in the right sub-folder to restrict access for that sub-folder and all folders it contains.
I have one .htaccess file located at the root level (e.g top-level folder for the website).
QUESTION 1: I need place in this top-level .htaccess file to (1) restrict access to only two specific IP addresses that I can specify (blocking access to all other IP addresses), and (2) specify the URL addresses that I wish to apply this rule to?
Yes, it’s quite true. DreamHost representatives are asking their clients to block GoogleBot trough the .htaccess file, because their websites were “hammered by GoogleBot”.
Dreamhost representatives are also guiding their clients to make all their websites “unsearchable and uncrawlable by search engine robots”, because they cause “high memory usage and load on the server”.
PS: This is NOT a rumor. Multiple people are already complaining about Dreamhost asking clients to block search engine spiders. A post in QuickOnlineTips confirms this too.
If your server is blocking googlebot from finding your robots.txt file, how do you configure your firewall to unblock it?
I've searched through Google and I've seen may people just say your firewall is blocking it, but none mention how to really stop it from doing that. Like does Google have an IP it uses, and if so, what is the IP you should whitelist for your server?
As I keep getting that message: Network unreachable: robots.txt unreachable and I'm sure it's due to a firewall issue, just have no idea how to fix that.
if I do a reverse lookup on my server's IP it returns a host assigned by the data center. So is this something I should even be managing too or is that always left up to the DC? I run my own DNS for the forward lookup zones.
Also just a technical question, when you do a reverse lookup, how does it know where to look to get the host? I'm not sure I fully understand how these work.
Somebody purchased a domain from a popular registrar. When you lookup the domain, it points to its own name servers (i.e. ns1.registrar.com and ns2.registrar.com), which means that it is hosted by them as well. correct?
this person is saying they bought their domain from the registrar, but they are hosted by company X. he says that he is pointing the ip address of his web site on company X to his registrar.
I want to know how serious following nameserver problem can be in reality: [url] ?
From DNS timing someone could conclude it's a slow hosting.
Can anyone explain above results "accurately" ? I still don't know what's "good" result in DNS timing: A or F?
The problem is that [url] loads slow (5-10 seconds on T1 connection), here's a speed test. I'm aware images and scripts contribute to slow site loading.
My question is: Beside removing some images from the Home page, is it any reason to change the hosting (regarding DNS tests).
Friend of mine asked me what "CURL named lookup issue resolved" means, he's got a CentOS 4 machine, with cPanel/WHM latest builds I tried recompiling apache and even cPanel at some point but no good..
I am having a problem. My server was able to ping google.com successfully before installing DirectAdmin.
However after installing DirectAdmin, I am no longer able to ping google.com anymore. I just get a host cannot be found error. This is the same with all other domains.
The /etc/resolv.conf is exactly the same, so I do not know the issue. I also have other servers using the same nameserver and are still able to ping and resolve successfully.
The main problem here is that exim no longer can send email as it cannot lookup the hostnames.
the error from exim is: R=lookuphost defer (-1): host lookup did not complete.
My VPS Config: centos with webmin mail program sendmail
Its a newly configured VPS have a problem with smtp realying through outlook and other smtp programs. When i use smtp on outlook it give following error:
The message could not be sent because one of the recipients was rejected by the server. The rejected e-mail address was 'xxxxx@yahoo.com'. Subject 'Outlook', Account: 'Testing', Server: 'xxxx.com', Protocol: SMTP, Server Response: '550 5.7.1 <xxxxx@yahoo.com>... Relaying denied. IP name possibly forged [xx.xxx.xx.xxx]', Port: 25, Secure(SSL): No, Server Error: 550, Error Number: 0x800CCC79
I want to make outgoing authentication activate instead of allowing some iprange or domain for relaying through /etc/mail/access.
I have six dedicated servers on a VPR. I use one of these as the Database server and the other five connects to that server to read the Database.
When I setup my scripts for the first time I used the external IP to connect to the DB server. Everything worked perfect for these weeks:
mysql_connect('174.xxx.xxx.xx', 'user', 'pass');
Since three days ago I finally received the internal IPs of my servers and now I use the internal IP of the DB server:
mysql_connect('10.0.0.1:3306', 'user', 'pass');
When I did this last change this my database started to collapse. Queries stuck on a bottleneck every 10 minutes. Here's an example:
40468 simon 10.0.0.7:47002 pkindigo_db Query 115 Locked update `captured` set hp_left = '79', status = 'normal', att1pp = '25', att2pp = '30', att3p 40523 simon 10.0.0.2:58080 pkindigo_db Query 115 Locked update `captured` set experience = '5120', level = '16', hp_left = '58', hp = '58', status = ... (tons of locked queries before) ...
As you may see table is locked since 115 seconds for a simple update... after some hours this ends on a "#1135 - Can't create a new thread (errno 13); if you are not out of available memory, you can consult the manual for a possible OS-dependent bug" and even some of my tables finally crashing.
I think that my Database stuck because of a DNS lookup delay. When I had this problem before using the external IPs, I added skip-name-resolve and the problem was fixed. But now with the internal IPs it seems that's not enough.
I'm having tons of problems trying to get reverse lookup working on my dns server. I've scoured the net and tried lots of methods but so far none work.
Forward lookup DNS is working perfect, but reverse lookup is completely broken.
db.44.80.212.67:
Code: $TTL 86400 ; Default TTL in secs(1 day) @ IN SOA ns1.capturetheprize.com. info.capturetheprize.com. (
2009072902; serial number YYMMDDNN 28800 ; Refresh 7200 ; Retry 864000 ; Expire 86400 ; Min TTL )
IN NS ns1.capturetheprize.com. IN NS ns1.mytempmail.com.
; Reverse lookups 44 IN PTR ns2.mytempmail.com. 44 IN PTR ns1.mytempmail.com. 44 IN PTR ns2.capturetheprize.com. 44 IN PTR ns1.capturetheprize.com. 44 IN PTR mail.mytempmail.com. 44 IN PTR mail.capturetheprize.com. 44 IN PTR stats.capturetheprize.com. 44 IN PTR secure.capturetheprize.com. named.conf
Code: zone "80.212.67.in-addr.arpa" IN { type master; file "etcdb.44.80.212.67"; allow-transfer { localhost; }; };
if its possible to change the reverse lookup, or at least i think thats what it is called.
I have 2 accounts on my CentOS, WHM/cPanel VPS and they are hosted on their own seperate IPs however when you do a lookup on one domain it also lists the other domain as being hosted on the server and vice-versa.
How do i change this so they appear like they are on different servers? Each one has their own nameservers which are on seperate IPs also.