Apache :: URLs Ending In / Show 404 When Exploring With Google Bot In
Nov 7, 2012
I have a WP online shop using WP E-commerce plugin 3.8.9 together with the SEO Yoast plugin.My problem in that when exploring the product URLs ending with / in google webmaster tools, it displays 404. But the same URL without / is found and ok. I must day that both URLs show up correctly in browsers and the non / version is redirected to the one ending in /.Here is my .htaccess:
LCMlinux ~> uname -a Linux LCMlinux 3.2.29-smp #2 SMP Mon Sep 17 13:16:43 CDT 2012 i686 LCMlinux ~> httpd -v Server version: Apache/2.4.3 (Unix) Server built: Aug 23 2012 11:07:26 LCMlinux ~>
We are using this both for the Trac issue-tracking application and for a small, simple internal mirror web site. Trac is working perfectly; the web site works if exact URLs are provided (as in <a href=...>
1. User hits my new 2.4 reverse proxy at [URL] ....
2. I proxy the request through to my "real" app server at [URL] ....
3. I also use a re-write rule to add a querystring to the URL: ?Parameter=Foo
4. So, client's request arrives at the my app server as [URL] .....
5. When my app server responds, it is including the Parameter=Foo key/value combination. I don't want this.
6. I want my reverse proxy (somebox.com) to strip "Parameter=Foo" from the string which gets returned to the client.
I have steps 1 & 2 working nicely, but it looks like I can't handle the last bit with with mod_rewrite. I found mod_filter and mod_substitute, but it appears that this stuff is used for re-writing strings IN the document. Can these libs be used to maybe modify (I'm guessing here) the headers so that the "?Parameter=Foo" string can't be seen on the client if they're running something like fiddler?
I've taken over a site that caters for client access. They all access there own folder, and in the folder the files have an include with a relative path as below.
/core - contains all the actual files /client/file.php - <? include "../core/file.php";?> but with the growing number of clients I want to go a level deeper and separate them better... /uk/client/file.php - <? include "../../core/file.php";?>
This is fine but when the files are included, they too have there own relative includes and this is where it breaks. There are so many files I can't easily go through them to change all the include paths so I would like to maybe do a rewrite to fake the path? I've tried this...
I used a little vServer with ubuntu (turnkey) and use logwatch to be informed by email about any errors. I'm confused about the following errors from Apache:
--------------------- httpd Begin ------------------------ Requests with error response codes 404 Not Found http://translate.google.com/gen204: 1 Time(s) http://www.teddybrinkofski.com/ip_json.php: 1 Time(s) 503 Service Unavailable http://www.google.com/: 1 Time(s) ---------------------- httpd End -------------------------
These errors are definetly not from my own code. I have checked that mod_proxy is disabled and i disabled also CONNECT like here described: [URL] ....
What does these errors mean and how can i disabled this?
I have test server with > VMware CentOS + Plesk 12...And after reboot, plesk "localhost.localdomain" show me "Apache 2 default page"I do this > service httpd restart, and after all ok. (Show Plesk Default page) no apache page.I want normaly reboot server and show "Plesk default page" no apache...
I have domains parked on a subdomain. Since i updated the apache configuration with cPanel the parked domains just show the default apache configuration. The domain shows up under parked domains. I can delete the domain and readd then it works. However, that isn't a good option cause there are 100+ domains.
My .htaccess file redirects (rewrites) all .html to .php... I need to add Google's file in order to verify my account (Webmaster Tools) but it can't find the file because of my .htaccess.
I have five domains redirecting to my main site. I originally setup these redirect domains because many people in my target demographic are not very computer literate, and I worried they would mis-spell or mis-type the site name. I want to do what Google does with gogle.com and their other domains.
exampledomains.com <-- Main Site exampledomains.net exampledomains.org exampledomain.com exampledomain.net exampledomain.org
Now I'm seeing that Google is indexing about 3000 results for each redirect domain, and I'm worried it's going to hurt my SEO.
My host setup the domains with the same IP as my main site, then put non-masking redirects in the .htaccess. The sites do not have their own accounts on the server, so I can't setup a separate htaccess or robots.txt as their presently configured. How should I set this up?
I hope some of you are using Google Apps and can help me to find an answer to the following question:
I own two different and independent domain names (e.g. domain1.com and domain2.com). I'd like to use the Google Apps (Standard, free edition) with them to create two different and totally independent mailboxes (e.g. abc@domain1.com and xyz@domain2.com).
But how many Google accounts I need to do this? Can I manage two (or more) independent and fully functional domains using one Google account?
P.S. Help section contains descriptions of aliases for multiple domains, which are just pointers or shortcuts, but not a fully functional mailboxes, so this solution isn't something I'm looking for.
how can i provide temporary urls for users on my server like [url]until the actual domain resolves? ive seen this done in with cpanel but i dont know how its done exactly. my current server does not have cpanel.
Are there any scripts out there that can protect URLs? For an example I am trying to protect a megaupload.com URL with a masking URL and making sure that the masking URL is only access by a referral site. Can this be done?
Let's say you want to protect againts hacking,and using method with simply blocking loading url.So let's say someone hacked your index.html and changed links to lead to his domain.com.Is it possible to block what would be loaded on site ?(to prevent possible future hacking intrusions)
I'm testing scripts on new server now, and server has 2 problems.
1. I can not enter domain name as "get" parameter. For example, if I'm requesting URL like domain.com/file.php?url=[url] - it does not work. If I'm requesting URL like domain.com/file.php?url=[url](please note it has INVALID extension for TLD) - it works!
2. fsockopen and file_get_contents does not work. I added these settings into php.ini:
allow_url_fopen = On allow_url_include = On
...and nothing works. I get just blank pages when using these functions.
Server is running cpanel + apache 2.2 + php 5 + APF firewall
I just found hundreds of rubbish urls in awstats for a particular domain. Is this referrer spam or something more serious and can I do something about this?
One of the sites I have, is placed on a non-Apache server (the others are). Phpinfo() gives this: Server API CGI
I'd like to make search engine-friendly URLs for all my sites. All of them will do fine with mod_rewrite, but that's not possible on this server, it seems. Anybody here knows how I can do this for this particular server?