I edited the /etc/apf/deny.hosts_rules files, then removed all lines from the file and finally restarted apf so it can restart with no deny host listed. But that is not working... the file appears empty or again with the rules removed before.
iptables -L -n shows the same banned hosts as dropped.
I already tried.. remove the deny hosts IPs from the file, then ran "iptables -F", then "service iptables save", and finally restarted apf and the deny IPs still there
I have vhost setup for test of a new website. I want to allow access on the localhost, and, from one IP from the Internet (redacted). Apache serves the site just fine on the server but I can't access the site from my the "xxx...." IP.
I'm using a physical path to test from the public IP as follows:
I have vhost setup for test of a new website. I want to allow access on the localhost, and, from one IP from the Internet (redacted). Apache serves the site just fine on the server but I can't access the site from my the "xxx...." IP.
I'm using a physical path to test from the public IP as follows:
Code: <VirtualHost *:80> ServerName test ServerAlias test DocumentRoot /home/user/public_html/test <IfModule mod_fcgid.c>
[Code] .....
I don't have a FQDN as yet, so I just made a entry in /etc/hosts as follows:
Code: 127.0.0.1 test
Here is an excerpt from the Apache error log:
Quote: [Mon Jun 17 12:02:16 2013] [error] [client xxx.xxx.xxx.xxx] client denied by server configuration: /home/user/public_html/test/index.html
I've checked the firewall and the /etc/hosts.allow- that's not it. I've read the Apache docs and in the vhost block Allow should be evaluated last, and apparently is matching localhost but is not matching my IP.
I have recently had number of websites that link directly to images from my website. This is not hotlinking, it is direct server request. As an example: on the linking website there is image gallery script with thumbnails and when the visitor clicks on the thumb it calls the image from my website.
I block their IP-s in .htaccess, but it is not the best way to stop them since IP change. Is there any way, similar to anti-hotlinking, to deny such direct access to my images by domain name i.e. to allow only from my website and deny from all others. Or something else that could work in my case with .htaccess.
when i check apache status, i see one domain send many request to server, for example: domain.com 10.20.30.40 domain.com 10.20.30.40 domain.com 10.20.30.40 domain.com 10.20.30.40 domain.com 10.20.30.40 - - - how can i prevent this problem? this problem tease me and my server, because induce apache to work unremitting. Ram Usage is: 65%!
I am working with XAMPP 5.6.8 (Apache 2.4.4, MySQL 5.5.32 and PHP 5.6.8 ) on a 64 bits Windows 7 Ultimate (Service Pack 1) Operating System.
I am working with an Arduino UNO and a WIFI Shield connected to the Apache server.
I am sure Arduino is connected to the WiFi network and to the server, and it also sends the GET request to the server.
Apparently, everything is OK because I can see the 200 OK message from the server in the Arduino serial monitor, but I find no trace of that request in the server log although all the requests made from the browser (by typewriting the server address in the browser address bar and pressing enter) appear in the server log.
Every request is getting processed 3 times. In other words, if I point my browser to the URL of an image hosted on this server, it generates 3 lines in the access log each time I refresh the page.
If I point it to a script which logs something to a file, it logs it 3 times, showing it's run all 3 times.
I haven't touched the httpd.conf or any other configuration. Any idea what could cause this?
Is there any tool out there (I prefer command line) that is especially for analysis of apache error log files ? I need something that can summarize information from log and give them back to me.
In my web site I have several index pages in different languages in the following format
[URL] ....
Two days ago I noticed increased, many times. Google bot activity on my site and when I checked my log file I found that all pages crawled were wrong web addresses: to the above index were added existing files from my site like
/folder1/folder2/file.html
So, the strings looked like
[URL] ....
And surprisingly all they returned code "200".
My question is: is there any way to rewrite such requests to the first ".html" found in the string.
I have question for apache in centos. I loaded the apache and I want to know that which MPM used by default two MPM defined in apache but which MPM apache actually used for request server.
I've just joined the group and new to Apache/php. I have just assembled a website in Joomla/vertumart and called petslovezone.com.au. I want to redirect all the request such as
1. http://xyz.com to https://xyz.com 2. http://www.xyz.com to https://xyz.com 3. xyz.com. to https://xyz.com 4. www.xyz.com to https://xyz.com
now know I have to change .htaccess "RewriteEngine On" section. What would be the best code to do all the above.
As we are planning to implement Mobile for our platform, we want to distinguish between request coming from Mobile and Web in Apache. We will be using Apache for Reverse Proxy and we want it to differentiate the request source and forward it to required destination.Is this possible ?
I've been having trouble the past few days with someone who's been "attacking" my site so to speak by continuously downloading very large files with as many connections as (he) can open. I operate a large downloads site for computer games, this person has selected the largest files (like 400-500MB). Not sure of the real intent other than to clog up my bandwidth capacity. Also he appears to be using proxies since as soon as I ban one, another shows up seeminly from China.
Anyway, I have mod_bw and I've limited the number of connections in the downloads area to 2. While that works ok, his tool uses threads like a download manager would and he's using up 30-40 child threads for his 2 file downloads.
So 2 questions,
Is there anyway to not only limit file downloads to 2, but limit the number of connections per request? Many of my visitors do use download managers and I'd like for them to continue using them but use a reasonable number of threads like 6 or 8, but not 30.
Also, is there a way to restrict access to someone using a proxy?
I have an Xitami server and am migrating to apache httpd. I have the regular server working fine. I tried configuring ssl, but no requests are coming through. I know 443 is open on the router because it works fine under Xitami. I checked the logs and it si starting fine. I am attaching my httpd.conf and the startup log. If I try to access the website using https, it just times out and nothing goes in the log file. I replaced my domain with domain.com. I have tried many different examples, but cannot get it to work and am not sure what to do.
So I've set everything up manually a few times before now, but I got so bored of configuring everything for a manual install I just said screw it and used XAMPP this time - so my circumstances are not completely ideal.
Basically what I am looking to find out is how to improve loading speeds for Apache, PHP and MySQL on my dedi server?
The server I have is of the following spec: Intel Xeon CPU E5-1650 V2 (3.50Ghz with 12 cores total) 64 GB DDR3 ECC 2 x 2TB SATA3 (RAID 0/1)
use Windows Web 2008 R2 so only 32GB of the RAM is usable.
With all the abive aside, here is the important part: Whilst people are browsing the websites I have configured they are random hit with a blank white page saying "Your request has timed out. Please retry the request." - I have about 100 unique hits daily and a lot of people report the same problem, and I have even had it myself.
It feels as if the server has much more power than Apache and co. is trying to utilize - what can I do?
I am using 2.2.29 in Windows.Trying to remove one cookie in a request header before passing the request to the application, but having trouble. The cookie is in the middle of the request header.
i want to redirect main domain http //, www request to https://
i added this code
RewriteCond %{HTTPS} off# First rewrite to HTTPS:# Don't put www. here. If it is already there it will be included, if not# the subsequent rule will catch it.RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]# Now, rewrite any request to the wrong domain to use www.RewriteCond %{HTTP_HOST} !^www.RewriteRule ^(.*)$ https://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
I am using the latest version of Apache on an Windows XP machine
When my web service is down for maintenance, since Apache is will still be up and running, I would like for Apache to serve an xml file as a response for the appropriate request. I have three operations available, makePayment, calculateFee, and voidPayment.
Is it possible to have Apache determine what type of request is made for example if I have an xml error page for each operation; how will Apache know which xml file to serve based on the operation request from the client
To make it more clear: What is the best practice for modifying apache to know what request is being made in order to serve the appropriate xml file?
So I've got a problem where a small percentage of incoming requests are resulting in "400 bad request" errors and I could really use some input. At first I thought they were just caused by malicious spiders, scrapers, etc. but they seem to be legitimate requests.
I'm running Apache 2.2.15 and mod_perl2.
The first thing I did was turn on mod_logio and interestingly enough, for every request where this happens the request headers are between 8000-9000 bytes, whereas with most requests it's under 1000. Hmm.
There are a lot of cookies being set, and it's happening across all browsers and operating systems, so I assumed it had to be related to bad or "corrupted" cookies somehow - but it's not.
I added "%{Cookie}i" to my LogFormat directive hoping that would provide some clues, but as it turns out half the time the 400 error is returned the client doesn't even have a cookie. Darn.
Next I fired up mod_log_forensic hoping to be able to see ALL the request headers, but as luck would have it nothing is logged when it happens. I guess Apache is returning the 400 error before the forensic module gets to do its logging?
By the way, when this happens I see this in the error log:
request failed: error reading the headers
To me this says Apache doesn't like something about the raw incoming request, rather than a problem with our rewriting, etc. Or am I misunderstanding the error?
I'm at a loss where to go from here. Is there some other way that I can easily see all the request headers? I feel like that's the only thing that will possibly provide a clue as to what's going on.
My customer has an external facing Apache server that is acting as a reverse proxy to two internal applications. They have:
- external addresses for each app which resolve to different ip addresses, so app1.their_domain.com and app2.their_domain.com resolve to 77.3.170.10 and 77.3.170.11 respectively. - the Apache server has two network interfaces with ip addresses 192.168.10.10 and 192.168.10.11 - the external ip addresses resolve to the above internal addresses - the firewall between the Apache server and the internal app servers is configured to allow traffic from 192.168.10.10 to reach app_server1, and traffic from 192.168.10.11 to reach app_server2, both using port 7777.
I have configured a virtual host in httpd.conf for each ip, i.e.
This works fine in that the external address are being routed to the correct application, however the firewall is blocking requests to the second app as it appears the requests are coming from the Apache servers 'primary' ip address 192.168.10.10 instead of 192.168.10.11.
Is it possible to send requests using the ip address from the relevant VirtualHost?