I want visitors from my site to be able to connect trough my squidproxy (installed on the same webserver as the site is) They will only be able to visit 3 or 4 sites trough the proxy. (These will be added to a whitelist in squid)
Preferably i want to to set it up so that users MUST visit my website to make the connection trough squid. Squid is already setup, but how do i link a site trough squid?
Preferably i would like users be able to click a link on my website, that opens an external site trough squid.
After having conversation with many WHT members and few other system admins I have not been been to resolve an issue yet.
I have a Basic VPS and squid runs fine on it.
Debian 4 Squid 3
Now the issue is that I have 2 IP allocated to my VPS. But no matter what configuration i have on squid.conf file , no matter what version of Squid i use,I am not able to have the additional Ip on my vps as Outgoing External IP address.
I currently have a site running on 8 servers : 5 web servers (apache2/php5), 2 DB servers (mysql 5), and one front reverse proxy server.
Currently I use apache as the reverse proxy (with mod_proxy of course).
I have it do 3 type of things:
1) serve some static files (the website's static files) directly from the front server. The files are stored in local directories.
2) cache some other static files (user uploaded images and files) on the front server after downloading them once from the backend webservers. This is done via mod_cache.
3) route some requests to specific web servers depending on a subdomain (on the first few letters of the subdomain more precisely). To do this i use rewrite rules like : RewriteCond %{HTTP_HOST}^sub1(.*).domain.com$ RewriteRule ^(.*) http://sub1%1.webserver1.com/$1 [P,L] RewriteCond %{HTTP_HOST}^sub2(.*).domain.com$ RewriteRule ^(.*) http://sub2%1.webserver2.com/$1 [P,L] etc.
My web servers are not in a cluster from this point of view, so it is important that the reverse proxy is able to route requests based on subdomain like this.
Now I have a few weird performance problems on the front server. CPU, hard disk usage and memory usage keep staying at relatively constant (and always low) levels, yet the server load periodically spikes to places anywhere between 4 and 12 during the day. This seems to be mod_cache related (spikes disappear when i disable it) but I can't figure what's happenning, and I'm reading everywhere squid is a better alternative to do reverse proxying.
Only, I don't know if i can do the same as mentionned above with squid. From what I read, I know I can do 2). However I'm not sure if squid is able to serve some files (based on URL patterns) directly from the local file system rather than querying them / caching them locally ? And can squid route the reverse proxy requests to different web servers based on the subdomain in a URL?
Generated Tue, 22 Jul 2008 16:09:13 GMT by igw-ipcop.netarcs.com (squid/2.5.STABLE14)
Could anyone with server geekish skills tell me what may be the problem here might be (I should mention that their annoying support contact form uses the same script hence I can't even get in touch)? What's that ipcop thing about? Do they have some program at the server level filtering IP and mine is no good or what?
I'm aware the REMOTE_ADDR revealed in Squid needs to be a legitimate IP address to communicate properly across the internet. But I'd like Squid to use and publically reveal a different IP address than the default system IP address on our proxy servers. Does anyone know if its possible to make the Squid REMOTE_ADDR use a different IP address on the system other than default.
I've defined a different IP address and port for http_port at the top of the squid.conf file. And I can connect to this IP address and Port successfully. But when I run the connection through an IP address checker, or session environment test, it reveals the actual system IP address and not the http_port IP address.
I'm trying to set up a caching squid server to speed up website access. How can I selectively choose to cache certain PHP scripts while ignoring others? I can't seem to get it to work. I've commented out the following lines:
I want to software load balance one of my website using squid. It doesn't look like it is possible with squid 2.5. Squid 2.6 is a upgrade for FC6. I am running FC4 and it cannot be installed with a lot of dependency failures. Has anyone successfully installed Squid 2.6 on FC4?
I would like to use cPanel Apache as the backend web server, and Squid cache as the front end http accelerator.
My VPS has two IP addresses, however, I want the httpd acceleration to occur only on one IP.
So far, I have installed squid cache and edited its config file to this:
http_port 74.50.118.189:80 httpd_accel_host localhost httpd_accel_port 80 httpd_accel_single_host on httpd_accel_uses_host_header on acl all src 0.0.0.0/0.0.0.0 http_access allow all
My site has a few subdomains and I would like them to work.
So, what do I do now in the apache config (which I think is here: /etc/httpd/conf/httpd.conf ?)
The type of DDoS is the one that comes from DC clients.
I have managed to mitigate the attack and to get everything working ok.
I do not like the solution I came up with for many reasons and I found that squid can be good on stopping bad requests like the one that DC clients send when the attack occurs.
I am kinda new to squid and I do not know all the settings.
I have configured It and everything works great when there is no DDoS.
But when the attacks starts , nothing works. Squid does not log anything in access_log and also, there is no load, just a lot of connections to squid.
Is there a limit for max concurrent connections in squid ?
Or the ideea of using squid as a reverse proxy without caching, just to stop bad requests is a bad one ? (I do not need snort-inline, I have some issues with it).
Im currently running cent0S 5. I recently just installed Squid Version 2.6.STABLE6 for a client to enable him to use as proxy. However it seems that sites like whatismyip.com and ipchicken.com are resolving back to my clients IP address and not the servers.
There is only one IP on my server and I think the problem may deal with X-Headers? (correct me if I am wrong)
Is there any way to use the server IP address for when my customer is using the proxy server.
My squid.conf looks like the following: Code: Code: http_port 8080 forwarded_for off icp_port 0 cache_mem 64 MB cache_dir ufs /var/spool/squid 100 16 128 maximum_object_size 4096 KB cache_store_log none cache_access_log /var/log/squid/access.log hierarchy_stoplist cgi-bin ? acl QUERY urlpath_regex cgi-bin ? no_cache deny QUERY visible_hostname proxyserver acl all src 0.0.0.0/0.0.0.0 acl manager proto cache_object acl localhost src xxx.xx.xxx.xxx acl SSL_ports port 443 563 10000 acl Safe_ports port 80 acl Safe_ports port 21 acl Safe_ports port 443 563 acl Safe_ports port 70 acl Safe_ports port 210 acl Safe_ports port 1025-65535 acl Safe_ports port 280 acl Safe_ports port 488 acl Safe_ports port 591 acl Safe_ports port 777 acl Safe_ports port 901 acl purge method PURGE acl CONNECT method CONNECT acl LocalNet src xxx.xx.xxx.xx http_access allow manager localhost http_access deny manager http_access allow purge localhost http_access deny purge http_access deny !Safe_ports http_access deny CONNECT !SSL_ports http_access allow localhost http_access allow LocalNet http_access deny all icp_access allow all log_fqdn on ##### This side is to make the proxy tranparency #httpd_accel_with_proxy on #httpd_accel_uses_host_header on #httpd_accel_host virtual #httpd_accel_port 80 ######------------------------------ error_directory /usr/share/squid/errors/English #httpd_accel_uses_host_header off #anonymize_headers deny From Referer Server forwarded_for on http_port ServerIP:8080 transparent
# no forwarded quite useless for an anonymizer forwarded_for off # no client stat client_db off
# Paranoid anonymize header_access Allow allow all header_access Authorization allow all header_access Cache-Control allow all header_access Content-Encoding allow all header_access Content-Length allow all header_access Content-Type allow all header_access Date allow all header_access Expires allow all header_access Host allow all header_access If-Modified-Since allow all header_access Last-Modified allow all header_access Location allow all header_access Pragma allow all header_access Accept allow all header_access Charset allow all header_access Accept-Encoding allow all header_access Accept-Language allow all header_access Content-Language allow all header_access Mime-Version allow all header_access Retry-After allow all header_access Title allow all header_access Connection allow all header_access Proxy-Connection allow all header_access All deny all header_access Cookie allow all header_access Set-Cookie allow all header_replace User-Agent Anonymous Proxy at example.com
I've been hearing other admins talk about using squid to speed thins up on web servers. Yes, not as a network proxy, but as simple cache engine for dinamic sites.
i successfully installed Squid Proxy on my other Dedicated server however that dedicated server had 50 ip address, i am wondering if i can use those ip address for my Squid Proxy?
Here's my story: The landlord provides the internet for the house and it goes through a server which has squid installed. Squid in turn seems to block the internal port (or whatever the exact term for that is) for phpmyadmin or webmail in cpanel so I cannot access them.
At my reseller account at Hostgator they said they redirected that port to port80 via mod rewrite and that now works for me. I wonder how I would do the same thing on a dedicated server. I can't see what Hostgator did since it's happening out of my sight somewhere. Otherwise I would just replicate that on my box.
I have an apache server on windows which I wanted to speed up using caching systems.
I tried installing squid, and I got it working with a basic example config. It seemed to work well, however under more heavy load I experienced some weird behaviour where network output is 0 for several seconds at the time, and all clients just hang in the wind and wait for a respons.
Remote to the server is working fine, so it's definitely a squid issue.. with only apache running the server is constantly sending out data, no halts there.
Is it possible to implement reverse proxy for Sever-A through Sever-B.
The issue is let's say, Server-A is located in Network-A, and the Sever-B is located in Network-B. And the users in Network-B are allowed to access only to Sever-B while the Sever-B itself can access to Server-A. So, when the users from Network-B access to Sever-B, the traffic will be proxied through Sever-B to Server-A transparently without letting the users notice about how's the traffics go. The Server-B will be linux, thus squid is the primary proxy application to deal with. Is it possible to do with squid or it needs some other application?
How about a comparison of mod_proxy for apache vs. squid as all Sever-B should do is forward the requests coming from Network-B to forward to Sever-A and acting as a middle box between the two network scopes transparently. All the users in Network-B should do is accessing the Sever-B and the requests will be transparently going to Server-A without any configuration is done at their browser or any kind of NAT/Firewall rules on either of the networks.