Ngix To Serve Static File Contents On CPanel Servers
Jun 17, 2009how to setup Nginx webserver on a cpanel server to serve static contents,
say /images folder from every domains hosted on the server so that Apache's load will decrease?
how to setup Nginx webserver on a cpanel server to serve static contents,
say /images folder from every domains hosted on the server so that Apache's load will decrease?
We recently began to mirror a large number of open source projects with a dedicated mirror server on our network and I was surprised not only with how popular the mirror server has become, is, but also of the ability of the hardware we're using to keep up with the load.
At an given time, the mirror seems to be pushing at least 50 Megabits of trafficthe server is also an IRC server (irc.igsobe.com) for customers and internal staff communications.
The hardware is a low end Dell Pentium 4 @ 2.66 GHZ server, running with 512MB of RAM and a 400 GB ATA hard drive. CentOS v5.3 is the operating system.
If you're interested, you can view the HTML logfile analysis here but that doesn't tell the full story as FTP users make up a good portion of the traffic. We've received over a quarter million hits in the first few days of November alone.
18:14:15 up 65 days, 9:04, 5 users, load average: 0.31, 0.69, 0.56
The only change that I made to the default configuration was lowering the maximum number of Apache servers to 128.
Just thought I'd share this information as I wouldn't have thought a server with such a small amount of RAM would be able to serve up so much data, even though we are talking strictly static HTML files.
I'll definitely keep this in mind when clients ask me for those "what type of dedicated server should I use for XXX" type discussions that are had all too often with clients.
I got this warning message from a script I was working on to read XML feeds.
Warning: file_get_contents(url) [function.file-get-contents]: failed to open stream: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond. in C:Apache2.2htdocs est5.php on line 8
but when i run the url itself, it works fine.
i already set allow_url_fopen = On
When some jobs done via php scripts which use file get contents , page gives internal server error without waiting.
View 3 Replies View RelatedI am using the latest version of Apache on an Windows XP machine
When my web service is down for maintenance, since Apache is will still be up and running, I would like for Apache to serve an xml file as a response for the appropriate request. I have three operations available, makePayment, calculateFee, and voidPayment.
Is it possible to have Apache determine what type of request is made for example if I have an xml error page for each operation; how will Apache know which xml file to serve based on the operation request from the client
To make it more clear: What is the best practice for modifying apache to know what request is being made in order to serve the appropriate xml file?
When i add the private key and certificate via plesk panel, i get the following error:
ERR [panel] Unable to set the certificate: Unable to put certificate file: Unable to arrange cert file: file_put_contents failed:
I currently have an existing web hosting package with a web host. However, I need to supplement that with a file hosting service for my users.
I'm estimating that I will need about 2Gb disk space, and approximately 30~40Gb of traffic monthly. This will just be plain static file hosting. I don't need any scripts, databases, etc.
I'm planning to setup a server ONLY for hosting of static binary files varying from few KB to few MB in size.
I've seen some of the litespeedtech performance benchmarks, which you can find here: [url]
From the "small static file" benchmark chart, i can see that IIS6 beats lighttpd in this test.
So i'm wondering does the IIS6 really have better performance at file hosting than lighttpd.
Actually it does not matter which operating system i will be using at this server, since i will use it only for file serving. With lots of concurrent connections. Possibly thousands of connections.
I need some feedbacks on this, so i can decide, IIS or lighttpd.
Few more bucks for win2k3 won't be an issue here, if it's performance is better than lighttpd for this kind of use.
I am using DDOS Deflate
[url]
I have a problem with NO_OF_CONNECTIONS.
The default is 150
For example, if a website has 200 thumbnails in one page, then the user will get banned.
But in my case, each time a user have only 1 connection(He only access 1 flv file each time).
So, is that safe for me to decrease the number to 20.
I can see a lot of IP having more than 80 connections, which I think they are ddos attack.
Ok, I'm a few lost about which dedicated server to choose.
First, I need 2 dedicated servers. These servers are only to download static content.
Downloadable files are small (50-400k).
I'll use lightweight server.
If there are too many user, I'll buy another server.
I search for excellent uptime, US server and fast download server.
I've visited some popular web hosting but if you know better, lets me know
1and1
Price (Root Server): $99
CPU: Single Core AMD Athlon 64 3500+ 2.2 GHz
RAM: 1 GB
HD: 2 x 160 GB
Bandwidth: 2,000 GB
Bandwidth Cap: 100 Mbps
GoDaddy
Price: $68
CPU: Celeron 2.0GHz
RAM: 1GB
HD: 1 x 120GB
Bandwidth: 500GB
Bandwidth Cap: 10-20 Mbps
Also, the price between 1and1 and GoDaddy is $31 but 1and1 give more on cpu, hd, bandwidth(4x), connection...
One of my question is 1and1 offers a average price for dedicated server or GoDaddy spends a part of money with Amanda, Candice and Danica?
Do you have some suggestion for dedicated server or web hosting?
I have a lot of questions here so if you can't answer them all I understand. even pointing me somewhere where I could get the answers would be appreciated; hardware sites focusing on server hardware, forums focusing on such, etc.
we plan to have three different types of servers:
- db server (self explanatory. mysql. for forums, mysql driven sites.)
- file server (lots of files around ~2-10MB, consistant 70mbps right now, but we want more room for upgrades. needs a LOT of storage room.)
- web server (lots of php files, but also static things like plain html, images, etc. also includes all misc services for the setup-- dns, etc.)
could I be given a rundown for which hardware each of the three should have? I don't need specifics, even just knowing that more ram is important here while cpu doesn't matter as much, or that the fastest disks available are a must, etc would all be valuable info for me. despite that, I certainly wouldn't mind specific hypothetical hardware configs.
for the database server I'm assuming the more ram the better. not entirely sure about the cpu? also not positive on disks...
for the fileserver, how much ram would be practical or useful? disk io will be an issue I'm because plenty of people will be pulling files at once so the disk needs to read from multiple places. scsi (and even raptors) are not an option as we need 750GB+ of space on a reasonable budget. more ram will take some load of of the disks, but how much is neccessary / reasonable?
for the web server I'm assuming cpu first, then ram, but it'll likely need less ram than the db server?
I'm more lost on the disks than anything. scsi on the fileserver is not an option under any circumstances due to $/GB. for the db & web server I'm willing to pay for scsi if the performance increase really does warrant the extra money, but I'd like to be convinced before shelling it out. if you have benchmarks geared at server hardware when it comes to disks I'd really appreciate it.
also, what's the best way to network these together when colocated? each one with a dual gigabit ethernet port and then the communications go to and from the router?
What's the best setup for me to have for a machine which will host web files that other servers have to read and serve to end users on the web?
For example, I have servers Web1 and Web2 serving the same content from Files1.
I assume it's best just to go with RAID 10 and be safe or?
More importantly for me, what's the best way for these systems to communicate? I.e.
what protocol should be used for the web servers to read the files from the file server?
I once used SSHFS for serving the same static files on a couple of machines from one location... but that's presumabely very slow (at the time wasn't a problem).
All systems will be running CentOS Linux.
Is it normal practice to have shared filespace that multiple web servers can access? Then I just provide my developers access to that filespace (one server, instead of multiple users on multiple servers) to manage files for different sites easily... right? (also meaning multiple servers can serve the same content)
So.. what would be the best way to do this? We're talking Linux systems here by the way. What sort of specs would such a server need?
I *think* it's NAS that I'm trying to get at.. unsure though if that's correct or not!
is there a low end server just designed for receiving and sending large audio and media files?
like CPU should be minimal, as long as it can send and receive large files to and from a main server which has apache + php + mysql running in it...
would this be cost effective?
Zip the contents of a directory (/path/to/dir/*) and then include a file in that zip that is located here: /a/whole/different/path/file.jpg. Is there a ZIP parameter that allows me to include that JPG file in the ZIP file that I am creating? Can someone give me the command?
Also, instead of zipping the contents inside of the "dir" directory, can I zip the "dir" directory itself but exclude the full path when creating the zip (/path/to/ part). I can do this with the -j parameter but when I do it also excludes the "dir" directory and only ZIPs the content inside of it.
I have a laptop which was recently reset to factory default and in the process I lost my ability to run and test my sites locally. So, I installed IIS (it is 5.1) and tested the default pages. So far I am having no luck getting anything but this:
-----------------------------
Server Application Error
The server has encountered an error while loading an application during the processing of your request. Please refer to the event log for more detail information. Please contact the server administrator for assistance.
-----------------------------
I have searched and searched and searched and I am just out of ideas on what to search for. So far, this is what I have tried: ......
I have been working on a client's site that is hosted at www.serve.com. Anyone has an experience with them? I find them very lousy in support, support tickets take days to get response.. sometimes no response...
My task is to install drupal and customise it and .. but the clean-url feature doesnot work at their server .. as this conflicts with ther system settings. wow.
I am trying to find someone who has successfully installed Drupal with clean url enabled on their server.
I have installed Apache 2.0.64(Win32) and after repeated tries at adjusting the httpd.conf file, it still is refusing to serve up CGI. So far I have, following the Apache documentation, set my designated CGI directory with AddHandler directive, Option +ExecCGI enabled. Also have added the ScriptAlias directive as per instructions. These directives should work in conjuction with modules mod_alias and mod_cgi, both of which I have made sure were set to be loaded. The log files show no errors. There are no syntax errors being pushed out at start-up. The server stops and starts upon commands, with no problems,
I will post some excerpts from my httpd.conf in the pastebin as soon as it gets up & running--right now the authentication icon is missing.I have Active Perl 5.16 running on this machine, but my Apache download is the binary without SSL and with msi installer.
i have a server with centos,
i need to edit the hidden file .htaccess from the file management tool of cpanel,
but the hidden files not shown,
ow can i modify the setting and let the files shown in the file management tool of cpanel?
What is the best way to share files/folders between centos/linux servers?
View 7 Replies View Relatedi have to reload a page several times. at least once. the templates show up fine, but the actual images and contents do not show up unless i reload the page....this is evident during peak hours but not during normal hours.....
i've done tracert, no lost packets.
I plan to build huge image gallery, using lighttpd to server these images. It would be easiest for me to have all these files (more than 1 000 000) in one directory? Is it ok?
I have no idea, if this can cause any problems. Should I part my files into several directories, does it make serving better/faster?
Hello, we have a few web servers that run Windows 2003 Server and IIS for web page hosting. We develop custom applications and don't do "web hosting" per se.
What is the best way to do this in a load balanced environment? We have a Cisco load balancer out in front of these servers, but I'm curious about the following:
1) Is there a way to replicate IIS entries instead of having to configure the site on each server?
2) How does everyone handle file replication (hopefully in real time) across all servers?
I am configuring a new Plesk server 12.x based on Linux. I already have a Plesk server in 10.x on Windows and i would like to transfer all the data we have on this one to the new server.
I tried to do a backup of the old server using the web interface but the zip file created was not compatible with the one using Linux.