So I've recently ordered a Supermicro 4U server with 24x1TB HDs, 64GB RAM and put it in RAID 10. I'm running Debian 5.0 and have installed lighttpd. All the content I serve are video files (AVi, MP4, MKV, OGM) and each file is about 100-500mb in size. I'm wondering how can I optimize lighttpd to get the best performance out of it. I look forward to your replies.
I have four servers with a quad xeon, 4gb ram, and 2x300GB SAS 15K RAID0 harddrives, pushing a total of 1.6gbits. It serves a lot of zip files with an average flesize of 180mb. My question is, how can I optimize lighttpd 1.4.19 to push its max with very low IO-wait. I've looked up some stuff and only found options that deal with lighttpd 1.5 and use Linux-AIO for the backend network. Currently I use writev with 16 workers and an read/write idle timeout of 10s. Logging is off, too.
I run file sharing service, bandwidth usage is going up too much, 300 Mbps, anyone know if file hosting sites use bandwidth shaping ? Any recommended settings for lighttpd?
My client's website needs to hold files that are around 60 or 70 MB. The host only allows files up to 10 MB. Is that typical?
Right now I'm linking to a file storage but would rather make the files available from my site without going to a 3rd Party Site. He doesn't want to zip his files either - just to be a straight download.
I have been trying quite unsuccessfully to import a large sql db file via phpMyAdmin for one of my clients. Since the db file is about 250mb I get a server timeout error.how I can do this via SSH...I have a CentOS server 6.5, 64 bit that runs Plesk v 12.0.18
on good hosting setups for getting large amounts of disk space.
I would like to be able to offer up to 2Gb storage space for 100s, maybe up to a few 1000 users - any solution should scale well. The files would be static files that might be up to 400Mb in size.
It would be nice to be able to give users FTP access to their disk space, although it's not a core requirement.
A few days ago, my friends studying in America recommended me a new popular transfer toolQoodaa. And he told me that it was a quite good software to download files and movies. At first,I was skeptical, but after using it, I found its a good choice to choose Qoodaa. And I have summarized the some features of Qoodaa:
1.Its speed is faster than any other softwares I used before to upload movies.
2.It can download files quickly through downloading links, in a word, it is time-saver and with high efficiency.
3.No limit of space.No matter where you are, it can download fast.
4. Qoodaa is a green software with high security and easy use.
It really can give you unexpected surprise.
I am a person who would like to share with others, and if you have sth good pls share with me
I'm currently running on a VPS. My site allows for large file uploads and downloads, with files over 600mb in size.
The server has issues when the site gets three or more requests for large file downloads. I'm trying to grow this site to thousands of users and it is hard to do when the site can't handle even three.
I've been told by my host that I need to upgrade to dedicated. My VPS only has 512mb RAM and one large file download is eating up that RAM. This is causing the issue.
I'm a newbie and while I knew I was risking a bit by going with VPS I do find it a bit annoying that these guys advertise 1TB of bandwidth per month but I can't even support downloading 1GB at the same time....maybe it's just me...
Anyway, I am now looking into moving the large files and the upload/download over to Amazon S3. If I do this I am expecting my RAM usage on the VPS to greatly decrease. Is this correct? If my PHP code is running on the VPS, but the actual file download via HTTP is coming from S3, that should not be a heavy load on my box, correct?
Today i need to run "OPTIMIZE TABLE" from MySQLTuner recommendation.
Now i don't know how to run it ? I try to put in SQL Query via phpmyadmin and:
SQL query:
OPTIMIZE TABLE
MySQL said: #1064 - You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '' at line 1
I have purchased "Due2Core" with 4GB Ram. i have one problem. i hope someone help me in this matter.
I have hosted only 4 or 5 websites on dedicated. mostly maximum 8000 users visit my website. Maximum B/W usage is 35GB.
Daily 25 or 30 mints my websites "WebPages open speed slow". Some time problem occur for 1 time or 2 times. when i checked my server status report Load only 0.50% and Memory usage 35% only.
when i checked server and my sql usage report maximum http load come from my top visit websites.
some friend suggest me for optimize your dedicated apache. because some thing wrong in your apache.
because i have enough memory.
Anybody suggest me suitable apache Optimization method. with step by step.
Remember, in this time my server run default sitting.
I recently got my first dedicated server and now have it all up and running. I would like to optimize mySQL for improved performance.
Server running: CentOS 5, Tomcat, mySQL 5
Most queries: Single line queries (SELECT * FROM table WHERE name='John') NOTE: Most of my queries include the return of at least one blob (size range 100kb to 1Mb)
Server Specs: Dual Core Intel Pentium 3Ghz, 4GB RAM, 500Gb hard disk
Below is my current my.cnf file:
Code: [mysqld] datadir=/var/lib/mysql socket=/var/lib/mysql/mysql.sock # Default to using old password format for compatibility with mysql 3.x # clients (those using the mysqlclient10 compatibility package). old_passwords=1 max_allowed_packet=16M skip-innodb skip-bdb
I've been a long time follower of wht.com from reading the hosting reviews to optimizations, and now it's come to that point that I need to post a question to get more accurate help than reading others configurations since all servers and sites have their needs. So here it goes.
As I wrote in the topic my main concern is my apache config. I've tried many many different configs but I'm never sure when to stop optimizing it. The site is a chat community (myspace like site) with over 1000-2000 simultaneous users online that do everything from posting images, to talk in the forums. And yes I host everything on the same server atm.
My worry is that at night when the site has the most visitors the amount of processes that get produced get 'cut' as it seems to reach a limit somewhere. Below is an image of cacti with load and processes. Please ignore the load peaks as they are nothing then my average 'testing' periods of new functions
[url]
This seems to be run by my Max/MinSpareServers which I've played around with but as I said never could find a perfect spot. The amount of processes have always been controled by these values, before I increased them a couple of days ago the amount was around 600 processes to the max. Now as you can see on the latest graph it's around 800.
I've tried the default values but the site went to slow with those values. I've lately come up to what I think is insane values but it keeps the server ticking in a nice pace.
So I'm asking you guys for help as I've reached my trial and error sessions. I have no clue what 'ok' values are for amount of current processes running, and I don't' feel confident to increase the values.
here is my apache config.
Timeout 60 KeepAlive On MaxKeepAliveRequests 3500 KeepAliveTimeout 2
I have noticed that if I enable KeepAlive, the performance is much more better. However, Apache will be frozen few times a day without any reason. So, I need to turn off KeepAlive. Is there any idea why and how to fix? This is the configuration
There are some great tools to optimize MySQL, however I have not been able to find the equivalent for Apache + PHP. I have a lot of experience tweaking these two in extreme traffic environments, but there is no beating a program that can simultaneously evaluate several inputs over time to calculate optimal settings.