Optimizing Lighttpd For Large Files (180mb Avg)
May 9, 2008
I have four servers with a quad xeon, 4gb ram, and 2x300GB SAS 15K RAID0 harddrives, pushing a total of 1.6gbits. It serves a lot of zip files with an average flesize of 180mb. My question is, how can I optimize lighttpd 1.4.19 to push its max with very low IO-wait. I've looked up some stuff and only found options that deal with lighttpd 1.5 and use Linux-AIO for the backend network. Currently I use writev with 16 workers and an read/write idle timeout of 10s. Logging is off, too.
View 14 Replies
ADVERTISEMENT
Nov 2, 2009
I optimized a mysql table of 2 million records and about 500MB.. it took about 15 minutes.. However, on the same DB now i have another huge table of 88 million records, it size is 2.2 GB and it has about 30 MB to optimize... my questions..
1.- How can I speed up the optimization process so it can take the less possible time? any tweaks to my.cnf?
2.- Should I repair it using phpmyadmin or just from the shell?
3.- Should I stop http traffic during this optimization?
This is a dedicated db mysql server that handles a large VB forum with 5-8 users online average:
Code:
Intel Xeon 3.2 x4 procs, 32 bits, 4 GB ram
/etc/my.cnf
Code:
[mysqld]
datadir=/var/lib/mysql
socket=/var/lib/mysql/mysql.sock
skip-locking
skip-innodb
skip-bdb
query_cache_limit=1M
query_cache_size=48M
query_cache_type=1
max_connections=1200
interactive_timeout=100
wait_timeout=300
connect_timeout=10
thread_cache_size=128
key_buffer=48M
join_buffer=8M
max_allowed_packet=16M
table_cache=2036
sort_buffer_size=1M
read_buffer_size=1M
read_rnd_buffer_size=2M
max_connect_errors=10
# Try number of CPU's*2 for thread_concurrency
thread_concurrency=4
myisam_sort_buffer_size=64M
# Add
max_heap_table_size = 48M
tmp_table_size = 48M
low_priority_updates=1
concurrent_insert=2
[mysqldump]
quick
max_allowed_packet=16M
[mysql.server]
user=mysql
#basedir=/var/lib
[mysqld_safe]
err-log=/var/log/mysqld.log
pid-file=/var/run/mysqld/mysqld.pid
log-slow-queries=/var/log/slow-queries.log
On the other side, i have the same hardware for the webserver..
View 2 Replies
View Related
Apr 14, 2009
So I've recently ordered a Supermicro 4U server with 24x1TB HDs, 64GB RAM and put it in RAID 10. I'm running Debian 5.0 and have installed lighttpd. All the content I serve are video files (AVi, MP4, MKV, OGM) and each file is about 100-500mb in size. I'm wondering how can I optimize lighttpd to get the best performance out of it. I look forward to your replies.
View 14 Replies
View Related
Nov 7, 2008
Linux Fedora 6, Apache 2 with Mod Security, MySQL.
Our mod_sec logs get incredibly large very quickly. In the configuration for mod_security, we have specified logging options as
SecAuditEngine RelevantOnly
SecAuditLogRelevantStatus "^[45]"
but the mod_sec.log gets to almost 10 GB (in a matter of 5-6 days) before it is truncated to mod_sec.log.1 and a new one is created.
Is there a way we can specify that a max size of one log file is 1 GB, for example?
Or another question, how come it gets so huge so quickly? We thought that logging "RelevantOnly" will only display errors / requests that are deemed security risks.
View 2 Replies
View Related
Oct 6, 2008
I have a customer who wants to sell access to videos of conferences he runs.
Each flv vid is approx 1 - 1 1/2 hors long approx 380MB each and there will be about 12 videos per conference.
approx 4 - 8 conferences per year.
My customer suggests 10 - 20 people will buy access to watch each video.
Access to watch the videos will be through a password protected webpage.
issue - the current site hosting company only allow uploads up to 150MB per file.
Can I host the flash videos elsewhere and deliver them through the password protected web page without anyone else being able to see them via server they are hosted on?
This would also reduce the bandwidth going through his current site server.
View 14 Replies
View Related
Jul 7, 2008
I am trying to locate what large file are filling up the / on the server but I am having trouble using the find command to do this.
View 1 Replies
View Related
Nov 9, 2009
I'm working on a web site which will basically be a flash games portal. I have a dedicated server running Apache 2 on a 100mbit dedicated line but my download speed for large files (flash files of over 5mbs) is really slow. I am thinking this is because of Apache but I don't know much about this. I've read that I should change for a lighter http server for serving static files. The way my server is set up is I have 2 virtual machines running, one doing the PHP processing and the other serving static files, both running Apache, so if I have to change HTTP server for the static files it would be very easy. Although I am not sure if this is necessary or if I can tune Apache to push files faster than this.
View 8 Replies
View Related
Jul 17, 2008
I'm facing a very strange FTP issue with one of my shared-hosting accounts, while all of my other servers are having no problems but only this one, when I try to upload a file (whatever file) larger than 500kb from my local PCs, in most cases, the file would stop uploading during the process and hang there until it times out.
There are 2 interesting things though:
The file transmission typically hangs when approximately 248kb of the file have been transferred, pls see the attached screenshot for example.
If you look at the attached screenshot, you will notice that the uploading transmission hangs when 248kb of the file have been transferred. This is very strange and what I mean is that for example, I randomly pick up a file, and attempt to upload it onto my host for 10 times, now see, 5 times it will hang when 248kb of the total size have been transferred, 3 times it will hang at other points *near* 248kb (224kb or 280kb typically), 1 time it will hang at another random point, and 1 time it might be uploaded successfully (yes, there is still a tiny chance for the file to be uploaded successfully).
My default internet uploading speed is 80kb/s-100kb/s, lately I found that, when I limit the uploading speed on my FTP client (e.g. max. 30kb/s), everything WILL WORK without any problem! No hangs, no interrupt.. Whereas when I free up the uploading speed limitation and let it upload with my regular speed, the problem appears again.
It seems to me that the FTP hangs only when the uploading speed is higher than 60kb/s. However my host provider told me that they have customers uploading without any problem at over 400kb/s, and they said "there's no problem or limitations on the server at all".
Up until now, I have done following things to troubleshoot the issue but with no luck:
Contacted my host.
Disabled/Enabled the PASV mode on my FTP client.
Tried different FTP clients on different computers (FlashFXP and Filezilla).
Rebooted my router and reseted everything with the factory default settings.
Contacted my ISP for the issue, they "did something" but nothing were helpful.
Rebooted all my PCs.
Disabled both firewalls on my PC and on the router.
Furthermore, I have asked another friend of mine in another city with another ISP to test the FTP uploading, but unfortunately he got the exact same problem. And I've done some search on the internet for hours but no one seemed to have the same problem..
View 12 Replies
View Related
Nov 22, 2008
I just logged into my VPS and was astonished by how much space I have in use.
8.09GB... but I can't figure out what's using up so much space!?
How can I find out were large files are located ? Since it's increasing daily
I use LXAdmin with HyperVM Control Panel
View 10 Replies
View Related
May 20, 2007
Just noticed quite a few large Core. files within one of our websites (within a sub folder of public_html). Anyone knwo what these are and how they got there?
View 3 Replies
View Related
Jun 15, 2008
I have a debian box, and have archived a gallery in to a .tar file, 5.77gb.
I have a centOS box, and have used wget to bring the data file over to the new server.
However upon doing so it only detects it as 1.8gb when it starts downloading.
I have terminal access to both servers, just trying to bring my files over from one server to another.
View 4 Replies
View Related
Jul 16, 2008
I've been using Lypha for the past 4 years, but they've taken the last straw (gigabytes of backups went missing and they wont reply to emails as to why).
Looking for a web hosting package for under $10/month that has large enough disk-space/bandwidth to allow me to backup large audio / video files to it, as well as the normal site operation (I use it for portfolio website, as well as hosting additional domains)
View 17 Replies
View Related
Mar 30, 2007
I am developing a web application for a private investigative firm. They do surveillance work and therefore have surveillance videos. I would like the capabilities of uploading the videos online and allowing the client to login and view their surveillance video online.
Currently, we get the video from the PI, put it on a DVD and then mail it to the client.
This takes too long. We want the client to be able to view the video online.
Some of these videos can be up to 2 hours long.
First, is this even possible?
Second,
- how much bandwidth would a website like this take?
- Is there a host that can hold hundreds of GB of video?
I want to convert it to flash to save file size and also so I can stream it.
View 3 Replies
View Related
Mar 21, 2007
I have some 100's of MB's to move and I'm definitely not doing it by transferring it via my PC / FTP.
I seen all the tutorials on how to move your MySQL databases, but what about full folders etc, how do I move those (putty?)?
View 1 Replies
View Related
Jul 13, 2008
Something weird happening here. I have tried every string possible...
There are a number of folders I want to remove off my server, tried the good old and simple...
rm -r /folder/
And then went and ended up with a string as long as my screen. No matter what I do, as it goes recursive in to the directory it asks me if I want to remove each file individually. No matter what string or action I take it insists on asking me as it goes to delete each file.
Could this be a configuration option in CentOS?
View 3 Replies
View Related
Oct 29, 2006
i just wana know is it safe to do remote daily backup for about 70,000 files?
file sizes is about 200kb and every day i have about 1000 new file, so rsync first should check old files becouse i am deleting about 30-50 of them daily and them backup new 1000 files ,
so how much it will take every time to compare that 70,000 files?
i have 2 option now:
1-using second hdd and raid 1
2-using rsync and backuping to my second server , so i can save about $70 each month.
View 9 Replies
View Related
Jan 19, 2008
Does writing large files (ie, 10GB backups in one archive) cause any risk of damaging a linux filesystem?
View 1 Replies
View Related
Feb 8, 2007
I've got a client who wants to host audio files... Here are the sizes:
50 x 75MBs
300 x 10MBs
400 x 5MBs
That totals 8750MBs or 8.75GBs... If he gets hundreds of visitors, it could end up being 1000's of GBBs or bandwidth.
I don't know what to look for to support so much bandwidth... Do you buy bandwidth? Are their special companies out their that host it for you?
View 6 Replies
View Related
Jun 18, 2015
Domain has PHP Settings in Plesk set to 2G and I get this error when uploading a 48MB file using Wordpress. I assume I need ot modify this manually in conf file somewhere to allow uploading large files?
Requested content-length of 48443338 is larger than the configured limit of 10240000..
mod_fcgid: error reading data, FastCGI server closed connection...
View 1 Replies
View Related
Sep 17, 2014
I have a 6GB backup file created with another Plesk Backup Manager, now I trying to upload this backup file to my Plesk Backup Manager but after upload 3% I am getting "413 Request Entity Too Large" error, I tried with disable NGINX but still getting this error.
how can I resolve this error or is their any other way to upload my file on backup manager?
I see that Backup Manager have a file size restriction of 2GB how can I increase this?
View 2 Replies
View Related
Oct 23, 2009
In reference to my previous post, i want to tranfer accross 7GB of data, approximatly 80,000 files i believe it is (due to a gallery script).
It's currently on another host (on a webhosting account) which uses their own control panel which has no options but to manage databases, the only way i can see to do this is via FTP but it'll take me days. I've tried using compressing and backup scripts, but the damn execution time on the hosts server is too low to allow the files to be zipped. Are there any ways? Can i login to my VPS via SSH and anyhow pull off the files from the other hosts server?
View 6 Replies
View Related
Feb 13, 2007
I have a 512mb cPanel VPS and would like to try and save a bit more ram, I have made the below changes to the httpd.conf and have also turned spamd, entropychat, melange, mailman, the only webstats I have is AWStats
Code:
MinSpareServers 2
MaxSpareServers 5
StartServers 3
Can anyone recommend any other changes to save some more ram ?
For anyone wanting to know, the VPS is being used to serve my websites
View 1 Replies
View Related
Apr 14, 2009
I am wanting my VPS Optimize to run at it best
I am also wanting MySQL on the VPS to run at it best
Which company do you recommend for VPS and MySQL optimizing?
View 1 Replies
View Related
Jun 27, 2008
I have a server facing to 150+ requests/sec. Using apache 1.3.39. The current config here
Timeout 15
KeepAlive Off
MaxKeepAliveRequests 10
KeepAliveTimeout 5
MinSpareServers 15
MaxSpareServers 20
StartServers 15
MaxClients 400
MaxRequestsPerChild 100000
I wonder if I can change anything to make site is faster? Currently, it's very slow. It's one core2duo box with 2GB RAM
View 14 Replies
View Related
Sep 6, 2008
A friend of mine has a VPS server from godaddy, the site he is hosting on this VPS is quite slow and times out a lot.
What would be the best option? Move out of godaddy? Upgrade the plan? or hire someone to optimize the server (apache/mysql).
If you would recommend the third option, it would be great if you can let me know about companies that provide this service.
View 6 Replies
View Related
Aug 12, 2007
i run a popular forum with at least 1000-1200 users online at any given time which reaches 2000-2500 at peak times...
i am using Invision Power Board.
Problem:
Server seems to lose connection with mysql server at peak loads with server load going upto 30.
The server is good enough for these type of loads...
Server Specs:
Code:
Dual AMD Opteron 248
4 GB RAM
CENTOS Enterprise 4.5 i686
Cpanel
The my.cnf file is as follows:
-----------------------------------
Code:
[mysqld]
datadir=/var/lib/mysql
socket=/var/lib/mysql/mysql.sock
skip-locking
skip-innodb
long_query_time=4
key_buffer_size = 64M
query_cache_limit=32M
query_cache_size=256M
query_cache_type=1
max_connections=1024
max_user_connections=1024
interactive_timeout=20
wait_timeout=20
connect_timeout=6
thread_cache_size=256
key_buffer_size=64M
log-queries-not-using-indexes
join_buffer=8M
low_priority_updates=1
max_allowed_packet=16M
table_cache=2048
record_buffer=8M
sort_buffer_size=16M
read_buffer_size=4M
max_connect_errors=10
# Try number of CPU's*2 for thread_concurrency
thread_concurrency=4
myisam_sort_buffer_size=64M
[mysql.server]
user=mysql
basedir=/var/lib
[safe_mysqld]
err-log=/var/log/mysqld.log
#pid-file=/var/lib/mysql/mysql.pid
open_files_limit=8192
[mysqldump]
quick
max_allowed_packet=16M
[mysql]
no-auto-rehash
#safe-updates
[isamchk]
key_buffer=128M
sort_buffer=128M
read_buffer=32M
write_buffer=32M
[myisamchk]
key_buffer=128M
sort_buffer=128M
read_buffer=32M
write_buffer=32M
[mysqlhotcopy]
interactive-timeout
----------------------------------------------
any changes that would bring down the server load?
Also, some entries from httpd.conf are as follows:
Code:
Timeout 30
KeepAlive Off
MinSpareServers 15
MaxSpareServers 40
StartServers 15
MaxClients 300
MaxRequestsPerChild 1000
View 14 Replies
View Related
Apr 14, 2009
I am wanting my VPS Optimize to run at it best
I am also wanting MySQL on the VPS to run at it best
Which company do you recommend for VPS and MySQL optimizing?
View 1 Replies
View Related
May 26, 2008
I remember seeing a website/forum where you couldn't post your server stats and httpd.conf settings and experts would give advice on settings you should use.
Does anyone know the url to this site, or site similar to it
View 14 Replies
View Related
May 24, 2007
Are there some simple things that we can do or perhaps request our server management to do to optimize our server? Not sure where to start or what to ask so would like to learn some more before speaking with the server management company. Often times, you need to know the right questions to ask.
View 3 Replies
View Related
Oct 22, 2009
The my sql seems to take much load in my VPs, configuration file is as follows
[mysqld]
set-variable=local-infile=0
datadir=/var/lib/mysql
socket=/var/lib/mysql/mysql.sock
user=mysql
# Default to using old password format for compatibility with mysql 3.x
# clients (those using the mysqlclient10 compatibility package).
old_passwords=1
skip-bdb
set-variable = innodb_buffer_pool_size=2M
set-variable = innodb_additional_mem_pool_size=500K
set-variable = innodb_log_buffer_size=500K
set-variable = innodb_thread_concurrency=2
[mysqld_safe]
log-error=/var/log/mysqld.log
pid-file=/var/run/mysqld/mysqld.pid
skip-bdb
set-variable = innodb_buffer_pool_size=2M
set-variable = innodb_additional_mem_pool_size=500K
set-variable = innodb_log_buffer_size=500K
set-variable = innodb_thread_concurrency=2
View 0 Replies
View Related
Jul 7, 2008
drupal based flash archade website in that more 5000 visitors per day. I like to optimize the Apache & MySQL in best.
4GB RAM
Apache : 2.0.52
PHP : PHP 5.2.5
MySQL : 5.0.51
httpd.conf
KeepAlive Off
MaxKeepAliveRequests 50
KeepAliveTimeout 10
StartServers 16
MinSpareServers 8
MaxSpareServers 64
ServerLimit 1000
MaxClients 500
MaxRequestsPerChild 5000
ServerSignature On
my.cnf (Seprate box with 2GB RAM)
log-slow-queries = /var/log/mysql/mysql-slow.log
skip-external-locking
skip-locking
skip-innodb
skip-bdb
skip-name-resolve
max_connections = 800
max_user_connections = 800
key_buffer = 36M
key_buffer_size = 64M
myisam_sort_buffer_size = 64M
join_buffer_size = 2M
read_buffer_size = 2M
sort_buffer_size = 3M
thread_stack = 128K
table_cache = 1024
thread_cache_size = 286
interactive_timeout = 25
wait_timeout = 1800
connect_timeout = 10
max_allowed_packet = 16M
max_connect_errors = 999999
query_cache_limit = 8M
query_cache_size = 64M
query_cache_type = 1
tmp_table_size = 16M
old_passwords=1
user=mysql
quick
quote-names
max_allowed_packet = 16M
[isamchk]
key_buffer = 16M
View 9 Replies
View Related