Speed Up Serving Large Files

Nov 9, 2009

I'm working on a web site which will basically be a flash games portal. I have a dedicated server running Apache 2 on a 100mbit dedicated line but my download speed for large files (flash files of over 5mbs) is really slow. I am thinking this is because of Apache but I don't know much about this. I've read that I should change for a lighter http server for serving static files. The way my server is set up is I have 2 virtual machines running, one doing the PHP processing and the other serving static files, both running Apache, so if I have to change HTTP server for the static files it would be very easy. Although I am not sure if this is necessary or if I can tune Apache to push files faster than this.

View 8 Replies


ADVERTISEMENT

Serving & Playing Breeze FIles

May 15, 2007

Having trouble hosting Macromedia Breeze files on a brand new dedicated 2003 NT box.

Anything I need to adjust?

When someone visits the URL, it loads the presentation main screen, but it's supposed to auto-play several movies and it won't. It works fine locally and on 2 unix servers I tested it on, but not the new NT box.

The weird thing is, if I remote desktop to that server, I can't play that presentation locally or server thru the linux boxes. I figured it was an extensions thing, but like I said the intro still does appear, and if I go to Adobe.com and play a new "Adobe Presenter" presentation - it works fine.

View 1 Replies View Related

What Type Of Server And Os, Etc For Just File Serving- Small Files Like Under 10kb

Sep 26, 2008

I have a website that just serves small files, under 10kb most of them. I just need a server that lets me ftp the file to it, set up subdomains and domains for one website. Don't need to manage mysql or anything. Not even php. Just serve files.

A good fast OS? Something like lighttpd? Ioono?

I'm currently doing 600gb of bandwidth per month. I'm expecting to do about 1000gb by the end of the year. Would a small server like a pentium 4 be able to handle just serving files?

View 14 Replies View Related

Mod_Security Log Files Too Large

Nov 7, 2008

Linux Fedora 6, Apache 2 with Mod Security, MySQL.

Our mod_sec logs get incredibly large very quickly. In the configuration for mod_security, we have specified logging options as
SecAuditEngine RelevantOnly
SecAuditLogRelevantStatus "^[45]"

but the mod_sec.log gets to almost 10 GB (in a matter of 5-6 days) before it is truncated to mod_sec.log.1 and a new one is created.

Is there a way we can specify that a max size of one log file is 1 GB, for example?
Or another question, how come it gets so huge so quickly? We thought that logging "RelevantOnly" will only display errors / requests that are deemed security risks.

View 2 Replies View Related

Hosting Large Flv Files

Oct 6, 2008

I have a customer who wants to sell access to videos of conferences he runs.

Each flv vid is approx 1 - 1 1/2 hors long approx 380MB each and there will be about 12 videos per conference.

approx 4 - 8 conferences per year.

My customer suggests 10 - 20 people will buy access to watch each video.

Access to watch the videos will be through a password protected webpage.

issue - the current site hosting company only allow uploads up to 150MB per file.

Can I host the flash videos elsewhere and deliver them through the password protected web page without anyone else being able to see them via server they are hosted on?

This would also reduce the bandwidth going through his current site server.

View 14 Replies View Related

Locate Large Files

Jul 7, 2008

I am trying to locate what large file are filling up the / on the server but I am having trouble using the find command to do this.

View 1 Replies View Related

FTP :: Stop Uploading Large Files

Jul 17, 2008

I'm facing a very strange FTP issue with one of my shared-hosting accounts, while all of my other servers are having no problems but only this one, when I try to upload a file (whatever file) larger than 500kb from my local PCs, in most cases, the file would stop uploading during the process and hang there until it times out.

There are 2 interesting things though:
The file transmission typically hangs when approximately 248kb of the file have been transferred, pls see the attached screenshot for example.

If you look at the attached screenshot, you will notice that the uploading transmission hangs when 248kb of the file have been transferred. This is very strange and what I mean is that for example, I randomly pick up a file, and attempt to upload it onto my host for 10 times, now see, 5 times it will hang when 248kb of the total size have been transferred, 3 times it will hang at other points *near* 248kb (224kb or 280kb typically), 1 time it will hang at another random point, and 1 time it might be uploaded successfully (yes, there is still a tiny chance for the file to be uploaded successfully).

My default internet uploading speed is 80kb/s-100kb/s, lately I found that, when I limit the uploading speed on my FTP client (e.g. max. 30kb/s), everything WILL WORK without any problem! No hangs, no interrupt.. Whereas when I free up the uploading speed limitation and let it upload with my regular speed, the problem appears again.

It seems to me that the FTP hangs only when the uploading speed is higher than 60kb/s. However my host provider told me that they have customers uploading without any problem at over 400kb/s, and they said "there's no problem or limitations on the server at all".

Up until now, I have done following things to troubleshoot the issue but with no luck:

Contacted my host.
Disabled/Enabled the PASV mode on my FTP client.
Tried different FTP clients on different computers (FlashFXP and Filezilla).
Rebooted my router and reseted everything with the factory default settings.
Contacted my ISP for the issue, they "did something" but nothing were helpful.
Rebooted all my PCs.
Disabled both firewalls on my PC and on the router.

Furthermore, I have asked another friend of mine in another city with another ISP to test the FTP uploading, but unfortunately he got the exact same problem. And I've done some search on the internet for hours but no one seemed to have the same problem..

View 12 Replies View Related

VPS :: Large Files And Taking Too Much Space?

Nov 22, 2008

I just logged into my VPS and was astonished by how much space I have in use.
8.09GB... but I can't figure out what's using up so much space!?

How can I find out were large files are located ? Since it's increasing daily

I use LXAdmin with HyperVM Control Panel

View 10 Replies View Related

Multiple Large Core Files

May 20, 2007

Just noticed quite a few large Core. files within one of our websites (within a sub folder of public_html). Anyone knwo what these are and how they got there?

View 3 Replies View Related

Transfer Large Data Files

Jun 15, 2008

I have a debian box, and have archived a gallery in to a .tar file, 5.77gb.

I have a centOS box, and have used wget to bring the data file over to the new server.

However upon doing so it only detects it as 1.8gb when it starts downloading.

I have terminal access to both servers, just trying to bring my files over from one server to another.

View 4 Replies View Related

Large Media Files & Backups

Jul 16, 2008

I've been using Lypha for the past 4 years, but they've taken the last straw (gigabytes of backups went missing and they wont reply to emails as to why).

Looking for a web hosting package for under $10/month that has large enough disk-space/bandwidth to allow me to backup large audio / video files to it, as well as the normal site operation (I use it for portfolio website, as well as hosting additional domains)

View 17 Replies View Related

Large Video Files Need To Be Streamed

Mar 30, 2007

I am developing a web application for a private investigative firm. They do surveillance work and therefore have surveillance videos. I would like the capabilities of uploading the videos online and allowing the client to login and view their surveillance video online.

Currently, we get the video from the PI, put it on a DVD and then mail it to the client.

This takes too long. We want the client to be able to view the video online.

Some of these videos can be up to 2 hours long.

First, is this even possible?

Second,
- how much bandwidth would a website like this take?
- Is there a host that can hold hundreds of GB of video?

I want to convert it to flash to save file size and also so I can stream it.

View 3 Replies View Related

How To Move Large Files To New Host

Mar 21, 2007

I have some 100's of MB's to move and I'm definitely not doing it by transferring it via my PC / FTP.

I seen all the tutorials on how to move your MySQL databases, but what about full folders etc, how do I move those (putty?)?

View 1 Replies View Related

Optimizing Lighttpd For Large Files (180mb Avg)

May 9, 2008

I have four servers with a quad xeon, 4gb ram, and 2x300GB SAS 15K RAID0 harddrives, pushing a total of 1.6gbits. It serves a lot of zip files with an average flesize of 180mb. My question is, how can I optimize lighttpd 1.4.19 to push its max with very low IO-wait. I've looked up some stuff and only found options that deal with lighttpd 1.5 and use Linux-AIO for the backend network. Currently I use writev with 16 workers and an read/write idle timeout of 10s. Logging is off, too.

View 14 Replies View Related

Removing Large Files Or Folders CentOS

Jul 13, 2008

Something weird happening here. I have tried every string possible...

There are a number of folders I want to remove off my server, tried the good old and simple...

rm -r /folder/

And then went and ended up with a string as long as my screen. No matter what I do, as it goes recursive in to the directory it asks me if I want to remove each file individually. No matter what string or action I take it insists on asking me as it goes to delete each file.

Could this be a configuration option in CentOS?

View 3 Replies View Related

Daily Backup + Rsync Ssh + Large Number Of Files

Oct 29, 2006

i just wana know is it safe to do remote daily backup for about 70,000 files?

file sizes is about 200kb and every day i have about 1000 new file, so rsync first should check old files becouse i am deleting about 30-50 of them daily and them backup new 1000 files ,
so how much it will take every time to compare that 70,000 files?

i have 2 option now:

1-using second hdd and raid 1
2-using rsync and backuping to my second server , so i can save about $70 each month.

View 9 Replies View Related

Writing Large Files - Risk Of Damaging Filesystem

Jan 19, 2008

Does writing large files (ie, 10GB backups in one archive) cause any risk of damaging a linux filesystem?

View 1 Replies View Related

How Do You Host Large Video Files And Support The Bandwidth

Feb 8, 2007

I've got a client who wants to host audio files... Here are the sizes:

50 x 75MBs
300 x 10MBs
400 x 5MBs

That totals 8750MBs or 8.75GBs... If he gets hundreds of visitors, it could end up being 1000's of GBBs or bandwidth.

I don't know what to look for to support so much bandwidth... Do you buy bandwidth? Are their special companies out their that host it for you?

View 6 Replies View Related

Does These Speed Test Files From Providers Help You Decide

Jul 3, 2008

I am in Europe, and during this week I´ve been doing some speed tests with the files most providers have on their website or they gave me the link here.

I am using a DSL 25Mpbs net connection for the tests.

Here some results from just 3 providers:

FDC (Chicago) - 250k
iWEB (Canada) - 1.2MB
Take2Hosting (San Jose, California) - 1.4MB

My point is, does these speed test file are of any use to have
an idea what speed you can expect with the servers?

The way I see it ,there´s so many variables (shared and how many shared servers for port,
your location, peak hours traffic, etc).

So how does this tests influence you to go for one provide, instead of another?

View 6 Replies View Related

NFS Slow Speed When Moving Many Small Files

Jan 20, 2007

I have mounted NFS part. but when copying a big file the speed is OK like 5-6MB/s but when starting to copying all other (small files) speed is like 20-200KB.s What is the reason and is it way to improve the speed or use other way to mount drive remotely and preserve the same permissions after backup?

View 2 Replies View Related

Plesk 12.x / Linux :: Content-length Limit When Uploading Large Files

Jun 18, 2015

Domain has PHP Settings in Plesk set to 2G and I get this error when uploading a 48MB file using Wordpress. I assume I need ot modify this manually in conf file somewhere to allow uploading large files?

Requested content-length of 48443338 is larger than the configured limit of 10240000..

mod_fcgid: error reading data, FastCGI server closed connection...

View 1 Replies View Related

Plesk 12.x / Linux :: 413 Request Entity Too Large - Can't Upload Backup Files With Backup Manager

Sep 17, 2014

I have a 6GB backup file created with another Plesk Backup Manager, now I trying to upload this backup file to my Plesk Backup Manager but after upload 3% I am getting "413 Request Entity Too Large" error, I tried with disable NGINX but still getting this error.

how can I resolve this error or is their any other way to upload my file on backup manager?

I see that Backup Manager have a file size restriction of 2GB how can I increase this?

View 2 Replies View Related

Transfer Large Many Files From Server To Server

Oct 23, 2009

In reference to my previous post, i want to tranfer accross 7GB of data, approximatly 80,000 files i believe it is (due to a gallery script).

It's currently on another host (on a webhosting account) which uses their own control panel which has no options but to manage databases, the only way i can see to do this is via FTP but it'll take me days. I've tried using compressing and backup scripts, but the damn execution time on the hosts server is too low to allow the files to be zipped. Are there any ways? Can i login to my VPS via SSH and anyhow pull off the files from the other hosts server?

View 6 Replies View Related

IIS Just Stopped Serving..

Aug 2, 2008

I have been hosting a site for the past number of weeks on IIS on Server 2003..I have a no-ip a/c which is working fine along with port forwarding on my router..

The other day I restarted my server for something but now I cant seem to access my site from across the internet and according down for everyone or just me (website) it is down. I can, however, access the site using domain name from within my lan.

I'm not too hot on server 2003 or IIS. What can I do to debug/diagnose the problem?

View 2 Replies View Related

10 Second Lag Serving Static HTML

Mar 21, 2007

this isn't my server, so I don't have a lot of information about it, other than it's a Linux/Apache Dedicated server at EV1. (cPanel/WHM)

There are other sites on the server, and they are running fine.

One site has a terrible lag. It takes about 10 seconds to serve up a static HTML file..

Now, it's not like the server is slow. It's like this: You request a small HTML file. The site sits and thinks about it for about 10 seconds, and then after that everything processes quickly.

The forum on the site is the same way. Everything you click on works fine and loads quickly after that initial delay passes.

Is there a common configuration problem that might be causing this?

If you want to see this phenomenon, here are a few test files:
[url]
[url]
[url]
[url]

View 7 Replies View Related

Servint - Serving Porn On My Sites

Jul 7, 2008

What is up with servint?

I have domains and a vps through them.

The vps is now serving up porn on all the sites.

When I call servint they say I don't own the name but a whois shows i DO INDEED own the name for another year and it is registered to me and they are the registrar!!!

The portal user/pass suddenly stopped working so I can't even put in a support ticket

All I get is a message machine ... does anyone know if servint staff ever come on this board.

It has probably been hacked but it is pretty disturbing when they say I don't own the domain but it is CLEAR THAT I do own it.

View 14 Replies View Related

Serving Data From 2 Hard Disks

Mar 1, 2009

I have a server with 2 hard drives, say drive A and drive B. Right now all my files, database and data is on drive A, and drive B is empty. Since I have another drive available, I want to split the load between the two drives. I'm ok with having the web pages and the database on one drive. I mostly want to just have the data (I have about 500GB of data) split between the two drives. Note that I want to avoid duplicating the data. I want to have each file on either drive A XOR drive B.

Should I map a separate subdomain to drive B and then use that subdomain to serve the half of the data thats there? Is there something I can do with hard/soft links on the server so that even though the data is on 2 drives, users still use the same url to access data on either drive? Any other options?

View 6 Replies View Related

Media Serving And Cache Daemon

Dec 11, 2008

what I want to do, have a "node" somewhere serve media (static) files from a central server, but cache the static files the first time they are hit, so subsequent requests to the "node" don't require getting the file from the central server.

Is there readily available solution to this?

View 1 Replies View Related

How To Tweak Linux For File Serving

Sep 14, 2007

i have few servers that just serve files (1MB-100MB)

Suse Linux on all

the servers have minimal apps installed and i already got a llarge performance increase by dumping apache

now im look at tweaking at the OS level

any settings in Linux itself to speed up sending files down the pipe?

net.ipv4.tcp... in /etc/sysctl.conf?

View 0 Replies View Related

Static File Serving - Performance

Jun 24, 2008

I'm planning to setup a server ONLY for hosting of static binary files varying from few KB to few MB in size.

I've seen some of the litespeedtech performance benchmarks, which you can find here: [url]

From the "small static file" benchmark chart, i can see that IIS6 beats lighttpd in this test.

So i'm wondering does the IIS6 really have better performance at file hosting than lighttpd.

Actually it does not matter which operating system i will be using at this server, since i will use it only for file serving. With lots of concurrent connections. Possibly thousands of connections.

I need some feedbacks on this, so i can decide, IIS or lighttpd.

Few more bucks for win2k3 won't be an issue here, if it's performance is better than lighttpd for this kind of use.

View 11 Replies View Related

Preferred Operating System For Web Serving

May 13, 2007

I have been online since 95 and I've got a lot of information I could help others with regarding web development and interface design.

One thing I'm terrible at is System Administration, but I'm getting better and my first step is to pick a reliable operating system for both the webservers and the database servers. Would anyone be willing to answer the following questions or point me to a thread that already discusses these?

A little background, our site has 200k members, 30k active and is growing by about 5k a week. We still run MySQL 4.0.27 on the DB servers and they are running FC2. The webservers are Apache 1.3.37 and PHP 4.4.2 running FC2 and FC4 as well.

My questions

1. What OS would you prefer to upgrade to for the web servers? and a
"few" reasons why?

2. What OS would you prefer to upgrade to for the database servers?
and a "few" reasons why?

View 14 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved