Premature Data Download Disconnections From Server
May 6, 2007
Recently our network of servers all started to prematurely disconnect in middle of data download. This can be download of pictures or files.
Eg: [url]
You'll notice on first go it only downloads +-5% of the images. Then you gotta keep pressing F5 many times for images to complete downloading.
Having attempted many solutions to correct this (so that it downloads data 100%), I though I ask on forums since there's always somewhere here that knows more then admins.
Would you please share any solutions that would solve this challange? Or any guidence on what to adjust.
SErvers specs:
Linux 2.6 kernel (latest)
Apache 2.0
View 1 Replies
ADVERTISEMENT
May 6, 2007
Recently our network of servers all started to prematurely disconnect in middle of data download. This can be download of pictures or files.
Eg: [url]
You'll notice on first go it only downloads +-5% of the images. Then you gotta keep pressing F5 many times for images to complete downloading.
Having attempted many solutions to correct this (so that it downloads data 100%), I though I ask on forums since there's always somewhere here that knows more then admins.
View 0 Replies
View Related
May 3, 2007
My server (serving hosting) constantly keeps cutting out data download of files or pictures. So pictures usually load only 10%, then you need to keep pressing refresh to see whole picture.
As for downloads, I tested with a 8MB file. I downloaded with typical firefox browser and 1st time it claimed it finished downloading at 3MB mark. Second time at 1MB mark.
I have no idea how to fix this. Any solutions as to what to change or adjust in order to prevent this from happening?
Additional notes:
Server only supports php/static pages. It's limited to 150Mbits, and constantly wants to do more. But I can't lift now due to financial concerns.
View 5 Replies
View Related
Nov 12, 2013
Apache 2.2.22
I may be making something here too complicated but a friend downloaded a largish video file via my very basic web server.
Trying to interpret the data I am getting different figures, which one is correct or am I reading it wrongly.
carole [10/Nov/2013:17:55:21 +0000] "GET /cwgl/files/vidlink/videos/vid01.ts.mpg HTTP/1.1" 200 2879608
carole [10/Nov/2013:17:56:07 +0000] "GET /cwgl/files/vidlink/videos/vid01.ts.mpg HTTP/1.1" 200 1902255
carole [10/Nov/2013:18:11:41 +0000] "GET /cwgl/files/vidlink/videos/vid01.ts.mpg HTTP/1.1" 206 357
carole [10/Nov/2013:18:11:49 +0000] "GET /cwgl/files/vidlink/videos/vid01.ts.mpg HTTP/1.1" 206 5317289
carole [10/Nov/2013:18:16:38 +0000] "GET /cwgl/files/vidlink/videos/vid01.ts.mpg HTTP/1.1" 200 1834235652
200 totals = 1839017515
File properties on Ubuntu 2.0 GB (1,971,322,880 bytes)
server-status
1-0319210/3/3_ 82.6457740190482630.01880.001880.00 ip72-197-68-158.sd.sd.cox.net3lanesNULL
The other_vhost_access file says 1839017515
Ubuntu file properties says 2.0 GB (1,971,322,880 bytes)
Apache server-status says 1880.00
View 1 Replies
View Related
May 31, 2015
I've been working on a project that I am trying to utilize some of the new functions from PHP 5.5 so I changed the PHP version from 5.3 to 5.5 on this domain, but I am getting a 500 error. I've found this kb: [URL] .... which is my exact issue, but the resolution doesn't work for me.
All of the permissions are set right, all of the modules are compiled and registered correctly, but I'm still getting a 500 error.
From error_log:
Code:
[Sun May 31 20:32:31 2015] [warn] [client] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server
[Sun May 31 20:32:31 2015] [error] [client] Premature end of script headers: adduser-exec.php
# /usr/local/psa/admin/bin/php_handlers_control --list :
[Code] ....
View 5 Replies
View Related
Apr 12, 2015
My wordpress installation in plesk some times give me:
[Sun Apr 12 14:26:08 2015] [warn] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server
[Sun Apr 12 14:26:08 2015] [error] Premature end of script headers: index.php
How I can fix? it's very random ....
View 11 Replies
View Related
Oct 7, 2009
I've been trying in vain to find out how to download files from online sites directly to my server hosted with Siteground. Basically, I figure why bother downloading to my local HDD when I back everything up on my online server anyway.
I know this classes as a server to server file transfer but I want to download things like software, particularly Linux distros, and archive them directly on my server. I just can't think how to word it any other way. I use my browser and click the download link and it asks me to save it to my local HDD but instead I'd like to save it to my backup server bypassing my local comp totally.
Currently Siteground give me "unlimited" bandwidth and "unlimited" storage space, overselling obviously but now I want to use some of my space. My home upload speed it 20kb on average so this is why I'd rather upload/download/save interesting software etc onto my backup server.
I've looked briefly at WGET and CURL but it looks too complicated. I'd just like to be able to FTP all my 'normal' downloads right onto my server. I'd use an intermediary service for this as well as this backup is for non-private files only, if only I knew how to word the concept.
I just don't know how to search for the right tools or whatever for what I want to do.
I just want to add a "save to my server" option to my normal download dialog for want of a better description.
View 6 Replies
View Related
Oct 19, 2009
I'm starting a file download site. I've done a lot of research and I'm currently planning a head for growth and scaling. Looking to serve around 250-500 thousand 5MB files a day.
I would like some input from people that KNOW what they are talking about, hopefully people that have hosted/ran similar sites.
The main question I need answered is what will be the first bottle neck for a single download server will run into when delivering the following.
File size = 10MB.
Number of downloads = As many 1.2mbit download streams as possible.
Example Server, lets say...
Intel 5405
8GB Ram
2x640GB RE3 Hardware Raid 1
1000mbit
Am I correct in assuming the bottle neck will be the HD's here? So would I be right in assuming this could handle around 200 concurrent downloads @ 1.2mbit(250mbit)?
View 6 Replies
View Related
Sep 30, 2009
I've two shared hosting accounts, one is hostgator and other is godaddy.
I've uploaded a file (.flv) on hostgator and same file on godaddy.
here a link to both
Hostgator flv
Goddady flv
Now i am use a Download manager " free download manager" to download files
The hostgator file downloads at 17-35kb/s
The godaddy file downloads at 200kb/s
I am using 2Mbps dsl.
Also my hostgator cpanel loads slow. and other files are downloads at very low max 40kb/s.
What are the issues. I've contacted hostgator support and they say that every thing is ok on their end.
I have to stream videos on hostgator but the speed is too slow. and it buffers alot.
View 14 Replies
View Related
Apr 9, 2009
Which is the most important? CPU, Memory or others?
Which company provides the most cost-efficient download servers?
View 14 Replies
View Related
Jun 18, 2009
I'm running a remote dl site where user will request a file from various file sharing site like rapidshare and megaupload, and allow it to stream and dl. Few days ago everything was normal, but since yesterday all rar files download have corrupt name and sizes but when trying to extract it seems to be corrupted, zip files are running fine though.
View 1 Replies
View Related
Feb 3, 2008
I have a server with the following specs:
Intel Core2Duo 6550
2x2 GB DDR2 RAM
2x500GB 15,000 RPM SCSI drives
Fedora Core 7
It's on a FDC Servers 1gbit connection with a 100mbit guarantee, their largest plan. I've used this plan (though with another server) before to push 241mbit/sec so there should be plenty of bandwidth.
The downloads will be anywhere from a few megabytes to CD sized downloads so there might be quite a bit of large files, mostly files will be around 20-50 megabytes though.
Basically I'd like a sort of rapidshare setup with two types of members, premium and non-premium.
Non-premium members should be limited to a certain speed across all connections but not necessarily limited to one connection only since Asia and Europe traffic have a hard time getting really good speeds without using several connections.
But I'd still like to be able to limit them at 4 connections or so, premium members should just be limited at some other value, that part shouldn't be the problem.
Currently I can limit the speed through the php script that checks if a user is premium but that limit only works per connection and not across all connections.
I'm looking for a software configuration setup, httpd etc. Hardware upgrade suggestions are also welcome for the future.
So far I'm thinking setting up two virtual hosts, one for premium and one for non-premium and then use mod_limitipconn.c. But maybe there's a better way without using apache?
View 0 Replies
View Related
May 25, 2008
When using a download manager to download stuff from server, download manager detects server unable to support resume and parallel downloads, is there a way to set the server to allow it?
View 3 Replies
View Related
Aug 8, 2008
to getting a second server, specifically for downloads for my site. Without going into too much detail, my website requires that I get a separate server for downloads, and another for the webserver.
I've already gone ahead and configured the server I want; however, I'm trying to do something different from what I'm used to. I'd like to use Lighttpd, instead of apache, as well as not having cPanel installed on the server. I'd still have cPanel and Apache on my regular webserver, but I'd rather keep my download server relatively clean.
So, once I get my server, and I install Lighttpd, how do I go about setting up everything so that my two servers can communicate? I'd like the download server to be something like: download.mydomain.com , so would I have to set up an A Hostname on my webserver?
What are the steps I should take?
View 8 Replies
View Related
Jun 10, 2007
My server is Fedora Linux. I access it via Putty using SSH.
I find it a pain to look through the logs using PICO since it won't scroll and I'm a slight linux mainly windows guy.
Is there a way to either copy the whole log at once over to notepad or something or to download the file to my local PC?
Then I could go through it much easier.
I tried using copy in putty but that only copies the screen and these logs can be huge sometimes.
View 4 Replies
View Related
Feb 25, 2007
I was wondering if i could use my windows dedicated server to download backups from my website on a daily basis? Maybe with a cron job or something...
Heres the setup:
Dedicated Server with windows 2003 - Want to use for downloading backups
Shared Server hosting my website with cpanel - Want to make backups of
View 0 Replies
View Related
May 15, 2008
I have a website, where i provide streaming videos. Recently i read somewhere that, streaming video or audio files increases server load a loot. Is providing direct download to files is a better option than streaming videos? Please let me know.
View 14 Replies
View Related
Sep 24, 2008
Does anyone know of a "Windows Server 2003 Standard" download link? It seems that Microsoft has taken this OFF their 2003 download page, or has somehow "hidden" the link to only show the newer "2008" links (and the updates to 2003).
The trial software link here ...
[url]
...works for 2003 to begin the download process, but in the end no download URL is provided.
View 12 Replies
View Related
Feb 10, 2008
It's been a LONG time since I've used Perl scripts, since almost everything these days is PHP-based. But there is one old Perl script that I dusted off and decided to attempt to use today on my server, and to my surprise I can NOT get it working. I've come to discover it's not just this script, it's ALL Perl scripts on the server right now that aren't working (at least within Apache).
Here's what I know for sure:
1. Path to Perl on all scripts IS correct (/usr/local/bin/perl)
2. File permissions on the Perl scripts is correct (755, also tried 777 for testing)
3. Scripts are not C compiled scripts, they're straight Perl code and they have been uploaded properly (ASCII mode).
So now for whatever reason, I'm getting the following Apache error each time I try to run any simple Perl script: "Premature end of script headers".
Here's the simplest of scripts I'm attempting to run:
Code:
#!/usr/local/bin/perl
print "Content-type: text/html
";
print "Hello, World.";
Something as simple as this script is even triggering the error. I've searched through this forum and on Google for an easy solution, but I can't seem to come across anything.
Any possible ideas on this one? Again, it's been so long since I've run anything Perl that it's very likely my server has been this way since day one - but I just didn't realize it.
View 14 Replies
View Related
Oct 7, 2007
I just got a VPSlink account about a month ago and for all testing purposes, it has been great so far [my first VPS]. But I haven't transferred my main site to it yet mainly due to one specific question about download speeds. Quickly, here's what I've found about their network...
VPSLink is owned by Spry, so from all I can tell, they're using the same network speeds. Which are:
Unmetered: 1.5 megabits per sec in total traffic.
Metered: Spry's site says "throughput of up to 8Mbps"
Ok, so here's what I'm trying to find out:
I'm running a small software business with some downloads (all legal of course - they're mine ) and would like to know what the real-world download speeds that users would normally see when downloading my files. Here are the knowns:
a) Let's assume for this question that all of the user computer's are on very quick connections, so their speed wouldn't be an issue.
b) An example file would be say, 20 MB.
I'm most looking to see how speeds would be (in Kbps) if say, 20 people are downloading a 20 MB file at the same time (again, with their connections being extremely fast; fast enough for the sake of this example to not be a factor - so the max speed would only depend on my VPSlink server's connection).
I don't know of a good way to really test this since I only have access to 1 physical connection. I did try a test with a friend. Here's what happened:
I normally get about 500-600kbps sustained on my cable modem, as does my friend. When downloading a test file on my server I was able to get about that when downloading all alone. But, when my friend and I downloaded it at the same time, both of us got around 350kbps -- so the total speed about dropped in half. Again since my site is for software downloads, and when releasing an update to one of our products, sometimes I'm guessing 20+ users download simultaneously for a couple days.
This currently seems to work ok on Dreamhost, but for many other reasons, we're needing a VPS. But would this vastly decrease our file transfer download speeds to users? If users got something horrible like 2kbps, that would most certainly loose a lot of business for us since they'd just get frustrated and cancel for the most part.
Does anyone know of a rough estimate on how to figure this out, or even better, are there any VPSLink customers out there that have experience with this?
View 8 Replies
View Related
Feb 20, 2007
How can I find the data transfer rate on the server. I have done ifconfig -a , it
display the amout of data has been received and transfered. I want to see the live data transfer date. Can I able to check it?
View 6 Replies
View Related