The problem is that HTTP file download speed is nearly 10 times lower than FTP download speed. What could be limiting it? It's about 7Mb for FTP and 70-100Kb for HTTP. Strangely, download speed is OK when browsing from the server itself (e.g. via RDP)
We're running a departmental intranet site on Apache 2.2 (on Windows Server 2003, if that matters). I'm trying to figure out how to post exe files on the web site so I can create a link and users can download the programs.trying various changes to my httpd.conf file, but no matter what I tried I get a 403 Forbidden error and the error.log file shows an error "Options ExecCGI is off in this directory:...".
1) I've tried putting the EXE in various folders. 2) I don't have a .htaccess file and I don't see anything in the httpd.conf file that seems to specifically relate to access for this file type. 3) If I put a zip file in the same folders, I am able to download or open it. 4) The httpd.conf file should be whatever was created when we installed Apache.
I am using single Apache HTTP Server (2.2.23) as a Load Balancer with two IBM Websphere application server nodes (other machines). I have deployed the simple text based helloWorld application and it works fine with load balancer. But When I deploy the real application that contains images,css file , java script file. It loads the page without images and show me simple text and gives me the following Exception on error_logs and similar kind of exceptions
[error] [client 192.217.71.77] File does not exist: /usr/local/apache2/htdocs/application, referer: http://192.168.141.17/application/faces/test.jsp
Interestingly, when I access the application without load balancer, it also works fine.
I want ask How i know the Server Download speed from Rapidshare ? I have windows Dedicated Server with 1GB Port and when i download something from rapidshare its just 2MB/s , is that Normal ? and now i am thinking to Buy another one , How i can know the Download speed from rapidshare site ? I ask some Company about that but No one give me test or any thing like that ,
I've two shared hosting accounts, one is hostgator and other is godaddy.
I've uploaded a file (.flv) on hostgator and same file on godaddy.
here a link to both
Hostgator flv
Goddady flv
Now i am use a Download manager " free download manager" to download files The hostgator file downloads at 17-35kb/s The godaddy file downloads at 200kb/s
I am using 2Mbps dsl.
Also my hostgator cpanel loads slow. and other files are downloads at very low max 40kb/s.
What are the issues. I've contacted hostgator support and they say that every thing is ok on their end.
I have to stream videos on hostgator but the speed is too slow. and it buffers alot.
We have several site that are downloading. how may i limit this site limit bw,omit speed download,limit connection and ... ecause this sites have very download and ...
I have just gotten a new server and the ping results were pretty fine. However, when it comes to the download speed its horrible. What could be the issue?
If you met any problems with downloading from Odin servers (autoinstall.plesk.com or autoinstall-win.pp.parallels.com) - please describe this problem briefly and specify geographic location (country, city) of your Plesk server.
currently i am on OVH...and ovh's speeds are amazing when it comes to torrenting..and stuff..no problems whatsoever till now..but i am looking for a modest budget server than will give me good http speeds average 400-500KB/sec ...i dont care much abt the hard drive space..even a 40 geg hard drive will do..but all i need is a good RAM..i am expecting 300-400 max users to download from the server using http..concurrent connections may be not more than 150..so i definitely need a good RAM..3 gegs i presume? budget: preferably less than 100 dollars
I have just performed a ftp backup of my entire server. I then zipped it up in Cpanel.
The file size its self it 3322megs big (about 3.3gb). If i press it to download it says:
Forbidden
You don't have permission to access /backup/sepmid07/sepmid07.zip on this server.
Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.
Apache/1.3.37 Server at backup.xxx*********** Port 80
I have made sure that i have the right permissions, I have also made sure that it belongs to the correct user. Also next to it where it would say how big it is, it says "-"
We have a video streaming server, sometimes the server gets really slow and when we digg into it, we see that the same ip is trying to download the same file many many times. for example i either run this command
I am having a large problem moving my sites over to another server. WHM is timing out transferring some large sites that are 4-16 gigs in size, so I tried to manually pkgacct and was able to get one of the accounts setup as root, moved it over to public_html so I could wget from the 2nd server. No luck, permission denied. I tried to change permissions in FTP and found out I couldn't change them so I chmodded the file as root, no luck.
From there I set chown and changed it to the user of the FTP account and was able to modify the file from FTP. I'm still getting a forbidden message when I try wget or download from browser.
Forbidden You don't have permission to access /backup/cpmove-azcorp.tar.gz on this server.
Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.
At this point I'm clueless, I chmodded the file to 777 so what else could be causing this?
Users of my web site, running MD-Pro, a PostNuke (php-based) clone, suddenly started to report difficulties downloading files from the download module. We assumed it was the download module and spent ages diagnosing it, uploading fresh versions etc. After some time someone had the bright idea to see if the files could be downloaded direct (using the URLs) to eliminate the web application.
We found: Some users can download all files from the server with no problems. Some users can download some files but not others.
So far as we can tell only MS Word files are affected, but not all.
The files on the server are not corrupted and there are no permissions issues
Disabling antivirus and firewalls locally makes no difference.
REloading fresh originals to the servedr does not help.
The host reports that no changes to the server have been made recently.
We have discovered that files in the web root are OK. The furhte down the directory structure they are, the less likely they are to download. Performance varies according to the browser in use. According to the host technical support, 'Firefox appears to be returning the data from its own cache. IE is only doing so for root. For the other places IE tries to download the file but stops receiving at exactly the same number of bytes from different locations.'
Using another server on a different host, the problem disappears so it must be due to the host setup in some way. Diagnosis is difficult because the host technical support can't reproduce it.
Platforms include Mac, Windows and Linux and browsers include IE7, Firefox and Safari.
One user with Mac isn't having problems, another with Mac is. All others reporting problems are on Windows. Users are at different locations, using different ISPs so it is unlikely there are common local problems.
The only common element I can see is that the ony files causing trouble are MS Word, though some of these do download OK
We have run out of ideas why this should be happening. How can some users have problems with the same files on the same server and others not?
What could have happened to cause this problem on a site that has been functioning correctly for several months on this server?
I just want to use wget command to transfer data from a shared host to dedicated server.Anybody knows how to set wget to download the .htaccess file and keep the file/directory's permission the same as they used to be on the old server? I only knows these:wget -b -c -r -l0 -np -nH -t0
On our production service, we've been getting numerous malformed POST requests to some of our CGI scripts that are showing up as 500 errors in our logs. They are malformed in the sense that the actual content length doesn't match the Content-Length specified in the request.
Here's the most trivial example I can come up with that reproduces the problem for us:
In addition to the 500 error in the access log, we see the corresponding error in the error log:
(70014)End of file found: Error reading request entity data
Based on the nature of the POST request and the error response, it does appear that Apache is doing the right thing here.
The POST never actually makes it as far as the script being targeted (/some_valid_alias in the above example); in other words, Apache returns 500 to the client, writes the error to the error log and never executes the script.
Is there a way to capture/avoid internal Apache errors like 70014, and return some other HTTP status besides 500 (like 403)? It's particularly annoying in our case, because our server sends us an email for all 500 errors.
So far, our best "defense" against these 500 errors is to disallow POST for these aliases, which normally just ignore the POST data anyway (when the request is not malformed):
I'm currently running on a VPS. My site allows for large file uploads and downloads, with files over 600mb in size.
The server has issues when the site gets three or more requests for large file downloads. I'm trying to grow this site to thousands of users and it is hard to do when the site can't handle even three.
I've been told by my host that I need to upgrade to dedicated. My VPS only has 512mb RAM and one large file download is eating up that RAM. This is causing the issue.
I'm a newbie and while I knew I was risking a bit by going with VPS I do find it a bit annoying that these guys advertise 1TB of bandwidth per month but I can't even support downloading 1GB at the same time....maybe it's just me...
Anyway, I am now looking into moving the large files and the upload/download over to Amazon S3. If I do this I am expecting my RAM usage on the VPS to greatly decrease. Is this correct? If my PHP code is running on the VPS, but the actual file download via HTTP is coming from S3, that should not be a heavy load on my box, correct?