"Download Test File" From Theplanet
May 20, 2008someone can give me a link "Download Test File" from theplanet..
View 3 Repliessomeone can give me a link "Download Test File" from theplanet..
View 3 RepliesAnyone have a good download test in chicago other then FDC's?
I was told i was upgraded to a 1GBPS port for free by my host(from a 10mbit) which is sort of weird, i want to confirm. at this time FDC's 100MB file takes me about 3 seconds to download and downloads at 44.7M/s.
the file is too small for me to try and pull 1gbit, by that speed i have a 500mbit line odviosely but i want to test against something outside of FDC.
[url]
What download speed do you get
I have 100MB/Sec port and normally always get around 500-600kbps download speed with DedicatedNOW but today I am only getting around 170kbps
I have a 10mb connection and when downloading file from here [url]I get 1.2MB/sec download speed
I asked support to confirm if I had 100MB/sec port and got a reply with:
"Your server is on a 100mbps port:
-bash-3.2# ethtool eth0 |grep Speed:
Speed: 100Mb/s
-bash-3.2#"
Sounds like a local issue. I am able to download at:
Response: 226-File successfully transferred
Response: 226 32.674 seconds (measured here), 6.12 Mbytes per second
i just wrote a nice little page on the best location in the US for a VPS server for my Australian and New Zealand customers,
View 10 Replies View RelatedLogwatch reset my server logs this morning and there's an event I need to check for in /var/log/kern.log
How can I download the gzip of:
kern.log.1.gz
..to my pc desktop, using putty / ssh.
I have just performed a ftp backup of my entire server. I then zipped it up in Cpanel.
The file size its self it 3322megs big (about 3.3gb). If i press it to download it says:
Forbidden
You don't have permission to access /backup/sepmid07/sepmid07.zip on this server.
Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.
Apache/1.3.37 Server at backup.xxx*********** Port 80
I have made sure that i have the right permissions, I have also made sure that it belongs to the correct user. Also next to it where it would say how big it is, it says "-"
Can anyone provide me with output of 2 Gb file copy test on SAS hdd under FreeBSD 7 ?
I got something strange and just need compare results.
We have a video streaming server, sometimes the server gets really slow and when we digg into it, we see that the same ip is trying to download the same file many many times. for example i either run this command
netstat -n | grep :80 | awk '{ print $5 }' | awk -F: '{ print $1 }' | sort | uniq -c | sort -n | tail
or go to WHM and see the 'apache stats' and i see 100 http connections from the same ip, trying to download the same video 100 times,,,
what is this? is this some sort of attack? could that be 100 different people that use a proxy? or what is going on?
I am having a large problem moving my sites over to another server. WHM is timing out transferring some large sites that are 4-16 gigs in size, so I tried to manually pkgacct and was able to get one of the accounts setup as root, moved it over to public_html so I could wget from the 2nd server. No luck, permission denied. I tried to change permissions in FTP and found out I couldn't change them so I chmodded the file as root, no luck.
From there I set chown and changed it to the user of the FTP account and was able to modify the file from FTP. I'm still getting a forbidden message when I try wget or download from browser.
Forbidden
You don't have permission to access /backup/cpmove-azcorp.tar.gz on this server.
Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.
At this point I'm clueless, I chmodded the file to 777 so what else could be causing this?
how all the php files popping up for download. Am I missing anything here?
View 2 Replies View RelatedUsers of my web site, running MD-Pro, a PostNuke (php-based) clone, suddenly started to report difficulties downloading files from the download module. We assumed it was the download module and spent ages diagnosing it, uploading fresh versions etc. After some time someone had the bright idea to see if the files could be downloaded direct (using the URLs) to eliminate the web application.
We found: Some users can download all files from the server with no problems.
Some users can download some files but not others.
So far as we can tell only MS Word files are affected, but not all.
The files on the server are not corrupted and there are no permissions issues
Disabling antivirus and firewalls locally makes no difference.
REloading fresh originals to the servedr does not help.
The host reports that no changes to the server have been made recently.
We have discovered that files in the web root are OK. The furhte down the directory structure they are, the less likely they are to download. Performance varies according to the browser in use. According to the host technical support, 'Firefox appears to be returning the data from its own cache. IE is only doing so for root. For the other places IE tries to download the file but stops receiving at exactly the same number of bytes from different locations.'
Using another server on a different host, the problem disappears so it must be due to the host setup in some way. Diagnosis is difficult because the host technical support can't reproduce it.
Platforms include Mac, Windows and Linux and browsers include IE7, Firefox and Safari.
One user with Mac isn't having problems, another with Mac is. All others reporting problems are on Windows. Users are at different locations, using different ISPs so it is unlikely there are common local problems.
The only common element I can see is that the ony files causing trouble are MS Word, though some of these do download OK
We have run out of ideas why this should be happening. How can some users have problems with the same files on the same server and others not?
What could have happened to cause this problem on a site that has been functioning correctly for several months on this server?
if you can share a 100MB download link that I can use to test cogent's speed to my network. Hopefully plugged into a 100MBPS port at the switch to see if it will max out or not.
View 6 Replies View Relatedwe are running a file sharing service and use lighttpd as a web server
the problem is that in Internet Explorer are some files i.e. .mpg, .pdf
and others opened directly in the browser
so is there a way to prevent this behavior(to force the browser to
download the file) by setting headers (or something else) in
lighttpd.conf?
We're running a departmental intranet site on Apache 2.2 (on Windows Server 2003, if that matters). I'm trying to figure out how to post exe files on the web site so I can create a link and users can download the programs.trying various changes to my httpd.conf file, but no matter what I tried I get a 403 Forbidden error and the error.log file shows an error "Options ExecCGI is off in this directory:...".
1) I've tried putting the EXE in various folders.
2) I don't have a .htaccess file and I don't see anything in the httpd.conf file that seems to specifically relate to access for this file type.
3) If I put a zip file in the same folders, I am able to download or open it.
4) The httpd.conf file should be whatever was created when we installed Apache.
I just want to use wget command to transfer data from a shared host to dedicated server.Anybody knows how to set wget to download the .htaccess file and keep the file/directory's permission the same as they used to be on the old server?
I only knows these:wget -b -c -r -l0 -np -nH -t0
The problem is that HTTP file download speed is nearly 10 times lower than FTP download speed. What could be limiting it? It's about 7Mb for FTP and 70-100Kb for HTTP. Strangely, download speed is OK when browsing from the server itself (e.g. via RDP)
View 2 Replies View RelatedI'm currently running on a VPS. My site allows for large file uploads and downloads, with files over 600mb in size.
The server has issues when the site gets three or more requests for large file downloads. I'm trying to grow this site to thousands of users and it is hard to do when the site can't handle even three.
I've been told by my host that I need to upgrade to dedicated. My VPS only has 512mb RAM and one large file download is eating up that RAM. This is causing the issue.
I'm a newbie and while I knew I was risking a bit by going with VPS I do find it a bit annoying that these guys advertise 1TB of bandwidth per month but I can't even support downloading 1GB at the same time....maybe it's just me...
Anyway, I am now looking into moving the large files and the upload/download over to Amazon S3. If I do this I am expecting my RAM usage on the VPS to greatly decrease. Is this correct? If my PHP code is running on the VPS, but the actual file download via HTTP is coming from S3, that should not be a heavy load on my box, correct?
any opinions on S3?
Apache 2.2.22
I may be making something here too complicated but a friend downloaded a largish video file via my very basic web server.
Trying to interpret the data I am getting different figures, which one is correct or am I reading it wrongly.
carole [10/Nov/2013:17:55:21 +0000] "GET /cwgl/files/vidlink/videos/vid01.ts.mpg HTTP/1.1" 200 2879608
carole [10/Nov/2013:17:56:07 +0000] "GET /cwgl/files/vidlink/videos/vid01.ts.mpg HTTP/1.1" 200 1902255
carole [10/Nov/2013:18:11:41 +0000] "GET /cwgl/files/vidlink/videos/vid01.ts.mpg HTTP/1.1" 206 357
carole [10/Nov/2013:18:11:49 +0000] "GET /cwgl/files/vidlink/videos/vid01.ts.mpg HTTP/1.1" 206 5317289
carole [10/Nov/2013:18:16:38 +0000] "GET /cwgl/files/vidlink/videos/vid01.ts.mpg HTTP/1.1" 200 1834235652
200 totals = 1839017515
File properties on Ubuntu 2.0 GB (1,971,322,880 bytes)
server-status
1-0319210/3/3_ 82.6457740190482630.01880.001880.00 ip72-197-68-158.sd.sd.cox.net3lanesNULL
The other_vhost_access file says 1839017515
Ubuntu file properties says 2.0 GB (1,971,322,880 bytes)
Apache server-status says 1880.00
I did not put this under the tutorial section because it is not comprehensive enough. Its just a simple rant.
Those of you shopping for a host come to this forum, and are often given the advice to ask for a "test file" to download.
1) Hosts who offer test files will most likely put the file on their fastest server, not the server where your site will actually be hosted.
2) If they have servers in multiple data centers, they will use the one that is the most well-connected, not necessarily the one where your site is going to be.
3) Even if your account is assigned to the same server as the test file, what does your ability to download a static file actually prove? A test file does not show how well a server is going to perform, which is usually the biggest factor in page load times.
4) Barring all the above, and assuming your site is going to consist only of static files, most web site visitors are not all in the same area, so your results may differ from the rest of the people visiting your web site.
A test file can be helpful in rare circumstances, but as a potential customer you would have no way to really know whether your download is a true test of what you can expect, so it is best not to rely on something like that unless you are downloading it from your own hosting account with that host.
The only way to really know how a host is going to perform is to try it out. This is why hosts offer full money-back guarantees and free trials.
I'm build Plesk Panel for Linux and Presence Builder, I don't want my user can upload their website to hosting via File Manager. How can I do it...
View 2 Replies View RelatedThePlanet vs.......... ?
I just ordered a Dual Xeon 3.2, 4GB RAM, 2X 250GB SATA server from ThePlanet and it ...sucks. This damn thing doesn't even want to run a single UrbanTerror server ( which mind you is an 8 year old game that should be running like BUTTER on this system )every 5 minutes this thing lags out like you wouldn't believe, so it seems I'm wasting nearly 200 month on nothing but a web host now, which isn't something I need to do. ThePlanet of course doesn't support any of this, not that I asked them to, I just told them this box sucks and something is wrong, but they don't really care it seems, so I'm looking for another place, does anyone have any good recommendations ?
I have dedicated server P4 3.2 Ghz with ThePlanet for 4 years now. Since I want to upgrade a server I started to think if it will be good idea to change to the Rackspace.
They offer AMD Opteron 246 for pretty much same price I am getting my P4 3.2 Ghz at ThePlanet. Is it faster processor?
I do not need faster CPU, I want to upgrade Hard drive but if I am getting faster CPU it's good.
Several points here.
1. I can not complain about ThePlanet. Never had a problem. And whenever I had opened ticket they were answered in time I expected.
2. I need a good/fast network.
Should I switch?
I am going to begin a social network and for the beginning I just need 100 gb hard disk, 2 ram, 200 TB monthly. I need a web hosting with potential for big traffic and scalability in the near future.
I have read what people from big social networks have said about both companies and they can handle big traffic, though mediatemple has better ranked costumers than the plannet according to Alexa (I checked one by one of their big clients in Alexa.)
My website is going to be about video streaming, webcam streaming and other common things in social networks.
What do you recommend? Do you have a better company in mind that can handle big traffic and scalability?
My budget is between 500 up to 800 monthly?
I definitely like ThePlanet. They have good support and goot network.
But seems I need to know more from the world.
So could anyone please tell me another DCs that can be compared with TP *in full* (i.e. not "good support but bad network" - "good" should be both).
ThePlanet did a mailout the other day about their servers in UK. Does anyone have servers there? Good pings?
View 6 Replies View RelatedThePlanet / Softlayer
Does anyone know of any hosters that are reselling servers from The Planet or Softlayer (or anyone else is Texas that I've missed)
A few days ago my server started to receive a bunch of Input/Output errors. I scrambled all over Google to find out what the problem could be and most pointed to a failing hard drive.
I contacted ThePlanet roughly 4 hours after I first saw the problem and they recommended that run a few diagnostic programs on the hard drive to find out if there were any problems. I agreed and we picked a good 4 hour time frame to do the work.
Luckily for me this was the 2nd hdd on the server, but the downside was its a 200GB that I am using 165GB. So I tried to do backups and at about 35% done, the 2nd hdd became virtually unaccessible.
ThePlanet started the diagnostic at roughly 12:00pm PST and updated me nearly every 10 minutes on the progress. At 12:50 they indicated that the hdd diagnostic wasn't able to find any problems, then rebooted the server. I'm not sure what they did, but after the server came back up after roughly 5 minutes the 2nd hdd is working perfectly fine.
So i'd like to send out some praise as i'm very happy that the hdd is working again, and I'm happy with the updates and professionalism with ThePlanet.
I've had some rough times with ThePlanet, but the majority of my situations have been resolved in a timely and professional manner. Thanks again TP for saving my butt!
If I have to choose between ThePlanet and SoftLayer for a dedicated server which will be used for Shared Hosting, which one should I choose?
View 14 Replies View Related