I've come across something that has stumped me. A client who has pdf files download fine when using firefox but using IE 6 or 7 prompt to save a tar file which does not open properly.
My clients in LA - those on AT&T and Time Warner in paticular - are having bad download speeds from my server with Softlayer. Anyone else having similar trouble? Clients who normally get 500-700kb/s down are getting < 50 kb/s down, even early in the morning when the total traffic on my 100mbps port is under 1.5mbps.
Basically, east coast is fine, and anyone with a T1 in LA is fine as well - it's just DSL and Cable modems in LA that are screwed up. In fact, one of our are clients is using Apple Remote Desktop to connect to a remote client with a T1; it's faster to through Time Warner, then VPN to Verizon, and then connect Softlayer and back again than to go directly from Time Warner to Softlayer, which is really weird.
I own the website [url]and it's hosted at Host Excellence. Recently there have been some downtime periods, and most importantly, I have noticed that downloads are very slow. For instance, when I download a file such as [url]I get an approximative transfer rate of 100-150 KB/sec (whereas most other sites download at 500 KB/sec or more). First of all I'd like to know if you guys get similar results, and what do you recommend? I have contacted my host about it, they seem to deny the problem. Should I switch? What to? I need something reliable and fast.
My site is hosted on siteground, and i offer a few mp3s for download on my site. But, when my visitors download the songs, only a part of the song is downloaded, not completely. Siteground says this problem is about the apache server limitations on the http protocol downloads.
Can any user defined apache handler be defined to increase the http timeout value? I am on shared hosting.
I ordered a new server from iWeb a few days and after I transfered everything and was stating to set everything up I noticed that I as only getting about 1/2 of my download speed from both of my iWeb servers.
Is anyone else experiencing slower then normal downloads from iWeb?
Normally I could max my connection out at 2.1 MB/s now I can only get about 800 KB/s - 1.0 MB/s
I'm selling downloads of music files. The zip files are quite large. I've had several people complain that they get a message that the server resets their connection before the download finishes.
My new server speed seems very low. The downloads are taking too much time. While downloading the os templates I am getting the download speeds in the range of 30K/s - 40k/s (Max 100 K/s)
I have another server in this DC and havent this problem on this.
I send an email to datacenter about this issu and they sayd:
Code:
Dear Mr Yarmohammadi,
I just checked the switch configuration. Could you please look for the configuration of the network card in your server and if it is in 'Full Duplex' mode?
If not, please try and configure it in this way, maybe it already is, because the switch just changed to 100 MBit FD in automatic configuration mode.
I am wondering if simultaneous downloads could take up a lot of CPU/Ram usage? Could a celeron server with 512MB handle simultaneous downloads and how many users can it support simultaneously? The server will be serving as a pure download, no database, no php, no cgi, no nothing. And what is the highest mbps this server could potentially reach?
I've ordered 1gbit/s port with one my dedicated servers. But I am still unhappy with the speed of download.
I have 2 mbit DSL connection at home and I can download files with 90 kb/s from the server. I also see the same speed on a 100mbit port server. But I can download files from RapidShare with 210 kb/s..
What do you recommend me to do make faster downloads from server-side?
I have a problem with some mp4 videos and the downloads made from cellphones.
I have a plain LAMP server (centos 5, apache 2, php 5), the customers download the videos from a web, mobile section, and play on their cellphones.
The strange thing is that when they opened the video to play it on the cellphone, the video shows itself as a binary, although the extension remains mp4.
Tried moving the same video to another server, and it was played ok without any changes.. so, tested another server and after making this changes, it was able to reproduce the mp4 format:
I changed the Default mime.types from text/plain to application/octet-stream
TypesConfig / etc / mime.types
# DefaultType is the default MIME type the server will use for a document
# If it can not otherwise determine one, such as from filename extensions.
# If your server contains mostly text or HTML documents, "text / plain" isnte
# A good value. If most of your content is binary, such as applications
# Or images, you may want to use "application / octet-stream" instead to
# Keep browsers from trying to display binary files as though they are
# Text.
#
DefaultType application / octet-stream
When the videos out in binary, with a lot of strange characters, leaving just this: application / octet-stream
I see in /etc/mime.types and there is support for many formats, including. Mp4
However, on the original first server even if I change the above code, I can not reproduce mp4.
On any laptop or pc from the three servers I can reproduce the videos, the problem it’s just on one server playing from cellphones.
I just received a fairly scary WHMCS notice, you can view the details here:
<<please don't paste the file names, there are accounts that may have these on them>>
What are your thoughts on the entire situation? Personally, I'm a tad fearful (luckily, I hadn't upgraded to the next version yet as I was letting the other users play beta-testers) given the fact that there wasn't any versioning / modification 'notification' system in place on their end.
I'm fearing further updates. In essence, my concern is that the WHMCS development team isn't entirely certain how they were backdoored or to what scale they were backdoored.
Are their own billing systems & servers hosted in the same environment, were our billing details also released? etc. I want to know the scale of the attack.
I have a little project for a website that contains some of Downloadable files and Media Files!
In my calculation..., I may use about 50 GB storage and if the number of visiors is as what im expected i'll need 1 TB bandwith or some less!
== let's talk about the hosting part here i know that there are some companies that over more than 300 gb storage and more that 3 TB of traffic!
but i don't think they are going to work with my project so... i searched for a vps to start my project ..., and im thinking in dedicated server in the future..,so let's not talk about the future right now!
I want to talk about the ability of vps for downloads hosting! the vps I've found is this one:
Disk Space : 60,000 MB (60 GB) Number of Domains : Unlimited Domains CPU Limit : Equal Share Guranteed Memory : 512 MB Burstable Memory : 2048 MB Monthly Bandwidth : 2,000 GB Included Control Panel : cPanel / WHM
I thought some might find this short review useful. I recently found my website - vladstudio.com - pushing shared hosting limits fast (powweb.com at that time). After long days of (unsuccessful) searching for ideal hosting (fast, with lots of traffic, managed, and of course cheap :-) ), I got good idea from my wife and splitted my website into 2 pieces hosted separately.
For domain, PHP and MySQL, I've been using MediaLayer Application (shared) hosting. I might already mention it here before - I'm extremely happy with them, once my site hit digg.com frontpage and I got 500000 visitors in a day, and hosting did not go down! Right now, it serves about 25000 visits (500000 hits) a day, every hit being dynamic PHP page with several MySQL queries.
For hosting images and downloadable files, I use PacificRack dedicated server (most basic one, but with 2000 Gb/m traffic included). I am very bad at managing servers, but default configuration and my modest knowledge of Linux was enough - it is only used to store files that visitors download from vladstudio.com.
So, if your site is growing out of shared hosting too, consider 2 separate hosting accounts for application and files - it works very well for me. And once again, big thanks to MediaLayer and PacificRack.
I've tried Lighttpd and Apache. Server is QuadCore 1 cpu.
Either one when I have one long curl download occurring in Firefox, and open up another window to connect to the site it waits until the curl process completes.
In Lighttpd I have php 5.2.5 fastcgi and xcache.
There is no errors. It simply just waits until the process is completed and 1 second later the second request (2nd browser) starts.
I just try install but as title I have this error ./chkrootkit.sh: line 2: cd: /downloads/chkrootkit-0.48/: No such file or directory ./chkrootkit.sh: line 3: ./chkrootkit: No such file or directory
I’m edite pico /etc/cron.daily/chkrootkit.sh and set it to #!/bin/bash cd /downloads/chkrootkit-0.48/ ./chkrootkit | mail -s "Daily chkrootkit from Servername" ****@****.com
then I try make test by cd /etc/cron.daily/
./chkrootkit.sh
and it give me this error ./chkrootkit.sh: line 2: cd: /downloads/chkrootkit-0.48/: No such file or directory . ./chkrootkit.sh: line 3: ./chkrootkit: No such file or directory
I am trying many OS's but none of them worked as it should:
- Windows 2000: install aborted - Windows XP: install aborted - CentOS 4.4: install OK but Kernel panic on start-up - CentOS 3.8: instal OK, only 3.8 Gb identified but the OS out of 8 Gb - CentOS 3.8 64 bits: couldn't install, CPUs support only 32 bits
Add to this, the machine boots on EL Kernel on CentOS 4.4 but not on the SMP Kernel!
how I could run this machine on Linux with 8 Gb RAM?
i dont want clients taking the servers I/O and server load over 4.00 when they do major update ect... querys on sql. is there a way to limit the ammount they can do?
I am going to run a VPS as a VPN proxy server and therefore I was asking myself if it is possible to freeze or shut down the VPS before it exceeds it bandwidth limit of 100GB a month?