Mget S Slower Than Wget
Oct 29, 2009i try to backup a folder with many .tar files,
i use mget,the seppd is about 1MB/s,
but wget is about 5-7MB/s,
i try to backup a folder with many .tar files,
i use mget,the seppd is about 1MB/s,
but wget is about 5-7MB/s,
Cronjob is not working for a script on a clients cpanel account:
He set it up to grab from the path below and it is giving this error:
/bin/sh: /usr/bin/wget: Permission denied
The cronjob that is running is:
/usr/bin/wget -O - -q [url]
This is a cpanel box with centos 5
i had an p4 2.8g db server, centos4+mysql4.1. the cpu usage went up to 9X% at peak time (web pages loaded within 2~3sec), so i decide to swtich to a new server with a faster cpu. the new box is core 2 duo e6600, centos5+mysql5.0, having same RAM and HD with the old box. surprisingly , the cpu usage keeps above 150% almost all the time, and web pages cost 10+seconds to load at peak time.
how is it possible? what is the problem here? is the cpu, os, or mysql?
if Web servers might prioritize servicing Regular GET or a Conditional GET.
I have seen busy servers take 9 seconds to respond with a (304 Not modified). On fast networks (LANs), the file size is no issue and, it seems that a (200 OK) with the object is downloaded faster than the (304 Not modified) response alone.
Wiseowl. Not so wise on this one.
I seem to be experiencing slower bw throughput in speedtests on my server today on a domain that is on a shared ip as a virtualhost in apache 2.2.x on fedora core 6.
Right now this domain is pushing somewhere around 35Mbits out and 1Mbit in.
When I d/l from this domain I see pausing and overall performance is roughly 1/2 of what it was yesterday (400K/sec versus ~800K yesterday), when the site was only doing about 3Mbits.
However, another domain on this same server that is not on a shared IP, when I test it, I get my usual 100% d/l speed.
My server is not under heavy load, .02, .06, .00 with 95% idle cpu.
I'm just wondering, should sites with high bw use definitely have their own IP, as shared/virtual hosting is known to cause slowness?
if i use uk2.net's dedicated, will my US users feel slower connection?
if i host my site on uk2.net, will the site load slower for US visitors because the server is in uk ?
what about FDCservers, and serverpronto, any opinions on them ?
uk2.net appeals because of the price.
say i have lot of traffic from USA, does my site load slower if hosted in netherlands or uk?
View 11 Replies View RelatedWe used Red Hat with ext2 as our file system on an old server with 100k+ of image files in a single directory. This seemed to preform ok until we switched to FreeBSD using UFS. Now images load very slow. I have read that Ext2 uses and internal hash to speed lookups, while UFS does linear searches for lookups.
View 2 Replies View RelatedI have this big file i want to transfer to my server. The Direct Link to file is being masked by PHP.
The URL is "/download.php?file=1" and requires Authentication.
Is there any way...i can wget or download the file to server
A customer is currently using wget on my server. I read it's not secure to leave this enabled.
I was wondering how to disable this and then re-enable it just when this specific customer needs to use it?
I am trying to move filename.tar.gz from server A to Server B with root at both, i have moved filename into a web accessable folder and chmod the file 755, when i go to wget filename from server A I get...
View 14 Replies View RelatedI just accidentally removed wget on my server via rpm
How do I reinstall it back using rpm?
I try yum install wget but it doesn't work
I'm using centOS5.
I'm trying to do is wget images, however, i'm not sure how to do it 100% right...
what ive got is a index.html page that has images(thmubs) that link to the full size images. how do i grab the full size images?
Example of links on the page:
<a href="images/*random numbers*.jpg" target="_blank"><img border=0 width=112 height=150 src="images/tn_*random numbers*.jpg" style="position:relative;left:3px;top:3px" /></a>
i tried
wget -A.jpg -r -l1 -np URLHERE
why when i wget large files around 300/500Mb in size the load on server goes 2.00+
View 7 Replies View RelatedI've just did a 'wget' on a file that's quiet large (2.3GB), and now I'm wanting to serve the file through one of my accounts. The ownership and permissions of the file have already been changed to reflect the specific account; however, when I browse to the file through the web it will not pick up the filesize nor allow me to download the file stating 403 - Forbidden Access.
Is there some setting that needs to be changed to allow users to download a file of this size? or am I missing a permission command?
Does anyone know any linux software that would chop up a video file for you?
View 14 Replies View RelatedMy crontab uses Wget to access a PHP file. It works just fine; however, the only problem is that each time the crontab is run it creates a file in the root of my site called filename.php.10XX
filename is the name of the file that Wget runs and 10xx is just the number of times the tab has been run.
How can I prevent these files from being created?
Its been so long since i used it i have forgot, I need to transfer backups to my new server so i can then install them.
But i have forgot what wget command to use.
I can't find wget on a hosting. SSH command find / -name wget returns with nothing, however wget works properly on a hosting, what could the problem be?
View 14 Replies View RelatedHello, I don't know how to use crontab to run PHP without using wget and lynx
1) The PHP script can run via SSH command line mode without problem
2) I can use crontab to run the PHP script with wget or lynx.
However,
3) The script will not run if i using below entry
1 2 * * * php /path/to/script/crontest.php
1 2 * * * php -q /path/to/script/crontest.php
1 2 * * * php -f /path/to/script/crontest.php
1 2 * * * /usr/local/bin/php - /path/to/script/crontest.php
1 2 * * * /usr/local/bin/php -f /path/to/script/crontest.php
1 2 * * * /usr/local/bin/php -q /path/to/script/crontest.php
i have been under DDoS attacks, and what it does is it will have different servers wget
a certain file so it's all pretty much with HTTP.
for example: i had 10000 wget site.com/file.rar from ip x.x.x.x
and then same wget from ip y.y.y.y.
now question is how could i block this?
is it a way on apache2 to limit Downloads per IP (example 1 gb /IP)?
Hi I was following this guide:
[url]
Now command "wget" don't work anymore ... any ideas what is wrong with this guide ? I did exactly as it said.
i have one download site.with direct link.
i want another servers can`t with run wget [url]
transfer to self servers
How may?
my server is cpanel,apache,
I been trying to get this file using wget but it uses some sort of advance Authentication that I cant get around using wget it doesnt use cookies it uses some Authentication method.
How can I login to that website using wget? the form field names are usernamlc and passlc if I can some how post those two usign wget I can get the download link.
I did a wget get on a file today and i didnt specify where i wanted it to go, eg cd /dir1/dir2
Where does it keep the files i wget.
I am running Cpanel/whm,
I'm making a reasonably uninformed comparison here. Since Windows Vista is noted to be more resource intensive and slower than Win XP, are we right in assuming that Windows 2008 is slower than Windows 2003?
For instance, with two boxes with an identical hardware setup but the two different server OSes, will the same application like, say MySQL run slower on the Win 2008 machine?
Will there be a prominent security issue if I enable wget for a user?
And where would I find the user/group file to add that user?
I have a backup file at my old server like:
[url]
What is the command line I have to give to get this file by wget to my new server?
Please keep in mind that I have to give my old server user/password in this command line.
I am trying to move over a client from Verio's hosting to my VPS. The verio control panel looks very amateurish and there is no option to do a full account backup. There is no SSH either.
I tried wget -r ftp://user:pass@hostname/directory/* which was mentioned in another WHT thread, but after it gets connected, it shows the following and stops there:
Quote:
Logging in as accountname ... Logged in!
==> SYST ... done. ==> PWD ... done.
==> TYPE I ... done. ==> CWD /directoryname ... done.
==> PASV ...
Can anyone suggest a command that will work? There is only about 35 MB of files there, but I spent hours yesterday trying to download through FTP.
Most popular way of downloading these scripts is to use "wget" comman
when we need to change the command wget in our server for download some destructive script and run it through some server side program such as PHP or PERL ...
now i try to start this session by this command ...
PHP Code:
pico /usr/bin/wget
it's show me like this!
PHP Code:
?ELF^A^A^A^@^@^@^@^@^@^@^@^@^B^@^C^@^A^@^@^@^�¡^D^H4^@^@^@ u^C^@^@^@^@^@4^@ ^@^H^@(^@^[^@^Z^@^F^@^@^@4^@^@^@4^�^D^H4^�^D^H^$^@^@^@^ط^@^@^@^Û^@^@^@c^@^@^@C^@^@^@^@^@^@^@^@^@^@^@^A^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@$^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^E^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^^@^@^@ ^@^@$^@^@^@Pا^G^H^@^@^@^@^Q^@ø�^É^D^@^@^@^@^@^@Æ^@^@^@^R^@^@^@^È^E^@^@^@^@^@^@v^@^@^@^R^@^@^@Û^A^@^@^@^@^@^@&^@^@^@^R^@^@^@^O^E^@$^@I^F^@^@^@^@^@^@<^@^@^@^R^@^@^@^A^E^@^@^@^@^@^@~^A^@^@^R^@^@^@ص^G^@^@^@^@^@^@1^@^@^@^R^@^@^@ñ^@^@^@^@^@^@^@ð^C^@^@^R^@^@^@r$^@^@^@^R^@^@^@A^G^@^@^@^@^@^@M^@^@^@^R^@^@^@^ص^D^@^@^@^@^@^@E^@^@^@^R^@^@^@�^D^@^@^@^@^@^@=^A^@^@^R^@^@^@Ç^@^@^@^@^@^@^@J^@^$^@ش^D^@^@^@^@^@^@'^@^@^@^R^@^@^@y^D^@^@^@^@^@^@^Z^A^@^@^R^@^@^@^_^F^@^@^@^@^
now how can i change this command ? or change anything in this file ( wget ) ...
I have this Strange problem with the server.
Whenever i try to wget a file onto server it gives ERROR
Quote:
Resolving superb-west.dl.sourceforge.net... failed: Name or service not known.
If i use the FTP command on SSH. It still gives ERROR
Quote:
"ftp: xxxx.xxxx.net unknown host
Not connected.
Not connected.
Interactive mode off.
Not connected."
Even Paypal cant call back. Some users are facing problems where they have billing Scripts and scripts cant trace callback from Paypal