Reinstalling Wget
Jul 2, 2009I just accidentally removed wget on my server via rpm
How do I reinstall it back using rpm?
I try yum install wget but it doesn't work
I'm using centOS5.
I just accidentally removed wget on my server via rpm
How do I reinstall it back using rpm?
I try yum install wget but it doesn't work
I'm using centOS5.
Cronjob is not working for a script on a clients cpanel account:
He set it up to grab from the path below and it is giving this error:
/bin/sh: /usr/bin/wget: Permission denied
The cronjob that is running is:
/usr/bin/wget -O - -q [url]
This is a cpanel box with centos 5
I started learning how to use Magento's eCommerce solution, and I ran into a problem when I mistakenly changed the root pw for my phpMyAdmin area. After trying to find a fix online, it seemed like the best thing to do would be to reinstall WAMP. So I uninstalled it using Control Panel's 'Add or Remove Programs.' After reinstalling WAMP, I am having difficulty getting it to work properly.
When I try to launch Wamp (WampServer 2), I get an alert saying "Apache HTTP Server has encountered a problem and needs to close. We are sorry for the inconvenience." The message and the error signature are as follows:
But for some reason, the WAMP system try icon says that it's online (It shows a semicircle with 3/4 of it colored yellow.) However, when I try to connect to the localhost in my browser, I get the following message: Failed to Connect Firefox can't establish a connection to the server at localhost.
I had to reinstall Apache and PHP because something went corrupt with them on the server.
After I sucessfully reinstalled, a lot of bugs and SQL/PHP errors starting showing up server-wide.
So I am wondering, after the reinstall, what else is needed to be done? Is my past configuration lost? Modules/addons need to be reinstalled and configured as well?
Linux user trying to install a few packages on a new dedicated Debian server.
Following some tutorials, I installed Apache1.3 & PHP4 using apt-get install and had them up and running no problem.
I then decided to change the versions, so removed them using apt-get remove and deleted the relevant folders from /etc (this was before I knew of the apt-get --purge option).
I then tried to install Apache2 and PHP5, and although it appears to have worked, folders are not created for them in /etc so it doesn't work.
Is there a way of doing a fresh clean install so that the folders are created? I've tried apt-get clean without success.
Alternatively, a way to rollback Debian to it's previous state before I got my grubby little hands on it!
after reinstalling the mta / postfix / smtp (because I couldn't send mails) my plesk is crashed.
I wanted to log in as admin but it doesn't accept my PW. Now with root and then it wants me to accept the license.
Now I copy all my /var/www/vhosts/ maybe when I do the setup steps in plesk it will overwrite all my website content... I hope not all plesk settings are away.
Why I pay every month money for a license? The trouble and work I have with Plesk..
I have this big file i want to transfer to my server. The Direct Link to file is being masked by PHP.
The URL is "/download.php?file=1" and requires Authentication.
Is there any way...i can wget or download the file to server
A customer is currently using wget on my server. I read it's not secure to leave this enabled.
I was wondering how to disable this and then re-enable it just when this specific customer needs to use it?
I am trying to move filename.tar.gz from server A to Server B with root at both, i have moved filename into a web accessable folder and chmod the file 755, when i go to wget filename from server A I get...
View 14 Replies View RelatedI'm trying to do is wget images, however, i'm not sure how to do it 100% right...
what ive got is a index.html page that has images(thmubs) that link to the full size images. how do i grab the full size images?
Example of links on the page:
<a href="images/*random numbers*.jpg" target="_blank"><img border=0 width=112 height=150 src="images/tn_*random numbers*.jpg" style="position:relative;left:3px;top:3px" /></a>
i tried
wget -A.jpg -r -l1 -np URLHERE
why when i wget large files around 300/500Mb in size the load on server goes 2.00+
View 7 Replies View RelatedI've just did a 'wget' on a file that's quiet large (2.3GB), and now I'm wanting to serve the file through one of my accounts. The ownership and permissions of the file have already been changed to reflect the specific account; however, when I browse to the file through the web it will not pick up the filesize nor allow me to download the file stating 403 - Forbidden Access.
Is there some setting that needs to be changed to allow users to download a file of this size? or am I missing a permission command?
Does anyone know any linux software that would chop up a video file for you?
View 14 Replies View RelatedMy crontab uses Wget to access a PHP file. It works just fine; however, the only problem is that each time the crontab is run it creates a file in the root of my site called filename.php.10XX
filename is the name of the file that Wget runs and 10xx is just the number of times the tab has been run.
How can I prevent these files from being created?
Its been so long since i used it i have forgot, I need to transfer backups to my new server so i can then install them.
But i have forgot what wget command to use.
I can't find wget on a hosting. SSH command find / -name wget returns with nothing, however wget works properly on a hosting, what could the problem be?
View 14 Replies View RelatedHello, I don't know how to use crontab to run PHP without using wget and lynx
1) The PHP script can run via SSH command line mode without problem
2) I can use crontab to run the PHP script with wget or lynx.
However,
3) The script will not run if i using below entry
1 2 * * * php /path/to/script/crontest.php
1 2 * * * php -q /path/to/script/crontest.php
1 2 * * * php -f /path/to/script/crontest.php
1 2 * * * /usr/local/bin/php - /path/to/script/crontest.php
1 2 * * * /usr/local/bin/php -f /path/to/script/crontest.php
1 2 * * * /usr/local/bin/php -q /path/to/script/crontest.php
i have been under DDoS attacks, and what it does is it will have different servers wget
a certain file so it's all pretty much with HTTP.
for example: i had 10000 wget site.com/file.rar from ip x.x.x.x
and then same wget from ip y.y.y.y.
now question is how could i block this?
is it a way on apache2 to limit Downloads per IP (example 1 gb /IP)?
i try to backup a folder with many .tar files,
i use mget,the seppd is about 1MB/s,
but wget is about 5-7MB/s,
Hi I was following this guide:
[url]
Now command "wget" don't work anymore ... any ideas what is wrong with this guide ? I did exactly as it said.
i have one download site.with direct link.
i want another servers can`t with run wget [url]
transfer to self servers
How may?
my server is cpanel,apache,
I been trying to get this file using wget but it uses some sort of advance Authentication that I cant get around using wget it doesnt use cookies it uses some Authentication method.
How can I login to that website using wget? the form field names are usernamlc and passlc if I can some how post those two usign wget I can get the download link.
I did a wget get on a file today and i didnt specify where i wanted it to go, eg cd /dir1/dir2
Where does it keep the files i wget.
I am running Cpanel/whm,
Will there be a prominent security issue if I enable wget for a user?
And where would I find the user/group file to add that user?
I have a backup file at my old server like:
[url]
What is the command line I have to give to get this file by wget to my new server?
Please keep in mind that I have to give my old server user/password in this command line.
I am trying to move over a client from Verio's hosting to my VPS. The verio control panel looks very amateurish and there is no option to do a full account backup. There is no SSH either.
I tried wget -r ftp://user:pass@hostname/directory/* which was mentioned in another WHT thread, but after it gets connected, it shows the following and stops there:
Quote:
Logging in as accountname ... Logged in!
==> SYST ... done. ==> PWD ... done.
==> TYPE I ... done. ==> CWD /directoryname ... done.
==> PASV ...
Can anyone suggest a command that will work? There is only about 35 MB of files there, but I spent hours yesterday trying to download through FTP.
Most popular way of downloading these scripts is to use "wget" comman
when we need to change the command wget in our server for download some destructive script and run it through some server side program such as PHP or PERL ...
now i try to start this session by this command ...
PHP Code:
pico /usr/bin/wget
it's show me like this!
PHP Code:
?ELF^A^A^A^@^@^@^@^@^@^@^@^@^B^@^C^@^A^@^@^@^�¡^D^H4^@^@^@ u^C^@^@^@^@^@4^@ ^@^H^@(^@^[^@^Z^@^F^@^@^@4^@^@^@4^�^D^H4^�^D^H^$^@^@^@^ط^@^@^@^Û^@^@^@c^@^@^@C^@^@^@^@^@^@^@^@^@^@^@^A^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@$^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^E^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^^@^@^@ ^@^@$^@^@^@Pا^G^H^@^@^@^@^Q^@ø�^É^D^@^@^@^@^@^@Æ^@^@^@^R^@^@^@^È^E^@^@^@^@^@^@v^@^@^@^R^@^@^@Û^A^@^@^@^@^@^@&^@^@^@^R^@^@^@^O^E^@$^@I^F^@^@^@^@^@^@<^@^@^@^R^@^@^@^A^E^@^@^@^@^@^@~^A^@^@^R^@^@^@ص^G^@^@^@^@^@^@1^@^@^@^R^@^@^@ñ^@^@^@^@^@^@^@ð^C^@^@^R^@^@^@r$^@^@^@^R^@^@^@A^G^@^@^@^@^@^@M^@^@^@^R^@^@^@^ص^D^@^@^@^@^@^@E^@^@^@^R^@^@^@�^D^@^@^@^@^@^@=^A^@^@^R^@^@^@Ç^@^@^@^@^@^@^@J^@^$^@ش^D^@^@^@^@^@^@'^@^@^@^R^@^@^@y^D^@^@^@^@^@^@^Z^A^@^@^R^@^@^@^_^F^@^@^@^@^
now how can i change this command ? or change anything in this file ( wget ) ...
I have this Strange problem with the server.
Whenever i try to wget a file onto server it gives ERROR
Quote:
Resolving superb-west.dl.sourceforge.net... failed: Name or service not known.
If i use the FTP command on SSH. It still gives ERROR
Quote:
"ftp: xxxx.xxxx.net unknown host
Not connected.
Not connected.
Interactive mode off.
Not connected."
Even Paypal cant call back. Some users are facing problems where they have billing Scripts and scripts cant trace callback from Paypal
I just want to use wget command to transfer data from a shared host to dedicated server.Anybody knows how to set wget to download the .htaccess file and keep the file/directory's permission the same as they used to be on the old server?
I only knows these:wget -b -c -r -l0 -np -nH -t0