I am trying to move over a client from Verio's hosting to my VPS. The verio control panel looks very amateurish and there is no option to do a full account backup. There is no SSH either.
I tried wget -r ftp://user:pass@hostname/directory/* which was mentioned in another WHT thread, but after it gets connected, it shows the following and stops there:
Quote:
Logging in as accountname ... Logged in!
==> SYST ... done. ==> PWD ... done.
==> TYPE I ... done. ==> CWD /directoryname ... done.
==> PASV ...
Can anyone suggest a command that will work? There is only about 35 MB of files there, but I spent hours yesterday trying to download through FTP.
Is there a command i can type into the ssh console to stop a current transfer that i started wit the wget command?
the file im wgeting always stuffs up at 51% but then the server just retries and starts again, its done it 3 times so far and i just want to completely cancle the process if possible....
Most popular way of downloading these scripts is to use "wget" comman
when we need to change the command wget in our server for download some destructive script and run it through some server side program such as PHP or PERL ...
now i try to start this session by this command ...
I just want to use wget command to transfer data from a shared host to dedicated server.Anybody knows how to set wget to download the .htaccess file and keep the file/directory's permission the same as they used to be on the old server? I only knows these:wget -b -c -r -l0 -np -nH -t0
After having my site on an Win2k3 virtual server for the last 10 years I've finally bitten the bullet and purchased a VPS account.
My site is an old legacy of MS Frontpage, complete with FPSE, Access databases, asp, html pages and lots of images. Total size is just over 4Gb.
I've tried publishing the site using MS ExpressionWeb direct from the virtual server to the VPS, plus I've tried to Publish them from the backup on my PC. Nothing works. I just get error after error after error.
EW starts to publish and maybe will run for 4 hours before throwing up an error saying that it can't find the web server or FPSE aren't installed. They are basically the same old error messages that all of Microsoft's Frontpage and Expression Web software have been throwing up since the day Frontpage was first marketed back in 1998! If they can't solve the problem you'd at least think they could change the error messages.
I've tried uploading them via FTP but without the MS server extensions the child webs aren't created so therefore I just end up with permission problems and nothing works properly-
I have had multiple websites hacked and need to do a cleanup. I need to run a command that will log all files (including path to that file) that contain <!-- ~ --> to a text file from the /home/* directory.
So far I have recieved 2 different ways to do it but none of them have worked
I got our server staff to install BackupPC on our dedicated server and run daily backups. All is well (from what I can tell) but i've come to a point where I want to actually use an archived file.
So I jump into my ftp client, navigate to the backup folder, and download the file. The problem is, the file reads as jibberish. I'm assuming that BackupPC has compressed it.
So the question is two-fold: 1. How do I decompress it on windows (command line stuff is well beyond me)
2. Is there any browser/windows apps that I can use to manage the backups?
I have a small issue that's probably easy to answer. If I upload a zip file to a Linux server, and run this command via SSH:
Code:
unzip -a name_of_zip.zip
Although it does unzip the directories as expected, it makes all file names and folders lowercase. This is a problem when trying to install software that relies on case sensitive names.
Does anyone know what command tells the server to retain the file names and not alter them?
I am trying to move filename.tar.gz from server A to Server B with root at both, i have moved filename into a web accessable folder and chmod the file 755, when i go to wget filename from server A I get...
I've just did a 'wget' on a file that's quiet large (2.3GB), and now I'm wanting to serve the file through one of my accounts. The ownership and permissions of the file have already been changed to reflect the specific account; however, when I browse to the file through the web it will not pick up the filesize nor allow me to download the file stating 403 - Forbidden Access.
Is there some setting that needs to be changed to allow users to download a file of this size? or am I missing a permission command?
My crontab uses Wget to access a PHP file. It works just fine; however, the only problem is that each time the crontab is run it creates a file in the root of my site called filename.php.10XX
filename is the name of the file that Wget runs and 10xx is just the number of times the tab has been run.
I can't find wget on a hosting. SSH command find / -name wget returns with nothing, however wget works properly on a hosting, what could the problem be?
I been trying to get this file using wget but it uses some sort of advance Authentication that I cant get around using wget it doesnt use cookies it uses some Authentication method.
How can I login to that website using wget? the form field names are usernamlc and passlc if I can some how post those two usign wget I can get the download link.