I just want to use wget command to transfer data from a shared host to dedicated server.Anybody knows how to set wget to download the .htaccess file and keep the file/directory's permission the same as they used to be on the old server? I only knows these:wget -b -c -r -l0 -np -nH -t0
for example my current path in server is root@server [/home/user1/public_html/upload]# and i wanna copy every thing inside directory upload to /home/user1/public_html/ but when i used this command root@server [/home/user1/public_html/upload]#cp -r -f *.* /home/user1/public_html/
then it just copy files . is there is any way to copy folder as well?
Im using the 'top' command on my server to view the memory and cpu usage, to save me sitting in front of my pc for hours looking at it, is there any way I can save it to a text file for viewing later on
Can you please provide a clamscan ssh command for scan all sites public_html folder? I know "clamscan -i -r --remove /home/" can scan all /home directrory and sub-directory, but its can have a heavy cpu process and serer load!
Code: zip ../d/db/backup.zip ../d/db/09-02-15.sqlite backup.zip never appears. Instead, I get some random filename in the directory. Like ziOHokOw
If I try to zip a smaller file(last weeks backup) everything runs fine?
Code: zip ../d/db/backup.zip ../d/db/09-02-08.sqlite So the 134mb file zips fine, but the 200mb one seems like its failing and im left with some type of temporary file. I tried downloading the weird filename and unzipping it. It has partial info like directory structure, the filename but the actual file inside is corrupt.
I have recently brought a VPS hosting package. At the moment I am going through the tutoritals on the net that I have researched before getting a VPS package to give me some understanding on what I need to do to securior the server and also how to install the software that I require.
For most of today, I have been trying to sort out a problem that I am currently having.
Of which is I am trying to sort out a part of the tutorial from a website that requires the use of apt commands.
But for every command I am getting the message back apt..... Command not found. I am currently using the ubuntu operating system. And through some research, I have got the feeling that I might have the bare installation done on my server to just make it work.
Would I be right, and with the bare installation apt commands wouldn't be installed?
If I am, how would I go about installing the Apt commands and anything else that I might require?
I am trying to find the right command to backup files, but exclude certain ones and whole folders. I have experimented with various --exclude= and --exclude-from= combinations in my tar backups of my /var/www/html/ folder, but to not avail. I get either errors about non existant files when it checks to exclude, or it just ignores what I put and backs up the folder/files anyway.
If someone could give me the proper syntax to say... backup