To automate a task, I need to create a shell script that will allow me to copy the folder blank/ to a new location and rename it to news/, same for the files inside that directory. And, if possible, change the content of one of those files as well.
For example, I need to copy this:
Code: blank/ templates/ frontend/ index.html Blank.php ...to a new folder and rename it to this:
I have a large directory which I want to copy to another account on the same server. Its 1 folder which contains 20000+ files and its around 2GB in size.
Can anyone tell me a simple way in Bash to copy all of the contents of a directory (and only the contents), including hidden files, into another, existing directory?
E.g.
Code:
# I have this directory structure - directory_A --- existing_file - - directory_B --- some_file --- some_subdirectory --- .some_hidden_file
I have tried copy files from Linux server to Windows Server using WinSCP3 and it was so fast.
I am not sure who to copy files as well from Windows to Windows with same speed. I tried to use RDC between the Windows Servers and share one drive. But it wasn't fast as WinSCP3.
I want to my friend's files from /home/admin/dc to /home/ as I am planning to rebuild kloxo for him, this could kill all the files so, I would like to know the code so tht I could copy few files from the above location to the home directory
to move the httpdocs folders of multiple sites on my plesk server to another server (share on other serv is already mounted so thats not an issue) but their is 31 different domains i need to copy. wondering if anyone knows any way to batch automate a cp process to copy each domains files to a new folder on remote server (this is for backup purposes not live sites so i cant use plesk migration etc
What tool is there for me to copy a lots of file from one webhost (FTP site) to another webhost? Like in for migrating to another host.
I'm thinking of a tool where I can specify the FTP access information for both sites, and it would perform the copy without needing to store the files on my local hard disk first.
I rent windows dedicated server for just about a month, but I never had problem.
When I copy files on my home computer and then paste them onto server's drive, files begin copying, but after some time they stop copying and I hear double beep sound.
transfer a client's site files (over 220 MB) to my server. The client does not use cPanel or have SSH access.
FTP is horribly tedious. I have created the account on my server and have SSH enabled. I have a feeling I can use wget to download the files to the account's home directory, but I am not sure of the correct syntax to recursively download all the directories and the files.
how to setup a cron job to copy files & directories from one folder to the root folder. I have CPanel X.
My root directory is public_html/ I have another directory public_html/uploads containing both files and directories.
I need a cron job that will copy all the files & directories from public_html/uploads to the root public_html/
If it helps, here is some system info
General server information: Operating system Linux Service Status Click to View Kernel version 2.6.22_hg_grsec_pax Apache version 1.3.39 (Unix) PERL version 5.8.8 Path to PERL /usr/bin/perl Path to sendmail /usr/sbin/sendmail PHP version 4.4.4 MySQL version 4.1.22-standard cPanel Build 11.17.0-STABLE 19434 Theme cPanel X v2.6.0
I am having problems with two accounts, I get the following errors:
Code: Failed to copy files storage to destination path. stderr: filemng: Cannot open destination file '/var/www/vhosts/domain.tld/httpdocs/index.html.Chn3rn' System error 122: Disk quota exceeded stdout: filemng: Cannot open destination file '/var/www/vhosts/domain.tld/httpdocs/index.html.Chn3rn' System error 122: Disk quota exceeded
I'll mention this is a cPanel/WHM server with about 15 accounts on it.
I get an e-mail everyday from my server's backup script telling me what new or changed files it found. Yesterday this summary was massive, and it showed new or changed files in these directories:
The /home/virtfs directory is something I haven't seen before, and it contained two directories belonging to two accounts on the server.
Does anyone know why these directories would be created or what they are for? The bit of reading I've done on this so far mentions that these directories will be created as a result of jailed SSH, but neither of those two accounts have ever had jailed SSH enabled.
if it is possible to allow only one ip downloading files from a directory at a time.web server is apache , may be with some apache module or if there is some build in configuration can be applied to .htaccess .
We used Red Hat with ext2 as our file system on an old server with 100k+ of image files in a single directory. This seemed to preform ok until we switched to FreeBSD using UFS. Now images load very slow. I have read that Ext2 uses and internal hash to speed lookups, while UFS does linear searches for lookups.