I have a large directory which I want to copy to another account on the same server. Its 1 folder which contains 20000+ files and its around 2GB in size.
Can anyone tell me a simple way in Bash to copy all of the contents of a directory (and only the contents), including hidden files, into another, existing directory?
E.g.
Code:
# I have this directory structure - directory_A --- existing_file - - directory_B --- some_file --- some_subdirectory --- .some_hidden_file
In previous thread we made few manual transfer of our domains.
We also made some automated migration of few domains/sites using Web Host Manager's Copy an account from another server feature.
All the files and other settings were properly transferred from old server to new server, but only the mysql database is not visible on new server. I am unsure if the same got copied to new server.
There is a directory say, "Master" and inside, "Master" there is sub-directory, "Slave". A user who has access to, "Master" should be able to access, "Slave" automatically. However, a user who has access to, "Slave" should not have access to, "Master". Inside cPanel this type of protection is not possible.
I want to move the entire contents of a directory tree to another directory.
So for example we may have a directory with 15 directories inside, each directory contains files itself. I want to copy all the files from the directory tree into another directory located somewhere else one the file system. I want only the "files" to end up in the other directory and not the file structure too.
I have around 40 domains and I'm trying to avoid copying each one over manually. I can't remember what files need to be copied and the best command to do that.
I have wrote a small script which copies file names which has name as *default* from backup to users main folder. It works fine with one folder but doesn't copy files from sub-folders.
it works with all the sub-folders as well?
HTML Code: #!/bin/bash
for i in $(ls /var/cpanel/users)
do rsync -vrplogDtH /backup/cpbackup/weekly/$i/homedir/public_html/*default* /home/$i/public_html
I want to use DD to fully duplicate a HD, however, the HD is 120GB and also have a mount folder that is 2TB. If I use DD, will it also trying to copy that 2TB onto my 2nd HD?
wasn't sure if this should go into "technical" or "programming" but I figure someone here will instantly be able to tell me...
(details: I'm on a shared host with cpanel and centos 3)
My hosting service techs refuse to update gnu TAR on the server even though it's several years old at 1.13 and has some know bugs that are annoying for complex calls - (basically the techs use any request like this to try to upsell me to dedicated or VPS which I don't need at my low level)
So I'd like to install a "local" copy of the newer gnu 1.14 or 1.15 TAR (which I *think* is possible?) into my account and have it run instead of the shared copy?
Unfortunately this is beyond my knowledge so thanks for any help in the right direction...
I was trying to copy packages from one cPanel server to other using Copy function in WHM accounts were moved easily but packages failed , Tried 4-5 times but still failed.
What's the biggest site you've copied using Copy Account function in WHM? I have a site about 5GB, containing a forum with a gallery. The database is ~600MB. Do you think it'll timeout during the copying process?
i tried todo a cPanel move to my server here. [url]
on the new server above i keep getting the errors below. i just dont know what any of this means. can anyone help?
Warning: fopen(/usr/local/apache/htdocs/online.txt) [function.fopen]: failed to open stream: Permission denied in /home/cometora/public_html/onlineold.php on line 14
Warning: fclose(): supplied argument is not a valid stream resource in /home/cometora/public_html/onlineold.php on line 15
Warning: chmod() [function.chmod]: No such file or directory in /home/cometora/public_html/onlineold.php on line 16
Warning: fopen(/usr/local/apache/htdocs/online.txt) [function.fopen]: failed to open stream: No such file or directory in /home/cometora/public_html/onlineold.php on line 19
Warning: flock(): supplied argument is not a valid stream resource in /home/cometora/public_html/onlineold.php on line 20
Warning: feof(): supplied argument is not a valid stream resource in /home/cometora/public_html/onlineold.php on line 22
Warning: fgets(): supplied argument is not a valid stream resource in /home/cometora/public_html/onlineold.php on line 24
I'm having a lengthy issue where my databases are to large to import in phpmyadmin using plesk. Unfortunately I dont have direct access to phpmyadmin and can only access it by DB user through plesk.
I have tried to edit php.ini in the following locations:
upload_max_filesize = changed this to 64M
post_max_size = changed this to 32M
maximum_execution_time = changed this to 300
maximum_input_time = changed this to 300
Why am I still not able to import my DB's which are about 8MB each?
Is it possible to get [Crone job] that sends a backup copy for single user in the server to another hosting via FTP which has [Domain name/ User /Password] either weekly or 3 times a week
I have tried copy files from Linux server to Windows Server using WinSCP3 and it was so fast.
I am not sure who to copy files as well from Windows to Windows with same speed. I tried to use RDC between the Windows Servers and share one drive. But it wasn't fast as WinSCP3.
I want to my friend's files from /home/admin/dc to /home/ as I am planning to rebuild kloxo for him, this could kill all the files so, I would like to know the code so tht I could copy few files from the above location to the home directory
to move the httpdocs folders of multiple sites on my plesk server to another server (share on other serv is already mounted so thats not an issue) but their is 31 different domains i need to copy. wondering if anyone knows any way to batch automate a cp process to copy each domains files to a new folder on remote server (this is for backup purposes not live sites so i cant use plesk migration etc
What tool is there for me to copy a lots of file from one webhost (FTP site) to another webhost? Like in for migrating to another host.
I'm thinking of a tool where I can specify the FTP access information for both sites, and it would perform the copy without needing to store the files on my local hard disk first.
I seem to be having a problem where periodically the data in one file is getting corrupted. I haven't been able to figure out a pattern to it, so I wanted to run command by crontab that would create a copy of the file each day. To avoid overwriting previous backups the filename of each day's copy would have to be unique like...
cp filename filename-2008-07-03
Is there a way to include this year, this month, and this day variables in a linux command?