Any Daily Moving Files Soloution (Rsync)
Mar 5, 2008
I have move all my vBulletin images to a separate server .. to decrease the load in the main server.. but i am still having a problem .. I have a 3 directors updated daily with new images they are
/home/mark/public_html/vb/customavatars
/home/mark/public_html/vb/customprofilepics
/home/mark/public_html/vb/signaturepics/
I am searching for a way to move only the new files uploaded to this there directory's to the other server in this way
/home/mark/public_html/vb/customavatars (new files) ===> /home/mark/public_html/images/customavatars
/home/mark/public_html/vb/customprofilepics (new files) ===> /home/mark/public_html/images/customprofilepics
/home/mark/public_html/vb/signaturepics/(new files) ===> /home/mark/public_html/images/signaturepics
I have tried to use Rsync in this way as i have a internal connection between my server but it's not work
rsync -a rsync://192.168.0.2/vb/customprofilepics/ /home/mark/public_html/images/customprofilepics
rsync -a rsync://192.168.0.2/vb/customavatars/ /home/mark/public_html/images/customavatars
rsync -a rsync://192.168.0.2/vb/signaturepics/ /home/mark/public_html/images/signaturepics
with daily corn ..
View 4 Replies
ADVERTISEMENT
Oct 29, 2006
i just wana know is it safe to do remote daily backup for about 70,000 files?
file sizes is about 200kb and every day i have about 1000 new file, so rsync first should check old files becouse i am deleting about 30-50 of them daily and them backup new 1000 files ,
so how much it will take every time to compare that 70,000 files?
i have 2 option now:
1-using second hdd and raid 1
2-using rsync and backuping to my second server , so i can save about $70 each month.
View 9 Replies
View Related
May 16, 2008
i have a client machine which requires backing up, and the server. For some reason, rsync doesn't want to transfer files...
i've already set up keys, it logs in fine without asking for password.
server config:
max connections = 2
log file = /var/log/rsync.log
timeout = 300
[backup]
comment = Rsync backups
path = /secondary/client
read only = no
list = yes
uid = nobody
gid = nobody
auth users = root
client command:
rsync -vz --recursive --progress --stats -e "ssh -i /root/.ssh/rsynckey" root@server.com:/backup/ /home/usertobackup
it keeps saying cannot find the backup folder, even thouhg it does exist on the server and is what is specified in the path above?
View 7 Replies
View Related
Mar 28, 2007
At work, we have a windows 2003 server.
We have a set of files that get modified from day to day.
Since they know I own a dedicated server (Linux) they ask me if it would be possible to have an offsite backup on my server of the set of files on the windows 2003 server.
Also, I dont want to have the whole files set to be uploaded everyday.
I am aware of rsync on *nix system, but do you have any suggestions on how to proceed with a windows 2003 server to a unix server?
View 3 Replies
View Related
Jun 3, 2009
I'm currently using (amongst other backup systems) rsync to an offsite space (am using BQBackup at the moment)
I'm just wondering - apart from backing up all of /home/, /var/lib/mysql/ and the important config files (httpd.conf, php.conf, etc etc) is there anything else that *needs* to be backed up?
Obviously in a worst case scenario, a new machine would be deployed with a fresh OS install (and a fresh WHM/cPanel install) so I wouldn't worry about backing up OS files or cPanel core files, although I'm wondering if there's anything apart from the /home/ directory and the MySQL databases which would be lost (and so need backing up) in the event of a crash?
View 10 Replies
View Related
Jun 28, 2007
Does anybody know how I can force RSYNC to backup open files on Windows 2003 Standard and Vista Ultimate?
View 4 Replies
View Related
Jan 31, 2007
How would you recommend moving files from one ftp to another ftp? (Keep in mind this is a large amount of data)
View 1 Replies
View Related
Aug 7, 2008
I'm trying to migrate my vps to another server, and basically I need to move the entire public_html directory from first server to the second. As its almost 2.5 gb, downloading and ftping isn't really an option, so I'm thinking i need to tar and then gzip it, and then somehow transfer it diectly.
so my question is what is the proper command to use for tar and gzip in this case? and what method to transfer. should i use wget or is there a better way. Once i have it on my destination server how do i untar it, and what path will it untar to?
View 8 Replies
View Related
Jan 28, 2007
I have about 20 gigs of pictures that I have already uploaded to my server, but they're just sitting there in a directory listing tree.
How can I import all of those pictures into, say Coppermine or 4images or the like, without having to do each one manually?
View 2 Replies
View Related
Aug 15, 2007
I using a Windows VPS service, and I wondering, if I have a folder on my computer desktop, how can I move it to my VPS Remote Desktop?
View 1 Replies
View Related
Jul 18, 2009
Hi guys, I've been having problems trying to edit my php.ini file which I think I've now fixed.
The whole reason I wanted to do this was because I've just moved to Media Temple from another hosting company and I'm having a couple of problems with the switchover.
Basically I use a CMS system to add properties which appear on the main website. I also upload PDFs and images. On the old hosting company, the PDFs and images went into folders called dnDir/pdf and dnDir/images but on Media Temple, they are going in to a folder called tmp. I really want them to go to the same place as they used to.
Is this an issue with php.ini that I need to rectify? If so, could you point me in the right direction?
View 5 Replies
View Related
Apr 6, 2008
i have a lot of files in about 100-200 subfolders and im wondering if anyone has a script i can run to move the files from the subfolders (mostly .zip files) to the root folder then extract the zips (which wil normally result in a .rar file) then extract that file and leave the extracted files in the same folder so i can easily access them, the folders/files are old backups of one of my computers that i need to get data out of. i have access to linux and windows systems whichever is easiest
View 2 Replies
View Related
Jan 20, 2007
I have mounted NFS part. but when copying a big file the speed is OK like 5-6MB/s but when starting to copying all other (small files) speed is like 20-200KB.s What is the reason and is it way to improve the speed or use other way to mount drive remotely and preserve the same permissions after backup?
View 2 Replies
View Related
Mar 30, 2009
I want to set up my server (a linux dedicated server) to automatically create daily backups of the pop3, mysql, & webfiles. I want it to go to a server which i have purchased with the exact same specifications.
I am not very good at unix command line/scripting. So what I need is for someone to help me define the backup strategy, select the scripts, and tell me of how to make sure backup server is secure.
View 7 Replies
View Related
Apr 21, 2009
I am running a dedicated server.
My apache crashes daily and I am investigating the cause of it.
I have found this strange message in my apache error_log....
View 12 Replies
View Related
Apr 8, 2009
Our website is receiving a daily attack from a french network called Neuf Cegetel. The IP is different each day but the network is always the same. The attack is daily and during several hours.
The website does not use ajax (the request is an ajax request) and there is no URL /0_0?_=... But the attacker use a random URL similar to this /0_0?_=1238873869634. Since the URL is always different the page is not cached so it is compressed by mod_deflate and therefore the attack is more harmful. The User-Agent and the cookies changes quite a lot but it is always an ajax request. Taking in account that it is the only ajax request in the server that would be the easily way to stop it. But it seems that when we try to stop the attack, the attacker try another way, what makes me think that the attack is voluntary (not a virus nor something like that).
Since it seems that the attacker can be easily found it (we are a Spanish website and the attacker comes always from the same French network), should we report this? If it were a virus in a remote server, the solution maybe is just to contact the abuse department of the network but if it is voluntary I think that we should discover who is behind the attack since it might be a company that want to bother us, a competitor or something like that. What do you think?
This is a very small copy of the logs containing a few examples:
Code:
4087 ReqStart c XX.XXX.42.189 52592 517548693
4087 RxRequest c GET
4087 RxURL c /0_0?_=1238873869634
4087 RxProtocol c HTTP/1.1
4087 RxHeader c x-requested-with: XMLHttpRequest
4087 RxHeader c Accept-Language: fr
4087 RxHeader c Referer: http://thewebsite.com/
4087 RxHeader c Accept: application/xml, text/xml, */*
4087 RxHeader c x-requested-handler: ajax
4087 RxHeader c UA-CPU: x86
4087 RxHeader c Accept-Encoding: gzip, deflate
4087 RxHeader c User-Agent: Mozilla/4.0 (compatible; MSIE 7.0;
Windows NT 6.0; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; .NET
CLR 3.5.30729; .NET CLR 3.0.30618; FDM; OfficeLiveConnector.1.3;
OfficeLivePatch.0.0)
4087 RxHeader c Host: thewebsite.com
4087 RxHeader c Connection: Keep-Alive
4087 RxHeader c Cookie:
__utma=9819446.1354119376.1238785835.1238785835.1238865537.2;
__utmz=9819446.1238865537.2.2.utmccn=(organic)|utmcsr=msn|utmctr=thewebsite|utmcmd=organic;
__utmc=9819446; /=
4087 VCL_call c recv lookup
4087 VCL_call c hash hash
4087 VCL_call c miss fetch
4087 Backend c 3052 default default
4087 ObjProtocol c HTTP/1.1
4087 ObjStatus c 404
4087 ObjResponse c Not Found
4087 ObjHeader c Date: Sat, 04 Apr 2009 19:37:47 GMT
4087 ObjHeader c Server: Apache/2.2.3 (CentOS)
4087 ObjHeader c Vary: Accept-Encoding
4087 ObjHeader c Content-Encoding: gzip
4087 ObjHeader c Content-Type: text/html; charset=iso-8859-1
4087 TTL c 517548693 RFC 120 1238873867 0 0 0 0
4087 VCL_call c fetch
4087 TTL c 517548693 VCL 3600 1238873868
4087 VCL_return c deliver
4087 Length c 235
4087 VCL_call c deliver deliver
4087 TxProtocol c HTTP/1.1
4087 TxStatus c 404
4087 TxResponse c Not Found
4087 TxHeader c Server: Apache/2.2.3 (CentOS)
4087 TxHeader c Vary: Accept-Encoding
4087 TxHeader c Content-Encoding: gzip
4087 TxHeader c Content-Type: text/html; charset=iso-8859-1
4087 TxHeader c Content-Length: 235
4087 TxHeader c cache-control: max-age = 300
4087 TxHeader c Date: Sat, 04 Apr 2009 19:37:47 GMT
4087 TxHeader c X-Varnish: 517548693
4087 TxHeader c Via: 1.1 varnish
4087 TxHeader c Connection: keep-alive
4087 TxHeader c age: 0
4087 ReqEnd c 517548693 1238873867.757586718
1238873867.758437872 0.936849117 0.000804424 0.000046730
View 6 Replies
View Related
Aug 27, 2008
customer of HostV's VPS hosting, and for the past 3 days, at almost exactly 01:20 GMT, CPU load jumps from an average of about 0.10 to 2.5+, stays there for over an hour, then drops back down.
During this time, there are NO processes on my virtual server using any significant amount of CPU time, memory, or IO. No cron jobs are running on my server, etc.
Note the output from 'uptime' below (I was monitoring it waiting for the problem to occur, which it did at exactly the time I expected):
00:32:05 up 22:01, 2 users, load average: 0.09, 0.11, 0.08
00:32:07 up 22:01, 2 users, load average: 0.08, 0.11, 0.08
01:09:49 up 22:39, 2 users, load average: 0.06, 0.03, 0.00
01:10:03 up 22:39, 2 users, load average: 0.05, 0.03, 0.00
01:19:26 up 22:48, 2 users, load average: 0.46, 0.16, 0.04
01:20:42 up 22:50, 2 users, load average: 1.53, 0.55, 0.18
01:21:39 up 22:51, 2 users, load average: 1.40, 0.67, 0.24
01:46:04 up 23:15, 2 users, load average: 3.06, 2.02, 1.52
Also note output from 'top', taken when load average was at 3.06 shown on the last line above:
Cpu(s): 0.1% us, 0.0% sy, 0.0% ni, 91.0% id, 9.0% wa, 0.0% hi, 0.0% si
My cpu usage is very low (0.1%) but wait time is at 9.0%, and I've seen this go as high as 70% during these times.
So, basically, there is a problem that exists on the host node somewhere that is causing my site to become effectively unresponsive (page load 20 seconds+ - measured), and it happens every single day at the same time.
So, why am I posting it here instead of logging a trouble ticket? I have logged a trouble ticket, but when I encountered the problem yesterday, despite logging it as "CRITICAL", I had to wait nearly 5 hours for a response, which effectively said not much beyond "we noticed the problem and fixed it and we're monitoring it". So I don't have a lot of faith that today's response will be any better.
I moved to HostV because of similar problems I was encountering with shared hosting, and was assured before signing up that the kind of problem I'm seeing doesn't happen. So now I'm outlaying more than 10 times the cost for almost exactly the same problems and a similarly unhelpful response to it.
By publicly posting the problem, I would hope that someone at HostV will ensure the problem is addressed PROPERLY, rather than bandaided again, and that hopefully we will all be able to see just how good HostV's support CAN be (as evidenced in another similar post).
I await HostV/Cirtex's response.
As shown in the uptime information about, server uptime is 23:15, because I rebooted the virtual server yesterday to see if that helped. It didn't. In fact, it took over 20 minutes for the server to come back up, which is why I'm not going to do it again.
View 14 Replies
View Related
Oct 13, 2007
Our VPS is being hit several times a day with hacking attempts. We have been actively monitoring error logs and can see the failed attempts. I was just wondering if there is a better way to track such attempts or another system log that wold provide additional info on these attacks? or maybe some 3rd party logging scripts?
View 13 Replies
View Related
Mar 7, 2007
I just had a quick question about backup solutions. What advantage would I have by setting up 2 HD's in a RAID-1 array as opposed to just doing daily automated backups on one of the drives.
The way I see it, if I have automated backups, HD use for that backup drive is limited to say 20 minutes a day. In a RAID-1 array however, both drives are used at the same rate. Wouldn't this provide better life expectancy for the backup drive, granted it is at the expense of having a guaranteed instant replacement for that original drive?
Reason I'm asking is because I'm setting up a Mac Mini for a friend as a web server and he would like to have data backups. The only way to add space is to intall an external hard drive so my options are a bit limited.
View 7 Replies
View Related
Jun 2, 2007
i want to run /scripts/cpbackup every day begain with 12:00 AM and i put this line in /tmp/crontab.XXXXwuxGUI File
0 0 * * * /scripts/cpbackup
but the backup didn't work and do the job
View 8 Replies
View Related
Mar 22, 2008
I would like to create an exact copy of my live drive on a daily basis via cron. Is there a good mechanism for doing this *without* taking the main drive offline? It seems like the two common backup solutions: dd and rsync both have issues in this area. I don’t think Rsync can create an exact mirror (including partitions) and dd looks like you need to unmount the drive(s) first.
Both drives are of identical size and installed via the ide controller.
View 7 Replies
View Related
May 5, 2009
I recently got a dedi from Hivelocity, and they installed CSF/LFD. On my previous hosts, I didn't have this, just cPHulk. With this dedi, I'm receiving nearly a dozen daily emails from LFD with IPs that have been blocked for multiple failed logins, mostly with username root, but also sales, staff, admin, system, etc., and a few for port scanning.
Is this normal? I've already disabled direct root login via SSH, and I'm not really worried about anyone actually managing to gain access, I'm just curious about the high number of attempts. On previous hosts, where I actually had active sites and forums, with links posted on other forums that are indexed and nicely ranked by Google, I rarely received any emails from cPBrute at all.
View 1 Replies
View Related
Jun 22, 2008
Is there any apache module which can limit user download daily. e.g. userA can download XX GB per day.
I am using mod_cband but it seems it can't do something like that.
View 2 Replies
View Related
May 5, 2008
I have my WHM/cPanel installation configured with daily and weekly backups. I checked at what time of the day the server load was at the minimum and configured the cPanel backup cron to run then.
The problem now is: Backing up a few hundred accounts results in a high server load. My server configuration:
Dual Processor Quad Core Xeon 5335 2.0GHz with 4GB RAM and 2 x 250GB SATA HDD hosted at SoftLayer.
The accounts are located on the first HDD and the backup archives are placed on the second HDD.
What can I do about this? I'd like to take daily backups of all accounts but not if my server load increases up to 10... That kind of renders the cPanel backup feature useless if it doesn't even work on a powerful server like this one...
Would it help if I use an application such as Auto Nice Daemon to give the backup process a lower priority? But then again that won't work on the MySQL dumps? And I think it's not a CPU problem but an I/O wait problem? Other processes have to wait for disk access because the disk-intensive backup process is running?
View 9 Replies
View Related
Feb 27, 2008
How can I setup daily exim statistics?
From WHM, it shows for about one month exim statistics.
Is there any way to have daily exim statistics?
View 2 Replies
View Related
Jul 3, 2008
I seem to be having a problem where periodically the data in one file is getting corrupted. I haven't been able to figure out a pattern to it, so I wanted to run command by crontab that would create a copy of the file each day. To avoid overwriting previous backups the filename of each day's copy would have to be unique like...
cp filename filename-2008-07-03
Is there a way to include this year, this month, and this day variables in a linux command?
View 6 Replies
View Related
Apr 6, 2007
I have seen resellerzoom provides daily full backup to their customer. How should i configure my WHM so that it create daily backup and delete old backup.
View 4 Replies
View Related
Apr 5, 2007
I want to know how can backup my linux server data's to windows server? due high number of files and daily updates i cant use FTP
View 3 Replies
View Related