Can anyone tell me a simple way in Bash to copy all of the contents of a directory (and only the contents), including hidden files, into another, existing directory?
E.g.
Code:
# I have this directory structure - directory_A --- existing_file - - directory_B --- some_file --- some_subdirectory --- .some_hidden_file
I could use a bash script for a crontab that does a regular backup of my mysql database.
Unfortunately i can't employ one of the made-up backup scripts using mysqldump because i need to use mysqlhotcopy (that's because i need the raw data for a charset mess in mysql with some foreign languages not classifieds as utf8 .. long story), and i'm a total ignorant of perl and bash scripting.
The script (that will be recalled via cron) has to:remove all the .tar.gz files older than X days in the folder /xxx/backup, if the folder contains more than X tar.gz files create a folder /xxx/backup/$todaydate call the command "mysqlhotcopy --bla -bla -bla " that will copy the dbase in the previously created /xxx/backup/$todaydate folder at the end of the previous operation (if successful) compress the $todaydate folder in a $todaydate-sqlbackup.tar.gz file at the end of the previous operation (if successful) delete the uncompressed folder. launch the "rsync -bla -bla" command to syncronize this folder with a remote server I thought it will be something like a 10lines script, and i'll be glad to hand you a couple of virtual beers (via paypal ) as a thank you sign, but if the script is not trivial and you're willing to help anyhow, of course i'm willing to pay more.
I'm having a problem with litespeed and apache, every time when I reboot the server listespeed and apache(both) are started and the server uses apache. I configured litespeed to use a conf file of apache so I can't remove just apache. I need a way when I restart the server litespeed is started and apache is stoped
2nd problem. how to make this in a batch script The script will backup a directory every month and I need in every backup get a date like this directoryname-date(day/month/year)
I am currently trying to create a bash script which I will run off a loop with a sleep interval that will query tcpdump (udp packets only) on a network interface, and is looking for length 10 packets.
So far so good, not that hard to code I know (Already made it / coded it this far perfectly). Now here is the tricky part, I only want the bash script to identify IP's that have sent over 15 packets with the length of 10. (This is the part that I can't seem to find a way to code).
I was thinking, from the output maybe to calculate the number of lines with the same equal IP's.
Once this script identifies that, it will automatically run a command which I have set. (Quite easy, and I can do this).
I am looking for someone to help me with this. It is a fairly simple and quick job (editing the script I have at the moment). I am willing to also pay (if needed) an amount for this to be completed too. Obvieusly not that much, but still something I am sure we can work out.
I need to make a bash script, when I run it ./script.sh it needs to vivit a website - [url]/something.php where something.php has some function, e.g for emails, so when someone visit something.php it sends email to my email address. I just need a way to visit it via bash script
--05:03:39-- http://%0D/ Resolving 15... failed: Name or service not known. FINISHED --05:03:39-- Downloaded: 1 files, 423K in 1.3s (338 KB/s) tar: : Not found in archive tar: Error exit delayed from previous errors ./cPanelServer.sh: line 9: cd: csf: No such file or directory sh: install.sh: No such file or directory ./cPanelServer.sh: line 13: unexpected EOF while looking for matching `"' ./cPanelServer.sh: line 18: syntax error: unexpected end of file Script Contents;
Code: ./yum.sh cd /home wget layer1.cpanel.net/latest sh latest cd / rm -fv csf.tgz wget [url] tar -xzf csf.tgz cd /csf sh install.sh echo -n "TESTING = "1" read word sed "s/$word/TESTING = "0"/g" /etc/file.conf > /etc/file.conf.new mv /etc/file.conf.new /etc/file.conf service csf restart
done
It's seems the script is having issues extracting "csf.tgz", but I have no clue.
TBH, this is my first ever script and I'm surprised any of it works!
my VPS provided didn't enable a lot of modules and that's why I can use a firewall(csf or apf) and dos deflate script
I need a simple script for it.
First,it has to call this: netstat -ntu | awk '{print $5}' | cut -d: -f1 | sort | uniq -c | sort -n then there will be something like: Number : IP address 20 1.2.3.4 40 1.2.3.5 80 1.2.3.6 and then the bash script has to bann IPs with more than 30 connections(In our case: 1.2.3.5 and 1.2.3.6) with this: iptables -A INPUT -s IP_FOR_BLOCK -j DROP
#!/bin/bash sitepoint=`ps aux | grep -v grep | grep -c 'process'` if [ $sitepoint -le "0" ]; then command fi
I wonder if it can be extended to monitor 3 processes than making 3 different scripts like that or some other solution for monitoring more than 1 process if it is running or dead.
Trying to install yum no RedHed EL4 with Python 2.3.4. I have downloaded [url]and untarred it. I cd'd in the to untarred directory. I then went to ./configure and it gave me: ./configure -bash: ./configure: No such file or directory
I'm writing an inode and directory size counter, but hit a snag with directories that contain a space.
Simple code, finds all directories within a folder, sets the current directory:
for i in `find . -type d`; do ls $i; done
Looks correct? It works great, until you hit directories with spaces. So I try the following methods:
for i in `find . -type d|sed 's/ / /g'`; do ls $i; done for i in `find . -type d|sed 's/ / /g'`; do ls "$i"; done
What is happening is, at the ( for i in ), it treats each item at the first break, if its or a ' '. Is there a flag I can set to make it only use ? When I pipe the data, it sends each chunk of the directory through.
[root@home /home/mindbend/dev_html]# for i in `find . -type d|sed 's/ / /g'`; do ls $i; done
ls: ./test: No such file or directory ls: ing: No such file or directory ls: 12: No such file or directory ls: 3: No such file or directory ls: ./test: No such file or directory ls: ing: No such file or directory ls: 12: No such file or directory ls: 3/test: No such file or directory ls: 2: No such file or directory
# ls -d test ing 12 3/ test ing 12 3/
#ls -d test ing 12 3/test 2/ test ing 12 3/test 2/
GNU bash, version 3.2.39(1)-release (i386-portbld-freebsd7.1) Copyright (C) 2007 Free Software Foundation, Inc.
We have Plesk Panel 11.5 in Virtuozzo containers (Centos 6 x86_64) and we often provide to our customers the ssh access with chroot - /usr/local/ psa/ bin/chrootsh.All we know about Shellshock Vulnerability and we already installed all fixes to bash, but chrootsh-version is still vulnerable.Here are the results of BashCheck from [URL] ..... under chrooted user:
Vulnerable to CVE-2014-6271 (original shellshock) Vulnerable to CVE-2014-7169 (taviso bug) bashcheck: line 15: 19226 Segmentation fault bash -c "true $(printf '<<EOF %.0s' {1..79})" 2> /dev/null Vulnerable to CVE-2014-7186 (redir_stack bug)
Test for CVE-2014-7187 not reliable without address sanitizer.Variable function parser still active, likely vulnerable to yet unknown parser bugs like CVE-2014-6277 (lcamtuf bug).Do you plan to release updates for chrootsh?
Because i have couple site hosted by both hosting company (DTH=downtownhost and TCH=totalchoicehosting) and have positive experience with them i needed to make decision who will be hoster for my new project.
The problem was by both company and by 99% others that traceroute command is not enabled by default and it is essential for my new project.
After exchanging a couple emails with both companies DTH was winner again because DTH allow on shared hosting traceroute command (in situation that you need it).
Answer from TCH was that only on VPS (semi-dedicated) is traceroute command possible and not on shared envirovment.
I must again praise DTH for the way how they handle with their customers and that they are ready to do everything for their customers. Not wonder that they are one of the greatest hosting company.
where I can get a simple SSL certificate for my web hosting business? So my customers feel safe on submitting their information. Is the standard SSL from GoDaddy pretty good?
So I was trying to run a backup process in Plesk 8.1 and the whole panel froze up on me (it's happened numerous times before).
Anyway, since the panel was all frozen up I just went into SSH and did a simple "reboot" (also, as done before many times). Only problem is, this time after I did the reboot the server never actually came back online... it seems to be locked up or something, I have no idea what.
I called my host and they are looking into it but they have no idea what's going on either and it's taking them forever to figure it out all the meanwhile my sites are down.... this isn't good.
Does anyone have any suggestions or advice as to why this could be occuring?