I need to make a bash script, when I run it ./script.sh it needs to vivit a website - [url]/something.php where something.php has some function, e.g for emails, so when someone visit something.php it sends email to my email address. I just need a way to visit it via bash script
My website is hosted locally on my network, so I'm unable to view the .com address for it (i can only access through the local IP address). This makes it really hard to test throughly.
Is there any way to view this website from how it would be outside the network? I know i can do it from another location but I need a solution that is right here. I do not have a off site computer i can remote desktop to either.
I've been having troubles with my website recently . I installed a forum software (MyBB), and the problem started happening shortly after. I've talked to a few site admins in the MyBB community but they all say that none of MyBB's scripts are capable of doing this.
Every single webpage i visit redirects me to a webpage which then has a redirect loop. I've removed all recent changes made to my website via FTP and restarted apache, but i'm still receiving the same error. I'm not quite sure what to do.
Link to my website: ript.onl Screenshots: [URL] ....
Every single webpage does this. I dont have any .htaccess files around and i've checked a few config files for anything out of the ordinary, but haven't seen anything.
I could use a bash script for a crontab that does a regular backup of my mysql database.
Unfortunately i can't employ one of the made-up backup scripts using mysqldump because i need to use mysqlhotcopy (that's because i need the raw data for a charset mess in mysql with some foreign languages not classifieds as utf8 .. long story), and i'm a total ignorant of perl and bash scripting.
The script (that will be recalled via cron) has to:remove all the .tar.gz files older than X days in the folder /xxx/backup, if the folder contains more than X tar.gz files create a folder /xxx/backup/$todaydate call the command "mysqlhotcopy --bla -bla -bla " that will copy the dbase in the previously created /xxx/backup/$todaydate folder at the end of the previous operation (if successful) compress the $todaydate folder in a $todaydate-sqlbackup.tar.gz file at the end of the previous operation (if successful) delete the uncompressed folder. launch the "rsync -bla -bla" command to syncronize this folder with a remote server I thought it will be something like a 10lines script, and i'll be glad to hand you a couple of virtual beers (via paypal ) as a thank you sign, but if the script is not trivial and you're willing to help anyhow, of course i'm willing to pay more.
I'm having a problem with litespeed and apache, every time when I reboot the server listespeed and apache(both) are started and the server uses apache. I configured litespeed to use a conf file of apache so I can't remove just apache. I need a way when I restart the server litespeed is started and apache is stoped
2nd problem. how to make this in a batch script The script will backup a directory every month and I need in every backup get a date like this directoryname-date(day/month/year)
I am currently trying to create a bash script which I will run off a loop with a sleep interval that will query tcpdump (udp packets only) on a network interface, and is looking for length 10 packets.
So far so good, not that hard to code I know (Already made it / coded it this far perfectly). Now here is the tricky part, I only want the bash script to identify IP's that have sent over 15 packets with the length of 10. (This is the part that I can't seem to find a way to code).
I was thinking, from the output maybe to calculate the number of lines with the same equal IP's.
Once this script identifies that, it will automatically run a command which I have set. (Quite easy, and I can do this).
I am looking for someone to help me with this. It is a fairly simple and quick job (editing the script I have at the moment). I am willing to also pay (if needed) an amount for this to be completed too. Obvieusly not that much, but still something I am sure we can work out.
--05:03:39-- http://%0D/ Resolving 15... failed: Name or service not known. FINISHED --05:03:39-- Downloaded: 1 files, 423K in 1.3s (338 KB/s) tar: : Not found in archive tar: Error exit delayed from previous errors ./cPanelServer.sh: line 9: cd: csf: No such file or directory sh: install.sh: No such file or directory ./cPanelServer.sh: line 13: unexpected EOF while looking for matching `"' ./cPanelServer.sh: line 18: syntax error: unexpected end of file Script Contents;
Code: ./yum.sh cd /home wget layer1.cpanel.net/latest sh latest cd / rm -fv csf.tgz wget [url] tar -xzf csf.tgz cd /csf sh install.sh echo -n "TESTING = "1" read word sed "s/$word/TESTING = "0"/g" /etc/file.conf > /etc/file.conf.new mv /etc/file.conf.new /etc/file.conf service csf restart
done
It's seems the script is having issues extracting "csf.tgz", but I have no clue.
TBH, this is my first ever script and I'm surprised any of it works!
my VPS provided didn't enable a lot of modules and that's why I can use a firewall(csf or apf) and dos deflate script
I need a simple script for it.
First,it has to call this: netstat -ntu | awk '{print $5}' | cut -d: -f1 | sort | uniq -c | sort -n then there will be something like: Number : IP address 20 1.2.3.4 40 1.2.3.5 80 1.2.3.6 and then the bash script has to bann IPs with more than 30 connections(In our case: 1.2.3.5 and 1.2.3.6) with this: iptables -A INPUT -s IP_FOR_BLOCK -j DROP
#!/bin/bash sitepoint=`ps aux | grep -v grep | grep -c 'process'` if [ $sitepoint -le "0" ]; then command fi
I wonder if it can be extended to monitor 3 processes than making 3 different scripts like that or some other solution for monitoring more than 1 process if it is running or dead.
Trying to install yum no RedHed EL4 with Python 2.3.4. I have downloaded [url]and untarred it. I cd'd in the to untarred directory. I then went to ./configure and it gave me: ./configure -bash: ./configure: No such file or directory
I'm writing an inode and directory size counter, but hit a snag with directories that contain a space.
Simple code, finds all directories within a folder, sets the current directory:
for i in `find . -type d`; do ls $i; done
Looks correct? It works great, until you hit directories with spaces. So I try the following methods:
for i in `find . -type d|sed 's/ / /g'`; do ls $i; done for i in `find . -type d|sed 's/ / /g'`; do ls "$i"; done
What is happening is, at the ( for i in ), it treats each item at the first break, if its or a ' '. Is there a flag I can set to make it only use ? When I pipe the data, it sends each chunk of the directory through.
[root@home /home/mindbend/dev_html]# for i in `find . -type d|sed 's/ / /g'`; do ls $i; done
ls: ./test: No such file or directory ls: ing: No such file or directory ls: 12: No such file or directory ls: 3: No such file or directory ls: ./test: No such file or directory ls: ing: No such file or directory ls: 12: No such file or directory ls: 3/test: No such file or directory ls: 2: No such file or directory
# ls -d test ing 12 3/ test ing 12 3/
#ls -d test ing 12 3/test 2/ test ing 12 3/test 2/
GNU bash, version 3.2.39(1)-release (i386-portbld-freebsd7.1) Copyright (C) 2007 Free Software Foundation, Inc.
Can anyone tell me a simple way in Bash to copy all of the contents of a directory (and only the contents), including hidden files, into another, existing directory?
E.g.
Code:
# I have this directory structure - directory_A --- existing_file - - directory_B --- some_file --- some_subdirectory --- .some_hidden_file
We have Plesk Panel 11.5 in Virtuozzo containers (Centos 6 x86_64) and we often provide to our customers the ssh access with chroot - /usr/local/ psa/ bin/chrootsh.All we know about Shellshock Vulnerability and we already installed all fixes to bash, but chrootsh-version is still vulnerable.Here are the results of BashCheck from [URL] ..... under chrooted user:
Vulnerable to CVE-2014-6271 (original shellshock) Vulnerable to CVE-2014-7169 (taviso bug) bashcheck: line 15: 19226 Segmentation fault bash -c "true $(printf '<<EOF %.0s' {1..79})" 2> /dev/null Vulnerable to CVE-2014-7186 (redir_stack bug)
Test for CVE-2014-7187 not reliable without address sanitizer.Variable function parser still active, likely vulnerable to yet unknown parser bugs like CVE-2014-6277 (lcamtuf bug).Do you plan to release updates for chrootsh?
If I type google.com in my address bar, it forwards me to www.google.com. This is not happening for my website right now. I think its a good idea to do this, since then search engines will have only 1 main URL for the website to index.
My question is:
How do I implement this? I think this may involve mucking with CNAME settings...