Im trying to compare the speed of my website and others and I do know of this function ping but I was wondering how do you know weather its fast or slow like example 200+ ms etc. can someone enlighten me on speed and performance,
I believe it's ev1server.net But i'm not sure if it's correct or not. This website is jacking some of my cursors. He/she doesn't even rename the files that he/she takes. It's the exact same filename as the one's I have on my website. Also when you look internally you can see information about the cursors. It says something like created by, animated by etc.
I just want to contact his webhost to remove my cursors.
I would like to know how I can find out who hosts a particular website. I checked the whois info at whois.sc but it did not provide the host. Is there another way to do this?
I just started using plesk for my blogs and websites, so testing and finding my way around i notice every site i add instead of having its own folder just like in cpanel(shared hosting) the sites are been added under the primary domain.
how can i do a search for all files (probs using regex) of files consisting purely of numbers?
for e.g. find:
53243.php 24353.php 24098.php
(always have 5 numbers).
seems one of my accounts has had some script run which generated a bunch of these in various subfolders, and the php file basically does a callback to www3.rssnews.ws and www3.xmldata.info, which seem to be some sort of spyware servers.
There is a directory say, "Master" and inside, "Master" there is sub-directory, "Slave". A user who has access to, "Master" should be able to access, "Slave" automatically. However, a user who has access to, "Slave" should not have access to, "Master". Inside cPanel this type of protection is not possible.
I want to move the entire contents of a directory tree to another directory.
So for example we may have a directory with 15 directories inside, each directory contains files itself. I want to copy all the files from the directory tree into another directory located somewhere else one the file system. I want only the "files" to end up in the other directory and not the file structure too.
If I type google.com in my address bar, it forwards me to www.google.com. This is not happening for my website right now. I think its a good idea to do this, since then search engines will have only 1 main URL for the website to index.
My question is:
How do I implement this? I think this may involve mucking with CNAME settings...
I can't seem to figure out why this is not working. I want to cd into a directory and only compress certain files. However, what I end up with is a file called ..tgz (I am not sure why it is adding that other dot).
I have a CentOS web server at my company. It's dual opteron. That server also acts like a router and I have about 5 computer are connected to that router. My web server has been slowed and can i find out who is using my bw ?. Those 5 computers has only local ips (10.10.0.x).
that my website is just for me, no one else know that website.
Someone posted some code similar to below, I made modifications or two after trying to detect PHP "nobody" users, after dumping a few printenv I found PHP exports PWD when calling an external program such sendmail. Basically the PWD will show the user directory that is coming from, which is enough to detect who is sending SPAM even as nobody! It's not 100% secure in that they could wipe /var/log/formmail but I don't imagine any spam will notice the logger, they presume any cPanel server (or other CP for that matter) is the same.
mv /usr/sbin/sendmail /usr/sbin/sendmail2 pico /usr/bin/sendmail (paste the below code into it) chmod +x /usr/bin/sendmail echo > /var/log/formmail chmod 777 /var/log/formail
#!/usr/local/bin/perl
# use strict; use Env; my $date = `date`; chomp $date; open (INFO, ">>/var/log/formmail.log") || die "Failed to open file ::$!"; my $uid = $>; my @info = getpwuid($uid); if($REMOTE_ADDR) { print INFO "$date - $REMOTE_ADDR ran $SCRIPT_NAME at $SERVER_NAME"; } else {
also,i m looking for a specific cron right now (xbt_cron).once i find it ,what command do i use to run it manually.its supposed to run by itself..i just moved to a new server last week and now its stopped working.
I tried searching for it on google but couldn't find any server company offering VPS using Litespeed instead of Apache. Yes, I know that Apache could be optimized, but would like to try a VPS or dedicated server with Litespeed, just for testing and learning to use and troubleshoot Litespeed. Does anybody here know of a supplier? I would need less than 10 gigs, 250+ Ram as well as cPanel.
I run a site that does a lot of transfers AND uses a lot of CPU resources. I think I would like to get two different hosting plans to deal with these different patterns of usage, but where can I find a good host offering lots of bandwidth that is content to have it actually used? I've been with DreamHost for a while but they don't allow "data archiving" or whatever (and I have arguments against their claims but it's neither here nor there). So, really, where the heck can a guy find a good host offering plenty of space and bandwidth? Keep in mind I need pretty much no CPU power with such a plan; the web server can be stone cold stupid for all I care, as I can just get a VPS to run the CPU-intensive part of my site!
If I have PID 1122 of apche httpd and using 99% CPU. How can I find the crosponding web site name which is utilizing that much CPU so I can the suspend it.
Code: root@server [~]# service exim restart Shutting down clamd: [ OK ] Shutting down exim: [ OK ] Shutting down antirelayd: [FAILED] Shutting down spamd: [ OK ] Starting clamd: [ OK ] Starting exim-26: [ OK ] Starting exim: [ OK ] Starting exim-smtps: [ OK ] Starting antirelayd: Cannot find the maillog at /usr/sbin/antirelayd line 26. [FAILED] Starting spamd: [ OK ]
Assuming that one was to get a local office in a town, how would someone find building or area that had a high availbility of fiber nearby, but was not a datacenter? Are their fiber maps for each big city? Does anyone have fiber maps for Houston, Texas? I would be interested in seeing these maps if possible.
I can't find wget on a hosting. SSH command find / -name wget returns with nothing, however wget works properly on a hosting, what could the problem be?
Some of our servers are complaining that they can't access the website. How can we check the blocked IPs using IP tables rules? Are there any special command to check?