Somebody knew password databases of my forum do not know how then enter and modify the forum style templates and add iframe codes. Among the actions of its currency, although that problem still:
Initially I found phpshell uploaded on my site and I delete him , and I realized that there is no other phpshell.
1- I have changed database password and ftp password.
2 - I coded config file using zend
3- I make chmod 751 for directory and 644 for files.
I worked all these actions, however hacked on a daily basis.
How come this hackers to my server?
How closed this issue?
What is the log files, which would know from which all the details from entering the database?
Can anyone recommend a Apached and MYSQL optimization expert? I just some optimization done on mysql and apache upgraded... but pache wouldn't restart with the following entries:
if i remove above then it starts... but my CPU spikes and I am getting a ton of errors emailed to me:
Invalid SQL: SELECT COUNT(DISTINCT(userid)) AS count FROM vb_session WHERE vb_session.userid>0 AND vb_session.lastactivity>1245294346;
MySQL Error : Out of memory (Needed 8388580 bytes) Error Number : 5 Request Date : Wednesday, June 17th 2009 @ 09:39:06 PM Error Date : Wednesday, June 17th 2009 @ 09:39:06 PM
Its constantly complaining about this same query and i thought i removed all of the vbulletin options...
i am pulling my hair out... cuz my server isn't swapping, ther are 0ver 8GBs in cache.... but my CPU goes up
Here is original problem thread...
[url]
2nd post has recommended config... and my subsequent probs
I'm not sure how many people here use VMWare, but I'll give this question a shot... I'm looking for a solution similar to Winrar that can view the contents of the .vmdk file. Either to extract any files, or just view the contents without having the have the virtual machine online. I checked Google as well as VMWare's forums/website with no avail.
I have dedicated server with a Swedish hosting company. We have a network/community website that can have about 300 people online at the same time. But as soon it gets to 240-280 people get logged out (Sessions are closed). The website becomes extremly slow.
What can we do to have at least 2000 people online at the same time with out any session being killed?
Do we have to upgrade the memory from 1 GB > 2 GB? Or do we need 10>100 Mb port? Unmetered server?
We have Livezilla tracking on our site, and saw some suspicious activity this afternoon from the Netherlands, about 5 connections from the same IP address, they now appear to have put in a cname record/copied our site for http://wiiee.nl/design.html to our site. Does anyone know what they're doing/trying to do?
So I've just been browsing around and I've checked out a couple of peoples portfolios. Some of you guys have 100's of domain names registered under you waiting for someone to buy it.
So that's my question, how can you have so many domains registered? Isn't that very expensive?
I'm having my first DDOS attack. Do people DDos servers for no reason? Beacause they are attacking an IP on a server that have not had any sites or any use on it for 4 years.
So we just switched servers, but some people are reporting that they can either not access the site, or are not being displayed the newest content. What is causing this?
I just switched to a new server last month and Im having a problem where a lot of my members and even some of my mods are unable to connect to my website. They are just getting a page cannot be displayed error. The website is ftascene(dot)com. Is everyone here able to connect to the site, and any idea why not everyone would be able to connect?
I recently moved a client of mine to a dedicated server with a hosting company called the NYNOC. After losing a few weeks of data for my client in the incedent described below, I am wondering if what happened was industry standard, or this was a case of choosing the wrong hosting company.
After signing up and paying the initial hosting charge, I received two notices - one on 18 july, and the second on 26th july - for a bill that was due 25th july.
On 2nd august I get a complaint from my client saying the website is down. I figured the hosting company had cut me off for a late bill. Fine, I figure i go and pay the bill and the reconnection charge and they put me back on.
After waiting 8 hours to get a support person (they dont have 24 hours customer service), I was told that I no longer had a account with them! No client log in. No server. Nothing. The disk was wiped clean and the server was sold to another customer. Without any warning that this would happen.
So my client lost a few weeks of customer data and I lost my credibility.
I was just wondering if this is standard practice in the dedicated hosting industry, or these guys are just bad news.
One more thing. When you talk to these people (they have only one customer support person) its almost as if they really really dislike their customers. As in, they talk down to you even as they exhibit complete disregard for your interests. This I know is not an industry standard. I have several accounts at Hurricane Electric, and they are tops. Unfortunately they do not offer well priced dedicated hosting.
I think I will try Interserver next. I spoke with one of the support guys on AOL late at night, and they seem good, and more importantly, are able to have a pleasant conversation with their customers.
I have a 777 cmod folder open. It needed to be writable so that legitimate users can upload their picture. However, i do not want people to upload .php or .php.pjepg etc to the server.
There are times that they do not use the form in my site to upload the php file. How can they do that? via perl command? And how to prevent such thing from happending?
I migrated my server on friday, and I changed the IP address of my DNS. There are still a few customers accessing the old server. What can I do? Is it possible to flush some persistent DNS cache somewhere?
i have free hosting server and a rule to upload 3MB file max. it works for FTP, but somehow it doesn't work for php. It seems for php the limit on my server is 100MB (no idea why)
i use following directives to limit file size in php.ini :
; Maximum size of POST data that PHP will accept. post_max_size = 4M
(4 just for some margin )
; Maximum allowed size for uploaded files. upload_max_filesize = 3M
and i still can find 100MB files on disk. this is part of log file from apache from the account that uploaded it to me:
HTTP/1.1" 302 188 [url] "Mozilla/5.0 (Windows; U; Windows NT 5.1; pl; rv:1.8.1.3) Gecko/20070309 Firefox/2.0.0.3"
as the effect of this (at least i think so), there was 100MB file in his home dir.
any idea how can he POST such big files even with those two directives?
i have also set LimitRequestBody to 5194304 and LimitXMLRequestBody to 5194304 in apache2.conf which also should stop files being POSTED as big as 100MB.
i have php 4.4.4-9, Linux Debian, apache 2.2.3 working in worker mpm, and php as fastcgi.
P.S. i removed server info like IP, dir and address to not show specifics about my server in public, i put [] there.
I installed mod_security and the 403security rules on my VPS (Centos 4.1, Release version of WHM).
Several vBulletin files, including the ajax quick editor and some vbulletin.org add-ons are triggering this rule and banning members' IPs in CSF:
# Restrict witch content encodings we accept. # # TODO Most applications support only two encodings for request bodies # because that is all browsers know how to produce. If you are using # automated tools to talk to the application you may be using other # content types and would want to change the list of supported encodings. # # Note though that ModSecurity parses only three content encodings: # application/x-www-form-urlencoded, multipart/form-data request and # text/xml. The protection provided for any other type of encoding is # inferior. # # TODO There are many applications that are not using multipart/form-data # encoding (typically only used for file uploads). This content type # can be disabled if not used. # # NOTE We allow any content type to be specified with GET or HEAD # because some tools incorrectly supply content type information # even when the body is not present. There is a rule further in # the file to prevent GET and HEAD requests to have bodies to we're # safe in that respect. # # NOTE Use of WebDAV requires "text/xml" content type. # # NOTE Philippe Bourcier (pbourcier AT citali DOT com) reports # applications running on the PocketPC and AvantGo platforms use # non-standard content types: # # M-Business iAnywhere application/x-mal-client-data # UltraLite iAnywhere application/octet-stream # SecRule REQUEST_METHOD "!^(?:get|head|propfind|options)$" "chain, t:lowercase, deny,log,auditlog,status:501,msg:'Request content encoding is not allowed by policy',id:'960010',severity:'4'" SecRule REQUEST_HEADERS:Content-Type "!(?:^(?:application/x-www-form-urlencoded$|multipart/form-data;)|text/xml)" I don't know how to decipher this rule to know if just removing it is ok, or if it is serving an important purpose. During a couple hour period it was enabled, that rule only seemed to trigger false alarms.
The above was triggered with calls such as [uri "/forums/ajax.php?do=usersearch"] and [uri "/forums/newreply.php?do=postreply&t=11057"]
What I really don't understand is that I have an .htaccess in place to turn off mod_security for the /forums directory:
<IfModule mod_security.c> SecFilterEngine Off SecFilterScanPOST Off </IfModule> I have also had this rule triggered today when someone tried to access : ...