if there is a way to detect traffic spikes using weblogic server logs and depending on traffic load redirect to a static page, or load the dynamic page.
I currently pay $9.99 USD per month for a dedicated server to host my little news site.
The specs are 667mhz / 256Mb ram / Centos / about 20-40mbit BW on a shared 100mbit port.
Anyway, I have a possibility to have one of my articles dugg soon, as It may become popular and I am wondering.. is there anyway i can tweak this box to survive DIGG?
I'm using wordpress at the moment, with lighttpd and mysql and I could possibly just make it a static http page I guess for the digging.
I think that even with lots of tweaking and serving a static HTML page, it may still die.
SO, anyway..
My maximum budget is approximately $25 USD per month, but I'd like to spend less, ideally.
Do you think I should A) get a 2800+ sempron / 2.8ghz p4 with 1GB of ram and unmetered 10mbit / 100mbit with 1.5TB
Or shared hosting? Or a VPS? or what?
OR just stick with what I have now?
Basically it will just be news articles / blog type stuff + some images.
The reason I say 10mbit unmetered is at least I know what my bill is going to be etcetc.
What type of hosting account i need to choose for a digg like news service(or social bookmarking). Is Bluehost enough ?? They have unlimited data transfer/domains/space near 6$ /per month. What other factors should be considered for a file hosting or image hosting site??
Please share your experience about having your site dugg.
What hosting solution will be the best shot for the traffic-intensive site? Are there any shared hosts that allow to survive the digg effect? Might a shared plan with unmetered bandwidth nad disk space be the way out? I'm talking about the Deluxe Plus type of plan from webhostgiant.com deals.
I learned the hard way last year when my website (on GoDaddy shared hosting) made the front page of Digg. GoDaddy suspended my account in a hurry (and didn't bother to inform me, but that's another story). I'm planning to get a VPS account with SLHost to prepare for future traffic growth.
How should I configure the server to best handle a huge spike in traffic? From what I can gather, there are a number of factors: - Max HTTP connections (MaxClients in Apache) - Max number of open file handles allowed (a kernel thing) - Virtuozzo allowed TCP connections
This post at webhostingtalk.com/showthread.php?p=4552677#post4552677 by Josh at SLHost outlines the defaults for their VPS servers:
Quote:
Are you referring to HTTP connections or other? By default, the MaxClients setting is at 256 clients and would need a recompile if you want more. The number of open files allowed is set to 1024 by default and can be raised. There are also Virtuozzo allowed TCP connections, which is set at 1200 and we've noticed that anything more than that should either be on an Enterprise VPS package or low end dedicated server at least.
Should I do any tweaking to the defaults if I want to survive another Digg onslaught?
how can I use to control or cap the traffic on a per server basis ? in other words, I have 15 servers in one cabinet, in this cabinet there is one switch to feed all 15 servers, the swith is a DELL 3448, one of the servers is eatingup almost all the traffic I have fro the cabinet itself, is there a way I can cap or limit traffic quota on a per port basis at the switch level? or what is the best way to manage this?
I'm up Games for Windows VPS servers with VMWare Server ESXi and wonders whether some option to control the traffic of each IP, I thought about using a "Cisco ASA 5500" but I do not know if it has this option:
Imagine you want a set of servers (VPSs would be a cheaper choice, that is why I am posting here) that do not have much outbound traffic but download from other servers (more or less as spiders, but I am not trying to create a web index). Disk space or memory size are not important, but port speed and monthly transfer should be as high as possible. As inbound traffic is less frequently used, I wonder if any provider offer cheaper rates if traffic is like this.
I have been searching the forums and have not found too much about this topic (a quite related post named "I want to download the Internet" or something similar did not get a conclusion).
I am not sure if my dedicated server is being attacked or if it is legitimate traffic. I need help figuring out the difference and if it is an attack, how to prevent it, and if it is legitimate traffic, how to configure the server to handle the load.
SoftwareCentOS 5.3-32 Apache2 MySQL 5 PHP 5 When I do ps aux|grep httpd|wc -l I get the count of current connected clients of 259 which is always maxing out my MaxClients of 256. I had increased it to 512, and it maxed out, I had increased it to 1024 and it maxed out, and lastly I had setup to 2048 and it works, but slows the entire server down.
Recently I noticed the load on one of my servers way beyound what I would expect it to be. I run multi processor servers and even during a backup the load is only around 1.5.
But lately I noticed peak loads that high under normal web traffic.
I know 1.5 is low on an multi processor server, but I am hoping to add much more to those machines and with sustained load that high it leaves no room for expansion. The servers are not cheap, so adding another server to the cluster can only be done if I make money from the last one I added.
I checked the traffic levels and they were very high. After further review I had some bots hitting sites at over 1200 pages a minute. Multiply that by a few hundred bots and clearly I could have a load issue. The potential is there to bring any server to its knees when delivering those volumes.
I created programing to watch connections and block the abusive bots. While logging I became aware of over 600 bots crawling my servers. Many bots from, Japan, China, Germany and so on and on, useless to my customers even if they are legit search indexes.
Another problem I see is that the bots are running from many ip addresses and hitting the same sites from multiple ips at the same time. Why would the need to do that?
Among other things I decided to validate googlebot, msn and yahoo with dns lookups so I could determine that they were actually their bots and not imposters. In 24 hours I found valid bots from the big three hitting one server from 1100 different ips.
Now we are looking at thousands of vaild bots and thousands more email harvesters and content theives.
As a host, the number of sites I can host on a server is greatly reduced by the bot traffic. My customers do not want to hear that their website was being crawled at 3,000 pages a minute and that is why they could not access it. Of course they will blame it on me.
I was able to filter the bots at a firewall level and drop connections based on reverse dns lookups and site crawl rates and my server sits around 0.05 most of the time even with hundreds of pages a minute being accessed.
I am wondering how the rest of you hosts deal with this problem. Do you leave it up to your hosting customers? Or do you have some type of filter to get rid of the bots.
When you have a few sites it is not really a problem, but as you grow it grows exponetially out of control.
i've a vps with iptables, but i've too much traffic (RX), there are too many packets received from random ports on both upt and tcp. Today in just 14 hours i've 2.8 gib of traffic, without any connection for web, email, etc (i've stopped all the services). How can i stop this? it's going to burn all my monthly traffic
I've only ever had a shared hosting account with Hostgator, plus a few freebie hosts. However, I'm now pulling some heavy traffic and I'm concerned that Hostgator is going to suspend me soon.
My traffic on Saturday for example was ~2600 unique visitors and ~5000 page views. All of this traffic was from WordPress blogs and a small SMF forum. I've since converted one of the blogs to a static site to limit my CPU usage and I've setup caching for my other WordPress blogs. Advice I've heard on the Hostgator forums is that 7000 page views per day for a database driven site is around the time you should be upgrading and based on my traffic from Saturday (which admittedly was a bit of a spike) I could potentially be receiving 150,000 page views/month, so about 20x the point at which they recommend upgrading at.
Anyhows, in a nutshell I need to upgrade, or risk Hostgator throwing a tantrum at me ... but I don't have a lot of cash to pay for an upgrade Due to my lack of cashflow I've been considering moving to a VPS. The company which has interested me the most is HostV.com who offer a 256 MB (with 1000 MB 'burst' RAM) for only US$39.99 which seems quite reasonable to me.
They say that their 256 MB plan should be able to handle over 5000 page views per day for a WordPress run site, but I'm a little suspect. Do any of you know if this is a reasonable expectation from a 256 MB chunk of a virtual server? I have no idea and am always wary of believing the sales pitch of a random company across the other side of the world.
I just want to ask. my ISP told me my server is generating high traffic from outside and paste me their traffic log with 1 IP address (xx.xx.xx.xx)
They rebooted my server and the problem disappear but I need to check what has been going on and where do I start? The only information I have is the IP xx.xx.xx.xx
I just recently upgraded my website from WordPress to WordPress Mu.
Everything went smoothly except for one problem. On WordPress, all my posts would appear as [url] but with WordPress Mu, it is now [url]. So whenever someone visits ht[url] or [url] they are given a 404 error because it no longer exists at that location.
I know there is a way, like a wildcard or something, that makes it so that wheneever anyone visits [url]anything it would change it to[url]whatever else was typed/, no? I can't figure out how to search for that exactly and tried reading through .htaccess docs and can't figure out how to make this work.
I have 4 sub domains on qisoftware.com and most have network traffic between 30-34%. Unresolved traffic about 12-14%. Is the network traffic statistic high? What would be considered normal?
A proxy server can mask IP address, right? Does a proxy server show up as network traffic in site statistics reports?
Okay, maybe that's enough questions for right now. I have been researching the internet for terms but I am not finding what would be considered normal.
An ad-network requires my website to have certain amount of traffic for x days to qualify, but they won't provide stats and have asked me to log the stats myself.
For incoming traffic stats, I already use AWstats etc, but is there anything available for logging outgoing traffic as well?
Don't know if this is the right place to ask for this but here goes.
I have a 20U rack space and i use 4U of this space (1U + 2U servers and a 1U switch).
There is a 20Mbit internet connection on a 100Mbit network.
But here comes my problem, some friends have there servers in my rack space to, There is 12 servers that are my friends and i want to know how much traffic they use.
I want to know how many GB traffic they use to i can charge them (they don't pay right now).
2 of my friends servers must max take 50gb traffic!
i have an VPS account at SolarVPS. I have had some bad days lately. When i go to Virtuozzo on my Plesk cp. i see this
day - in - out 12 0.00 0.01 13 0.07 0.48 14 0.06 99.61
why is it like this? i don't used like 20GB last month (for all of my sites). but now it goes up to 100GB???? Why is this happening? how can i stop it?
i have already an dedicated server, but with an strong traffic limit, so i search webspace (filespace) for my thumbnail images, the space is only for the images, no php scripts or databases are needed just imagespace.
the hoster should tolerate adult content, and have no problem with 500gb bandwith per month. Paypal should be accepted as payment.
its important for me that the webhoster follow their offers and dont kick me if i use high bandwith....