What have you found to be the best tuning sites for MySQL?
I'm getting into a bit of trouble. We have a weather site, and with all of the traffic, we're getting a little tapped out. When the loads hit between 134 and 160, the mail clients start to time out. Apache is still pretty fast, although it takes a little longer once you cross loads of 80, 5 second page loads, but when it hits between 130 and 160, I'm seeing 15-20 second page loads. DA is impossible above 80 but SSH is still very workable. Apache is tweaked to the max. I've kicked up some of the sizes in MySQL several weeks ago, and that did it them. However, we're taking on about 22,000 to 25,000 uniques an hour now. We normally can handle that no problem, but people are asking for maps a lot more now with the flooding and all. That requires a lot of MySQL lookups and the CPU creating a lot more maps. The maps I already cache for the duration, which is 15 minutes. The only horse I have left to whip is MySQL. After that, it will probably be a move to FreeBSD 7, but I'd like to throw in a few tweaks yet before we do that.
I have Linux server with WHM/Cpanel with 2000 domains now my problem is.Mysql is using 90-100% CPU usage and 1500-2000 queries are running at a time so please guide me how can i optimize it and how can i tune mysql server so it doesn't go high.
I have configure my.cnf file as ---> max_allowed_packet = 4M set-variable = max_connections=100 safe-show-database query_cache_limit=1M query_cache_size=128M query_cache_type=1 key_buffer_size=256M long_query_time = 3 table_cache=9092
I am reading the BYO Database Driven Website Using PHP and MySQL and I am ready to play around a bit. I was wondering if anyone could recommend a quality provider with this language. I have had problems with some in the past when using Joomla so I am trying eleviate that problem from the get-go.
I'm no SQL expert so I'm not exactly sure what to look for in-depth here. However, I noticed that MySQL would periodically raise to 98% cpu usage in top. So, I checked it out and turns out every time you load the front page of a site I host it takes about 30 seconds to load and its during that time that MySQL is freaking out.
I know the user relies heavily on MySQL, however I want to try to narrow down the problem as much as possible for them. Any recommendations on what else to look for?
I just switched to a new server - I went from a Dual Opteron 248 with 3GB of RAM to a Dual Clovertown with 4GB of RAM. Yet, despite having mild traffic right now, my load is rather high and my memory usage is like 80% (which is ridiculous, since that would mean I would be maxing out on the old box).
For some reason, my servers usually lose access when I get around 400 people on TEXT-BASED, non-MySQL pages? How is that possible? I have really solid servers, yet people running basic Pentiums with 1 or 2GB never run into this issue.
Is there any good, cheap site that would be able to properly optimize mySQL and Apache to lower the load and figure out why the memory usage is so high? Platinum Server Management never really got this done, so even though they're great for other support issues, please don't recommend them.
I have 2 ecommerce sites now hosted at 2 different shared locations, both sites offer same exact products and pricing. I am in the process of elancing out the recode/rebuild for both sites to share one mysql database, cart and to be housed together in one VPS.
I plan to put them both on 1 windows VPS (they are asp) each with its own separate IP but have heard this can cause problems with search engines especially google. I have great natural organic right now (1st position for 5 of my demo keywords) and dont want to put that at risk but getting really tired of 2 separate backends, carts, hosting etc..
as part of a project I have lately been looking into various aspects of kernel tuning. Most notably lately tuning the TCP stack for more efficient memory usage/throughput.
Thought I would start this thread to mention some of the tools I'd found for doing testing and see what anyone else had to recommend.
So far my favorite of the bunch is nuttcp. Its easy to use and gives a very good idea of how much of your bandwidth you are able to utilize.
A few interesting web pages are as follows for anyone interested in the topic:
[url]- Tuning TCP for High Bandwidth Delay networks
[url]- TCP Tuning Cook book, some interesting information in there as well
[url]...formanceTuning - Performance Tuning TWiki. Has a list of useful tools, flags for existing tools and ways to monitor network performance from a system level, along with some suggestions of things to correct
i have this couple of windows 2003 servers, colocated in data center, i need to improve download speeds to our customers who are at least 200ms away, the end user is not using download accelarator,
is there a way that any settings to be done on server so that per thread speed can be increased, this case the server and client both have the ability to make a connection at more than a megabit speed. i did some search but all the articles point to end user and not the server saying to increase tcp window size etc.. not sure if those articles relate to server side changes.
Uptime = 0 days 0 hrs 4 min 15 sec Avg. qps = 17 Total Questions = 4479 Threads Connected = 1
Warning: Server has not been running for at least 48hrs. It may not be safe to use these recommendations
To find out more information on how each of these runtime variables effects performance visit: [url]
SLOW QUERIES Current long_query_time = 10 sec. You have 1 out of 4491 that take longer than 10 sec. to complete The slow query log is NOT enabled. Your long_query_time may be too high, I typically set this under 5 sec.
WORKER THREADS Current thread_cache_size = 128 Current threads_cached = 6 Current threads_per_sec = 0 Historic threads_per_sec = 0 Your thread_cache_size is fine
MAX CONNECTIONS Current max_connections = 2000 Current threads_connected = 1 Historic max_used_connections = 7 The number of used connections is 0% of the configured maximum. You are using less than 10% of your configured max_connections. Lowering max_connections could help to avoid an over-allocation of memory See "MEMORY USAGE" section to make sure you are not over-allocating
MEMORY USAGE Max Memory Ever Allocated : 96 M Configured Max Per-thread Buffers : 10 G Configured Max Global Buffers : 58 M Configured Max Memory Limit : 10 G Total System Memory : 3.95 G
Max memory limit exceeds 85% of total system memory
KEY BUFFER Current MyISAM index space = 78 M Current key_buffer_size = 16 M Key cache miss rate is 1 : 735 Key buffer fill ratio = 8.00 % Your key_buffer_size seems to be too high. Perhaps you can use these resources elsewhere
QUERY CACHE Query cache is enabled Current query_cache_size = 32 M Current query_cache_used = 4 M Current query_cach_limit = 1 M Current Query cache fill ratio = 14.83 % Your query_cache_size seems to be too high. Perhaps you can use these resources elsewhere MySQL won't cache query results that are larger than query_cache_limit in size
SORT OPERATIONS Current sort_buffer_size = 2 M Current record/read_rnd_buffer_size = 256 K Sort buffer seems to be fine
JOINS Current join_buffer_size = 1.00 M You have had 0 queries where a join could not use an index properly Your joins seem to be using indexes properly
OPEN FILES LIMIT Current open_files_limit = 10000 files The open_files_limit should typically be set to at least 2x-3x that of table_cache if you have heavy MyISAM usage. Your open_files_limit value seems to be fine
TABLE CACHE Current table_cache value = 1024 tables You have a total of 721 tables You have 93 open tables. The table_cache value seems to be fine
TEMP TABLES Current max_heap_table_size = 16 M Current tmp_table_size = 32 M Of 212 temp tables, 0% were created on disk Effective in-memory tmp_table_size is limited to max_heap_table_size. Created disk tmp tables ratio seems fine
TABLE SCANS Current read_buffer_size = 1 M Current table scan ratio = 17754 : 1 You have a high ratio of sequential access requests to SELECTs You may benefit from raising read_buffer_size and/or improving your use of indexes.
TABLE LOCKING Current Lock Wait ratio = 1 : 76 You may benefit from selective use of InnoDB. If you have long running SELECT's against MyISAM tables and perform frequent updates consider setting 'low_priority_updates=1'
how to make the changes in red? My server works good for awhile, but then gets REALLY REALLY slow.
I have a VPS system on the west coast of the US, and access it from the east coast. Sometimes I can get 1Mbyte/sec downloads, and other times it is as bad a 250KB/sec.
I have done some pings, and have not seen any packet loss. I've experimented with sysctl and changed some parameters to hopefully help, but really haven't seen much of a difference.
Does anyone have a recommendation as to what I could do different to squeeze a little more speed out of the connection? The problem is that from both sides of the US, I see ping times (depending on different ISPs on the east coast) from 80ms-120ms.
I installed the MySQL binary packages in /usr/local/mysql/ after removing the MySQL RPM package. MySQL is functioning when I executed /usr/local/mysql/bin/safe_mysqld. I reinstalled MySQL before I installed PHP. When I used a PHP script to access a MySQL database, it outputs an error:
Code: Warning: mysqli::mysqli() [function.mysqli-mysqli]: (HY000/2002): can't connect to local mysql server through socket /var/lib/mysql/mysql.sock in index.php on line 2 However, I installed MySQL in /usr/local/mysql, not in /var/lib/mysql. How do I fix MySQL?
I'm having the oddest issue. For some reason, some of the websites on my server load fine, and some take a really long time to load (2 minutes).
Now, the server load is fine, and the size of the sites aren't the issue either. I've restarted Apache and a couple more services, and still the same sites seem to load very slow.
What could be causing this since it's only effecting certain websites?
I have a pretty beefy VPS (1 Gb RAM, equal share Intel xeon Quad core processor), but I have no idea how many other VPS's are sharing that processor.
Is there any way to know that? I'm guessing the hosting company (Future Host - very happy with them btw) isn't going to tell me.
Right now my stats are pretty low, but how many individual cPanel accounts (1 site each) before it starts to bog down? I know it depends largely on the traffic, but is 20-30 low-volume sites a lot for a VPS with 1Gb ram?
I have a few personal sites one server and my business sites on another. I was thinking of moving the personal ones to my business VPS (to save money) but I don't want someone to be able to do a NS or IP search on one group and find the others or see that they're related in any way. (I had a weird experience with a business contact contacting me via a personal site that he found by a NS search. That's why they're on another server now.)
Is is possible to add a second NS (another domain) on separate IPs to handle the personal sites, and my current NS handle the biz sites? Or should I just keep them on two different servers?
I thought maybe I could make the personal sites into their own reseller account and do a new NS that way, but if there's an easier way I'd rather do that.
I know a guy who has about 10 domains on the same IP and each of the domains uses its domain name as its own NS. I think the term he used was NS aliasing. I think it was all done by changing the DNS stuff in WHM. I tried to do what he said but it didn't work for me. Also I'm not sure how to handle the registrar part. They want two separate IPs for the NS, but he's using only one.
For some reason mysql wont start, i have tried restarting mysql but it wont, it says FAILED. The mysql.sock file seems to have disappeared and i cannot find it anywhere.
I recently had a harddrive failure and luckliy I can still access certain directories on this failed drive. I can still access the /var/lib/mysql/ directory which holds all the users databases and have backed all these up separately using tar.
Now what I need to know is how do you restore these database files to another server? I tried simply untar'ing one of these to the new servers /var/lib/mysql/ direcotry and it stuffed Mysql up - it went offline. I had to get a cpanel tech to bring Mysql back online.
how can I get these database files to fully work on a new server?
i currently have 2 webhosting providers but want to consolidate to one acct at one host with a bit better load times...
1. JaguarPc - i have two accts with them
a. i have a shared acct with them currently - i believe its called the gigadeal (something like $10 a month). i have been with them since 2000, pretty decent host not too much downtime. support is "ok" when needed. currently i have 3 smallish websites hosted on this one acct, they dont get a lot of traffic. two of them are using wordpress one is just a static html site. i did a look up and found the server has about 130 sites hosetd on it. so not too bad in regards to overselling. however my big problem is the site takes about 900-1500ms to generate a page. this seems to be pretty often. again the sites dont really draw that much traffic. so thats not the problem here.
b. i also have one of their freedom vps accts with upgraded ram and bandwidth. i only have one site hosted on it. this site used to get about 500K+ unique visitors a month. at its peak we were serving around 1tb bandwidth a month... i know we ran the vps hard but considering we wanted to stay under $50 it worked well. there was of course some downtime due to the massive traffic - [we serve up a popular flash cartoon website].
2. Hostgator - babygator plan this host only serves up one wordpress site - the site isnt very well known yet, but the site is growing each month. the funny thing is that when i looked up this server there was about 830 websites hosted on it - obviously oversold and crowded. support really sucks here imho. however, the page load time is anywhere between 250ms to 500ms a lot faster than JaguarPcs. which is crazy since jag has much less sites on the server... im looking to basically consolidate the websites that are on both shared plans. my original thought was to keep them all on my shared hosting acct at jaguarpc (the one with the 3 sites). obviously i cannot add them to the vps since its pretty active. also the vps is business and the other sites are all personal. and i dont want them to mix so to speak.
i am currently spending about $20 a month between the two shared host plans. im looking for some recommendations as to where to move -- where speed isnt such a big problem, and i can maintain one acct. it would be great if i could host these sites all for around $20 - hopefully without much lag.
was thinking mediatemple - but after reading so many negative posts here about them - im not sure...
I did a search on the forum bu it exploded so I thought I would post the question - sorry if it's been asked.
I would like to know how many sites are commonly hosted on a dedicated server..
Say - 4 gigs of RAM, and a Q9300 (for example).
I know that some companies put 600 - 800 sites on the server and then customers talk about load and etc etc.. So, I just wanted to get a feel for how many sites/server..
I am interested in your suggestions as to what I should do. I need a setup that will allow me to host several sites. They are mostly my own and low traffic. I do not offer hosting to any other folks, if so, i will set the sites up myself. I need mysql, email and virus checker, php, and firewall. I would like unix/linus hosting and am used to cpanel. Would prefer east coast hosting since many sites pertain to local areas there. I was thinkg of a reseller or webmaster package.
In terms of the secure domain, if I had a secure site and wished to access some information on a web page that was from a NON-SECURE domain or at least duplicate the non secure information on to the secure page, does the user need to click acknowledge buttons to go in and out of the secure areas? Can I copy or transfer information [eg goggle search results] onto the secure page without this necessity?
I have a godaddy deluxe Linux hosting plan for the next year. I'm wondering how I can host my two sites on that plan. According to GoDaddy it's possible but how can I do that and have each site separately? It says the only way would be to have it like...Url.com/keyword for other site but I don't want that. I want it to be so you visit one site and it takes you to that exact site.
Now i've been noticing in my logs that i have been getting referals from some other sites, but they appeared to be refering from pages with the same name as mine. I thought nothing of it until sitting on live support one evening i noticed a few visitors were viewing other domains. Which isn't right as my system shows me visitors and which page thay are viewing on my site.
Now they all actually link to my own site whenever you click a link, but seems little odd as it appears they have downloaded my entire site to their server. Almost like mirroring it without my permission or request.
I just want a rough idea; How many sites could safely fit on a server with the following specs?
PENTIUM-4 1.5GHZ 512MB DDR RAM CentOS Linux 5.X
I am thinking in terms of CPU power and Ram here. The sites I have in mind are all low usage (at most 300MB storage and 1GB bandwidth per month) so bandwidth and storage is irrelevant here.
I am having some issues here. I had a reseller account and decided to go to a dedicated box. My question is, what would be the easiest way to transfer all the account over.
I do not have root access on the resellers account so it is making it kind of difficult.
I did some research and found a way I think you can do but please correct me if I am wrong.
Can it be done by going under my dedicated roots access and clicking on the "transfer an account from another server using password"?
Also, Would I have to wait until all files are transfered to set my custom nameservers IP addy.
if there is a general rule of thumb on the amount of web sites can be loaded onto an Apache server which has 4Gb RAM and RAID. Each site will utilize one database, of about 5 Mb max. Preferably CPanel, else is there something else recommended for managing lots of sites?
Can I conceivably get 1000 sites per server or more?
What/where in httpd.conf do i tweak to allow more files to be open?