Server Load High / MySQL death
Server Load High / MySQL death
By Paul
Jun 21, 2005 (Edited May 06, 2014)
There are basically three types of sites with server load problems: extremely popular sites, sites where something is broken which is causing high loads, and sites which are being overrun by periodic bursts of traffic or uncontrolled spiders.
When your server load becomes high, pages become slow and a chain reaction is started where the server load keeps going higher as people keep trying to access the page while it's working, eventually resulting in Apache giving up and dying altogether until restarted. It can be useful to have an emergency method to halt this chain reaction before it reaches a critical stage, and that's one of the things this article is about.
Before continuing to read this article, please be sure that this is actually your problem. Enabling debug mode to show totals and execution time will allow you to view the server load on a Unix system (though not on a Windows server). Generally, server loads over 5 are a danger zone where there server is likely to suffer significantly and MySQL could cease responding. Anything over 3 is likely to cause noticeable slowing. If you can't check your server load, noticing page load times increase dramatically at times you have lots of guests online would be a clue. A "cannot connect to mysql through socket" sort of error message, when you're sure your config.php is correct and your host isn't doing maintenance, is also an indication.
Sometimes search engine spiders can have the same force as a DDOS attack, causing MySQL to slow and finally stop responding. Check your robots.txt to be sure you have a good crawl-delay value (at least 30 seconds). Sometimes there are bots that won't obey, or other sorts of traffic floods that can kill MySQL on a large site. You can limit the possible threat of such activity at Admin -> Settings ->SEO by selecting to block spiders from non-essential pages (pages you really don't need indexed anyhow) or non-rewritten pages.
As a precaution in severe circumstances you may wish to lock out spiders or guests entirely when the server load is above a certain level. If you have a Unix(/Linux/BSD) server and PHP is not in safe mode you may do this by going to Admin -> Settings -> System and setting the level for rejecting spiders/guests where it asks. 5 could be a good level. The load should go down quickly as the cause of it is locked out.
By Paul
Jun 21, 2005 (Edited May 06, 2014)
When your server load becomes high, pages become slow and a chain reaction is started where the server load keeps going higher as people keep trying to access the page while it's working, eventually resulting in Apache giving up and dying altogether until restarted. It can be useful to have an emergency method to halt this chain reaction before it reaches a critical stage, and that's one of the things this article is about.
Before continuing to read this article, please be sure that this is actually your problem. Enabling debug mode to show totals and execution time will allow you to view the server load on a Unix system (though not on a Windows server). Generally, server loads over 5 are a danger zone where there server is likely to suffer significantly and MySQL could cease responding. Anything over 3 is likely to cause noticeable slowing. If you can't check your server load, noticing page load times increase dramatically at times you have lots of guests online would be a clue. A "cannot connect to mysql through socket" sort of error message, when you're sure your config.php is correct and your host isn't doing maintenance, is also an indication.
Sometimes search engine spiders can have the same force as a DDOS attack, causing MySQL to slow and finally stop responding. Check your robots.txt to be sure you have a good crawl-delay value (at least 30 seconds). Sometimes there are bots that won't obey, or other sorts of traffic floods that can kill MySQL on a large site. You can limit the possible threat of such activity at Admin -> Settings ->SEO by selecting to block spiders from non-essential pages (pages you really don't need indexed anyhow) or non-rewritten pages.
As a precaution in severe circumstances you may wish to lock out spiders or guests entirely when the server load is above a certain level. If you have a Unix(/Linux/BSD) server and PHP is not in safe mode you may do this by going to Admin -> Settings -> System and setting the level for rejecting spiders/guests where it asks. 5 could be a good level. The load should go down quickly as the cause of it is locked out.
Description | Emergency measures to deal with extreme loads. |
Rating | |
Views | 2,147 views. Averaging 0 views per day. |
Similar Listings |