From: Robert M. <mra...@gm...> - 2010-02-12 10:39:03
|
Coders, I am not a coder. I'm not even any good at server maintenance. But SMW is taking my site down several times a day now. My wiki is either the biggest, or nearly the biggest SMW wiki (according to http://www.semantic-mediawiki.org/wiki/Sites_using_Semantic_MediaWiki) with 250,000 pages. My site runs out of memory and chokes all the time. I looked in /var/log/messages and it is full of things like httpd: PHP Fatal error: Out of memory (allocated 10747904) (tried to allocate 4864 bytes) in /home/reformedword/public_html/includes/AutoLoader.php on line 582 but the php file in question is different every time. I'm getting one of these kind of errors every half hour or more. Before you say, "Up your PHP memory", know that I did! I went up from 64MB to 128MB to 256MB. Same story. So I switched to babysitting "top -cd2". When I change a page without semantic data, HTTPD and MYSQLD requests come, linger and go. But when I change a page with Semantic::Values, the HTTPD and MYSQLD processes take a VERY long time to die, sometimes never. Eventually the site runs out of memory. Like I said, php.ini has 128MB memory and 60 second timeout for mysql. apache has a 60 second timeout too. Any help? -Robert |
From: Marco M. <ma...@fz...> - 2010-02-12 12:56:55
|
Hi Robert, I am also still new to SMW, but did you also adjust your settings in MediaWikis LocalSettings.php? Try to set the line Ini_set( 'memory_limit', '32M'); to some higher value. Greetings Marco -----Original Message----- From: Robert Murphy [mailto:mra...@gm...] Sent: Friday, February 12, 2010 11:39 AM To: Semantic MediaWiki Developers List Subject: [SMW-devel] SMW scalability Coders, I am not a coder. I'm not even any good at server maintenance. But SMW is taking my site down several times a day now. My wiki is either the biggest, or nearly the biggest SMW wiki (according to http://www.semantic-mediawiki.org/wiki/Sites_using_Semantic_MediaWiki) with 250,000 pages. My site runs out of memory and chokes all the time. I looked in /var/log/messages and it is full of things like httpd: PHP Fatal error: Out of memory (allocated 10747904) (tried to allocate 4864 bytes) in /home/reformedword/public_html/includes/AutoLoader.php on line 582 but the php file in question is different every time. I'm getting one of these kind of errors every half hour or more. Before you say, "Up your PHP memory", know that I did! I went up from 64MB to 128MB to 256MB. Same story. So I switched to babysitting "top -cd2". When I change a page without semantic data, HTTPD and MYSQLD requests come, linger and go. But when I change a page with Semantic::Values, the HTTPD and MYSQLD processes take a VERY long time to die, sometimes never. Eventually the site runs out of memory. Like I said, php.ini has 128MB memory and 60 second timeout for mysql. apache has a 60 second timeout too. Any help? -Robert |
From: Robert M. <mra...@gm...> - 2010-02-12 14:24:29
|
It's commented out, so the system is going with what's in php.ini, right? On Fri, Feb 12, 2010 at 4:55 AM, Marco Mauritczat <ma...@fz...> wrote: > Hi Robert, > I am also still new to SMW, but did you also adjust your settings in > MediaWikis LocalSettings.php? Try to set the line > Ini_set( 'memory_limit', '32M'); > to some higher value. > > Greetings > Marco > > -----Original Message----- > From: Robert Murphy [mailto:mra...@gm...] > Sent: Friday, February 12, 2010 11:39 AM > To: Semantic MediaWiki Developers List > Subject: [SMW-devel] SMW scalability > > Coders, > > I am not a coder. I'm not even any good at server maintenance. But SMW is > taking my site down several times a day now. My wiki is either the > biggest, > or nearly the biggest SMW wiki (according to > http://www.semantic-mediawiki.org/wiki/Sites_using_Semantic_MediaWiki) > with > 250,000 pages. My site runs out of memory and chokes all the time. I > looked in /var/log/messages and it is full of things like > > httpd: PHP Fatal error: Out of memory (allocated 10747904) (tried to > allocate 4864 bytes) in > /home/reformedword/public_html/includes/AutoLoader.php on line 582 > > but the php file in question is different every time. I'm getting one of > these kind of errors every half hour or more. > Before you say, "Up your PHP memory", know that I did! I went up from 64MB > to 128MB to 256MB. Same story. So I switched to babysitting "top -cd2". > When I change a page without semantic data, HTTPD and MYSQLD requests come, > linger and go. But when I change a page with Semantic::Values, the HTTPD > and MYSQLD processes take a VERY long time to die, sometimes never. > Eventually the site runs out of memory. > > Like I said, php.ini has 128MB memory and 60 second timeout for mysql. > apache has a 60 second timeout too. Any help? > > -Robert > > |
From: Ryan L. <rl...@gm...> - 2010-02-12 14:38:41
|
Robert, Did you restart the web server after upping the memory? Those setting won't take affect otherwise. V/r, Ryan Lane On Fri, Feb 12, 2010 at 8:24 AM, Robert Murphy <mra...@gm...> wrote: > It's commented out, so the system is going with what's in php.ini, right? > > On Fri, Feb 12, 2010 at 4:55 AM, Marco Mauritczat <ma...@fz...> wrote: >> >> Hi Robert, >> I am also still new to SMW, but did you also adjust your settings in >> MediaWikis LocalSettings.php? Try to set the line >> Ini_set( 'memory_limit', '32M'); >> to some higher value. >> >> Greetings >> Marco >> >> -----Original Message----- >> From: Robert Murphy [mailto:mra...@gm...] >> Sent: Friday, February 12, 2010 11:39 AM >> To: Semantic MediaWiki Developers List >> Subject: [SMW-devel] SMW scalability >> >> Coders, >> >> I am not a coder. I'm not even any good at server maintenance. But SMW >> is >> taking my site down several times a day now. My wiki is either the >> biggest, >> or nearly the biggest SMW wiki (according to >> http://www.semantic-mediawiki.org/wiki/Sites_using_Semantic_MediaWiki) >> with >> 250,000 pages. My site runs out of memory and chokes all the time. I >> looked in /var/log/messages and it is full of things like >> >> httpd: PHP Fatal error: Out of memory (allocated 10747904) (tried to >> allocate 4864 bytes) in >> /home/reformedword/public_html/includes/AutoLoader.php on line 582 >> >> but the php file in question is different every time. I'm getting one of >> these kind of errors every half hour or more. >> Before you say, "Up your PHP memory", know that I did! I went up from >> 64MB >> to 128MB to 256MB. Same story. So I switched to babysitting "top -cd2". >> When I change a page without semantic data, HTTPD and MYSQLD requests >> come, >> linger and go. But when I change a page with Semantic::Values, the HTTPD >> and MYSQLD processes take a VERY long time to die, sometimes never. >> Eventually the site runs out of memory. >> >> Like I said, php.ini has 128MB memory and 60 second timeout for mysql. >> apache has a 60 second timeout too. Any help? >> >> -Robert >> > > > ------------------------------------------------------------------------------ > SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, > Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW > http://p.sf.net/sfu/solaris-dev2dev > _______________________________________________ > Semediawiki-devel mailing list > Sem...@li... > https://lists.sourceforge.net/lists/listinfo/semediawiki-devel > > |
From: Robert M. <mra...@gm...> - 2010-02-12 14:49:44
|
Does /etc/init.d/httpd restart do enough? That's what I did. On Fri, Feb 12, 2010 at 6:38 AM, Ryan Lane <rl...@gm...> wrote: > Robert, > > Did you restart the web server after upping the memory? Those setting > won't take affect otherwise. > > V/r, > > Ryan Lane > > On Fri, Feb 12, 2010 at 8:24 AM, Robert Murphy <mra...@gm...> > wrote: > > It's commented out, so the system is going with what's in php.ini, right? > > > > On Fri, Feb 12, 2010 at 4:55 AM, Marco Mauritczat <ma...@fz...> wrote: > >> > >> Hi Robert, > >> I am also still new to SMW, but did you also adjust your settings in > >> MediaWikis LocalSettings.php? Try to set the line > >> Ini_set( 'memory_limit', '32M'); > >> to some higher value. > >> > >> Greetings > >> Marco > >> > >> -----Original Message----- > >> From: Robert Murphy [mailto:mra...@gm...] > >> Sent: Friday, February 12, 2010 11:39 AM > >> To: Semantic MediaWiki Developers List > >> Subject: [SMW-devel] SMW scalability > >> > >> Coders, > >> > >> I am not a coder. I'm not even any good at server maintenance. But SMW > >> is > >> taking my site down several times a day now. My wiki is either the > >> biggest, > >> or nearly the biggest SMW wiki (according to > >> http://www.semantic-mediawiki.org/wiki/Sites_using_Semantic_MediaWiki) > >> with > >> 250,000 pages. My site runs out of memory and chokes all the time. I > >> looked in /var/log/messages and it is full of things like > >> > >> httpd: PHP Fatal error: Out of memory (allocated 10747904) (tried to > >> allocate 4864 bytes) in > >> /home/reformedword/public_html/includes/AutoLoader.php on line 582 > >> > >> but the php file in question is different every time. I'm getting one > of > >> these kind of errors every half hour or more. > >> Before you say, "Up your PHP memory", know that I did! I went up from > >> 64MB > >> to 128MB to 256MB. Same story. So I switched to babysitting "top > -cd2". > >> When I change a page without semantic data, HTTPD and MYSQLD requests > >> come, > >> linger and go. But when I change a page with Semantic::Values, the > HTTPD > >> and MYSQLD processes take a VERY long time to die, sometimes never. > >> Eventually the site runs out of memory. > >> > >> Like I said, php.ini has 128MB memory and 60 second timeout for mysql. > >> apache has a 60 second timeout too. Any help? > >> > >> -Robert > >> > > > > > > > ------------------------------------------------------------------------------ > > SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, > > Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW > > http://p.sf.net/sfu/solaris-dev2dev > > _______________________________________________ > > Semediawiki-devel mailing list > > Sem...@li... > > https://lists.sourceforge.net/lists/listinfo/semediawiki-devel > > > > > |
From: Markus K. <ma...@se...> - 2010-02-12 15:19:02
|
On Freitag, 12. Februar 2010, Robert Murphy wrote: > Coders, > > I am not a coder. I'm not even any good at server maintenance. But SMW is > taking my site down several times a day now. My wiki is either the > biggest, or nearly the biggest SMW wiki (according to > http://www.semantic-mediawiki.org/wiki/Sites_using_Semantic_MediaWiki) with > 250,000 pages. My site runs out of memory and chokes all the time. I > looked in /var/log/messages and it is full of things like > > httpd: PHP Fatal error: Out of memory (allocated 10747904) (tried to > allocate 4864 bytes) in > /home/reformedword/public_html/includes/AutoLoader.php on line 582 > > but the php file in question is different every time. I'm getting one of > these kind of errors every half hour or more. > Before you say, "Up your PHP memory", know that I did! I went up from 64MB > to 128MB to 256MB. Same story. So I switched to babysitting "top -cd2". > When I change a page without semantic data, HTTPD and MYSQLD requests come, > linger and go. But when I change a page with Semantic::Values, the HTTPD > and MYSQLD processes take a VERY long time to die, sometimes never. > Eventually the site runs out of memory. > > Like I said, php.ini has 128MB memory and 60 second timeout for mysql. > apache has a 60 second timeout too. Any help? Great, finally someone has a performance-related request (I sometimes feel that I am the only one who is concerned about performance). Regarding PHP, I don't think that a memory limit of more than 50MB or maximally 100MB can be recommended to any public site. What ever dies beyond this point cannot be saved. On the other hand, PHP Out of Memory issues are hard to track since there cause is often not the function that adds the final byte that uses up all memory. You have seen this in your logs. One general thing that should be done on larger sites (actually on all sites!) is bytecode caching, see [1]. This significantly reduces the impact that large PHP files as such have on your memory requirements. Out of mem issues usually result in blank pages that can only be edited by changing the URL manually to use the edit action. Finding these pages is crucial to track down the problem. In the context of SMW, I have seen memory issues when inline queries return a long list of results each of which contains a lot of values. This problem is worse when using templates for formatting, but it occurs also with tables. I have tracked down this problem to MediaWiki in my tests: manually writing a page with the contents produced by the large inline query has also used up all memory, even without SMW being involved. If this is the case on your wiki, then my only advise is to change the SMW settings to restrict the size of query outputs so that pages cannot become so large. If this is not the problem you have, then it is important to find out which pages cause the issues in your wiki. Note that problems that are caused by MediaWiki jobs could also appear for random pages since they are not depending on the page contents. Regarding MySQL, you should activate and check the slow query logging of MySQL. It will create log files that show you which queries took particularly long. This can often be used to track down problematic queries and to do something to prevent them. If you experience general site overload in a burst-like fashion then it might be that some over-zealous crawler is visiting your site, possibly triggering complicated activities. Check your Apache logs to see if you have high loads for certain robots or suspicious user agents, especially on special pages like Ask. Update your robots.txt to disallow crawlers to browse all results of an inline query (crawlers have been observed to do this). -- Markus [1] http://www.mediawiki.org/wiki/User:Robchurch/Performance_tuning -- Markus Krötzsch <ma...@se...> * Personal page: http://korrekt.org * Semantic MediaWiki: http://semantic-mediawiki.org * Semantic Web textbook: http://semantic-web-book.org -- |
From: Thomas F. <tho...@gm...> - 2010-02-12 15:39:24
|
Just as a reference, I have a wiki (mw 1.13.2, smw 1.3, php 5.2.3, mysql 5.0.45) with ~210,000 pages and ~5.2 million properties values (over 51 defined property), PHP memory is set to 128M, and I believe it is using APC. No problems in terms of memory limits being reached -- how many properties/page and queries/page do you have? what types of queries? As always, Markus' suggestions are right on. -Tom On Fri, Feb 12, 2010 at 10:18 AM, Markus Krötzsch <ma...@se...> wrote: > On Freitag, 12. Februar 2010, Robert Murphy wrote: >> Coders, >> >> I am not a coder. I'm not even any good at server maintenance. But SMW is >> taking my site down several times a day now. My wiki is either the >> biggest, or nearly the biggest SMW wiki (according to >> http://www.semantic-mediawiki.org/wiki/Sites_using_Semantic_MediaWiki) with >> 250,000 pages. My site runs out of memory and chokes all the time. I >> looked in /var/log/messages and it is full of things like >> >> httpd: PHP Fatal error: Out of memory (allocated 10747904) (tried to >> allocate 4864 bytes) in >> /home/reformedword/public_html/includes/AutoLoader.php on line 582 >> >> but the php file in question is different every time. I'm getting one of >> these kind of errors every half hour or more. >> Before you say, "Up your PHP memory", know that I did! I went up from 64MB >> to 128MB to 256MB. Same story. So I switched to babysitting "top -cd2". >> When I change a page without semantic data, HTTPD and MYSQLD requests come, >> linger and go. But when I change a page with Semantic::Values, the HTTPD >> and MYSQLD processes take a VERY long time to die, sometimes never. >> Eventually the site runs out of memory. >> >> Like I said, php.ini has 128MB memory and 60 second timeout for mysql. >> apache has a 60 second timeout too. Any help? > > Great, finally someone has a performance-related request (I sometimes feel > that I am the only one who is concerned about performance). > > Regarding PHP, I don't think that a memory limit of more than 50MB or > maximally 100MB can be recommended to any public site. What ever dies beyond > this point cannot be saved. On the other hand, PHP Out of Memory issues are > hard to track since there cause is often not the function that adds the final > byte that uses up all memory. You have seen this in your logs. > > One general thing that should be done on larger sites (actually on all sites!) > is bytecode caching, see [1]. This significantly reduces the impact that large > PHP files as such have on your memory requirements. > > Out of mem issues usually result in blank pages that can only be edited by > changing the URL manually to use the edit action. Finding these pages is > crucial to track down the problem. In the context of SMW, I have seen memory > issues when inline queries return a long list of results each of which > contains a lot of values. This problem is worse when using templates for > formatting, but it occurs also with tables. I have tracked down this problem > to MediaWiki in my tests: manually writing a page with the contents produced > by the large inline query has also used up all memory, even without SMW being > involved. If this is the case on your wiki, then my only advise is to change > the SMW settings to restrict the size of query outputs so that pages cannot > become so large. If this is not the problem you have, then it is important to > find out which pages cause the issues in your wiki. Note that problems that > are caused by MediaWiki jobs could also appear for random pages since they are > not depending on the page contents. > > > Regarding MySQL, you should activate and check the slow query logging of > MySQL. It will create log files that show you which queries took particularly > long. This can often be used to track down problematic queries and to do > something to prevent them. > > > If you experience general site overload in a burst-like fashion then it might > be that some over-zealous crawler is visiting your site, possibly triggering > complicated activities. Check your Apache logs to see if you have high loads > for certain robots or suspicious user agents, especially on special pages like > Ask. Update your robots.txt to disallow crawlers to browse all results of an > inline query (crawlers have been observed to do this). > > -- Markus > > [1] http://www.mediawiki.org/wiki/User:Robchurch/Performance_tuning > > > > -- > Markus Krötzsch <ma...@se...> > * Personal page: http://korrekt.org > * Semantic MediaWiki: http://semantic-mediawiki.org > * Semantic Web textbook: http://semantic-web-book.org > -- > > ------------------------------------------------------------------------------ > SOLARIS 10 is the OS for Data Centers - provides features such as DTrace, > Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW > http://p.sf.net/sfu/solaris-dev2dev > _______________________________________________ > Semediawiki-devel mailing list > Sem...@li... > https://lists.sourceforge.net/lists/listinfo/semediawiki-devel > > |
From: Robert M. <mra...@gm...> - 2010-02-13 15:01:11
|
I install APC it makes some difference in speed. I turned on slow-message logging for mySQL and that log file is filling up fast! Some examples: # User@Host: reformedword[reformedword] @ localhost [] # Query_time: 9 Lock_time: 2 Rows_sent: 5 Rows_examined: 5 SELECT /* SMW::deleteSubject::Nary Aquatiki */ smw_id FROM `rw_smw_ids` WHERE smw_title='' AND smw_namespace='712136' AND smw_iw=':smw'; # User@Host: reformedword[reformedword] @ localhost [] # Query_time: 46 Lock_time: 0 Rows_sent: 0 Rows_examined: 0 DELETE /* SMW::deleteSubject::Atts2 Aquatiki */ FROM `rw_smw_atts2` WHERE s_id = '230051'; # Time: 100213 3:02:55 # User@Host: reformedword[reformedword] @ localhost [] # Query_time: 55 Lock_time: 1 Rows_sent: 0 Rows_examined: 0 INSERT /* SMW::updateRel2Data 216.129.119.43 */ INTO `rw_smw_rels2` (s_id,p_id,o_id) VALUES ('202900','4987',' 202901'),('202900','5064','5065'),('202900','5463','202899'),('202900','135','289060'),('289060','225879','304' ),('202900','135','289061'),('289061','225879','118784'),('289061','225881','331'),('202900','135','289062'),(' 289062','225879','119703'),('289062','225881','386'),('202900','135','289065'),('289065','225879','277'),('2029 00','135','289066'),('289066','225879','207525'),('289066','225881','385'),('202900','135','289067'),('289067', '225879','119048'),('289067','225881','225124'),('202900','135','601368'),('601368','225879','584'),('601368',' 225881','387'),('202900','1011','1417340'),('1417340','225879','1425'),('202900','1011','1417342'),('1417342',' 225879','1446'),('1417342','225881','1174937'),('202900','1011','1663477'),('1663477','225879','1182228'),('166 3477','225881','209798'),('202900','1464','1663478'),('1663478','225879','926'),('1663478','225881','375867'),( '202900','1464','1663479'),('1663479','225879','1490'),('202900','1464','1663480'),('1663480','225879','223216' ),('1663480','225881','687547'),('202900','1464','1663481'),('1663481','225879','845'),('202900','1464','222487 0'),('2224870','225879','1651'),('202900','6840','4980'),('202900','6906','5727'),('202900','223343','6503'); # Time: 100213 3:03:12 # User@Host: reformedword[reformedword] @ localhost [] # Query_time: 36 Lock_time: 19 Rows_sent: 1 Rows_examined: 1948 SELECT /* SMW::getQueryResult 216.129.119.43 */ COUNT(DISTINCT t0.smw_id) AS count FROM `rw_smw_ids` AS t0 IN NER JOIN `rw_smw_rels2` AS t2 ON t0.smw_id=t2.s_id INNER JOIN `rw_smw_rels2` AS t5 ON t2.o_id=t5.s_id INNER JOI N `rw_smw_inst2` AS t7 ON t2.s_id=t7.s_id WHERE t2.p_id='1464' AND t5.p_id='225879' AND t5.o_id='177656' AND t 7.o_id='3304' LIMIT 10001; # Time: 100213 3:03:30 > > Regarding PHP, I don't think that a memory limit of more than 50MB or > maximally 100MB can be recommended to any public site. What ever dies > beyond > this point cannot be saved. On the other hand, PHP Out of Memory issues are > hard to track since there cause is often not the function that adds the > final > byte that uses up all memory. You have seen this in your logs. > > One general thing that should be done on larger sites (actually on all > sites!) > is bytecode caching, see [1]. This significantly reduces the impact that > large > PHP files as such have on your memory requirements. > > Out of mem issues usually result in blank pages that can only be edited by > changing the URL manually to use the edit action. Finding these pages is > crucial to track down the problem. In the context of SMW, I have seen > memory > issues when inline queries return a long list of results each of which > contains a lot of values. This problem is worse when using templates for > formatting, but it occurs also with tables. I have tracked down this > problem > to MediaWiki in my tests: manually writing a page with the contents > produced > by the large inline query has also used up all memory, even without SMW > being > involved. If this is the case on your wiki, then my only advise is to > change > the SMW settings to restrict the size of query outputs so that pages cannot > become so large. If this is not the problem you have, then it is important > to > find out which pages cause the issues in your wiki. Note that problems that > are caused by MediaWiki jobs could also appear for random pages since they > are > not depending on the page contents. > > > Regarding MySQL, you should activate and check the slow query logging of > MySQL. It will create log files that show you which queries took > particularly > long. This can often be used to track down problematic queries and to do > something to prevent them. > > > If you experience general site overload in a burst-like fashion then it > might > be that some over-zealous crawler is visiting your site, possibly > triggering > complicated activities. Check your Apache logs to see if you have high > loads > for certain robots or suspicious user agents, especially on special pages > like > Ask. Update your robots.txt to disallow crawlers to browse all results of > an > inline query (crawlers have been observed to do this). > > -- Markus > > [1] http://www.mediawiki.org/wiki/User:Robchurch/Performance_tuning > > > > -- > Markus Krötzsch <ma...@se...> > * Personal page: http://korrekt.org > * Semantic MediaWiki: http://semantic-mediawiki.org > * Semantic Web textbook: http://semantic-web-book.org > -- > |