From: Stephan Robotta <robotta@on...> - 2010-03-31 10:27:42
we have an wiki installation with some queries that produce quite a lot
of result data. An average case is a query with up to 10 property print
outs and a result set of 4000 pages.
What is your experience in your installations about the limits in the
php.ini that fit your needs but also handles all incoming requests? The
current settings are:
max_execution_time = 30
memory_limit = 128M
which has been proven way to low to serve the queries mentioned above.
I am not able to estimate the page access numbers yet that are
expected. Any of your experienses in real life are very welcome.
Does anyone know a way so that a query may also be processed via a batch
script without running into the webserver limits? Some of the queries are
for administrative purposes only and it is sufficient that the result
is stored in a CSV file which can be mailed or downloaded later.
ontoprise GmbH - know how to use Know-how
- - -
Halo Extension - Want to get involved? http://smwforum.ontoprise.com/development
- - -
An der RaumFabrik 29; 76227 Karlsruhe; Germany
Tel.: +49 721 509809-10; Fax: +49 721 509809-11
email: robotta@..., www: http://www.ontoprise.com
Registered Office: Karlsruhe, Germany, HRB 109540
Managing Directors: Prof. Dr. Jürgen Angele, Hans-Peter Schnurr
- - -
Get latest updates about Open Source Projects, Conferences and News.