From: Aaron H. <aaron@MetroNY.com> - 2002-09-23 19:41:59
|
I asked on the pg lists and it kicked off a thread that is winding down=20 with a decision about marking the start time of a query in pg_stat_activi= ty. There does not seem to be any easy way to monitor the progess. The best I can do is get the procID from postgres and then use Python to=20 monitor its stats (cpu, disk , ram use....) Thanks, -Aaron Gerhard H=E4ring wrote: > * Aaron Held <aaron@MetroNY.com> [2002-09-20 16:03 -0400]: >=20 >>>>Is there a way to monitor the progress of a query? >>> >>>"Chapter 10. Monitoring Database Activity" >> >> >>>What else would they do except working? >> >>I have optimized the system as much as possible, but during high loads >>the queries can take a long time. I would like to give the user >>some indication of how long the search will take. >=20 >=20 > I don't think that's possible for a single SQL statement (if splitting > it up by rewriting it as a stored procedure isn't possible). At least I > don't know of any way under PostgreSQL. Or Oracle, for that matter. >=20 > pyPgSQL has some support for asynchronuous query processing in its > low-level module libpq, but I don't see how that could help in your > case. >=20 > I'd try to ask on one of the PostgreSQL lists, perhaps there's still > some creative possibility to achieve your goal. >=20 > -- Gerhard >=20 >=20 > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Pypgsql-users mailing list > Pyp...@li... > https://lists.sourceforge.net/lists/listinfo/pypgsql-users |