You can subscribe to this list here.
| 2001 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(47) |
Nov
(74) |
Dec
(66) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2002 |
Jan
(95) |
Feb
(102) |
Mar
(83) |
Apr
(64) |
May
(55) |
Jun
(39) |
Jul
(23) |
Aug
(77) |
Sep
(88) |
Oct
(84) |
Nov
(66) |
Dec
(46) |
| 2003 |
Jan
(56) |
Feb
(129) |
Mar
(37) |
Apr
(63) |
May
(59) |
Jun
(104) |
Jul
(48) |
Aug
(37) |
Sep
(49) |
Oct
(157) |
Nov
(119) |
Dec
(54) |
| 2004 |
Jan
(51) |
Feb
(66) |
Mar
(39) |
Apr
(113) |
May
(34) |
Jun
(136) |
Jul
(67) |
Aug
(20) |
Sep
(7) |
Oct
(10) |
Nov
(14) |
Dec
(3) |
| 2005 |
Jan
(40) |
Feb
(21) |
Mar
(26) |
Apr
(13) |
May
(6) |
Jun
(4) |
Jul
(23) |
Aug
(3) |
Sep
(1) |
Oct
(13) |
Nov
(1) |
Dec
(6) |
| 2006 |
Jan
(2) |
Feb
(4) |
Mar
(4) |
Apr
(1) |
May
(11) |
Jun
(1) |
Jul
(4) |
Aug
(4) |
Sep
|
Oct
(4) |
Nov
|
Dec
(1) |
| 2007 |
Jan
(2) |
Feb
(8) |
Mar
(1) |
Apr
(1) |
May
(1) |
Jun
|
Jul
(2) |
Aug
|
Sep
(1) |
Oct
|
Nov
|
Dec
|
| 2008 |
Jan
(1) |
Feb
|
Mar
(1) |
Apr
(2) |
May
|
Jun
|
Jul
(1) |
Aug
|
Sep
(1) |
Oct
|
Nov
|
Dec
|
| 2009 |
Jan
|
Feb
|
Mar
(2) |
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2010 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
(1) |
| 2011 |
Jan
|
Feb
|
Mar
(1) |
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
|
Dec
|
| 2012 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2013 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2016 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2017 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
(1) |
Dec
|
|
From: Geoff H. <ghu...@us...> - 2003-04-27 07:15:23
|
STATUS of ht://Dig branch 3-2-x
RELEASES:
3.2.0b5: Next release, First quarter 2003???
3.2.0b4: "In progress" -- snapshots called "3.2.0b4" until prerelease.
3.2.0b3: Released: 22 Feb 2001.
3.2.0b2: Released: 11 Apr 2000.
3.2.0b1: Released: 4 Feb 2000.
(Please note that everything added here should have a tracker PR# so
we can be sure they're fixed. Geoff is currently trying to add PR#s for
what's currently here.)
SHOWSTOPPERS:
* Mifluz database errors are a severe problem (PR#428295)
-- Does Neal's new zlib patch solve this for now?
KNOWN BUGS:
* Odd behavior with $(MODIFIED) and scores not working with
wordlist_compress set but work fine without wordlist_compress.
(the date is definitely stored correctly, even with compression on
so this must be some sort of weird htsearch bug) PR#618737.
* META descriptions are somehow added to the database as FLAG_TITLE,
not FLAG_DESCRIPTION. (PR#618738)
Can anyone reproduce this? I can't! -- Lachlan
PENDING PATCHES (available but need work):
* Additional support for Win32.
* Memory improvements to htmerge. (Backed out b/c htword API changed.)
* Mifluz merge.
NEEDED FEATURES:
* Quim's new htsearch/qtest query parser framework.
* File/Database locking. PR#405764.
TESTING:
* httools programs:
(htload a test file, check a few characteristics, htdump and compare)
* Tests for new config file parser
* Duplicate document detection while indexing
* Major revisions to ExternalParser.cc, including fork/exec instead of popen,
argument handling for parser/converter, allowing binary output from an
external converter.
* ExternalTransport needs testing of changes similar to ExternalParser.
DOCUMENTATION:
* List of supported platforms/compilers is ancient. (PR#405279)
* Document all of htsearch's mappings of input parameters to config attributes
to template variables. (Relates to PR#405278.)
Should we make sure these config attributes are all documented in
defaults.cc, even if they're only set by input parameters and never
in the config file?
* Split attrs.html into categories for faster loading.
* Turn defaults.cc into an XML file for generating documentation and
defaults.cc.
* require.html is not updated to list new features and disk space
requirements of 3.2.x (e.g. regex matching, database compression.)
PRs# 405280 #405281.
* TODO.html has not been updated for current TODO list and
completions.
I've tried. Someone "official" please check and remove this -- Lachlan
* Htfuzzy could use more documentation on what each fuzzy algorithm
does. PR#405714.
* Document the list of all installed files and default
locations. PR#405715.
OTHER ISSUES:
* Can htsearch actually search while an index is being created?
* The code needs a security audit, esp. htsearch. PR#405765.
|
|
From: Lachlan A. <lh...@us...> - 2003-04-27 04:13:15
|
On Sun, 27 Apr 2003 12:54, Jim Cole wrote: > Hi - Using the patch with current CVS code (as of this afternoon), > I am still running into fatal problems with OS X. Thanks for that. I have been able to replicate the problem you=20 reported, and have found a cludgy work-around but I am looking for=20 the source of the prblem and will let you know once I fix it. Thanks again, Lachlan |
|
From: Jim C. <li...@yg...> - 2003-04-27 02:54:34
|
On Thursday, April 24, 2003, at 06:32 PM, Lachlan Andrew wrote: > On Fri, 25 Apr 2003 00:30, Ted Stresen-Reuter wrote: >> I meant to express my interest in testing your patch too. I >> have OS X as well. I would need instructions for >> how to apply a patch. > > Thanks. The revised patch is attached. Hi - Using the patch with current CVS code (as of this afternoon), I am still running into fatal problems with OS X. I no longer get the segfault, but instead see the output shown below. The error messages repeat with a different page number each time. Although it hasn't yet run to completion, it seems that increasing the value of wordlist_page_size solves the immediate problem; it was originally set to 8192 and I doubled it. I run into the same problem if the attribute is left unset in the configuration file. Jim WordDB: CDB___memp_cmpr_alloc: unexpected error from weakcmpr base WordDB: PANIC: Cannot allocate memory WordDB: PANIC: DB_RUNRECOVERY: Fatal error, run database recovery WordDB: /Users/greyleaf/local/htdig_cvs/var/htdig-crash1/db.words.db: write failed for page 13982 WordDB: Unable to allocate 8247 bytes from mpool shared region: Cannot allocate memory |
|
From: Lachlan A. <lh...@us...> - 2003-04-25 00:40:48
|
Greetings Jim, What was the command which produced this error? Did ./configure report any problems? Which directory was make in at the time? Regards, Lachlan On Tue, 22 Apr 2003 05:21, Jim Gifford wrote: > ---Begin Error Message--- > ./libtool: line 1: cd: yes/lib: No such file or directory > libtool: link: cannot determine absolute directory name of > `yes/lib' make[3]: *** [libhtdb.la] Error 1 > make[2]: *** [all] Error 2 > make[1]: *** [all-recursive] Error 1 > ---End Error Message--- > > Everytime I try to compile the b4 snapshosts, I receive this error > message. I am using gcc 3.2.2, glibc 2.3.2, binutils 2.13.2.1, and > libtool 1.5. |
|
From: Lachlan A. <lh...@us...> - 2003-04-25 00:32:35
|
On Fri, 25 Apr 2003 00:30, Ted Stresen-Reuter wrote:
> I meant to express my interest in testing your patch too. I
> have OS X as well. I would need instructions for
> how to apply a patch.
Thanks. The revised patch is attached.
To apply it, start from a freshly checked out CVS, and cd to the top=20
directory. Then type
patch -p0 < your/path/to/dbase.patch1
et voila.
The '-p0' is because this particular patch specifies the destination=20
files (e.g., the last path on the first line of the patch) exactly=20
the way you would specify it from where you have 'cd'd to. If you=20
want it to ignore the first leading path component (say the file said=20
"test/db/mp.h"), then you would say '-p1'.
Let me know if you have any problems.
Thanks again,
Lachlan
> "How wonderful the world would be without Saddam and without Bush!"
Hear, Hear!! |
|
From: Ted Stresen-R. <ted...@ma...> - 2003-04-24 14:32:32
|
BTW, I meant to express my interest in testing your patch too. I have OS X as well. One thing tho... I would need instructions for how to apply a patch. I know it's not too difficult, but have little experience doing it and would need some assistance. Ted Stresen-Reuter On Thursday, April 24, 2003, at 08:56 AM, Lachlan Andrew wrote: > On Wed, 23 Apr 2003 00:20, Alban MEUNIER wrote: > >> Is there anybody working on htdig? >> when a new release of htdig is planned for live? > > Yes, we're here, and doing our best... > > I'm not sure when the next release is planned, but the next beta > should be out Any Day Now(tm). My guess would be another month. > This beta should be reliable enough for "live" use, albeit a bit > slow. > > My advice is to go with the current snapshot of the 3.2.0b4 rather > than waiting for a release. > > Regards, > Lachlan > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > htdig-dev mailing list > htd...@li... > https://lists.sourceforge.net/lists/listinfo/htdig-dev > > "How wonderful the world would be without Saddam and without Bush!" |
|
From: Lachlan A. <lh...@us...> - 2003-04-24 13:57:08
|
On Wed, 23 Apr 2003 00:20, Alban MEUNIER wrote: > Is there anybody working on htdig? > when a new release of htdig is planned for live? Yes, we're here, and doing our best... I'm not sure when the next release is planned, but the next beta=20 should be out Any Day Now(tm). My guess would be another month. =20 This beta should be reliable enough for "live" use, albeit a bit=20 slow. My advice is to go with the current snapshot of the 3.2.0b4 rather=20 than waiting for a release. Regards, Lachlan |
|
From: Lachlan A. <lh...@us...> - 2003-04-22 22:14:06
|
Thanks for the reminder, Geoff. I've also found several problems with the patch I sent, and will post=20 a new one once I've sorted them out. I hope that hasn't wasted too=20 much of your time! Apologies, Lachlan On Tue, 22 Apr 2003 23:32, Geoff Hutchison wrote: > use the SF compile farm for compile/debug/testing sorts of things. > (That said Lachlan, I'll give you feedback on your patches on OS X > and RH 8.0 later today.) |
|
From: <alb...@ya...> - 2003-04-22 14:20:37
|
Hi, Is there anybody working on htdig? when a new release of htdig is planned for live? regards ___________________________________________________________ Do You Yahoo!? -- Une adresse @yahoo.fr gratuite et en français ! Yahoo! Mail : http://fr.mail.yahoo.com |
|
From: Geoff H. <ghu...@ws...> - 2003-04-22 13:32:31
|
I thought I should point out to ht://Dig developers that we can all use the SF compile farm for compile/debug/testing sorts of things. This is of course extremely useful for testing on machines that you don't have personal access to. (That said Lachlan, I'll give you feedback on your patches on OS X and RH 8.0 later today.) http://sourceforge.net/docman/display_doc.php?docid=762&group_id=1 -Geoff |
|
From: Jim G. <ji...@jg...> - 2003-04-21 19:22:24
|
---Begin Error Message--- ./libtool: line 1: cd: yes/lib: No such file or directory libtool: link: cannot determine absolute directory name of `yes/lib' make[3]: *** [libhtdb.la] Error 1 make[2]: *** [all] Error 2 make[1]: *** [all-recursive] Error 1 ---End Error Message--- Everytime I try to compile the b4 snapshosts, I receive this error message. I am using gcc 3.2.2, glibc 2.3.2, binutils 2.13.2.1, and libtool 1.5. |
|
From: Lachlan A. <lh...@us...> - 2003-04-21 11:33:50
|
Greetings all, Attached is a patch to the database code which *should* fix the=20 infinite loop problem Jim reported a couple of months ago. It is a combination of explicitly denying excess recursion in =20 CDB___memp_alloc(), a patch from BDB 3.3.11 which handles failure of=20 writing a dirty page in CDB___memp_alloc(), and a new function =20 CDB___memp_clean_page() to ensure that there are sufficient clean=20 pages to allow dirty compressed pages to be written. Before I commit the patch, could people please test and/or read it,=20 and give me feedback? I *hope* that this is the final problem fixed (until we tackle the=20 native compression...) Cheers, Lachlan |
|
From: Lachlan A. <lh...@us...> - 2003-04-20 07:45:31
|
Greetings all,
Among other problems, I think I've found a (slow) memory leak in=20
mp_cmpr.c -- does anyone know where the memory allocated using
CDB___memp_cmpr_alloc_chain(dbmfp->dbmp, bhp, BH_CMPR_POOL);
is (or should be) freed?
Thanks,
Lachlan
On Sat, 12 Apr 2003 02:41, Geoff Hutchison wrote:
> Further pointers from you and/or Neil would really help
|
|
From: Geoff H. <ghu...@us...> - 2003-04-20 07:14:12
|
STATUS of ht://Dig branch 3-2-x
RELEASES:
3.2.0b5: Next release, First quarter 2003???
3.2.0b4: "In progress" -- snapshots called "3.2.0b4" until prerelease.
3.2.0b3: Released: 22 Feb 2001.
3.2.0b2: Released: 11 Apr 2000.
3.2.0b1: Released: 4 Feb 2000.
(Please note that everything added here should have a tracker PR# so
we can be sure they're fixed. Geoff is currently trying to add PR#s for
what's currently here.)
SHOWSTOPPERS:
* Mifluz database errors are a severe problem (PR#428295)
-- Does Neal's new zlib patch solve this for now?
KNOWN BUGS:
* Odd behavior with $(MODIFIED) and scores not working with
wordlist_compress set but work fine without wordlist_compress.
(the date is definitely stored correctly, even with compression on
so this must be some sort of weird htsearch bug) PR#618737.
* META descriptions are somehow added to the database as FLAG_TITLE,
not FLAG_DESCRIPTION. (PR#618738)
Can anyone reproduce this? I can't! -- Lachlan
PENDING PATCHES (available but need work):
* Additional support for Win32.
* Memory improvements to htmerge. (Backed out b/c htword API changed.)
* Mifluz merge.
NEEDED FEATURES:
* Quim's new htsearch/qtest query parser framework.
* File/Database locking. PR#405764.
TESTING:
* httools programs:
(htload a test file, check a few characteristics, htdump and compare)
* Tests for new config file parser
* Duplicate document detection while indexing
* Major revisions to ExternalParser.cc, including fork/exec instead of popen,
argument handling for parser/converter, allowing binary output from an
external converter.
* ExternalTransport needs testing of changes similar to ExternalParser.
DOCUMENTATION:
* List of supported platforms/compilers is ancient. (PR#405279)
* Document all of htsearch's mappings of input parameters to config attributes
to template variables. (Relates to PR#405278.)
Should we make sure these config attributes are all documented in
defaults.cc, even if they're only set by input parameters and never
in the config file?
* Split attrs.html into categories for faster loading.
* Turn defaults.cc into an XML file for generating documentation and
defaults.cc.
* require.html is not updated to list new features and disk space
requirements of 3.2.x (e.g. regex matching, database compression.)
PRs# 405280 #405281.
* TODO.html has not been updated for current TODO list and
completions.
I've tried. Someone "official" please check and remove this -- Lachlan
* Htfuzzy could use more documentation on what each fuzzy algorithm
does. PR#405714.
* Document the list of all installed files and default
locations. PR#405715.
OTHER ISSUES:
* Can htsearch actually search while an index is being created?
* The code needs a security audit, esp. htsearch. PR#405765.
|
|
From: Lachlan A. <lh...@us...> - 2003-04-19 15:52:23
|
On Sat, 19 Apr 2003 06:16, Jim Cole wrote: > I might actually have a few extra > minutes a day free to apply toward the next release. I've just fixed the database bug that was causing me grief with 'make check'. It was simply that htdig -i wasn't removing=20 db.words.db_weakcmpr (which I had reported earlier but not fixed --=20 Doh!!) which was corrupting the free-list of pages used when=20 compression was not as efficient as predicted. The only other DB problem I know of is the problem of infinite=20 recursion when "weak compression" needs to allocate a page, but=20 mp_alloc needs to write out a dirty page. Have you tried the hack I=20 suggested a couple of months ago? Getting make check to work under RedHat and OSX would also be great. > I picked up a copy of Sleepycat Software's Berkely DB > book on the discount rack for six bucks. Bargain! |
|
From: Jim C. <li...@yg...> - 2003-04-18 20:16:46
|
Hi - I am to a point where I might actually have a few extra minutes a day free to apply toward the next release. I have more or less kept track of where things stand but thought I would seek input on how my time might most effectively be spent. I have OS X and Linux boxes handy and an old Sparc that I could dust off if necessary. I picked up a copy of Sleepycat Software's Berkely DB book the other day, if that is of any help to anyone; hard to turn something like that down when you find it on the discount rack for six bucks. Jim |
|
From: Gabriele B. <g.b...@co...> - 2003-04-18 13:15:04
|
Ciao Neal! > 1) The function reparses 'bad_extensions' & 'valid_extensions' each time > through. This seems wastefull. And good reason to do this? As Geoff pointed out, we need to check this for the block feature. However, an optimized version of this structure would be good. But ... any hints on how to make this? > 2) Toward the end of the function, just before we test the URL against > 'limits' & 'limit_normalized', we check the server's robots.txt file. > Wouldn't it make sense to do the robots.txt check AFTER the limits > check, so as not to waste network connections on servers that will get > rejected by the next two tests? If I am not wrong, the robots txt is not retrieved at this stage, but after the URL is considered to be valid according to our 'limits'.=20 Indeed, the robots.txt file is retrieved in the server class' constructor. Please correct me if I am wrong. Ciao -Gabriele P.S.: Happy Easter to everyone. --=20 Gabriele Bartolini - Web Programmer Comune di Prato - Prato - Tuscany - Italy g.b...@co... | http://www.comune.prato.it > find bin/laden -name osama -exec rm {} ; |
|
From: Neal R. <ne...@ri...> - 2003-04-18 00:21:13
|
On Thu, 17 Apr 2003, Geoff Hutchison wrote: > > > 1) The function reparses 'bad_extensions' & 'valid_extensions' each time > > through. This seems wastefull. And good reason to do this? > > Depends. Once upon a time, we thought that these should be configurable on > a per-URL basis. (Which is why they're "reparsed.") Now maybe it's better > to re-think this in terms of improved performance? It would certainly be easy to add two hash/lists, fill them during 'Initial', and ping against this in IsValidURL. We could also preserve present functionality by adding local copies of the 'bad_extensions' & 'valid_extensions' strings, and checking the current ones against the local ones to check for changes, and reparse if necessary.... Is there a good example of the utility of per-URL changes to these? How would they change in the middle of a spidering run? > > 2) Toward the end of the function, just before we test the URL against > > 'limits' & 'limit_normalized', we check the server's robots.txt file. > > Wouldn't it make sense to do the robots.txt check AFTER the limits > > check, so as not to waste network connections on servers that will get > > rejected by the next two tests? > > Good point. I fixed this checked it in. I also ran 'GNU indent' on the file and did a second commit to clean up the formatting. I did this in two steps so the first change is easily readable. Thanks. Neal Richter Knowledgebase Developer RightNow Technologies, Inc. Customer Service for Every Web Site Office: 406-522-1485 |
|
From: Geoff H. <ghu...@ws...> - 2003-04-17 18:15:20
|
> 1) The function reparses 'bad_extensions' & 'valid_extensions' each time > through. This seems wastefull. And good reason to do this? Depends. Once upon a time, we thought that these should be configurable on a per-URL basis. (Which is why they're "reparsed.") Now maybe it's better to re-think this in terms of improved performance? > 2) Toward the end of the function, just before we test the URL against > 'limits' & 'limit_normalized', we check the server's robots.txt file. > Wouldn't it make sense to do the robots.txt check AFTER the limits > check, so as not to waste network connections on servers that will get > rejected by the next two tests? Good point. -- -Geoff Hutchison Williams Students Online http://wso.williams.edu/ |
|
From: Neal R. <ne...@ri...> - 2003-04-17 18:06:06
|
Hey all,
I'm looking through this function and I have a couple questions.
1) The function reparses 'bad_extensions' & 'valid_extensions' each time
through. This seems wastefull. And good reason to do this?
2) Toward the end of the function, just before we test the URL against
'limits' & 'limit_normalized', we check the server's robots.txt file.
Wouldn't it make sense to do the robots.txt check AFTER the limits
check, so as not to waste network connections on servers that will get
rejected by the next two tests?
Thanks.
Neal Richter
Knowledgebase Developer
RightNow Technologies, Inc.
Customer Service for Every Web Site
Office: 406-522-1485
|
|
From: SourceForge.net <no...@so...> - 2003-04-17 07:13:52
|
Patches item #722980, was opened at 2003-04-17 00:13 Message generated for change (Tracker Item Submitted) made by Item Submitter You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=304593&aid=722980&group_id=4593 Category: None Group: None Status: Open Resolution: None Priority: 5 Submitted By: Nobody/Anonymous (nobody) Assigned to: Nobody/Anonymous (nobody) Summary: new tamplate variable bad_words Initial Comment: this patch allows htsearch to display the keywords which where ignored during the search because they were on the bad word list. Further information may be found in the README within the tar-file contained. ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=304593&aid=722980&group_id=4593 |
|
From: Lachlan A. <lh...@us...> - 2003-04-16 23:02:05
|
Yes, the 3.2.0b4 snapshot does this. On Tue, 15 Apr 2003 01:35, Simon Gauthier wrote: > Hi i would to know if with Htdig we can do exact phrase search? > and if so in wich version ? |
|
From: SourceForge.net <no...@so...> - 2003-04-16 14:54:50
|
Patches item #722536, was opened at 2003-04-16 08:10 Message generated for change (Tracker Item Submitted) made by Item Submitter You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=304593&aid=722536&group_id=4593 Category: None Group: None Status: Open Resolution: None Priority: 5 Submitted By: Nobody/Anonymous (nobody) Assigned to: Nobody/Anonymous (nobody) Summary: new tamplate variable bad_words Initial Comment: this patch allows htsearch to display the keywords which where ignored during the search because they were on the bad word list. Further information may be found in the README within the tar-file contained. ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=304593&aid=722536&group_id=4593 |
|
From: SourceForge.net <no...@so...> - 2003-04-16 14:39:43
|
Patches item #722528, was opened at 2003-04-16 07:56 Message generated for change (Tracker Item Submitted) made by Item Submitter You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=304593&aid=722528&group_id=4593 Category: None Group: None Status: Open Resolution: None Priority: 5 Submitted By: Nobody/Anonymous (nobody) Assigned to: Nobody/Anonymous (nobody) Summary: new multidig skript Initial Comment: Here is a new, extended version of multidig see readme for details ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=304593&aid=722528&group_id=4593 |
|
From: Lachlan A. <lh...@us...> - 2003-04-15 11:58:23
|
Greetings all, I apologise for the rudeness of my previous email, and thanks for not=20 flaming me :) I didn't mean to sound unappreciative of the efforts=20 that everyone is putting in -- I was just having a bad day... Thanks=20 Jim, Neal and Geoff for replying. The trouble that I'm having seems to come from the list of free disk=20 pages getting corrupted. What do other people get when they run make TESTS=3Dt_htdig_local check after applying the patch at http://home.iprimus.com.au/lachlan_andrew/htdig/freelist.patch ? The output I get (Mandrake8.2, gcc 3.1) is in http://home.iprimus.com.au/lachlan_andrew/htdig/debug.first.gz and http://home.iprimus.com.au/lachlan_andrew/htdig/debug.second.gz (The output differs because make check doesn't clean up after=20 itself.) My hunch is that this (and the infinite recursion problem earlier)=20 come from the fact that the new "transparent compression" routines=20 are not re-entrant, but are being called recursively. They call=20 original "allocate page" routines. These "allocate page" routines=20 sometimes need to write dirty memory pages, which uses the on-the-fly=20 compression. If anyone can think of a way to avoid this recursion=20 (or to make the compression code properly reentrant) please post it. Thanks, Lachlan On Sat, 12 Apr 2003 02:41, Geoff Hutchison wrote: > > I've certainly been putting in what time I have, but as I'm not > having much luck reliably reproducing it, it's a bit difficult for > me. > > Further pointers from you and/or Neil would really help in my > search. |