From: <Dan...@em...> - 2008-03-07 09:26:46
|
Hi aperture team, we are currently designing concepts for EILF (http://www.eclipse.org/proposals/eilf/) and found that both technologies provides some similar features. So I read that your Crawlers support Incremental Crawling (we call it Delta Indexing). I have some questions about that: 1) How and when do you report removed objects? 2) Information about crawled objects is stored in AccessData implementations. What exactly is stored in AccessData ? What implementation and storage would you suggest to handle high volume of data (>> 1.000.000) ? Bye, Daniel |