|
From: Mike P. <mi...@sy...> - 2010-04-30 19:42:55
|
Jurgen, The answer to inference with quads really lies in full query-time inference, which we do not yet support. We have sketchy plans to work on query-time inference later this year after attending to several other priorities, most notably the high-availability architecture for scale-out. The problem with quad inference is that supports for inferred statements can exist in different contexts, and it is not known until query which contexts can be considered. Our model right now is to compute most inferences at load time. So if we compute inferences using supports from different contexts, we would have to filter out query results based on whether they are supported by the specific contexts listed in the query. Thanks, Mike -----Original Message----- From: Jürgen Jakobitsch [mailto:jak...@pu...] Sent: Wednesday, April 28, 2010 1:55 AM To: big...@li... Subject: [Bigdata-developers] Inference and Quads hi, can you tell, when there will inference support with the quad store. we're currently starting development of the next version of out thesaurus management tool poolparty (poolparty.punkt.at) and are further evaluating triple stores. but we have a significant need for the combination of quads and inference. wkr turnguard.com/turnguard -- punkt. netServices ______________________________ Jürgen Jakobitsch Codeography Lerchenfelder Gürtel 43 Top 5/2 A - 1160 Wien Tel.: 01 / 897 41 22 - 29 Fax: 01 / 897 41 22 - 22 netServices http://www.punkt.at ------------------------------------------------------------------------------ _______________________________________________ Bigdata-developers mailing list Big...@li... https://lists.sourceforge.net/lists/listinfo/bigdata-developers |