Menu

owllet + ELK integration

Help
2014-02-28
2014-03-13
  • Bryan Thompson

    Bryan Thompson - 2014-02-28

    Here is a plausible path to a DL integration. There was an interesting talk that touched on this at CSHALS.

    See http://trac.bigdata.com/ticket/824 (Examine owlet/ELK integration)

    Bryan

     
  • Jim Balhoff

    Jim Balhoff - 2014-02-28

    Hi Bryan,

    I'm glad you enjoyed Hilmar's talk. I had considered implementing owlet as a custom SERVICE as described here http://wiki.bigdata.com/wiki/index.php/FederatedQuery#Custom_Services. I imagine the user specifying a graph in the config, from which the ontology could be loaded into memory. It should also provide a choice of reasoner, such as ELK, Hermit, JFact, etc.

    I think it would be more efficient than the current version, since you wouldn't have to generate the filter, serialize the query, send it to another endpoint, and parse. But the nice thing about the current version is that can work with any triplestore and any OWL API reasoner.

    Alternatively I thought about "inverting" the current version to run as a server, so that you could call it as a remote SERVICE from a query to Bigdata. I wrote a little more about that here: https://github.com/phenoscape/owlet/wiki/Further-development-of-owlet

    We have found owlet useful because we have very large, complex ontologies which we don't really use to auto-classify instances. The instance data is linked to ontology terms, but queries may involve complex DL descriptions taking advantage of the knowledge in the ontology.

     

    Last edit: Jim Balhoff 2014-02-28
    • Bryan Thompson

      Bryan Thompson - 2014-03-01

      Jim, I found the approach quite exciting and I would very much like to see this feature available out of the box with bigdata. I spoke with Hilmar about visiting at his location and having a conversation about how best to proceed.

      We do a lot of things through the SERVICE mechanism and it does offer quite a bit of flexibility, including the ability to hook and leverage updates. It could be used to either embed a reasoner or to qreach out to a remote reasoner. The ASTOptimizer is another possible integration point.

      I am curious how much RAM and CPU demand is imposed by the owl reasoner. It could make sense to either embed the reasoner or have it be an external service, depending on the resource demand. I am also curious how the deployment model might interact with those decisions. For example, if deployed with the HA cluster. It. It be interesting to try some different configurations and observe the impact on query performance.

      Bryan

      On Feb 28, 2014, at 11:11 AM, "Jim Balhoff" balhoff@users.sf.netamp#98;amp#97;amp#108;amp#104;amp#111;amp#102;amp#102;amp#64;amp#117;amp#115;amp#101;amp#114;amp#115;amp#46;amp#115;amp#102;amp#46;amp#110;amp#101;amp#116; wrote:

      Hi Bryan,

      I'm glad you enjoyed Hilmar's talk. I had considered implementing owlet as a custom SERVICE as described here http://wiki.bigdata.com/wiki/index.php/FederatedQuery#Custom_Services. I imagine the user specifying a graph in the config, from the ontology could be loaded into memory. It should also provide a choice of reasoner, such as ELK, Hermit, JFact, etc.

      I think it would be more efficient than the current version, since you wouldn't have to generate the filter, serialize the query, send it to another endpoint, and parse. But the nice thing about the current version is that can work with any triplestore and any OWL API reasoner.

      Alternatively I thought about "inverting" the current version to run as a server, so that you could call it as a remote SERVICE from a query to Bigdata. I wrote a little more about that here: https://github.com/phenoscape/owlet/wiki/Further-development-of-owlet

      We have found owlet useful because we have very large, complex ontologies which we don't really use to auto-classify instances. The instance data is linked to ontology terms, but queries may involve complex DL descriptions taking advantage of the knowledge in the ontology.


      owllet + ELK integrationhttps://sourceforge.net/p/bigdata/discussion/676946/thread/5e589b75/?limit=25#84ab


      Sent from sourceforge.nethttp://sourceforge.net because you indicated interest in https://sourceforge.net/p/bigdata/discussion/676946/

      To unsubscribe from further messages, please visit https://sourceforge.net/auth/subscriptions/

       
  • Ramona Walls

    Ramona Walls - 2017-02-23

    Hi there. Has any progress been made on this? I would really like to be able to use BlazeGraph with an ELK reasoner. I can't find any documentation on it, but maybe I'm missing something.

     
    • Jim Balhoff

      Jim Balhoff - 2017-02-23

      Hi Ramona - I have never created a tight integration. Currently in Phenoscape we preprocess queries containing Owlet expressions and then send the result to Blazegraph. However I think developing an integration as a Blazegraph custom service would be pretty straightforward, where it would perhaps load the ontology from a graph at startup. If you get in touch directly I can describe a few different ways they can interact. One of the limitations is that a single ELK instance can only process one query at a time.

       

Log in to post a comment.

MongoDB Logo MongoDB