Menu

How use rules once created?

Help
Cyril
2015-07-27
2015-07-28
  • Cyril

    Cyril - 2015-07-27

    Dear everyone,

    I wish to use Bigdata and produce many inferences thanks to rules.
    I created the rules with:

    But the problem is once I created the rule I don't know how to give it to Bigdata. I'm a python developper and not a Java developper, there're many thing I don't really understand. I already cloned git through Eclipse and created my own rule but I don't know how launch Blazegraph with my rule taken into account.

    To create my rule I adapted the samples give in the git.

    My DEPENDSOF.java (SAMPLE.java)

    import org.openrdf.model.URI;
    import org.openrdf.model.impl.URIImpl;
    
    public class OWN_RULES {
    
    public static final String NAMESPACE = "http://www.semanticweb.org/mynamespace/";
    
    public static final URI PRODUCED_BY_ACTIVITY = new URIImpl(NAMESPACE+"producedByActivity");
    public static final URI HAS_EXCHANGE = new URIImpl(NAMESPACE+"hasExchange");
    public static final URI EXCHANGE_DEPENDS_OF_SUBEXCHANGE = new URIImpl(NAMESPACE+"exchangeDependsOfSubExchange");
    
     }
    

    My OwnRuleVocab.java (SampleVocab.java)

    package com.bigdata.rdf.rules;
    
    import java.util.Arrays;
    import java.util.Collections;
    import java.util.Iterator;
    
    import org.openrdf.model.URI;
    import org.openrdf.model.impl.URIImpl;
    import org.openrdf.model.vocabulary.RDF;
    
    import com.bigdata.rdf.store.AbstractTripleStore;
    import com.bigdata.rdf.vocab.RDFSVocabulary;
    import com.bigdata.rdf.vocab.VocabularyDecl;
    
    public class OwnRuleVocab extends RDFSVocabulary {
    
    public OwnRuleVocab() {        
        super();
    }
    
    public OwnRuleVocab(final String namespace) {
        super( namespace );
    }
    
    @Override
    protected void addValues() {
    
        super.addValues();
    
        addDecl(new OwnVocabularyDecl());
    
    }
    
    public static class OwnVocabularyDecl implements VocabularyDecl {
    
        static private final URI[] uris = new URI[]{
                new URIImpl(OWN_RULES.NAMESPACE),
                OWN_RULES.PRODUCED_BY_ACTIVITY, //
                OWN_RULES.HAS_EXCHANGE,
                OWN_RULES.EXCHANGE_DEPENDS_OF_SUBEXCHANGE
            };
    
        public OwnVocabularyDecl() {
            }
    
        public Iterator<URI> values() {
    
            return Collections.unmodifiableList(Arrays.asList(uris)).iterator();       
            }
        }
    }
    

    OwnRule.java (SampleRule.java)

    package com.bigdata.rdf.rules;

    import org.openrdf.model.vocabulary.RDF;
    import org.openrdf.model.vocabulary.RDFS;
    
    import com.bigdata.bop.IConstraint;
    import com.bigdata.bop.constraint.Constraint;
    import com.bigdata.bop.constraint.NE;
    import com.bigdata.bop.constraint.NEConstant;
    import com.bigdata.rdf.internal.constraints.InferenceBVE;
    import com.bigdata.rdf.internal.constraints.IsLiteralBOp;
    import com.bigdata.rdf.spo.SPOPredicate;
    import com.bigdata.rdf.vocab.Vocabulary;
    import com.bigdata.relation.rule.Rule;
    
    /**
    
     * If E is ProducedByActivity A (?e rme:producedByActivity ?a),
     * and A has for exchange ES,
     * then E exchangeDependsOfSubExchange ES.
     * 
     * [exchangeDependsOfSubExchange:
     * (?e rme:producedByActivity ?a),
     * (?a rme:hasExchange ?es)
     * -> (?e rme:exchangeDependsOfSubExchange ?es)]
     * 
     */
    
    public class exchangeDependsOfSubExchange extends Rule {
    
    //private static final long serialVersionUID = 7627609187312677342L;
    
    
    public DependsOfRules(final String relationName, final Vocabulary vocab) {
    
        super(  "exchangeDependsOfSubExchange", // rule name
                new SPOPredicate(relationName, var("e"), vocab.getConstant(OWN_RULES.EXCHANGE_DEPENDS_OF_SUBEXCHANGE), var("es")), // head
                new SPOPredicate[] { // tail
                    new SPOPredicate(relationName, var("e"), vocab.getConstant(OWN_RULES.PRODUCED_BY_ACTIVITY), var("a")),
                    new SPOPredicate(relationName, var("a"), vocab.getConstant(OWN_RULES.HAS_EXCHANGE), var("es")),
                },
                new IConstraint[] { // constraints
                    Constraint.wrap(new NE(var("a"), var("es"))),
                });
    
        }
    
    }
    

    RuleClosure.java (SampleClosure.java)

    package com.bigdata.rdf.rules;
    
    import java.util.LinkedList;
    import java.util.List;
    
    import com.bigdata.rdf.store.AbstractTripleStore;
    import com.bigdata.relation.rule.Rule;
    
    /**
    
     * The closure program must include the new custom inference rules by
     * overriding the parent method {@link BaseClosure#getCustomRules(String)}.
     */
    public class RuleClosure extends FullClosure {
    
    public FullerClosure(final AbstractTripleStore db) {
        super(db);
    }
    
    /**
    
     * Called once by super class during construction of inference program.
     */
    @Override
      protected List<Rule> getCustomRules(String database) {
        List<Rule> customRules = new ArrayList<Rule>();
        customRules.addAll(super.getCustomRules(database));
        customRules.add(new RuleEx09(database, vocab));
        return customRules;
    
    }
    
    }
    

    OwnRule.properties

    com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers=true
    com.bigdata.journal.AbstractJournal.bufferMode=DiskRW
    com.bigdata.rdf.store.AbstractTripleStore.closureClass=rules.RuleClosure
    com.bigdata.rdf.store.AbstractTripleStore.vocabularyClass=schemas.OwnRuleVocab
    

    Thank you in advance.
    Cyril

     

    Last edit: Cyril 2015-07-27
    • Bryan Thompson

      Bryan Thompson - 2015-07-27

      Igor,

      Would you mind responding to this question and making sure that we have
      captured the necessary information in the user manual?

      Thanks,
      Bryan

      PS: The AbstractTripleStore configuration for inference and rules are
      sticky properties. They can not be changed for a given triple store once
      that triple store has been created. I would review the properties that are
      in force for the triple store (look at the properties listed for the
      namespace on NAMESPACES tab of the workbench) and make sure that truth
      maintenance is turned on, that your rules class appears, etc.


      Bryan Thompson
      Chief Scientist & Founder
      SYSTAP, LLC
      4501 Tower Road
      Greensboro, NC 27410
      bryan@systap.com
      http://blazegraph.com
      http://blog.bigdata.com http://bigdata.com
      http://mapgraph.io

      Blazegraph™ http://www.blazegraph.com/ is our ultra high-performance
      graph database that supports both RDF/SPARQL and Tinkerpop/Blueprints
      APIs. MapGraph™ http://www.systap.com/mapgraph is our disruptive new
      technology to use GPUs to accelerate data-parallel graph analytics.

      http://smartdata2015.dataversity.net/

      CONFIDENTIALITY NOTICE: This email and its contents and attachments are
      for the sole use of the intended recipient(s) and are confidential or
      proprietary to SYSTAP. Any unauthorized review, use, disclosure,
      dissemination or copying of this email or its contents or attachments is
      prohibited. If you have received this communication in error, please notify
      the sender by reply email and permanently delete all copies of the email
      and its contents and attachments.

      On Mon, Jul 27, 2015 at 6:57 AM, Cyril cyril-f@users.sf.net wrote:

      Dear everyone,

      I wish to use Bigdata and produce many inference thanks to rules.
      I created the rules with:
      - https://wiki.blazegraph.com/wiki/index.php/InferenceAndTruthMaintenance
      -
      https://github.com/earldouglas/blazegraph-scratchpad/tree/master/inference-rules

      But the problem is once I created the rule I don't know how to give them
      to Bigdata. I'm a python developper and not a Java developper, there're
      many thing I don't really understand. I already cloned git through Eclipse
      and created my own rules but I don't know how launch Blazegraph with my
      rules taken into account.

      To create my rule I adapted the samples give in the git.

      My DEPENDSOF.java (SAMPLE.java)

      import org.openrdf.model.URI;import org.openrdf.model.impl.URIImpl;
      public class OWN_RULES {
      public static final String NAMESPACE = "http://www.semanticweb.org/reminer/";
      public static final URI PRODUCED_BY_ACTIVITY = new URIImpl(NAMESPACE+"producedByActivity");public static final URI HAS_EXCHANGE = new URIImpl(NAMESPACE+"hasExchange");public static final URI EXCHANGE_DEPENDS_OF_SUBEXCHANGE = new URIImpl(NAMESPACE+"exchangeDependsOfSubExchange");

      }

      My OwnRuleVocab.java (SampleVocab.java)

      package com.bigdata.rdf.rules;
      import java.util.Arrays;import java.util.Collections;import java.util.Iterator;
      import org.openrdf.model.URI;import org.openrdf.model.impl.URIImpl;import org.openrdf.model.vocabulary.RDF;
      import com.bigdata.rdf.store.AbstractTripleStore;import com.bigdata.rdf.vocab.RDFSVocabulary;import com.bigdata.rdf.vocab.VocabularyDecl;
      public class OwnRuleVocab extends RDFSVocabulary {
      public OwnRuleVocab() {
      super();}
      public OwnRuleVocab(final String namespace) {
      super( namespace );}
      @Overrideprotected void addValues() {

      super.addValues();
      
      addDecl(new OwnVocabularyDecl());
      

      }
      public static class OwnVocabularyDecl implements VocabularyDecl {

      static private final URI[] uris = new URI[]{
              new URIImpl(OWN_RULES.NAMESPACE),
              OWN_RULES.PRODUCED_BY_ACTIVITY, //
              OWN_RULES.HAS_EXCHANGE,
              OWN_RULES.EXCHANGE_DEPENDS_OF_SUBEXCHANGE
          };
      
      public OwnVocabularyDecl() {
          }
      
      public Iterator<URI> values() {
      
          return Collections.unmodifiableList(Arrays.asList(uris)).iterator();
          }
      }}
      

      OwnRule.java (SampleRule.java)

      package com.bigdata.rdf.rules;

      import org.openrdf.model.vocabulary.RDF;import org.openrdf.model.vocabulary.RDFS;
      import com.bigdata.bop.IConstraint;import com.bigdata.bop.constraint.Constraint;import com.bigdata.bop.constraint.NE;import com.bigdata.bop.constraint.NEConstant;import com.bigdata.rdf.internal.constraints.InferenceBVE;import com.bigdata.rdf.internal.constraints.IsLiteralBOp;import com.bigdata.rdf.spo.SPOPredicate;import com.bigdata.rdf.vocab.Vocabulary;import com.bigdata.relation.rule.Rule;
      /*
      * If E is ProducedByActivity A (?e rme:producedByActivity ?a),
      * and A has for exchange ES,
      * then E exchangeDependsOfSubExchange ES.
      *
      * [exchangeDependsOfSubExchange:
      * (?e rme:producedByActivity ?a),
      * (?a rme:hasExchange ?es)
      * -> (?e rme:exchangeDependsOfSubExchange ?es)]

      *
      /
      public class exchangeDependsOfSubExchange extends Rule {
      //private static final long serialVersionUID = 7627609187312677342L;
      public DependsOfRules(final String relationName, final Vocabulary vocab) {

      super(  "exchangeDependsOfSubExchange", // rule name
              new SPOPredicate(relationName, var("e"), vocab.getConstant(OWN_RULES.EXCHANGE_DEPENDS_OF_SUBEXCHANGE), var("es")), // head
              new SPOPredicate[] { // tail
                  new SPOPredicate(relationName, var("e"), vocab.getConstant(OWN_RULES.PRODUCED_BY_ACTIVITY), var("a")),
                  new SPOPredicate(relationName, var("a"), vocab.getConstant(OWN_RULES.HAS_EXCHANGE), var("es")),
              },
              new IConstraint[] { // constraints
                  Constraint.wrap(new NE(var("a"), var("es"))),
              });
      
      }
      

      }

      RuleClosure.java (SampleClosure.java)

      package com.bigdata.rdf.rules;
      import java.util.LinkedList;import java.util.List;
      import com.bigdata.rdf.store.AbstractTripleStore;import com.bigdata.relation.rule.Rule;
      /
      * The closure program must include the new custom inference rules by
      * overriding the parent method {@link BaseClosure#getCustomRules(String)}.
      */public class RuleClosure extends FullClosure {
      public FullerClosure(final AbstractTripleStore db) {
      super(db);}
      /

      * Called once by super class during construction of inference program.
      */@Override
      protected List<rule> getCustomRules(String database) {
      List<rule> customRules = new ArrayList<rule>();
      customRules.addAll(super.getCustomRules(database));
      customRules.add(new RuleEx09(database, vocab));
      return customRules;
      }
      }</rule></rule></rule>

      OwnRule.properties

      com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers=true
      com.bigdata.journal.AbstractJournal.bufferMode=DiskRW
      com.bigdata.rdf.store.AbstractTripleStore.closureClass=rules.RuleClosure
      com.bigdata.rdf.store.AbstractTripleStore.vocabularyClass=schemas.OwnRuleVocab

      Thank you in advance.
      Cyril


      How use rules once created?
      https://sourceforge.net/p/bigdata/discussion/676946/thread/be51a1ad/?limit=25#f118


      Sent from sourceforge.net because you indicated interest in
      https://sourceforge.net/p/bigdata/discussion/676946/

      To unsubscribe from further messages, please visit
      https://sourceforge.net/auth/subscriptions/

       
  • Cyril

    Cyril - 2015-07-27

    Thanks for your "PS" reply.

    According to your "PS" message the inference engine in BigData seems to be not exactly what I'm looking for.

    At the moment I created my inferred property relations with INSERT/UPDATE.
    But it's not really the kind of solution I'm looking for.
    The solution I'm looking for is to create an engine in which I can modify inference rules and thus would modify inferred relation properties in the graph. For the moment, It's not a problem for me if I need to restart the server to apply these changes when they occur.

    I know Stardog solution provides this kind of solution. But I want to use efficient OpenSource solution such as BigData.
    Do you know how I could do this kind of things with bigdata/blazegraph ?

    Thanks for you reply and your time.

     

    Last edit: Cyril 2015-07-27
    • Bryan Thompson

      Bryan Thompson - 2015-07-28

      If you drop all materialized entailments and rewrite the properties on the
      triple store then you can achieve this results. From Java, you would do
      something along these lines:

      • BigdataSail.getUnisolatedConnection().removeAllEntailments() to drop all
        materialized inferences. Commit.
      • Modify triple store properties by writing on the global row store.
        Commit. There are some examples of how to update these properties in the
        BigdataSailHelper class (attached).
      • Obtain a new BigdataSail object using an appropriate constructor (the one
        that accepts an AbstractTripleStore). The AbstractTripleStore must be the
        unisolated view. This should avoid cached versions of the properties that
        you just modified.
      • BigdataSail.getUnisolatedConnection().computeClosure(). Commit. This
        will rewrite the entailments using the new rules.

      At this point you can resume normal incremental truth maintenance
      operations.

      Thanks,
      Bryan


      Bryan Thompson
      Chief Scientist & Founder
      SYSTAP, LLC
      4501 Tower Road
      Greensboro, NC 27410
      bryan@systap.com
      http://blazegraph.com
      http://blog.bigdata.com http://bigdata.com
      http://mapgraph.io

      Blazegraph™ http://www.blazegraph.com/ is our ultra high-performance
      graph database that supports both RDF/SPARQL and Tinkerpop/Blueprints
      APIs. MapGraph™ http://www.systap.com/mapgraph is our disruptive new
      technology to use GPUs to accelerate data-parallel graph analytics.

      http://smartdata2015.dataversity.net/

      CONFIDENTIALITY NOTICE: This email and its contents and attachments are
      for the sole use of the intended recipient(s) and are confidential or
      proprietary to SYSTAP. Any unauthorized review, use, disclosure,
      dissemination or copying of this email or its contents or attachments is
      prohibited. If you have received this communication in error, please notify
      the sender by reply email and permanently delete all copies of the email
      and its contents and attachments.

      On Mon, Jul 27, 2015 at 11:55 AM, Cyril cyril-f@users.sf.net wrote:

      Thanks for your "PS" reply.

      According to your "PS" message the inference engine in BigData seems to be
      not exactly what I'm looking for.

      At the moment I created my inferred property relations with INSERT/UPDATE.
      But it's not really the kind of solution I'm looking for.
      The solution I'm looking for is to create an engine in which I can modify
      inference rules and thus would modify inferred relation properties in the
      graph. For the moment, It's not a problem for me if I need to restart the
      server to apply these changes when they occur.

      I know Stardog solution provides this kind of solution.
      Do you know how I could do this kind of things with bigdata/blazegraph ?

      Thanks for you reply and your time.

      How use rules once created?
      https://sourceforge.net/p/bigdata/discussion/676946/thread/be51a1ad/?limit=25#7851


      Sent from sourceforge.net because you indicated interest in
      https://sourceforge.net/p/bigdata/discussion/676946/

      To unsubscribe from further messages, please visit
      https://sourceforge.net/auth/subscriptions/

       

Log in to post a comment.

MongoDB Logo MongoDB