SourceForge has been redesigned. Learn more.
Close

Creating value expressions for FilterNode and FunctionNodes

Help
2014-02-28
2014-03-01
  • Anton Kulaga

    Anton Kulaga - 2014-02-28

    I am trying to create a filter right now. As BigDataSearch does not seem to work for URI's (at least I did not find how to configure it for URIs),
    I am trying to use ordinary SPARQL filters to search in URIs (I need such search for type suggessions), something like FILTER(STR(?myvar),"myregex")
    But the main problem is that I do know how to instantiate expressions inside filters.
    When create a function node for STR (i.e. new FunctionNode(FunctionRegistry.STR,null,new VarNode(variable)))

    I get it with null value expressions and I do not understand how to create com.bigdata.rdf.internal.constraints.StrBOp for it.
    I guess I should use public StrBOp(final IValueExpression<? extends IV> x, final GlobalAnnotations globals) But is it not clear for me what to provide for this constructor and from where I should take those GlobalAnnotationns

     
    Last edit: Anton Kulaga 2014-02-28
    • Bryan Thompson

      Bryan Thompson - 2014-03-01

      Let me suggest a different approach. Why don't you setup an embedded scala environment for bigdata and provide some guidance on how to use it. This might even be something we could bundle as a module in the platform. Once we can get setup with scala, let's try to bootstrap some tests. That will give us a basic means to setup and invoke bigdata's java code using Scala. We can then have a conversation about issues that emerge and you should get a better sense of how to use the bigdata internal APIs as a result. This might be much easier than trying comment on your code approach when I can not run the code and you can not provide unit tests for us.

      Another thing is to start from our internal test suites for the ASTOptimizers and the the sparql to AST translation tests.

      The tests are often boot strapped in ways that might make it difficult for you to generalize from the test to the running code. If you can help us get a scala java integration bootstrapped that we can understand, this might be easier.

      What is the license for the code that you are developing? Are you able to sign a contributor license agreement? Can you get your employer to sign one, even if you work on this outside of your job. Whether it makes sense to bundle a scala integration module could depend on these things.

      Bryan

      On Feb 28, 2014, at 12:07 PM, "Anton Kulaga" antonkulaga@users.sf.net<mailto:antonkulaga@users.sf.net> wrote:

      I am trying to create a filter right now. As BigDataSearch does not seem to work for URI's (at least I did not find how to configure it for URIs),
      I am trying to use ordinary SPARQL filters to search in URIs (I need such search for type suggessions), something like FILTER(STR(?myvar),"myregex")
      But the main problem is that I do know how to instantiate expressions inside filters.
      When create a function node for STR (i.e. new FunctionNode(FunctionRegistry.STR,null,new VarNode(variable))) I get it with null value expressions and I do not understand how to create com.bigdata.rdf.internal.constraints.StrBOp for it.


      Creating value expressions for FilterNode and FunctionNodeshttps://sourceforge.net/p/bigdata/discussion/676946/thread/431821de/?limit=25#054d


      Sent from sourceforge.nethttp://sourceforge.net because you indicated interest in https://sourceforge.net/p/bigdata/discussion/676946/

      To unsubscribe from further messages, please visit https://sourceforge.net/auth/subscriptions/

       
  • Anton Kulaga

    Anton Kulaga - 2014-03-01

    Hi Bryan!

    The project that I develop with bigdata is opensource knowledge management platform ( https://github.com/denigma/semantic-web ). There is only basic stuff there right now (like dynamic sparql query form) but I plan to work hard on it to be able to search for funding and to apply it to domains of my interest (project management, aging research, education).

    That will give us a basic means to setup and invoke bigdata's java code using Scala. This might be much easier than trying comment on your code approach when I can not run the code and you can not provide unit tests for us.

    Yes, it would be much better for me than creating Java tests because it will takes less time. I develop my project in my spare time so the time is the main constraint for me. In my project ( https://github.com/denigma/semantic-web ) I already have a bunch of tests ( https://github.com/denigma/semantic-web/tree/master/test/org/denigma/semantic ) with embeded bigdata but they reuquire comments to make them understandable by Java developers and to explain what approach I used to deal with bigdata.

     
    Last edit: Anton Kulaga 2014-03-01

Log in to post a comment.