I am changing user quires with BIGData AST. I see there a lot of ConstantsNode-s and trees with them like ConstantNode(TermId(5054U)) But I have no clue how to create ConstantNode :(
I managed to create TermId with TermId.fromString method but I do not know how I can pass it to ConstantNode constructor! (it has several constructor with confusing params and I do not know where and how to pass my TermId) :((
I also wonder how equality is controlled for most of BOps. For instance if I use replaceWith(old:BOp,newVal:BOp) method of ast to replace one ConstantNode to another one. Will it be sufficient if I simply create new ConstantNode for the same (as those in original tree) URI and pass it as oldValue or equality for ConstantNode does ordinary reference so for the old value I have provide original ConstantNode from AST?
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
TermId is an IV. IVs (InternalValues) are created using the LexiconRelation, which is accessible from the AbstractTripleStore. You need to resolve RDF Value objects to existing IVs or create new IVs if the Values are not in the lexicon (and are not represented inline).
With the IV, what you want to use is:
@SuppressWarnings("rawtypes")
public ConstantNode(final IV val) {
this(new Constant<IV>(val));
}
@SuppressWarnings("rawtypes")
public ConstantNode(final IConstant<IV> val) {
super(new BOp { val }, null);
}
Thanks,
Bryan
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
>TermId is an IV. IVs (InternalValues) are created using the LexiconRelation
Thank for the fast response! Yes, with lexicon I now can create terms.
Overall one of the things that I want to achieve with ASTs is dynamic filtering of context. For instance a user obtained a query result and then he chooses a name of a variable that will be filtered. I am not sure what I should do to substitute a value (URI that user chose from filter user box in the filter interface) instead of variable with ASTs or how I should filter the value of this variable with ASTs. Another thing that I want to accomplish by editing ASTs is programmatic generation of INSERT/DELETES (for instance I generated a table with SPARQL Select, I want to generate UPDATES that will change values when user edits some of table's values).
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Why do you want to use the AST for this? It would be easy enough to do with string generation of the SPARQL requests? You do need to be careful about escaping the substituted parameters, but that will be simpler than working at the AST layer.
Bryan
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
> It would be easy enough to do with string generation of the SPARQL requests?
I have to analyze them and change graph patterns (for instance extract context to see if a user has rights to run the query to appropriate graph or provide some automatic sorting in the UI that will add Filters and OrderBys to the query). AST already analyzed the query for me, so it seems like the only thing I have to do is to substitute there some TermNodes and add some Orderby-s, Slices (with Slices it was quite easy, btw), Filters and other things that will be relevant.
>Why do you want to use the AST for this? It would be easy enough to do with string generation of the SPARQL requests? >You do need to be careful about escaping the substituted parameters, but that will be simpler than working at the AST layer.
I have some small experience of working with ASTs when writing scala macroses so asts do not look scary to me. Although it is unpleasant to work with ASTs that are written in Java (a lot of verbose ugly code full of side effects due to lack of expressiveness/flexibility of Java as a language). I would better prefer to work with SPARQL algebra rather than ASTs but I see no documentation for sesame/bigdata sparql algebra (bigdata asts are also not documented but at least I was lucky to find ASTContainer and you helped me with advice about Lexer ), so if I will fail to get what I want with Bigdata AST I will try ARQ sparql algebra (it is Jena based but it can generate the resulting query as a string) and if I will fail with SPARQL algebra I would try to take parboiled2 to work with sparql strings.
P.S. By the way, I looked into Bigdata internals and some things there look strange to me, for instance having getOriginalUpdateAST and getOriginalAST ( what about having abstract class ASTContainer and SelectASTContainer and UpdateASTContainer that inherit from it?) Another contrintuitive thing was using LexerRelations.addTerms to generate IVs for BigdataValues …
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
By the way, how to do the materialization of URIs? When I create BigdataURI they are created with empty IVs, when I do addTerms with them they get IVs but those IVs return false on hasValue. As I understand in order to change something in query I have to provide ConstantNodes with Terms (generated by new Constant(mybigdatavalue.getIV)) that have values. I tried to add values directly to IVs but iv.setValue requries some weird type (it does not accept ordinary BigDataValue-s) =(
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
There are also classes that are used for streaming batch resolution. You can trace back from those methods to find them.
The IVs MUST be either in the dictionary indices (TERM2ID, ID2TERM, or BLOBS) -OR- valid inline IVs for the namespace of the specific triple store instance. BigdataValue objects have a hidden reference back to BigdataValueFactory and the BigdataValueFactory knows the namespace of the triple store instance that it is compatible with.
Bryan
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
>The IVs MUST be either in the dictionary indices (TERM2ID, ID2TERM, or BLOBS) -OR-
I figured out how to use IV.setValue it happened that wildcard generic inside of IVCache did not let scala to recognize incoming setValue type as bigdatavalue (in 99.9% cases Scala treat java types right but not this type) So eventually I did asInstanceOf conversion and used iv.setValue in combination with lexiconRelation.addTerms, and it worked! (I did not do any manual insertions to dictionary indices)
Now I am investigating editing of SPARQL update.
I wonder why does not setBinding work there? I think that it would be nice to be able to define variables in delete-insert-where update query and substitute it with setBinding, is it in plans?
ParsedUpdate throws unsopported operation error before and even after executing update. I see that it is used in SailUpdate class, so why is it unsupported? Should I do anything to get parsedUpdate values? When is it populated with values?
I see the same thing with tuple sparql query, both before and after evaluation I get null on q.getParsedQuery, when is it not null, or it is just a feature that is planned but has not been coded yet?
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
bigdata still uses the VALUES syntax rather than BINDINGS.
bigdata does NOT support the TupleExpr object model. Both query and update use the bigdata AST. It sounds like you are mixing things that do not work together?
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Well, I am exploring it in a bit "monkey way" by having bigdata sourcecode in one screen and scala repl with active bigdata repo to experiment with in another, so yes, sometimes I try crazy things. Overall I have got the feeling that there are some flows in you architecture, things that only confuse developers who use bigdata, like properties that are always null. In the same time I am very happy that you are very responsive and already solved some of my issues with your answers
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I am going to grammatically create Filter, where should I register string constants? I managed to insert Constant[String] into ConstantNode but it is not evaluated.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I am going to grammatically create Filter, where should I register string constants? I managed to insert Constant[String] into ConstantNode but it is not evaluated.
I am going to grammatically create Filter, where should I register string constants? I managed to insert Constant[String] into ConstantNode but it is not evaluated.
I am changing user quires with BIGData AST. I see there a lot of ConstantsNode-s and trees with them like ConstantNode(TermId(5054U)) But I have no clue how to create ConstantNode :(
I managed to create TermId with TermId.fromString method but I do not know how I can pass it to ConstantNode constructor! (it has several constructor with confusing params and I do not know where and how to pass my TermId) :((
I also wonder how equality is controlled for most of BOps. For instance if I use replaceWith(old:BOp,newVal:BOp) method of ast to replace one ConstantNode to another one. Will it be sufficient if I simply create new ConstantNode for the same (as those in original tree) URI and pass it as oldValue or equality for ConstantNode does ordinary reference so for the old value I have provide original ConstantNode from AST?
sorry for type errors=(
TermId is an IV. IVs (InternalValues) are created using the LexiconRelation, which is accessible from the AbstractTripleStore. You need to resolve RDF Value objects to existing IVs or create new IVs if the Values are not in the lexicon (and are not represented inline).
With the IV, what you want to use is:
this(new Constant<IV>(val));
}
@SuppressWarnings("rawtypes")
public ConstantNode(final IConstant<IV> val) {
super(new BOp { val }, null);
}
Thanks,
Bryan
>TermId is an IV. IVs (InternalValues) are created using the LexiconRelation
Thank for the fast response! Yes, with lexicon I now can create terms.
Overall one of the things that I want to achieve with ASTs is dynamic filtering of context. For instance a user obtained a query result and then he chooses a name of a variable that will be filtered. I am not sure what I should do to substitute a value (URI that user chose from filter user box in the filter interface) instead of variable with ASTs or how I should filter the value of this variable with ASTs. Another thing that I want to accomplish by editing ASTs is programmatic generation of INSERT/DELETES (for instance I generated a table with SPARQL Select, I want to generate UPDATES that will change values when user edits some of table's values).
Why do you want to use the AST for this? It would be easy enough to do with string generation of the SPARQL requests? You do need to be careful about escaping the substituted parameters, but that will be simpler than working at the AST layer.
Bryan
> It would be easy enough to do with string generation of the SPARQL requests?
I have to analyze them and change graph patterns (for instance extract context to see if a user has rights to run the query to appropriate graph or provide some automatic sorting in the UI that will add Filters and OrderBys to the query). AST already analyzed the query for me, so it seems like the only thing I have to do is to substitute there some TermNodes and add some Orderby-s, Slices (with Slices it was quite easy, btw), Filters and other things that will be relevant.
>Why do you want to use the AST for this? It would be easy enough to do with string generation of the SPARQL requests? >You do need to be careful about escaping the substituted parameters, but that will be simpler than working at the AST layer.
I have some small experience of working with ASTs when writing scala macroses so asts do not look scary to me. Although it is unpleasant to work with ASTs that are written in Java (a lot of verbose ugly code full of side effects due to lack of expressiveness/flexibility of Java as a language). I would better prefer to work with SPARQL algebra rather than ASTs but I see no documentation for sesame/bigdata sparql algebra (bigdata asts are also not documented but at least I was lucky to find ASTContainer and you helped me with advice about Lexer ), so if I will fail to get what I want with Bigdata AST I will try ARQ sparql algebra (it is Jena based but it can generate the resulting query as a string) and if I will fail with SPARQL algebra I would try to take parboiled2 to work with sparql strings.
P.S. By the way, I looked into Bigdata internals and some things there look strange to me, for instance having getOriginalUpdateAST and getOriginalAST ( what about having abstract class ASTContainer and SelectASTContainer and UpdateASTContainer that inherit from it?) Another contrintuitive thing was using LexerRelations.addTerms to generate IVs for BigdataValues …
By the way, how to do the materialization of URIs? When I create BigdataURI they are created with empty IVs, when I do addTerms with them they get IVs but those IVs return false on hasValue. As I understand in order to change something in query I have to provide ConstantNodes with Terms (generated by new Constant(mybigdatavalue.getIV)) that have values. I tried to add values directly to IVs but iv.setValue requries some weird type (it does not accept ordinary BigDataValue-s) =(
You will need to dig into the LexiconRelation class.
See http://www.bigdata.com/docs/api/com/bigdata/rdf/lexicon/LexiconRelation.html#addTerms%28com.bigdata.rdf.model.BigdataValue,%20int,%20boolean%29 (adding/resolving Values)
See http://www.bigdata.com/docs/api/com/bigdata/rdf/lexicon/LexiconRelation.html#getTerms%28java.util.Collection%29 (Resolving IVs for Values)
There are also classes that are used for streaming batch resolution. You can trace back from those methods to find them.
The IVs MUST be either in the dictionary indices (TERM2ID, ID2TERM, or BLOBS) -OR- valid inline IVs for the namespace of the specific triple store instance. BigdataValue objects have a hidden reference back to BigdataValueFactory and the BigdataValueFactory knows the namespace of the triple store instance that it is compatible with.
Bryan
Thank you for the fast response.
>The IVs MUST be either in the dictionary indices (TERM2ID, ID2TERM, or BLOBS) -OR-
I figured out how to use IV.setValue it happened that wildcard generic inside of IVCache did not let scala to recognize incoming setValue type as bigdatavalue (in 99.9% cases Scala treat java types right but not this type) So eventually I did asInstanceOf conversion and used iv.setValue in combination with lexiconRelation.addTerms, and it worked! (I did not do any manual insertions to dictionary indices)
Now I am investigating editing of SPARQL update.
I wonder why does not setBinding work there? I think that it would be nice to be able to define variables in delete-insert-where update query and substitute it with setBinding, is it in plans?
ParsedUpdate throws unsopported operation error before and even after executing update. I see that it is used in SailUpdate class, so why is it unsupported? Should I do anything to get parsedUpdate values? When is it populated with values?
I see the same thing with tuple sparql query, both before and after evaluation I get null on q.getParsedQuery, when is it not null, or it is just a feature that is planned but has not been coded yet?
bigdata still uses the VALUES syntax rather than BINDINGS.
bigdata does NOT support the TupleExpr object model. Both query and update use the bigdata AST. It sounds like you are mixing things that do not work together?
Well, I am exploring it in a bit "monkey way" by having bigdata sourcecode in one screen and scala repl with active bigdata repo to experiment with in another, so yes, sometimes I try crazy things. Overall I have got the feeling that there are some flows in you architecture, things that only confuse developers who use bigdata, like properties that are always null. In the same time I am very happy that you are very responsive and already solved some of my issues with your answers
I am going to grammatically create Filter, where should I register string constants? I managed to insert Constant[String] into ConstantNode but it is not evaluated.
Make sure that you are using the value factory associated with the AbstractTripleStore.
On Feb 27, 2014, at 10:57 PM, "Anton Kulaga" antonkulaga@users.sf.net<mailto:antonkulaga@users.sf.net> wrote:
I am going to grammatically create Filter, where should I register string constants? I managed to insert Constant[String] into ConstantNode but it is not evaluated.
Fighting against bigdata AST...https://sourceforge.net/p/bigdata/discussion/676946/thread/c83709a0/?limit=25#0dc1
Sent from sourceforge.nethttp://sourceforge.net because you indicated interest in https://sourceforge.net/p/bigdata/discussion/676946/
To unsubscribe from further messages, please visit https://sourceforge.net/auth/subscriptions/
You need to use IVs here. LexiconRelation.addTerms() should be attaching the IV to the BigdataValue object using the IVCache API.
On Feb 27, 2014, at 10:57 PM, "Anton Kulaga" antonkulaga@users.sf.net<mailto:antonkulaga@users.sf.net> wrote:
I am going to grammatically create Filter, where should I register string constants? I managed to insert Constant[String] into ConstantNode but it is not evaluated.
Fighting against bigdata AST...https://sourceforge.net/p/bigdata/discussion/676946/thread/c83709a0/?limit=25#0dc1
Sent from sourceforge.nethttp://sourceforge.net because you indicated interest in https://sourceforge.net/p/bigdata/discussion/676946/
To unsubscribe from further messages, please visit https://sourceforge.net/auth/subscriptions/