This list is closed, nobody may subscribe to it.
2010 |
Jan
|
Feb
(19) |
Mar
(8) |
Apr
(25) |
May
(16) |
Jun
(77) |
Jul
(131) |
Aug
(76) |
Sep
(30) |
Oct
(7) |
Nov
(3) |
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2011 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(2) |
Jun
(2) |
Jul
(16) |
Aug
(3) |
Sep
(1) |
Oct
|
Nov
(7) |
Dec
(7) |
2012 |
Jan
(10) |
Feb
(1) |
Mar
(8) |
Apr
(6) |
May
(1) |
Jun
(3) |
Jul
(1) |
Aug
|
Sep
(1) |
Oct
|
Nov
(8) |
Dec
(2) |
2013 |
Jan
(5) |
Feb
(12) |
Mar
(2) |
Apr
(1) |
May
(1) |
Jun
(1) |
Jul
(22) |
Aug
(50) |
Sep
(31) |
Oct
(64) |
Nov
(83) |
Dec
(28) |
2014 |
Jan
(31) |
Feb
(18) |
Mar
(27) |
Apr
(39) |
May
(45) |
Jun
(15) |
Jul
(6) |
Aug
(27) |
Sep
(6) |
Oct
(67) |
Nov
(70) |
Dec
(1) |
2015 |
Jan
(3) |
Feb
(18) |
Mar
(22) |
Apr
(121) |
May
(42) |
Jun
(17) |
Jul
(8) |
Aug
(11) |
Sep
(26) |
Oct
(15) |
Nov
(66) |
Dec
(38) |
2016 |
Jan
(14) |
Feb
(59) |
Mar
(28) |
Apr
(44) |
May
(21) |
Jun
(12) |
Jul
(9) |
Aug
(11) |
Sep
(4) |
Oct
(2) |
Nov
(1) |
Dec
|
2017 |
Jan
(20) |
Feb
(7) |
Mar
(4) |
Apr
(18) |
May
(7) |
Jun
(3) |
Jul
(13) |
Aug
(2) |
Sep
(4) |
Oct
(9) |
Nov
(2) |
Dec
(5) |
2018 |
Jan
|
Feb
|
Mar
|
Apr
(2) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2019 |
Jan
|
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Jyoti L. <jy...@ii...> - 2014-11-01 11:53:46
|
Dear Bryan, As I reported earlier I am facing the same issue as has been reported again in the trailing mail by another user and in the latest mails by another user Alice. I even tried the latest version 1.3.3 of Bigdata but still the problem persists. Can you please help me with this a bit. Perhaps a small example illustrating RDR usage will be really helpful. Thanks & Regards, Jyoti Leeka PhD student IIIT-Delhi India On Fri, Oct 17, 2014 at 9:26 PM, Maria Jackson <mar...@gm...> wrote: > Dear Bryan, > > I have created a ticket with ticket number Ticket #1023 (new defect) and > have assigned it to you. Sorry I did not get how to create unit test for > this case, therefore did not create one. > > I contains the same example as provided by another user. The example is a > small one, you may use it to load data and then query it using both old > SPARQL and SPARQL*. > > Cheers, > Maria > > On Fri, Oct 17, 2014 at 6:26 PM, Bryan Thompson <br...@sy...> wrote: >> >> Can you create a ticket at Trac.bigdata.com and assign it to me. Include >> the data, query, expected results, and the equivalent sparql query. The >> best thing is to create a unit test. That will get the quickest turn around. >> >> Thanks, >> Bryan >> >> On Oct 17, 2014 12:43 PM, "Maria Jackson" <mar...@gm...> >> wrote: >>> >>> Sorry I forgot to mention I also feel SPARQL* is not giving the correct >>> results. Although old SPARQL syntax gives the correct results. >>> On Fri, Oct 17, 2014 at 4:12 PM, Maria Jackson >>> <mar...@gm...> wrote: >>>> >>>> Dear Bryan, >>>> >>>> I tried with the exact data and SPARQL and SPARQL* queries which another >>>> user Jyoti mentions. >>>> >>>> The service description shows the following XML content, which I think >>>> are in accordance wiki. Please correct me if I am wrong: >>>> >>>> <rdf:RDF><rdf:Description rdf:nodeID="service"><rdf:type >>>> rdf:resource="http://www.w3.org/ns/sparql-service-description#Service"/><endpoint >>>> rdf:resource="http://12.18.1.5:9999/bigdata/namespace/test/sparql"/><endpoint >>>> rdf:resource="http://12.18.1.5:9999/bigdata/LBS/namespace/test/sparql"/><supportedLanguage >>>> rdf:resource="http://www.w3.org/ns/sparql-service-description#SPARQL10Query"/><supportedLanguage >>>> rdf:resource="http://www.w3.org/ns/sparql-service-description#SPARQL11Query"/><supportedLanguage >>>> rdf:resource="http://www.w3.org/ns/sparql-service-description#SPARQL11Update"/><feature >>>> rdf:resource="http://www.w3.org/ns/sparql-service-description#BasicFederatedQuery"/><feature >>>> rdf:resource="http://www.bigdata.com/rdf#/features/KB/Mode/Sids"/><feature >>>> rdf:resource="http://www.bigdata.com/rdf#/features/KB/TruthMaintenance"/><inputFormat >>>> rdf:resource="http://www.w3.org/ns/formats/RDF_XML"/><inputFormat >>>> rdf:resource="http://www.w3.org/ns/formats/N-Triples"/><inputFormat >>>> rdf:resource="http://www.w3.org/ns/formats/Turtle"/><inputFormat >>>> rdf:resource="http://www.w3.org/ns/formats/N3"/><inputFormat >>>> rdf:resource="http://www.wiwiss.fu-berlin.de/suhl/bizer/TriG/Spec/"/><inputFormat >>>> rdf:resource="http://sw.deri.org/2008/07/n-quads/#n-quads"/><inputFormat >>>> rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_XML"/><inputFormat >>>> rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_JSON"/><inputFormat >>>> rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_CSV"/><inputFormat >>>> rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_TSV"/><resultFormat >>>> rdf:resource="http://www.w3.org/ns/formats/RDF_XML"/><resultFormat >>>> rdf:resource="http://www.w3.org/ns/formats/N-Triples"/><resultFormat >>>> rdf:resource="http://www.w3.org/ns/formats/Turtle"/><resultFormat >>>> rdf:resource="http://www.w3.org/ns/formats/N3"/><resultFormat >>>> rdf:resource="http://www.wiwiss.fu-berlin.de/suhl/bizer/TriG/Spec/"/><resultFormat >>>> rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_XML"/><resultFormat >>>> rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_JSON"/><resultFormat >>>> rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_CSV"/><resultFormat >>>> rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_TSV"/><defaultDataset >>>> rdf:nodeID="defaultDataset"/></rdf:Description><rdf:Description >>>> rdf:nodeID="defaultDataset"><rdf:type >>>> rdf:resource="http://www.w3.org/ns/sparql-service-description#Dataset"/><rdf:type >>>> rdf:resource="http://rdfs.org/ns/void#Dataset"/><title>test</title><Namespace>test</Namespace><sparqlEndpoint >>>> rdf:resource="http://12.18.1.5:9999/bigdata/namespace/test/sparql/test/sparql"/><sparqlEndpoint >>>> rdf:resource="http://12.18.1.5:9999/bigdata/LBS/namespace/test/sparql/test/sparql"/><uriRegexPattern>^.*</uriRegexPattern><vocabulary >>>> rdf:resource="http://12.18.1.5:9999/bigdata/namespace/test/"/><vocabulary >>>> rdf:resource="http://www.w3.org/1999/02/22-rdf-syntax-ns"/><defaultGraph >>>> rdf:nodeID="defaultGraph"/></rdf:Description><rdf:Description >>>> rdf:nodeID="defaultGraph"><rdf:type >>>> rdf:resource="http://www.w3.org/ns/sparql-service-description#Graph"/><triples >>>> rdf:datatype="http://www.w3.org/2001/XMLSchema#long">5</triples><entities >>>> rdf:datatype="http://www.w3.org/2001/XMLSchema#long">6</entities><properties >>>> rdf:datatype="http://www.w3.org/2001/XMLSchema#int">5</properties><classes >>>> rdf:datatype="http://www.w3.org/2001/XMLSchema#int">1</classes><propertyPartition >>>> rdf:nodeID="node194eq34ovx1"/></rdf:Description><rdf:Description >>>> rdf:nodeID="node194eq34ovx1"><property >>>> rdf:resource="http://192.168.1.50:9999/bigdata/namespace/test/said"/><triples >>>> rdf:datatype="http://www.w3.org/2001/XMLSchema#long">1</triples></rdf:Description><rdf:Description >>>> rdf:nodeID="defaultGraph"><propertyPartition >>>> rdf:nodeID="node194eq34ovx2"/></rdf:Description><rdf:Description >>>> rdf:nodeID="node194eq34ovx2"><property >>>> rdf:resource="http://www.w3.org/1999/02/22-rdf-syntax-ns#object"/><triples >>>> rdf:datatype="http://www.w3.org/2001/XMLSchema#long">1</triples></rdf:Description><rdf:Description >>>> rdf:nodeID="defaultGraph"><propertyPartition >>>> rdf:nodeID="node194eq34ovx3"/></rdf:Description><rdf:Description >>>> rdf:nodeID="node194eq34ovx3"><property >>>> rdf:resource="http://www.w3.org/1999/02/22-rdf-syntax-ns#predicate"/><triples >>>> rdf:datatype="http://www.w3.org/2001/XMLSchema#long">1</triples></rdf:Description><rdf:Description >>>> rdf:nodeID="defaultGraph"><propertyPartition >>>> rdf:nodeID="node194eq34ovx4"/></rdf:Description><rdf:Description >>>> rdf:nodeID="node194eq34ovx4"><property >>>> rdf:resource="http://www.w3.org/1999/02/22-rdf-syntax-ns#subject"/><triples >>>> rdf:datatype="http://www.w3.org/2001/XMLSchema#long">1</triples></rdf:Description><rdf:Description >>>> rdf:nodeID="defaultGraph"><propertyPartition >>>> rdf:nodeID="node194eq34ovx5"/></rdf:Description><rdf:Description >>>> rdf:nodeID="node194eq34ovx5"><property >>>> rdf:resource="http://www.w3.org/1999/02/22-rdf-syntax-ns#type"/><triples >>>> rdf:datatype="http://www.w3.org/2001/XMLSchema#long">1</triples></rdf:Description><rdf:Description >>>> rdf:nodeID="defaultGraph"><classPartition >>>> rdf:nodeID="node194eq34ovx6"/></rdf:Description><rdf:Description >>>> rdf:nodeID="node194eq34ovx6"><class >>>> rdf:resource="http://www.w3.org/1999/02/22-rdf-syntax-ns#Statement"/><triples >>>> rdf:datatype="http://www.w3.org/2001/XMLSchema#long">1</triples></rdf:Description></rdf:RDF> >>>> >>>> On Thu, Oct 16, 2014 at 5:56 AM, Maria Jackson >>>> <mar...@gm...> wrote: >>>>> >>>>> Dear All, >>>>> >>>>> I am trying to load yago2s 18.5GB >>>>> (http://www.mpi-inf.mpg.de/departments/databases-and-information-systems/research/yago-naga/yago/downloads/) >>>>> in Bigdata. I downloaded bigdata from http://www.bigdata.com/download and I >>>>> am using Bigdata workbench via http://localhost:9999. >>>>> >>>>> I am loading yago2s in BigData's default namespace "kb". I am loading >>>>> yago2s using update by specifying the file path there. While Bigdata is >>>>> loading yago I notice that it consumes a significant amount of CPU and RAM >>>>> for 4-5 hours, but after that it stops using RAM. But my dilemma is that >>>>> BigData workbench still keeps on showing "Running update.." although BigData >>>>> does not consume any RAM or CPU for the next 48 hours or so (In fact it >>>>> keeps showing "Running update.." until I kill the process). Can you please >>>>> suggest as to where am I going wrong as after killing the process BigData is >>>>> not able to retrieve any tuples (and shows 0 results even for the query >>>>> select ?a?b?c where{?a ?b ?c}) >>>>> >>>>> >>>>> Also I am using BigData on a server with 16 cores and 64 GB RAM? >>>>> >>>>> Any help in this regard will be deeply appreciated. >>>>> >>>>> Cheers, >>>>> Maria >>>> >>>> >>> > > > ------------------------------------------------------------------------------ > Comprehensive Server Monitoring with Site24x7. > Monitor 10 servers for $9/Month. > Get alerted through email, SMS, voice calls or mobile push notifications. > Take corrective actions from your mobile device. > http://p.sf.net/sfu/Zoho > _______________________________________________ > Bigdata-developers mailing list > Big...@li... > https://lists.sourceforge.net/lists/listinfo/bigdata-developers > |
From: Alice E. <ali...@ya...> - 2014-11-01 04:10:02
|
To clarify in a readable manner, here is the problem: INSERT: root$ curl -X POST --data-binary 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql <?xml version="1.0"?><data modified="4" milliseconds="511"/> Query Type: RDR form: root$ curl -X POST http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdr' <?xml version='1.0' encoding='UTF-8'?> <sparql xmlns='http://www.w3.org/2005/sparql-results#'> <head> <variable name='s'/> <variable name='p'/> <variable name='o'/> <variable name='-sid-1'/> <variable name='p1'/> <variable name='o1'/> </head> <results> </results> </sparql> Query Type: Standard SPARQL Form: root:~/bigdataAnt/bigdata$ curl -X POST http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql --data-urlencode 'query=SELECT * {?s ?p1 ?o1 }' -H 'Accept:application/rdf+xml' <?xml version='1.0' encoding='UTF-8'?> <sparql xmlns='http://www.w3.org/2005/sparql-results#'> <head> <variable name='s'/> <variable name='p1'/> <variable name='o1'/> </head> <results> <result> <binding name='s'> <bnode>s52861833903943744104261879663595646787406030749160297564454518788</bnode> </binding> <binding name='p1'> <uri>http://example/ns#saidBy</uri> </binding> <binding name='o1'> <literal>l</literal> </binding> </result> </results> </sparql> This means the data is indeed getting inserted but I am not able to query it in RDR form, although the query works perfectly well as a normal SPARQL query. Probably I am missing something. A little help with this can help me big time Here, the file SmallYagoFacts.ttl is: @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> . @prefix dc: <http://purl.org/dc/elements/1.1/> . @prefix : <http://example/ns#> . _:c rdf:subject <http://example.org/book/book11> . _:c rdf:predicate dc:title1 . _:c rdf:object "j" . _:c :saidBy "l" . On Saturday, 1 November 2014 9:17 AM, Alice Everett <ali...@ya...> wrote: That's ok. I need to give a presentation on Monday. So probably you can help on Sunday. Actually, I dont have an issue with the framework I am just not getting how to use it to insert data using RDR mode using CURL. Perhaps, a little example from you can help me with this big time. On Saturday, 1 November 2014 1:35 AM, Bryan Thompson <br...@sy...> wrote: Alice, I am in meetings with a customer today. I could look at this next week. FYI, from the project forum page. If we can not easily recreate the issue then it will not receive any priority under open source support. It is up to you to make the issue as easy to recreate as possible. You can file a ticket and (preferably) create a unit test for the problem. You may use this forum to request help. If you have a bug or a feature request, please log an issue on the tracker [1] and include a unit test which demonstrates the bug. Please follow the instructions [2] when submitting a bug report. If your are interested in services for custom feature development, integration, architecture, or support, please contract the project leads directly. [1] http://trac.bigdata.com/ [2] http://wiki.bigdata.com/wiki/index.php/Submitting_Bugs Thanks, Bryan ---- Bryan Thompson Chief Scientist & Founder SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 br...@sy... http://bigdata.com http://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. On Fri, Oct 31, 2014 at 3:40 PM, Alice Everett <ali...@ya...> wrote: Dear Bryan, > > >I'll be thankful if you can help me with this a bit. Actually I need to give a small presentation in my company regarding how can frameworks like Bigdata help us. It will be great if I can accompany the presentation with a small demo. > > > > >Cheers, >Alice > > > >On Friday, 31 October 2014 7:55 PM, Alice Everett <ali...@ya...> wrote: > > > >Thanks for the reply Rose but I already tried it..although the loading works perfectly fine yet the database does not contain any data: > > >root:~/bigdataAnt$ curl -X POST http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdf+xml' ><?xml version='1.0' encoding='UTF-8'?> ><sparql xmlns='http://www.w3.org/2005/sparql-results#'> ><head> ><variable name='s'/> ><variable name='p'/> ><variable name='o'/> ><variable name='-sid-1'/> ><variable name='p1'/> ><variable name='o1'/> ></head> ><results> ></results> ></sparql> > > > > >I loaded the following file using in reificationRDR namespace: >@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> . >@prefix dc: <http://purl.org/dc/elements/1.1/> . >@prefix : <http://example/ns#> . > > >_:c rdf:subject <http://example.org/book/book11> . >_:c rdf:predicate dc:title1 . >_:c rdf:object "a" . >_:c :saidBy "b" . > > > > >But in the output it does not show any result. I dont know where am I going wrong perhaps BigData developers can help with this. I am waiting for their response. > > > > > > > >On Friday, 31 October 2014 7:51 PM, Rose Beck <ros...@gm...> wrote: > > > >I tried without tmp.xml and the loading worked perfectly fine with me: > > >curl -X POST --data-binary >'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' >http://194.668.5.1:9999/bigdata/namespace/reificationRDR/sparql > >On Fri, Oct 31, 2014 at 6:20 PM, Alice Everett <ali...@ya...> wrote: >> Thanks Rose. But I dont think so.. as it works perfectly with google.com >> >> root:~/bigdataAnt$ curl -v google.com >> * About to connect() to google.com port 80 (#0) >> * Trying 74.125.236.68... connected >>> GET / HTTP/1.1 >>> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 >>> zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >>> Host: google.com >>> Accept: */* >>> >> < HTTP/1.1 302 Found >> < Cache-Control: private >> < Content-Type: text/html; charset=UTF-8 >> < Location: http://www.google.co.in/?gfe_rd=cr&ei=bYVTVL-gG8jM8gfBzoDgCw >> < Content-Length: 261 >> < Date: Fri, 31 Oct 2014 12:49:49 GMT >> < Server: GFE/2.0 >> < Alternate-Protocol: 80:quic,p=0.01 >> < >> <HTML><HEAD><meta http-equiv="content-type" >> content="text/html;charset=utf-8"> >> <TITLE>302 Moved</TITLE></HEAD><BODY> >> <H1>302 Moved</H1> >> The document has moved >> <A >> HREF="http://www.google.co.in/?gfe_rd=cr&ei=bYVTVL-gG8jM8gfBzoDgCw">here</A>. >> </BODY></HTML> >> * Connection #0 to host google.com left intact >> * Closing connection #0 >> >> >> >> On Friday, 31 October 2014 6:19 PM, Rose Beck <ros...@gm...> wrote: >> >> >> I think its a dns error..can you try doing; >> >> curl -v google.com >> >> >> On Fri, Oct 31, 2014 at 6:02 PM, Bryan Thompson <br...@sy...> wrote: >>> If you use POST with a URL of the resource to be loaded (see the NSS wiki >>> page) then the URL must be accessible by bigdata. If you are using the >>> form >>> of POST that sends the data in the http request body (which is the case >>> here), then it only needs to be visible to the client making the request. >>> >>> Thanks, >>> Bryan >>> >>> ---- >>> Bryan Thompson >>> Chief Scientist & Founder >>> SYSTAP, LLC >>> 4501 Tower Road >>> Greensboro, NC 27410 >>> br...@sy... >>> http://bigdata.com >>> http://mapgraph.io >>> >>> CONFIDENTIALITY NOTICE: This email and its contents and attachments are >>> for >>> the sole use of the intended recipient(s) and are confidential or >>> proprietary to SYSTAP. Any unauthorized review, use, disclosure, >>> dissemination or copying of this email or its contents or attachments is >>> prohibited. If you have received this communication in error, please >>> notify >>> the sender by reply email and permanently delete all copies of the email >>> and >>> its contents and attachments. >>> >>> >>> On Fri, Oct 31, 2014 at 8:30 AM, Alice Everett <ali...@ya...> >>> wrote: >>>> >>>> Thanks Jennifer. But even keeping tmp.xml within the bigdata folder is >>>> not >>>> helping. >>>> >>>> >>>> On Friday, 31 October 2014 5:57 PM, Jennifer >>>> <jen...@re...> wrote: >>>> >>>> >>>> I think she is missing as to where tmp.xml should be kept within her >>>> bigdata/Ant folder as I think bigdata is not able to find tmp.xml. >>>> >>>> Alice I think you should keep tmp.xml within the bigdata folder which you >>>> downloaded. >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> From: Alice Everett <ali...@ya...> >>>> Sent: Fri, 31 Oct 2014 17:47:26 >>>> To: Bryan Thompson <br...@sy...> >>>> Cc: "big...@li..." >>>> <big...@li...> >>>> Subject: Re: [Bigdata-developers] How to use RDR with Curl >>>> Ok. Thanks a ton. But still I am a little lost. I used two methods of >>>> inserting as explained below. My namespace's name is reificationRDR. >>>> I'll be very grateful if you can help me with this a bit. >>>> >>>> Insert Method1: >>>> root:~/bigdataAnt$ curl -v -X POST --data-binary >>>> 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @tmp.xml >>>> http://192.168.145.1:9999/bigdata/sparql >>>> output: >>>> * getaddrinfo(3) failed for tmp.xml:80 >>>> * Couldn't resolve host 'tmp.xml' >>>> * Closing connection #0 >>>> curl: (6) Couldn't resolve host 'tmp.xml' >>>> * About to connect() to 192.168.145.1 port 9999 (#0) >>>> * Trying 192.168.145.1... connected >>>> > POST /bigdata/sparql HTTP/1.1 >>>> > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 >>>> > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >>>> > Host: 192.168.145.1:9999 >>>> > Accept: */* >>>> > Content-Length: 52 >>>> > Content-Type: application/x-www-form-urlencoded >>>> > >>>> * upload completely sent off: 52out of 52 bytes >>>> < HTTP/1.1 200 OK >>>> < Content-Type: application/xml; charset=ISO-8859-1 >>>> < Transfer-Encoding: chunked >>>> < Server: Jetty(9.1.4.v20140401) >>>> < >>>> * Connection #0 to host 192.168.145.1 left intact >>>> * Closing connection #0 >>>> >>>> >>>> Insert Method 2: >>>> root:~/bigdataAnt/bigdata$ curl -v -X POST --data-binary >>>> 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' >>>> @/home/bigdataAnt/tmp.xml >>>> http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql >>>> * getaddrinfo(3) failed for :80 >>>> output >>>> * Couldn't resolve host '' >>>> * Closing connection #0 >>>> curl: (6) Couldn't resolve host '' >>>> * About to connect() to 192.168.145.1 port 9999 (#0) >>>> * Trying 192.168.145.1... connected >>>> > POST /bigdata/namespace/reificationRDR/sparql HTTP/1.1 >>>> > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 >>>> > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >>>> > Host: 192.168.145.1:9999 >>>> > Accept: */* >>>> > Content-Length: 52 >>>> > Content-Type: application/x-www-form-urlencoded >>>> > >>>> * upload completely sent off: 52out of 52 bytes >>>> < HTTP/1.1 500 Server Error >>>> < Content-Type: text/plain >>>> < Transfer-Encoding: chunked >>>> < Server: Jetty(9.1.4.v20140401) >>>> < >>>> uri=[file:/home/bigdataAnt/SmallYagoFacts.ttl], context-uri=[] >>>> java.util.concurrent.ExecutionException: java.lang.RuntimeException: Not >>>> found: namespace=reificationRDR >>>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>>> at >>>> >>>> com.bigdata.rdf.sail.webapp.InsertServlet.doPostWithURIs(InsertServlet.java:401) >>>> at >>>> com.bigdata.rdf.sail.webapp.InsertServlet.doPost(InsertServlet.java:117) >>>> at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:267) >>>> at >>>> >>>> com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:144) >>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) >>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) >>>> at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:738) >>>> at >>>> >>>> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551) >>>> at >>>> >>>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) >>>> at >>>> >>>> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:568) >>>> at >>>> >>>> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221) >>>> at >>>> >>>> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1111) >>>> at >>>> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:478) >>>> at >>>> >>>> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:183) >>>> at >>>> >>>> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1045) >>>> at >>>> >>>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) >>>> at >>>> >>>> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:199) >>>> at >>>> >>>> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:109) >>>> at >>>> >>>> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) >>>> at org.eclipse.jetty.server.Server.handle(Server.java:462) >>>> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:279) >>>> at >>>> >>>> org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:232) >>>> at >>>> >>>> org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:534) >>>> at >>>> >>>> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607) >>>> at >>>> >>>> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536) >>>> at java.lang.Thread.run(Thread.java:745) >>>> Caused by: java.lang.RuntimeException: Not found: >>>> namespace=reificationRDR >>>> at >>>> >>>> com.bigdata.rdf.task.AbstractApiTask.getUnisolatedConnection(AbstractApiTask.java:217) >>>> at >>>> >>>> com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:457) >>>> at >>>> >>>> com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:414) >>>> at >>>> >>>> com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>>> at >>>> >>>> com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >>>> at >>>> >>>> com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:220) >>>> ... 26 more >>>> * Connection #0 to host 192.168.145.1 left intact >>>> * Closing connection #0 >>>> >>>> >>>> Query: >>>> curl -X POST >>>> http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql >>>> --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H >>>> 'Accept:application/rdf+xml' >>>> >>>> tmp.xml: >>>> <?xml version="1.0" encoding="UTF-8" standalone="no"?> >>>> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> >>>> <properties> >>>> <!-- --> >>>> <!-- NEW KB NAMESPACE (required). --> >>>> <!-- --> >>>> <entry key="com.bigdata.rdf.sail.namespace">reificationRDR</entry> >>>> <!-- --> >>>> <!-- Specify any KB specific properties here to override defaults for the >>>> BigdataSail --> >>>> <!-- AbstractTripleStore, or indices in the namespace of the new KB >>>> instance. --> >>>> <!-- --> >>>> <entry >>>> >>>> key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> >>>> </properties> >>>> >>>> >>>> >>>> On Friday, 31 October 2014 5:30 PM, Bryan Thompson <br...@sy...> >>>> wrote: >>>> >>>> >>>> What is the namespace for the RDR graph? >>>> >>>> The URL you need to be using is >>>> >>>> http://192.168.145.1:9999/bigdata/namespace/MY-GRAPH-NAMESPACE/sparql >>>> >>>> How to address a specific namespace is explicitly covered if you read the >>>> wiki section on the multitenant interface that I linked in my previous >>>> response. >>>> >>>> Thanks, >>>> Bryan >>>> >>>> On Friday, October 31, 2014, Alice Everett <ali...@ya...');" >>>> class="" style="" target=>ali...@ya...> wrote: >>>> >>>> Thanks a lot for the help. >>>> >>>> But I dont know where I am still going wrong: >>>> I inserted data using: curl -v -X POST --data-binary >>>> 'uri=file:///home/reifiedTriples.ttl' @tmp.xml >>>> http://192.168.145.1:9999/bigdata/sparql >>>> And then queried it using: curl -X POST >>>> http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml >>>> 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr' >>>> curl: (6) Couldn't resolve host 'query=SELECT * <<' >>>> Content-Type not recognized as RDF: application/x-www-form-urlencoded >>>> >>>> >>>> On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy...> >>>> wrote: >>>> >>>> >>>> Alice, >>>> >>>> The workbench choice of the "in use" namespace is recorded in java script >>>> in your browser client. That choice does not effect other workbench >>>> clients >>>> and does not effect the behavior of the various endpoints when using >>>> command >>>> line tools to query or update data in the database. Thus your command >>>> line >>>> requests are being made against a namespace that is not configured for >>>> RDR >>>> support. >>>> >>>> If you want to address a non-default bigdata namespace using curl or >>>> wget, >>>> you must use the appropriate URL for that namespace. This is all >>>> described >>>> on wiki.bigdata.com on the page for the nanoSparqlServer in the section >>>> on >>>> multi-tenancy. >>>> >>>> See >>>> http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API >>>> >>>> Thanks, >>>> Bryan >>>> >>>> On Thursday, October 30, 2014, Alice Everett <ali...@ya...> >>>> wrote: >>>> >>>> I found out an awesome feature in Bigdata called RDR and I am trying to >>>> explore that too. Can you please let me know as to where am I going wrong >>>> while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample >>>> RDF >>>> data, contains reification in its standard form: >>>> http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification) >>>> Loading: >>>> curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' >>>> http://192.168.145.1:9999/bigdata/sparql >>>> (Additionally I changed my current namespace within the workbench opened >>>> in my browser to RDR mode). >>>> >>>> After this I fired the following query and got the following error (Can >>>> you please correct me as to where am I going wrong. I'll be very grateful >>>> to >>>> you for the same): >>>> @HP-ProBook-4430s:~/bigdataAnt$ curl -X POST >>>> http://192.168.145.1:9999/bigdata/sparql --header >>>> "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p >>>> ?o>> >>>> ?p1 ?o1 }' -H 'Accept:application/rdr' >>>> >>>> SELECT * {<<?s ?p ?o>> ?p1 ?o1 } >>>> java.util.concurrent.ExecutionException: >>>> org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: >>>> java.util.concurrent.ExecutionException: java.lang.RuntimeException: >>>> java.util.concurrent.ExecutionException: java.lang.Exception: >>>> >>>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>>> cause=java.util.concurrent.ExecutionException: >>>> java.lang.RuntimeException: >>>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>>> at >>>> >>>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) >>>> at >>>> >>>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) >>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>>> at >>>> >>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) >>>> at >>>> >>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) >>>> at java.lang.Thread.run(Thread.java:745) >>>> Caused by: org.openrdf.query.QueryEvaluationException: >>>> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >>>> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >>>> java.lang.Exception: >>>> >>>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>>> cause=java.util.concurrent.ExecutionException: >>>> java.lang.RuntimeException: >>>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>>> at >>>> >>>> com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) >>>> at >>>> >>>> org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) >>>> at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) >>>> at >>>> >>>> org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) >>>> at >>>> >>>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) >>>> at >>>> >>>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) >>>> at >>>> >>>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) >>>> at >>>> >>>> com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>>> at >>>> >>>> com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >>>> ... 6 more >>>> Caused by: java.lang.RuntimeException: >>>> java.util.concurrent.ExecutionException: java.lang.RuntimeException: >>>> java.util.concurrent.ExecutionException: java.lang.Exception: >>>> >>>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>>> cause=java.util.concurrent.ExecutionException: >>>> java.lang.RuntimeException: >>>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>>> at >>>> >>>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) >>>> at >>>> >>>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) >>>> at >>>> >>>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) >>>> at >>>> >>>> com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) >>>> at >>>> >>>> com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) >>>> at >>>> >>>> com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) >>>> ... 15 more >>>> Caused by: java.util.concurrent.ExecutionException: >>>> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >>>> java.lang.Exception: >>>> >>>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>>> cause=java.util.concurrent.ExecutionException: >>>> java.lang.RuntimeException: >>>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>>> at >>>> >>>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) >>>> ... 20 more >>>> Caused by: java.lang.RuntimeException: >>>> java.util.concurrent.ExecutionException: java.lang.Exception: >>>> >>>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>>> cause=java.util.concurrent.ExecutionException: >>>> java.lang.RuntimeException: >>>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>>> at >>>> >>>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) >>>> at >>>> >>>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) >>>> at >>>> >>>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) >>>> at >>>> >>>> com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) >>>> at >>>> >>>> com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) >>>> at >>>> >>>> com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) >>>> >>>> ... 4 more >>>> Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: >>>> >>>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>>> cause=java.util.concurrent.ExecutionException: >>>> java.lang.RuntimeException: >>>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>>> at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) >>>> at >>>> >>>> com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) >>>> at >>>> >>>> com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) >>>> at >>>> >>>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) >>>> ... 9 more >>>> Caused by: java.lang.Exception: >>>> >>>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>>> cause=java.util.concurrent.ExecutionException: >>>> java.lang.RuntimeException: >>>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>>> at >>>> >>>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) >>>> at >>>> >>>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) >>>> at >>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) >>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>>> at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>>> at >>>> >>>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) >>>> ... 3 more >>>> Caused by: java.util.concurrent.ExecutionException: >>>> java.lang.RuntimeException: java.lang.RuntimeException: >>>> java.lang.ArrayIndexOutOfBoundsException: 0 >>>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>>> at >>>> >>>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) >>>> ... 8 more >>>> Caused by: java.lang.RuntimeException: java.lang.RuntimeException: >>>> java.lang.ArrayIndexOutOfBoundsException: 0 >>>> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) >>>> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) >>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>>> at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>>> at >>>> >>>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) >>>> ... 8 more >>>> Caused by: java.lang.RuntimeException: >>>> java.lang.ArrayIndexOutOfBoundsException: 0 >>>> at >>>> >>>> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) >>>> at >>>> >>>> com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) >>>> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) >>>> ... 12 more >>>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 >>>> at >>>> >>>> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) >>>> at >>>> >>>> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) >>>> ... 14 more >>>> >>>> >>>> >>>> -- >>>> ---- >>>> Bryan Thompson >>>> Chief Scientist & Founder >>>> SYSTAP, LLC >>>> 4501 Tower Road >>>> Greensboro, NC 27410 >>>> br...@sy... >>>> http://bigdata.com >>>> http://mapgraph.io >>>> CONFIDENTIALITY NOTICE: This email and its contents and attachments are >>>> for the sole use of the intended recipient(s) and are confidential or >>>> proprietary to SYSTAP. Any unauthorized review, use, disclosure, >>>> dissemination or copying of this email or its contents or attachments is >>>> prohibited. If you have received this communication in error, please >>>> notify >>>> the sender by reply email and permanently delete all copies of the email >>>> and >>>> its contents and attachments. >>>> >>>> >>>> >>>> >>>> I dont know why am I getting an error when I am querying using RDR. Can >>>> you please help me with this one last time. >>>> >>>> My tmp.xml file is: >>>> <?xml version="1.0" encoding="UTF-8" standalone="no"?> >>>> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> >>>> <properties> >>>> <!-- --> >>>> <!-- NEW KB NAMESPACE (required). --> >>>> <!-- --> >>>> <entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry> >>>> <!-- --> >>>> <!-- Specify any KB specific properties here to override defaults for the >>>> BigdataSail --> >>>> <!-- AbstractTripleStore, or indices in the namespace of the new KB >>>> instance. --> >>>> <!-- --> >>>> <entry >>>> >>>> key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> >>>> </properties> >>>> >>>> >>>> >>>> -- >>>> ---- >>>> Bryan Thompson >>>> Chief Scientist & Founder >>>> SYSTAP, LLC >>>> 4501 Tower Road >>>> Greensboro, NC 27410 >>>> br...@sy..." class="" style="" target='_blank' >>>> rel=external>br...@sy... >>>> http://bigdata.com >>>> http://mapgraph.io >>>> CONFIDENTIALITY NOTICE: This email and its contents and attachments are >>>> for the sole use of the intended recipient(s) and are confidential or >>>> proprietary to SYSTAP. Any unauthorized review, use, disclosure, >>>> dissemination or copying of this email or its contents or attachments is >>>> prohibited. If you have received this communication in error, please >>>> notify >>>> the sender by reply email and permanently delete all copies of the email >>>> and >>>> its contents and attachments. >>>> >>>> >>>> >>>> >>>> >>>> ------------------------------------------------------------------------------ >>>> _______________________________________________ >>>> Bigdata-developers mailing list >>>> Big...@li... >>>> https://lists.sourceforge.net/lists/listinfo/bigdata-developers >>>> >>>> Get your own FREE website, FREE domain & FREE mobile app with Company >>>> email. >>>> Know More > >>>> >>>> >>> >>> >>> >>> ------------------------------------------------------------------------------ >>> >>> _______________________________________________ >>> Bigdata-developers mailing list >>> Big...@li... >>> https://lists.sourceforge.net/lists/listinfo/bigdata-developers >>> >> >> > > > > > |
From: Alice E. <ali...@ya...> - 2014-10-31 14:25:38
|
Thanks for the reply Rose but I already tried it..although the loading works perfectly fine yet the database does not contain any data: root:~/bigdataAnt$ curl -X POST http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdf+xml' <?xml version='1.0' encoding='UTF-8'?> <sparql xmlns='http://www.w3.org/2005/sparql-results#'> <head> <variable name='s'/> <variable name='p'/> <variable name='o'/> <variable name='-sid-1'/> <variable name='p1'/> <variable name='o1'/> </head> <results> </results> </sparql> I loaded the following file using in reificationRDR namespace: @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> . @prefix dc: <http://purl.org/dc/elements/1.1/> . @prefix : <http://example/ns#> . _:c rdf:subject <http://example.org/book/book11> . _:c rdf:predicate dc:title1 . _:c rdf:object "a" . _:c :saidBy "b" . But in the output it does not show any result. I dont know where am I going wrong perhaps BigData developers can help with this. I am waiting for their response. On Friday, 31 October 2014 7:51 PM, Rose Beck <ros...@gm...> wrote: I tried without tmp.xml and the loading worked perfectly fine with me: curl -X POST --data-binary 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' http://194.668.5.1:9999/bigdata/namespace/reificationRDR/sparql On Fri, Oct 31, 2014 at 6:20 PM, Alice Everett <ali...@ya...> wrote: > Thanks Rose. But I dont think so.. as it works perfectly with google.com > > root:~/bigdataAnt$ curl -v google.com > * About to connect() to google.com port 80 (#0) > * Trying 74.125.236.68... connected >> GET / HTTP/1.1 >> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 >> zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >> Host: google.com >> Accept: */* >> > < HTTP/1.1 302 Found > < Cache-Control: private > < Content-Type: text/html; charset=UTF-8 > < Location: http://www.google.co.in/?gfe_rd=cr&ei=bYVTVL-gG8jM8gfBzoDgCw > < Content-Length: 261 > < Date: Fri, 31 Oct 2014 12:49:49 GMT > < Server: GFE/2.0 > < Alternate-Protocol: 80:quic,p=0.01 > < > <HTML><HEAD><meta http-equiv="content-type" > content="text/html;charset=utf-8"> > <TITLE>302 Moved</TITLE></HEAD><BODY> > <H1>302 Moved</H1> > The document has moved > <A > HREF="http://www.google.co.in/?gfe_rd=cr&ei=bYVTVL-gG8jM8gfBzoDgCw">here</A>. > </BODY></HTML> > * Connection #0 to host google.com left intact > * Closing connection #0 > > > > On Friday, 31 October 2014 6:19 PM, Rose Beck <ros...@gm...> wrote: > > > I think its a dns error..can you try doing; > > curl -v google.com > > > On Fri, Oct 31, 2014 at 6:02 PM, Bryan Thompson <br...@sy...> wrote: >> If you use POST with a URL of the resource to be loaded (see the NSS wiki >> page) then the URL must be accessible by bigdata. If you are using the >> form >> of POST that sends the data in the http request body (which is the case >> here), then it only needs to be visible to the client making the request. >> >> Thanks, >> Bryan >> >> ---- >> Bryan Thompson >> Chief Scientist & Founder >> SYSTAP, LLC >> 4501 Tower Road >> Greensboro, NC 27410 >> br...@sy... >> http://bigdata.com >> http://mapgraph.io >> >> CONFIDENTIALITY NOTICE: This email and its contents and attachments are >> for >> the sole use of the intended recipient(s) and are confidential or >> proprietary to SYSTAP. Any unauthorized review, use, disclosure, >> dissemination or copying of this email or its contents or attachments is >> prohibited. If you have received this communication in error, please >> notify >> the sender by reply email and permanently delete all copies of the email >> and >> its contents and attachments. >> >> >> On Fri, Oct 31, 2014 at 8:30 AM, Alice Everett <ali...@ya...> >> wrote: >>> >>> Thanks Jennifer. But even keeping tmp.xml within the bigdata folder is >>> not >>> helping. >>> >>> >>> On Friday, 31 October 2014 5:57 PM, Jennifer >>> <jen...@re...> wrote: >>> >>> >>> I think she is missing as to where tmp.xml should be kept within her >>> bigdata/Ant folder as I think bigdata is not able to find tmp.xml. >>> >>> Alice I think you should keep tmp.xml within the bigdata folder which you >>> downloaded. >>> >>> >>> >>> >>> >>> >>> >>> From: Alice Everett <ali...@ya...> >>> Sent: Fri, 31 Oct 2014 17:47:26 >>> To: Bryan Thompson <br...@sy...> >>> Cc: "big...@li..." >>> <big...@li...> >>> Subject: Re: [Bigdata-developers] How to use RDR with Curl >>> Ok. Thanks a ton. But still I am a little lost. I used two methods of >>> inserting as explained below. My namespace's name is reificationRDR. >>> I'll be very grateful if you can help me with this a bit. >>> >>> Insert Method1: >>> root:~/bigdataAnt$ curl -v -X POST --data-binary >>> 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @tmp.xml >>> http://192.168.145.1:9999/bigdata/sparql >>> output: >>> * getaddrinfo(3) failed for tmp.xml:80 >>> * Couldn't resolve host 'tmp.xml' >>> * Closing connection #0 >>> curl: (6) Couldn't resolve host 'tmp.xml' >>> * About to connect() to 192.168.145.1 port 9999 (#0) >>> * Trying 192.168.145.1... connected >>> > POST /bigdata/sparql HTTP/1.1 >>> > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 >>> > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >>> > Host: 192.168.145.1:9999 >>> > Accept: */* >>> > Content-Length: 52 >>> > Content-Type: application/x-www-form-urlencoded >>> > >>> * upload completely sent off: 52out of 52 bytes >>> < HTTP/1.1 200 OK >>> < Content-Type: application/xml; charset=ISO-8859-1 >>> < Transfer-Encoding: chunked >>> < Server: Jetty(9.1.4.v20140401) >>> < >>> * Connection #0 to host 192.168.145.1 left intact >>> * Closing connection #0 >>> >>> >>> Insert Method 2: >>> root:~/bigdataAnt/bigdata$ curl -v -X POST --data-binary >>> 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' >>> @/home/bigdataAnt/tmp.xml >>> http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql >>> * getaddrinfo(3) failed for :80 >>> output >>> * Couldn't resolve host '' >>> * Closing connection #0 >>> curl: (6) Couldn't resolve host '' >>> * About to connect() to 192.168.145.1 port 9999 (#0) >>> * Trying 192.168.145.1... connected >>> > POST /bigdata/namespace/reificationRDR/sparql HTTP/1.1 >>> > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 >>> > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >>> > Host: 192.168.145.1:9999 >>> > Accept: */* >>> > Content-Length: 52 >>> > Content-Type: application/x-www-form-urlencoded >>> > >>> * upload completely sent off: 52out of 52 bytes >>> < HTTP/1.1 500 Server Error >>> < Content-Type: text/plain >>> < Transfer-Encoding: chunked >>> < Server: Jetty(9.1.4.v20140401) >>> < >>> uri=[file:/home/bigdataAnt/SmallYagoFacts.ttl], context-uri=[] >>> java.util.concurrent.ExecutionException: java.lang.RuntimeException: Not >>> found: namespace=reificationRDR >>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>> at >>> >>> com.bigdata.rdf.sail.webapp.InsertServlet.doPostWithURIs(InsertServlet.java:401) >>> at >>> com.bigdata.rdf.sail.webapp.InsertServlet.doPost(InsertServlet.java:117) >>> at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:267) >>> at >>> >>> com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:144) >>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) >>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) >>> at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:738) >>> at >>> >>> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551) >>> at >>> >>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) >>> at >>> >>> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:568) >>> at >>> >>> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221) >>> at >>> >>> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1111) >>> at >>> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:478) >>> at >>> >>> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:183) >>> at >>> >>> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1045) >>> at >>> >>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) >>> at >>> >>> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:199) >>> at >>> >>> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:109) >>> at >>> >>> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) >>> at org.eclipse.jetty.server.Server.handle(Server.java:462) >>> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:279) >>> at >>> >>> org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:232) >>> at >>> >>> org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:534) >>> at >>> >>> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607) >>> at >>> >>> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536) >>> at java.lang.Thread.run(Thread.java:745) >>> Caused by: java.lang.RuntimeException: Not found: >>> namespace=reificationRDR >>> at >>> >>> com.bigdata.rdf.task.AbstractApiTask.getUnisolatedConnection(AbstractApiTask.java:217) >>> at >>> >>> com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:457) >>> at >>> >>> com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:414) >>> at >>> >>> com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>> at >>> >>> com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >>> at >>> >>> com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:220) >>> ... 26 more >>> * Connection #0 to host 192.168.145.1 left intact >>> * Closing connection #0 >>> >>> >>> Query: >>> curl -X POST >>> http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql >>> --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H >>> 'Accept:application/rdf+xml' >>> >>> tmp.xml: >>> <?xml version="1.0" encoding="UTF-8" standalone="no"?> >>> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> >>> <properties> >>> <!-- --> >>> <!-- NEW KB NAMESPACE (required). --> >>> <!-- --> >>> <entry key="com.bigdata.rdf.sail.namespace">reificationRDR</entry> >>> <!-- --> >>> <!-- Specify any KB specific properties here to override defaults for the >>> BigdataSail --> >>> <!-- AbstractTripleStore, or indices in the namespace of the new KB >>> instance. --> >>> <!-- --> >>> <entry >>> >>> key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> >>> </properties> >>> >>> >>> >>> On Friday, 31 October 2014 5:30 PM, Bryan Thompson <br...@sy...> >>> wrote: >>> >>> >>> What is the namespace for the RDR graph? >>> >>> The URL you need to be using is >>> >>> http://192.168.145.1:9999/bigdata/namespace/MY-GRAPH-NAMESPACE/sparql >>> >>> How to address a specific namespace is explicitly covered if you read the >>> wiki section on the multitenant interface that I linked in my previous >>> response. >>> >>> Thanks, >>> Bryan >>> >>> On Friday, October 31, 2014, Alice Everett <ali...@ya...');" >>> class="" style="" target=>ali...@ya...> wrote: >>> >>> Thanks a lot for the help. >>> >>> But I dont know where I am still going wrong: >>> I inserted data using: curl -v -X POST --data-binary >>> 'uri=file:///home/reifiedTriples.ttl' @tmp.xml >>> http://192.168.145.1:9999/bigdata/sparql >>> And then queried it using: curl -X POST >>> http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml >>> 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr' >>> curl: (6) Couldn't resolve host 'query=SELECT * <<' >>> Content-Type not recognized as RDF: application/x-www-form-urlencoded >>> >>> >>> On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy...> >>> wrote: >>> >>> >>> Alice, >>> >>> The workbench choice of the "in use" namespace is recorded in java script >>> in your browser client. That choice does not effect other workbench >>> clients >>> and does not effect the behavior of the various endpoints when using >>> command >>> line tools to query or update data in the database. Thus your command >>> line >>> requests are being made against a namespace that is not configured for >>> RDR >>> support. >>> >>> If you want to address a non-default bigdata namespace using curl or >>> wget, >>> you must use the appropriate URL for that namespace. This is all >>> described >>> on wiki.bigdata.com on the page for the nanoSparqlServer in the section >>> on >>> multi-tenancy. >>> >>> See >>> http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API >>> >>> Thanks, >>> Bryan >>> >>> On Thursday, October 30, 2014, Alice Everett <ali...@ya...> >>> wrote: >>> >>> I found out an awesome feature in Bigdata called RDR and I am trying to >>> explore that too. Can you please let me know as to where am I going wrong >>> while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample >>> RDF >>> data, contains reification in its standard form: >>> http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification) >>> Loading: >>> curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' >>> http://192.168.145.1:9999/bigdata/sparql >>> (Additionally I changed my current namespace within the workbench opened >>> in my browser to RDR mode). >>> >>> After this I fired the following query and got the following error (Can >>> you please correct me as to where am I going wrong. I'll be very grateful >>> to >>> you for the same): >>> @HP-ProBook-4430s:~/bigdataAnt$ curl -X POST >>> http://192.168.145.1:9999/bigdata/sparql --header >>> "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p >>> ?o>> >>> ?p1 ?o1 }' -H 'Accept:application/rdr' >>> >>> SELECT * {<<?s ?p ?o>> ?p1 ?o1 } >>> java.util.concurrent.ExecutionException: >>> org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: >>> java.util.concurrent.ExecutionException: java.lang.RuntimeException: >>> java.util.concurrent.ExecutionException: java.lang.Exception: >>> >>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>> cause=java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: >>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>> at >>> >>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) >>> at >>> >>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) >>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>> at >>> >>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) >>> at >>> >>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) >>> at java.lang.Thread.run(Thread.java:745) >>> Caused by: org.openrdf.query.QueryEvaluationException: >>> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >>> java.lang.Exception: >>> >>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>> cause=java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: >>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at >>> >>> com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) >>> at >>> >>> org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) >>> at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) >>> at >>> >>> org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) >>> at >>> >>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) >>> at >>> >>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) >>> at >>> >>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) >>> at >>> >>> com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>> at >>> >>> com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >>> ... 6 more >>> Caused by: java.lang.RuntimeException: >>> java.util.concurrent.ExecutionException: java.lang.RuntimeException: >>> java.util.concurrent.ExecutionException: java.lang.Exception: >>> >>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>> cause=java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: >>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at >>> >>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) >>> at >>> >>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) >>> at >>> >>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) >>> at >>> >>> com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) >>> at >>> >>> com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) >>> at >>> >>> com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) >>> ... 15 more >>> Caused by: java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >>> java.lang.Exception: >>> >>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>> cause=java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: >>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>> at >>> >>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) >>> ... 20 more >>> Caused by: java.lang.RuntimeException: >>> java.util.concurrent.ExecutionException: java.lang.Exception: >>> >>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>> cause=java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: >>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at >>> >>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) >>> at >>> >>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) >>> at >>> >>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) >>> at >>> >>> com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) >>> at >>> >>> com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) >>> at >>> >>> com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) >>> >>> ... 4 more >>> Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: >>> >>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>> cause=java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: >>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) >>> at >>> >>> com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) >>> at >>> >>> com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) >>> at >>> >>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) >>> ... 9 more >>> Caused by: java.lang.Exception: >>> >>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>> cause=java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: >>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at >>> >>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) >>> at >>> >>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) >>> at >>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) >>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>> at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>> at >>> >>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) >>> ... 3 more >>> Caused by: java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: java.lang.RuntimeException: >>> java.lang.ArrayIndexOutOfBoundsException: 0 >>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>> at >>> >>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) >>> ... 8 more >>> Caused by: java.lang.RuntimeException: java.lang.RuntimeException: >>> java.lang.ArrayIndexOutOfBoundsException: 0 >>> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) >>> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) >>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>> at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>> at >>> >>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) >>> ... 8 more >>> Caused by: java.lang.RuntimeException: >>> java.lang.ArrayIndexOutOfBoundsException: 0 >>> at >>> >>> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) >>> at >>> >>> com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) >>> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) >>> ... 12 more >>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at >>> >>> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) >>> at >>> >>> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) >>> ... 14 more >>> >>> >>> >>> -- >>> ---- >>> Bryan Thompson >>> Chief Scientist & Founder >>> SYSTAP, LLC >>> 4501 Tower Road >>> Greensboro, NC 27410 >>> br...@sy... >>> http://bigdata.com >>> http://mapgraph.io >>> CONFIDENTIALITY NOTICE: This email and its contents and attachments are >>> for the sole use of the intended recipient(s) and are confidential or >>> proprietary to SYSTAP. Any unauthorized review, use, disclosure, >>> dissemination or copying of this email or its contents or attachments is >>> prohibited. If you have received this communication in error, please >>> notify >>> the sender by reply email and permanently delete all copies of the email >>> and >>> its contents and attachments. >>> >>> >>> >>> >>> I dont know why am I getting an error when I am querying using RDR. Can >>> you please help me with this one last time. >>> >>> My tmp.xml file is: >>> <?xml version="1.0" encoding="UTF-8" standalone="no"?> >>> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> >>> <properties> >>> <!-- --> >>> <!-- NEW KB NAMESPACE (required). --> >>> <!-- --> >>> <entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry> >>> <!-- --> >>> <!-- Specify any KB specific properties here to override defaults for the >>> BigdataSail --> >>> <!-- AbstractTripleStore, or indices in the namespace of the new KB >>> instance. --> >>> <!-- --> >>> <entry >>> >>> key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> >>> </properties> >>> >>> >>> >>> -- >>> ---- >>> Bryan Thompson >>> Chief Scientist & Founder >>> SYSTAP, LLC >>> 4501 Tower Road >>> Greensboro, NC 27410 >>> br...@sy..." class="" style="" target='_blank' >>> rel=external>br...@sy... >>> http://bigdata.com >>> http://mapgraph.io >>> CONFIDENTIALITY NOTICE: This email and its contents and attachments are >>> for the sole use of the intended recipient(s) and are confidential or >>> proprietary to SYSTAP. Any unauthorized review, use, disclosure, >>> dissemination or copying of this email or its contents or attachments is >>> prohibited. If you have received this communication in error, please >>> notify >>> the sender by reply email and permanently delete all copies of the email >>> and >>> its contents and attachments. >>> >>> >>> >>> >>> >>> ------------------------------------------------------------------------------ >>> _______________________________________________ >>> Bigdata-developers mailing list >>> Big...@li... >>> https://lists.sourceforge.net/lists/listinfo/bigdata-developers >>> >>> Get your own FREE website, FREE domain & FREE mobile app with Company >>> email. >>> Know More > >>> >>> >> >> >> >> ------------------------------------------------------------------------------ >> >> _______________________________________________ >> Bigdata-developers mailing list >> Big...@li... >> https://lists.sourceforge.net/lists/listinfo/bigdata-developers >> > > |
From: Rose B. <ros...@gm...> - 2014-10-31 14:21:25
|
I tried without tmp.xml and the loading worked perfectly fine with me: curl -X POST --data-binary 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' http://194.668.5.1:9999/bigdata/namespace/reificationRDR/sparql On Fri, Oct 31, 2014 at 6:20 PM, Alice Everett <ali...@ya...> wrote: > Thanks Rose. But I dont think so.. as it works perfectly with google.com > > root:~/bigdataAnt$ curl -v google.com > * About to connect() to google.com port 80 (#0) > * Trying 74.125.236.68... connected >> GET / HTTP/1.1 >> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 >> zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >> Host: google.com >> Accept: */* >> > < HTTP/1.1 302 Found > < Cache-Control: private > < Content-Type: text/html; charset=UTF-8 > < Location: http://www.google.co.in/?gfe_rd=cr&ei=bYVTVL-gG8jM8gfBzoDgCw > < Content-Length: 261 > < Date: Fri, 31 Oct 2014 12:49:49 GMT > < Server: GFE/2.0 > < Alternate-Protocol: 80:quic,p=0.01 > < > <HTML><HEAD><meta http-equiv="content-type" > content="text/html;charset=utf-8"> > <TITLE>302 Moved</TITLE></HEAD><BODY> > <H1>302 Moved</H1> > The document has moved > <A > HREF="http://www.google.co.in/?gfe_rd=cr&ei=bYVTVL-gG8jM8gfBzoDgCw">here</A>. > </BODY></HTML> > * Connection #0 to host google.com left intact > * Closing connection #0 > > > > On Friday, 31 October 2014 6:19 PM, Rose Beck <ros...@gm...> wrote: > > > I think its a dns error..can you try doing; > > curl -v google.com > > > On Fri, Oct 31, 2014 at 6:02 PM, Bryan Thompson <br...@sy...> wrote: >> If you use POST with a URL of the resource to be loaded (see the NSS wiki >> page) then the URL must be accessible by bigdata. If you are using the >> form >> of POST that sends the data in the http request body (which is the case >> here), then it only needs to be visible to the client making the request. >> >> Thanks, >> Bryan >> >> ---- >> Bryan Thompson >> Chief Scientist & Founder >> SYSTAP, LLC >> 4501 Tower Road >> Greensboro, NC 27410 >> br...@sy... >> http://bigdata.com >> http://mapgraph.io >> >> CONFIDENTIALITY NOTICE: This email and its contents and attachments are >> for >> the sole use of the intended recipient(s) and are confidential or >> proprietary to SYSTAP. Any unauthorized review, use, disclosure, >> dissemination or copying of this email or its contents or attachments is >> prohibited. If you have received this communication in error, please >> notify >> the sender by reply email and permanently delete all copies of the email >> and >> its contents and attachments. >> >> >> On Fri, Oct 31, 2014 at 8:30 AM, Alice Everett <ali...@ya...> >> wrote: >>> >>> Thanks Jennifer. But even keeping tmp.xml within the bigdata folder is >>> not >>> helping. >>> >>> >>> On Friday, 31 October 2014 5:57 PM, Jennifer >>> <jen...@re...> wrote: >>> >>> >>> I think she is missing as to where tmp.xml should be kept within her >>> bigdata/Ant folder as I think bigdata is not able to find tmp.xml. >>> >>> Alice I think you should keep tmp.xml within the bigdata folder which you >>> downloaded. >>> >>> >>> >>> >>> >>> >>> >>> From: Alice Everett <ali...@ya...> >>> Sent: Fri, 31 Oct 2014 17:47:26 >>> To: Bryan Thompson <br...@sy...> >>> Cc: "big...@li..." >>> <big...@li...> >>> Subject: Re: [Bigdata-developers] How to use RDR with Curl >>> Ok. Thanks a ton. But still I am a little lost. I used two methods of >>> inserting as explained below. My namespace's name is reificationRDR. >>> I'll be very grateful if you can help me with this a bit. >>> >>> Insert Method1: >>> root:~/bigdataAnt$ curl -v -X POST --data-binary >>> 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @tmp.xml >>> http://192.168.145.1:9999/bigdata/sparql >>> output: >>> * getaddrinfo(3) failed for tmp.xml:80 >>> * Couldn't resolve host 'tmp.xml' >>> * Closing connection #0 >>> curl: (6) Couldn't resolve host 'tmp.xml' >>> * About to connect() to 192.168.145.1 port 9999 (#0) >>> * Trying 192.168.145.1... connected >>> > POST /bigdata/sparql HTTP/1.1 >>> > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 >>> > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >>> > Host: 192.168.145.1:9999 >>> > Accept: */* >>> > Content-Length: 52 >>> > Content-Type: application/x-www-form-urlencoded >>> > >>> * upload completely sent off: 52out of 52 bytes >>> < HTTP/1.1 200 OK >>> < Content-Type: application/xml; charset=ISO-8859-1 >>> < Transfer-Encoding: chunked >>> < Server: Jetty(9.1.4.v20140401) >>> < >>> * Connection #0 to host 192.168.145.1 left intact >>> * Closing connection #0 >>> >>> >>> Insert Method 2: >>> root:~/bigdataAnt/bigdata$ curl -v -X POST --data-binary >>> 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' >>> @/home/bigdataAnt/tmp.xml >>> http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql >>> * getaddrinfo(3) failed for :80 >>> output >>> * Couldn't resolve host '' >>> * Closing connection #0 >>> curl: (6) Couldn't resolve host '' >>> * About to connect() to 192.168.145.1 port 9999 (#0) >>> * Trying 192.168.145.1... connected >>> > POST /bigdata/namespace/reificationRDR/sparql HTTP/1.1 >>> > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 >>> > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >>> > Host: 192.168.145.1:9999 >>> > Accept: */* >>> > Content-Length: 52 >>> > Content-Type: application/x-www-form-urlencoded >>> > >>> * upload completely sent off: 52out of 52 bytes >>> < HTTP/1.1 500 Server Error >>> < Content-Type: text/plain >>> < Transfer-Encoding: chunked >>> < Server: Jetty(9.1.4.v20140401) >>> < >>> uri=[file:/home/bigdataAnt/SmallYagoFacts.ttl], context-uri=[] >>> java.util.concurrent.ExecutionException: java.lang.RuntimeException: Not >>> found: namespace=reificationRDR >>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>> at >>> >>> com.bigdata.rdf.sail.webapp.InsertServlet.doPostWithURIs(InsertServlet.java:401) >>> at >>> com.bigdata.rdf.sail.webapp.InsertServlet.doPost(InsertServlet.java:117) >>> at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:267) >>> at >>> >>> com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:144) >>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) >>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) >>> at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:738) >>> at >>> >>> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551) >>> at >>> >>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) >>> at >>> >>> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:568) >>> at >>> >>> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221) >>> at >>> >>> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1111) >>> at >>> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:478) >>> at >>> >>> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:183) >>> at >>> >>> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1045) >>> at >>> >>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) >>> at >>> >>> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:199) >>> at >>> >>> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:109) >>> at >>> >>> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) >>> at org.eclipse.jetty.server.Server.handle(Server.java:462) >>> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:279) >>> at >>> >>> org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:232) >>> at >>> >>> org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:534) >>> at >>> >>> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607) >>> at >>> >>> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536) >>> at java.lang.Thread.run(Thread.java:745) >>> Caused by: java.lang.RuntimeException: Not found: >>> namespace=reificationRDR >>> at >>> >>> com.bigdata.rdf.task.AbstractApiTask.getUnisolatedConnection(AbstractApiTask.java:217) >>> at >>> >>> com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:457) >>> at >>> >>> com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:414) >>> at >>> >>> com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>> at >>> >>> com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >>> at >>> >>> com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:220) >>> ... 26 more >>> * Connection #0 to host 192.168.145.1 left intact >>> * Closing connection #0 >>> >>> >>> Query: >>> curl -X POST >>> http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql >>> --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H >>> 'Accept:application/rdf+xml' >>> >>> tmp.xml: >>> <?xml version="1.0" encoding="UTF-8" standalone="no"?> >>> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> >>> <properties> >>> <!-- --> >>> <!-- NEW KB NAMESPACE (required). --> >>> <!-- --> >>> <entry key="com.bigdata.rdf.sail.namespace">reificationRDR</entry> >>> <!-- --> >>> <!-- Specify any KB specific properties here to override defaults for the >>> BigdataSail --> >>> <!-- AbstractTripleStore, or indices in the namespace of the new KB >>> instance. --> >>> <!-- --> >>> <entry >>> >>> key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> >>> </properties> >>> >>> >>> >>> On Friday, 31 October 2014 5:30 PM, Bryan Thompson <br...@sy...> >>> wrote: >>> >>> >>> What is the namespace for the RDR graph? >>> >>> The URL you need to be using is >>> >>> http://192.168.145.1:9999/bigdata/namespace/MY-GRAPH-NAMESPACE/sparql >>> >>> How to address a specific namespace is explicitly covered if you read the >>> wiki section on the multitenant interface that I linked in my previous >>> response. >>> >>> Thanks, >>> Bryan >>> >>> On Friday, October 31, 2014, Alice Everett <ali...@ya...');" >>> class="" style="" target=>ali...@ya...> wrote: >>> >>> Thanks a lot for the help. >>> >>> But I dont know where I am still going wrong: >>> I inserted data using: curl -v -X POST --data-binary >>> 'uri=file:///home/reifiedTriples.ttl' @tmp.xml >>> http://192.168.145.1:9999/bigdata/sparql >>> And then queried it using: curl -X POST >>> http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml >>> 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr' >>> curl: (6) Couldn't resolve host 'query=SELECT * <<' >>> Content-Type not recognized as RDF: application/x-www-form-urlencoded >>> >>> >>> On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy...> >>> wrote: >>> >>> >>> Alice, >>> >>> The workbench choice of the "in use" namespace is recorded in java script >>> in your browser client. That choice does not effect other workbench >>> clients >>> and does not effect the behavior of the various endpoints when using >>> command >>> line tools to query or update data in the database. Thus your command >>> line >>> requests are being made against a namespace that is not configured for >>> RDR >>> support. >>> >>> If you want to address a non-default bigdata namespace using curl or >>> wget, >>> you must use the appropriate URL for that namespace. This is all >>> described >>> on wiki.bigdata.com on the page for the nanoSparqlServer in the section >>> on >>> multi-tenancy. >>> >>> See >>> http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API >>> >>> Thanks, >>> Bryan >>> >>> On Thursday, October 30, 2014, Alice Everett <ali...@ya...> >>> wrote: >>> >>> I found out an awesome feature in Bigdata called RDR and I am trying to >>> explore that too. Can you please let me know as to where am I going wrong >>> while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample >>> RDF >>> data, contains reification in its standard form: >>> http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification) >>> Loading: >>> curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' >>> http://192.168.145.1:9999/bigdata/sparql >>> (Additionally I changed my current namespace within the workbench opened >>> in my browser to RDR mode). >>> >>> After this I fired the following query and got the following error (Can >>> you please correct me as to where am I going wrong. I'll be very grateful >>> to >>> you for the same): >>> @HP-ProBook-4430s:~/bigdataAnt$ curl -X POST >>> http://192.168.145.1:9999/bigdata/sparql --header >>> "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p >>> ?o>> >>> ?p1 ?o1 }' -H 'Accept:application/rdr' >>> >>> SELECT * {<<?s ?p ?o>> ?p1 ?o1 } >>> java.util.concurrent.ExecutionException: >>> org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: >>> java.util.concurrent.ExecutionException: java.lang.RuntimeException: >>> java.util.concurrent.ExecutionException: java.lang.Exception: >>> >>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>> cause=java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: >>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>> at >>> >>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) >>> at >>> >>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) >>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>> at >>> >>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) >>> at >>> >>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) >>> at java.lang.Thread.run(Thread.java:745) >>> Caused by: org.openrdf.query.QueryEvaluationException: >>> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >>> java.lang.Exception: >>> >>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>> cause=java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: >>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at >>> >>> com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) >>> at >>> >>> org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) >>> at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) >>> at >>> >>> org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) >>> at >>> >>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) >>> at >>> >>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) >>> at >>> >>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) >>> at >>> >>> com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>> at >>> >>> com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >>> ... 6 more >>> Caused by: java.lang.RuntimeException: >>> java.util.concurrent.ExecutionException: java.lang.RuntimeException: >>> java.util.concurrent.ExecutionException: java.lang.Exception: >>> >>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>> cause=java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: >>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at >>> >>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) >>> at >>> >>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) >>> at >>> >>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) >>> at >>> >>> com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) >>> at >>> >>> com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) >>> at >>> >>> com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) >>> ... 15 more >>> Caused by: java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >>> java.lang.Exception: >>> >>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>> cause=java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: >>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>> at >>> >>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) >>> ... 20 more >>> Caused by: java.lang.RuntimeException: >>> java.util.concurrent.ExecutionException: java.lang.Exception: >>> >>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>> cause=java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: >>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at >>> >>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) >>> at >>> >>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) >>> at >>> >>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) >>> at >>> >>> com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) >>> at >>> >>> com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) >>> at >>> >>> com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) >>> >>> ... 4 more >>> Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: >>> >>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>> cause=java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: >>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) >>> at >>> >>> com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) >>> at >>> >>> com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) >>> at >>> >>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) >>> ... 9 more >>> Caused by: java.lang.Exception: >>> >>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>> cause=java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: >>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at >>> >>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) >>> at >>> >>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) >>> at >>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) >>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>> at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>> at >>> >>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) >>> ... 3 more >>> Caused by: java.util.concurrent.ExecutionException: >>> java.lang.RuntimeException: java.lang.RuntimeException: >>> java.lang.ArrayIndexOutOfBoundsException: 0 >>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>> at >>> >>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) >>> ... 8 more >>> Caused by: java.lang.RuntimeException: java.lang.RuntimeException: >>> java.lang.ArrayIndexOutOfBoundsException: 0 >>> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) >>> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) >>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>> at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>> at >>> >>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) >>> ... 8 more >>> Caused by: java.lang.RuntimeException: >>> java.lang.ArrayIndexOutOfBoundsException: 0 >>> at >>> >>> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) >>> at >>> >>> com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) >>> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) >>> ... 12 more >>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 >>> at >>> >>> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) >>> at >>> >>> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) >>> ... 14 more >>> >>> >>> >>> -- >>> ---- >>> Bryan Thompson >>> Chief Scientist & Founder >>> SYSTAP, LLC >>> 4501 Tower Road >>> Greensboro, NC 27410 >>> br...@sy... >>> http://bigdata.com >>> http://mapgraph.io >>> CONFIDENTIALITY NOTICE: This email and its contents and attachments are >>> for the sole use of the intended recipient(s) and are confidential or >>> proprietary to SYSTAP. Any unauthorized review, use, disclosure, >>> dissemination or copying of this email or its contents or attachments is >>> prohibited. If you have received this communication in error, please >>> notify >>> the sender by reply email and permanently delete all copies of the email >>> and >>> its contents and attachments. >>> >>> >>> >>> >>> I dont know why am I getting an error when I am querying using RDR. Can >>> you please help me with this one last time. >>> >>> My tmp.xml file is: >>> <?xml version="1.0" encoding="UTF-8" standalone="no"?> >>> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> >>> <properties> >>> <!-- --> >>> <!-- NEW KB NAMESPACE (required). --> >>> <!-- --> >>> <entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry> >>> <!-- --> >>> <!-- Specify any KB specific properties here to override defaults for the >>> BigdataSail --> >>> <!-- AbstractTripleStore, or indices in the namespace of the new KB >>> instance. --> >>> <!-- --> >>> <entry >>> >>> key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> >>> </properties> >>> >>> >>> >>> -- >>> ---- >>> Bryan Thompson >>> Chief Scientist & Founder >>> SYSTAP, LLC >>> 4501 Tower Road >>> Greensboro, NC 27410 >>> br...@sy..." class="" style="" target='_blank' >>> rel=external>br...@sy... >>> http://bigdata.com >>> http://mapgraph.io >>> CONFIDENTIALITY NOTICE: This email and its contents and attachments are >>> for the sole use of the intended recipient(s) and are confidential or >>> proprietary to SYSTAP. Any unauthorized review, use, disclosure, >>> dissemination or copying of this email or its contents or attachments is >>> prohibited. If you have received this communication in error, please >>> notify >>> the sender by reply email and permanently delete all copies of the email >>> and >>> its contents and attachments. >>> >>> >>> >>> >>> >>> ------------------------------------------------------------------------------ >>> _______________________________________________ >>> Bigdata-developers mailing list >>> Big...@li... >>> https://lists.sourceforge.net/lists/listinfo/bigdata-developers >>> >>> Get your own FREE website, FREE domain & FREE mobile app with Company >>> email. >>> Know More > >>> >>> >> >> >> >> ------------------------------------------------------------------------------ >> >> _______________________________________________ >> Bigdata-developers mailing list >> Big...@li... >> https://lists.sourceforge.net/lists/listinfo/bigdata-developers >> > > |
From: Alice E. <ali...@ya...> - 2014-10-31 12:51:11
|
Thanks Rose. But I dont think so.. as it works perfectly with google.com root:~/bigdataAnt$ curl -v google.com * About to connect() to google.com port 80 (#0) * Trying 74.125.236.68... connected > GET / HTTP/1.1 > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > Host: google.com > Accept: */* > < HTTP/1.1 302 Found < Cache-Control: private < Content-Type: text/html; charset=UTF-8 < Location: http://www.google.co.in/?gfe_rd=cr&ei=bYVTVL-gG8jM8gfBzoDgCw < Content-Length: 261 < Date: Fri, 31 Oct 2014 12:49:49 GMT < Server: GFE/2.0 < Alternate-Protocol: 80:quic,p=0.01 < <HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8"> <TITLE>302 Moved</TITLE></HEAD><BODY> <H1>302 Moved</H1> The document has moved <A HREF="http://www.google.co.in/?gfe_rd=cr&ei=bYVTVL-gG8jM8gfBzoDgCw">here</A>. </BODY></HTML> * Connection #0 to host google.com left intact * Closing connection #0 On Friday, 31 October 2014 6:19 PM, Rose Beck <ros...@gm...> wrote: I think its a dns error..can you try doing; curl -v google.com On Fri, Oct 31, 2014 at 6:02 PM, Bryan Thompson <br...@sy...> wrote: > If you use POST with a URL of the resource to be loaded (see the NSS wiki > page) then the URL must be accessible by bigdata. If you are using the form > of POST that sends the data in the http request body (which is the case > here), then it only needs to be visible to the client making the request. > > Thanks, > Bryan > > ---- > Bryan Thompson > Chief Scientist & Founder > SYSTAP, LLC > 4501 Tower Road > Greensboro, NC 27410 > br...@sy... > http://bigdata.com > http://mapgraph.io > > CONFIDENTIALITY NOTICE: This email and its contents and attachments are for > the sole use of the intended recipient(s) and are confidential or > proprietary to SYSTAP. Any unauthorized review, use, disclosure, > dissemination or copying of this email or its contents or attachments is > prohibited. If you have received this communication in error, please notify > the sender by reply email and permanently delete all copies of the email and > its contents and attachments. > > > On Fri, Oct 31, 2014 at 8:30 AM, Alice Everett <ali...@ya...> > wrote: >> >> Thanks Jennifer. But even keeping tmp.xml within the bigdata folder is not >> helping. >> >> >> On Friday, 31 October 2014 5:57 PM, Jennifer >> <jen...@re...> wrote: >> >> >> I think she is missing as to where tmp.xml should be kept within her >> bigdata/Ant folder as I think bigdata is not able to find tmp.xml. >> >> Alice I think you should keep tmp.xml within the bigdata folder which you >> downloaded. >> >> >> >> >> >> >> >> From: Alice Everett <ali...@ya...> >> Sent: Fri, 31 Oct 2014 17:47:26 >> To: Bryan Thompson <br...@sy...> >> Cc: "big...@li..." >> <big...@li...> >> Subject: Re: [Bigdata-developers] How to use RDR with Curl >> Ok. Thanks a ton. But still I am a little lost. I used two methods of >> inserting as explained below. My namespace's name is reificationRDR. >> I'll be very grateful if you can help me with this a bit. >> >> Insert Method1: >> root:~/bigdataAnt$ curl -v -X POST --data-binary >> 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @tmp.xml >> http://192.168.145.1:9999/bigdata/sparql >> output: >> * getaddrinfo(3) failed for tmp.xml:80 >> * Couldn't resolve host 'tmp.xml' >> * Closing connection #0 >> curl: (6) Couldn't resolve host 'tmp.xml' >> * About to connect() to 192.168.145.1 port 9999 (#0) >> * Trying 192.168.145.1... connected >> > POST /bigdata/sparql HTTP/1.1 >> > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 >> > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >> > Host: 192.168.145.1:9999 >> > Accept: */* >> > Content-Length: 52 >> > Content-Type: application/x-www-form-urlencoded >> > >> * upload completely sent off: 52out of 52 bytes >> < HTTP/1.1 200 OK >> < Content-Type: application/xml; charset=ISO-8859-1 >> < Transfer-Encoding: chunked >> < Server: Jetty(9.1.4.v20140401) >> < >> * Connection #0 to host 192.168.145.1 left intact >> * Closing connection #0 >> >> >> Insert Method 2: >> root:~/bigdataAnt/bigdata$ curl -v -X POST --data-binary >> 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @/home/bigdataAnt/tmp.xml >> http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql >> * getaddrinfo(3) failed for :80 >> output >> * Couldn't resolve host '' >> * Closing connection #0 >> curl: (6) Couldn't resolve host '' >> * About to connect() to 192.168.145.1 port 9999 (#0) >> * Trying 192.168.145.1... connected >> > POST /bigdata/namespace/reificationRDR/sparql HTTP/1.1 >> > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 >> > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >> > Host: 192.168.145.1:9999 >> > Accept: */* >> > Content-Length: 52 >> > Content-Type: application/x-www-form-urlencoded >> > >> * upload completely sent off: 52out of 52 bytes >> < HTTP/1.1 500 Server Error >> < Content-Type: text/plain >> < Transfer-Encoding: chunked >> < Server: Jetty(9.1.4.v20140401) >> < >> uri=[file:/home/bigdataAnt/SmallYagoFacts.ttl], context-uri=[] >> java.util.concurrent.ExecutionException: java.lang.RuntimeException: Not >> found: namespace=reificationRDR >> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >> at >> com.bigdata.rdf.sail.webapp.InsertServlet.doPostWithURIs(InsertServlet.java:401) >> at >> com.bigdata.rdf.sail.webapp.InsertServlet.doPost(InsertServlet.java:117) >> at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:267) >> at >> com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:144) >> at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) >> at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) >> at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:738) >> at >> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551) >> at >> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) >> at >> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:568) >> at >> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221) >> at >> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1111) >> at >> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:478) >> at >> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:183) >> at >> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1045) >> at >> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) >> at >> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:199) >> at >> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:109) >> at >> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) >> at org.eclipse.jetty.server.Server.handle(Server.java:462) >> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:279) >> at >> org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:232) >> at >> org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:534) >> at >> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607) >> at >> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536) >> at java.lang.Thread.run(Thread.java:745) >> Caused by: java.lang.RuntimeException: Not found: namespace=reificationRDR >> at >> com.bigdata.rdf.task.AbstractApiTask.getUnisolatedConnection(AbstractApiTask.java:217) >> at >> com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:457) >> at >> com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:414) >> at >> com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >> at >> com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >> at >> com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:220) >> ... 26 more >> * Connection #0 to host 192.168.145.1 left intact >> * Closing connection #0 >> >> >> Query: >> curl -X POST >> http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql >> --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H >> 'Accept:application/rdf+xml' >> >> tmp.xml: >> <?xml version="1.0" encoding="UTF-8" standalone="no"?> >> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> >> <properties> >> <!-- --> >> <!-- NEW KB NAMESPACE (required). --> >> <!-- --> >> <entry key="com.bigdata.rdf.sail.namespace">reificationRDR</entry> >> <!-- --> >> <!-- Specify any KB specific properties here to override defaults for the >> BigdataSail --> >> <!-- AbstractTripleStore, or indices in the namespace of the new KB >> instance. --> >> <!-- --> >> <entry >> key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> >> </properties> >> >> >> >> On Friday, 31 October 2014 5:30 PM, Bryan Thompson <br...@sy...> >> wrote: >> >> >> What is the namespace for the RDR graph? >> >> The URL you need to be using is >> >> http://192.168.145.1:9999/bigdata/namespace/MY-GRAPH-NAMESPACE/sparql >> >> How to address a specific namespace is explicitly covered if you read the >> wiki section on the multitenant interface that I linked in my previous >> response. >> >> Thanks, >> Bryan >> >> On Friday, October 31, 2014, Alice Everett <ali...@ya...');" >> class="" style="" target=>ali...@ya...> wrote: >> >> Thanks a lot for the help. >> >> But I dont know where I am still going wrong: >> I inserted data using: curl -v -X POST --data-binary >> 'uri=file:///home/reifiedTriples.ttl' @tmp.xml >> http://192.168.145.1:9999/bigdata/sparql >> And then queried it using: curl -X POST >> http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml >> 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr' >> curl: (6) Couldn't resolve host 'query=SELECT * <<' >> Content-Type not recognized as RDF: application/x-www-form-urlencoded >> >> >> On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy...> >> wrote: >> >> >> Alice, >> >> The workbench choice of the "in use" namespace is recorded in java script >> in your browser client. That choice does not effect other workbench clients >> and does not effect the behavior of the various endpoints when using command >> line tools to query or update data in the database. Thus your command line >> requests are being made against a namespace that is not configured for RDR >> support. >> >> If you want to address a non-default bigdata namespace using curl or wget, >> you must use the appropriate URL for that namespace. This is all described >> on wiki.bigdata.com on the page for the nanoSparqlServer in the section on >> multi-tenancy. >> >> See >> http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API >> >> Thanks, >> Bryan >> >> On Thursday, October 30, 2014, Alice Everett <ali...@ya...> >> wrote: >> >> I found out an awesome feature in Bigdata called RDR and I am trying to >> explore that too. Can you please let me know as to where am I going wrong >> while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample RDF >> data, contains reification in its standard form: >> http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification) >> Loading: >> curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' >> http://192.168.145.1:9999/bigdata/sparql >> (Additionally I changed my current namespace within the workbench opened >> in my browser to RDR mode). >> >> After this I fired the following query and got the following error (Can >> you please correct me as to where am I going wrong. I'll be very grateful to >> you for the same): >> @HP-ProBook-4430s:~/bigdataAnt$ curl -X POST >> http://192.168.145.1:9999/bigdata/sparql --header >> "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p ?o>> >> ?p1 ?o1 }' -H 'Accept:application/rdr' >> >> SELECT * {<<?s ?p ?o>> ?p1 ?o1 } >> java.util.concurrent.ExecutionException: >> org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: >> java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.util.concurrent.ExecutionException: java.lang.Exception: >> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >> cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >> at >> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) >> at >> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) >> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >> at >> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) >> at >> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) >> at java.lang.Thread.run(Thread.java:745) >> Caused by: org.openrdf.query.QueryEvaluationException: >> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >> java.lang.Exception: >> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >> cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >> at >> com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) >> at >> org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) >> at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) >> at >> org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) >> at >> com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) >> at >> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) >> at >> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) >> at >> com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >> at >> com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >> ... 6 more >> Caused by: java.lang.RuntimeException: >> java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.util.concurrent.ExecutionException: java.lang.Exception: >> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >> cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >> at >> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) >> at >> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) >> at >> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) >> at >> com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) >> at >> com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) >> at >> com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) >> ... 15 more >> Caused by: java.util.concurrent.ExecutionException: >> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >> java.lang.Exception: >> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >> cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >> at >> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) >> ... 20 more >> Caused by: java.lang.RuntimeException: >> java.util.concurrent.ExecutionException: java.lang.Exception: >> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >> cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >> at >> com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) >> at >> com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) >> at >> com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) >> at >> com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) >> at >> com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) >> at >> com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) >> >> ... 4 more >> Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: >> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >> cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >> at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) >> at >> com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) >> at >> com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) >> at >> com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) >> ... 9 more >> Caused by: java.lang.Exception: >> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >> cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >> at >> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) >> at >> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) >> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) >> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >> at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >> at >> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) >> ... 3 more >> Caused by: java.util.concurrent.ExecutionException: >> java.lang.RuntimeException: java.lang.RuntimeException: >> java.lang.ArrayIndexOutOfBoundsException: 0 >> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >> at >> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) >> ... 8 more >> Caused by: java.lang.RuntimeException: java.lang.RuntimeException: >> java.lang.ArrayIndexOutOfBoundsException: 0 >> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) >> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) >> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >> at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >> at >> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) >> ... 8 more >> Caused by: java.lang.RuntimeException: >> java.lang.ArrayIndexOutOfBoundsException: 0 >> at >> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) >> at >> com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) >> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) >> ... 12 more >> Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 >> at >> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) >> at >> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) >> ... 14 more >> >> >> >> -- >> ---- >> Bryan Thompson >> Chief Scientist & Founder >> SYSTAP, LLC >> 4501 Tower Road >> Greensboro, NC 27410 >> br...@sy... >> http://bigdata.com >> http://mapgraph.io >> CONFIDENTIALITY NOTICE: This email and its contents and attachments are >> for the sole use of the intended recipient(s) and are confidential or >> proprietary to SYSTAP. Any unauthorized review, use, disclosure, >> dissemination or copying of this email or its contents or attachments is >> prohibited. If you have received this communication in error, please notify >> the sender by reply email and permanently delete all copies of the email and >> its contents and attachments. >> >> >> >> >> I dont know why am I getting an error when I am querying using RDR. Can >> you please help me with this one last time. >> >> My tmp.xml file is: >> <?xml version="1.0" encoding="UTF-8" standalone="no"?> >> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> >> <properties> >> <!-- --> >> <!-- NEW KB NAMESPACE (required). --> >> <!-- --> >> <entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry> >> <!-- --> >> <!-- Specify any KB specific properties here to override defaults for the >> BigdataSail --> >> <!-- AbstractTripleStore, or indices in the namespace of the new KB >> instance. --> >> <!-- --> >> <entry >> key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> >> </properties> >> >> >> >> -- >> ---- >> Bryan Thompson >> Chief Scientist & Founder >> SYSTAP, LLC >> 4501 Tower Road >> Greensboro, NC 27410 >> br...@sy..." class="" style="" target='_blank' >> rel=external>br...@sy... >> http://bigdata.com >> http://mapgraph.io >> CONFIDENTIALITY NOTICE: This email and its contents and attachments are >> for the sole use of the intended recipient(s) and are confidential or >> proprietary to SYSTAP. Any unauthorized review, use, disclosure, >> dissemination or copying of this email or its contents or attachments is >> prohibited. If you have received this communication in error, please notify >> the sender by reply email and permanently delete all copies of the email and >> its contents and attachments. >> >> >> >> >> ------------------------------------------------------------------------------ >> _______________________________________________ >> Bigdata-developers mailing list >> Big...@li... >> https://lists.sourceforge.net/lists/listinfo/bigdata-developers >> >> Get your own FREE website, FREE domain & FREE mobile app with Company >> email. >> Know More > >> >> > > > ------------------------------------------------------------------------------ > > _______________________________________________ > Bigdata-developers mailing list > Big...@li... > https://lists.sourceforge.net/lists/listinfo/bigdata-developers > |
From: Rose B. <ros...@gm...> - 2014-10-31 12:49:38
|
I think its a dns error..can you try doing; curl -v google.com On Fri, Oct 31, 2014 at 6:02 PM, Bryan Thompson <br...@sy...> wrote: > If you use POST with a URL of the resource to be loaded (see the NSS wiki > page) then the URL must be accessible by bigdata. If you are using the form > of POST that sends the data in the http request body (which is the case > here), then it only needs to be visible to the client making the request. > > Thanks, > Bryan > > ---- > Bryan Thompson > Chief Scientist & Founder > SYSTAP, LLC > 4501 Tower Road > Greensboro, NC 27410 > br...@sy... > http://bigdata.com > http://mapgraph.io > > CONFIDENTIALITY NOTICE: This email and its contents and attachments are for > the sole use of the intended recipient(s) and are confidential or > proprietary to SYSTAP. Any unauthorized review, use, disclosure, > dissemination or copying of this email or its contents or attachments is > prohibited. If you have received this communication in error, please notify > the sender by reply email and permanently delete all copies of the email and > its contents and attachments. > > > On Fri, Oct 31, 2014 at 8:30 AM, Alice Everett <ali...@ya...> > wrote: >> >> Thanks Jennifer. But even keeping tmp.xml within the bigdata folder is not >> helping. >> >> >> On Friday, 31 October 2014 5:57 PM, Jennifer >> <jen...@re...> wrote: >> >> >> I think she is missing as to where tmp.xml should be kept within her >> bigdata/Ant folder as I think bigdata is not able to find tmp.xml. >> >> Alice I think you should keep tmp.xml within the bigdata folder which you >> downloaded. >> >> >> >> >> >> >> >> From: Alice Everett <ali...@ya...> >> Sent: Fri, 31 Oct 2014 17:47:26 >> To: Bryan Thompson <br...@sy...> >> Cc: "big...@li..." >> <big...@li...> >> Subject: Re: [Bigdata-developers] How to use RDR with Curl >> Ok. Thanks a ton. But still I am a little lost. I used two methods of >> inserting as explained below. My namespace's name is reificationRDR. >> I'll be very grateful if you can help me with this a bit. >> >> Insert Method1: >> root:~/bigdataAnt$ curl -v -X POST --data-binary >> 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @tmp.xml >> http://192.168.145.1:9999/bigdata/sparql >> output: >> * getaddrinfo(3) failed for tmp.xml:80 >> * Couldn't resolve host 'tmp.xml' >> * Closing connection #0 >> curl: (6) Couldn't resolve host 'tmp.xml' >> * About to connect() to 192.168.145.1 port 9999 (#0) >> * Trying 192.168.145.1... connected >> > POST /bigdata/sparql HTTP/1.1 >> > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 >> > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >> > Host: 192.168.145.1:9999 >> > Accept: */* >> > Content-Length: 52 >> > Content-Type: application/x-www-form-urlencoded >> > >> * upload completely sent off: 52out of 52 bytes >> < HTTP/1.1 200 OK >> < Content-Type: application/xml; charset=ISO-8859-1 >> < Transfer-Encoding: chunked >> < Server: Jetty(9.1.4.v20140401) >> < >> * Connection #0 to host 192.168.145.1 left intact >> * Closing connection #0 >> >> >> Insert Method 2: >> root:~/bigdataAnt/bigdata$ curl -v -X POST --data-binary >> 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @/home/bigdataAnt/tmp.xml >> http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql >> * getaddrinfo(3) failed for :80 >> output >> * Couldn't resolve host '' >> * Closing connection #0 >> curl: (6) Couldn't resolve host '' >> * About to connect() to 192.168.145.1 port 9999 (#0) >> * Trying 192.168.145.1... connected >> > POST /bigdata/namespace/reificationRDR/sparql HTTP/1.1 >> > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 >> > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >> > Host: 192.168.145.1:9999 >> > Accept: */* >> > Content-Length: 52 >> > Content-Type: application/x-www-form-urlencoded >> > >> * upload completely sent off: 52out of 52 bytes >> < HTTP/1.1 500 Server Error >> < Content-Type: text/plain >> < Transfer-Encoding: chunked >> < Server: Jetty(9.1.4.v20140401) >> < >> uri=[file:/home/bigdataAnt/SmallYagoFacts.ttl], context-uri=[] >> java.util.concurrent.ExecutionException: java.lang.RuntimeException: Not >> found: namespace=reificationRDR >> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >> at >> com.bigdata.rdf.sail.webapp.InsertServlet.doPostWithURIs(InsertServlet.java:401) >> at >> com.bigdata.rdf.sail.webapp.InsertServlet.doPost(InsertServlet.java:117) >> at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:267) >> at >> com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:144) >> at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) >> at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) >> at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:738) >> at >> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551) >> at >> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) >> at >> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:568) >> at >> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221) >> at >> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1111) >> at >> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:478) >> at >> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:183) >> at >> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1045) >> at >> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) >> at >> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:199) >> at >> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:109) >> at >> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) >> at org.eclipse.jetty.server.Server.handle(Server.java:462) >> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:279) >> at >> org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:232) >> at >> org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:534) >> at >> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607) >> at >> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536) >> at java.lang.Thread.run(Thread.java:745) >> Caused by: java.lang.RuntimeException: Not found: namespace=reificationRDR >> at >> com.bigdata.rdf.task.AbstractApiTask.getUnisolatedConnection(AbstractApiTask.java:217) >> at >> com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:457) >> at >> com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:414) >> at >> com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >> at >> com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >> at >> com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:220) >> ... 26 more >> * Connection #0 to host 192.168.145.1 left intact >> * Closing connection #0 >> >> >> Query: >> curl -X POST >> http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql >> --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H >> 'Accept:application/rdf+xml' >> >> tmp.xml: >> <?xml version="1.0" encoding="UTF-8" standalone="no"?> >> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> >> <properties> >> <!-- --> >> <!-- NEW KB NAMESPACE (required). --> >> <!-- --> >> <entry key="com.bigdata.rdf.sail.namespace">reificationRDR</entry> >> <!-- --> >> <!-- Specify any KB specific properties here to override defaults for the >> BigdataSail --> >> <!-- AbstractTripleStore, or indices in the namespace of the new KB >> instance. --> >> <!-- --> >> <entry >> key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> >> </properties> >> >> >> >> On Friday, 31 October 2014 5:30 PM, Bryan Thompson <br...@sy...> >> wrote: >> >> >> What is the namespace for the RDR graph? >> >> The URL you need to be using is >> >> http://192.168.145.1:9999/bigdata/namespace/MY-GRAPH-NAMESPACE/sparql >> >> How to address a specific namespace is explicitly covered if you read the >> wiki section on the multitenant interface that I linked in my previous >> response. >> >> Thanks, >> Bryan >> >> On Friday, October 31, 2014, Alice Everett <ali...@ya...');" >> class="" style="" target=>ali...@ya...> wrote: >> >> Thanks a lot for the help. >> >> But I dont know where I am still going wrong: >> I inserted data using: curl -v -X POST --data-binary >> 'uri=file:///home/reifiedTriples.ttl' @tmp.xml >> http://192.168.145.1:9999/bigdata/sparql >> And then queried it using: curl -X POST >> http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml >> 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr' >> curl: (6) Couldn't resolve host 'query=SELECT * <<' >> Content-Type not recognized as RDF: application/x-www-form-urlencoded >> >> >> On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy...> >> wrote: >> >> >> Alice, >> >> The workbench choice of the "in use" namespace is recorded in java script >> in your browser client. That choice does not effect other workbench clients >> and does not effect the behavior of the various endpoints when using command >> line tools to query or update data in the database. Thus your command line >> requests are being made against a namespace that is not configured for RDR >> support. >> >> If you want to address a non-default bigdata namespace using curl or wget, >> you must use the appropriate URL for that namespace. This is all described >> on wiki.bigdata.com on the page for the nanoSparqlServer in the section on >> multi-tenancy. >> >> See >> http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API >> >> Thanks, >> Bryan >> >> On Thursday, October 30, 2014, Alice Everett <ali...@ya...> >> wrote: >> >> I found out an awesome feature in Bigdata called RDR and I am trying to >> explore that too. Can you please let me know as to where am I going wrong >> while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample RDF >> data, contains reification in its standard form: >> http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification) >> Loading: >> curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' >> http://192.168.145.1:9999/bigdata/sparql >> (Additionally I changed my current namespace within the workbench opened >> in my browser to RDR mode). >> >> After this I fired the following query and got the following error (Can >> you please correct me as to where am I going wrong. I'll be very grateful to >> you for the same): >> @HP-ProBook-4430s:~/bigdataAnt$ curl -X POST >> http://192.168.145.1:9999/bigdata/sparql --header >> "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p ?o>> >> ?p1 ?o1 }' -H 'Accept:application/rdr' >> >> SELECT * {<<?s ?p ?o>> ?p1 ?o1 } >> java.util.concurrent.ExecutionException: >> org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: >> java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.util.concurrent.ExecutionException: java.lang.Exception: >> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >> cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >> at >> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) >> at >> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) >> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >> at >> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) >> at >> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) >> at java.lang.Thread.run(Thread.java:745) >> Caused by: org.openrdf.query.QueryEvaluationException: >> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >> java.lang.Exception: >> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >> cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >> at >> com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) >> at >> org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) >> at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) >> at >> org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) >> at >> com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) >> at >> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) >> at >> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) >> at >> com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >> at >> com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >> ... 6 more >> Caused by: java.lang.RuntimeException: >> java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.util.concurrent.ExecutionException: java.lang.Exception: >> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >> cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >> at >> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) >> at >> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) >> at >> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) >> at >> com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) >> at >> com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) >> at >> com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) >> ... 15 more >> Caused by: java.util.concurrent.ExecutionException: >> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >> java.lang.Exception: >> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >> cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >> at >> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) >> ... 20 more >> Caused by: java.lang.RuntimeException: >> java.util.concurrent.ExecutionException: java.lang.Exception: >> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >> cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >> at >> com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) >> at >> com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) >> at >> com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) >> at >> com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) >> at >> com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) >> at >> com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) >> >> ... 4 more >> Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: >> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >> cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >> at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) >> at >> com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) >> at >> com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) >> at >> com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) >> ... 9 more >> Caused by: java.lang.Exception: >> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >> cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >> at >> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) >> at >> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) >> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) >> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >> at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >> at >> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) >> ... 3 more >> Caused by: java.util.concurrent.ExecutionException: >> java.lang.RuntimeException: java.lang.RuntimeException: >> java.lang.ArrayIndexOutOfBoundsException: 0 >> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >> at >> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) >> ... 8 more >> Caused by: java.lang.RuntimeException: java.lang.RuntimeException: >> java.lang.ArrayIndexOutOfBoundsException: 0 >> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) >> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) >> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >> at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >> at >> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) >> ... 8 more >> Caused by: java.lang.RuntimeException: >> java.lang.ArrayIndexOutOfBoundsException: 0 >> at >> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) >> at >> com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) >> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) >> ... 12 more >> Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 >> at >> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) >> at >> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) >> ... 14 more >> >> >> >> -- >> ---- >> Bryan Thompson >> Chief Scientist & Founder >> SYSTAP, LLC >> 4501 Tower Road >> Greensboro, NC 27410 >> br...@sy... >> http://bigdata.com >> http://mapgraph.io >> CONFIDENTIALITY NOTICE: This email and its contents and attachments are >> for the sole use of the intended recipient(s) and are confidential or >> proprietary to SYSTAP. Any unauthorized review, use, disclosure, >> dissemination or copying of this email or its contents or attachments is >> prohibited. If you have received this communication in error, please notify >> the sender by reply email and permanently delete all copies of the email and >> its contents and attachments. >> >> >> >> >> I dont know why am I getting an error when I am querying using RDR. Can >> you please help me with this one last time. >> >> My tmp.xml file is: >> <?xml version="1.0" encoding="UTF-8" standalone="no"?> >> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> >> <properties> >> <!-- --> >> <!-- NEW KB NAMESPACE (required). --> >> <!-- --> >> <entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry> >> <!-- --> >> <!-- Specify any KB specific properties here to override defaults for the >> BigdataSail --> >> <!-- AbstractTripleStore, or indices in the namespace of the new KB >> instance. --> >> <!-- --> >> <entry >> key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> >> </properties> >> >> >> >> -- >> ---- >> Bryan Thompson >> Chief Scientist & Founder >> SYSTAP, LLC >> 4501 Tower Road >> Greensboro, NC 27410 >> br...@sy..." class="" style="" target='_blank' >> rel=external>br...@sy... >> http://bigdata.com >> http://mapgraph.io >> CONFIDENTIALITY NOTICE: This email and its contents and attachments are >> for the sole use of the intended recipient(s) and are confidential or >> proprietary to SYSTAP. Any unauthorized review, use, disclosure, >> dissemination or copying of this email or its contents or attachments is >> prohibited. If you have received this communication in error, please notify >> the sender by reply email and permanently delete all copies of the email and >> its contents and attachments. >> >> >> >> >> ------------------------------------------------------------------------------ >> _______________________________________________ >> Bigdata-developers mailing list >> Big...@li... >> https://lists.sourceforge.net/lists/listinfo/bigdata-developers >> >> Get your own FREE website, FREE domain & FREE mobile app with Company >> email. >> Know More > >> >> > > > ------------------------------------------------------------------------------ > > _______________________________________________ > Bigdata-developers mailing list > Big...@li... > https://lists.sourceforge.net/lists/listinfo/bigdata-developers > |
From: Alice E. <ali...@ya...> - 2014-10-31 12:40:05
|
Also now when I try to load I a getting the following errro: root:~/bigdataAnt/bigdata$ curl -v -X POST --data-binary 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @/home/bigdataAnt/tmp.xml http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql Error: * getaddrinfo(3) failed for :80 * Couldn't resolve host '' * Closing connection #0 curl: (6) Couldn't resolve host '' * About to connect() to 192.168.145.1 port 9999 (#0) * Trying 192.168.145.1... connected > POST /bigdata/namespace/reificationRDR/sparql HTTP/1.1 > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > Host: 192.168.145.1:9999 > Accept: */* > Content-Length: 52 > Content-Type: application/x-www-form-urlencoded > * upload completely sent off: 52out of 52 bytes < HTTP/1.1 200 OK < Content-Type: application/xml; charset=ISO-8859-1 < Transfer-Encoding: chunked < Server: Jetty(9.1.4.v20140401) < * Connection #0 to host 192.168.145.1 left intact * Closing connection #0 On Friday, 31 October 2014 6:07 PM, Alice Everett <ali...@ya...> wrote: Thanks again Bryan for all the help. My service description page shows the following (I think there is a namespace with the name: reificationRDR..dont you think so): <?xml version="1.0" encoding="UTF-8"?> <rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"> <rdf:Description rdf:nodeID="service"> <rdf:type rdf:resource="http://www.w3.org/ns/sparql-service-description#Service"/> <endpoint xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql"/> <endpoint xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://192.168.145.1:9999/bigdata/LBS/namespace/reificationRDR/sparql"/> <supportedLanguage xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#SPARQL10Query"/> <supportedLanguage xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#SPARQL11Query"/> <supportedLanguage xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#SPARQL11Update"/> <feature xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#BasicFederatedQuery"/> <feature xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.bigdata.com/rdf#/features/KB/Mode/Sids"/> <feature xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.bigdata.com/rdf#/features/KB/TruthMaintenance"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/RDF_XML"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N-Triples"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/Turtle"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N3"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.wiwiss.fu-berlin.de/suhl/bizer/TriG/Spec/"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://sw.deri.org/2008/07/n-quads/#n-quads"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_XML"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_JSON"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_CSV"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_TSV"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/RDF_XML"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N-Triples"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/Turtle"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N3"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.wiwiss.fu-berlin.de/suhl/bizer/TriG/Spec/"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_XML"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_JSON"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_CSV"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_TSV"/> <defaultDataset xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:nodeID="defaultDataset"/> </rdf:Description> <rdf:Description rdf:nodeID="defaultDataset"> <rdf:type rdf:resource="http://www.w3.org/ns/sparql-service-description#Dataset"/> <rdf:type rdf:resource="http://rdfs.org/ns/void#Dataset"/> <title xmlns="http://purl.org/dc/terms/">reificationRDR</title> <Namespace xmlns="http://www.bigdata.com/rdf#/features/KB/">reificationRDR</Namespace> <sparqlEndpoint xmlns="http://rdfs.org/ns/void#" rdf:resource="http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql/reificationRDR/sparql"/> <sparqlEndpoint xmlns="http://rdfs.org/ns/void#" rdf:resource="http://192.168.145.1:9999/bigdata/LBS/namespace/reificationRDR/sparql/reificationRDR/sparql"/> <uriRegexPattern xmlns="http://rdfs.org/ns/void#">^.*</uriRegexPattern> <defaultGraph xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:nodeID="defaultGraph"/> </rdf:Description> <rdf:Description rdf:nodeID="defaultGraph"> <rdf:type rdf:resource="http://www.w3.org/ns/sparql-service-description#Graph"/> <triples xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#long">0</triples> <entities xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#long">0</entities> <properties xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#int">0</properties> <classes xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#int">0</classes> </rdf:Description> </rdf:RDF><?xml version="1.0" encoding="UTF-8"?> <rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"> <rdf:Description rdf:nodeID="service"> <rdf:type rdf:resource="http://www.w3.org/ns/sparql-service-description#Service"/> <endpoint xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql"/> <endpoint xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://192.168.145.1:9999/bigdata/LBS/namespace/reificationRDR/sparql"/> <supportedLanguage xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#SPARQL10Query"/> <supportedLanguage xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#SPARQL11Query"/> <supportedLanguage xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#SPARQL11Update"/> <feature xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#BasicFederatedQuery"/> <feature xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.bigdata.com/rdf#/features/KB/Mode/Sids"/> <feature xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.bigdata.com/rdf#/features/KB/TruthMaintenance"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/RDF_XML"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N-Triples"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/Turtle"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N3"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.wiwiss.fu-berlin.de/suhl/bizer/TriG/Spec/"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://sw.deri.org/2008/07/n-quads/#n-quads"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_XML"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_JSON"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_CSV"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_TSV"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/RDF_XML"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N-Triples"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/Turtle"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N3"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.wiwiss.fu-berlin.de/suhl/bizer/TriG/Spec/"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_XML"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_JSON"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_CSV"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_TSV"/> <defaultDataset xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:nodeID="defaultDataset"/> </rdf:Description> <rdf:Description rdf:nodeID="defaultDataset"> <rdf:type rdf:resource="http://www.w3.org/ns/sparql-service-description#Dataset"/> <rdf:type rdf:resource="http://rdfs.org/ns/void#Dataset"/> <title xmlns="http://purl.org/dc/terms/">reificationRDR</title> <Namespace xmlns="http://www.bigdata.com/rdf#/features/KB/">reificationRDR</Namespace> <sparqlEndpoint xmlns="http://rdfs.org/ns/void#" rdf:resource="http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql/reificationRDR/sparql"/> <sparqlEndpoint xmlns="http://rdfs.org/ns/void#" rdf:resource="http://192.168.145.1:9999/bigdata/LBS/namespace/reificationRDR/sparql/reificationRDR/sparql"/> <uriRegexPattern xmlns="http://rdfs.org/ns/void#">^.*</uriRegexPattern> <defaultGraph xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:nodeID="defaultGraph"/> </rdf:Description> <rdf:Description rdf:nodeID="defaultGraph"> <rdf:type rdf:resource="http://www.w3.org/ns/sparql-service-description#Graph"/> <triples xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#long">0</triples> <entities xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#long">0</entities> <properties xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#int">0</properties> <classes xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#int">0</classes> </rdf:Description> </rdf:RDF> On Friday, 31 October 2014 6:00 PM, Alice Everett <ali...@ya...> wrote: Thanks Jennifer. But even keeping tmp.xml within the bigdata folder is not helping. On Friday, 31 October 2014 5:57 PM, Jennifer <jen...@re...> wrote: I think she is missing as to where tmp.xml should be kept within her bigdata/Ant folder as I think bigdata is not able to find tmp.xml. Alice I think you should keep tmp.xml within the bigdata folder which you downloaded. From: Alice Everett <ali...@ya...> Sent: Fri, 31 Oct 2014 17:47:26 To: Bryan Thompson <br...@sy...> Cc: "big...@li..." <big...@li...> Subject: Re: [Bigdata-developers] How to use RDR with Curl Ok. Thanks a ton. But still I am a little lost. I used two methods of inserting as explained below. My namespace's name is reificationRDR. I'll be very grateful if you can help me with this a bit. Insert Method1: root:~/bigdataAnt$ curl -v -X POST --data-binary 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @tmp.xml http://192.168.145.1:9999/bigdata/sparql output: * getaddrinfo(3) failed for tmp.xml:80 * Couldn't resolve host 'tmp.xml' * Closing connection #0 curl: (6) Couldn't resolve host 'tmp.xml' * About to connect() to 192.168.145.1 port 9999 (#0) * Trying 192.168.145.1... connected > POST /bigdata/sparql HTTP/1.1 > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > Host: 192.168.145.1:9999 > Accept: */* > Content-Length: 52 > Content-Type: application/x-www-form-urlencoded > * upload completely sent off: 52out of 52 bytes < HTTP/1.1 200 OK < Content-Type: application/xml; charset=ISO-8859-1 < Transfer-Encoding: chunked < Server: Jetty(9.1.4.v20140401) < * Connection #0 to host 192.168.145.1 left intact * Closing connection #0 Insert Method 2: root:~/bigdataAnt/bigdata$ curl -v -X POST --data-binary 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @/home/bigdataAnt/tmp.xml http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql * getaddrinfo(3) failed for :80 output * Couldn't resolve host '' * Closing connection #0 curl: (6) Couldn't resolve host '' * About to connect() to 192.168.145.1 port 9999 (#0) * Trying 192.168.145.1... connected > POST /bigdata/namespace/reificationRDR/sparql HTTP/1.1 > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > Host: 192.168.145.1:9999 > Accept: */* > Content-Length: 52 > Content-Type: application/x-www-form-urlencoded > * upload completely sent off: 52out of 52 bytes < HTTP/1.1 500 Server Error < Content-Type: text/plain < Transfer-Encoding: chunked < Server: Jetty(9.1.4.v20140401) < uri=[file:/home/bigdataAnt/SmallYagoFacts.ttl], context-uri=[] java.util.concurrent.ExecutionException: java.lang.RuntimeException: Not found: namespace=reificationRDR at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:188) at com.bigdata.rdf.sail.webapp.InsertServlet.doPostWithURIs(InsertServlet.java:401) at com.bigdata.rdf.sail.webapp.InsertServlet.doPost(InsertServlet.java:117) at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:267) at com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:144) at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:738) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:568) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1111) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:478) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:183) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1045) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:199) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:109) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) at org.eclipse.jetty.server.Server.handle(Server.java:462) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:279) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:232) at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:534) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607) at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: Not found: namespace=reificationRDR at com.bigdata.rdf.task.AbstractApiTask.getUnisolatedConnection(AbstractApiTask.java:217) at com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:457) at com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:414) at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) at com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:220) ... 26 more * Connection #0 to host 192.168.145.1 left intact * Closing connection #0 Query: curl -X POST http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdf+xml' tmp.xml: <?xml version="1.0" encoding="UTF-8" standalone="no"?> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> <properties> <!-- --> <!-- NEW KB NAMESPACE (required). --> <!-- --> <entry key="com.bigdata.rdf.sail.namespace">reificationRDR</entry> <!-- --> <!-- Specify any KB specific properties here to override defaults for the BigdataSail --> <!-- AbstractTripleStore, or indices in the namespace of the new KB instance. --> <!-- --> <entry key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> </properties> On Friday, 31 October 2014 5:30 PM, Bryan Thompson <br...@sy...> wrote: What is the namespace for the RDR graph? The URL you need to be using is http://192.168.145.1:9999/bigdata/namespace/MY-GRAPH-NAMESPACE/sparql How to address a specific namespace is explicitly covered if you read the wiki section on the multitenant interface that I linked in my previous response. Thanks, Bryan On Friday, October 31, 2014, Alice Everett <ali...@ya...');" class="" style="" target=>ali...@ya...> wrote: Thanks a lot for the help. > > >But I dont know where I am still going wrong: >I inserted data using: curl -v -X POST --data-binary 'uri=file:///home/reifiedTriples.ttl' @tmp.xml http://192.168.145.1:9999/bigdata/sparql >And then queried it using: curl -X POST http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr'curl: (6) Couldn't resolve host 'query=SELECT * <<' >Content-Type not recognized as RDF: application/x-www-form-urlencoded > > > >On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy...> wrote: > > > >Alice, > > >The workbench choice of the "in use" namespace is recorded in java script in your browser client. That choice does not effect other workbench clients and does not effect the behavior of the various endpoints when using command line tools to query or update data in the database. Thus your command line requests are being made against a namespace that is not configured for RDR support. > > >If you want to address a non-default bigdata namespace using curl or wget, you must use the appropriate URL for that namespace. This is all described on wiki.bigdata.com on the page for the nanoSparqlServer in the section on multi-tenancy. > > >See http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API > > >Thanks, >Bryan > >On Thursday, October 30, 2014, Alice Everett <ali...@ya...> wrote: > >I found out an awesome feature in Bigdata called RDR and I am trying to explore that too. Can you please let me know as to where am I going wrong while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample RDF data, contains reification in its standard form: http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification) >>Loading: >>curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' http://192.168.145.1:9999/bigdata/sparql >> >>(Additionally I changed my current namespace within the workbench opened in my browser to RDR mode). >> >> >>After this I fired the following query and got the following error (Can you please correct me as to where am I going wrong. I'll be very grateful to you for the same): >>@HP-ProBook-4430s:~/bigdataAnt$ curl -X POST http://192.168.145.1:9999/bigdata/sparql --header "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdr' >> >> >>SELECT * {<<?s ?p ?o>> ?p1 ?o1 } >>java.util.concurrent.ExecutionException: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) >>at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) >>at java.lang.Thread.run(Thread.java:745) >>Caused by: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) >>at org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) >>at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) >>at org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) >>at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >>... 6 more >>Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) >>at com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) >>at com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) >>at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) >>... 15 more >>Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) >>... 20 more >>Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) >>at com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) >>at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) >>at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) >> >> >>... 4 more >>Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) >>at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) >>at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) >>... 9 more >>Caused by: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) >>at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) >>... 3 more >>Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) >>... 8 more >>Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) >>at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) >>... 8 more >>Caused by: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) >>at com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) >>at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) >>... 12 more >>Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) >>at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) >>... 14 more >> >> > >-- > >---- >Bryan Thompson > >Chief Scientist & Founder >SYSTAP, LLC > >4501 Tower Road >Greensboro, NC 27410 > >br...@sy... > >http://bigdata.com > >http://mapgraph.io > >CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. > > > > > >I dont know why am I getting an error when I am querying using RDR. Can you please help me with this one last time. > > >My tmp.xml file is: ><?xml version="1.0" encoding="UTF-8" standalone="no"?> ><!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> ><properties> ><!-- --> ><!-- NEW KB NAMESPACE (required). --> ><!-- --> ><entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry> ><!-- --> ><!-- Specify any KB specific properties here to override defaults for the BigdataSail --> ><!-- AbstractTripleStore, or indices in the namespace of the new KB instance. --> ><!-- --> ><entry key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> ></properties> > > -- ---- Bryan Thompson Chief Scientist & Founder SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 br...@sy..." class="" style="" target='_blank' rel=external>br...@sy... http://bigdata.com http://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. ------------------------------------------------------------------------------ _______________________________________________ Bigdata-developers mailing list Big...@li... https://lists.sourceforge.net/lists/listinfo/bigdata-developers Get your own FREE website, FREE domain & FREE mobile app with Company email. Know More > |
From: Alice E. <ali...@ya...> - 2014-10-31 12:38:10
|
Thanks again Bryan for all the help. My service description page shows the following (I think there is a namespace with the name: reificationRDR..dont you think so): <?xml version="1.0" encoding="UTF-8"?> <rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"> <rdf:Description rdf:nodeID="service"> <rdf:type rdf:resource="http://www.w3.org/ns/sparql-service-description#Service"/> <endpoint xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql"/> <endpoint xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://192.168.145.1:9999/bigdata/LBS/namespace/reificationRDR/sparql"/> <supportedLanguage xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#SPARQL10Query"/> <supportedLanguage xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#SPARQL11Query"/> <supportedLanguage xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#SPARQL11Update"/> <feature xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#BasicFederatedQuery"/> <feature xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.bigdata.com/rdf#/features/KB/Mode/Sids"/> <feature xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.bigdata.com/rdf#/features/KB/TruthMaintenance"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/RDF_XML"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N-Triples"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/Turtle"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N3"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.wiwiss.fu-berlin.de/suhl/bizer/TriG/Spec/"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://sw.deri.org/2008/07/n-quads/#n-quads"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_XML"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_JSON"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_CSV"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_TSV"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/RDF_XML"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N-Triples"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/Turtle"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N3"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.wiwiss.fu-berlin.de/suhl/bizer/TriG/Spec/"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_XML"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_JSON"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_CSV"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_TSV"/> <defaultDataset xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:nodeID="defaultDataset"/> </rdf:Description> <rdf:Description rdf:nodeID="defaultDataset"> <rdf:type rdf:resource="http://www.w3.org/ns/sparql-service-description#Dataset"/> <rdf:type rdf:resource="http://rdfs.org/ns/void#Dataset"/> <title xmlns="http://purl.org/dc/terms/">reificationRDR</title> <Namespace xmlns="http://www.bigdata.com/rdf#/features/KB/">reificationRDR</Namespace> <sparqlEndpoint xmlns="http://rdfs.org/ns/void#" rdf:resource="http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql/reificationRDR/sparql"/> <sparqlEndpoint xmlns="http://rdfs.org/ns/void#" rdf:resource="http://192.168.145.1:9999/bigdata/LBS/namespace/reificationRDR/sparql/reificationRDR/sparql"/> <uriRegexPattern xmlns="http://rdfs.org/ns/void#">^.*</uriRegexPattern> <defaultGraph xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:nodeID="defaultGraph"/> </rdf:Description> <rdf:Description rdf:nodeID="defaultGraph"> <rdf:type rdf:resource="http://www.w3.org/ns/sparql-service-description#Graph"/> <triples xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#long">0</triples> <entities xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#long">0</entities> <properties xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#int">0</properties> <classes xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#int">0</classes> </rdf:Description> </rdf:RDF><?xml version="1.0" encoding="UTF-8"?> <rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"> <rdf:Description rdf:nodeID="service"> <rdf:type rdf:resource="http://www.w3.org/ns/sparql-service-description#Service"/> <endpoint xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql"/> <endpoint xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://192.168.145.1:9999/bigdata/LBS/namespace/reificationRDR/sparql"/> <supportedLanguage xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#SPARQL10Query"/> <supportedLanguage xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#SPARQL11Query"/> <supportedLanguage xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#SPARQL11Update"/> <feature xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/sparql-service-description#BasicFederatedQuery"/> <feature xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.bigdata.com/rdf#/features/KB/Mode/Sids"/> <feature xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.bigdata.com/rdf#/features/KB/TruthMaintenance"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/RDF_XML"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N-Triples"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/Turtle"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N3"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.wiwiss.fu-berlin.de/suhl/bizer/TriG/Spec/"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://sw.deri.org/2008/07/n-quads/#n-quads"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_XML"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_JSON"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_CSV"/> <inputFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_TSV"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/RDF_XML"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N-Triples"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/Turtle"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/N3"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.wiwiss.fu-berlin.de/suhl/bizer/TriG/Spec/"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_XML"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_JSON"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_CSV"/> <resultFormat xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:resource="http://www.w3.org/ns/formats/SPARQL_Results_TSV"/> <defaultDataset xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:nodeID="defaultDataset"/> </rdf:Description> <rdf:Description rdf:nodeID="defaultDataset"> <rdf:type rdf:resource="http://www.w3.org/ns/sparql-service-description#Dataset"/> <rdf:type rdf:resource="http://rdfs.org/ns/void#Dataset"/> <title xmlns="http://purl.org/dc/terms/">reificationRDR</title> <Namespace xmlns="http://www.bigdata.com/rdf#/features/KB/">reificationRDR</Namespace> <sparqlEndpoint xmlns="http://rdfs.org/ns/void#" rdf:resource="http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql/reificationRDR/sparql"/> <sparqlEndpoint xmlns="http://rdfs.org/ns/void#" rdf:resource="http://192.168.145.1:9999/bigdata/LBS/namespace/reificationRDR/sparql/reificationRDR/sparql"/> <uriRegexPattern xmlns="http://rdfs.org/ns/void#">^.*</uriRegexPattern> <defaultGraph xmlns="http://www.w3.org/ns/sparql-service-description#" rdf:nodeID="defaultGraph"/> </rdf:Description> <rdf:Description rdf:nodeID="defaultGraph"> <rdf:type rdf:resource="http://www.w3.org/ns/sparql-service-description#Graph"/> <triples xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#long">0</triples> <entities xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#long">0</entities> <properties xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#int">0</properties> <classes xmlns="http://rdfs.org/ns/void#" rdf:datatype="http://www.w3.org/2001/XMLSchema#int">0</classes> </rdf:Description> </rdf:RDF> On Friday, 31 October 2014 6:00 PM, Alice Everett <ali...@ya...> wrote: Thanks Jennifer. But even keeping tmp.xml within the bigdata folder is not helping. On Friday, 31 October 2014 5:57 PM, Jennifer <jen...@re...> wrote: I think she is missing as to where tmp.xml should be kept within her bigdata/Ant folder as I think bigdata is not able to find tmp.xml. Alice I think you should keep tmp.xml within the bigdata folder which you downloaded. From: Alice Everett <ali...@ya...> Sent: Fri, 31 Oct 2014 17:47:26 To: Bryan Thompson <br...@sy...> Cc: "big...@li..." <big...@li...> Subject: Re: [Bigdata-developers] How to use RDR with Curl Ok. Thanks a ton. But still I am a little lost. I used two methods of inserting as explained below. My namespace's name is reificationRDR. I'll be very grateful if you can help me with this a bit. Insert Method1: root:~/bigdataAnt$ curl -v -X POST --data-binary 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @tmp.xml http://192.168.145.1:9999/bigdata/sparql output: * getaddrinfo(3) failed for tmp.xml:80 * Couldn't resolve host 'tmp.xml' * Closing connection #0 curl: (6) Couldn't resolve host 'tmp.xml' * About to connect() to 192.168.145.1 port 9999 (#0) * Trying 192.168.145.1... connected > POST /bigdata/sparql HTTP/1.1 > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > Host: 192.168.145.1:9999 > Accept: */* > Content-Length: 52 > Content-Type: application/x-www-form-urlencoded > * upload completely sent off: 52out of 52 bytes < HTTP/1.1 200 OK < Content-Type: application/xml; charset=ISO-8859-1 < Transfer-Encoding: chunked < Server: Jetty(9.1.4.v20140401) < * Connection #0 to host 192.168.145.1 left intact * Closing connection #0 Insert Method 2: root:~/bigdataAnt/bigdata$ curl -v -X POST --data-binary 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @/home/bigdataAnt/tmp.xml http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql * getaddrinfo(3) failed for :80 output * Couldn't resolve host '' * Closing connection #0 curl: (6) Couldn't resolve host '' * About to connect() to 192.168.145.1 port 9999 (#0) * Trying 192.168.145.1... connected > POST /bigdata/namespace/reificationRDR/sparql HTTP/1.1 > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > Host: 192.168.145.1:9999 > Accept: */* > Content-Length: 52 > Content-Type: application/x-www-form-urlencoded > * upload completely sent off: 52out of 52 bytes < HTTP/1.1 500 Server Error < Content-Type: text/plain < Transfer-Encoding: chunked < Server: Jetty(9.1.4.v20140401) < uri=[file:/home/bigdataAnt/SmallYagoFacts.ttl], context-uri=[] java.util.concurrent.ExecutionException: java.lang.RuntimeException: Not found: namespace=reificationRDR at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:188) at com.bigdata.rdf.sail.webapp.InsertServlet.doPostWithURIs(InsertServlet.java:401) at com.bigdata.rdf.sail.webapp.InsertServlet.doPost(InsertServlet.java:117) at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:267) at com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:144) at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:738) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:568) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1111) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:478) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:183) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1045) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:199) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:109) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) at org.eclipse.jetty.server.Server.handle(Server.java:462) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:279) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:232) at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:534) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607) at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: Not found: namespace=reificationRDR at com.bigdata.rdf.task.AbstractApiTask.getUnisolatedConnection(AbstractApiTask.java:217) at com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:457) at com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:414) at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) at com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:220) ... 26 more * Connection #0 to host 192.168.145.1 left intact * Closing connection #0 Query: curl -X POST http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdf+xml' tmp.xml: <?xml version="1.0" encoding="UTF-8" standalone="no"?> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> <properties> <!-- --> <!-- NEW KB NAMESPACE (required). --> <!-- --> <entry key="com.bigdata.rdf.sail.namespace">reificationRDR</entry> <!-- --> <!-- Specify any KB specific properties here to override defaults for the BigdataSail --> <!-- AbstractTripleStore, or indices in the namespace of the new KB instance. --> <!-- --> <entry key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> </properties> On Friday, 31 October 2014 5:30 PM, Bryan Thompson <br...@sy...> wrote: What is the namespace for the RDR graph? The URL you need to be using is http://192.168.145.1:9999/bigdata/namespace/MY-GRAPH-NAMESPACE/sparql How to address a specific namespace is explicitly covered if you read the wiki section on the multitenant interface that I linked in my previous response. Thanks, Bryan On Friday, October 31, 2014, Alice Everett <ali...@ya...');" class="" style="" target=>ali...@ya...> wrote: Thanks a lot for the help. > > >But I dont know where I am still going wrong: >I inserted data using: curl -v -X POST --data-binary 'uri=file:///home/reifiedTriples.ttl' @tmp.xml http://192.168.145.1:9999/bigdata/sparql >And then queried it using: curl -X POST http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr'curl: (6) Couldn't resolve host 'query=SELECT * <<' >Content-Type not recognized as RDF: application/x-www-form-urlencoded > > > >On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy...> wrote: > > > >Alice, > > >The workbench choice of the "in use" namespace is recorded in java script in your browser client. That choice does not effect other workbench clients and does not effect the behavior of the various endpoints when using command line tools to query or update data in the database. Thus your command line requests are being made against a namespace that is not configured for RDR support. > > >If you want to address a non-default bigdata namespace using curl or wget, you must use the appropriate URL for that namespace. This is all described on wiki.bigdata.com on the page for the nanoSparqlServer in the section on multi-tenancy. > > >See http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API > > >Thanks, >Bryan > >On Thursday, October 30, 2014, Alice Everett <ali...@ya...> wrote: > >I found out an awesome feature in Bigdata called RDR and I am trying to explore that too. Can you please let me know as to where am I going wrong while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample RDF data, contains reification in its standard form: http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification) >>Loading: >>curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' http://192.168.145.1:9999/bigdata/sparql >> >>(Additionally I changed my current namespace within the workbench opened in my browser to RDR mode). >> >> >>After this I fired the following query and got the following error (Can you please correct me as to where am I going wrong. I'll be very grateful to you for the same): >>@HP-ProBook-4430s:~/bigdataAnt$ curl -X POST http://192.168.145.1:9999/bigdata/sparql --header "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdr' >> >> >>SELECT * {<<?s ?p ?o>> ?p1 ?o1 } >>java.util.concurrent.ExecutionException: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) >>at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) >>at java.lang.Thread.run(Thread.java:745) >>Caused by: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) >>at org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) >>at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) >>at org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) >>at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >>... 6 more >>Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) >>at com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) >>at com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) >>at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) >>... 15 more >>Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) >>... 20 more >>Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) >>at com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) >>at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) >>at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) >> >> >>... 4 more >>Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) >>at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) >>at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) >>... 9 more >>Caused by: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) >>at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) >>... 3 more >>Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) >>... 8 more >>Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) >>at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) >>... 8 more >>Caused by: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) >>at com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) >>at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) >>... 12 more >>Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) >>at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) >>... 14 more >> >> > >-- > >---- >Bryan Thompson > >Chief Scientist & Founder >SYSTAP, LLC > >4501 Tower Road >Greensboro, NC 27410 > >br...@sy... > >http://bigdata.com > >http://mapgraph.io > >CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. > > > > > >I dont know why am I getting an error when I am querying using RDR. Can you please help me with this one last time. > > >My tmp.xml file is: ><?xml version="1.0" encoding="UTF-8" standalone="no"?> ><!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> ><properties> ><!-- --> ><!-- NEW KB NAMESPACE (required). --> ><!-- --> ><entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry> ><!-- --> ><!-- Specify any KB specific properties here to override defaults for the BigdataSail --> ><!-- AbstractTripleStore, or indices in the namespace of the new KB instance. --> ><!-- --> ><entry key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> ></properties> > > -- ---- Bryan Thompson Chief Scientist & Founder SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 br...@sy..." class="" style="" target='_blank' rel=external>br...@sy... http://bigdata.com http://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. ------------------------------------------------------------------------------ _______________________________________________ Bigdata-developers mailing list Big...@li... https://lists.sourceforge.net/lists/listinfo/bigdata-developers Get your own FREE website, FREE domain & FREE mobile app with Company email. Know More > |
From: Bryan T. <br...@sy...> - 2014-10-31 12:32:40
|
If you use POST with a URL of the resource to be loaded (see the NSS wiki page) then the URL must be accessible by bigdata. If you are using the form of POST that sends the data in the http request body (which is the case here), then it only needs to be visible to the client making the request. Thanks, Bryan ---- Bryan Thompson Chief Scientist & Founder SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 br...@sy... http://bigdata.com http://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. On Fri, Oct 31, 2014 at 8:30 AM, Alice Everett <ali...@ya...> wrote: > Thanks Jennifer. But even keeping tmp.xml within the bigdata folder is not > helping. > > > On Friday, 31 October 2014 5:57 PM, Jennifer < > jen...@re...> wrote: > > > I think she is missing as to where tmp.xml should be kept within her > bigdata/Ant folder as I think bigdata is not able to find tmp.xml. > > Alice I think you should keep tmp.xml within the bigdata folder which you > downloaded. > > > > > > > > From: Alice Everett <ali...@ya...> > Sent: Fri, 31 Oct 2014 17:47:26 > To: Bryan Thompson <br...@sy...> > Cc: "big...@li..." < > big...@li...> > Subject: Re: [Bigdata-developers] How to use RDR with Curl > Ok. Thanks a ton. But still I am a little lost. I used two methods of > inserting as explained below. My namespace's name is reificationRDR. > I'll be very grateful if you can help me with this a bit. > > Insert Method1: > root:~/bigdataAnt$ curl -v -X POST --data-binary > 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @tmp.xml > http://192.168.145.1:9999/bigdata/sparql > output: > * getaddrinfo(3) failed for tmp.xml:80 > * Couldn't resolve host 'tmp.xml' > * Closing connection #0 > curl: (6) Couldn't resolve host 'tmp.xml' > * About to connect() to 192.168.145.1 port 9999 (#0) > * Trying 192.168.145.1... connected > > POST /bigdata/sparql HTTP/1.1 > > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > > Host: 192.168.145.1:9999 > > Accept: */* > > Content-Length: 52 > > Content-Type: application/x-www-form-urlencoded > > > * upload completely sent off: 52out of 52 bytes > < HTTP/1.1 200 OK > < Content-Type: application/xml; charset=ISO-8859-1 > < Transfer-Encoding: chunked > < Server: Jetty(9.1.4.v20140401) > < > * Connection #0 to host 192.168.145.1 left intact > * Closing connection #0 > > > Insert Method 2: > root:~/bigdataAnt/bigdata$ curl -v -X POST --data-binary > 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @/home/bigdataAnt/tmp.xml > http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql > * getaddrinfo(3) failed for :80 > output > * Couldn't resolve host '' > * Closing connection #0 > curl: (6) Couldn't resolve host '' > * About to connect() to 192.168.145.1 port 9999 (#0) > * Trying 192.168.145.1... connected > > POST /bigdata/namespace/reificationRDR/sparql HTTP/1.1 > > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > > Host: 192.168.145.1:9999 > > Accept: */* > > Content-Length: 52 > > Content-Type: application/x-www-form-urlencoded > > > * upload completely sent off: 52out of 52 bytes > < HTTP/1.1 500 Server Error > < Content-Type: text/plain > < Transfer-Encoding: chunked > < Server: Jetty(9.1.4.v20140401) > < > uri=[file:/home/bigdataAnt/SmallYagoFacts.ttl], context-uri=[] > java.util.concurrent.ExecutionException: java.lang.RuntimeException: Not > found: namespace=reificationRDR > at java.util.concurrent.FutureTask.report(FutureTask.java:122) > at java.util.concurrent.FutureTask.get(FutureTask.java:188) > at > com.bigdata.rdf.sail.webapp.InsertServlet.doPostWithURIs(InsertServlet.java:401) > at com.bigdata.rdf.sail.webapp.InsertServlet.doPost(InsertServlet.java:117) > at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:267) > at > com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:144) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) > at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:738) > at > org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551) > at > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) > at > org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:568) > at > org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221) > at > org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1111) > at > org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:478) > at > org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:183) > at > org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1045) > at > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) > at > org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:199) > at > org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:109) > at > org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) > at org.eclipse.jetty.server.Server.handle(Server.java:462) > at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:279) > at > org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:232) > at > org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:534) > at > org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607) > at > org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.RuntimeException: Not found: namespace=reificationRDR > at > com.bigdata.rdf.task.AbstractApiTask.getUnisolatedConnection(AbstractApiTask.java:217) > at > com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:457) > at > com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:414) > at > com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) > at > com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:220) > ... 26 more > * Connection #0 to host 192.168.145.1 left intact > * Closing connection #0 > > > Query: > curl -X POST > http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql > --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H > 'Accept:application/rdf+xml' > > tmp.xml: > <?xml version="1.0" encoding="UTF-8" standalone="no"?> > <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> > <properties> > <!-- --> > <!-- NEW KB NAMESPACE (required). --> > <!-- --> > <entry key="com.bigdata.rdf.sail.namespace">reificationRDR</entry> > <!-- --> > <!-- Specify any KB specific properties here to override defaults for the > BigdataSail --> > <!-- AbstractTripleStore, or indices in the namespace of the new KB > instance. --> > <!-- --> > <entry > key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> > </properties> > > > > On Friday, 31 October 2014 5:30 PM, Bryan Thompson <br...@sy...> > wrote: > > > What is the namespace for the RDR graph? > > The URL you need to be using is > > http://192.168.145.1:9999/bigdata/namespace/MY-GRAPH-NAMESPACE/sparql > <http://192.168.145.1:9999/bigdata/sparql> > > How to address a specific namespace is explicitly covered if you read the > wiki section on the multitenant interface that I linked in my previous > response. > > Thanks, > Bryan > > On Friday, October 31, 2014, Alice Everett <ali...@ya...');" > class="" style="" target=>ali...@ya...> wrote: > > Thanks a lot for the help. > > But I dont know where I am still going wrong: > I inserted data using: curl -v -X POST --data-binary > 'uri=file:///home/reifiedTriples.ttl' @tmp.xml > http://192.168.145.1:9999/bigdata/sparql > And then queried it using: curl -X POST > http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml > 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr' > curl: (6) Couldn't resolve host 'query=SELECT * <<' > Content-Type not recognized as RDF: application/x-www-form-urlencoded > > > On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy...> > wrote: > > > Alice, > > The workbench choice of the "in use" namespace is recorded in java script > in your browser client. That choice does not effect other workbench > clients and does not effect the behavior of the various endpoints when > using command line tools to query or update data in the database. Thus your > command line requests are being made against a namespace that is not > configured for RDR support. > > If you want to address a non-default bigdata namespace using curl or wget, > you must use the appropriate URL for that namespace. This is all described > on wiki.bigdata.com on the page for the nanoSparqlServer in the section > on multi-tenancy. > > See > http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API > <http://wiki.bigdata.com/wiki/index.php/NanoSparqlServerMulti-Tenancy_API> > > Thanks, > Bryan > > On Thursday, October 30, 2014, Alice Everett <ali...@ya...> > wrote: > > I found out an awesome feature in Bigdata called RDR and I am trying to > explore that too. Can you please let me know as to where am I going wrong > while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample > RDF data, contains reification in its standard form: > http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification > <http://www.w3.org/2001/sw/DataAccess/rq23/queryReification>) > Loading: > curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' > http://192.168.145.1:9999/bigdata/sparql > (Additionally I changed my current namespace within the workbench opened > in my browser to RDR mode). > > After this I fired the following query and got the following error (Can > you please correct me as to where am I going wrong. I'll be very grateful > to you for the same): > @HP-ProBook-4430s:~/bigdataAnt$ curl -X POST > http://192.168.145.1:9999/bigdata/sparql --header > "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p ?o>> > ?p1 ?o1 }' -H 'Accept:application/rdr' > > SELECT * {<<?s ?p ?o>> ?p1 ?o1 } > java.util.concurrent.ExecutionException: > org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at java.util.concurrent.FutureTask.report(FutureTask.java:122) > at java.util.concurrent.FutureTask.get(FutureTask.java:188) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > Caused by: org.openrdf.query.QueryEvaluationException: > java.lang.RuntimeException: java.util.concurrent.ExecutionException: > java.lang.RuntimeException: java.util.concurrent.ExecutionException: > java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) > at > org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) > at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) > at > org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) > at > com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) > ... 6 more > Caused by: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) > at > com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) > at > com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) > at > com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) > ... 15 more > Caused by: java.util.concurrent.ExecutionException: > java.lang.RuntimeException: java.util.concurrent.ExecutionException: > java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at java.util.concurrent.FutureTask.report(FutureTask.java:122) > at java.util.concurrent.FutureTask.get(FutureTask.java:188) > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) > ... 20 more > Caused by: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) > at > com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) > at > com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) > at > com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) > > ... 4 more > Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) > at > com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) > at > com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) > ... 9 more > Caused by: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) > ... 3 more > Caused by: java.util.concurrent.ExecutionException: > java.lang.RuntimeException: java.lang.RuntimeException: > java.lang.ArrayIndexOutOfBoundsException: 0 > at java.util.concurrent.FutureTask.report(FutureTask.java:122) > at java.util.concurrent.FutureTask.get(FutureTask.java:188) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) > ... 8 more > Caused by: java.lang.RuntimeException: java.lang.RuntimeException: > java.lang.ArrayIndexOutOfBoundsException: 0 > at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) > at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) > ... 8 more > Caused by: java.lang.RuntimeException: > java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) > at > com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) > at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) > ... 12 more > Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) > at > com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) > ... 14 more > > > > -- > ---- > Bryan Thompson > Chief Scientist & Founder > SYSTAP, LLC > 4501 Tower Road > Greensboro, NC 27410 > br...@sy... > http://bigdata.com > http://mapgraph.io > CONFIDENTIALITY NOTICE: This email and its contents and attachments are > for the sole use of the intended recipient(s) and are confidential or > proprietary to SYSTAP. Any unauthorized review, use, disclosure, > dissemination or copying of this email or its contents or attachments is > prohibited. If you have received this communication in error, please notify > the sender by reply email and permanently delete all copies of the email > and its contents and attachments. > > > > > I dont know why am I getting an error when I am querying using RDR. Can > you please help me with this one last time. > > My tmp.xml file is: > <?xml version="1.0" encoding="UTF-8" standalone="no"?> > <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> > <properties> > <!-- --> > <!-- NEW KB NAMESPACE (required). --> > <!-- --> > <entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry> > <!-- --> > <!-- Specify any KB specific properties here to override defaults for the > BigdataSail --> > <!-- AbstractTripleStore, or indices in the namespace of the new KB > instance. --> > <!-- --> > <entry > key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> > </properties> > > > > -- > ---- > Bryan Thompson > Chief Scientist & Founder > SYSTAP, LLC > 4501 Tower Road > Greensboro, NC 27410 > br...@sy..." class="" style="" target='_blank' rel=external> > br...@sy... > http://bigdata.com > http://mapgraph.io > CONFIDENTIALITY NOTICE: This email and its contents and attachments are > for the sole use of the intended recipient(s) and are confidential or > proprietary to SYSTAP. Any unauthorized review, use, disclosure, > dissemination or copying of this email or its contents or attachments is > prohibited. If you have received this communication in error, please notify > the sender by reply email and permanently delete all copies of the email > and its contents and attachments. > > > > > ------------------------------------------------------------------------------ > _______________________________________________ > Bigdata-developers mailing list > Big...@li... > https://lists.sourceforge.net/lists/listinfo/bigdata-developers > > > <http://sigads.rediff.com/RealMedia/ads/click_nx.ads/www.rediffmail.com/signatureline.htm@Middle?> > Get your own *FREE* website, *FREE* domain & *FREE* mobile app with > Company email. > *Know More >* > <http://track.rediff.com/click?url=___http://businessemail.rediff.com/email-ids-for-companies-with-less-than-50-employees?sc_cid=sign-1-10-13___&cmp=host&lnk=sign-1-10-13&nsrv1=host> > > > |
From: Alice E. <ali...@ya...> - 2014-10-31 12:30:20
|
Thanks Jennifer. But even keeping tmp.xml within the bigdata folder is not helping. On Friday, 31 October 2014 5:57 PM, Jennifer <jen...@re...> wrote: I think she is missing as to where tmp.xml should be kept within her bigdata/Ant folder as I think bigdata is not able to find tmp.xml. Alice I think you should keep tmp.xml within the bigdata folder which you downloaded. From: Alice Everett <ali...@ya...> Sent: Fri, 31 Oct 2014 17:47:26 To: Bryan Thompson <br...@sy...> Cc: "big...@li..." <big...@li...> Subject: Re: [Bigdata-developers] How to use RDR with Curl Ok. Thanks a ton. But still I am a little lost. I used two methods of inserting as explained below. My namespace's name is reificationRDR. I'll be very grateful if you can help me with this a bit. Insert Method1: root:~/bigdataAnt$ curl -v -X POST --data-binary 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @tmp.xml http://192.168.145.1:9999/bigdata/sparql output: * getaddrinfo(3) failed for tmp.xml:80 * Couldn't resolve host 'tmp.xml' * Closing connection #0 curl: (6) Couldn't resolve host 'tmp.xml' * About to connect() to 192.168.145.1 port 9999 (#0) * Trying 192.168.145.1... connected > POST /bigdata/sparql HTTP/1.1 > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > Host: 192.168.145.1:9999 > Accept: */* > Content-Length: 52 > Content-Type: application/x-www-form-urlencoded > * upload completely sent off: 52out of 52 bytes < HTTP/1.1 200 OK < Content-Type: application/xml; charset=ISO-8859-1 < Transfer-Encoding: chunked < Server: Jetty(9.1.4.v20140401) < * Connection #0 to host 192.168.145.1 left intact * Closing connection #0 Insert Method 2: root:~/bigdataAnt/bigdata$ curl -v -X POST --data-binary 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @/home/bigdataAnt/tmp.xml http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql * getaddrinfo(3) failed for :80 output * Couldn't resolve host '' * Closing connection #0 curl: (6) Couldn't resolve host '' * About to connect() to 192.168.145.1 port 9999 (#0) * Trying 192.168.145.1... connected > POST /bigdata/namespace/reificationRDR/sparql HTTP/1.1 > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > Host: 192.168.145.1:9999 > Accept: */* > Content-Length: 52 > Content-Type: application/x-www-form-urlencoded > * upload completely sent off: 52out of 52 bytes < HTTP/1.1 500 Server Error < Content-Type: text/plain < Transfer-Encoding: chunked < Server: Jetty(9.1.4.v20140401) < uri=[file:/home/bigdataAnt/SmallYagoFacts.ttl], context-uri=[] java.util.concurrent.ExecutionException: java.lang.RuntimeException: Not found: namespace=reificationRDR at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:188) at com.bigdata.rdf.sail.webapp.InsertServlet.doPostWithURIs(InsertServlet.java:401) at com.bigdata.rdf.sail.webapp.InsertServlet.doPost(InsertServlet.java:117) at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:267) at com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:144) at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:738) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:568) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1111) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:478) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:183) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1045) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:199) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:109) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) at org.eclipse.jetty.server.Server.handle(Server.java:462) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:279) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:232) at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:534) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607) at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: Not found: namespace=reificationRDR at com.bigdata.rdf.task.AbstractApiTask.getUnisolatedConnection(AbstractApiTask.java:217) at com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:457) at com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:414) at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) at com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:220) ... 26 more * Connection #0 to host 192.168.145.1 left intact * Closing connection #0 Query: curl -X POST http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdf+xml' tmp.xml: <?xml version="1.0" encoding="UTF-8" standalone="no"?> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> <properties> <!-- --> <!-- NEW KB NAMESPACE (required). --> <!-- --> <entry key="com.bigdata.rdf.sail.namespace">reificationRDR</entry> <!-- --> <!-- Specify any KB specific properties here to override defaults for the BigdataSail --> <!-- AbstractTripleStore, or indices in the namespace of the new KB instance. --> <!-- --> <entry key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> </properties> On Friday, 31 October 2014 5:30 PM, Bryan Thompson <br...@sy...> wrote: What is the namespace for the RDR graph? The URL you need to be using is http://192.168.145.1:9999/bigdata/namespace/MY-GRAPH-NAMESPACE/sparql How to address a specific namespace is explicitly covered if you read the wiki section on the multitenant interface that I linked in my previous response. Thanks, Bryan On Friday, October 31, 2014, Alice Everett <ali...@ya...');" class="" style="" target=>ali...@ya...> wrote: Thanks a lot for the help. > > >But I dont know where I am still going wrong: >I inserted data using: curl -v -X POST --data-binary 'uri=file:///home/reifiedTriples.ttl' @tmp.xml http://192.168.145.1:9999/bigdata/sparql >And then queried it using: curl -X POST http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr'curl: (6) Couldn't resolve host 'query=SELECT * <<' >Content-Type not recognized as RDF: application/x-www-form-urlencoded > > > >On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy...> wrote: > > > >Alice, > > >The workbench choice of the "in use" namespace is recorded in java script in your browser client. That choice does not effect other workbench clients and does not effect the behavior of the various endpoints when using command line tools to query or update data in the database. Thus your command line requests are being made against a namespace that is not configured for RDR support. > > >If you want to address a non-default bigdata namespace using curl or wget, you must use the appropriate URL for that namespace. This is all described on wiki.bigdata.com on the page for the nanoSparqlServer in the section on multi-tenancy. > > >See http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API > > >Thanks, >Bryan > >On Thursday, October 30, 2014, Alice Everett <ali...@ya...> wrote: > >I found out an awesome feature in Bigdata called RDR and I am trying to explore that too. Can you please let me know as to where am I going wrong while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample RDF data, contains reification in its standard form: http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification) >>Loading: >>curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' http://192.168.145.1:9999/bigdata/sparql >> >>(Additionally I changed my current namespace within the workbench opened in my browser to RDR mode). >> >> >>After this I fired the following query and got the following error (Can you please correct me as to where am I going wrong. I'll be very grateful to you for the same): >>@HP-ProBook-4430s:~/bigdataAnt$ curl -X POST http://192.168.145.1:9999/bigdata/sparql --header "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdr' >> >> >>SELECT * {<<?s ?p ?o>> ?p1 ?o1 } >>java.util.concurrent.ExecutionException: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) >>at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) >>at java.lang.Thread.run(Thread.java:745) >>Caused by: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) >>at org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) >>at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) >>at org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) >>at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >>... 6 more >>Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) >>at com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) >>at com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) >>at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) >>... 15 more >>Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) >>... 20 more >>Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) >>at com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) >>at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) >>at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) >> >> >>... 4 more >>Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) >>at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) >>at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) >>... 9 more >>Caused by: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) >>at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) >>... 3 more >>Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) >>... 8 more >>Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) >>at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) >>... 8 more >>Caused by: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) >>at com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) >>at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) >>... 12 more >>Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) >>at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) >>... 14 more >> >> > >-- > >---- >Bryan Thompson > >Chief Scientist & Founder >SYSTAP, LLC > >4501 Tower Road >Greensboro, NC 27410 > >br...@sy... > >http://bigdata.com > >http://mapgraph.io > >CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. > > > > > >I dont know why am I getting an error when I am querying using RDR. Can you please help me with this one last time. > > >My tmp.xml file is: ><?xml version="1.0" encoding="UTF-8" standalone="no"?> ><!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> ><properties> ><!-- --> ><!-- NEW KB NAMESPACE (required). --> ><!-- --> ><entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry> ><!-- --> ><!-- Specify any KB specific properties here to override defaults for the BigdataSail --> ><!-- AbstractTripleStore, or indices in the namespace of the new KB instance. --> ><!-- --> ><entry key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> ></properties> > > -- ---- Bryan Thompson Chief Scientist & Founder SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 br...@sy..." class="" style="" target='_blank' rel=external>br...@sy... http://bigdata.com http://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. ------------------------------------------------------------------------------ _______________________________________________ Bigdata-developers mailing list Big...@li... https://lists.sourceforge.net/lists/listinfo/bigdata-developers Get your own FREE website, FREE domain & FREE mobile app with Company email. Know More > |
From: Bryan T. <br...@sy...> - 2014-10-31 12:30:05
|
What do you see if you issue wget http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql This should return the service description of the sparql endpoint. In the stack trace, it indicates that the reificationRDR namespace was not found. Are you certain that this is the correct name for that namespace? Thanks, Bryan ---- Bryan Thompson Chief Scientist & Founder SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 br...@sy... http://bigdata.com http://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. On Fri, Oct 31, 2014 at 8:17 AM, Alice Everett <ali...@ya...> wrote: > Ok. Thanks a ton. But still I am a little lost. I used two methods of > inserting as explained below. My namespace's name is reificationRDR. > I'll be very grateful if you can help me with this a bit. > > Insert Method1: > root:~/bigdataAnt$ curl -v -X POST --data-binary > 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @tmp.xml > http://192.168.145.1:9999/bigdata/sparql > output: > * getaddrinfo(3) failed for tmp.xml:80 > * Couldn't resolve host 'tmp.xml' > * Closing connection #0 > curl: (6) Couldn't resolve host 'tmp.xml' > * About to connect() to 192.168.145.1 port 9999 (#0) > * Trying 192.168.145.1... connected > > POST /bigdata/sparql HTTP/1.1 > > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > > Host: 192.168.145.1:9999 > > Accept: */* > > Content-Length: 52 > > Content-Type: application/x-www-form-urlencoded > > > * upload completely sent off: 52out of 52 bytes > < HTTP/1.1 200 OK > < Content-Type: application/xml; charset=ISO-8859-1 > < Transfer-Encoding: chunked > < Server: Jetty(9.1.4.v20140401) > < > * Connection #0 to host 192.168.145.1 left intact > * Closing connection #0 > > > Insert Method 2: > root:~/bigdataAnt/bigdata$ curl -v -X POST --data-binary > 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @/home/bigdataAnt/tmp.xml > http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql > * getaddrinfo(3) failed for :80 > output > * Couldn't resolve host '' > * Closing connection #0 > curl: (6) Couldn't resolve host '' > * About to connect() to 192.168.145.1 port 9999 (#0) > * Trying 192.168.145.1... connected > > POST /bigdata/namespace/reificationRDR/sparql HTTP/1.1 > > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > > Host: 192.168.145.1:9999 > > Accept: */* > > Content-Length: 52 > > Content-Type: application/x-www-form-urlencoded > > > * upload completely sent off: 52out of 52 bytes > < HTTP/1.1 500 Server Error > < Content-Type: text/plain > < Transfer-Encoding: chunked > < Server: Jetty(9.1.4.v20140401) > < > uri=[file:/home/bigdataAnt/SmallYagoFacts.ttl], context-uri=[] > java.util.concurrent.ExecutionException: java.lang.RuntimeException: Not > found: namespace=reificationRDR > at java.util.concurrent.FutureTask.report(FutureTask.java:122) > at java.util.concurrent.FutureTask.get(FutureTask.java:188) > at > com.bigdata.rdf.sail.webapp.InsertServlet.doPostWithURIs(InsertServlet.java:401) > at com.bigdata.rdf.sail.webapp.InsertServlet.doPost(InsertServlet.java:117) > at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:267) > at > com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:144) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) > at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:738) > at > org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551) > at > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) > at > org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:568) > at > org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221) > at > org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1111) > at > org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:478) > at > org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:183) > at > org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1045) > at > org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) > at > org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:199) > at > org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:109) > at > org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) > at org.eclipse.jetty.server.Server.handle(Server.java:462) > at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:279) > at > org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:232) > at > org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:534) > at > org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607) > at > org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.RuntimeException: Not found: namespace=reificationRDR > at > com.bigdata.rdf.task.AbstractApiTask.getUnisolatedConnection(AbstractApiTask.java:217) > at > com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:457) > at > com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:414) > at > com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) > at > com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:220) > ... 26 more > * Connection #0 to host 192.168.145.1 left intact > * Closing connection #0 > > > Query: > curl -X POST > http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql > --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H > 'Accept:application/rdf+xml' > > tmp.xml: > <?xml version="1.0" encoding="UTF-8" standalone="no"?> > <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> > <properties> > <!-- --> > <!-- NEW KB NAMESPACE (required). --> > <!-- --> > <entry key="com.bigdata.rdf.sail.namespace">reificationRDR</entry> > <!-- --> > <!-- Specify any KB specific properties here to override defaults for the > BigdataSail --> > <!-- AbstractTripleStore, or indices in the namespace of the new KB > instance. --> > <!-- --> > <entry > key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> > </properties> > > > > On Friday, 31 October 2014 5:30 PM, Bryan Thompson <br...@sy...> > wrote: > > > What is the namespace for the RDR graph? > > The URL you need to be using is > > http://192.168.145.1:9999/bigdata/namespace/MY-GRAPH-NAMESPACE/sparql > <http://192.168.145.1:9999/bigdata/sparql> > > How to address a specific namespace is explicitly covered if you read the > wiki section on the multitenant interface that I linked in my previous > response. > > Thanks, > Bryan > > On Friday, October 31, 2014, Alice Everett <ali...@ya...> wrote: > > Thanks a lot for the help. > > But I dont know where I am still going wrong: > I inserted data using: curl -v -X POST --data-binary > 'uri=file:///home/reifiedTriples.ttl' @tmp.xml > http://192.168.145.1:9999/bigdata/sparql > And then queried it using: curl -X POST > http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml > 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr' > curl: (6) Couldn't resolve host 'query=SELECT * <<' > Content-Type not recognized as RDF: application/x-www-form-urlencoded > > > On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy...> > wrote: > > > Alice, > > The workbench choice of the "in use" namespace is recorded in java script > in your browser client. That choice does not effect other workbench > clients and does not effect the behavior of the various endpoints when > using command line tools to query or update data in the database. Thus your > command line requests are being made against a namespace that is not > configured for RDR support. > > If you want to address a non-default bigdata namespace using curl or wget, > you must use the appropriate URL for that namespace. This is all described > on wiki.bigdata.com on the page for the nanoSparqlServer in the section > on multi-tenancy. > > See > http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API > > Thanks, > Bryan > > On Thursday, October 30, 2014, Alice Everett <ali...@ya...> > wrote: > > I found out an awesome feature in Bigdata called RDR and I am trying to > explore that too. Can you please let me know as to where am I going wrong > while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample > RDF data, contains reification in its standard form: > http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification > <http://www.w3.org/2001/sw/DataAccess/rq23/queryReification>) > Loading: > curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' > http://192.168.145.1:9999/bigdata/sparql > (Additionally I changed my current namespace within the workbench opened > in my browser to RDR mode). > > After this I fired the following query and got the following error (Can > you please correct me as to where am I going wrong. I'll be very grateful > to you for the same): > @HP-ProBook-4430s:~/bigdataAnt$ curl -X POST > http://192.168.145.1:9999/bigdata/sparql --header > "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p ?o>> > ?p1 ?o1 }' -H 'Accept:application/rdr' > > SELECT * {<<?s ?p ?o>> ?p1 ?o1 } > java.util.concurrent.ExecutionException: > org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at java.util.concurrent.FutureTask.report(FutureTask.java:122) > at java.util.concurrent.FutureTask.get(FutureTask.java:188) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > Caused by: org.openrdf.query.QueryEvaluationException: > java.lang.RuntimeException: java.util.concurrent.ExecutionException: > java.lang.RuntimeException: java.util.concurrent.ExecutionException: > java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) > at > org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) > at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) > at > org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) > at > com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) > ... 6 more > Caused by: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) > at > com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) > at > com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) > at > com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) > ... 15 more > Caused by: java.util.concurrent.ExecutionException: > java.lang.RuntimeException: java.util.concurrent.ExecutionException: > java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at java.util.concurrent.FutureTask.report(FutureTask.java:122) > at java.util.concurrent.FutureTask.get(FutureTask.java:188) > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) > ... 20 more > Caused by: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) > at > com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) > at > com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) > at > com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) > > ... 4 more > Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) > at > com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) > at > com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) > ... 9 more > Caused by: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) > ... 3 more > Caused by: java.util.concurrent.ExecutionException: > java.lang.RuntimeException: java.lang.RuntimeException: > java.lang.ArrayIndexOutOfBoundsException: 0 > at java.util.concurrent.FutureTask.report(FutureTask.java:122) > at java.util.concurrent.FutureTask.get(FutureTask.java:188) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) > ... 8 more > Caused by: java.lang.RuntimeException: java.lang.RuntimeException: > java.lang.ArrayIndexOutOfBoundsException: 0 > at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) > at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) > ... 8 more > Caused by: java.lang.RuntimeException: > java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) > at > com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) > at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) > ... 12 more > Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) > at > com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) > ... 14 more > > > > -- > ---- > Bryan Thompson > Chief Scientist & Founder > SYSTAP, LLC > 4501 Tower Road > Greensboro, NC 27410 > br...@sy... > http://bigdata.com > http://mapgraph.io > CONFIDENTIALITY NOTICE: This email and its contents and attachments are > for the sole use of the intended recipient(s) and are confidential or > proprietary to SYSTAP. Any unauthorized review, use, disclosure, > dissemination or copying of this email or its contents or attachments is > prohibited. If you have received this communication in error, please notify > the sender by reply email and permanently delete all copies of the email > and its contents and attachments. > > > > > I dont know why am I getting an error when I am querying using RDR. Can > you please help me with this one last time. > > My tmp.xml file is: > <?xml version="1.0" encoding="UTF-8" standalone="no"?> > <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> > <properties> > <!-- --> > <!-- NEW KB NAMESPACE (required). --> > <!-- --> > <entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry> > <!-- --> > <!-- Specify any KB specific properties here to override defaults for the > BigdataSail --> > <!-- AbstractTripleStore, or indices in the namespace of the new KB > instance. --> > <!-- --> > <entry > key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> > </properties> > > > > -- > ---- > Bryan Thompson > Chief Scientist & Founder > SYSTAP, LLC > 4501 Tower Road > Greensboro, NC 27410 > br...@sy... > http://bigdata.com > http://mapgraph.io > CONFIDENTIALITY NOTICE: This email and its contents and attachments are > for the sole use of the intended recipient(s) and are confidential or > proprietary to SYSTAP. Any unauthorized review, use, disclosure, > dissemination or copying of this email or its contents or attachments is > prohibited. If you have received this communication in error, please notify > the sender by reply email and permanently delete all copies of the email > and its contents and attachments. > > > > |
From: Jennifer <jen...@re...> - 2014-10-31 12:27:39
|
I think she is missing as to where tmp.xml should be kept within her bigdata/Ant folder as I think bigdata is not able to find tmp.xml. Alice I think you should keep tmp.xml within the bigdata folder which you downloaded. From: Alice Everett <ali...@ya...> Sent: Fri, 31 Oct 2014 17:47:26 To: Bryan Thompson <br...@sy...> Cc: "big...@li..." <big...@li...> Subject: Re: [Bigdata-developers] How to use RDR with Curl Ok. Thanks a ton. But still I am a little lost. I used two methods of inserting as explained below. My namespace's name is reificationRDR.I'll be very grateful if you can help me with this a bit.Insert Method1:root:~/bigdataAnt$ curl -v -X POST --data-binary 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @tmp.xml http://192.168.145.1:9999/bigdata/sparqloutput:* getaddrinfo(3) failed for tmp.xml:80* Couldn't resolve host 'tmp.xml'* Closing connection #0curl: (6) Couldn't resolve host 'tmp.xml'* About to connect() to 192.168.145.1 port 9999 (#0)* Trying 192.168.145.1... connected> POST /bigdata/sparql HTTP/1.1> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3> Host: 192.168.145.1:9999> Accept: */*> Content-Length: 52> Content-Type: application/x-www-form-urlencoded> * upload completely sent off: 52out of 52 bytes< HTTP/1.1 200 OK< Content-Type: application/xml; charset=ISO-8859-1< Transfer-Encoding: chunked< Server: Jetty(9.1.4.v20140401)< * Connection #0 to host 192.168.145.1 left intact* Closing connection #0Insert Method 2:root:~/bigdataAnt/bigdata$ curl -v -X POST --data-binary 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @/home/bigdataAnt/tmp.xml http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql* getaddrinfo(3) failed for :80output* Couldn't resolve host ''* Closing connection #0curl: (6) Couldn't resolve host ''* About to connect() to 192.168.145.1 port 9999 (#0)* Trying 192.168.145.1... connected> POST /bigdata/namespace/reificationRDR/sparql HTTP/1.1> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3> Host: 192.168.145.1:9999> Accept: */*> Content-Length: 52> Content-Type: application/x-www-form-urlencoded> * upload completely sent off: 52out of 52 bytes< HTTP/1.1 500 Server Error< Content-Type: text/plain< Transfer-Encoding: chunked< Server: Jetty(9.1.4.v20140401)< uri=[file:/home/bigdataAnt/SmallYagoFacts.ttl], context-uri=[]java.util.concurrent.ExecutionException: java.lang.RuntimeException: Not found: namespace=reificationRDR at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:188) at com.bigdata.rdf.sail.webapp.InsertServlet.doPostWithURIs(InsertServlet.java:401) at com.bigdata.rdf.sail.webapp.InsertServlet.doPost(InsertServlet.java:117) at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:267) at com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:144) at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:738) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:568) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1111) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:478) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:183) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1045) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:199) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:109) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) at org.eclipse.jetty.server.Server.handle(Server.java:462) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:279) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:232) at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:534) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607) at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536) at java.lang.Thread.run(Thread.java:745)Caused by: java.lang.RuntimeException: Not found: namespace=reificationRDR at com.bigdata.rdf.task.AbstractApiTask.getUnisolatedConnection(AbstractApiTask.java:217) at com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:457) at com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:414) at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) at com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:220) ... 26 more* Connection #0 to host 192.168.145.1 left intact* Closing connection #0Query:curl -X POST http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdf+xml'tmp.xml:<?xml version="1.0" encoding="UTF-8" standalone="no"?><!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"><properties><!-- --><!-- NEW KB NAMESPACE (required). --><!-- --><entry key="com.bigdata.rdf.sail.namespace">reificationRDR</entry><!-- --><!-- Specify any KB specific properties here to override defaults for the BigdataSail --><!-- AbstractTripleStore, or indices in the namespace of the new KB instance. --><!-- --><entry key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry></properties> On Friday, 31 October 2014 5:30 PM, Bryan Thompson <br...@sy...> wrote: What is the namespace for the RDR graph?The URL you need to be using is http://192.168.145.1:9999/bigdata/namespace/MY-GRAPH-NAMESPACE/sparqlHow to address a specific namespace is explicitly covered if you read the wiki section on the multitenant interface that I linked in my previous response.Thanks,BryanOn Friday, October 31, 2014, Alice Everett <ali...@ya...');" class="" style="" target=>ali...@ya...> wrote:Thanks a lot for the help.But I dont know where I am still going wrong:I inserted data using: curl -v -X POST --data-binary 'uri=file:///home/reifiedTriples.ttl' @tmp.xml http://192.168.145.1:9999/bigdata/sparqlAnd then queried it using: curl -X POST http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr'curl: (6) Couldn't resolve host 'query=SELECT * <<'Content-Type not recognized as RDF: application/x-www-form-urlencoded On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy...> wrote: Alice,The workbench choice of the "in use" namespace is recorded in java script in your browser client. That choice does not effect other workbench clients and does not effect the behavior of the various endpoints when using command line tools to query or update data in the database. Thus your command line requests are being made against a namespace that is not configured for RDR support.If you want to address a non-default bigdata namespace using curl or wget, you must use the appropriate URL for that namespace. This is all described on wiki.bigdata.com on the page for the nanoSparqlServer in the section on multi-tenancy. See http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_APIThanks,BryanOn Thursday, October 30, 2014, Alice Everett <ali...@ya...> wrote:I found out an awesome feature in Bigdata called RDR and I am trying to explore that too. Can you please let me know as to where am I going wrong while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample RDF data, contains reification in its standard form: http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification)Loading:curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' http://192.168.145.1:9999/bigdata/sparql(Additionally I changed my current namespace within the workbench opened in my browser to RDR mode).After this I fired the following query and got the following error (Can you please correct me as to where am I going wrong. I'll be very grateful to you for the same):@HP-ProBook-4430s:~/bigdataAnt$ curl -X POST http://192.168.145.1:9999/bigdata/sparql --header "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdr' SELECT * {<<?s ?p ?o>> ?p1 ?o1 }java.util.concurrent.ExecutionException: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:188) at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745)Caused by: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) at org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) at org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) at com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) ... 6 moreCaused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) at com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) at com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) ... 15 moreCaused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:188) at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) ... 20 moreCaused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) at com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) at com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) at com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) ... 4 moreCaused by: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) ... 9 moreCaused by: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) ... 3 moreCaused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:188) at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) ... 8 moreCaused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) ... 8 moreCaused by: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) at com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) ... 12 moreCaused by: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) ... 14 more-- ----Bryan ThompsonChief Scientist & FounderSYSTAP, LLC4501 Tower RoadGreensboro, NC 274...@sy...http://bigdata.comhttp://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. I dont know why am I getting an error when I am querying using RDR. Can you please help me with this one last time.My tmp.xml file is:<?xml version="1.0" encoding="UTF-8" standalone="no"?><!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"><properties><!-- --><!-- NEW KB NAMESPACE (required). --><!-- --><entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry><!-- --><!-- Specify any KB specific properties here to override defaults for the BigdataSail --><!-- AbstractTripleStore, or indices in the namespace of the new KB instance. --><!-- --><entry key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry></properties>-- ----Bryan ThompsonChief Scientist & FounderSYSTAP, LLC4501 Tower RoadGreensboro, NC 274...@sy..." class="" style="" target='_blank' rel=external>br...@sy...http://bigdata.comhttp://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. ------------------------------------------------------------------------------ _______________________________________________ Bigdata-developers mailing list Big...@li... https://lists.sourceforge.net/lists/listinfo/bigdata-developers |
From: Alice E. <ali...@ya...> - 2014-10-31 12:17:15
|
Ok. Thanks a ton. But still I am a little lost. I used two methods of inserting as explained below. My namespace's name is reificationRDR. I'll be very grateful if you can help me with this a bit. Insert Method1: root:~/bigdataAnt$ curl -v -X POST --data-binary 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @tmp.xml http://192.168.145.1:9999/bigdata/sparql output: * getaddrinfo(3) failed for tmp.xml:80 * Couldn't resolve host 'tmp.xml' * Closing connection #0 curl: (6) Couldn't resolve host 'tmp.xml' * About to connect() to 192.168.145.1 port 9999 (#0) * Trying 192.168.145.1... connected > POST /bigdata/sparql HTTP/1.1 > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > Host: 192.168.145.1:9999 > Accept: */* > Content-Length: 52 > Content-Type: application/x-www-form-urlencoded > * upload completely sent off: 52out of 52 bytes < HTTP/1.1 200 OK < Content-Type: application/xml; charset=ISO-8859-1 < Transfer-Encoding: chunked < Server: Jetty(9.1.4.v20140401) < * Connection #0 to host 192.168.145.1 left intact * Closing connection #0 Insert Method 2: root:~/bigdataAnt/bigdata$ curl -v -X POST --data-binary 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @/home/bigdataAnt/tmp.xml http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql * getaddrinfo(3) failed for :80 output * Couldn't resolve host '' * Closing connection #0 curl: (6) Couldn't resolve host '' * About to connect() to 192.168.145.1 port 9999 (#0) * Trying 192.168.145.1... connected > POST /bigdata/namespace/reificationRDR/sparql HTTP/1.1 > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > Host: 192.168.145.1:9999 > Accept: */* > Content-Length: 52 > Content-Type: application/x-www-form-urlencoded > * upload completely sent off: 52out of 52 bytes < HTTP/1.1 500 Server Error < Content-Type: text/plain < Transfer-Encoding: chunked < Server: Jetty(9.1.4.v20140401) < uri=[file:/home/bigdataAnt/SmallYagoFacts.ttl], context-uri=[] java.util.concurrent.ExecutionException: java.lang.RuntimeException: Not found: namespace=reificationRDR at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:188) at com.bigdata.rdf.sail.webapp.InsertServlet.doPostWithURIs(InsertServlet.java:401) at com.bigdata.rdf.sail.webapp.InsertServlet.doPost(InsertServlet.java:117) at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:267) at com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:144) at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:738) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:568) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1111) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:478) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:183) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1045) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:199) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:109) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) at org.eclipse.jetty.server.Server.handle(Server.java:462) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:279) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:232) at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:534) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607) at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: Not found: namespace=reificationRDR at com.bigdata.rdf.task.AbstractApiTask.getUnisolatedConnection(AbstractApiTask.java:217) at com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:457) at com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:414) at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) at com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:220) ... 26 more * Connection #0 to host 192.168.145.1 left intact * Closing connection #0 Query: curl -X POST http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdf+xml' tmp.xml: <?xml version="1.0" encoding="UTF-8" standalone="no"?> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> <properties> <!-- --> <!-- NEW KB NAMESPACE (required). --> <!-- --> <entry key="com.bigdata.rdf.sail.namespace">reificationRDR</entry> <!-- --> <!-- Specify any KB specific properties here to override defaults for the BigdataSail --> <!-- AbstractTripleStore, or indices in the namespace of the new KB instance. --> <!-- --> <entry key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> </properties> On Friday, 31 October 2014 5:30 PM, Bryan Thompson <br...@sy...> wrote: What is the namespace for the RDR graph? The URL you need to be using is http://192.168.145.1:9999/bigdata/namespace/MY-GRAPH-NAMESPACE/sparql How to address a specific namespace is explicitly covered if you read the wiki section on the multitenant interface that I linked in my previous response. Thanks, Bryan On Friday, October 31, 2014, Alice Everett <ali...@ya...> wrote: Thanks a lot for the help. > > >But I dont know where I am still going wrong: >I inserted data using: curl -v -X POST --data-binary 'uri=file:///home/reifiedTriples.ttl' @tmp.xml http://192.168.145.1:9999/bigdata/sparql >And then queried it using: curl -X POST http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr'curl: (6) Couldn't resolve host 'query=SELECT * <<' >Content-Type not recognized as RDF: application/x-www-form-urlencoded > > > >On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy...> wrote: > > > >Alice, > > >The workbench choice of the "in use" namespace is recorded in java script in your browser client. That choice does not effect other workbench clients and does not effect the behavior of the various endpoints when using command line tools to query or update data in the database. Thus your command line requests are being made against a namespace that is not configured for RDR support. > > >If you want to address a non-default bigdata namespace using curl or wget, you must use the appropriate URL for that namespace. This is all described on wiki.bigdata.com on the page for the nanoSparqlServer in the section on multi-tenancy. > > >See http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API > > >Thanks, >Bryan > >On Thursday, October 30, 2014, Alice Everett <ali...@ya...> wrote: > >I found out an awesome feature in Bigdata called RDR and I am trying to explore that too. Can you please let me know as to where am I going wrong while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample RDF data, contains reification in its standard form: http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification) >>Loading: >>curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' http://192.168.145.1:9999/bigdata/sparql >> >>(Additionally I changed my current namespace within the workbench opened in my browser to RDR mode). >> >> >>After this I fired the following query and got the following error (Can you please correct me as to where am I going wrong. I'll be very grateful to you for the same): >>@HP-ProBook-4430s:~/bigdataAnt$ curl -X POST http://192.168.145.1:9999/bigdata/sparql --header "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdr' >> >> >>SELECT * {<<?s ?p ?o>> ?p1 ?o1 } >>java.util.concurrent.ExecutionException: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) >>at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) >>at java.lang.Thread.run(Thread.java:745) >>Caused by: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) >>at org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) >>at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) >>at org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) >>at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) >>at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >>... 6 more >>Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) >>at com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) >>at com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) >>at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) >>... 15 more >>Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) >>... 20 more >>Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) >>at com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) >>at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) >>at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) >> >> >>... 4 more >>Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) >>at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) >>at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) >>at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) >>... 9 more >>Caused by: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) >>at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) >>... 3 more >>Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) >>... 8 more >>Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) >>at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) >>at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) >>... 8 more >>Caused by: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) >>at com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) >>at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) >>... 12 more >>Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 >>at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) >>at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) >>... 14 more >> >> > >-- > >---- >Bryan Thompson > >Chief Scientist & Founder >SYSTAP, LLC > >4501 Tower Road >Greensboro, NC 27410 > >br...@sy... > >http://bigdata.com > >http://mapgraph.io > >CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. > > > > > >I dont know why am I getting an error when I am querying using RDR. Can you please help me with this one last time. > > >My tmp.xml file is: ><?xml version="1.0" encoding="UTF-8" standalone="no"?> ><!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> ><properties> ><!-- --> ><!-- NEW KB NAMESPACE (required). --> ><!-- --> ><entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry> ><!-- --> ><!-- Specify any KB specific properties here to override defaults for the BigdataSail --> ><!-- AbstractTripleStore, or indices in the namespace of the new KB instance. --> ><!-- --> ><entry key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> ></properties> > > -- ---- Bryan Thompson Chief Scientist & Founder SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 br...@sy... http://bigdata.com http://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. |
From: Bryan T. <br...@sy...> - 2014-10-31 12:00:25
|
What is the namespace for the RDR graph? The URL you need to be using is http://192.168.145.1:9999/bigdata/namespace/MY-GRAPH-NAMESPACE/sparql <http://192.168.145.1:9999/bigdata/sparql> How to address a specific namespace is explicitly covered if you read the wiki section on the multitenant interface that I linked in my previous response. Thanks, Bryan On Friday, October 31, 2014, Alice Everett <ali...@ya...> wrote: > Thanks a lot for the help. > > But I dont know where I am still going wrong: > I inserted data using: curl -v -X POST --data-binary > 'uri=file:///home/reifiedTriples.ttl' @tmp.xml > http://192.168.145.1:9999/bigdata/sparql > And then queried it using: curl -X POST > http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml > 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr' > curl: (6) Couldn't resolve host 'query=SELECT * <<' > Content-Type not recognized as RDF: application/x-www-form-urlencoded > > > On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy... > <javascript:_e(%7B%7D,'cvml','br...@sy...');>> wrote: > > > Alice, > > The workbench choice of the "in use" namespace is recorded in java script > in your browser client. That choice does not effect other workbench > clients and does not effect the behavior of the various endpoints when > using command line tools to query or update data in the database. Thus your > command line requests are being made against a namespace that is not > configured for RDR support. > > If you want to address a non-default bigdata namespace using curl or wget, > you must use the appropriate URL for that namespace. This is all described > on wiki.bigdata.com on the page for the nanoSparqlServer in the section > on multi-tenancy. > > See > http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API > > Thanks, > Bryan > > On Thursday, October 30, 2014, Alice Everett <ali...@ya... > <javascript:_e(%7B%7D,'cvml','ali...@ya...');>> wrote: > > I found out an awesome feature in Bigdata called RDR and I am trying to > explore that too. Can you please let me know as to where am I going wrong > while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample > RDF data, contains reification in its standard form: > http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification > <http://www.w3.org/2001/sw/DataAccess/rq23/queryReification>) > Loading: > curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' > http://192.168.145.1:9999/bigdata/sparql > (Additionally I changed my current namespace within the workbench opened > in my browser to RDR mode). > > After this I fired the following query and got the following error (Can > you please correct me as to where am I going wrong. I'll be very grateful > to you for the same): > @HP-ProBook-4430s:~/bigdataAnt$ curl -X POST > http://192.168.145.1:9999/bigdata/sparql --header > "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p ?o>> > ?p1 ?o1 }' -H 'Accept:application/rdr' > > SELECT * {<<?s ?p ?o>> ?p1 ?o1 } > java.util.concurrent.ExecutionException: > org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at java.util.concurrent.FutureTask.report(FutureTask.java:122) > at java.util.concurrent.FutureTask.get(FutureTask.java:188) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > Caused by: org.openrdf.query.QueryEvaluationException: > java.lang.RuntimeException: java.util.concurrent.ExecutionException: > java.lang.RuntimeException: java.util.concurrent.ExecutionException: > java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) > at > org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) > at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) > at > org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) > at > com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) > ... 6 more > Caused by: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) > at > com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) > at > com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) > at > com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) > ... 15 more > Caused by: java.util.concurrent.ExecutionException: > java.lang.RuntimeException: java.util.concurrent.ExecutionException: > java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at java.util.concurrent.FutureTask.report(FutureTask.java:122) > at java.util.concurrent.FutureTask.get(FutureTask.java:188) > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) > ... 20 more > Caused by: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) > at > com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) > at > com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) > at > com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) > > ... 4 more > Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) > at > com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) > at > com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) > ... 9 more > Caused by: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) > ... 3 more > Caused by: java.util.concurrent.ExecutionException: > java.lang.RuntimeException: java.lang.RuntimeException: > java.lang.ArrayIndexOutOfBoundsException: 0 > at java.util.concurrent.FutureTask.report(FutureTask.java:122) > at java.util.concurrent.FutureTask.get(FutureTask.java:188) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) > ... 8 more > Caused by: java.lang.RuntimeException: java.lang.RuntimeException: > java.lang.ArrayIndexOutOfBoundsException: 0 > at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) > at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) > ... 8 more > Caused by: java.lang.RuntimeException: > java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) > at > com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) > at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) > ... 12 more > Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) > at > com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) > ... 14 more > > > > -- > ---- > Bryan Thompson > Chief Scientist & Founder > SYSTAP, LLC > 4501 Tower Road > Greensboro, NC 27410 > br...@sy... <javascript:_e(%7B%7D,'cvml','br...@sy...');> > http://bigdata.com > http://mapgraph.io > CONFIDENTIALITY NOTICE: This email and its contents and attachments are > for the sole use of the intended recipient(s) and are confidential or > proprietary to SYSTAP. Any unauthorized review, use, disclosure, > dissemination or copying of this email or its contents or attachments is > prohibited. If you have received this communication in error, please notify > the sender by reply email and permanently delete all copies of the email > and its contents and attachments. > > > > > I dont know why am I getting an error when I am querying using RDR. Can > you please help me with this one last time. > > My tmp.xml file is: > <?xml version="1.0" encoding="UTF-8" standalone="no"?> > <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> > <properties> > <!-- --> > <!-- NEW KB NAMESPACE (required). --> > <!-- --> > <entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry> > <!-- --> > <!-- Specify any KB specific properties here to override defaults for the > BigdataSail --> > <!-- AbstractTripleStore, or indices in the namespace of the new KB > instance. --> > <!-- --> > <entry > key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> > </properties> > > -- ---- Bryan Thompson Chief Scientist & Founder SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 br...@sy... http://bigdata.com http://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. |
From: Alice E. <ali...@ya...> - 2014-10-31 11:59:23
|
Just to make it more readable what I have done: INSERT: curl -v -X POST --data-binary 'uri=file:///home/reifiedTriples.ttl' @tmp.xml http://192.168.145.1:9999/bigdata/sparql Query: curl -X POST http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr' Error when queried using RDR: curl: (6) Couldn't resolve host 'query=SELECT * <<' Content-Type not recognized as RDF: application/x-www-form-urlencoded tmp.xml <?xml version="1.0" encoding="UTF-8" standalone="no"?> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> <properties> <!-- --> <!-- NEW KB NAMESPACE (required). --> <!-- --> <entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry> <!-- --> <!-- Specify any KB specific properties here to override defaults for the BigdataSail --> <!-- AbstractTripleStore, or indices in the namespace of the new KB instance. --> <!-- --> <entry key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> </properties> On Friday, 31 October 2014 5:20 PM, Alice Everett <ali...@ya...> wrote: Thanks a lot for the help. But I dont know where I am still going wrong: I inserted data using: curl -v -X POST --data-binary 'uri=file:///home/reifiedTriples.ttl' @tmp.xml http://192.168.145.1:9999/bigdata/sparql And then queried it using: curl -X POST http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr'curl: (6) Couldn't resolve host 'query=SELECT * <<' Content-Type not recognized as RDF: application/x-www-form-urlencoded On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy...> wrote: Alice, The workbench choice of the "in use" namespace is recorded in java script in your browser client. That choice does not effect other workbench clients and does not effect the behavior of the various endpoints when using command line tools to query or update data in the database. Thus your command line requests are being made against a namespace that is not configured for RDR support. If you want to address a non-default bigdata namespace using curl or wget, you must use the appropriate URL for that namespace. This is all described on wiki.bigdata.com on the page for the nanoSparqlServer in the section on multi-tenancy. See http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API Thanks, Bryan On Thursday, October 30, 2014, Alice Everett <ali...@ya...> wrote: I found out an awesome feature in Bigdata called RDR and I am trying to explore that too. Can you please let me know as to where am I going wrong while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample RDF data, contains reification in its standard form: http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification) >Loading: >curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' http://192.168.145.1:9999/bigdata/sparql > >(Additionally I changed my current namespace within the workbench opened in my browser to RDR mode). > > >After this I fired the following query and got the following error (Can you please correct me as to where am I going wrong. I'll be very grateful to you for the same): >@HP-ProBook-4430s:~/bigdataAnt$ curl -X POST http://192.168.145.1:9999/bigdata/sparql --header "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdr' > > >SELECT * {<<?s ?p ?o>> ?p1 ?o1 } >java.util.concurrent.ExecutionException: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at java.util.concurrent.FutureTask.report(FutureTask.java:122) >at java.util.concurrent.FutureTask.get(FutureTask.java:188) >at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) >at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) >at java.util.concurrent.FutureTask.run(FutureTask.java:262) >at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) >at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) >at java.lang.Thread.run(Thread.java:745) >Caused by: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) >at org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) >at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) >at org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) >at com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) >at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) >at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) >at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >at java.util.concurrent.FutureTask.run(FutureTask.java:262) >at com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >... 6 more >Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) >at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) >at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) >at com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) >at com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) >at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) >... 15 more >Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at java.util.concurrent.FutureTask.report(FutureTask.java:122) >at java.util.concurrent.FutureTask.get(FutureTask.java:188) >at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) >... 20 more >Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) >at com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) >at com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) >at com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) >at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) >at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) > > >... 4 more >Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) >at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) >at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) >at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) >... 9 more >Caused by: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) >at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) >at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) >at java.util.concurrent.FutureTask.run(FutureTask.java:262) >at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) >... 3 more >Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at java.util.concurrent.FutureTask.report(FutureTask.java:122) >at java.util.concurrent.FutureTask.get(FutureTask.java:188) >at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) >... 8 more >Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) >at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) >at java.util.concurrent.FutureTask.run(FutureTask.java:262) >at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) >... 8 more >Caused by: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) >at com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) >at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) >... 12 more >Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) >at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) >... 14 more > > -- ---- Bryan Thompson Chief Scientist & Founder SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 br...@sy... http://bigdata.com http://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. I dont know why am I getting an error when I am querying using RDR. Can you please help me with this one last time. My tmp.xml file is: <?xml version="1.0" encoding="UTF-8" standalone="no"?> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> <properties> <!-- --> <!-- NEW KB NAMESPACE (required). --> <!-- --> <entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry> <!-- --> <!-- Specify any KB specific properties here to override defaults for the BigdataSail --> <!-- AbstractTripleStore, or indices in the namespace of the new KB instance. --> <!-- --> <entry key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> </properties> |
From: Alice E. <ali...@ya...> - 2014-10-31 11:51:05
|
Thanks a lot for the help. But I dont know where I am still going wrong: I inserted data using: curl -v -X POST --data-binary 'uri=file:///home/reifiedTriples.ttl' @tmp.xml http://192.168.145.1:9999/bigdata/sparql And then queried it using: curl -X POST http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr'curl: (6) Couldn't resolve host 'query=SELECT * <<' Content-Type not recognized as RDF: application/x-www-form-urlencoded On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy...> wrote: Alice, The workbench choice of the "in use" namespace is recorded in java script in your browser client. That choice does not effect other workbench clients and does not effect the behavior of the various endpoints when using command line tools to query or update data in the database. Thus your command line requests are being made against a namespace that is not configured for RDR support. If you want to address a non-default bigdata namespace using curl or wget, you must use the appropriate URL for that namespace. This is all described on wiki.bigdata.com on the page for the nanoSparqlServer in the section on multi-tenancy. See http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API Thanks, Bryan On Thursday, October 30, 2014, Alice Everett <ali...@ya...> wrote: I found out an awesome feature in Bigdata called RDR and I am trying to explore that too. Can you please let me know as to where am I going wrong while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample RDF data, contains reification in its standard form: http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification) >Loading: >curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' http://192.168.145.1:9999/bigdata/sparql > >(Additionally I changed my current namespace within the workbench opened in my browser to RDR mode). > > >After this I fired the following query and got the following error (Can you please correct me as to where am I going wrong. I'll be very grateful to you for the same): >@HP-ProBook-4430s:~/bigdataAnt$ curl -X POST http://192.168.145.1:9999/bigdata/sparql --header "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdr' > > >SELECT * {<<?s ?p ?o>> ?p1 ?o1 } >java.util.concurrent.ExecutionException: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at java.util.concurrent.FutureTask.report(FutureTask.java:122) >at java.util.concurrent.FutureTask.get(FutureTask.java:188) >at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) >at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) >at java.util.concurrent.FutureTask.run(FutureTask.java:262) >at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) >at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) >at java.lang.Thread.run(Thread.java:745) >Caused by: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) >at org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) >at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) >at org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) >at com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) >at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) >at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) >at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >at java.util.concurrent.FutureTask.run(FutureTask.java:262) >at com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >... 6 more >Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) >at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) >at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) >at com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) >at com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) >at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) >... 15 more >Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at java.util.concurrent.FutureTask.report(FutureTask.java:122) >at java.util.concurrent.FutureTask.get(FutureTask.java:188) >at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) >... 20 more >Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) >at com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) >at com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) >at com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) >at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) >at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) > > >... 4 more >Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) >at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) >at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) >at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) >... 9 more >Caused by: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) >at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) >at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) >at java.util.concurrent.FutureTask.run(FutureTask.java:262) >at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) >... 3 more >Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at java.util.concurrent.FutureTask.report(FutureTask.java:122) >at java.util.concurrent.FutureTask.get(FutureTask.java:188) >at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) >... 8 more >Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) >at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) >at java.util.concurrent.FutureTask.run(FutureTask.java:262) >at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) >... 8 more >Caused by: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) >at com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) >at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) >... 12 more >Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 >at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) >at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) >... 14 more > > -- ---- Bryan Thompson Chief Scientist & Founder SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 br...@sy... http://bigdata.com http://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. I dont know why am I getting an error when I am querying using RDR. Can you please help me with this one last time. My tmp.xml file is: <?xml version="1.0" encoding="UTF-8" standalone="no"?> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> <properties> <!-- --> <!-- NEW KB NAMESPACE (required). --> <!-- --> <entry key="com.bigdata.rdf.sail.namespace">RDRMode</entry> <!-- --> <!-- Specify any KB specific properties here to override defaults for the BigdataSail --> <!-- AbstractTripleStore, or indices in the namespace of the new KB instance. --> <!-- --> <entry key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> </properties> |
From: Bryan T. <br...@sy...> - 2014-10-31 10:25:49
|
Alice, The workbench choice of the "in use" namespace is recorded in java script in your browser client. That choice does not effect other workbench clients and does not effect the behavior of the various endpoints when using command line tools to query or update data in the database. Thus your command line requests are being made against a namespace that is not configured for RDR support. If you want to address a non-default bigdata namespace using curl or wget, you must use the appropriate URL for that namespace. This is all described on wiki.bigdata.com on the page for the nanoSparqlServer in the section on multi-tenancy. See http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API Thanks, Bryan On Thursday, October 30, 2014, Alice Everett <ali...@ya...> wrote: > I found out an awesome feature in Bigdata called RDR and I am trying to > explore that too. Can you please let me know as to where am I going wrong > while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample > RDF data, contains reification in its standard form: > http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification > <http://www.w3.org/2001/sw/DataAccess/rq23/queryReification>) > Loading: > curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' > http://192.168.145.1:9999/bigdata/sparql > (Additionally I changed my current namespace within the workbench opened > in my browser to RDR mode). > > After this I fired the following query and got the following error (Can > you please correct me as to where am I going wrong. I'll be very grateful > to you for the same): > @HP-ProBook-4430s:~/bigdataAnt$ curl -X POST > http://192.168.145.1:9999/bigdata/sparql --header > "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p ?o>> > ?p1 ?o1 }' -H 'Accept:application/rdr' > > SELECT * {<<?s ?p ?o>> ?p1 ?o1 } > java.util.concurrent.ExecutionException: > org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at java.util.concurrent.FutureTask.report(FutureTask.java:122) > at java.util.concurrent.FutureTask.get(FutureTask.java:188) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > Caused by: org.openrdf.query.QueryEvaluationException: > java.lang.RuntimeException: java.util.concurrent.ExecutionException: > java.lang.RuntimeException: java.util.concurrent.ExecutionException: > java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) > at > org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) > at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) > at > org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) > at > com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) > at > com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at > com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) > ... 6 more > Caused by: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) > at > com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) > at > com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) > at > com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) > ... 15 more > Caused by: java.util.concurrent.ExecutionException: > java.lang.RuntimeException: java.util.concurrent.ExecutionException: > java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at java.util.concurrent.FutureTask.report(FutureTask.java:122) > at java.util.concurrent.FutureTask.get(FutureTask.java:188) > at > com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) > ... 20 more > Caused by: java.lang.RuntimeException: > java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) > at > com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) > at > com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) > at > com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) > > ... 4 more > Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) > at > com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) > at > com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) > at > com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) > ... 9 more > Caused by: java.lang.Exception: > task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, > cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: > java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) > ... 3 more > Caused by: java.util.concurrent.ExecutionException: > java.lang.RuntimeException: java.lang.RuntimeException: > java.lang.ArrayIndexOutOfBoundsException: 0 > at java.util.concurrent.FutureTask.report(FutureTask.java:122) > at java.util.concurrent.FutureTask.get(FutureTask.java:188) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) > ... 8 more > Caused by: java.lang.RuntimeException: java.lang.RuntimeException: > java.lang.ArrayIndexOutOfBoundsException: 0 > at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) > at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) > at java.util.concurrent.FutureTask.run(FutureTask.java:262) > at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) > at > com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) > ... 8 more > Caused by: java.lang.RuntimeException: > java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) > at > com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) > at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) > ... 12 more > Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 > at > com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) > at > com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) > ... 14 more > > -- ---- Bryan Thompson Chief Scientist & Founder SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 br...@sy... http://bigdata.com http://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. |
From: Alice E. <ali...@ya...> - 2014-10-31 03:26:25
|
I found out an awesome feature in Bigdata called RDR and I am trying to explore that too. Can you please let me know as to where am I going wrong while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample RDF data, contains reification in its standard form: http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification) Loading: curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' http://192.168.145.1:9999/bigdata/sparql (Additionally I changed my current namespace within the workbench opened in my browser to RDR mode). After this I fired the following query and got the following error (Can you please correct me as to where am I going wrong. I'll be very grateful to you for the same): @HP-ProBook-4430s:~/bigdataAnt$ curl -X POST http://192.168.145.1:9999/bigdata/sparql --header "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdr' SELECT * {<<?s ?p ?o>> ?p1 ?o1 } java.util.concurrent.ExecutionException: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:188) at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) at org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) at org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) at com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) at com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) ... 6 more Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) at com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) at com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) at com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) ... 15 more Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:188) at com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) ... 20 more Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) at com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) at com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) at com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) at com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) ... 4 more Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) at com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) at com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) ... 9 more Caused by: java.lang.Exception: task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) ... 3 more Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:188) at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) ... 8 more Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) at com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) ... 8 more Caused by: java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) at com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) ... 12 more Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.reorderTasks(PipelineJoin.java:1317) at com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:971) ... 14 more |
From: Alice E. <ali...@ya...> - 2014-10-29 17:46:55
|
Dear Bigdata developers, I downloaded bigdata from http://www.systap.com/download. I am using bigdata workbench within my browser at http://localhost:9999. I need to load large RDF datasets of about 50 GB in bigdata. Although I find the browser interface of BigData very friendly for users like me who are unfamiliar with Blueprint and Sesame API. But at the same time I am not able to find a method using which I may keep my process running in background as I can not keep my browser window open for 3-4 days till the data loads. Is there a way by which I may run BigData in the background (using something like screen in linux). An example illustrating how I may load large turtle files (ttl) in bigdata in background will be really helpful for a novice like me. Cheers, Alice |
From: Bryan T. <br...@sy...> - 2014-10-29 13:37:09
|
You are trying to hand off the BigdataSailConnection for an unisolated connection to another thread when executing the close(). You need to do this from the same thread that obtained the connection. This is why the thread does not own the lock. Bryan ---- Bryan Thompson Chief Scientist & Founder SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 br...@sy... http://bigdata.com http://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. On Wed, Oct 8, 2014 at 8:36 AM, ryan <ry...@os...> wrote: > Here's a simpler version of the code: just compile and run. > > The setup is one thread starts the repository, and another thread closes > it. > An exception is thrown. > > ry > > On Tuesday, October 07, 2014 05:31:28 PM Bryan Thompson wrote: > > I still do not understand the setup you have going. It sounds simple > > enough, but what "app" are you clicking on? What list? > > > > Please reduce this to code so we can replicate the issue and then we can > > take a look at it. > > > > For example, a unit test would be great. That way we can get the test > into > > the regression suite as well. > > > > Thanks, > > Bryan > > > > ---- > > Bryan Thompson > > Chief Scientist & Founder > > SYSTAP, LLC > > 4501 Tower Road > > Greensboro, NC 27410 > > br...@sy... > > http://bigdata.com > > http://mapgraph.io > > > > CONFIDENTIALITY NOTICE: This email and its contents and attachments are > > for the sole use of the intended recipient(s) and are confidential or > > proprietary to SYSTAP. Any unauthorized review, use, disclosure, > > dissemination or copying of this email or its contents or attachments is > > prohibited. If you have received this communication in error, please > notify > > the sender by reply email and permanently delete all copies of the email > > and its contents and attachments. > > > > On Tue, Oct 7, 2014 at 5:13 PM, ryan <ry...@os...> wrote: > > > With that example, I'm just starting and stopping a local journal file > > > directly. No webserver involved. > > > > > > > > > Bryan Thompson <br...@sy...> wrote: > > > > > > Are you running the NanoSparqlServer or the sesame workbench? > > > > > > Bryan > > > > > > ---- > > > Bryan Thompson > > > Chief Scientist & Founder > > > SYSTAP, LLC > > > 4501 Tower Road > > > Greensboro, NC 27410 > > > br...@sy... > > > http://bigdata.com > > > http://mapgraph.io > > > > > > CONFIDENTIALITY NOTICE: This email and its contents and attachments > are > > > for the sole use of the intended recipient(s) and are confidential or > > > proprietary to SYSTAP. Any unauthorized review, use, disclosure, > > > dissemination or copying of this email or its contents or attachments > is > > > prohibited. If you have received this communication in error, please > > > notify > > > the sender by reply email and permanently delete all copies of the > email > > > and its contents and attachments. > > > > > > On Tue, Oct 7, 2014 at 4:29 PM, ryan <ry...@os...> > wrote: > > >> Here's a (non)working sample. I haven't included my RWStore.properties > > >> file, > > >> but I don't think it matters. > > >> > > >> Basically, start the app with a directory containing an > > >> RWStore.properties > > >> file, then click on it from the list. The repository gets created in > one > > >> thread, and closed in another, and it bombs. > > >> > > >> Thanks for your help, > > >> > > >> ry > > >> > > >> On Tuesday, October 07, 2014 10:00:29 AM Bryan Thompson wrote: > > >> > Can you provide a simple Java class that illustrates the problem? > The > > >> > unisolated connection close logic is below. If the connection is > > >> > > >> already > > >> > > >> > closed, it should return immediately. I suspect that the problem is > > >> > > >> with > > >> > > >> > the invocation of > > >> > > > >> > >>> BigdataSail.*this*.connectionClosed(*this*); > > >> > > > >> > which notifies the base SAIL that the connection is closed. > > >> > > > >> > However, this problem does not show up in our tests so we would > need an > > >> > example in order to replicate it. > > >> > > > >> > Thanks, > > >> > Bryan > > >> > > > >> > *public* *synchronized* *void* close() *throws* > SailException { > > >> > > > >> > // assertOpen(); > > >> > > > >> > *if* (!openConn) { > > >> > > > >> > *return*; > > >> > > > >> > } > > >> > > > >> > > > >> > > > >> > *if* (*txLog*.isInfoEnabled()) > > >> > > > >> > *txLog*.info("SAIL-CLOSE-CONN: conn=" + *this*); > > >> > > > >> > *final* IIndexManager im = > getDatabase().getIndexManager(); > > >> > > > >> > > > >> > *if* (isDirty()) { > > >> > > > >> > /* > > >> > > > >> > * Do implicit rollback() of a dirty connection. > > >> > > > >> > */ > > >> > > > >> > rollback(); > > >> > > > >> > } > > >> > > > >> > > > >> > > > >> > *try* { > > >> > > > >> > // notify the SailBase that the connection is no > longer > > >> > > >> in > > >> > > >> > use. > > >> > > > >> > BigdataSail.*this*.connectionClosed(*this*); > > >> > > > >> > } *finally* { > > >> > > > >> > *if* (lock != *null*) { > > >> > > > >> > // release the reentrant lock > > >> > > > >> > lock.unlock(); > > >> > > > >> > } > > >> > > > >> > *if* (unisolated && im *instanceof* Journal) { > > >> > > > >> > // release the permit. > > >> > > > >> > ((Journal) im).releaseUnisolatedConnection(); > > >> > > > >> > } > > >> > > > >> > openConn = *false*; > > >> > > > >> > } > > >> > > > >> > } > > >> > > > >> > ---- > > >> > Bryan Thompson > > >> > Chief Scientist & Founder > > >> > SYSTAP, LLC > > >> > 4501 Tower Road > > >> > Greensboro, NC 27410 > > >> > br...@sy... > > >> > http://bigdata.com > > >> > http://mapgraph.io > > >> > > > >> > CONFIDENTIALITY NOTICE: This email and its contents and attachments > > >> > are > > >> > for the sole use of the intended recipient(s) and are confidential > or > > >> > proprietary to SYSTAP. Any unauthorized review, use, disclosure, > > >> > dissemination or copying of this email or its contents or > attachments > > >> > is > > >> > prohibited. If you have received this communication in error, please > > >> > > >> notify > > >> > > >> > the sender by reply email and permanently delete all copies of the > > >> > email > > >> > and its contents and attachments. > > >> > > > >> > On Tue, Oct 7, 2014 at 9:36 AM, ryan <ry...@os...> > > >> > > >> wrote: > > >> > > Hi All, > > >> > > I'm trying to add a feature to my project that will allow a user > to > > >> > > disconnect > > >> > > from a locally-connected BigData (1.3.1) engine, start up the > network > > >> > > server, > > >> > > and reconnect to the now-networked repository. > > >> > > > > >> > > The problem is that every time I I try to close my local > connection, > > >> > > >> I get > > >> > > >> > > an > > >> > > IllegalMonitorStateException that appears to be preventing the > > >> > > network > > >> > > server > > >> > > from working correctly. > > >> > > > > >> > > The pertinent stack trace is: > > >> > > java.lang.IllegalMonitorStateException > > >> > > > > >> > > at > > >> > > >> > java.util.concurrent.locks.ReentrantReadWriteLock$Sync.tryRelease(Reentra > > >> n > > >> > > >> > > tReadWriteLock.java:374)> > > >> > > > > >> > > at > > >> > > >> > java.util.concurrent.locks.AbstractQueuedSynchronizer.release(AbstractQue > > >> u > > >> > > >> > > edSynchronizer.java:1260)> > > >> > > > > >> > > at > > >> > > >> > java.util.concurrent.locks.ReentrantReadWriteLock$WriteLock.unlock(Reentr > > >> a > > >> > > >> > > ntReadWriteLock.java:1131)> > > >> > > > > >> > > at > > >> > > >> > com.bigdata.rdf.sail.BigdataSail$BigdataSailConnection.close(BigdataSail. > > >> j > > >> > > >> > > ava:3349)> > > >> > > > > >> > > at > > >> > > >> > org.openrdf.repository.sail.SailRepositoryConnection.close(SailRepository > > >> C > > >> > > >> > > onnection.java:109) > > >> > > > > >> > > > > >> > > With the default OpenRDF backing store, the system will wait a > little > > >> > > >> bit, > > >> > > >> > > and > > >> > > then close connections for you if you don't close them first. I've > > >> > > >> tried > > >> > > >> > > both > > >> > > ways (closing the connection first, then the repository, and just > > >> > > >> closing > > >> > > >> > > the > > >> > > repository) with no difference. > > >> > > > > >> > > I'm running on java 1.7, bigdata 1.3.1, on a linux host. > > >> > > > > >> > > I'm at a loss for troubleshooting. I'd be grateful for any advice > you > > >> > > might > > >> > > have. > > >> > > > > >> > > Thanks, > > >> > > ry > > >> > > > > >> > > -- > > >> > > ...you can't have five wolves and one sheep voting on what to have > > >> > > for > > >> > > supper. > > >> > > > > >> > > --Larry Flint > > >> > > >> > ------------------------------------------------------------------------- > > >> - > > >> > > >> > > ---- Meet PCI DSS 3.0 Compliance Requirements with EventLog > Analyzer > > >> > > Achieve PCI DSS 3.0 Compliant Status with Out-of-the-box PCI DSS > > >> > > >> Reports > > >> > > >> > > Are you Audit-Ready for PCI DSS 3.0 Compliance? Download White > paper > > >> > > Comply to PCI DSS 3.0 Requirement 10 and 11.5 with EventLog > Analyzer > > >> > > >> > http://pubads.g.doubleclick.net/gampad/clk?id=154622311&iu=/4140/ostg.clk > > >> t > > >> > > >> > > rk _______________________________________________ > > >> > > Bigdata-developers mailing list > > >> > > Big...@li... > > >> > > https://lists.sourceforge.net/lists/listinfo/bigdata-developers > > >> > > >> -- > > >> Four boxes to be used in defense of liberty: soap, ballot, jury, ammo > - > > >> use in > > >> that order. > > >> > > >> --Ed Howdershelt > > -- > Psychos don't burst into flames when sunlight hits them. I don't [care] how > crazy they are. > |
From: Bryan T. <br...@sy...> - 2014-10-29 11:14:13
|
Please see below for details on the 1.3.3 release. This was also posted to the blog and will be announced on the mailing list once the blog post is pushed to the subscribers. Thanks, Bryan ---------- Forwarded message ---------- From: Bryan Thompson <tho...@us...> Date: Tue, Oct 28, 2014 at 6:30 PM Subject: [bigdata:news] thompsonbry created post Bigdata Release 1.3.3 (HA Load Balancer, Blueprints, RDR, new Workbench) To: "Release 1.3.2 @news.bigdata.p.re.sf.net Bigdata Release 1.3.3 (HA Load Balancer, Blueprints, RDR, new Workbench)" <Bigdata> *This is a critical fix release of bigdata(R). All users are encouraged to upgrade immediately.* Bigdata is a horizontally-scaled, open-source architecture for indexed data with an emphasis on RDF capable of loading 1B triples in under one hour on a 15 node cluster. Bigdata operates in both a single machine mode (Journal), highly available replication cluster mode (HAJournalServer), and a horizontally sharded cluster mode (BigdataFederation). The Journal provides fast scalable ACID indexed storage for very large data sets, up to 50 billion triples / quads. The HAJournalServer adds replication, online backup, horizontal scaling of query, and high availability. The federation provides fast scalable shard-wise parallel indexed storage using dynamic sharding and shard-wise ACID updates and incremental cluster size growth. Both platforms support fully concurrent readers with snapshot isolation. Distributed processing offers greater throughput but does not reduce query or update latency. Choose the Journal when the anticipated scale and throughput requirements permit. Choose the HAJournalServer for high availability and linear scaling in query throughput. Choose the BigdataFederation when the administrative and machine overhead associated with operating a cluster is an acceptable tradeoff to have essentially unlimited data scaling and throughput. See [1,2,8] for instructions on installing bigdata(R), [4] for the javadoc, and [3,5,6] for news, questions, and the latest developments. For more information about SYSTAP, LLC and bigdata, see [7]. Starting with the 1.0.0 release, we offer a WAR artifact [8] for easy installation of the single machine RDF database. For custom development and cluster installations we recommend checking out the code from SVN using the tag for this release. The code will build automatically under eclipse. You can also build the code using the ant script. The cluster installer requires the use of the ant script. Starting with the 1.3.0 release, we offer a tarball artifact [10] for easy installation of the HA replication cluster. You can download the WAR (standalone) or HA artifacts from: http://sourceforge.net/projects/bigdata/ You can checkout this release from: https://svn.code.sf.net/p/bigdata/code/tags/BIGDATA_RELEASE_1_3_3 Critical or otherwise of note in this minor release: - 1021 Add critical section protection to AbstractJournal.abort() and BigdataSailConnection.rollback(). - 1026 SPARQL UPDATE with runtime errors causes problems with lexicon indices. New features in 1.3.x: - Java 7 is now required. - High availability [10]. - High availability load balancer. - New RDF/SPARQL workbench. - Blueprints API. - RDF Graph Mining Service (GASService) [12]. - Reification Done Right (RDR) support [11]. - Property Path performance enhancements. - Plus numerous other bug fixes and performance enhancements. Feature summary: - Highly Available Replication Clusters (HAJournalServer [10]) - Single machine data storage to ~50B triples/quads (RWStore); - Clustered data storage is essentially unlimited (BigdataFederation); - Simple embedded and/or webapp deployment (NanoSparqlServer); - Triples, quads, or triples with provenance (SIDs); - Fast RDFS+ inference and truth maintenance; - Fast 100% native SPARQL 1.1 evaluation; - Integrated "analytic" query package; - %100 Java memory manager leverages the JVM native heap (no GC); Road map [3]: - Column-wise indexing; - Runtime Query Optimizer for quads; - Performance optimization for scale-out clusters; and - Simplified deployment, configuration, and administration for scale-out clusters. Change log: Note: Versions with (*) MAY require data migration. For details, see [9]. 1.3.3: - http://trac.bigdata.com/ticket/980 (Object position of query hint is not a Literal (partial resolution - see #1028 as well)) - http://trac.bigdata.com/ticket/1018 (Add the ability to track and cancel all queries issued through a BigdataSailRemoteRepositoryConnection) - http://trac.bigdata.com/ticket/1021 (Add critical section protection to AbstractJournal.abort() and BigdataSailConnection.rollback()) - http://trac.bigdata.com/ticket/1024 (GregorianCalendar? does weird things before 1582) - http://trac.bigdata.com/ticket/1026 (SPARQL UPDATE with runtime errors causes problems with lexicon indices) - http://trac.bigdata.com/ticket/1028 (very rare NotMaterializedException: XSDBoolean(true)) - http://trac.bigdata.com/ticket/1029 (RWStore commit state not correctly rolled back if abort fails on empty journal) - http://trac.bigdata.com/ticket/1030 (RWStorage stats cleanup) 1.3.2: - http://trac.bigdata.com/ticket/1016 (Jetty/LBS issues when deployed as WAR under tomcat) - http://trac.bigdata.com/ticket/1010 (Upgrade apache http components to 1.3.1 (security)) - http://trac.bigdata.com/ticket/1005 (Invalidate BTree objects if error occurs during eviction) - http://trac.bigdata.com/ticket/1004 (Concurrent binding problem) - http://trac.bigdata.com/ticket/1002 (Concurrency issues in JVMHashJoinUtility caused by MAX_PARALLEL query hint override) - http://trac.bigdata.com/ticket/1000 (Add configuration option to turn off bottom-up evaluation) - http://trac.bigdata.com/ticket/999 (Extend BigdataSailFactory to take arbitrary properties) - http://trac.bigdata.com/ticket/998 (SPARQL Update through BigdataGraph) - http://trac.bigdata.com/ticket/996 (Add custom prefix support for query results) - http://trac.bigdata.com/ticket/995 (Allow general purpose SPARQL queries through BigdataGraph) - http://trac.bigdata.com/ticket/992 (Deadlock between AbstractRunningQuery.cancel(), QueryLog.log(), and ArbitraryLengthPathTask) - http://trac.bigdata.com/ticket/990 (Query hints not recognized in FILTERs) - http://trac.bigdata.com/ticket/989 (Stored query service) - http://trac.bigdata.com/ticket/988 (Bad performance for FILTER EXISTS) - http://trac.bigdata.com/ticket/987 (maven build is broken) - http://trac.bigdata.com/ticket/986 (Improve locality for small allocation slots) - http://trac.bigdata.com/ticket/985 (Deadlock in BigdataTriplePatternMaterializer) - http://trac.bigdata.com/ticket/975 (HA Health Status Page) - http://trac.bigdata.com/ticket/974 (Name2Addr.indexNameScan(prefix) uses scan + filter) - http://trac.bigdata.com/ticket/973 (RWStore.commit() should be more defensive) - http://trac.bigdata.com/ticket/971 (Clarify HTTP Status codes for CREATE NAMESPACE operation) - http://trac.bigdata.com/ticket/968 (no link to wiki from workbench) - http://trac.bigdata.com/ticket/966 (Failed to get namespace under concurrent update) - http://trac.bigdata.com/ticket/965 (Can not run LBS mode with HA1 setup) - http://trac.bigdata.com/ticket/961 (Clone/modify namespace to create a new one) - http://trac.bigdata.com/ticket/960 (Export namespace properties in XML/Java properties text format) - http://trac.bigdata.com/ticket/938 (HA Load Balancer) - http://trac.bigdata.com/ticket/936 (Support larger metabits allocations) - http://trac.bigdata.com/ticket/932 (Bigdata/Rexster integration) - http://trac.bigdata.com/ticket/919 (Formatted Layout for Status pages) - http://trac.bigdata.com/ticket/899 (REST API Query Cancellation) - http://trac.bigdata.com/ticket/885 (Panels do not appear on startup in Firefox) - http://trac.bigdata.com/ticket/884 (Executing a new query should clear the old query results from the console) - http://trac.bigdata.com/ticket/882 (Abbreviate URIs that can be namespaced with one of the defined common namespaces) - http://trac.bigdata.com/ticket/880 (Can't explore an absolute URI with < >) - http://trac.bigdata.com/ticket/878 (Explore page looks weird when empty) - http://trac.bigdata.com/ticket/873 (Allow user to go use browser back & forward buttons to view explore history) - http://trac.bigdata.com/ticket/865 (OutOfMemoryError instead of Timeout for SPARQL Property Paths) - http://trac.bigdata.com/ticket/858 (Change explore URLs to include URI being clicked so user can see what they've clicked on before) - http://trac.bigdata.com/ticket/855 (AssertionError: Child does not have persistent identity) - http://trac.bigdata.com/ticket/850 (Search functionality in workbench) - http://trac.bigdata.com/ticket/847 (Query results panel should recognize well known namespaces for easier reading) - http://trac.bigdata.com/ticket/845 (Display the properties for a namespace) - http://trac.bigdata.com/ticket/843 (Create new tabs for status & performance counters, and add per namespace service/VoID description links) - http://trac.bigdata.com/ticket/837 (Configurator for new namespaces) - http://trac.bigdata.com/ticket/836 (Allow user to create namespace in the workbench) - http://trac.bigdata.com/ticket/830 (Output RDF data from queries in table format) - http://trac.bigdata.com/ticket/829 (Export query results) - http://trac.bigdata.com/ticket/828 (Save selected namespace in browser) - http://trac.bigdata.com/ticket/827 (Explore tab in workbench) - http://trac.bigdata.com/ticket/826 (Create shortcut to execute load/query) - http://trac.bigdata.com/ticket/823 (Disable textarea when a large file is selected) - http://trac.bigdata.com/ticket/820 (Allow non-file:// URLs to be loaded) - http://trac.bigdata.com/ticket/819 (Retrieve default namespace on page load) - http://trac.bigdata.com/ticket/772 (Query timeout only checked at operator start/stop) - http://trac.bigdata.com/ticket/765 (order by expr skips invalid expressions) - http://trac.bigdata.com/ticket/587 (JSP page to configure KBs) - http://trac.bigdata.com/ticket/343 (Stochastic assert in AbstractBTree#writeNodeOrLeaf() in CI) 1.3.1: - http://trac.bigdata.com/ticket/242 (Deadlines do not play well with GROUP_BY, ORDER_BY, etc.) - http://trac.bigdata.com/ticket/256 (Amortize RTO cost) - http://trac.bigdata.com/ticket/257 (Support BOP fragments in the RTO.) - http://trac.bigdata.com/ticket/258 (Integrate RTO into SAIL) - http://trac.bigdata.com/ticket/259 (Dynamically increase RTO sampling limit.) - http://trac.bigdata.com/ticket/526 (Reification done right) - http://trac.bigdata.com/ticket/580 (Problem with the bigdata RDF/XML parser with sids) - http://trac.bigdata.com/ticket/622 (NSS using jetty+windows can lose connections (windows only; jdk 6/7 bug)) - http://trac.bigdata.com/ticket/624 (HA Load Balancer) - http://trac.bigdata.com/ticket/629 (Graph processing API) - http://trac.bigdata.com/ticket/721 (Support HA1 configurations) - http://trac.bigdata.com/ticket/730 (Allow configuration of embedded NSS jetty server using jetty-web.xml) - http://trac.bigdata.com/ticket/759 (multiple filters interfere) - http://trac.bigdata.com/ticket/763 (Stochastic results with Analytic Query Mode) - http://trac.bigdata.com/ticket/774 (Converge on Java 7.) - http://trac.bigdata.com/ticket/779 (Resynchronization of socket level write replication protocol (HA)) - http://trac.bigdata.com/ticket/780 (Incremental or asynchronous purge of HALog files) - http://trac.bigdata.com/ticket/782 (Wrong serialization version) - http://trac.bigdata.com/ticket/784 (Describe Limit/offset don't work as expected) - http://trac.bigdata.com/ticket/787 (Update documentations and samples, they are OUTDATED) - http://trac.bigdata.com/ticket/788 (Name2Addr does not report all root causes if the commit fails.) - http://trac.bigdata.com/ticket/789 (ant task to build sesame fails, docs for setting up bigdata for sesame are ancient) - http://trac.bigdata.com/ticket/790 (should not be pruning any children) - http://trac.bigdata.com/ticket/791 (Clean up query hints) - http://trac.bigdata.com/ticket/793 (Explain reports incorrect value for opCount) - http://trac.bigdata.com/ticket/796 (Filter assigned to sub-query by query generator is dropped from evaluation) - http://trac.bigdata.com/ticket/797 (add sbt setup to getting started wiki) - http://trac.bigdata.com/ticket/798 (Solution order not always preserved) - http://trac.bigdata.com/ticket/799 (mis-optimation of quad pattern vs triple pattern) - http://trac.bigdata.com/ticket/802 (Optimize DatatypeFactory instantiation in DateTimeExtension) - http://trac.bigdata.com/ticket/803 (prefixMatch does not work in full text search) - http://trac.bigdata.com/ticket/804 (update bug deleting quads) - http://trac.bigdata.com/ticket/806 (Incorrect AST generated for OPTIONAL { SELECT }) - http://trac.bigdata.com/ticket/808 (Wildcard search in bigdata for type suggessions) - http://trac.bigdata.com/ticket/810 (Expose GAS API as SPARQL SERVICE) - http://trac.bigdata.com/ticket/815 (RDR query does too much work) - http://trac.bigdata.com/ticket/816 (Wildcard projection ignores variables inside a SERVICE call.) - http://trac.bigdata.com/ticket/817 (Unexplained increase in journal size) - http://trac.bigdata.com/ticket/821 (Reject large files, rather then storing them in a hidden variable) - http://trac.bigdata.com/ticket/831 (UNION with filter issue) - http://trac.bigdata.com/ticket/841 (Using "VALUES" in a query returns lexical error) - http://trac.bigdata.com/ticket/848 (Fix SPARQL Results JSON writer to write the RDR syntax) - http://trac.bigdata.com/ticket/849 (Create writers that support the RDR syntax) - http://trac.bigdata.com/ticket/851 (RDR GAS interface) - http://trac.bigdata.com/ticket/852 (RemoteRepository.cancel() does not consume the HTTP response entity.) - http://trac.bigdata.com/ticket/853 (Follower does not accept POST of idempotent operations (HA)) - http://trac.bigdata.com/ticket/854 (Allow override of maximum length before converting an HTTP GET to an HTTP POST) - http://trac.bigdata.com/ticket/855 (AssertionError: Child does not have persistent identity) - http://trac.bigdata.com/ticket/862 (Create parser for JSON SPARQL Results) - http://trac.bigdata.com/ticket/863 (HA1 commit failure) - http://trac.bigdata.com/ticket/866 (Batch remove API for the SAIL) - http://trac.bigdata.com/ticket/867 (NSS concurrency problem with list namespaces and create namespace) - http://trac.bigdata.com/ticket/869 (HA5 test suite) - http://trac.bigdata.com/ticket/872 (Full text index range count optimization) - http://trac.bigdata.com/ticket/874 (FILTER not applied when there is UNION in the same join group) - http://trac.bigdata.com/ticket/876 (When I upload a file I want to see the filename.) - http://trac.bigdata.com/ticket/877 (RDF Format selector is invisible) - http://trac.bigdata.com/ticket/883 (CANCEL Query fails on non-default kb namespace on HA follower.) - http://trac.bigdata.com/ticket/886 (Provide workaround for bad reverse DNS setups.) - http://trac.bigdata.com/ticket/887 (BIND is leaving a variable unbound) - http://trac.bigdata.com/ticket/892 (HAJournalServer does not die if zookeeper is not running) - http://trac.bigdata.com/ticket/893 (large sparql insert optimization slow?) - http://trac.bigdata.com/ticket/894 (unnecessary synchronization) - http://trac.bigdata.com/ticket/895 (stack overflow in populateStatsMap) - http://trac.bigdata.com/ticket/902 (Update Basic Bigdata Chef Cookbook) - http://trac.bigdata.com/ticket/904 (AssertionError: PropertyPathNode got to ASTJoinOrderByType.optimizeJoinGroup) - http://trac.bigdata.com/ticket/905 (unsound combo query optimization: union + filter) - http://trac.bigdata.com/ticket/906 (DC Prefix Button Appends "") - http://trac.bigdata.com/ticket/907 (Add a quick-start ant task for the BD Server "ant start") - http://trac.bigdata.com/ticket/912 (Provide a configurable IAnalyzerFactory) - http://trac.bigdata.com/ticket/913 (Blueprints API Implementation) - http://trac.bigdata.com/ticket/914 (Settable timeout on SPARQL Query (REST API)) - http://trac.bigdata.com/ticket/915 (DefaultAnalyzerFactory issues) - http://trac.bigdata.com/ticket/920 (Content negotiation orders accept header scores in reverse) - http://trac.bigdata.com/ticket/939 (NSS does not start from command line: bigdata-war/src not found.) - http://trac.bigdata.com/ticket/940 (ProxyServlet in web.xml breaks tomcat WAR (HA LBS) 1.3.0: - http://trac.bigdata.com/ticket/530 (Journal HA) - http://trac.bigdata.com/ticket/621 (Coalesce write cache records and install reads in cache) - http://trac.bigdata.com/ticket/623 (HA TXS) - http://trac.bigdata.com/ticket/639 (Remove triple-buffering in RWStore) - http://trac.bigdata.com/ticket/645 (HA backup) - http://trac.bigdata.com/ticket/646 (River not compatible with newer 1.6.0 and 1.7.0 JVMs) - http://trac.bigdata.com/ticket/648 (Add a custom function to use full text index for filtering.) - http://trac.bigdata.com/ticket/651 (RWS test failure) - http://trac.bigdata.com/ticket/652 (Compress write cache blocks for replication and in HALogs) - http://trac.bigdata.com/ticket/662 (Latency on followers during commit on leader) - http://trac.bigdata.com/ticket/663 (Issue with OPTIONAL blocks) - http://trac.bigdata.com/ticket/664 (RWStore needs post-commit protocol) - http://trac.bigdata.com/ticket/665 (HA3 LOAD non-responsive with node failure) - http://trac.bigdata.com/ticket/666 (Occasional CI deadlock in HALogWriter testConcurrentRWWriterReader) - http://trac.bigdata.com/ticket/670 (Accumulating HALog files cause latency for HA commit) - http://trac.bigdata.com/ticket/671 (Query on follower fails during UPDATE on leader) - http://trac.bigdata.com/ticket/673 (DGC in release time consensus protocol causes native thread leak in HAJournalServer at each commit) - http://trac.bigdata.com/ticket/674 (WCS write cache compaction causes errors in RWS postHACommit()) - http://trac.bigdata.com/ticket/676 (Bad patterns for timeout computations) - http://trac.bigdata.com/ticket/677 (HA deadlock under UPDATE + QUERY) - http://trac.bigdata.com/ticket/678 (DGC Thread and Open File Leaks: sendHALogForWriteSet()) - http://trac.bigdata.com/ticket/679 (HAJournalServer can not restart due to logically empty log file) - http://trac.bigdata.com/ticket/681 (HAJournalServer deadlock: pipelineRemove() and getLeaderId()) - http://trac.bigdata.com/ticket/684 (Optimization with skos altLabel) - http://trac.bigdata.com/ticket/686 (Consensus protocol does not detect clock skew correctly) - http://trac.bigdata.com/ticket/687 (HAJournalServer Cache not populated) - http://trac.bigdata.com/ticket/689 (Missing URL encoding in RemoteRepositoryManager) - http://trac.bigdata.com/ticket/690 (Error when using the alias "a" instead of rdf:type for a multipart insert) - http://trac.bigdata.com/ticket/691 (Failed to re-interrupt thread in HAJournalServer) - http://trac.bigdata.com/ticket/692 (Failed to re-interrupt thread) - http://trac.bigdata.com/ticket/693 (OneOrMorePath SPARQL property path expression ignored) - http://trac.bigdata.com/ticket/694 (Transparently cancel update/query in RemoteRepository) - http://trac.bigdata.com/ticket/695 (HAJournalServer reports "follower" but is in SeekConsensus and is not participating in commits.) - http://trac.bigdata.com/ticket/701 (Problems in BackgroundTupleResult) - http://trac.bigdata.com/ticket/702 (InvocationTargetException on /namespace call) - http://trac.bigdata.com/ticket/704 (ask does not return json) - http://trac.bigdata.com/ticket/705 (Race between QueryEngine.putIfAbsent() and shutdownNow()) - http://trac.bigdata.com/ticket/706 (MultiSourceSequentialCloseableIterator.nextSource() can throw NPE) - http://trac.bigdata.com/ticket/707 (BlockingBuffer.close() does not unblock threads) - http://trac.bigdata.com/ticket/708 (BIND heisenbug - race condition on select query with BIND) - http://trac.bigdata.com/ticket/711 (sparql protocol: mime type application/sparql-query) - http://trac.bigdata.com/ticket/712 (SELECT ?x { OPTIONAL { ?x eg:doesNotExist eg:doesNotExist } } incorrect) - http://trac.bigdata.com/ticket/715 (Interrupt of thread submitting a query for evaluation does not always terminate the AbstractRunningQuery) - http://trac.bigdata.com/ticket/716 (Verify that IRunningQuery instances (and nested queries) are correctly cancelled when interrupted) - http://trac.bigdata.com/ticket/718 (HAJournalServer needs to handle ZK client connection loss) - http://trac.bigdata.com/ticket/720 (HA3 simultaneous service start failure) - http://trac.bigdata.com/ticket/723 (HA asynchronous tasks must be canceled when invariants are changed) - http://trac.bigdata.com/ticket/725 (FILTER EXISTS in subselect) - http://trac.bigdata.com/ticket/726 (Logically empty HALog for committed transaction) - http://trac.bigdata.com/ticket/727 (DELETE/INSERT fails with OPTIONAL non-matching WHERE) - http://trac.bigdata.com/ticket/728 (Refactor to create HAClient) - http://trac.bigdata.com/ticket/729 (ant bundleJar not working) - http://trac.bigdata.com/ticket/731 (CBD and Update leads to 500 status code) - http://trac.bigdata.com/ticket/732 (describe statement limit does not work) - http://trac.bigdata.com/ticket/733 (Range optimizer not optimizing Slice service) - http://trac.bigdata.com/ticket/734 (two property paths interfere) - http://trac.bigdata.com/ticket/736 (MIN() malfunction) - http://trac.bigdata.com/ticket/737 (class cast exception) - http://trac.bigdata.com/ticket/739 (Inconsistent treatment of bind and optional property path) - http://trac.bigdata.com/ticket/741 (ctc-striterators should build as independent top-level project (Apache2)) - http://trac.bigdata.com/ticket/743 (AbstractTripleStore.destroy() does not filter for correct prefix) - http://trac.bigdata.com/ticket/746 (Assertion error) - http://trac.bigdata.com/ticket/747 (BOUND bug) - http://trac.bigdata.com/ticket/748 (incorrect join with subselect renaming vars) - http://trac.bigdata.com/ticket/754 (Failure to setup SERVICE hook and changeLog for Unisolated and Read/Write connections) - http://trac.bigdata.com/ticket/755 (Concurrent QuorumActors can interfere leading to failure to progress) - http://trac.bigdata.com/ticket/756 (order by and group_concat) - http://trac.bigdata.com/ticket/760 (Code review on 2-phase commit protocol) - http://trac.bigdata.com/ticket/764 (RESYNC failure (HA)) - http://trac.bigdata.com/ticket/770 (alpp ordering) - http://trac.bigdata.com/ticket/772 (Query timeout only checked at operator start/stop.) - http://trac.bigdata.com/ticket/776 (Closed as duplicate of #490) - http://trac.bigdata.com/ticket/778 (HA Leader fail results in transient problem with allocations on other services) - http://trac.bigdata.com/ticket/783 (Operator Alerts (HA)) 1.2.4: - http://trac.bigdata.com/ticket/777 (ConcurrentModificationException in ASTComplexOptionalOptimizer) 1.2.3: - http://trac.bigdata.com/ticket/168 (Maven Build) - http://trac.bigdata.com/ticket/196 (Journal leaks memory). - http://trac.bigdata.com/ticket/235 (Occasional deadlock in CI runs in com.bigdata.io.writecache.TestAll) - http://trac.bigdata.com/ticket/312 (CI (mock) quorums deadlock) - http://trac.bigdata.com/ticket/405 (Optimize hash join for subgroups with no incoming bound vars.) - http://trac.bigdata.com/ticket/412 (StaticAnalysis#getDefinitelyBound() ignores exogenous variables.) - http://trac.bigdata.com/ticket/485 (RDFS Plus Profile) - http://trac.bigdata.com/ticket/495 (SPARQL 1.1 Property Paths) - http://trac.bigdata.com/ticket/519 (Negative parser tests) - http://trac.bigdata.com/ticket/531 (SPARQL UPDATE for SOLUTION SETS) - http://trac.bigdata.com/ticket/535 (Optimize JOIN VARS for Sub-Selects) - http://trac.bigdata.com/ticket/555 (Support PSOutputStream/InputStream at IRawStore) - http://trac.bigdata.com/ticket/559 (Use RDFFormat.NQUADS as the format identifier for the NQuads parser) - http://trac.bigdata.com/ticket/570 (MemoryManager Journal does not implement all methods). - http://trac.bigdata.com/ticket/575 (NSS Admin API) - http://trac.bigdata.com/ticket/577 (DESCRIBE with OFFSET/LIMIT needs to use sub-select) - http://trac.bigdata.com/ticket/578 (Concise Bounded Description (CBD)) - http://trac.bigdata.com/ticket/579 (CONSTRUCT should use distinct SPO filter) - http://trac.bigdata.com/ticket/583 (VoID in ServiceDescription) - http://trac.bigdata.com/ticket/586 (RWStore immedateFree() not removing Checkpoint addresses from the historical index cache.) - http://trac.bigdata.com/ticket/590 (nxparser fails with uppercase language tag) - http://trac.bigdata.com/ticket/592 (Optimize RWStore allocator sizes) - http://trac.bigdata.com/ticket/593 (Ugrade to Sesame 2.6.10) - http://trac.bigdata.com/ticket/594 (WAR was deployed using TRIPLES rather than QUADS by default) - http://trac.bigdata.com/ticket/596 (Change web.xml parameter names to be consistent with Jini/River) - http://trac.bigdata.com/ticket/597 (SPARQL UPDATE LISTENER) - http://trac.bigdata.com/ticket/598 (B+Tree branching factor and HTree addressBits are confused in their NodeSerializer implementations) - http://trac.bigdata.com/ticket/599 (BlobIV for blank node : NotMaterializedException) - http://trac.bigdata.com/ticket/600 (BlobIV collision counter hits false limit.) - http://trac.bigdata.com/ticket/601 (Log uncaught exceptions) - http://trac.bigdata.com/ticket/602 (RWStore does not discard logged deletes on reset()) - http://trac.bigdata.com/ticket/607 (History service / index) - http://trac.bigdata.com/ticket/608 (LOG BlockingBuffer not progressing at INFO or lower level) - http://trac.bigdata.com/ticket/609 (bigdata-ganglia is required dependency for Journal) - http://trac.bigdata.com/ticket/611 (The code that processes SPARQL Update has a typo) - http://trac.bigdata.com/ticket/612 (Bigdata scale-up depends on zookeper) - http://trac.bigdata.com/ticket/613 (SPARQL UPDATE response inlines large DELETE or INSERT triple graphs) - http://trac.bigdata.com/ticket/614 (static join optimizer does not get ordering right when multiple tails share vars with ancestry) - http://trac.bigdata.com/ticket/615 (AST2BOpUtility wraps UNION with an unnecessary hash join) - http://trac.bigdata.com/ticket/616 (Row store read/update not isolated on Journal) - http://trac.bigdata.com/ticket/617 (Concurrent KB create fails with "No axioms defined?") - http://trac.bigdata.com/ticket/618 (DirectBufferPool.poolCapacity maximum of 2GB) - http://trac.bigdata.com/ticket/619 (RemoteRepository class should use application/x-www-form-urlencoded for large POST requests) - http://trac.bigdata.com/ticket/620 (UpdateServlet fails to parse MIMEType when doing conneg.) - http://trac.bigdata.com/ticket/626 (Expose performance counters for read-only indices) - http://trac.bigdata.com/ticket/627 (Environment variable override for NSS properties file) - http://trac.bigdata.com/ticket/628 (Create a bigdata-client jar for the NSS REST API) - http://trac.bigdata.com/ticket/631 (ClassCastException in SIDs mode query) - http://trac.bigdata.com/ticket/632 (NotMaterializedException when a SERVICE call needs variables that are provided as query input bindings) - http://trac.bigdata.com/ticket/633 (ClassCastException when binding non-uri values to a variable that occurs in predicate position) - http://trac.bigdata.com/ticket/638 (Change DEFAULT_MIN_RELEASE_AGE to 1ms) - http://trac.bigdata.com/ticket/640 (Conditionally rollback() BigdataSailConnection if dirty) - http://trac.bigdata.com/ticket/642 (Property paths do not work inside of exists/not exists filters) - http://trac.bigdata.com/ticket/643 (Add web.xml parameters to lock down public NSS end points) - http://trac.bigdata.com/ticket/644 (Bigdata2Sesame2BindingSetIterator can fail to notice asynchronous close()) - http://trac.bigdata.com/ticket/650 (Can not POST RDF to a graph using REST API) - http://trac.bigdata.com/ticket/654 (Rare AssertionError in WriteCache.clearAddrMap()) - http://trac.bigdata.com/ticket/655 (SPARQL REGEX operator does not perform case-folding correctly for Unicode data) - http://trac.bigdata.com/ticket/656 (InFactory bug when IN args consist of a single literal) - http://trac.bigdata.com/ticket/647 (SIDs mode creates unnecessary hash join for GRAPH group patterns) - http://trac.bigdata.com/ticket/667 (Provide NanoSparqlServer initialization hook) - http://trac.bigdata.com/ticket/669 (Doubly nested subqueries yield no results with LIMIT) - http://trac.bigdata.com/ticket/675 (Flush indices in parallel during checkpoint to reduce IO latency) - http://trac.bigdata.com/ticket/682 (AtomicRowFilter UnsupportedOperationException) 1.2.2: - http://trac.bigdata.com/ticket/586 (RWStore immedateFree() not removing Checkpoint addresses from the historical index cache.) - http://trac.bigdata.com/ticket/602 (RWStore does not discard logged deletes on reset()) - http://trac.bigdata.com/ticket/603 (Prepare critical maintenance release as branch of 1.2.1) 1.2.1: - http://trac.bigdata.com/ticket/533 (Review materialization for inline IVs) - http://trac.bigdata.com/ticket/539 (NotMaterializedException with REGEX and Vocab) - http://trac.bigdata.com/ticket/540 (SPARQL UPDATE using NSS via index.html) - http://trac.bigdata.com/ticket/541 (MemoryManaged backed Journal mode) - http://trac.bigdata.com/ticket/546 (Index cache for Journal) - http://trac.bigdata.com/ticket/549 (BTree can not be cast to Name2Addr (MemStore recycler)) - http://trac.bigdata.com/ticket/550 (NPE in Leaf.getKey() : root cause was user error) - http://trac.bigdata.com/ticket/558 (SPARQL INSERT not working in same request after INSERT DATA) - http://trac.bigdata.com/ticket/562 (Sub-select in INSERT cause NPE in UpdateExprBuilder) - http://trac.bigdata.com/ticket/563 (DISTINCT ORDER BY) - http://trac.bigdata.com/ticket/567 (Failure to set cached value on IV results in incorrect behavior for complex UPDATE operation) - http://trac.bigdata.com/ticket/568 (DELETE WHERE fails with Java AssertionError) - http://trac.bigdata.com/ticket/569 (LOAD-CREATE-LOAD using virgin journal fails with "Graph exists" exception) - http://trac.bigdata.com/ticket/571 (DELETE/INSERT WHERE handling of blank nodes) - http://trac.bigdata.com/ticket/573 (NullPointerException when attempting to INSERT DATA containing a blank node) 1.2.0: (*) - http://trac.bigdata.com/ticket/92 (Monitoring webapp) - http://trac.bigdata.com/ticket/267 (Support evaluation of 3rd party operators) - http://trac.bigdata.com/ticket/337 (Compact and efficient movement of binding sets between nodes.) - http://trac.bigdata.com/ticket/433 (Cluster leaks threads under read-only index operations: DGC thread leak) - http://trac.bigdata.com/ticket/437 (Thread-local cache combined with unbounded thread pools causes effective memory leak: termCache memory leak & thread-local buffers) - http://trac.bigdata.com/ticket/438 (KeyBeforePartitionException on cluster) - http://trac.bigdata.com/ticket/439 (Class loader problem) - http://trac.bigdata.com/ticket/441 (Ganglia integration) - http://trac.bigdata.com/ticket/443 (Logger for RWStore transaction service and recycler) - http://trac.bigdata.com/ticket/444 (SPARQL query can fail to notice when IRunningQuery.isDone() on cluster) - http://trac.bigdata.com/ticket/445 (RWStore does not track tx release correctly) - http://trac.bigdata.com/ticket/446 (HTTP Repostory broken with bigdata 1.1.0) - http://trac.bigdata.com/ticket/448 (SPARQL 1.1 UPDATE) - http://trac.bigdata.com/ticket/449 (SPARQL 1.1 Federation extension) - http://trac.bigdata.com/ticket/451 (Serialization error in SIDs mode on cluster) - http://trac.bigdata.com/ticket/454 (Global Row Store Read on Cluster uses Tx) - http://trac.bigdata.com/ticket/456 (IExtension implementations do point lookups on lexicon) - http://trac.bigdata.com/ticket/457 ("No such index" on cluster under concurrent query workload) - http://trac.bigdata.com/ticket/458 (Java level deadlock in DS) - http://trac.bigdata.com/ticket/460 (Uncaught interrupt resolving RDF terms) - http://trac.bigdata.com/ticket/461 (KeyAfterPartitionException / KeyBeforePartitionException on cluster) - http://trac.bigdata.com/ticket/463 (NoSuchVocabularyItem with LUBMVocabulary for DerivedNumericsExtension) - http://trac.bigdata.com/ticket/464 (Query statistics do not update correctly on cluster) - http://trac.bigdata.com/ticket/465 (Too many GRS reads on cluster) - http://trac.bigdata.com/ticket/469 (Sail does not flush assertion buffers before query) - http://trac.bigdata.com/ticket/472 (acceptTaskService pool size on cluster) - http://trac.bigdata.com/ticket/475 (Optimize serialization for query messages on cluster) - http://trac.bigdata.com/ticket/476 (Test suite for writeCheckpoint() and recycling for BTree/HTree) - http://trac.bigdata.com/ticket/478 (Cluster does not map input solution(s) across shards) - http://trac.bigdata.com/ticket/480 (Error releasing deferred frees using 1.0.6 against a 1.0.4 journal) - http://trac.bigdata.com/ticket/481 (PhysicalAddressResolutionException against 1.0.6) - http://trac.bigdata.com/ticket/482 (RWStore reset() should be thread-safe for concurrent readers) - http://trac.bigdata.com/ticket/484 (Java API for NanoSparqlServer REST API) - http://trac.bigdata.com/ticket/491 (AbstractTripleStore.destroy() does not clear the locator cache) - http://trac.bigdata.com/ticket/492 (Empty chunk in ThickChunkMessage (cluster)) - http://trac.bigdata.com/ticket/493 (Virtual Graphs) - http://trac.bigdata.com/ticket/496 (Sesame 2.6.3) - http://trac.bigdata.com/ticket/497 (Implement STRBEFORE, STRAFTER, and REPLACE) - http://trac.bigdata.com/ticket/498 (Bring bigdata RDF/XML parser up to openrdf 2.6.3.) - http://trac.bigdata.com/ticket/500 (SPARQL 1.1 Service Description) - http://www.openrdf.org/issues/browse/SES-884 (Aggregation with an solution set as input should produce an empty solution as output) - http://www.openrdf.org/issues/browse/SES-862 (Incorrect error handling for SPARQL aggregation; fix in 2.6.1) - http://www.openrdf.org/issues/browse/SES-873 (Order the same Blank Nodes together in ORDER BY) - http://trac.bigdata.com/ticket/501 (SPARQL 1.1 BINDINGS are ignored) - http://trac.bigdata.com/ticket/503 (Bigdata2Sesame2BindingSetIterator throws QueryEvaluationException were it should throw NoSuchElementException) - http://trac.bigdata.com/ticket/504 (UNION with Empty Group Pattern) - http://trac.bigdata.com/ticket/505 (Exception when using SPARQL sort & statement identifiers) - http://trac.bigdata.com/ticket/506 (Load, closure and query performance in 1.1.x versus 1.0.x) - http://trac.bigdata.com/ticket/508 (LIMIT causes hash join utility to log errors) - http://trac.bigdata.com/ticket/513 (Expose the LexiconConfiguration to Function BOPs) - http://trac.bigdata.com/ticket/515 (Query with two "FILTER NOT EXISTS" expressions returns no results) - http://trac.bigdata.com/ticket/516 (REGEXBOp should cache the Pattern when it is a constant) - http://trac.bigdata.com/ticket/517 (Java 7 Compiler Compatibility) - http://trac.bigdata.com/ticket/518 (Review function bop subclass hierarchy, optimize datatype bop, etc.) - http://trac.bigdata.com/ticket/520 (CONSTRUCT WHERE shortcut) - http://trac.bigdata.com/ticket/521 (Incremental materialization of Tuple and Graph query results) - http://trac.bigdata.com/ticket/525 (Modify the IChangeLog interface to support multiple agents) - http://trac.bigdata.com/ticket/527 (Expose timestamp of LexiconRelation to function bops) - http://trac.bigdata.com/ticket/532 (ClassCastException during hash join (can not be cast to TermId)) - http://trac.bigdata.com/ticket/533 (Review materialization for inline IVs) - http://trac.bigdata.com/ticket/534 (BSBM BI Q5 error using MERGE JOIN) 1.1.0 (*) - http://trac.bigdata.com/ticket/23 (Lexicon joins) - http://trac.bigdata.com/ticket/109 (Store large literals as "blobs") - http://trac.bigdata.com/ticket/181 (Scale-out LUBM "how to" in wiki and build.xml are out of date.) - http://trac.bigdata.com/ticket/203 (Implement an persistence capable hash table to support analytic query) - http://trac.bigdata.com/ticket/209 (AccessPath should visit binding sets rather than elements for high level query.) - http://trac.bigdata.com/ticket/227 (SliceOp appears to be necessary when operator plan should suffice without) - http://trac.bigdata.com/ticket/232 (Bottom-up evaluation semantics). - http://trac.bigdata.com/ticket/246 (Derived xsd numeric data types must be inlined as extension types.) - http://trac.bigdata.com/ticket/254 (Revisit pruning of intermediate variable bindings during query execution) - http://trac.bigdata.com/ticket/261 (Lift conditions out of subqueries.) - http://trac.bigdata.com/ticket/300 (Native ORDER BY) - http://trac.bigdata.com/ticket/324 (Inline predeclared URIs and namespaces in 2-3 bytes) - http://trac.bigdata.com/ticket/330 (NanoSparqlServer does not locate "html" resources when run from jar) - http://trac.bigdata.com/ticket/334 (Support inlining of unicode data in the statement indices.) - http://trac.bigdata.com/ticket/364 (Scalable default graph evaluation) - http://trac.bigdata.com/ticket/368 (Prune variable bindings during query evaluation) - http://trac.bigdata.com/ticket/370 (Direct translation of openrdf AST to bigdata AST) - http://trac.bigdata.com/ticket/373 (Fix StrBOp and other IValueExpressions) - http://trac.bigdata.com/ticket/377 (Optimize OPTIONALs with multiple statement patterns.) - http://trac.bigdata.com/ticket/380 (Native SPARQL evaluation on cluster) - http://trac.bigdata.com/ticket/387 (Cluster does not compute closure) - http://trac.bigdata.com/ticket/395 (HTree hash join performance) - http://trac.bigdata.com/ticket/401 (inline xsd:unsigned datatypes) - http://trac.bigdata.com/ticket/408 (xsd:string cast fails for non-numeric data) - http://trac.bigdata.com/ticket/421 (New query hints model.) - http://trac.bigdata.com/ticket/431 (Use of read-only tx per query defeats cache on cluster) 1.0.3 - http://trac.bigdata.com/ticket/217 (BTreeCounters does not track bytes released) - http://trac.bigdata.com/ticket/269 (Refactor performance counters using accessor interface) - http://trac.bigdata.com/ticket/329 (B+Tree should delete bloom filter when it is disabled.) - http://trac.bigdata.com/ticket/372 (RWStore does not prune the CommitRecordIndex) - http://trac.bigdata.com/ticket/375 (Persistent memory leaks (RWStore/DISK)) - http://trac.bigdata.com/ticket/385 (FastRDFValueCoder2: ArrayIndexOutOfBoundsException) - http://trac.bigdata.com/ticket/391 (Release age advanced on WORM mode journal) - http://trac.bigdata.com/ticket/392 (Add a DELETE by access path method to the NanoSparqlServer) - http://trac.bigdata.com/ticket/393 (Add "context-uri" request parameter to specify the default context for INSERT in the REST API) - http://trac.bigdata.com/ticket/394 (log4j configuration error message in WAR deployment) - http://trac.bigdata.com/ticket/399 (Add a fast range count method to the REST API) - http://trac.bigdata.com/ticket/422 (Support temp triple store wrapped by a BigdataSail) - http://trac.bigdata.com/ticket/424 (NQuads support for NanoSparqlServer) - http://trac.bigdata.com/ticket/425 (Bug fix to DEFAULT_RDF_FORMAT for bulk data loader in scale-out) - http://trac.bigdata.com/ticket/426 (Support either lockfile (procmail) and dotlockfile (liblockfile1) in scale-out) - http://trac.bigdata.com/ticket/427 (BigdataSail#getReadOnlyConnection() race condition with concurrent commit) - http://trac.bigdata.com/ticket/435 (Address is 0L) - http://trac.bigdata.com/ticket/436 (TestMROWTransactions failure in CI) 1.0.2 - http://trac.bigdata.com/ticket/32 (Query time expansion of (foo rdf:type rdfs:Resource) drags in SPORelation for scale-out.) - http://trac.bigdata.com/ticket/181 (Scale-out LUBM "how to" in wiki and build.xml are out of date.) - http://trac.bigdata.com/ticket/356 (Query not terminated by error.) - http://trac.bigdata.com/ticket/359 (NamedGraph pattern fails to bind graph variable if only one binding exists.) - http://trac.bigdata.com/ticket/361 (IRunningQuery not closed promptly.) - http://trac.bigdata.com/ticket/371 (DataLoader fails to load resources available from the classpath.) - http://trac.bigdata.com/ticket/376 (Support for the streaming of bigdata IBindingSets into a sparql query.) - http://trac.bigdata.com/ticket/378 (ClosedByInterruptException during heavy query mix.) - http://trac.bigdata.com/ticket/379 (NotSerializableException for SPOAccessPath.) - http://trac.bigdata.com/ticket/382 (Change dependencies to Apache River 2.2.0) 1.0.1 (*) - http://trac.bigdata.com/ticket/107 (Unicode clean schema names in the sparse row store). - http://trac.bigdata.com/ticket/124 (TermIdEncoder should use more bits for scale-out). - http://trac.bigdata.com/ticket/225 (OSX requires specialized performance counter collection classes). - http://trac.bigdata.com/ticket/348 (BigdataValueFactory.asValue() must return new instance when DummyIV is used). - http://trac.bigdata.com/ticket/349 (TermIdEncoder limits Journal to 2B distinct RDF Values per triple/quad store instance). - http://trac.bigdata.com/ticket/351 (SPO not Serializable exception in SIDS mode (scale-out)). - http://trac.bigdata.com/ticket/352 (ClassCastException when querying with binding-values that are not known to the database). - http://trac.bigdata.com/ticket/353 (UnsupportedOperatorException for some SPARQL queries). - http://trac.bigdata.com/ticket/355 (Query failure when comparing with non materialized value). - http://trac.bigdata.com/ticket/357 (RWStore reports "FixedAllocator returning null address, with freeBits".) - http://trac.bigdata.com/ticket/359 (NamedGraph pattern fails to bind graph variable if only one binding exists.) - http://trac.bigdata.com/ticket/362 (log4j - slf4j bridge.) For more information about bigdata(R), please see the following links: [1] http://wiki.bigdata.com/wiki/index.php/Main_Page [2] http://wiki.bigdata.com/wiki/index.php/GettingStarted [3] http://wiki.bigdata.com/wiki/index.php/Roadmap [4] http://www.bigdata.com/bigdata/docs/api/ [5] http://sourceforge.net/projects/bigdata/ [6] http://www.bigdata.com/blog [7] http://www.systap.com/bigdata.htm [8] http://sourceforge.net/projects/bigdata/files/bigdata/ [9] http://wiki.bigdata.com/wiki/index.php/DataMigration [10] http://wiki.bigdata.com/wiki/index.php/HAJournalServer [11] http://www.bigdata.com/whitepapers/reifSPARQL.pdf [12] http://wiki.bigdata.com/wiki/index.php/RDF_GAS_API About bigdata: Bigdata(R) is a horizontally-scaled, general purpose storage and computing fabric for ordered data (B+Trees), designed to operate on either a single server or a cluster of commodity hardware. Bigdata(R) uses dynamically partitioned key-range shards in order to remove any realistic scaling limits - in principle, bigdata(R) may be deployed on 10s, 100s, or even thousands of machines and new capacity may be added incrementally without requiring the full reload of all data. The bigdata(R) RDF database supports RDFS and OWL Lite reasoning, high-level query (SPARQL), and datum level provenance. ------------------------------ Sent from sourceforge.net because you indicated interest in https://sourceforge.net/p/bigdata/news/2014/10/bigdata-release-132-ha-load-balancer-blueprints-rdr-new-workbench/ To unsubscribe from further messages, please visit https://sourceforge.net/auth/subscriptions/ |
From: Bigdata by S. L. <bi...@sy...> - 2014-10-27 15:16:42
|
Checkout the latest Blog post on Bigdata by SYSTAP, LLC. View this email in your browser (http://us8.campaign-archive2.com/?u=807fe42a6d19f97994387207d&id=bcc95d0248&e=085d8ae40c) Updates from ** bigdata ------------------------------------------------------------ bigdata(R) is a scale-out storage and computing fabric supporting optional transactions, very high concurrency, and very high aggregate IO rates. In the 10/27/2014 edition: * MapGraph @ IEEE Bigdata 2014 — Seeking for Beta Customers for GPU Graph Analytics ** MapGraph @ IEEE Bigdata 2014 — Seeking for Beta Customers for GPU Graph Analytics (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=0fec62185e&e=085d8ae40c) ------------------------------------------------------------ By Brad Bebee on Oct 27, 2014 09:34 am SYSTAP Chief Scientist Bryan Thompson and Cuda Researcher Zhisong Fu will be presenting “Parallel Breadth First Search on GPU Clusters” in the distributed systems track of this year’s IEEE Bigdata conference (http://bigdata.us8.list-manage1.com/track/click?u=807fe42a6d19f97994387207d&id=3d81195e41&e=085d8ae40c) . This research, in conjunction with the University of Utah SCI institute, presents results for MapGraph(tm) running on NVIDIA’s Research Cluster with 64 K40 GPUs. It achieves performance of 32,000,000,000 traversed edges per second of a graph with 4,300,000,000 directed edges (32 GTEPS)! We are at a point with our MapGraph(tm) GPU technology where it has been demonstrated and validated for very high performance graph analytics. We are entering phase in which we are actively seeking Beta customers; missions or customers that have a existing graph analytics that could be dramatically accelerated with our GPU Graph analytics software. This could be for a single GPU if the graph size is in the 100s of millions of edges or for multi-GPUs for larger data sizes. If you would like to become a Beta customer for MapGraph, we’d love to hear from you. Please contact us at sa...@sy... (mailto:sa...@sy...) . “Parallel Breadth First Search on GPU Clusters”, Zhisong Fu, Harish Dasari, Martin Berzins, and Bryan Thompson IEEE Bigdata Program (See L7) (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=e8f582a855&e=085d8ae40c) Read in browser » (http://bigdata.us8.list-manage1.com/track/click?u=807fe42a6d19f97994387207d&id=4226866ea8&e=085d8ae40c) http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=8398b7295c&e=085d8ae40c http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=e9d52dd7c3&e=085d8ae40c. ** Recent Articles: ------------------------------------------------------------ ** SYSTAP To Present MapGraph GPU Graph Acceleration at ISWC 2014 (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=608e6607e7&e=085d8ae40c) ** Bigdata Release 1.3.2 (HA Load Balancer, Blueprints, RDR, new Workbench) (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=2f335433eb&e=085d8ae40c) ** Inline URIs (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=29def059ce&e=085d8ae40c) ** MapGraph processes nearly 30 billion edges per second on GPU cluster (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=aa46c09b32&e=085d8ae40c) ** Significant increases in transaction throughput for RWStore (http://bigdata.us8.list-manage2.com/track/click?u=807fe42a6d19f97994387207d&id=cad348ce9f&e=085d8ae40c ============================================================ Copyright © 2014 SYSTAP, LLC, All rights reserved. You are receiving this email as you've subscribed to receive information about Bigdata, fully open-source high-performance graph database supporting the RDF data model and RDR. Bigdata operates as an embedded database or over a client/server REST API. Bigdata supports high-availability and dynamic sharding. Bigdata supports both the Blueprints and Sesame APIs. Our mailing address is: SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 USA ** unsubscribe from this list (http://bigdata.us8.list-manage.com/unsubscribe?u=807fe42a6d19f97994387207d&id=400be6035d&e=085d8ae40c&c=bcc95d0248) ** update subscription preferences (http://bigdata.us8.list-manage1.com/profile?u=807fe42a6d19f97994387207d&id=400be6035d&e=085d8ae40c) Email Marketing Powered by MailChimp http://www.mailchimp.com/monkey-rewards/?utm_source=freemium_newsletter&utm_medium=email&utm_campaign=monkey_rewards&aid=807fe42a6d19f97994387207d&afl=1 |
From: Jeremy J C. <jj...@sy...> - 2014-10-24 23:34:39
|
Hi Bryan et al, concerning the underlying bug that triggered the more significant 1026, this is the best I can come up with as an error report. This is a long way short of a repro, although I can, given some hours, get a report on my system. Jeremy > On Oct 24, 2014, at 2:57 PM, TRAC Bigdata <no...@sy...> wrote: > > #1028: very rare NotMaterializedException: XSDBoolean(true) > ---------------------------+----------------------------------- > Reporter: jeremycarroll | Owner: thompsonbry > Type: defect | Status: new > Priority: major | Milestone: > Component: Query Engine | Version: BIGDATA_RELEASE_1_3_1 > Keywords: | > ---------------------------+----------------------------------- > When running a soak test with characteristics to follow, I get a very hard > to understand error, very rarely, and in conditions I have failed to > replicate other than in my own test harness (that is testing my own code, > not bigdata). > > My code interacts with bigdata only through the http interface to the NSS. > Enabling logging on bigdata makes the problem vanish, but I can reproduce > the problem with logging in my code. In particular enabling the > ASTEvalHelper log makes the problem disappear. > > Attached are 6 logs: > - a log of the NSS being basically the stdout - showing two stack traces, > one at approx 14:00:29, which is the one for which I had the other logging > enabled > - five logs from my code, one for each of five different namespaces being > used in the period 21:00:28 to 21:00:29 (note the 7hour time zone > difference) > The update that failed is in sparqlu2iDMc.log.part > > I note that the error concerns a boolean(true) but there are no such > values in any of the logs. There were some boolean(true)s being used in > earlier completed queries and updates; and I would expect some > boolean(true) values to be in the triple store. > > The version of bigdata I was running is 1.3.2 + five additional commits > and patches as agreed with Systap, in particular a patch fixing 1026. > > The test itself has the following characteristics. > Every forty minutes there is a new round of tests. > There are five concurrent parts to the test, each of which is identical. > Each part creates a namespaces, does some operations, maybe taking 15 > minutes over a 35 minute period, and then deletes the namespace. > The namespace names are reused, not on every round, but ... I have 15 > namespace names, and at any time 5 are in use. Each part logs all the > queries and updates it is sending in several separate log files. > > Typically each query involves resources that start with a URI > http://localsyapsehost:NNNNN/ where the number NNNNN is assigned by > jenkins differently to each of the five parts, hence it is easy to tell > the queries from each namespace apart. > > I have not seen the problem in the first round of testing (i.e. the first > 40 minutes), and I believe it requires the reuse of namespace names to be > seen; on the other hand, reusing a namespace name does not guarantee an > issue. It typically takes 3 or 4 hours to get a single fault. Staggering > the parts by one minute also seems to make the problem go away. > I have seen the error report occur in SELECT queries as well as UPDATEs > (this particular instance is an update). > The error always seem to occur in only two parts of my test suite, this is > one of them. > > I have longer logs but not complete logs of all the operations since the > beginning of the journal file. > > -- > Ticket URL: <http://trac.bigdata.com/ticket/1028> > Bigdata <trac.bigdata.com> > Bigdata Triple Store |
From: Bigdata by S. L. <bi...@sy...> - 2014-10-22 08:59:55
|
Innovative Solutions for Knowledge Graphs View this email in your browser (http://us8.campaign-archive1.com/?u=807fe42a6d19f97994387207d&id=417098c63e&e=085d8ae40c) ** SYSTAP and metaphacts Collaborate to Deliver Innovative Solutions for Knowledge Graphs ------------------------------------------------------------ ** For immediate release 10/22/2014: ------------------------------------------------------------ Trentino, Italy - October 22, 2014 - SYSTAP (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=396c7b5051&e=085d8ae40c) , LLC (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=d1c8e9b819&e=085d8ae40c) and metaphacts (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=b5c85dee03&e=085d8ae40c) announced at this year’s International Semantic Web Conference (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=f10dd89e54&e=085d8ae40c) (ISWC) the start of a new partnership to deliver innovative solutions for building and managing knowledge graphs. Knowledge graphs are large networks of entities and their semantic relationships. They are a powerful tool that can change the way we do search, analytics, recommendations, and data integration. See the full press release at here (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=e5390c1f10&e=085d8ae40c) . Partnership at a glance The two companies have teamed up to deliver joint solutions for big graphs in Europe and Asia. The customer offering will be based on SYSTAP technology - bigdata® and MapGraph™ - and metaphacts technology for creating knowledge graphs and rapid application development, along with consulting and customization expertise. “metaphacts CEO Peter Haase has deep experience building knowledge graphs and knowledge graph applications. As a reseller, metaphacts will bring SYSTAP's scalable bigdata® RDF database and next generation MapGraph™ technology into the European market. We are excited about the opportunity to work with metaphacts to deliver SYSTAP technologies as part of their knowledge graph applications”, said Bryan Thompson, CEO at SYSTAP, LLC. “SYSTAP delivers innovative and highly scalable platforms for big graphs. We are looking forward to combining this with our expertise in knowledge graph management to create innovative and compelling solutions for our customers”, added Dr. Peter Haase, CEO at metaphacts. ** Meet at ISWC 2014 ------------------------------------------------------------ SYSTAP and metaphacts are attending this year’s ISWC in Trentino, Italy and presenting their joint solutions. ISWC 2014 is the premier international forum for the Semantic Web / Linked Data Community. From October 21 through 23, Riva del Garda will be hosting over 600 researchers and entrepreneurs from all over the world to discuss the future of practical, scalable, user-friendly, and game changing solutions. ISWC will be featuring talks by experts from Google, University of Southern California, Fondazione Bruno Kessler (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=001340701b&e=085d8ae40c) , and the Open Data Institute. To meet up with the partners at ISWC, please contact SYSTAP (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=67fc3504a0&e=085d8ae40c) or metaphacts (http://bigdata.us8.list-manage2.com/track/click?u=807fe42a6d19f97994387207d&id=c0c64fc1e9&e=085d8ae40c) directly. To learn more about SYSTAP please visit http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=0d3021fa7c&e=085d8ae40c. For more details on metaphacts and their solutions please visit http://bigdata.us8.list-manage1.com/track/click?u=807fe42a6d19f97994387207d&id=42ed0102e0&e=085d8ae40c (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=f0c2fc987d&e=085d8ae40c) . ============================================================ Copyright © 2014 SYSTAP, LLC, All rights reserved. You are receiving this email as you've subscribed to receive information about Bigdata, fully open-source high-performance graph database supporting the RDF data model and RDR. Bigdata operates as an embedded database or over a client/server REST API. Bigdata supports high-availability and dynamic sharding. Bigdata supports both the Blueprints and Sesame APIs. Our mailing address is: SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 USA ** unsubscribe from this list (http://bigdata.us8.list-manage1.com/unsubscribe?u=807fe42a6d19f97994387207d&id=400be6035d&e=085d8ae40c&c=417098c63e) ** update subscription preferences (http://bigdata.us8.list-manage.com/profile?u=807fe42a6d19f97994387207d&id=400be6035d&e=085d8ae40c) Email Marketing Powered by MailChimp http://www.mailchimp.com/monkey-rewards/?utm_source=freemium_newsletter&utm_medium=email&utm_campaign=monkey_rewards&aid=807fe42a6d19f97994387207d&afl=1 |
From: Bigdata by S. L. <bi...@sy...> - 2014-10-20 15:16:43
|
Checkout the latest Blog post on Bigdata by SYSTAP, LLC. View this email in your browser (http://us8.campaign-archive1.com/?u=807fe42a6d19f97994387207d&id=859f430037&e=085d8ae40c) Updates from ** bigdata ------------------------------------------------------------ bigdata(R) is a scale-out storage and computing fabric supporting optional transactions, very high concurrency, and very high aggregate IO rates. In the 10/20/2014 edition: * SYSTAP To Present MapGraph GPU Graph Acceleration at ISWC 2014 ** SYSTAP To Present MapGraph GPU Graph Acceleration at ISWC 2014 (http://bigdata.us8.list-manage1.com/track/click?u=807fe42a6d19f97994387207d&id=9d85346b43&e=085d8ae40c) ------------------------------------------------------------ By Brad Bebee on Oct 17, 2014 02:14 pm Trentino, Italy: 19-23 October 2014 SYSTAP’s Chief Scientist and Co-Founder, Bryan Thompson, is presenting SYSTAP’s MapGraph technology during the industry track (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=16a7db69eb&e=085d8ae40c) at the the 13th Annual International Semantic Web Conference (ISWC 2014). SYSTAP is a Gold Sponsor for ISWC 2014 (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=8bbf44d8f6&e=085d8ae40c) , the premier international forum for the Semantic Web / Linked Data Community. Here, scientists, industry specialists, and practitioners meet to discuss the future of practical, scalable, user-friendly, and game changing solutions. SYSTAP’s MapGraph is a new and disruptive technology for organizations that need to process large graphs in near-real time. It cost effectively brings the capabilities of High Performance Computing (HPC) to your organization’s biggest and most time critical graph challenges. MapGraph provides a familiar vertex-centric graph programming model, but its GPU acceleration is 100s of times faster than competing CPU-only technologies and up to 100,000 times faster than graph technologies based on key-value stores such as HBase, Titan, and Accumulo. MapGraph runs on one GPU or a cluster of GPUs. With MapGraph on 64 NVIDIA K40 GPUs, you can traverse a scale-free graph of 4.3 billion directed edges in .13 seconds for a throughput of 32 Billion Traversed Edges Per Second (32 GTEPS). Learn more at http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=5b219a619d&e=085d8ae40c. Read in browser » (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=0e3b80e337&e=085d8ae40c) http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=047a6f6f43&e=085d8ae40c http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=343392c8dd&e=085d8ae40c. ** Recent Articles: ------------------------------------------------------------ ** Bigdata Release 1.3.2 (HA Load Balancer, Blueprints, RDR, new Workbench) (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=014e6bf8c6&e=085d8ae40c) ** Inline URIs (http://bigdata.us8.list-manage1.com/track/click?u=807fe42a6d19f97994387207d&id=8910d9b8d4&e=085d8ae40c) ** MapGraph processes nearly 30 billion edges per second on GPU cluster (http://bigdata.us8.list-manage2.com/track/click?u=807fe42a6d19f97994387207d&id=a382f6dc50&e=085d8ae40c) ** Significant increases in transaction throughput for RWStore (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=12dd59d116&e=085d8ae40c) ** Bigdata powers Linked Open Data portal for the Bavarian State Library (German Library) (http://bigdata.us8.list-manage.com/track/click?u=807fe42a6d19f97994387207d&id=bc9a1ab8df&e=085d8ae40c ============================================================ Copyright © 2014 SYSTAP, LLC, All rights reserved. You are receiving this email as you've subscribed to receive information about Bigdata, fully open-source high-performance graph database supporting the RDF data model and RDR. Bigdata operates as an embedded database or over a client/server REST API. Bigdata supports high-availability and dynamic sharding. Bigdata supports both the Blueprints and Sesame APIs. Our mailing address is: SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 USA ** unsubscribe from this list (http://bigdata.us8.list-manage.com/unsubscribe?u=807fe42a6d19f97994387207d&id=400be6035d&e=085d8ae40c&c=859f430037) ** update subscription preferences (http://bigdata.us8.list-manage.com/profile?u=807fe42a6d19f97994387207d&id=400be6035d&e=085d8ae40c) Email Marketing Powered by MailChimp http://www.mailchimp.com/monkey-rewards/?utm_source=freemium_newsletter&utm_medium=email&utm_campaign=monkey_rewards&aid=807fe42a6d19f97994387207d&afl=1 |