From: Alice E. <ali...@ya...> - 2014-11-01 15:14:35
|
Ok. Thanks a lot I'll ask my manager to contact you soon then :) On Saturday, 1 November 2014 8:12 PM, Bryan Thompson <br...@sy...> wrote: Try using SPARQL. Performance depends greatly on how the platform is configured. We can help you maximize performance for your application under our accelerator program. This program goes beyond our basic developer support program and is designed to give you access to the core development team to help you develop your application and get it ready for the market, including our support for your internal performance tuning of your application. We do not provide these as free services. Proper benchmarking and performance tuning are a complex and time consuming activities. On the other hand, we can provide references to existing customers that have been deeply satisfied by their engagement with Systap around the bigdata platform. If you do your own performance testing, you should realize that misconfigured deployments can result in substantial bais in the test results. Thanks, Bryan On Saturday, November 1, 2014, Alice Everett <ali...@ya...> wrote: Sorry for a chain of mails. If SPARQL* does not make an difference to the performance than its ok I can work with SPARQL too as I have shown in my example. > > > >On Saturday, 1 November 2014 5:01 PM, Alice Everett <ali...@ya...> wrote: > > > >ant start-bigdata is giving me the following output. Thank you again for the help thus far. God Bless You. I think some of the things related to my environment are given below. Can you please share details as to in which environment will bigdata work -- I'll happily change my environment. > > >Buildfile: /home/bigdataAnt/bigdata/build.xml > > >prepare: > [echo] version=bigdata-1.3.2-20141101 > [echo] svn.checkout=true > > >buildinfo: > [echo] > [echo] package com.bigdata; > [echo] public class BuildInfo { > [echo] public static final String buildVersion="1.3.2"; > [echo] public static final String buildVersionOSGI="1.0"; > [echo] public static final String svnRevision="8685"; > [echo] public static final String svnURL="svn://svn.code.sf.net/p/bigdata/code/branches/BIGDATA_RELEASE_1_3_0"; > [echo] public static final String buildTimestamp="2014/11/01 16:12:54 IST"; > [echo] public static final String buildUser=""; > [echo] public static final String buildHost="${env.COMPUTERNAME}"; > [echo] public static final String osArch="amd64"; > [echo] public static final String osName="Linux"; > [echo] public static final String osVersion="3.8.0-44-generic"; > [echo] } > > >compile: > [echo] javac > [echo] destdir="ant-build" > [echo] fork="yes" > [echo] memorymaximumsize="1g" > [echo] debug="yes" > [echo] debuglevel="lines,vars,source" > [echo] verbose="off" > [echo] encoding="Cp1252" > [echo] source="1.7" > [echo] target="1.7" > [javac] Compiling 1 source file to /home/bigdataAnt/bigdata/ant-build/classes > [javac] javac 1.7.0_65 > > >start-bigdata: > [java] > [java] BIGDATA(R) > [java] > [java] Flexible > [java] Reliable > [java] Affordable > [java] Web-Scale Computing for the Enterprise > [java] > [java] Copyright SYSTAP, LLC 2006-2013. All rights reserved. > [java] > [java] -HP-ProBook-4430s > [java] Sat Nov 01 16:12:58 IST 2014 > [java] Linux/3.8.0-44-generic amd64 > [java] Intel(R) Core(TM) i5-3210M CPU @ 2.50GHz Family 6 Model 58 Stepping 9, GenuineIntel #CPU=4 > [java] Oracle Corporation 1.7.0_65 > [java] freeMemory=113623880 > [java] buildVersion=1.3.2 > [java] > [java] Dependency License > [java] ICU http://source.icu-project.org/repos/icu/icu/trunk/license.html > [java] bigdata-ganglia http://www.apache.org/licenses/LICENSE-2.0.html > [java] blueprints-core https://github.com/tinkerpop/blueprints/blob/master/LICENSE.txt > [java] colt http://acs.lbl.gov/software/colt/license.html > [java] commons-codec http://www.apache.org/licenses/LICENSE-2.0.html > [java] commons-fileupload http://www.apache.org/licenses/LICENSE-2.0.html > [java] commons-io http://www.apache.org/licenses/LICENSE-2.0.html > [java] commons-logging http://www.apache.org/licenses/LICENSE-2.0.html > [java] dsiutils http://www.gnu.org/licenses/lgpl-2.1.html > [java] fastutil http://www.apache.org/licenses/LICENSE-2.0.html > [java] flot http://www.opensource.org/licenses/mit-license.php > [java] high-scale-lib http://creativecommons.org/licenses/publicdomain > [java] httpclient http://www.apache.org/licenses/LICENSE-2.0.html > [java] httpclient-cache http://www.apache.org/licenses/LICENSE-2.0.html > [java] httpcore http://www.apache.org/licenses/LICENSE-2.0.html > [java] httpmime http://www.apache.org/licenses/LICENSE-2.0.html > [java] jackson-core http://www.apache.org/licenses/LICENSE-2.0.html > [java] jetty http://www.apache.org/licenses/LICENSE-2.0.html > [java] jquery https://github.com/jquery/jquery/blob/master/MIT-LICENSE.txt > [java] log4j http://www.apache.org/licenses/LICENSE-2.0.html > [java] lucene http://www.apache.org/licenses/LICENSE-2.0.html > [java] nanohttp http://elonen.iki.fi/code/nanohttpd/#license > [java] nxparser http://sw.deri.org/2006/08/nxparser/license.txt > [java] rexster-core https://github.com/tinkerpop/rexster/blob/master/LICENSE.txt > [java] river http://www.apache.org/licenses/LICENSE-2.0.html > [java] servlet-api http://www.apache.org/licenses/LICENSE-2.0.html > [java] sesame http://www.openrdf.org/download.jsp > [java] slf4j http://www.slf4j.org/license.html > [java] zookeeper http://www.apache.org/licenses/LICENSE-2.0.html > [java] > [java] INFO: com.bigdata.util.config.LogUtil: Configure and watch: bigdata-war/src/WEB-INF/classes/log4j.properties > [java] WARN : NanoSparqlServer.java:476: Starting NSS > [java] WARN : ServiceRegistry.java:47: New service class org.openrdf.rio.ntriples.NTriplesParserFactory replaces existing service class com.bigdata.rdf.rio.ntriples.BigdataNTriplesParserFactory > [java] WARN : ServiceRegistry.java:47: New service class org.openrdf.rio.turtle.TurtleParserFactory replaces existing service class com.bigdata.rdf.rio.turtle.BigdataTurtleParserFactory > [java] WARN : ServiceRegistry.java:47: New service class org.openrdf.query.resultio.sparqljson.SPARQLResultsJSONWriterFactory replaces existing service class com.bigdata.rdf.rio.json.BigdataSPARQLResultsJSONWriterFactoryForSelect > [java] WARN : ServiceRegistry.java:47: New service class org.openrdf.rio.turtle.TurtleWriterFactory replaces existing service class com.bigdata.rdf.rio.turtle.BigdataTurtleWriterFactory > [java] serviceURL: http://192.168.145.1:9999 > > > >On Saturday, 1 November 2014 4:56 PM, Alice Everett <ali...@ya...> wrote: > > > >Ok I am using Ubuntu 12.04. So do you suggest I should try in Windows environment. I trying to first make the product workable on my lenovo T430 laptop. > > >Actually we are testing a number of products like Virtuoso, BigData, Jena, etc. And my manager says the company will buy the product which performs best for open source software, as our company would not like to buy support for all the products. Therefore I am asking for a little help with the initial set-up (as otherwise we will not be able to test it). I'll be very grateful if you can help me with this a bit. > > >I'll spoke to my manager to see if we can have a call with you -- but my manager says first I should show some performance on open source, then definitely we'll buy the product support. Hope you understand my stance and help me with this a bit. > > > >On Saturday, 1 November 2014 4:47 PM, Bryan Thompson <br...@sy...> wrote: > > > >Alice, > > >The issue may be the openrdf parsers are not being correctly overridden by the bigdata RDR parsers in your deployment environment. If you want to attempt and diagnose this yourself, it might be a class path ordering issue or a jar metadata odering issue. We have noticed this is some environments, but have not yet reduced it to a root cause. > > >Thanks, >Bryan > >On Saturday, November 1, 2014, Bryan Thompson <br...@sy...> wrote: > >Alice, >> >> >>We do have paid developer support subscriptions for small projects at $500/month. Paid developer support allows us to prioritize your requests. We also have production support subscriptions that provide direct access to the core bigdata team for support of production deployments. >> >> >>The open source support channel is provided as a kindness to the community. It is not an appropriate forum if you have an internal project deadline. Instead, I suggest that you start a developer support subscription. If necessary, we can even do this as a paypal transaction. This will allow us to prioritize your issues along with those of other paying customers. >> >> >>For the moment, it sounds like you have a workaround for this specific issue since you can query the data using the reified triple patterns. >> >> >>If you would like to move forward, I suggest that we also schedule a meeting for next week so we can understand a little more about your use case and applications and help you understand more about the features and offerings for bigdata. >> >> >>Thanks, >>Bryan >> >>On Friday, October 31, 2014, Alice Everett <ali...@ya...> wrote: >> >>That's ok. I need to give a presentation on Monday. So probably you can help on Sunday. >>> >>> >>>Actually, I dont have an issue with the framework I am just not getting how to use it to insert data using RDR mode using CURL. Perhaps, a little example from you can help me with this big time. >>> >>> >>> >>>On Saturday, 1 November 2014 1:35 AM, Bryan Thompson <br...@sy...> wrote: >>> >>> >>> >>>Alice, >>> >>> >>>I am in meetings with a customer today. I could look at this next week. >>> >>> >>>FYI, from the project forum page. If we can not easily recreate the issue then it will not receive any priority under open source support. It is up to you to make the issue as easy to recreate as possible. You can file a ticket and (preferably) create a unit test for the problem. >>> >>> >>>You may use this forum to request help. If you have a bug or a feature request, please log an issue on the tracker [1] and include a unit test which demonstrates the bug. Please follow the instructions [2] when submitting a bug report. >>> >>>If your are interested in services for custom feature development, integration, architecture, or support, please contract the project leads directly. >>>[1] http://trac.bigdata.com/ >>>[2] http://wiki.bigdata.com/wiki/index.php/Submitting_Bugs >>> >>> >>>Thanks, >>>Bryan >>> >>> >>>---- >>>Bryan Thompson >>> >>>Chief Scientist & Founder >>>SYSTAP, LLC >>> >>>4501 Tower Road >>>Greensboro, NC 27410 >>> >>>br...@sy... >>> >>>http://bigdata.com >>> >>>http://mapgraph.io >>> >>>CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. >>> >>>On Fri, Oct 31, 2014 at 3:40 PM, Alice Everett <ali...@ya...> wrote: >>> >>>Dear Bryan, >>>> >>>> >>>>I'll be thankful if you can help me with this a bit. Actually I need to give a small presentation in my company regarding how can frameworks like Bigdata help us. It will be great if I can accompany the presentation with a small demo. >>>> >>>> >>>> >>>> >>>>Cheers, >>>>Alice >>>> >>>> >>>> >>>>On Friday, 31 October 2014 7:55 PM, Alice Everett <ali...@ya...> wrote: >>>> >>>> >>>> >>>>Thanks for the reply Rose but I already tried it..although the loading works perfectly fine yet the database does not contain any data: >>>> >>>> >>>>root:~/bigdataAnt$ curl -X POST http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H 'Accept:application/rdf+xml' >>>><?xml version='1.0' encoding='UTF-8'?> >>>><sparql xmlns='http://www.w3.org/2005/sparql-results#'> >>>><head> >>>><variable name='s'/> >>>><variable name='p'/> >>>><variable name='o'/> >>>><variable name='-sid-1'/> >>>><variable name='p1'/> >>>><variable name='o1'/> >>>></head> >>>><results> >>>></results> >>>></sparql> >>>> >>>> >>>> >>>> >>>>I loaded the following file using in reificationRDR namespace: >>>>@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> . >>>>@prefix dc: <http://purl.org/dc/elements/1.1/> . >>>>@prefix : <http://example/ns#> . >>>> >>>> >>>>_:c rdf:subject <http://example.org/book/book11> . >>>>_:c rdf:predicate dc:title1 . >>>>_:c rdf:object "a" . >>>>_:c :saidBy "b" . >>>> >>>> >>>> >>>> >>>>But in the output it does not show any result. I dont know where am I going wrong perhaps BigData developers can help with this. I am waiting for their response. >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>>On Friday, 31 October 2014 7:51 PM, Rose Beck <ros...@gm...> wrote: >>>> >>>> >>>> >>>>I tried without tmp.xml and the loading worked perfectly fine with me: >>>> >>>> >>>>curl -X POST --data-binary >>>>'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' >>>>http://194.668.5.1:9999/bigdata/namespace/reificationRDR/sparql >>>> >>>>On Fri, Oct 31, 2014 at 6:20 PM, Alice Everett <ali...@ya...> wrote: >>>>> Thanks Rose. But I dont think so.. as it works perfectly with google.com >>>>> >>>>> root:~/bigdataAnt$ curl -v google.com >>>>> * About to connect() to google.com port 80 (#0) >>>>> * Trying 74.125.236.68... connected >>>>>> GET / HTTP/1.1 >>>>>> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 >>>>>> zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >>>>>> Host: google.com >>>>>> Accept: */* >>>>>> >>>>> < HTTP/1.1 302 Found >>>>> < Cache-Control: private >>>>> < Content-Type: text/html; charset=UTF-8 >>>>> < Location: http://www.google.co.in/?gfe_rd=cr&ei=bYVTVL-gG8jM8gfBzoDgCw >>>>> < Content-Length: 261 >>>>> < Date: Fri, 31 Oct 2014 12:49:49 GMT >>>>> < Server: GFE/2.0 >>>>> < Alternate-Protocol: 80:quic,p=0.01 >>>>> < >>>>> <HTML><HEAD><meta http-equiv="content-type" >>>>> content="text/html;charset=utf-8"> >>>>> <TITLE>302 Moved</TITLE></HEAD><BODY> >>>>> <H1>302 Moved</H1> >>>>> The document has moved >>>>> <A >>>>> HREF="http://www.google.co.in/?gfe_rd=cr&ei=bYVTVL-gG8jM8gfBzoDgCw">here</A>. >>>>> </BODY></HTML> >>>>> * Connection #0 to host google.com left intact >>>>> * Closing connection #0 >>>>> >>>>> >>>>> >>>>> On Friday, 31 October 2014 6:19 PM, Rose Beck <ros...@gm...> wrote: >>>>> >>>>> >>>>> I think its a dns error..can you try doing; >>>>> >>>>> curl -v google.com >>>>> >>>>> >>>>> On Fri, Oct 31, 2014 at 6:02 PM, Bryan Thompson <br...@sy...> wrote: >>>>>> If you use POST with a URL of the resource to be loaded (see the NSS wiki >>>>>> page) then the URL must be accessible by bigdata. If you are using the >>>>>> form >>>>>> of POST that sends the data in the http request body (which is the case >>>>>> here), then it only needs to be visible to the client making the request. >>>>>> >>>>>> Thanks, >>>>>> Bryan >>>>>> >>>>>> ---- >>>>>> Bryan Thompson >>>>>> Chief Scientist & Founder >>>>>> SYSTAP, LLC >>>>>> 4501 Tower Road >>>>>> Greensboro, NC 27410 >>>>>> br...@sy... >>>>>> http://bigdata.com >>>>>> http://mapgraph.io >>>>>> >>>>>> CONFIDENTIALITY NOTICE: This email and its contents and attachments are >>>>>> for >>>>>> the sole use of the intended recipient(s) and are confidential or >>>>>> proprietary to SYSTAP. Any unauthorized review, use, disclosure, >>>>>> dissemination or copying of this email or its contents or attachments is >>>>>> prohibited. If you have received this communication in error, please >>>>>> notify >>>>>> the sender by reply email and permanently delete all copies of the email >>>>>> and >>>>>> its contents and attachments. >>>>>> >>>>>> >>>>>> On Fri, Oct 31, 2014 at 8:30 AM, Alice Everett <ali...@ya...> >>>>>> wrote: >>>>>>> >>>>>>> Thanks Jennifer. But even keeping tmp.xml within the bigdata folder is >>>>>>> not >>>>>>> helping. >>>>>>> >>>>>>> >>>>>>> On Friday, 31 October 2014 5:57 PM, Jennifer >>>>>>> <jen...@re...> wrote: >>>>>>> >>>>>>> >>>>>>> I think she is missing as to where tmp.xml should be kept within her >>>>>>> bigdata/Ant folder as I think bigdata is not able to find tmp.xml. >>>>>>> >>>>>>> Alice I think you should keep tmp.xml within the bigdata folder which you >>>>>>> downloaded. >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> From: Alice Everett <ali...@ya...> >>>>>>> Sent: Fri, 31 Oct 2014 17:47:26 >>>>>>> To: Bryan Thompson <br...@sy...> >>>>>>> Cc: "big...@li..." >>>>>>> <big...@li...> >>>>>>> Subject: Re: [Bigdata-developers] How to use RDR with Curl >>>>>>> Ok. Thanks a ton. But still I am a little lost. I used two methods of >>>>>>> inserting as explained below. My namespace's name is reificationRDR. >>>>>>> I'll be very grateful if you can help me with this a bit. >>>>>>> >>>>>>> Insert Method1: >>>>>>> root:~/bigdataAnt$ curl -v -X POST --data-binary >>>>>>> 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' @tmp.xml >>>>>>> http://192.168.145.1:9999/bigdata/sparql >>>>>>> output: >>>>>>> * getaddrinfo(3) failed for tmp.xml:80 >>>>>>> * Couldn't resolve host 'tmp.xml' >>>>>>> * Closing connection #0 >>>>>>> curl: (6) Couldn't resolve host 'tmp.xml' >>>>>>> * About to connect() to 192.168.145.1 port 9999 (#0) >>>>>>> * Trying 192.168.145.1... connected >>>>>>> > POST /bigdata/sparql HTTP/1.1 >>>>>>> > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 >>>>>>> > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >>>>>>> > Host: 192.168.145.1:9999 >>>>>>> > Accept: */* >>>>>>> > Content-Length: 52 >>>>>>> > Content-Type: application/x-www-form-urlencoded >>>>>>> > >>>>>>> * upload completely sent off: 52out of 52 bytes >>>>>>> < HTTP/1.1 200 OK >>>>>>> < Content-Type: application/xml; charset=ISO-8859-1 >>>>>>> < Transfer-Encoding: chunked >>>>>>> < Server: Jetty(9.1.4.v20140401) >>>>>>> < >>>>>>> * Connection #0 to host 192.168.145.1 left intact >>>>>>> * Closing connection #0 >>>>>>> >>>>>>> >>>>>>> Insert Method 2: >>>>>>> root:~/bigdataAnt/bigdata$ curl -v -X POST --data-binary >>>>>>> 'uri=file:///home/bigdataAnt/SmallYagoFacts.ttl' >>>>>>> @/home/bigdataAnt/tmp.xml >>>>>>> http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql >>>>>>> * getaddrinfo(3) failed for :80 >>>>>>> output >>>>>>> * Couldn't resolve host '' >>>>>>> * Closing connection #0 >>>>>>> curl: (6) Couldn't resolve host '' >>>>>>> * About to connect() to 192.168.145.1 port 9999 (#0) >>>>>>> * Trying 192.168.145.1... connected >>>>>>> > POST /bigdata/namespace/reificationRDR/sparql HTTP/1.1 >>>>>>> > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 >>>>>>> > OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 >>>>>>> > Host: 192.168.145.1:9999 >>>>>>> > Accept: */* >>>>>>> > Content-Length: 52 >>>>>>> > Content-Type: application/x-www-form-urlencoded >>>>>>> > >>>>>>> * upload completely sent off: 52out of 52 bytes >>>>>>> < HTTP/1.1 500 Server Error >>>>>>> < Content-Type: text/plain >>>>>>> < Transfer-Encoding: chunked >>>>>>> < Server: Jetty(9.1.4.v20140401) >>>>>>> < >>>>>>> uri=[file:/home/bigdataAnt/SmallYagoFacts.ttl], context-uri=[] >>>>>>> java.util.concurrent.ExecutionException: java.lang.RuntimeException: Not >>>>>>> found: namespace=reificationRDR >>>>>>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>>>>>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.webapp.InsertServlet.doPostWithURIs(InsertServlet.java:401) >>>>>>> at >>>>>>> com.bigdata.rdf.sail.webapp.InsertServlet.doPost(InsertServlet.java:117) >>>>>>> at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:267) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:144) >>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) >>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) >>>>>>> at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:738) >>>>>>> at >>>>>>> >>>>>>> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551) >>>>>>> at >>>>>>> >>>>>>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) >>>>>>> at >>>>>>> >>>>>>> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:568) >>>>>>> at >>>>>>> >>>>>>> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221) >>>>>>> at >>>>>>> >>>>>>> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1111) >>>>>>> at >>>>>>> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:478) >>>>>>> at >>>>>>> >>>>>>> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:183) >>>>>>> at >>>>>>> >>>>>>> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1045) >>>>>>> at >>>>>>> >>>>>>> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) >>>>>>> at >>>>>>> >>>>>>> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:199) >>>>>>> at >>>>>>> >>>>>>> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:109) >>>>>>> at >>>>>>> >>>>>>> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) >>>>>>> at org.eclipse.jetty.server.Server.handle(Server.java:462) >>>>>>> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:279) >>>>>>> at >>>>>>> >>>>>>> org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:232) >>>>>>> at >>>>>>> >>>>>>> org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:534) >>>>>>> at >>>>>>> >>>>>>> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607) >>>>>>> at >>>>>>> >>>>>>> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536) >>>>>>> at java.lang.Thread.run(Thread.java:745) >>>>>>> Caused by: java.lang.RuntimeException: Not found: >>>>>>> namespace=reificationRDR >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.task.AbstractApiTask.getUnisolatedConnection(AbstractApiTask.java:217) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:457) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.webapp.InsertServlet$InsertWithURLsTask.call(InsertServlet.java:414) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:220) >>>>>>> ... 26 more >>>>>>> * Connection #0 to host 192.168.145.1 left intact >>>>>>> * Closing connection #0 >>>>>>> >>>>>>> >>>>>>> Query: >>>>>>> curl -X POST >>>>>>> http://192.168.145.1:9999/bigdata/namespace/reificationRDR/sparql >>>>>>> --data-urlencode 'query=SELECT * {<<?s ?p ?o>> ?p1 ?o1 }' -H >>>>>>> 'Accept:application/rdf+xml' >>>>>>> >>>>>>> tmp.xml: >>>>>>> <?xml version="1.0" encoding="UTF-8" standalone="no"?> >>>>>>> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> >>>>>>> <properties> >>>>>>> <!-- --> >>>>>>> <!-- NEW KB NAMESPACE (required). --> >>>>>>> <!-- --> >>>>>>> <entry key="com.bigdata.rdf.sail.namespace">reificationRDR</entry> >>>>>>> <!-- --> >>>>>>> <!-- Specify any KB specific properties here to override defaults for the >>>>>>> BigdataSail --> >>>>>>> <!-- AbstractTripleStore, or indices in the namespace of the new KB >>>>>>> instance. --> >>>>>>> <!-- --> >>>>>>> <entry >>>>>>> >>>>>>> key="com.bigdata.rdf.store.AbstractTripleStore.statementIdentifiers">true</entry> >>>>>>> </properties> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Friday, 31 October 2014 5:30 PM, Bryan Thompson <br...@sy...> >>>>>>> wrote: >>>>>>> >>>>>>> >>>>>>> What is the namespace for the RDR graph? >>>>>>> >>>>>>> The URL you need to be using is >>>>>>> >>>>>>> http://192.168.145.1:9999/bigdata/namespace/MY-GRAPH-NAMESPACE/sparql >>>>>>> >>>>>>> How to address a specific namespace is explicitly covered if you read the >>>>>>> wiki section on the multitenant interface that I linked in my previous >>>>>>> response. >>>>>>> >>>>>>> Thanks, >>>>>>> Bryan >>>>>>> >>>>>>> On Friday, October 31, 2014, Alice Everett <ali...@ya...');" >>>>>>> class="" style="" target=>ali...@ya...> wrote: >>>>>>> >>>>>>> Thanks a lot for the help. >>>>>>> >>>>>>> But I dont know where I am still going wrong: >>>>>>> I inserted data using: curl -v -X POST --data-binary >>>>>>> 'uri=file:///home/reifiedTriples.ttl' @tmp.xml >>>>>>> http://192.168.145.1:9999/bigdata/sparql >>>>>>> And then queried it using: curl -X POST >>>>>>> http://192.168.145.1:9999/bigdata/sparql --data-urlencode @tmp.xml >>>>>>> 'query=SELECT * { <<?s ?p ?o>> ?p ?o }' -H 'Accept:application/rdr' >>>>>>> curl: (6) Couldn't resolve host 'query=SELECT * <<' >>>>>>> Content-Type not recognized as RDF: application/x-www-form-urlencoded >>>>>>> >>>>>>> >>>>>>> On Friday, 31 October 2014 3:55 PM, Bryan Thompson <br...@sy...> >>>>>>> wrote: >>>>>>> >>>>>>> >>>>>>> Alice, >>>>>>> >>>>>>> The workbench choice of the "in use" namespace is recorded in java script >>>>>>> in your browser client. That choice does not effect other workbench >>>>>>> clients >>>>>>> and does not effect the behavior of the various endpoints when using >>>>>>> command >>>>>>> line tools to query or update data in the database. Thus your command >>>>>>> line >>>>>>> requests are being made against a namespace that is not configured for >>>>>>> RDR >>>>>>> support. >>>>>>> >>>>>>> If you want to address a non-default bigdata namespace using curl or >>>>>>> wget, >>>>>>> you must use the appropriate URL for that namespace. This is all >>>>>>> described >>>>>>> on wiki.bigdata.com on the page for the nanoSparqlServer in the section >>>>>>> on >>>>>>> multi-tenancy. >>>>>>> >>>>>>> See >>>>>>> http://wiki.bigdata.com/wiki/index.php/NanoSparqlServer#Multi-Tenancy_API >>>>>>> >>>>>>> Thanks, >>>>>>> Bryan >>>>>>> >>>>>>> On Thursday, October 30, 2014, Alice Everett <ali...@ya...> >>>>>>> wrote: >>>>>>> >>>>>>> I found out an awesome feature in Bigdata called RDR and I am trying to >>>>>>> explore that too. Can you please let me know as to where am I going wrong >>>>>>> while querying RDR data (http://trac.bigdata.com/ticket/815). (My sample >>>>>>> RDF >>>>>>> data, contains reification in its standard form: >>>>>>> http://www.w3.org/2001/sw/DataAccess/rq23/#queryReification) >>>>>>> Loading: >>>>>>> curl -X POST --data-binary 'uri=file:///home/SmallFacts.ttl' >>>>>>> http://192.168.145.1:9999/bigdata/sparql >>>>>>> (Additionally I changed my current namespace within the workbench opened >>>>>>> in my browser to RDR mode). >>>>>>> >>>>>>> After this I fired the following query and got the following error (Can >>>>>>> you please correct me as to where am I going wrong. I'll be very grateful >>>>>>> to >>>>>>> you for the same): >>>>>>> @HP-ProBook-4430s:~/bigdataAnt$ curl -X POST >>>>>>> http://192.168.145.1:9999/bigdata/sparql --header >>>>>>> "X-BIGDATA-MAX-QUERY-MILLIS" --data-urlencode 'query=SELECT * {<<?s ?p >>>>>>> ?o>> >>>>>>> ?p1 ?o1 }' -H 'Accept:application/rdr' >>>>>>> >>>>>>> SELECT * {<<?s ?p ?o>> ?p1 ?o1 } >>>>>>> java.util.concurrent.ExecutionException: >>>>>>> org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: >>>>>>> java.util.concurrent.ExecutionException: java.lang.RuntimeException: >>>>>>> java.util.concurrent.ExecutionException: java.lang.Exception: >>>>>>> >>>>>>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>>>>>> cause=java.util.concurrent.ExecutionException: >>>>>>> java.lang.RuntimeException: >>>>>>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>>>>>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>>>>>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1277) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:503) >>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>>>>>> at >>>>>>> >>>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) >>>>>>> at >>>>>>> >>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) >>>>>>> at java.lang.Thread.run(Thread.java:745) >>>>>>> Caused by: org.openrdf.query.QueryEvaluationException: >>>>>>> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >>>>>>> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >>>>>>> java.lang.Exception: >>>>>>> >>>>>>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>>>>>> cause=java.util.concurrent.ExecutionException: >>>>>>> java.lang.RuntimeException: >>>>>>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:188) >>>>>>> at >>>>>>> >>>>>>> org.openrdf.query.impl.TupleQueryResultImpl.hasNext(TupleQueryResultImpl.java:90) >>>>>>> at org.openrdf.query.QueryResultUtil.report(QueryResultUtil.java:52) >>>>>>> at >>>>>>> >>>>>>> org.openrdf.repository.sail.SailTupleQuery.evaluate(SailTupleQuery.java:63) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$TupleQueryTask.doQuery(BigdataRDFContext.java:1386) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1221) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask$SparqlRestApiTask.call(BigdataRDFContext.java:1171) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:67) >>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:293) >>>>>>> ... 6 more >>>>>>> Caused by: java.lang.RuntimeException: >>>>>>> java.util.concurrent.ExecutionException: java.lang.RuntimeException: >>>>>>> java.util.concurrent.ExecutionException: java.lang.Exception: >>>>>>> >>>>>>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>>>>>> cause=java.util.concurrent.ExecutionException: >>>>>>> java.lang.RuntimeException: >>>>>>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>>>>>> at >>>>>>> >>>>>>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1523) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator._hasNext(BlockingBuffer.java:1710) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.hasNext(BlockingBuffer.java:1563) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.striterator.AbstractChunkedResolverator._hasNext(AbstractChunkedResolverator.java:365) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.striterator.AbstractChunkedResolverator.hasNext(AbstractChunkedResolverator.java:341) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.Bigdata2Sesame2BindingSetIterator.hasNext(Bigdata2Sesame2BindingSetIterator.java:134) >>>>>>> ... 15 more >>>>>>> Caused by: java.util.concurrent.ExecutionException: >>>>>>> java.lang.RuntimeException: java.util.concurrent.ExecutionException: >>>>>>> java.lang.Exception: >>>>>>> >>>>>>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>>>>>> cause=java.util.concurrent.ExecutionException: >>>>>>> java.lang.RuntimeException: >>>>>>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>>>>>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>>>>>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.relation.accesspath.BlockingBuffer$BlockingIterator.checkFuture(BlockingBuffer.java:1454) >>>>>>> ... 20 more >>>>>>> Caused by: java.lang.RuntimeException: >>>>>>> java.util.concurrent.ExecutionException: java.lang.Exception: >>>>>>> >>>>>>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>>>>>> cause=java.util.concurrent.ExecutionException: >>>>>>> java.lang.RuntimeException: >>>>>>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:59) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.close(RunningQueryCloseableIterator.java:73) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.hasNext(RunningQueryCloseableIterator.java:82) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.striterator.ChunkedWrappedIterator.hasNext(ChunkedWrappedIterator.java:197) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:222) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.striterator.AbstractChunkedResolverator$ChunkConsumerTask.call(AbstractChunkedResolverator.java:197) >>>>>>> >>>>>>> ... 4 more >>>>>>> Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: >>>>>>> >>>>>>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>>>>>> cause=java.util.concurrent.ExecutionException: >>>>>>> java.lang.RuntimeException: >>>>>>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>>>>>> at com.bigdata.util.concurrent.Haltable.get(Haltable.java:273) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:1476) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.bop.engine.AbstractRunningQuery.get(AbstractRunningQuery.java:103) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.rdf.sail.RunningQueryCloseableIterator.checkFuture(RunningQueryCloseableIterator.java:46) >>>>>>> ... 9 more >>>>>>> Caused by: java.lang.Exception: >>>>>>> >>>>>>> task=ChunkTask{query=eeb24f0d-29b7-49d1-bddf-14869c463e76,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, >>>>>>> cause=java.util.concurrent.ExecutionException: >>>>>>> java.lang.RuntimeException: >>>>>>> java.lang.RuntimeException: java.lang.ArrayIndexOutOfBoundsException: 0 >>>>>>> at >>>>>>> >>>>>>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1335) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTaskWrapper.run(ChunkedRunningQuery.java:894) >>>>>>> at >>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) >>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>>>>>> at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkFutureTask.run(ChunkedRunningQuery.java:789) >>>>>>> ... 3 more >>>>>>> Caused by: java.util.concurrent.ExecutionException: >>>>>>> java.lang.RuntimeException: java.lang.RuntimeException: >>>>>>> java.lang.ArrayIndexOutOfBoundsException: 0 >>>>>>> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >>>>>>> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1315) >>>>>>> ... 8 more >>>>>>> Caused by: java.lang.RuntimeException: java.lang.RuntimeException: >>>>>>> java.lang.ArrayIndexOutOfBoundsException: 0 >>>>>>> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:643) >>>>>>> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:343) >>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >>>>>>> at com.bigdata.concurrent.FutureTaskMon.run(FutureTaskMon.java:63) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.bop.engine.ChunkedRunningQuery$ChunkTask.call(ChunkedRunningQuery.java:1314) >>>>>>> ... 8 more >>>>>>> Caused by: java.lang.RuntimeException: >>>>>>> java.lang.ArrayIndexOutOfBoundsException: 0 >>>>>>> at >>>>>>> >>>>>>> com.bigdata.bop.join.PipelineJoin$JoinTask$BindingSetConsumerTask.call(PipelineJoin.java:988) >>>>>>> at >>>>>>> >>>>>>> com.bigdata.bop.join.PipelineJoin$JoinTask.consumeSource(PipelineJoin.java:700) >>>>>>> at com.bigdata.bop.join.PipelineJoin$JoinTask.call(PipelineJoin.java:584) >>>> -- ---- Bryan Thompson Chief Scientist & Founder SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 br...@sy... http://bigdata.com http://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. |