From: Bryan T. <br...@sy...> - 2015-04-30 16:44:55
|
Here is the relevant documentation from the code: to turn this off include the following in your property file: com.bigdata.rdf.store.AbstractTripleStore.inlineDateTimes=false /** * Set up database to inline date/times directly into the statement * indices rather than using the lexicon to map them to term identifiers * and back (default {@value #DEFAULT_INLINE_DATE_TIMES}). Date times * will be converted to UTC, then stored as milliseconds since the * epoch. Thus if you inline date/times you will lose the canonical * representation of the date/time. This has two consequences: (1) you * will not be able to recover the original time zone of the date/time; * and (2) greater than millisecond precision will be lost. * * @see #INLINE_DATE_TIMES_TIMEZONE */ String INLINE_DATE_TIMES = AbstractTripleStore.class.getName() + ".inlineDateTimes"; String DEFAULT_INLINE_DATE_TIMES = "true"; ---- Bryan Thompson Chief Scientist & Founder SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 br...@sy... http://blazegraph.com http://blog.bigdata.com http://mapgraph.io Blazegraph™ is our ultra high-performance graph database that supports both RDF/SPARQL and Tinkerpop/Blueprints APIs. MapGraph™ is our disruptive new technology to use GPUs to accelerate data-parallel graph analytics. CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. On Thu, Apr 30, 2015 at 12:34 PM, Bryan Thompson <br...@sy...> wrote: > I am not at a computer now.. see AbstractTripleStore INLINE_DATE_TIMES. You > need to turn off that property when creating the namespace or in the RWStore > file in WEB-INF. > > Thanks, > Bryan > > On Apr 30, 2015 7:30 AM, "James HK" <jam...@gm...> wrote: >> >> Hi, >> >> When trying to run our unit test suite [0] against a preliminary/local >> Blazegraph 1.5.1 instance the >> following error appeared during the test run (using a vanilla >> Blazegraph with the standard kb namespace) . >> >> Our test suite (and the test mentioned) is run/tested against Virtuoso >> 6.1/Fuseki 1.1.1/Sesame 2.7.14 [0] on Travis-CI therefore it is >> unlikely an issue on our side. >> >> 1) >> SMW\Tests\Integration\MediaWiki\Import\TimeDataTypeTest::testImportOfDifferentDateWithAssortmentOfOutputConversion >> SMW\SPARQLStore\Exception\BadHttpDatabaseResponseException: A SPARQL >> query error has occurred >> Query: >> PREFIX wiki: <http://example.org/id/> >> PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> >> PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#> >> PREFIX owl: <http://www.w3.org/2002/07/owl#> >> PREFIX swivt: <http://semantic-mediawiki.org/swivt/1.0#> >> PREFIX property: <http://example.org/id/Property-3A> >> PREFIX xsd: <http://www.w3.org/2001/XMLSchema#> >> DELETE { wiki:TimeDataTypeRegressionTest ?p ?o } WHERE { >> wiki:TimeDataTypeRegressionTest ?p ?o } >> Error: Query refused >> Endpoint: http://192.168.1.104:9999/bigdata/namespace/kb/sparql >> HTTP response code: 500 >> >> This translates into an error on the Blazegraph side with (output from >> http://localhost:9999/bigdata/#update): >> >> ERROR: SPARQL-UPDATE: updateStr=PREFIX wiki: >> PREFIX rdf: >> PREFIX rdfs: >> PREFIX owl: >> PREFIX swivt: >> PREFIX property: >> PREFIX xsd: >> DELETE { wiki:TimeDataTypeRegressionTest ?p ?o } WHERE { >> wiki:TimeDataTypeRegressionTest ?p ?o } >> java.util.concurrent.ExecutionException: >> java.util.concurrent.ExecutionException: >> org.openrdf.query.UpdateExecutionException: >> java.lang.IllegalStateException: Already assigned: >> old=LiteralExtensionIV [delegate=XSDLong(6017484188943806464), >> datatype=Vocab(-42)], new=LiteralExtensionIV >> [delegate=XSDLong(204552172800000), datatype=Vocab(-42)], this: >> "8452"^^ >> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >> at >> com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:261) >> at >> com.bigdata.rdf.sail.webapp.QueryServlet.doSparqlUpdate(QueryServlet.java:359) >> at >> com.bigdata.rdf.sail.webapp.QueryServlet.doPost(QueryServlet.java:165) >> at >> com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:237) >> at >> com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:137) >> at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) >> at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) >> at >> org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:769) >> at >> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585) >> at >> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) >> at >> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577) >> at >> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223) >> at >> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1125) >> at >> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515) >> at >> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) >> at >> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1059) >> at >> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) >> at >> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215) >> at >> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110) >> at >> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) >> at org.eclipse.jetty.server.Server.handle(Server.java:497) >> at >> org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311) >> at >> org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:248) >> at >> org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540) >> at >> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:610) >> at >> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:539) >> at java.lang.Thread.run(Thread.java:744) >> Caused by: java.util.concurrent.ExecutionException: >> org.openrdf.query.UpdateExecutionException: >> java.lang.IllegalStateException: Already assigned: >> old=LiteralExtensionIV [delegate=XSDLong(6017484188943806464), >> datatype=Vocab(-42)], new=LiteralExtensionIV >> [delegate=XSDLong(204552172800000), datatype=Vocab(-42)], this: >> "8452"^^ >> at java.util.concurrent.FutureTask.report(FutureTask.java:122) >> at java.util.concurrent.FutureTask.get(FutureTask.java:188) >> at >> com.bigdata.rdf.sail.webapp.QueryServlet$SparqlUpdateTask.call(QueryServlet.java:460) >> at >> com.bigdata.rdf.sail.webapp.QueryServlet$SparqlUpdateTask.call(QueryServlet.java:371) >> at >> com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:68) >> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >> at >> com.bigdata.rdf.task.AbstractApiTask.submitApiTask(AbstractApiTask.java:365) >> at >> com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:258) >> ... 26 more >> Caused by: org.openrdf.query.UpdateExecutionException: >> java.lang.IllegalStateException: Already assigned: >> old=LiteralExtensionIV [delegate=XSDLong(6017484188943806464), >> datatype=Vocab(-42)], new=LiteralExtensionIV >> [delegate=XSDLong(204552172800000), datatype=Vocab(-42)], this: >> "8452"^^ >> at >> com.bigdata.rdf.sparql.ast.eval.ASTEvalHelper.executeUpdate(ASTEvalHelper.java:1303) >> at >> com.bigdata.rdf.sail.BigdataSailUpdate.execute2(BigdataSailUpdate.java:152) >> at >> com.bigdata.rdf.sail.webapp.BigdataRDFContext$UpdateTask.doQuery(BigdataRDFContext.java:1683) >> at >> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.innerCall(BigdataRDFContext.java:1310) >> at >> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:1275) >> at >> com.bigdata.rdf.sail.webapp.BigdataRDFContext$AbstractQueryTask.call(BigdataRDFContext.java:517) >> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >> at >> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) >> at >> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) >> ... 1 more >> Caused by: java.lang.IllegalStateException: Already assigned: >> old=LiteralExtensionIV [delegate=XSDLong(6017484188943806464), >> datatype=Vocab(-42)], new=LiteralExtensionIV >> [delegate=XSDLong(204552172800000), datatype=Vocab(-42)], this: >> "8452"^^ >> at >> com.bigdata.rdf.model.BigdataValueImpl.setIV(BigdataValueImpl.java:139) >> at >> com.bigdata.rdf.internal.LexiconConfiguration.createInlineIV(LexiconConfiguration.java:430) >> at >> com.bigdata.rdf.lexicon.LexiconRelation.getInlineIV(LexiconRelation.java:3150) >> at >> com.bigdata.rdf.lexicon.LexiconRelation.addTerms(LexiconRelation.java:1719) >> at >> com.bigdata.rdf.store.AbstractTripleStore.getAccessPath(AbstractTripleStore.java:2928) >> at >> com.bigdata.rdf.store.AbstractTripleStore.getAccessPath(AbstractTripleStore.java:2874) >> at >> com.bigdata.rdf.sail.BigdataSail$BigdataSailConnection.removeStatements(BigdataSail.java:2962) >> at >> com.bigdata.rdf.sail.BigdataSail$BigdataSailConnection.removeStatements(BigdataSail.java:2865) >> at >> com.bigdata.rdf.sparql.ast.eval.AST2BOpUpdate.addOrRemoveStatement(AST2BOpUpdate.java:2054) >> at >> com.bigdata.rdf.sparql.ast.eval.AST2BOpUpdate.convertDeleteInsert(AST2BOpUpdate.java:989) >> at >> com.bigdata.rdf.sparql.ast.eval.AST2BOpUpdate.convertUpdateSwitch(AST2BOpUpdate.java:417) >> at >> com.bigdata.rdf.sparql.ast.eval.AST2BOpUpdate.convertUpdate(AST2BOpUpdate.java:279) >> at >> com.bigdata.rdf.sparql.ast.eval.ASTEvalHelper.executeUpdate(ASTEvalHelper.java:1295) >> ... 9 more >> >> [0] https://travis-ci.org/SemanticMediaWiki/SemanticMediaWiki >> >> Cheers >> >> >> ------------------------------------------------------------------------------ >> One dashboard for servers and applications across Physical-Virtual-Cloud >> Widest out-of-the-box monitoring support with 50+ applications >> Performance metrics, stats and reports that give you Actionable Insights >> Deep dive visibility with transaction tracing using APM Insight. >> http://ad.doubleclick.net/ddm/clk/290420510;117567292;y >> _______________________________________________ >> Bigdata-developers mailing list >> Big...@li... >> https://lists.sourceforge.net/lists/listinfo/bigdata-developers |