From: Bryan T. <br...@sy...> - 2014-10-16 10:25:22
|
There is a distinction between the workbench (JavaScript in the browser) and the database process (java running inside of a servlet container in this case). In anomalous conditions the workbench might not correctly track what is happening on the database side. I suggest that you check the database log output and see what messages were generated during that time. I suspect that you might have something like a "GC Overhead limit exceeded", which is a type of out of memory exception for java where too much of the total time is spent in garbage collection. Or perhaps some other root cause that abnormally terminated the update request in a manner that the workbench was unable to identify. If the update failed, then the database will not contain any triples. If you are trying to load a very large dataset it may make sense to upload the data in a series of smaller chunks. There is a "monitor" option that will show you the status of the update requests as they are being processed. When loading large files it will echo back on the HTTP connection a summary of the number of statements loaded over time during the load. This will provide you with better feedback. But I think that you have an error condition on the server that has halted the load. Thansk, Bryan On Wednesday, October 15, 2014, Maria Jackson <mar...@gm...> wrote: > Dear All, > > I am trying to load yago2s 18.5GB ( > http://www.mpi-inf.mpg.de/departments/databases-and-information-systems/research/yago-naga/yago/downloads/ > <https://contactmonkey.com/api/v1/tracker?cm_session=4d54369b-9f5b-4f3b-ae2d-5c05ba2939a0&cm_type=link&cm_link=36eb659b-7a36-459f-95da-e6d711aec4d0&cm_destination=http://www.mpi-inf.mpg.de/departments/databases-and-information-systems/research/yago-naga/yago/downloads/>) > in Bigdata. I downloaded bigdata from http://www.bigdata.com/download > <https://contactmonkey.com/api/v1/tracker?cm_session=4d54369b-9f5b-4f3b-ae2d-5c05ba2939a0&cm_type=link&cm_link=0b0bc5d8-f7fe-46b3-b416-31eb502201c4&cm_destination=http://www.bigdata.com/download> and > I am using Bigdata workbench via http://localhost:9999. > > I am loading yago2s in BigData's default namespace "kb". I am loading > yago2s using update by specifying the file path there. While Bigdata is > loading yago I notice that it consumes a significant amount of CPU and RAM > for 4-5 hours, but after that it stops using RAM. But my dilemma is that > BigData workbench still keeps on showing "Running update.." although > BigData does not consume any RAM or CPU for the next 48 hours or so (In > fact it keeps showing "Running update.." until I kill the process). Can you > please suggest as to where am I going wrong as after killing the process > BigData is not able to retrieve any tuples (and shows 0 results even for > the query select ?a?b?c where{?a ?b ?c}) > > > Also I am using BigData on a server with 16 cores and 64 GB RAM? > > Any help in this regard will be deeply appreciated. > > Cheers, > Maria > -- ---- Bryan Thompson Chief Scientist & Founder SYSTAP, LLC 4501 Tower Road Greensboro, NC 27410 br...@sy... http://bigdata.com http://mapgraph.io CONFIDENTIALITY NOTICE: This email and its contents and attachments are for the sole use of the intended recipient(s) and are confidential or proprietary to SYSTAP. Any unauthorized review, use, disclosure, dissemination or copying of this email or its contents or attachments is prohibited. If you have received this communication in error, please notify the sender by reply email and permanently delete all copies of the email and its contents and attachments. |