From: Rob V. <rv...@do...> - 2012-04-11 06:13:00
|
Hi Sandhya I am slightly unclear what you are asking here? Do you have some static RDF dump you wish to expose as a SPARQL endpoint or do you have some SQL database which you want to dynamically map to RDF and SPARQL over? The former can be done but may not be that performant if your dataset is large (million triples plus) and for anything more than a few hundred thousand triples I'd strongly suggest putting the RDF into a native triple store anyway and using it's SPARQL endpoint. You can then build apps that use dotNetRDF to talk to that store. If it is the latter then dotNetRDF does not have the ability to do this currently unless your SQL store was created using our Data.Sql library in which case the data is simply whatever RDF you put into there and is stored in our own well defined schema for encoding RDF. There is capability for plugging arbitrary backends into the SPARQL engine but this is fairly complex and I'm not sure that it necessarily fits your use case as from your email I'm unclear on exactly what you are trying to achieve. Some more details would be helpful Regards, Rob On 4/10/12 3:28 AM, sandhya wrote: Hi Rob, We are in the process of creating our own SPARQL endpoint where users can query. Towards this, we have converted the data in RDF format in the SQL Server store. Rdf:type has not been specified. We are not clear as to how can the data schema be defined in dotnetrdf. Please guide us in this direction. Regards Sandhya ---------------------------------------------------------------------------- -- Better than sec? Nothing is better than sec when it comes to monitoring Big Data applications. Try Boundary one-second resolution app monitoring today. Free. http://p.sf.net/sfu/Boundary-dev2dev _______________________________________________ dotNetRDF-Support mailing list dot...@li... https://lists.sourceforge.net/lists/listinfo/dotnetrdf-support |