From: Marco B. <bra...@eb...> - 2006-03-08 17:01:51
|
Hi all, I've just discovered the Semantic Media Wiki project and it looks very interesting (cool and fun! ;-) ). I'd like to use it in a project of my own, but probably I would need to write some changes/extensions. Isn't there some documentation about? Even something minimal, such as a guideline about where to start looking the code would be helpful however. First things I have in mind are: - Support for reading RDF from external sources (via URI or alike). Also SPARQL queries like: "find the statements having this page as subject" would be interesting - Support for triple stores / RDF frameworks like Jena - Support for adapting rich ontologies that could be derived (or similarly adapted) as extensions of current (sincerely) basic specification of Categories/Typed Links/Attributes. Probably this would require addition to markup syntax and hence some browsing interface to select a category or a link type would be useful too. Thanks in advance for the help. -- =============================================================================== Marco Brandizi <bra...@eb...> |
From: Markus <ma...@ai...> - 2006-03-09 09:34:43
|
On Wednesday 08 March 2006 17:58, Marco Brandizi wrote: > Hi all, > > I've just discovered the Semantic Media Wiki project and it looks very > interesting (cool and fun! ;-) ). Sure! -- that's why we do it ;-) > > I'd like to use it in a project of my own, but probably I would need to > write some changes/extensions. Isn't there some documentation about? > Even something minimal, such as a guideline about where to start looking > the code would be helpful however. There is no such documentation yet, but I am happy to help out.=20 > > > First things I have in mind are: > > - Support for reading RDF from external sources (via URI or alike).=20 What does it mean to "read RDF" with the wiki? Do you want to somehow "impo= rt"=20 data? Denny and I are currently about to implement an extension that import= s=20 external OWL/RDF specifications and that allows the wiki to produce data fo= r=20 these ontologies (i.e. using the URIs of the imported ontology instead of t= he=20 wiki URIs). Is this similiar to what you had in mind? > Also=20 > SPARQL queries like: "find the statements having this page as subject" > would be interesting This can already be done with the Special:SearchTriple. Just enter the subj= ect=20 page and leave the other fields empty. This does of course not use SPARQL. SPARQL support is currently being implemented as an external service. This= =20 does not mean that SPARQL might not be run on the same server -- we just=20 architecturally separate the triplestore+SPARQL from the core wiki. The reason is that we want to have the ability to run various SPARQL engine= s=20 in parallel in order to find out whether one or more of them shall be=20 eventually integrated into MediaWiki. There is no best triplestore or SPARQ= L=20 engine at the moment, so that will create some direct competition. Also,=20 WikiMedia foundation does not run Java for lack of free implementations. So= =20 we offer the possibility to still apply Java-triplestores to Wikipedia data= =20 by running them on external servers. Anyway, a public demo for a first working SPARQL endpoint will be enabled=20 soon. > - Support for triple stores / RDF frameworks like Jena This relates to the previous answer. We view SPARQL-support and=20 tripestore-support as mostly synonymous (we do not plan to implement our ow= n=20 SPARQL-engine). We are also about to provide the necessary infrastructure f= or=20 external triplestores to stay in sync with the RDF that is represented by t= he=20 wiki. > - Support for adapting rich ontologies that could be derived (or > similarly adapted) as extensions of current (sincerely) basic > specification of Categories/Typed Links/Attributes.=20 We will soon support in-wiki markup for role hierarchies, transitive and=20 symmetric roles, and maybe for basic domain and range restrictions, but the= re=20 is no intention so far to provide more complex DL-style statements (i.e. no= =20 disjunctions, arbitrary existential or universal quantifiers, negations). It is still an open problem of whether one can edit a complex ontology in a= =20 distributed collaborative sense (related problem: how to edit an ontology i= n=20 a modular fashion?). The problem is that rich languages such as OWL are=20 inherently non-local: you make a statement in one article, and, combined wi= th=20 a couple of thousand other articles, this makes the whole ontology logicall= y=20 inconsistent. But you cannot easily restrict your attention to a reasonably= =20 small part of the ontology. So you would always have the whole ontological= =20 content of your wiki in mind, and thus the distributed environment might be= =20 problematic for ontology engineering. Denny and I are currently working on a partial solution for this, which all= ows=20 you to work against a fixed background ontology, that cannot be changed in= =20 the wiki.=20 > Probably this would =20 > require addition to markup syntax=20 Yes. But one should first decide which language one actually wants to suppo= rt=20 in the wiki.=20 > and hence some browsing interface to=20 > select a category or a link type would be useful too. Yes, this would be nice. We have some ideas, but nothing concrete yet.=20 OK, now lets sum up what features are currently implemented by us, where we= =20 could use some help, and where nobody is actually working on: =3D=3D Stuff that we currently work on =3D=3D * External triplestore and SPARQL service: will be done by us; expected unt= il=20 end of March. * A smart synchronization API for keeping external triplestores in synch wi= th=20 little network traffic: currently there is only the RDF export (which still= =20 needs completion), but for small wikis no further optimization is needed. I= n=20 the long run, we have some plans on how to make the synchronization more=20 efficient. * Importing parts of external ontologies, creating new articles for their=20 vocabulary, and exporting RDF that uses external URIs: done by Denny and me= =20 until end of March. * Inline SPARQL queries for the wiki: done by a new developer who will=20 probably join our team soon. =3D=3D Stuff that we would be interested in =3D=3D * User interface support for simplifying semantic annotation. It would be n= ice=20 if we could suggest relations or categories for articles. Any ideas? * SPARQL UIs. We will build online services for executing SPARQL queries. B= ut=20 how can an average user enter such a query? We will provide some solutions,= =20 but the more (easy-to-use) search interfaces the better. =3D=3D Stuff that we are not focussing on =3D=3D * Rich ontology language editing inside the wiki. Many unsolved problems he= re.=20 Not suitable for Wikipedia, for reasons of scale (reasoning in OWL-Lite is= =20 already ExpTime ...) and since most people don't understand it. Just tell us what you would like to do, and I will send you some hints on h= ow=20 to approach the code (and also which parts of the code will be fundamentall= y=20 changing in the next week ;-). Regards, Markus > > Thanks in advance for the help. =2D-=20 Markus Kr=F6tzsch Institute AIFB, University of Karlsruhe, D-76128 Karlsruhe ma...@ai... phone +49 (0)721 608 7362 www.aifb.uni-karlsruhe.de/WBS/ fax +49 (0)721 693 717 |
From: Marco B. <bra...@eb...> - 2006-03-09 10:35:08
|
Markus Kr=F6tzsch wrote: >=20 > There is no such documentation yet, but I am happy to help out.=20 >=20 Ok, I'll give a look to the code the next days. Meanwhile thank you for=20 the throughout answer! >=20 > What does it mean to "read RDF" with the wiki? Do you want to somehow "= import"=20 > data? Denny and I are currently about to implement an extension that im= ports=20 Yes, I think to an import feature that would allow to relate MediaWiki=20 pages to external objects. >=20 > SPARQL support is currently being implemented as an external service. T= his=20 > does not mean that SPARQL might not be run on the same server -- we jus= t=20 > architecturally separate the triplestore+SPARQL from the core wiki. >=20 Seems to be the right way. >=20 > We will soon support in-wiki markup for role hierarchies, transitive an= d=20 > symmetric roles, and maybe for basic domain and range restrictions, but= there=20 > is no intention so far to provide more complex DL-style statements (i.e= . no=20 > disjunctions, arbitrary existential or universal quantifiers, negations= ). > Yes, I am of the school of "Semantic Web cannot be too much logics and=20 AI"... I am using OWL with its very basic features, such as the ones you=20 mention here. What I would like to do in Semantic Wikipedia is extend,=20 in a plug-in like style, the ontology that may be used to define=20 categories and typed links. I think this would be useful to support=20 specific domains of knowledge (for instance I need to handle concepts=20 related to the microarray technology, used in Biology). However it's not=20 needed anything more that: 1) basic relationships like is-a,=20 transitivity, simmetry, equivalence 2) possibility for the site=20 administrator to load an OWL file that specify the TBox part of the=20 ontology one wants to support within its instance of Semantic Wikipedia. >=20 > It is still an open problem of whether one can edit a complex ontology = in a=20 > distributed collaborative sense (related problem: how to edit an ontolo= gy in=20 > a modular fashion?).=20 > It doesn't seem to be the scope of Sem Wikipedia. Other projects (pOWL,=20 Proteg=E9) are focused on that, aren't they? > The problem is that rich languages such as OWL are=20 > inherently non-local: you make a statement in one article, and, combine= d with=20 > a couple of thousand other articles, this makes the whole ontology logi= cally=20 > inconsistent.=20 > Yes, support for ontologies should be minimal and in a way that still=20 allow to work with partial knowledge and solve possible inconsistencies.=20 Although this is still left as an easy exercise to the reader... :-) > OK, now lets sum up what features are currently implemented by us, wher= e we=20 > could use some help, and where nobody is actually working on: >=20 Thank you very much, your message was very useful. Cheers. --=20 =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D Marco Brandizi <bra...@eb...> |
From: Markus <ma...@ai...> - 2006-03-09 11:26:59
|
On Thursday 09 March 2006 11:35, Marco Brandizi wrote: > Markus Kr=F6tzsch wrote: > > There is no such documentation yet, but I am happy to help out. > > Ok, I'll give a look to the code the next days. Meanwhile thank you for > the throughout answer! Try to use the most recent CVS. The internal storage architecture will chan= ge=20 again soon. The SMW_Hooks file is the core of the meta-data processsing. > > > What does it mean to "read RDF" with the wiki? Do you want to somehow > > "import" data? Denny and I are currently about to implement an extension > > that imports > > Yes, I think to an import feature that would allow to relate MediaWiki > pages to external objects. This will be available in a basic form by the beginning of next week. > > > SPARQL support is currently being implemented as an external service. > > This does not mean that SPARQL might not be run on the same server -- we > > just architecturally separate the triplestore+SPARQL from the core wiki. > > Seems to be the right way. > > > We will soon support in-wiki markup for role hierarchies, transitive and > > symmetric roles, and maybe for basic domain and range restrictions, but > > there is no intention so far to provide more complex DL-style statements > > (i.e. no disjunctions, arbitrary existential or universal quantifiers, > > negations). > > Yes, I am of the school of "Semantic Web cannot be too much logics and > AI"... I am using OWL with its very basic features, such as the ones you > mention here. What I would like to do in Semantic Wikipedia is extend, > in a plug-in like style, the ontology that may be used to define > categories and typed links. I think this would be useful to support > specific domains of knowledge (for instance I need to handle concepts > related to the microarray technology, used in Biology). However it's not > needed anything more that: 1) basic relationships like is-a, > transitivity, simmetry, equivalence 2) possibility for the site > administrator to load an OWL file that specify the TBox part of the > ontology one wants to support within its instance of Semantic Wikipedia. Great. I think we are working in a similar direction. The hard part is to=20 allow the wiki to do anything useful with the loaded ontology (since=20 processing OWL is computationally very complex). > > > It is still an open problem of whether one can edit a complex ontology = in > > a distributed collaborative sense (related problem: how to edit an > > ontology in a modular fashion?). > > It doesn't seem to be the scope of Sem Wikipedia. Other projects (pOWL, > Proteg=E9) are focused on that, aren't they? I am sure other projects are focussed on the whole issue of cooperative=20 ontology engineering. The question is whether this research is sufficiently= =20 mature to be included in a productive wiki.=20 > > > The problem is that rich languages such as OWL are > > inherently non-local: you make a statement in one article, and, combined > > with a couple of thousand other articles, this makes the whole ontology > > logically inconsistent. > > Yes, support for ontologies should be minimal and in a way that still > allow to work with partial knowledge and solve possible inconsistencies. > Although this is still left as an easy exercise to the reader... :-) :-) > > > OK, now lets sum up what features are currently implemented by us, where > > we could use some help, and where nobody is actually working on: > > Thank you very much, your message was very useful. I would be glad to stay up-to-date with your developments, and of course we= =20 would also be happy to integrate contributed features into the main=20 system ... =2D-=20 Markus Kr=F6tzsch Institute AIFB, University of Karlsruhe, D-76128 Karlsruhe ma...@ai... phone +49 (0)721 608 7362 www.aifb.uni-karlsruhe.de/WBS/ fax +49 (0)721 693 717 |