I've read a handful of NLP books from my local university, but none of them made it quite clear how one should transform a syntax tree into a more semantic-centric representation. I liked the idea of transformational grammar (read about it in an 80's era book), but this has apparently fallen out of favor. Is there some dominant theory which has some notion similar to that of surface and deep structures? I would like it if I could transform sentences into a more neutral form for the construction of a semantic rep (is that called a concept dependency graph?). I suppose the various surface realizations have their purpose and one cannot successfully analyze semantics if some of that information is disregarded? I would deeply appreciate it if someone could point me to a theory/paper/book/website. Actually, if anyone knows of a good freely available piece of software that constructs a semantic representation, that would be awesome!
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Recently, there is a task called Semantic Role Labeling that has been emerged as a shallow parsing technique. In this task, we try to capture the relationships between the participants in some context that is described in a sentence. For example, if you say: "The stock increased from $10 to $15 this week", the task should return "the stock" as the SUBJECT, "$10" as the START POINT (temporarily called) and "$20" as the DEST POINT in the context of INCREMENT. The task is based on Frame Semantics theory developed by Charles J. Fillmore. There are many research in the field of NLP use the idea of Frame Semantics theory to capture semantics of natural language text. For further information, you can refer to:
FrameNet project of UCBerkeley: http://framenet.icsi.berkeley.edu/
Propositional Bank (originally from UPenn): http://verbs.colorado.edu/mpalmer/palmer/projects/ace.html
In Dependency structure, some semantic relations are also defined. You can derived Dependency structure of an English sentence from Minipar parser, it is free (http://www.cs.ualberta.ca/~lindek/minipar.htm)
There is a language called Universal Networking Language (UNL) that was originally developed from United Nation University with the aim be able to represent semantics from natural language. As my opinion, the idea of UNL is similar to that of Interlingual. You can refer to http://www.undl.org/
Hope I could answer your question. Looking forward to further discussions.
Dat
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I've read a handful of NLP books from my local university, but none of them made it quite clear how one should transform a syntax tree into a more semantic-centric representation. I liked the idea of transformational grammar (read about it in an 80's era book), but this has apparently fallen out of favor. Is there some dominant theory which has some notion similar to that of surface and deep structures? I would like it if I could transform sentences into a more neutral form for the construction of a semantic rep (is that called a concept dependency graph?). I suppose the various surface realizations have their purpose and one cannot successfully analyze semantics if some of that information is disregarded? I would deeply appreciate it if someone could point me to a theory/paper/book/website. Actually, if anyone knows of a good freely available piece of software that constructs a semantic representation, that would be awesome!
Is the theory you mentioned Montague semantics?
Recently, there is a task called Semantic Role Labeling that has been emerged as a shallow parsing technique. In this task, we try to capture the relationships between the participants in some context that is described in a sentence. For example, if you say: "The stock increased from $10 to $15 this week", the task should return "the stock" as the SUBJECT, "$10" as the START POINT (temporarily called) and "$20" as the DEST POINT in the context of INCREMENT. The task is based on Frame Semantics theory developed by Charles J. Fillmore. There are many research in the field of NLP use the idea of Frame Semantics theory to capture semantics of natural language text. For further information, you can refer to:
FrameNet project of UCBerkeley: http://framenet.icsi.berkeley.edu/
Propositional Bank (originally from UPenn): http://verbs.colorado.edu/mpalmer/palmer/projects/ace.html
In Dependency structure, some semantic relations are also defined. You can derived Dependency structure of an English sentence from Minipar parser, it is free (http://www.cs.ualberta.ca/~lindek/minipar.htm)
There is a language called Universal Networking Language (UNL) that was originally developed from United Nation University with the aim be able to represent semantics from natural language. As my opinion, the idea of UNL is similar to that of Interlingual. You can refer to http://www.undl.org/
Hope I could answer your question. Looking forward to further discussions.
Dat
Proposition Bank moved:
http://verbs.colorado.edu/~mpalmer/projects/ace.html
http://www.ldc.upenn.edu/Catalog/CatalogEntry.jsp?catalogId=LDC2004T14