|
From: Donnie R. <don...@is...> - 2013-05-02 23:05:19
|
To Whom It May Concern: My name is Donnie Rivera, and I work at Intelligent Software Solutions. I've taken over Chris Shellenbarger's project over the last year which I've completed several changes to the software. One of the changes included moving from Java 1.6 to Java 1.7 which was the cause for the DL Learner bug to appear. Our software still utilizes DL Learner for what we call the "Hypothesis Refinement" process. During this process DL Learner used to find specific types. Unfortunately with all the changes, DL Learner stopped learning these concepts. After looking into the problem, I've discovered a bug within DL Learner, but also wanted to get input from the side effect for the current working around. The issue starts with an assumption made within the OWLAPIReasoner. The getSuperClassesImp() assumes that an ordered NodeSet<OWLClass> will be return from the PelletReasoner. This is true with Java 1.6, but not Java 1.7. That assumption is what leads to the bug within DL Learner's ClassHierarchy. thinOutSubsumptionHierarchy() process. During the cloneAndRestrict() restrict process, DL Learner's OWLAPIReasoner. getFirstClasses() grabs the first node from the NodeSet<OWLClass> that was returned from the PelletReasoner. This is the cause of the problem because classes are getting mapped to both owl:Thing and rdfs:Resource in Java 1.7 whereas in Java .1.6 they were only getting mapped to owl:Thing. Then during the thinOutSubsumptionHierarchy() process, the upward mapping gets horked which is why DL Learner stopped learning specific types. This is easily worked around by setting the improveSubsumptionHierarchy attribute to false, but what are the side effects from doing so? Regards, Donnie Rivera Software Engineer Intelligent Software Solutions (ISS) Work: (719) 457-0228 Moblie: (719) 242-8522 Email: don...@is...<mailto:don...@is...> |