Is there in Cyc any knowledge that lets it reason about degree of likelihood of statements? (Degrees may be "likely" or "unlikely", not certainly some numeric value) For example, can it infer that eating the cat by human is possible, but quite improbable?
No, not to my knowledge.
Cyc only deals with discrete rules (e.g. cat is an animal). It doesn't include any examples of usage that you could use to model the occurances of when "cat" refers to an animal, verses a person's name, verses a piece of construction equipment.
Personally, I find this to be Cyc's fatal shortcoming, and it'll likely remain that way.
<Insert rant about ontologies ignoring the importance of modeling certainty here>.
Yes, you can assert microtheories about probability, and any other mathematical concept, so long as you can express it as a collection of other concepts and microtheories. It's actually a relatively easy idea to thrash out, and by following the tutorials, you could get a working model within a day or two. Chris, unfortunately, not only fails at spelling, but also fails at having any sort of clue of what he's talking about.
Cycl (the expressive language of the Cyc/openCyc system) is Turing complete.
OpenCyc's major shortcoming is communication with and development of a community.
It may be more accurate to say that, while the concept can be built, it isn't a "default" premise in Cyc's method of handling new data. I have been fiddling with the "teaching game" thing, and have *never* seen a question come up along the lines of, "a melon is sometimes green", or the like. Rather, instead, all you ever see it stuff like, "generally yellow", "generally green", etc. Basically, even if that, eventually, derives, on the other end, some result that deals with, "it can be, sometimes", its hardly clear at all if this is the case in the game *at all*. Worse, it results in some seriously odd assertions, in some cases, where you badly want to explain why the assertion is "partly" true, but not true, as stated. lol
Hmm. Suppose, in a way, you are right, it has serious issues with communication. Example - it might help if the game the full Cyc people had set up included a, "Could you state this in a different way.", button. Some of the stuff, if you look at it one way, *could* imply a correct assertion, but looked at in a more common way, seems wrong, and the "teachers" are almost universally going to just confuse the hell out of the poor AI, by looking at the common one, not the obscure interpretation.
Good example of one where imho, it may be failing: "The act of shaking one's head expresses distress." Certain things can be subsets of an expression, thus express more than one thing, in a different context, but you flat out cannot answer this as yes, no, don't know, or doesn't make sense, which are the only options.
So I re-iterate my original point; no, Cyc doesn't do this. Saying "yeah, it can do this because CycL is a programming language, so it can do anything" is disingenuous. If I have to implement all the "mathematical modeling" myself, I'd be better off using Markov logic networks or some other existing statistical framework. The fact is, Cyc wasn't designed to handle uncertainty, and it's developers seem unlikely to change this anytime soon.
I'd love to be proven wrong by someone, anyone, citing documentation, a tutorial, or anything that illustrates how to use Cyc to reason about uncertainty. Until then, I'll be content with smug know-it-alls deflecting these problems by complaining about my spelling mistakes.
Well. Its a case of yes and no. It uses uncertainty to determine *if* an assertion is likely true or not. What it doesn't, from my admitted limited experience with it, is handle cases where uncertainty is *inherent* in the assertion/construct/concept itself. This, if true, in imho, a fatal flaw. But, I can't be certain I am correct.