|
From: Smith, B. <phi...@bu...> - 2008-01-25 14:10:29
|
At 08:37 AM 1/25/2008, Matthew Pocock wrote: >On Friday 25 January 2008, Ryan Brinkman wrote: > > > Just to start you off: In my case there are intentions to perform > > > actions fulfilled the content of the phrase and to accept the > > > obligations consequent on uttering the phrase. > > > (There are similar sorts of intentions involved e.g. in protocoll > > > applications. If the protocoll requires that I feed the mouse with > > > rice, and my brother, for a joke, leaves rice in the same room with > > > the mouse, my brother is not performing a protocoll application.) > > > BS > > > > Assuming the manner the rice was left followed the requirements laid out > > in the protocol and your brother did not introduce and potentially > > confounding conditions (e.g., the protocol did not specify that it was > > BS specifically who had to feed the mice, the amount and condition the > > rice was left in was correct, he didn't jump around the room), from a > > experimental standpoint were not the requirements of the protocol were > > realized even if that was not your brother's intent? I believe this is > > what matters from those who would use OBI, since operationally I would > > not handle the results of the experiment any different than if you fed > > the mice (again since there was a proper application of the protocol, > > even if that was not the intent). I am assuming that the protocoll will have a good ontology in-built (the ontology people use when they speak English and say things like: feed the mice at regular intervals with ...) So let's twist the wheel one bit further. You are a lazy, but lucky, experimenter. You need to feed the mice twice a day, but you have learned that the mice are good at opening bags of rice with their teeth, and that it takes them about 12 hours to get them open. You leave two bags of rice there every morning. All the mice get fed, exactly according to the protocoll. (You are lucky.) Will you still say: 'from a experimental standpoint ... the requirements of the protocol were realized' >Or to say essentially the same thing differently, in the one case there was a >plan to feed the mice according to experimental requirements in the agent >that facilitated the action. In the other there was not. In the first case, >the mouse-feeding activity is a realization of the plan, in the second case >it is not. In both cases, the physical activity of mouse-feeding is exactly >the kind required by the experiment. We should not conflate these two >features. Physical events and activities are entities of two different kinds. The protocoll relates to the latter. >If you conflate the intent (that an agent realized a plan through an >activity) >with the activity, you naturally get permiscuous multiple inheritance on the >activity side. I hope that I am not doing that. >Any given physical activity could be the outcome of no plan >being followed, or any one of a large number of plans, possibly but not >necesarily with the plans falling into some sort of hierachy. > >Plan: make a cup of tea >Activity 1: poor boiling water into cup ; add teabag ; stew ; remove teabag ; >add milk ; add 2 sugars >Activity 2: poor milk into cup ; add sugar ; add tea brewed in teapot >Agent 1: me >Agent 2: tea-making robot, non-sentient > >So, there are two physical activities that result in a cup of tea. Either me >or my tea robot can participate in these activities. If I make the tea, I am >realizing the plan to make tea. If the robot does, it is not (by analogy to >the earlier parrot). Hence, we have 4 combinations here, which by the BS >logic would give rise to 4 universals of activity, and clearly have 2 >super-types of these, one co-ordinated by realization, the other by >base-activity. The robot needs to be programmed and switched on by you, with the same intention (to make tea). If you, a careful and non-lazy experimenter, wake up one morning and find that the mouse has nearly pulled open one of your bags of rice, you may decide (intentionally) to use that half open bag of rice. You convert mouse activity into human intentional activity according to the protocoll. >I take from this example that wether a physical activity is the >realization of >a plan or not, is not a good basis for categorising it, much as categorising >baloons by if they are red or not is also not a good basis for categorising >them. If OBI were interested exclusively in physical activities we might need a different base of classification. But OBI is interested in two kinds of activities, the intentional (performed by experimenters; conceivably also by subjects), and the non-intentional, e.g. cells dying. >Wether something is a realization of a plan, is IMHO very clearly and >naturally a defined class. For deciding what the consequences are in the real >world, this information is eclipsed by what actually happened. A scientific investigation is not only a matter of consequences, but also a matter of how those consequences were reached (deliberately, carefully, on the basis of the application of these and those protocolls, honestly, etc.) BS > > Ryan > >Matthew |