[Embedlets-dev] Re: Bridging JAPL and Embedlets
Status: Alpha
Brought to you by:
tkosan
|
From: Andrzej J. T. <an...@ch...> - 2003-04-01 02:49:22
|
Ted sed:
> Well, we are getting there but I still have some more questions so I hop=
e
> you will bear with me! ;-)
No problem, Ted. That's what this list is for.
> The understanding of JAPL peripherals that I have been working under is
> that they are generic, standardized interfaces for any kind of periphera=
l
> I/O circuits from simple I/O ports through medium complexity devices lik=
e
> UARTs and up to highly sophisticated I/O systems like a RobotScanner.
That is how I perceive JAPL peripherals as well.
> My working definition of what a device driver means in the context of
> Embedlets is that it is the piece of software that is responsible for
> doing the impedance matching between the I/O level semantics of the JAPL
> peripheral and the application level semantics of the Embedlet based
> application.
I think of device drivers more like Windows drivers. They are typically n=
ot application specific, just
expose a higher level device API. And hence my use of the terms JAPL peri=
pheral and device
driver synonymously.
> For example, if a cannon were attached to bit 3 of a JAPL I/O port, then
> an event like =93fire forward cannon=94 from the FireControl Embedlet wo=
uld
> eventually get translated into something like JaplPort.setBit(3, true);
I would call JaplPort.setBit(3, true); the device driver level. The fireF=
orwardCannon() method,
implemented in a more application specific class is something I would put =
in a wrapper above
JAPL but below Embedlets.
> Well, I am trying to understand how the Embedlet implementing a JAPL
> interface and registering itself as a listener for Events coming from a
> JAPL peripheral (that it already has a reference to) using this interfac=
e
> makes the JAPL peripheral at all dependent on the Embedlet code base?
It doesn't. It makes the Embedlet application code too dependent on JAPL =
code, and more
specifically, dependent on JAPL interrupts/events. Reasons why this is a =
"bad thing" later in this
email.
> and then the Embedlet would implement this interface and then register
> itself as a JaplEventListener just like any other JAPL client would.
You're close.....I am suggesting that the best practices approach is to us=
e three distinct classes:
1) JAPL Peripheral (possibly with a more app-specific layer over the top t=
o provide things like the
fireForwardCannon() method).
2) A FireControl Embedlet that responds to EnemyInRange Embedlet Events (a=
nd which would
possibly call the fireForwardCannon() method if it determined that it shou=
ld/could fire).
3) A class that acts as a bridge between these two. Call it a wrapper/pro=
xy/translator/mediator. It
implements JAPLEventListener and org.embedlet.event.EventProducer. When i=
t gets called
with a JAPL Event (if we detected an enemy in range) all it does is create=
a new Embedlet Event,
attach any relevant info (maybe the JAPLEvent set as the UserData in the E=
mbedlet Event) and
sends the Embedlet Event.
In our example, it would look something like this:
public class EnemyDetectorMediator implements JAPLEventListener,
org.embedlets.event.EventProducer
{
public EnemyDetectorMediator() // constructor
{
EnemyRadar radar =3D getRadarJAPLPeripheral(...);
radar.registerJAPLEventListener( this );
}
public void receiveJAPLEvent( JAPLEvent japlEvent )
{
EventService eventService =3D Container.getEventService();
EmbedletEvent event =3D eventService.getEventInstance( "enemy.detected",=
...);
event.setUserData( japlEvent.getRangeAndDirection() );
eventService.send( event );
}
}
This code is very simple. Easy to understand. Not much code required. A=
nd it isolates the
"bridging" code away from both the JAPL device stuff and the Embedlet appl=
ication code (fire
control details).
> the Embedlet already has a dependency on the JAPL peripheral, what is th=
e
> harm of allowing it to have a further dependency on a JAPL interface
> (again, there are no Embedlet dependencies in the JAPL code)?
There are a number of good reasons for using the "mediatory/proxy" class t=
o bridge the two
worlds of JAPL and Embedlets....
First, it keeps the two event models distinct and decoupled, which is a go=
od thing. This means
that your Embedlet code (FireControl) is not dependent on a particular JAP=
L implementation or
even specific devices (so you could switch between Radar ranging and Satel=
lite imagery/GPS
detection of enemies with no change to your FireControl Embedlet).
Also Embedlets are controlled by the Container (ref: Inversion of Control =
architectural design
principle). If you let the JAPL layer do a callback directly into the Emb=
edlet you have bypassed
the Container control mechanisms which could be a "very bad thing". The c=
ontainer
implementation may be enforcing prioritization through the event queues, a=
nd by bypassing this
with a direct callback you may compromise realtime deterministic response =
guarantees. Also a
container implementation may choose to use a dedicated thread per Embedlet=
....and calling into
the Embedlet using the JAPL thread could easily cause problems.
These are just two examples of why the middle "proxy" class that is NOT an=
Embedlet is a good
idea, since it avoids all of these potential issues with various container=
implementations (and
there will be many varied Embedlet container implementations that could ea=
sily make such
implementation decisions...if we are successful!).
> My thought was that for any given JAPL peripheral, one Embedlet in the
> application would be used to wrap around it and then all the code needed
> to perform the semantic impedance matching with this JAPL peripheral wou=
ld
> be isolated inside of this single Embedlet (including messaging code and
> JAPL event handling code).
Close....but for the reasons above, it's better practice to use
a "mediation" class as per my example above.
> I am still trying to understand what the benefit is of allowing an
> Embedlet to send sub-application-semantic-level messages directly to a
> JAPL peripheral but to have sub-application-semantic-level asynchronous
> events coming from the JAPL peripheral to be broadcast to an arbitrary
> number of Embedlets via the Embedlets event mechanism?
OK...let's follow on with the battle example. If you use the "mediator cl=
ass" approach, when you
first write your battlebot code, you might implement just the FireControl =
Embedlet. This is a
simple case so you are right to question why go to the bother of creating=
the separate
"mediation" class since only one Embedlet will receive the events.
However, as your battlebot evolves, you decide that you want to add an ana=
lysis and potential
avoidance behaviour (run way from the enemy) code, but it should be lower =
priority than firing the
cannon (since you have decided to always fire the cannon immediately upon =
sighting an enemy
as part of your battle strategy).
So you write an Avoidance Embedlet and register it to receive the EnemyDet=
ected embedlet
event. Note...you didn't change a line of code in your FireControl Embedl=
et. Nor did you change
a line of code in your JAPL classes nor in your "mediator" class that tran=
slates from the JAPL
event to the Embedlet Event. And you can tell the container to first send=
the event (using XML
config stuff) to the FireControl Embedlet, then you can analyze the situat=
ion and decide whether
to flee or not in the Avoidance Embedlet.
Look at the modularity and flexibility that you have available at your fin=
gertips, Ted! You added
advanced enemy avoidance strategy without touching a line of your other co=
de. Contrast this
with what you would have had to do if you had made the FireControl embedle=
t receive the JAPL
callback directly....you would have had to refactor a lot of code (and wou=
ld have ended up with a
mediator class anyway) or you would have mixed FireControl with Avoidance =
code, which is not a
good way to split up application requirements that are basically independe=
nt of each other.
Now you could say you could implement the "translation" of JAPL interrupts=
to Embedlet Events
with a 3rd Embedlet. But there are issues with that and the Container inv=
ersion of control
principle that I have already explained earlier.
> I think you are going to say that the JAPL wrapper will convert between
> the I/O level semantics of the JAPL peripheral and the application-level
> semantics of the application but I still can't see why all of this
> impedance matching is not happening in one place (either all in one
> Embedlet or all in the JAPL wrapper)?
See above. The impedence matching has one foot in either camp....and so i=
t makes sense to
keep it separate and in the middle, rather than try to arbitrarily force i=
t to live on either side of the
fence.
It has the nice side benefit of separating the "bridging" code away from t=
he device and/or
application code. This makes the final app easier to maintain, and easier=
to understand, since
you are not mixing roles/functions.
> From my perspective, this is the kind of situation that is confusing me:
Do the above descriptions help lessen the confusion at all?
> So, why does all of the outgoing Embedlet/JAPL impedance matching happen
> in an Embedlet and all of the incoming JAPL/Embedlet impedance matching
> happen in a completely separate and unrelated piece of software (the JAP=
L
> wrapper)?
I should not have called the piece of code that bridges the two worlds the=
JAPL Wrapper,. Since
it's not part of JAPL...nor is it part of Embedlets....it sits on the fenc=
e and acts as "glue" between
the two. that might be confusing you. And you only need to fix the impe=
dence mismatch
problem between the two event models (see many reasons as to why above) wh=
en the JAPL
device is using an interrupt driven approach.
If you are just issuing a method call against the JAPL layer from the Embe=
dlet, that does not
compromise the Container Inversion of Control principles, and is akin to c=
alling a lower level
method to do your work. So that is OK.
> Why not let one Embedlet handle impedance matching in both
> directions since having an Embedlet implement something like the
> JaplEventListener interface completely prevents JAPL peripherals from
> having dependencies on Embedlets?
See above for reasons why. The only point I'm trying to make is that the =
"thing" that handles the
impedence mismatch should not be a JAPL nor an Embedlet. It's a thing unt=
o itself...with one
foot in each camp per my example.
In fact, you could create a class that handled the translation between the=
two event models AND
also provided application level semantics to the Embedlets (eg. add a setD=
etectorSensitivity()
method to the example EnemyDetectorMediator class I outlined above).
> Again, my main goal here is to completely understand the reasoning behin=
d
> the decisions being made in this part of the architecture so please bear
> with me because I am having a tough time with it! ;-)
I hope this long description and examples help. You're pretty close to ge=
tting it! ;-)
...Andrzej
Chaeron Corporation
http://www.chaeron.com
|