Thought I would add my two cents, I think there is some confusion
between interface functionality and client proxy functionality. The
interface cannot implement a blocking wait, the client proxy can. I
would suggest that the data packet simply includes a "moving" byte, and
the client can read that to find out if the target has been reached.
Possibly a blocked byte would be handy as well.
Also I would go for a seperate interface for individual montor/joint
control, as this is fairly abstract, something like a motor array
interface. each motor can report its position/velocity and be commanded
with position/ velocity.
If the arm interface is doing its job then the likely hood of needing to
control individual joints it very slim, so lets not clutter the arm
interface. However thats only my humble opinion (Ive never used a robot
arm at this point so I'm no expert), but from a software point of view
its much cleaner and more flexible.
Geoffrey Biggs wrote:
> Daniel Aarno wrote:
>> This looks pretty good, I will give some feedback based on the two
>> interfaces I developed for a PUMA and power-cube arm here at CAS.
> Good to know we're thinking along similar lines.
>>> Do-stuff Functionality
>>> Goto (x, y, z, roll, pitch, yaw)
>>> Move the end effector to given position using IK. If no IK available
>>> in the driver, this functionality is not available (like the
>>> position command in the position interfaces not being supported by
>>> many drivers). Position is probably in robot coordinates.
>> In my experience the use of roll,pitch and yaw is inferior to using
>> direction vectors. For example, we use the normal vector n that
>> points in the direction the end-effector is pointing and a vector o
>> that is orthogonal to n and points in a predifined direction of the
>> gripper (the on hand camera on the power-cube arm, or a mounting
>> bracket on the PUMA).
> I think you're right, a vector for effector orientation does seem more
> natural. It would allow, for example, quickly calculating the vector
> between the desired position of the end effector and the target object
> using simple vector subtraction. I'm guessing the use of two vectors
> is to ensure absolute orientation of the end effector, eg a flat
> gripper pointing along the x axis while lying in the xy plane?
>>> VectorMove (x, y, z, length)
>>> Move the end effector along a given vector using IK with its initial
>>> position as the origin. Move until a joint reaches end of range of
>>> montion and the vector cannot be maintained or until the end
>>> effector has moved the given distance.
>> Perhaps vectorMove should also support orientation of the end-effector?
> What we were aiming for here is that the end effector is already in
> the desired orientation following a Goto command. So if you want to
> orient the end effector differently to its current before performing a
> VectorMove then you would perform a Goto command using the current
> position and the new desired orientation, then perform your vector move.
> If you're thinking more along the lines of having the end effector
> reorient itself during the VectorMove... I don't have much experience
> with arms so I don't know how useful that would be, but I guess it
> could be used.
>>> Home [joint#]
>>> Move the entire arm or optionally a provided joint number to a safe
>>> home position.
>>> Stop the arm immediatly and completely (ie hold position).
>>> Turn the arm on/off.
>> Many arms also have breaks, it would be good to be able to turn
>> brakes on/off and query them if they are active or not (e.g. the
>> power-cubes turn brakes on after a short time of no motion).
> That's a good idea. Are brakes commonly entire-arm or are there
> separate brakes for each joint? I guess there may be an arm produced
> that has one or the other, so we should support both just as with
> Home. At the same time, I think the Stop command should also support
> stopping a single joint.
>> I don't think there is a problem, just have to think about the units
>> for example if there is non-rotary joints.
> That could be an interesting thing to deal with. Non-rotary joints
> would move a distance rather than an angle, correct? I'm assuming
> they'd be, for example, a hydraulic piston type system for extending
> the length of an arm segment. This would not change the Goto,
> VectorMove, etc commands but the SetJointPosition and SetJointSpeed
> commands may need to perform checks for the type of joint
> (linear/rotary) to ensure unit correctness. The only shortcoming with
> this is the failure would be server-side and harder for the client to
> determine the cause of. The alternative is have
> SetRotaryJointPosition/Speed and SetLinearJointPosition/Speed
> commands, but this seems to be cluttering the interface a bit.
>>> One other issue is knowing when the arm has reached the position you
>>> told it to, or if it couldn't get to it for some reason (collision
>>> prevention, out of range, etc). Perhaps a "Reached" message of some
>>> kind that returns if the arm has reached, is moving towards or
>>> failed to reach the last commanded Goto/VectorMove position?
>> Even a wait() function would be usefull, that blocks the caller until
>> the arm reaches the target or an error occurs.
> It would be good to have both, allowing clients to either wait without
> a busy loop constantly checking and using up network bandwidth by
> calling the Wait() function, or allowing them to go off and do
> something else for a while, regularly checking if the arm has finished
> yet using the Reached() function.
> So taking these changes into account, here's an updated version:
> Do-stuff Functionality
> Goto (x, y, z, nX, nY, nZ, oX, oY, oZ)
> Move the end effector to given position using IK. If no IK available
> in the driver, this functionality is not available (like the position
> command in the position interfaces not being supported by many
> drivers). Position is probably in robot coordinates. the (nX, nY, nZ)
> and (oX, oY, oZ) vectors determine the orientation of the end effector
> at the end of the move and should be orthogonal to each other.
> VectorMove (x, y, z, length)
> Move the end effector along a given vector using IK with its initial
> position as the origin. Move until a joint reaches end of range of
> montion and the vector cannot be maintained or until the end effector
> has moved the given distance.
> Home [joint#]
> Move the entire arm or optionally a provided joint number to a safe
> home position.
> Stop [joint#]
> Stop the arm optionally a provided joint number immediatly and
> completely (ie hold position).
> Brake [joint#]
> Turn brakes on/off for whole arm or optionally a single joint.
> Turn the arm on/off.
> SetJointPosition (joint#, angle)
> Move the specified joint to the specified position, or as close as
> SetJointSpeed (joint#, speed)
> Set the specified joint to the specified speed.
> Tell-me-stuff Functionality
> Position -> x, y, z, roll, pitch, yaw
> Gives the position of the end effector. Probably in robot coordinates,
> as in Goto().
> JointPosition (joint#) -> rad
> Gives the position of joint#.
> JointSpeed (joint#) -> rad/s
> Gives the speed of joint#.
> JointHome (joint#) -> rad
> Gives the home position of joint#.
> JointMin (joint#) -> rad
> Gives the minimum position of joint#.
> JointCentre (joint#) -> rad
> Gives the centre position of joint#.
> JointMax (joint#) -> rad
> Gives the maximum position of joint#.
> NumJoints (joint#) -> int
> Gives the number of joints in the arm. If we treat each DoF as a
> joint, even if they are physically in the same joint, this will be the
> total DoF in the arm. Treating each DoF as a joint is probably easier
> in terms of making a nice interface. I can't think of any reason why
> it shouldn't be possible to do this, but someone with more experience
> with arms might.
> Reached -> status code
> Returns if the arm is responding to, has finished responding to or
> could not complete the most recent movement command (Goto, VectorMove,
> Home, SetJointX).
> Wait -> status code
> Blocking version of Reached.