VipNav
Introduction:
Due to the limited market for this invention, as well as the amount of
development effort required to reduce it to practice, as a function of
the cost per unit that would be required to break even, it would not be
simultaneously lucrative and humane to patent it.
Therefore, to further its development and prevent anyone else from
blockading it via patents, I am hereby publicly presenting this idea
within the public marketplace, with the intent to create a SourceForge
project dedicated thereto. This will become an open source and open
schematic project for the good of those who need it.
--Robison Bryan, Digital Missionary.
---(builder of LastCallForJesus.com)
Personal Navigation Assistance System for Visually Impaired Walking:
The purpose of this SourceForge project is for a complete system, including
hardware and software, which would enable a visually impaired person to
walk around with at least a minimal real time awareness of surrounding
objects, remotely obtained and disclosed quickly via sound. This is not a
replacement for the white cane, but is intended to augment that use.
Physical Form with Inputs and Outputs:
A hard clam shell style pair of headphones is provisioned. Each hard clam shell
ear muff is modified by cutting out all but a small center strip spreading to a small
cup in the center, the inside of which houses a small walkman style headphone
transducer, the outside of which firmly holds a forward facing webcam. The
areas of hard headphone shell cut away allow the wearer to directly hear without
impediment all sounds including left, right, front, rear, up, down. The headband
going over the top of the head that holds the two clam shell earphones holds at
the top center a forward facing omnidirectional ultrasonic sonar unit. The power
supply that powers the entire system is a belt mounted rechargeable lithium
polymer battery of the type that is presently used to store in a purse but can jump
start a truck. Thus the unit has enough power for all day, and a charging power
unit can be swapped with the belt worn power unit. For night vision, there is a
bank of fast pulsed LEDs surrounding the sonar unit, which can provide flash for
each snapshot of surrounding objects. The LEDs are just bright enough to make
the webcams suitable, which is about the same brightness that bicycle riders use
on their forward flashing headlights that warn drivers of their presence at night.
Inputs Detected:
Two forward facing webcams, located just outside of each headphone, thus about
a foot or so apart. One top of head mounted ultrasonic sonar unit to detect the
distance of the closest object ahead.
Outputs Given and their Derivation:
Output is via the two small walkman style headphone speakers sealed in their tiny
cups, in the center of the open air cut away headphone clam shells. The mode of
output is of the closest object ahead, via pairs of tone clusters, followed by an echo
of the second tone cluster, then a very brief synthesized speech report of the distance
in feet of such closest detected object ahead, followed by similar report of each major
detected object, but only the closest object gets the speech report of distance. Each
object shape is simplified to a parallelogram (rectangle that might be leaning over).
Whenever the distance or location of the closest object changes significanty, the list
of objects being reported is interrupted by starting a fresh report, starting with the
closest object ahead. But the interruption is never before the speech distance report.
The wearer would quickly learn a dancing technique called "spotting", which is the
rapid turning of the head as quickly as possible, then holding it to a fixed position.
Each tone cluster pair (one for each object) has two tone clusters lasting about
a tenth of a second to a fifth of a second, whatever is heard best. The first
tone cluster reports the left edge of the object via the stereo location of the
tone cluster, with the top note and bottom note in that tone cluster reporting
the top edge and bottom edge of the left portion of that object, respectively.
The second tone cluster reports the right edge of the object via the stereo
location of the tone cluster, with the top note and bottom note in that tone
cluster reporting the top edge and bottom edge of the right portion of that
object, respectively). Thus the general shape of the object is reported as
the closest approximate parallelogram (leaning or straight rectangle). Since
the two cameras have a slight (or great if close) difference in location for
detected objects based upon distance, and since there is only one person
walking around rather than two, the left and right edges reported via
sound for each object will be the average of the edge locations reported
by the two cameras as scaled by the size of the field of vision of the cameras.
Thus an object whose left edge is just at the left edge of the camera's field
of vision would be heard as far left sounds. Last but not least, the distance
of the object as calculated by trig is depicted as the delay period of the echo
of the second tone cluster. The first tone cluster does not have that echo,
to keep it from messing with the second tone cluster. And finally, the distance
of the closest object is detected as the closest echo of a brief ultrasonic pulse
train that has a unique pulse code, and that distance is used to replace the
detected distance of the closest trig derived distance object, and as an aid
to maintaining accuracy of all trig computations. Thus, when minor angle
variations of the headphones holding the two cameras out from their clam
shells take place, the momentary trig calculations are calibrated by the
ultrasonically derived distance of the closest object in the field of vision.
Thus the ultrasonic range detection of the soonest echo provides a correction
factor to the trig calculated distances, keeping the distance detection accurate
despite initial major variations and temporary minor changes of exact head
phone positioning.
Optional Additional Sensing and Warning:
It would also be possible to provide a reverse facing sonar to warn of any
objects rapidly approaching from the rear, such that the wearer could snap
their head around and get an idea of where it is headed so as to dodge it.
Nothing's perfect, but it provides better sensing than previously available.
Better Shape reporting:
If the shape is taller or shorter in the middle than on both ends, a simple
paralelogram may not be suitable to report the shape. But by changing
the pair of tone clusters to three in a row from left to right, the general
shape may be intuititively understood, at a glance, so to speak. The same
following echo would occur, with speech for the closest object, of course.
GPS Functionality:
The only thing this project needs to do about GPS is to be able to run on the
same smart phone as a GPS app, and to mix its audio output onto the same audio
output channel (headphones) as the GPS audio output. It would, however, be of
use to coordinate sound output with the GPS app to take turns using the speakers
so that the user can get GPS directions without VipNav interfering, then mute
the GPS app and resume VipNav assistance. If the user needs to depend upon the
GPS app to let him know he has travelled far enough on foot in any direction,
the GPS app should be coordinated at least to be willing to refrain from speech
until such time, allowing the VipNav to do its work. If the phone can allow the
VipNav app to sense speech other than itself, then the VipNav can pause long
enough for the GPS to do its job.
Operational Modes:
Because the ears are the central clearing house of urgent navigation information,
care needs to be taken to avoid clogging this accessway of information. Therefore
a small variety of user-selectable navigation modes would be desirable:
1. GPS and VipNav muted, so the user can react to urgent open air sound cues.
2. GPS only.
3. VipNav only.
4. Set VipNav to full report (interruptable report of major objects, closest first).
5. Set VipNav to Closest Report (report of just the closest object).
6. Set VipNav to Range Report (report of objects within a certain distance).
7. Set VipNav to Proximity Alarm only (report only upon rapid approach or close
proximity of objects).
It may also be possible to pre-program a set of selectable
automation modes, wherein the features of the unit will be performed upon certain
events, such as upon rapid approach of any object, report only on closest object
and the approaching object, approaching first, with (as exception to prove the rule
speech distance report of approaching object as well as closest object, but
with the closest object using a male voice and approaching object using the
female voice. In this mode, a very brief emergency alert chime could sound
prior to the approaching object report, so that the user knows what it is.
Other automation responses could automatically control the operation mode as
needed, so that all the user needs to do is trigger a certain automation such
as for instance, the location and walking navigation to a certain shop or other
destination requested via the GPS system.