Just finished wrapping up some final user interface details, and now here it is: A fully gesture-based application -actually more like a proof of concept ;). Hit some dumb face as often as you can within 20 seconds with both of your hands. Enjoy...
Added something called MotionSupplier to get movements only per-user; basic foundation logic is now nearly done. Furthermore added the first built-in gesture detector: HitWallGesture.
... having fun playing "air piano" ;)
Added proper handling of users, as they can appear and disappear completely unexpectedly. For example when replaying a recorded session (osceleton -i some_file.oni), the same osceleton user id is used, which has caused josceleton to crash. Additional added an integration test confirming robustness.
Experimenting with "Human Sound Generation" by connecting Kinect to Josceleton to MIDI to Ableton Live sequencer :)
Introducing Josceleton Facade providing a simplified access to the API. The fact that Guice is used internally is now completely hidden. Still a custom Guice module is provided for immediate usage.
Finally I feel honored to announce that the website online, check it out:
Phew... finally, it's done: Build is successful, tests are green, deployment working, basic site online, and off we go!
Right now it is only possible to access the library via Apache Maven and the Google Guice Dependency Injection Framework.
New version coming soon...