|Job #||Fee amount||Summary||Creator||Runner||Paid|
We'd like to experiment with the Leap Motion device for controlling the motion of the avatar:
Build the LeapSDK into the existing interface repo, minimally so that it builds correctly on OSX/Xcode.
Read the hand/finger position values from the leap at the lowest possible latency and highest sampling rate, and render small spheres in front of your own avatar (add methods needed to the head.cpp class, and add to the render() call there).
That's step 1. Step 2 will be to use the acquired data to move our own hand position.