view
Tweet
hifi: #19734 Integrate the Leap Motion SDK into Interface.

Integrate the latest Leap Motion SDK https://developer.leapmotion.com/ into Interface and allow the API to the accessed through Interface Javascript.

Comments & Activity

  • 4 yrs, 9 mnths ago

    #19734 created by chrisc Status set to Suggested.

  • Add the SDK as an external that is optional and fail gracefully if it is not present (we do this for LibOVR + sixense too if you want an example).
  • We should include a refactoring of spatial controls as part of this work item. Specifically, we SHOULD NOT integrate this in the manner that Hydra and Prio currently integrate (using PalmData to store the state) instead, we should use a model similar to how facetracker works.
  • To make sure I understand, you would like:
    * Create a HandTracker class
    * getHandPosition(LEFT/RIGHT) (presumably relative to some fixed point on the body?)
    * getHandOrientation(LEFT/RIGHT)
    * In MyAvatar::update (or somewhere else?) query the active hand tracker (if there is one) and update the location + orientation of each hand.

    Will finger data also be used?
  • The Leap Motion SDK provides access to data on hands, fingers, tools (e.g., if you're holding a pencil), gestures (some specific Leap Motion ones), and motions (of hands overall relative to one another and the sensor).

    Also probably coming up "soon" is the API providing access to the raw camera and / or depth video. They have a private beta on this; I'm sure we could get into it if we'd like.

    All of these should be exposed to Interface JavaScript?

    And some should be included in the (HandTracker? spatial controls) refactoring?
  • @Zappoman can you comment on this question from @ctrlaltdavid?
  • 4 yrs, 9 mnths ago

    ZappoMan uploaded an attachment to #19734

  • Anyone interested in bidding on this project, please review the attached "Software Development Proposed improvements to User Input aka Controller Scripting" ...

    My comment above relates to this design.

    So, we want you to implement the interface described in that document:
    * Event based (like mouse, keyboard are now)
    * LEAP, PrioVR, and Hydra should all be migrated to this model, where a "Spatial" type controller is exposed.
    * "spatial" entities should be exposed through a common set of named controllers. So in the case of leap "Left Hand", "Right Hand" etc.
    * all controllers, including spatial controllers may expose additional named entities outside of the standard ones, so for example, if LEAP exposes fingers, than these can be exposed in addition to the standard hand items by given them additional (hopefully meaningful and possibly future standard) names.
    * The new controller class/implementation SHOULD NOT be added in a manner currently supported by PrioVR/Sixsense... namely DON'T add it to the PalmData structure currently in the code base.
  • @huffman - see the attached document. It should answer your questions.
  • @ctrlaltdavid - please see the attached document... also my responses to your questions below:

    > The Leap Motion SDK provides access to data on hands, fingers,
    > tools (e.g., if you're holding a pencil),

    See the attached document, and my comment above. Hands, Fingers, and tools should nicely fit into the "spatial" type, with appropriate names for the controllers: e.g., "Left Hand", "Right Hand", "Left Index Finger", "Left Thumb", etc.

    > gestures (some specific Leap Motion ones),
    > and motions (of hands overall relative to one another and the sensor).

    I'd like to hear more about this. It may make sense to think of "gestures" and "motions" as new "controller types" as described in the attached design document, or it may make sense to consider these "event properties" for the spatial controller. In the case of TouchEvents, for example, they have the properties of isPinching, isRotating, etc. Whatever we do for 3D gestures, we will want to consider the implications for touch gestures as well. The model should be similar.

    > Also probably coming up "soon" is the API providing access
    > to the raw camera and / or depth video. They have a private
    > beta on this; I'm sure we could get into it if we'd like.

    I would not recommend going down this road at this point. The goal of this integration is to make LEAP, PrioVR, Hydra and any other spatial controllers "standard" and common in their usage and API.

    > All of these should be exposed to Interface JavaScript?

    Yes. This is required. See attached document.

    > And some should be included in the (HandTracker? spatial controls)
    > refactoring?

    Yes. This is required. See attached document.
  • @ZappoMan Thanks very much for your comments and document. Very helpful.

    An overview of Leap Motion's gestures and motions can be found towards the bottom of the following page: https://developer.leapmotion.com/documentation/skeletal/cpp/devguide/Leap_Overview.html
  • @ZappoMan Thank you, the document is a big help.

    More questions:

    1. What happens if a device is no longer available (ex. I disconnect my hydra)? Will it automatically resume emitting events to the correct handlers if/when the device is available again? Could this cause potential problems, ie multiple of a certain type of device getting swapped from primary to secondary?
    2. As far as I can tell the Leap SDK doesn't categorize finger types, so you can't get the "Thumb", "Index Finger", etc. Each hand has a set of fingers with unique ids, determined when a finger is first seen. If the finger leaves but later enters the scene again it will have a new id. How would you want this implemented? Extra events on the SpatialController for fingerIn, fingerOut, fingerMoved,... ? Or include them as extra properties in the Event object? This is how the Leap SDK does it - it operates on frames, and each frame has all of the hand/finger/tool/gesture/... data.
    3. Tools in the Leap SDK are implemented in a similar way, and can move from one hand to another.
  • @huffman - good questions. I will think about it and respond later tonight or tomorrow.
  • Regarding fingers, in the new SDK, every hand has 5 fingers all the time (with estimated positions if they're not directly visible). Though fingers' tracking IDs may change (e.g., tracking gets confused when fingers overlap), you can get them in order (thumb through pinky using Hand::FingerList() method) or otherwise determine which finger any particular one is (Finger::type() method returns FINGER_THUMB etc.).

    Tools are more problematic, though I suggest a script would typically only want to use one, or perhaps two with one per hand. We could certainly simplify things by supporting just a maximum of 1 tool per hand, at least to start with. Returning for each hand, the tool closest to the Leap Motion (Tools::frontmost()).
  • @ctrlaltdavid Thank you for the correction!
  • Say you have a pair of Hydras and a Leap Motion connected to your PC. Both of these are spatial controllers.
    In JavaScript ...
    var leftHand = new Controller("spatial", "left hand");
    ... which controller will you get? Should there be some mechanism to enumerate all the controllers available to provide the ability to choose which to use?
  • excellent question @ctrlaltdavid - let me get back with you on that.
  • Hi all
    I ve started implementing the class to grab data from the Leap Motion
    And since i just discovered this thread tonight, i indeed went the sixsense way trying to modify avatar hand from the LeapMotionManager::update()
    I understand it s not what s expected but it was my first contact with HF source :)
    Will look into the other approach now...
    Cheers
  • @ZappoMan
    I have a simple first implementation of the HandTracker class in place and i m looking at where to do the update from the Application "ActiveHandTracker" into the Avatar model.

    It seems that the right place to pass the HandTracker position and orientation is into the Hand.Palm somewhere in the Application::update(float deltaTime), after the Trackers have been polled

    But it seems that in the current implementation, there is not necessarily a Pam object. for example, the sixsense class add it manually in its update() call ( the first time)

    I m not clear what we want to do here exactly.
    DO you expect the Hand object to have 2 palms always and the Current Active HandTracker to feed the 2 palms with the latest values ?
    If not, do you want several set of palms each corresponding to a different HandTracker and pick the right one to update the skelton depending on what i active at a given time ?

    my guess is that it's the first behavior that is expected ? A single pair of palms updated by the current active Hand Tracker
    I 'd need a bit of guidance here.

    Thanks
    S
  • @ZappoMan In addition to the problem of there possibly being more than one spatial, left-hand controller, some devices such as the upcoming STEM modules have no specific assignment to hand, arm, leg, or head.

    One possible solution to these problems might be to handle device discovery, enabling/disabling, and configuring applicable devices' locations in the global Controller object. A script could access the global Controller object to list all the devices connected and let the user enable/disable devices and specify their locations. Then scripts could just do var leftHand = new Controller("spatial", "left hand") etc. without having to worry about multiple devices. And the script that configures the global Controller object could be a standard HiFi-supplied one that installs itself in the Tools menu, say.
  • @samcake - let me ask some of the other folks on the team about this... that code is a bit of a mess and we'd like to take this opportunity to clean it up.
  • @ZappoMan
    After spending some time in the code, i have to correct my sentence:
    "But it seems that in the current implementation, there is not necessarily a Pam object. for example, the sixsense class add it manually in its update() call ( the first time)"

    In fact there are 2 palms in the Hand object since the creation, but their "SixsenceID" is -1

    so later in the general SkeletonModel::update ( called from the MyAvatar::update ) when comes the time to apply the Hand data to the corresponding bones of the Avatar representation there is a call to
    Hand::getLeftRightPalmIndices
    which rely on the fact that the returned indices are not -1 to use it this frame. so in fact the "SixsenseID of the PalmData is used as an active flag AND the bone identifier

    I guess this is one of the way it was implemented first with the Sixsense that would need to be re-factored to be more HandTracker / Sensor agnostic isn't it ?
  • yes- thats a perfect example of the cruft we want cleaned up
  • 4 yrs, 9 mnths ago

    #19734 updated by cozza13 Status set to Bidding.

  • 4 yrs, 9 mnths ago

    A bid was placed on #19734

  • 4 yrs, 9 mnths ago

    cozza13 accepted 1000.00 from samcake on #19734 Status set to Working

  • 4 yrs, 9 mnths ago

    #19734 updated by problem Status set to Bidding.

  • 4 yrs, 9 mnths ago

    #19734 updated by problem Status set to Working.

  • Sorry about that! Accidentally changed the status on this one.
  • Work in progress... :)
  • The following error was returned when making your pull request:
    A pull request already exists for samcake:19734.
  • 4 yrs, 8 mnths ago

    #19734 updated by cozza13 Status set to Code Review.

  • @samcake I set the status as code review
  • I am going to mark this as done. @samcake
  • 4 yrs, 8 mnths ago

    #19734 updated by cozza13 Status set to Done.

Labels Saved!

Bids

Login to bid
Who Amount Done in ...
*name hidden*$ ***7 days