The Valve Index controllers track your fingers and while it’s fancy, there’s not much use for it.
The Oculus Quest has the hand tracking feature without requiring any additional modules, but it provides no haptic feedback and can’t be used in most games because they don’t support it. So it’s mainly for navigation and making gestures.
What makes the Pimax hand tracking module different?
I honestly don’t see much excitement about that here?
I pre-ordered it when I pre-ordered my 5K+ but don’t think much about it any longer since there’s really not that much content out which can use it (yet).
It’s more of a gimmick than anything I guess. Like WalkOVR and KAT Loco (I have both).
I’m still dreaming about a VR application where one can play musical instruments. Which can be projected to any haptic surface, like a table - or a piece of cardboard or wood-stick that you can wear on a strip. Finger tracking would have to be highly sensitive though, so the slightest change of finger pressure can be recognized. And a solution is needed to track the position of the haptic surface (e.g. with “Vive trackers”), particularly for surfaces that can be moved around - so virtual strings or keys can stay in position if you move your cardboard guitar or violin around
Think it’s a hen and egg problem - nobody writes applications for non existant (or too inprecise) hand trackers and nobody creates precise hand trackers with no applications. Will see how good the Pimax handtracker will be - if it’s good (and better than what the other controllers offer as gimmick), this could be a unique selling point. One or two really good applications that make use of this might be sufficient to get this going.
That is a very cool idea. You know UltraLeap also has a sonic based haptic feedback device. I got the chance to try it recently and it really does give your fingers the “feel” of touching something.
The demo I tried did indeed combine the sonic haptic with the new module in a vr environment. As long as my hands were facing the sonic haptic system it could simulate touch of VR objects. Btw it was a VR escape room demo, quite amazing and realistic. I don’t know when or if it will come to market though.
I had the Leap Motion v1 for some time to emulate controllers for me (had a Pimax 4K 3dof with a Kinect) and so I tried all the hand tracking demo apps too. I love it! Hand tracking is an incredible cool VR immersion application.
But there are problems with occlusion and those problems are too big to really design games or apps for it I think.
We need a combination with a wrist reader that can help read what your fingers are doing to fix the occlusion issue.
But I will likey still buy the Pimax hand tracker just to play around with the demos again and dream of the future where this becomes the norm, because it will become the norm I’m sure of it
Because VR is more than gaming, many of us are making awesome stuff with quest hand tracking for enterprise and training and would love to take those ideas to wide-fov on PC without taping an old leap motion v1 to a rift cv1
I for one am extremely excited for hand tracking for Flying sims. I’m hoping eventually we can pair the hand tracking module with a decent yoke set up. The hand tracking will be vital in actually reaching out and physically moving the knobs and buttons in the flight deck. While the yoke does the actual flying. Its about immersion. I work in the aviation industry and pressing buttons to “simulate” me reaching out and holding a button in is not doing it for me on that front.
I was at the VR Days in Amsterdam and there was a demo setup of a helicopter, using the Varjo headset, and a hand tracker setup which put your actual video hands into the image.
It had a mock dashboard with real buttons but no real screen. In VR everything was showing telemetry of course and in a Varjo headset this looked 100% like reality. The fact you used your hands to press the buttons together with touching actual buttons was mind blowing. It really became virtual reality in its most pure form. I wanted one but my wallet just laughed and walked away, still laughing.
The Index controllers could already be one step closer. But yeah, free handtracking support would be even better. Next level would probably be something like this, where you can acutally feel the controls (but - price…):
A step closer is still just a gimmick. I just got my Index controllers and having seen Leap Motion hand tracking, this Index finger bending business really looks like a joke in comparison
Ok, hoped it would be good enough to press a button or flip a switch
Have ordered an index controller (not there yet) in the hope to be a little closer to a “hands free” approach. Your review doesn’t sound too promising. But let’s see, still some hope here that it’s fun…
Also have the gen 1 LeapMotion controller (used it e.g. for MIDI control with the very nice Geco software). It was highly accurate, but range was really limited back then. I heard the new version is a lot better in that regard though! Will probably also get me the Pimax handtrackers once they are available. (Think I have enough Pimax stuff on “preorder” by now from Kickstarter days, will wait until it’s there this time. )
Oh it’s still fun, for sure. I really like them being attached to my hands so I can really let go of things. It has its merit but I just really love visual hand tracking and the near perfect virtualization of your own hands with that tech. But it’s not yet time for it, I think facebook is going to open the doors for amazing fidelity on all fronts. Untill then, we’re gonna have a lot of fun with the Index controllers too
A bit after I got my Index controllers, I decided that it’s about bloody time I finally go through The Talos Principle, and I only got to the first terminal in the game, before I figured I’d resume the postponement, to see how well the relatively input-light game works with the handtracking module, driving a virtual controller, and if the answer is “acceptably well”, play through using that instead. Those terminals are rare, and only have… what was it, two or three buttons, and already that felt too clunky, and too abstracted, interacting through a remote-control-for-the-player-character lump of matter in my hand.
If I can move around in this game, carrying and directing those disruptor devices with bare hands, I will. :7
By the way… Why don’t I just take this as a new opportunity to fish for information about software for the thing…
Never got much out of skimming the Ultraleap home page. If a Pimax rep can’t offer insight, maybe some existing LeapMotion user can? :7
So… Other than integrated complex support in applications, which is going to be a rare treat… How about a few more general things, either first party, or by independent developers:
Do we have virtual SteamVR Input controllers, that map Ultraleap’s skeletal tracking model to that of SteamVR Input, and have a comprehensive SteamVR Input mapping profile? -Preferrably with spoofing ability for several other input devices, so that one can make it work with older software.
Do we have a global virtual keyboard overlay, that works just like any keyboard plugged into a USB port, with any application, and does its own low-level, low overhead rendering (not a Unity or Unreal application, in other words)?
Any global gesture-, and/or floating/attached-menu -service, with the same conditions as the keyboard, that allows you to map arbitrary functions to configurable hand inputs? (Much like VoiceAttack, but for various modes of hand input)
Is there any software that does 3D-scanning with the device? What sort of resolution and impreciseness could one expect, with such use of the hardware, for different object surface properties?
While I’d love such scanning for helping with modelling 3D printed parts that fits together with existing parts in general, It would be really nice if one could turn the HMD on oneself, take a 3D capture of one’s face, and import that into Blender, for grafting together with a HMD shroud template, so that one could print one’s own personal tailored facial interface.
On top of the above: Any such scanning software that works in conjunction with the lighthouse tracking, for scanning larger objects, including letting you walk around your room, to pick up its geometry for things like an advanced chaperone-style thing?
Is the device strictly depth-sensing, or does it have colour camera(s), too, like the Dragonfly prototype? Again for the scanning.
Would any existing LeapMotion software work out of the box with the new wide field of view one, or need to be adapted?
That’s what I can immediately wish for, off the top of my head. Any other desireables, anybody? :7