tl-dr; Oculus uses a conveniently placed ‘grab’ button. Every other game uses it… why not Interface?
I am starting this thread during the second weekend of ownership of my Oculus Touch Controllers. I have 2 sensors placed in opposite corners (according to the documentation of Rift) thereby making 360 possible. It is a lot of fun; and has driven me to purchase a third sensor to grow the space to ‘room scale’
I realize this topic will be similar to ‘oculus rift touch controller api json’ a post i created back in June, HOWEVER the focus of this thread is that I have the assumption that point-and-grab ((a la Bullet Train and First Contact)) should just work. One of my big Vive turn-offs during demo of the hand controllers (the image clarity is imho better than Rift…ymmv) was the grab button was on the side of the wand, making it difficult to hold objects. This has since been changed somewhere and no longer required. Alphas remember what I’m talking about…
Anyway, my point is: I realize HTC has thrown into the moneyball here and I’m concerned full native integration for Touch will fall to the wayside.
I know a dev will chime with: JSON mappings ftw…
I created a pet for the contest that was button driven; I already know how those work. If however the distance grab is NOT located in the root C++ code, ((wasn’t it pointers.js at some time @ctrlaltdavid)) then I would recommend updating the JSON to reflect the expectations of Touch users; the grab button is there to create an affect on your brain that you are actually “holding” something; let the grab button go, and you drop the object. This was how it worked previously; I’m asking for it to be re-introduced. I respect Vive users, and would expect the current control scheme works well for them. However, this is a prime opportunity for me to voice the Oculus side of the argument in favor of ‘grab versus trigger’
I don’t like it’s current integration.
Will it be modified in the immediate future?