What do you need help with?

Alternative user input methods



As for input control, if you require 2 finger touch, you must pair the glasses with a smart phone running the ReticleRemote App, which makes the phone's touch screen a remote touch screen for the glasses. It can function in all ReticleOS input modes (selector, cursor and a new third mode that supports 2 finger input). 

The Android App for a phone or tablet called ReticleRemote (https://play.google.com/store/apps/details?id=com.osterhoutgroup.reticleremote) that can be used to provide input to the Glasses over Bluetooth.  Obviously, it's going to be a little different than what a standard App does on a phone - the relationship between where you are touching on the phone and the Apps UI on the screen is a little fuzzy; the App's UI does not show up on the phone, so you can see only on the glasses display where you are touching, but it does work fine for doing general pinch, zoom and rotations.

 If you don't want to use a smart phone paired with the glasses, you many need to make some changes to your App.  There are a few things we've done though to address this.

  1. ReticleOS has two input modes; one is selector mode, in which Apps receive trackpad/DPAD events and use those to select objects in the UI for selection and entry.  The other is cursor mode, where a mouse cursor appears on the screen, and you can position it and click anywhere on your App UI to generate mouse input events, which most Apps that are touch based will accept as input, and touch events.  This mode, though, does not support 2 finger input.
  2. We have a bluetooth "Reticle Speed Mouse" which has 2 modes of operation.  One is more standard 2D mode, where you generate mouse X/Y events to move the cursor around on the screen via a thumb optical sensor (much like the one on the Glasses temple).  The other is 3D mode, where an internal Inertial Measurement Unit (IMU) tracks gestures and generates events to position the cursor or control the App, and then the user can in effect click and drag using gestures, allowing for manipulation of objects in the UI, much like what pinch and zoom two finger gestures are used for.  This is what most people use to control things like 3D maps and 3D design models.
  3. Some Apps do gesture & hand recognition. There are a range of software that detects hands. One of our software partners, Augumenta, makes a Gesture SDK that uses the visible light camera. They have demo apps on AppCenter you can download and try. The SDK allows recognizing hands in certain positions, making certain signs. Another vendor, Crunchfish, has another gesture library that is a little more generic, and detects swipes for moving thru lists of things. They have phone and VR headset Apps their algorithms are in, but no sdk for ReticleOS yet. http://crunchfish.com/touchless/

Finally, another partner, Leap Motion, makes a depth sensor and software for detecting hands, and has an SDK that allows development an App running on Android that could be used for input. Check out the video here: https://developer.leapmotion.com/gallery/blocks

Those hands are the user's real hands, detected by their sensor, and then rendered in the display, position detect in space, to allows the user to grab blocks and throw them around.

Have more questions? Submit a request


Powered by Zendesk