What do you need help with?

Alternative User Input Methods

Follow

 

UI/UX is an ever evolving topic in the AR space, for sure. This article will summarize the levels or options for user input.  High level, there are 4:
  1. Legacy UI method - Selector or Cursor modes (see below) to support standard Android Phone/Tablet input models
  2. Voice Command - usually simple set works best
  3. Gesture input - Augumenta, Crunchfish, etc.. Also, R-8 & R-9 will have some additional options with the stereo cameras for depth.
  4. Head position (some call this "gaze") - somewhat similar to Cardboard.
Legacy: Yes, the Reticle Speed Mouse (AKA RSM) is this the same as Bluetooth Finger Controller mentioned in the UI Overview. The R-7 & R-9 have an optical sensor on its temple also, so both that and the RSM provide the same input events, depending on input mode. Basically, there's 2 modes - Selector or Cursor. Default mode is Selector. In Selector mode, swiping on the optical track pad (on either temple or RSM) generates DPAD events, and the standard Android UI can interpret these as swipe or motion control. The UI needs to implement object selection, so once the user has selected an object, they can then click the track pad to select/execute it. In Cursor mode, a mouse cursor appears, and swiping on the optical track pad generates mouse events that position the cursor over clickable UI objects. Then, click the track pad to select/execute it via enter or synthesized touch events. More on this below.
 
We also developed an App for Android and iPhone SmartPhones (ReticleRemote - described below) that one can use as a remote touch interface, and it is capable of a third mode, which supports all the standard 2 finger touch events for pinch, zoom, rotate. And, our BT keyboard/game controller generates all the standard key events you would need, plus joystick events for games.
 
Voice Command - A number of our partners have used a range of voice input fairly successfully; the use cases need to allow for voice input though (can't be super noisy or with a lot of different voices in the background). There's an article that covers this pretty well. https://developer.osterhoutgroup.com/hc/en-us/articles/206553544-Voice-Command-Recognition
 
Gesture: Using the single camera, there are SDKs that can recognize hand gestures. These will work even better and more efficiently on the R-8 & R-9 with dual stereo cameras. The only potential issue will be when an App use case includes streaming video all the time (as in most Telepresence Apps), which may conflict with using the camera for gesture input. R-9 will also have an interface for external sensor modules. The limitation will be that the 835 can have only 3 cameras on simultaneously, and with the 6DoF sensor being used for tracking most of the time, that may conflict with streaming versus detecting gesture input. More on this below.
 
Head/Gaze: This is where we are in the process of moving to. We are converting a number of the platform Apps to use gaze as the input method, and we'll have an SDK within a month or two for Apps to use for input selection. A number of our software partners have already implemented UIs that use this approach, and they work very well. If you are visiting San Francisco, come by the office - we can show you a demo of one of these Apps, and hopefully soon will have some of our own Apps that use gaze for input to show too.

 

ReticleRemote: Also for Legacy App support when your App require 2 finger touch, you must pair the glasses with a smart phone running the ReticleRemote App. That App makes the phone's touch screen a remote touch screen for the glasses. It can function in all ReticleOS input modes (selector, cursor and a new third mode that supports 2 finger input). 

The Android App for a phone or tablet called ReticleRemote (https://play.google.com/store/apps/details?id=com.osterhoutgroup.reticleremote) can be used to provide input to the Glasses over Bluetooth.  Obviously, the Android App's behavior running on the glasses paired with a phone is going to be a little different than what it would be like if the App what running on a phone - the relationship between where you are touching on the phone and the Apps UI on the screen is a little fuzzy; the App's UI does not show up on the phone, so you can see only on the glasses display where you are touching, but it does work fine for doing general pinch, zoom and rotations.

 If you don't want to use a smart phone paired with the glasses, you many need to make some changes to your App.  There are a few things an App Developer can do to address User Input.

  1. ReticleOS has two input modes; one is selector mode, in which Apps receive trackpad/DPAD events and use those to select objects in the UI for selection and entry.  The other is cursor mode, where a mouse cursor appears on the screen, and you can position it and click anywhere on your App UI to generate mouse input events, which most Apps that are touch based will accept as input, and touch events.  This mode, though, does not support 2 finger input.
  2. We have a bluetooth "Reticle Speed Mouse" (RSM) which has 2 modes of operation.  One is more standard 2D mode, where you generate mouse X/Y events to move the cursor around on the screen via a thumb optical sensor (much like the one on the Glasses temple).  The other is 3D mode, where an internal Inertial Measurement Unit (IMU) tracks gestures and generates events to position the cursor or control the App, and then the user can in effect click and drag using gestures, allowing for manipulation of objects in the UI, much like what pinch and zoom two finger gestures are used for.  This is what most people use to control things like 3D maps and 3D design models.
  3. Some Apps do gesture & hand recognition. There are a range of software that detects hands. One of our software partners, Augumenta, makes a Gesture SDK that uses the visible light camera. They have demo apps on AppCenter you can download and try. The SDK allows recognizing hands in certain positions, making certain signs. Another vendor, Crunchfish, has another gesture library that is a little more generic, and detects swipes for moving thru lists of things. They have phone and VR headset Apps their algorithms are in, but no sdk for ReticleOS yet. http://crunchfish.com/touchless/

Finally, another partner, Leap Motion, makes a depth sensor and software for detecting hands, and has an SDK that allows development an App running on Android that could be used for input. Check out the video here: https://developer.leapmotion.com/gallery/blocks

Those hands are the user's real hands, detected by their sensor, and then rendered in the display, position detect in space, to allows the user to grab blocks and throw them around.

Have more questions? Submit a request

Comments

Powered by Zendesk