Capabilities and Details - Reticle Speed Mouse
The RSM, formerly known as Wireless Finger Controller (WFC) and Soldier Air Mouse (SAM), has an IMU that allows it to be used in a "3D" mode for generating input events for Applications thru motion, as opposed to thumb manipulations detected by the optical sensor of the previous versions.
Accessing the Gyro and Magnetometer on the RSM / Distinguishing between the Sensors on the smartglasses & RSM
The raw sensor data in the RSM is not (yet) accessible by the App running on the glasses natively. The RSM must be put into "3D" mode. It then uses the sensors to control the cursor position and send events (trackball/DPAD events) to the App based on gestures. The App can turn off the cursor too, if it wants, as well.
To put the RSM into 3D mode - press and hold the volume < > keys for about 3 seconds, until the LED turns green. To go back to 2D mode, do the same until the green LED turns off. The RSM can only be in one mode at a time. Currently, the RSM doesn't communicate to the glasses which mode it is in, although we are considering adding this feature (currently the RSM sensor data is inaccessible pending completion of some in-progress Android/Linux driver changes + integration testing).
Currently the RSM natively receives a quaternion from the IMU and translates it into the three Tait-Bryan angles, which are then massaged into a HID Mouse report. This allows the RSM to function as a standard mouse device and work on any system. The Linux HID subsystem handles the device as if it were a standard desktop mouse, and Android exposes it through InputDevice/CursorInputMapper classes.
There is a way to skip that translation step and expose just the raw sensor data through a HID Sensor report. We are working on enabiling this capability; there is a bit of driver work which needs to be done in the Linux kernel to support this, developers should not count on getting access to the raw sensor data in the near term. Best way to allow for rotation would be to use a click and drag approach, where, in 3D mode, if the user clicks and holds the selection and then moves their wrist, it will cause the object to rotate. If you want to detect gestures, you will need to look at the HID event stream and deduce they are making a gesture. We are working toward getting the raw data available.