What do you need help with?

R-9 Camera Sensor Details


The R-9 has 4 cameras plus an external MIPI interface for external modules.  The 4 cameras show up as 3 Android cameras (ID 0, 1 & 2):

  • 13 MP camera with auto focus (default)
  • 170 degree mono fisheye (used by VR SDK 6DoF tracking system)
  • 1080p stereo camera (left) + 1080p stereo camera (right)
You should be able to get data from the cameras through the camera manager and iterate through the list of CameraIDs; they should be in the order listed above.  Attached are 3 files containing each camera's characteristics as reported by the APIs as of ReticleOS build 3.17.

The external interface on the top of the R-9 will support a 5th MIPI sensor. Note that only 3 can be active at once, so there will be trade offs on what your use case can do. Also, adding a MIPI device requires MIPI drivers which then must be integrated by ODG into the OS, so that's a bit of a time and money constraint, and requires ODG to integrate the driver into the OS kernel.

Camera Mounting:

The following diagram shows the cameras relative position and angle with respect to the user's view thru the displays. Note that when worn by a typical user, the temples are horizontal, and the display optical axis is 8 degrees down from the horizontal axis.


This diagram shows camera position with respect to the IMU sensors (Gyro/Accel & Mag):



Camera Sensors:

The 13 MP camera is the primary camera with auto focus, highest quality, most features. Good for use in streaming for telepresence, and object recognition. The camera module uses the Sony IMX258 sensor.  Diagonal full frame field of view of the camera is ~ 78 degrees.


The Stereo cameras can be used to capture S3D video for recording or streaming, and for depth data / point cloud data collection, mainly for detecting surfaces/planes for AR purposes. Each camera uses the Omnivision OV5675 sensor. The diagonal full frame field of view of the cameras are 90 - 100 degrees.

One of these below advertised resolutions should give the full frames side by side:

[w:5184, h:1944, format:RAW_SENSOR(32), min_duration:33333333, stall:100000000], 

[w:2560, h:800, format:JPEG(256), min_duration:33333333, stall:49000000], 

[w:1280, h:480, format:JPEG(256), min_duration:33333333, stall:45000000], 

[w:1280, h:400, format:JPEG(256), min_duration:33333333, stall:45000000], 

We are also hoping to work with gesture folks who could use these sensors with their gesture SDKs for better gesture detection. No specific timeline is defined yet on when this might be available.

Initially, the depth data will be made available via a Unity plug-in.

The Fisheye camera is used by the 6DoF tracking system, along with the IMU, to provide markerless inside-out tracking via the VR SDK.  The sensor used with it is Omnivision OV7251. The diagonal full frame field of view of the camera is approximately 170 degrees.
For Camera access, you can refer to the sample code:
If you are using NDK you can refer to the following sample:

Any Android Application can open any of the cameras, with the following limitations:

  • if you intend to use our inside-out tracking (VR SDK), that platform service opens and controls the Fisheye camera, so that won't be available when tracking is enabled. And, the data it collects is not sharable with other Apps.
  • if you use our platform depth system (under development right now), then that will have that stereo camera pair open. It is unknown right now if this camera data will be accessible to other Apps while it is running. If it is, it will be via an API other than the usual Android camera API.

We'll have APIs available for you to access the depth data, but not for a month or two will this be available in Alpha.

Have more questions? Submit a request


Powered by Zendesk