What is Naviscribe?

3D Motion Capture For Future User Interfaces

NaviScribe is a miniature opto-electronic navigation unit that fits in small hand-held user devices. The unit deploys efficient optical pose estimation algorithms to recover 3D position and orientation of the hand-held device with respect to its environment. Electronic pens, styluses, TV remotes, gaming wands as well as hand-held or wearable devices that include augmented reality(AR) glasses, virtual reality (VR) goggles and head-tracking gear can use NaviScribe units for real-time 3D localization and motion recovery. Smart phones and tablets can take advantage of NaviScribe’s algorithms without any additional on-board hardware, since they are already equipped with cameras and auxiliary sensors. The output of a NaviScribe unit is a stream of high resolution digitized data.  This data describes the 3D position and inclination angles with respect to stationary objects as well as 3D motion executed by the hand-held or wearable device on which the unit is mounted.  

NaviScribe-equipped devices support true 3D User Interfaces. The digitized 3D data provided by NaviScribe contains full gesture information computed in true 3D: absolute position and orientation expressed in the user's environment. The pose information is indexed to selected features of the user's environment. Such absolute pose data can be deployed to obtain superior results in more complex localization approaches including Simultaneous Localization and Mapping (SLAM) and other more modern techniques relying on sensor fusion (e.g., object-level SLAM++). It is the addition of absolute optical pose data that puts augmented reality applications in which virtual objects are placed in the user's field of view at intermediate distances within reach. VR applications requiring freedom from jitter and sub-millimeter resolution for "inside-out" localization should also consider NaviScribe. Its reliance on the camera to provide frequent ground truth to the system is an advantage when working with fast but noisy auxiliary sensors. Optical hardware coupled with appropriate computational vision techniques and efficient homography estimation provide tools for rapidly differentiating linear and rotational movements. Performance with delays under 20 ms finally promises to put that elusive "real-time" feel of VR within reach and keep the noise and drift effects that give VR goggle users headaches to a minimum. In considering compound sensors and alternatives (sonar, magnetometer, inclinometer, accelerometer, time-of-flight optical depth sensor etc.) the reasons that led the evolution of life on this planet to select vision as the primary sensing modality should not be underestimated. After millions of years of trial and error, most animals use sight to ascertain location and guide fine motor movements executed in 2D and 3D spaces. Meanwhile, NaviScribe, when mounted in an electronic stylus operating on a display screen or in a digital pen writing on paper can recover digital ink for writing and drawing applications as well as gestures performed in hover mode.  Note that these tasks require absolute resolution of sub-millimeter linear movements and fractions of a degree in angular changes with respect to the writing surface -- all easily captured and reported by NaviScribe.

NaviScribe was developed by Electronic Scripting Products, Inc. in Palo Alto, California.