sensor

Dedicated Hardware
with High-Quality Components

The VI-Sensor features a high-quality global shutter HDR stereoscopic camera, up to two additional camera modules and an industrial-grade inertial measurement system.
The embedded FPGA and ARM Dual Core Cortex-A9 platform provides enough power for your real-time localization and navigation tasks in- and outdoors.

settings

Embedded Real-Time
Image Processing

The VI-Sensor uses an FPGA for the computationally expensive real-time detection and extraction of  features in stereo images.
As a result, the embedded ARM9 processor is capable of running the backend of a vision pipeline, making the VI-Sensor a unique mobile pose-estimation and 3D mapping device.

clock

Factory Calibrated
& Time Synchronized

The VI-Sensor is shipped factory-calibrated (intrinsic and extrinsic). Combined with embedded synchronisation and timestamping, accurate processing of camera and IMU data is ensured.
A trigger line allows to sychronize external modules like additional cameras, GPS receivers or flashing lights.

study

User-Friendly SDK
& Easy Integration

The VI-Sensor SDK comes with easy-to-understand exemplary code and a detailed Wiki, allowing you to get started in no time.
An open source driver allows easy integration into ROS, OpenCV and your projects under all major operating systems.

location

Compatible with Open Source
Visual Odometry Frameworks

The VI-Sensor provides high-quality input data to state-of-the-art open source visual odometry frameworks.
A stereo-visual-inertial odometry framework is currently under development. With this, the VI-Sensor can be used as out-of-the-box pose sensor.

params

Small, Lightweight & Robust Hardware

The VI-Sensor is designed to be applied on mobile devices like ground robots and unmanned aerial vehicles.
With a size comparable to existing stereo cameras and weighting less than 150 grams, it fits seamlessly on your robot or desk.

Reference Projects

Real-time dense stereo 3D Mapping and Navigation in unstructured indoor and outdoor Environments

Embedded Visual-Inertial Odometry

The VI-Sensor has been used for dense stereo reconstruction and mapping tasks. Since the VI-Sensor is independent of active infrared emitters, 3D maps and depth-images can be generated indoors as well as outdoors. The resulting 3D maps are used for navigation tasks in numerous applications scenarios. Embedded image processing on the FPGA significantly reduced CPU load and integration effort.

Embedded visual-inertial  Navigation
for mobile robotic Systems

The VI-Sensor technology is designed for easy integration into existing mobile systems.  Thanks to the fusion of inertial and visual data, the sensor provides reliable state estimation even under highly dyamic conditions. At ADRL and ASL, ETH Zurich, it is already successfully used with UAVs, walking robots and handheld devices, enabling navigation in- and outdoors.

Position Feedback Control for UAVs in indoor Environments using visual-inertial Odometry

Embedded Visual-Inertial Odometry

The VI-Sensor allows robust vision-based position feedback control in GPS-denied environments. Integrated on a UAV, the sensor successfully helped to inspect boiler systems of thermal power plants and mine shafts despite rough industrial operating conditions in terms of temperature and lighting conditions. For more information refer to ASL, ETH Zurich.

Burri, M.; Nikolic, J.; Hurzeler, C.; Caprari, G., Siegwart, R., “Aerial service robots for visual inspection of thermal power plant boiler systems“, Applied Robotics for the Power Industry (CARPI), 2012 2nd International Conference on, pp.70,75, 11-13 Sept. 2012

Robust embedded Egomotion Estimation for mobile Devices

Embedded Visual-Inertial Odometry

The VI-Sensor has been successfully integrated in a real-time pose esti-mation framework. Using only synchronized & timestamped stereo sensor outputs, the framework provides information on relative orientation and 3D odometry in real-time. In handheld experiments the relative error remained below 1%. For more information refer to ASL, ETH Zurich.

R. Voigt, J. Nikolic, C. Hürzeler, S. Weiss, L. Kneip, R. Siegwart, “Robust Embedded Egomotion Estimation“, Proc. of The IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), San Francisco, September 2011.

Use – Cases

The VI-Sensor technology enables a wide range of localization, navigation, mapping applications in:

  • Mobile Robotics
  • Photogrammetric Surveying
  • Automation
  • Industrial inspection
  • Automotive
  • Augmented Reality
  • Mining
  • Construction

Technical Specifications

  
Synchronized Output 
 Stereo images at 30Hz

3-axis accelerometer and angular velocities at 200 Hz

Air pressure at 100 Hz

3-axis magnetic field at 25 Hz

Calibration
Stereo camera Distortion, spatial inter-camera
Inertial measurement unit Axis misalignment, sensitivity, bias,

temperature compensation

Camera – IMU inter-sensor spatial and temporal
   
Components
Stereo camera 2x Aptina MT9V034 global shutter chips

752 x 480 pixels  monochrome images

Lensagon BM2820 lenses with 2.8 mm focal lenght & max. 122 deg field of view

Low-light sensitive

11 cm stereo baseline

IMU Industrial-grade Analog Devices ADIS 16448 IMU

Physical
Dimensions 133 x 40 x 57 mm
Weight 130 grams

View Factsheet

(factsheet not yet including all final features)

Driver Software and SDK

coding

The open source VI-Sensor driver software consists of  standard linux C++ libraries that are OpenCV compatible. It allows the user to register custom callback functions, such that the sensor data can be accessed as shared pointer with zero data copy.  It also provides additional information such as shutter time and image gain. Futhermore, sensor parameters are adjustable during runtime using the open source driver.

The ROS bridge included outputs IMU- and camera data in standard ROS messages. Moreover, custom messages provide access to sensor parameters while Dynamic Reconfigure allows easy online setting adjustment.

The SDK features sample projects that demonstrate the integration in OpenCV as well as ROS. In addition, a sophisticated sensor calibration tool allows to easily recalibrate all camera and camera to IMU parameters.

Get the Sensor Early

Apply for the Early Adopter Program

Skybotix offers an Early Adopter Program for the VI-Sensor. This program addresses research groups around the world that are eager to get their hands on the VI-Sensor technology early and are willing to help us with their feedback to finalize market-ready technology into a really great and easy-to-use device.

Keep me up to Date

Sign Up for the VI-Sensor Newsletter

Skybotix regularly informs about the latest progress of the VI-Sensor and its application. Don’t miss any news and sign up of the VI-Sensor Newsletter.