FlyDVS: An Event-Driven Wireless Ultra-Low Power Visual Sensor Node

Alfio Di Mauroa, Moritz Schererb, Jordi Fornt Masc, Basile Bougenotd, Michele Magnoe and Luca Beninif
Integrated Systems Laboratory, ETH Zurich, Switzerland
aadimauro@ethz.ch
bscheremo@ethz.ch
cjormas@ethz.ch
dbasileb@ethz.ch
emichele.magno@ethz.ch
flbenini@ethz.ch

ABSTRACT


Event-based cameras, also called dynamic vision sensors (DVS), inspired by the human vision system, are gaining popularity due to their potential energy-saving since they generate asynchronous events only from the pixels changes in the field of view. Unfortunately, in most current uses, data acquisition, processing, and streaming of data from event-based cameras are performed by power-hungry hardware, mainly high-power FPGAs. For this reason, the overall power consumption of an event-based system that includes digital capture and streaming of events, is in the order of hundreds of milliwatts or even watts, reducing significantly usability in real-life low-power applications such as wearable devices. This work presents FlyDVS, the first event-driven wireless ultra-low-power visual sensor node that includes a low-power Lattice FPGA and, a Bluetooth wireless system-on-chip, and hosts a commercial ultra-low-power DVS camera module. Experimental results show that the low-power FPGA can reach up to 874 efps (event-frames per second) with only 17.6mW of power, and the sensor node consumes an overall power of 35.5 mW (including wireless streaming) at 200 efps. We demonstrate FlyDVS in a real-life scenario, namely, to acquire event frames of a gesture recognition data set.

Keywords: Brain-Inspired Sensor, Event-Based Camera, Bluetooth Low Energy, Low Power Design, FPGA, ULP, Edge Device.



Full Text (PDF)