Z
Zhongnan Qu
Researcher at ETH Zurich
Publications - 16
Citations - 139
Zhongnan Qu is an academic researcher from ETH Zurich. The author has contributed to research in topics: Neuromorphic engineering & Computer science. The author has an hindex of 5, co-authored 13 publications receiving 59 citations. Previous affiliations of Zhongnan Qu include École Polytechnique Fédérale de Lausanne & Technische Universität München.
Papers
More filters
Journal ArticleDOI
Multi-Cue Event Information Fusion for Pedestrian Detection With Neuromorphic Vision Sensors.
Guang Chen,Hu Cao,Canbo Ye,Zhenyan Zhang,Xingbo Liu,Xuhui Mo,Zhongnan Qu,Jörg Conradt,Florian Röhrbein,Alois Knoll +9 more
TL;DR: This work proposes to develop pedestrian detectors that unlock the potential of the event data by leveraging multi-cue information and different fusion strategies, and introduces three different event-stream encoding methods based on Frequency, Surface of Active Event (SAE) and Leaky Integrate-and-Fire (LIF).
Book ChapterDOI
Online Multi-object Tracking-by-Clustering for Intelligent Transportation System with Neuromorphic Vision Sensor
Gereon Hinz,Guang Chen,Muhammad Aafaque,Florian Röhrbein,Jörg Conradt,Zhenshan Bing,Zhongnan Qu,Walter Stechele,Alois Knoll +8 more
TL;DR: This contribution proposes an online multi-target tracking system utilizing for neuromorphic vision sensors, which is the first neuromorph vision system in intelligent transportation systems, and integrates an online tracking-by-clustering system running at a high frame rate, which far exceeds the real-time capabilities of traditional frame based industry cameras.
Proceedings ArticleDOI
Adaptive Loss-Aware Quantization for Multi-Bit Networks
TL;DR: Adaptive Loss-aware Quantization (ALQ), a new MBN quantization pipeline that is able to achieve an average bitwidth below one-bit without notable loss in inference accuracy, is proposed.
Posted ContentDOI
Measuring what Really Matters: Optimizing Neural Networks for TinyML
TL;DR: This work addresses the challenges of bringing Machine Learning to MCUs, where it focuses on the ubiquitous ARM Cortex-M architecture and proposes an implementation-aware design as a cost-effective method for verification and benchmarking.
Journal ArticleDOI
Event-Based Robotic Grasping Detection With Neuromorphic Vision Sensor and Event-Grasping Dataset.
TL;DR: A deep neural network for grasping detection is developed that considers the angle learning problem as classification instead of regression, and performs high detection accuracy on the Event-Grasping dataset with 93% precision at an object-wise level split.