首页    期刊浏览 2024年12月01日 星期日
登录注册

文章基本信息

  • 标题:A Spiking Neural Network Model of Depth from Defocus for Event-based Neuromorphic Vision
  • 本地全文:下载
  • 作者:Germain Haessig ; Xavier Berthelon ; Sio-Hoi Ieng
  • 期刊名称:Scientific Reports
  • 电子版ISSN:2045-2322
  • 出版年度:2019
  • 卷号:9
  • 期号:1
  • 页码:1-11
  • DOI:10.1038/s41598-019-40064-0
  • 出版社:Springer Nature
  • 摘要:Depth from defocus is an important mechanism that enables vision systems to perceive depth. While machine vision has developed several algorithms to estimate depth from the amount of defocus present at the focal plane, existing techniques are slow, energy demanding and mainly relying on numerous acquisitions and massive amounts of filtering operations on the pixels' absolute luminance value. Recent advances in neuromorphic engineering allow an alternative to this problem, with the use of event-based silicon retinas and neural processing devices inspired by the organizing principles of the brain. In this paper, we present a low power, compact and computationally inexpensive setup to estimate depth in a 3D scene in real time at high rates that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. Exploiting the high temporal resolution of the event-based silicon retina, we are able to extract depth at 100 Hz for a power budget lower than a 200 mW (10 mW for the camera, 90 mW for the liquid lens and ~100 mW for the computation). We validate the model with experimental results, highlighting features that are consistent with both computational neuroscience and recent findings in the retina physiology. We demonstrate its efficiency with a prototype of a neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological depth from defocus experiments reported in the literature.
国家哲学社会科学文献中心版权所有