Time-Interval-Guided Event Representation for Scene Understanding

The recovery of scenes under extreme lighting conditions is pivotal for effective image analysis and feature detection. Traditional cameras face challenges with low dynamic range and limited spectral response in such scenarios. In this paper, we advocate for the adoption of event cameras to reconstr...

Full description

Saved in:
Bibliographic Details
Main Authors: Boxuan Wang, Wenjun Yang, Kunqi Wu, Rui Yang, Jiayue Xie, Huixiang Liu
Format: Article
Language:English
Published: MDPI AG 2025-05-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/25/10/3186
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849327120272064512
author Boxuan Wang
Wenjun Yang
Kunqi Wu
Rui Yang
Jiayue Xie
Huixiang Liu
author_facet Boxuan Wang
Wenjun Yang
Kunqi Wu
Rui Yang
Jiayue Xie
Huixiang Liu
author_sort Boxuan Wang
collection DOAJ
description The recovery of scenes under extreme lighting conditions is pivotal for effective image analysis and feature detection. Traditional cameras face challenges with low dynamic range and limited spectral response in such scenarios. In this paper, we advocate for the adoption of event cameras to reconstruct static scenes, particularly those in low illumination. We introduce a new method to elucidate the phenomenon where event cameras continue to generate events even in the absence of brightness changes, highlighting the crucial role played by noise in this process. Furthermore, we substantiate that events predominantly occur in pairs and establish a correlation between the time interval of event pairs and the relative light intensity of the scene. A key contribution of our work is the proposal of an innovative method to convert sparse event streams into dense intensity frames without dependence on any active light source or motion, achieving the static imaging of event cameras. This method expands the application of event cameras in static vision fields such as HDR imaging and leads to a practical application. The feasibility of our method was demonstrated through multiple experiments.
format Article
id doaj-art-aa7341da1e8049d89e6d6d233cfe9a8a
institution Kabale University
issn 1424-8220
language English
publishDate 2025-05-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj-art-aa7341da1e8049d89e6d6d233cfe9a8a2025-08-20T03:47:58ZengMDPI AGSensors1424-82202025-05-012510318610.3390/s25103186Time-Interval-Guided Event Representation for Scene UnderstandingBoxuan Wang0Wenjun Yang1Kunqi Wu2Rui Yang3Jiayue Xie4Huixiang Liu5School of Automation, Beijing Information Science and Technology University, Beijing 102206, ChinaSchool of Automation, Beijing Information Science and Technology University, Beijing 102206, ChinaSchool of Automation, Beijing Information Science and Technology University, Beijing 102206, ChinaSchool of Automation, Beijing Information Science and Technology University, Beijing 102206, ChinaSchool of Automation, Beijing Information Science and Technology University, Beijing 102206, ChinaSchool of Automation, Beijing Information Science and Technology University, Beijing 102206, ChinaThe recovery of scenes under extreme lighting conditions is pivotal for effective image analysis and feature detection. Traditional cameras face challenges with low dynamic range and limited spectral response in such scenarios. In this paper, we advocate for the adoption of event cameras to reconstruct static scenes, particularly those in low illumination. We introduce a new method to elucidate the phenomenon where event cameras continue to generate events even in the absence of brightness changes, highlighting the crucial role played by noise in this process. Furthermore, we substantiate that events predominantly occur in pairs and establish a correlation between the time interval of event pairs and the relative light intensity of the scene. A key contribution of our work is the proposal of an innovative method to convert sparse event streams into dense intensity frames without dependence on any active light source or motion, achieving the static imaging of event cameras. This method expands the application of event cameras in static vision fields such as HDR imaging and leads to a practical application. The feasibility of our method was demonstrated through multiple experiments.https://www.mdpi.com/1424-8220/25/10/3186event camerastatic imagingtime intervalintensity frames
spellingShingle Boxuan Wang
Wenjun Yang
Kunqi Wu
Rui Yang
Jiayue Xie
Huixiang Liu
Time-Interval-Guided Event Representation for Scene Understanding
Sensors
event camera
static imaging
time interval
intensity frames
title Time-Interval-Guided Event Representation for Scene Understanding
title_full Time-Interval-Guided Event Representation for Scene Understanding
title_fullStr Time-Interval-Guided Event Representation for Scene Understanding
title_full_unstemmed Time-Interval-Guided Event Representation for Scene Understanding
title_short Time-Interval-Guided Event Representation for Scene Understanding
title_sort time interval guided event representation for scene understanding
topic event camera
static imaging
time interval
intensity frames
url https://www.mdpi.com/1424-8220/25/10/3186
work_keys_str_mv AT boxuanwang timeintervalguidedeventrepresentationforsceneunderstanding
AT wenjunyang timeintervalguidedeventrepresentationforsceneunderstanding
AT kunqiwu timeintervalguidedeventrepresentationforsceneunderstanding
AT ruiyang timeintervalguidedeventrepresentationforsceneunderstanding
AT jiayuexie timeintervalguidedeventrepresentationforsceneunderstanding
AT huixiangliu timeintervalguidedeventrepresentationforsceneunderstanding