The browser you are using is not supported by this website. All versions of Internet Explorer are no longer supported, either by us or Microsoft (read more here: https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Please use a modern browser to fully experience our website, such as the newest versions of Edge, Chrome, Firefox or Safari etc.

Adaptive enhancement and noise reduction in very low light-level video

Author

Summary, in English

A general methodology for noise reduction and contrast enhancement in very noisy image data with low dynamic range is presented. Video footage recorded in very dim light is especially targeted. Smoothing kernels that automatically adapt to the local spatio-temporal intensity structure in the

image sequences are constructed in order to preserve and enhance fine spatial detail and prevent motion blur. In color image data, the chromaticity is restored and demosaicing of raw RGB input data is performed simultaneously with the noise reduction. The method is very general, contains few user-defined parameters and has been developed for efficient

parallel computation using a GPU. The technique has been applied to image sequences with various degrees of darkness and noise levels, and results from some of these tests, and comparisons to other methods, are presented. The present work has been inspired by research on vision in nocturnal

animals, particularly the spatial and temporal visual summation that allows these animals to see in dim light.

Publishing year

2007

Language

English

Pages

1395-1402

Publication/Series

11th International Conference on Computer Vision, 2007

Document type

Conference paper

Publisher

IEEE - Institute of Electrical and Electronics Engineers Inc.

Topic

  • Computer Science
  • Computer Vision and Robotics (Autonomous Systems)

Conference name

IEEE 11th International Conference on Computer Vision, 2007. ICCV 2007

Conference date

2007-10-14 - 2007-10-21

Conference place

Rio de Janeiro, Brazil

Status

Published

ISBN/ISSN/Other

  • ISSN: 1550-5499
  • ISBN: 978-1-4244-1631-8