Repository hosted by TU Delft Library

Home · Contact · About · Disclaimer ·
 

Augmenting full colour-fused multi-band night vision imagery with synthetic imagery in real-time

Publication files not online:

Author: Toet, A. · Hogervorst, M.A. · Son, R. van · Dijk, J.
Type:article
Date:2011
Source:International Journal of Image and Data Fusion, 4, 2, 287-308
Identifier: 441618
doi: doi:10.1080/19479832.2011.598135
Keywords: Vision · augmented reality · image fusion · false colour · natural colour mapping · real-time fusion · night vision · color · Human ; Organisation ; Physics & Electronics · PCS - Perceptual and Cognitive Systems ; MSG - Modelling Simulation & Gaming ; II - Intelligent Imaging · BSS - Behavioural and Societal Sciences ; TS - Technical Sciences

Abstract

We present the design and first field trial results of an all-day all-weather enhanced and synthetic-fused multi-band colour night vision surveillance and observation system. The system augments a fused and dynamic three-band natural-colour night vision image with synthetic 3D imagery in real-time. The night vision sensor suite consists of three cameras, sensitive in, respectively, the visual (400–700 nm), the near-infrared (NIR, 700–1000 nm) and the long-wave infrared (LWIR, 8–14 mm) bands of the electromagnetic spectrum. The optical axes of the three cameras are aligned. Image quality of the fused sensor signals is enhanced in real-time through dynamic noise reduction, super resolution and local adaptive contrast enhancement. The quality of the LWIR image is enhanced through scene-based non-uniformity correction. The visual and NIR signals are used to represent the fused multi-band night vision image in natural daytime colours, using the Colour-the-Night colour remapping technique. Colour remapping can also be deployed to enhance the visibility of thermal targets that are camouflaged in the visual and NIR range of the spectrum. The dynamic false-colour night-time images are augmented with corresponding synthetic 3D scene views, generated in real-time using a geometric 3D scene model in combination with position and orientation information supplied by the Global Positioning System and inertial sensors of the system. Initial field trials show that this system provides enhanced situational information in various low-visibility conditions.