Rapid Fog-Removal Strategies for Traffic Environments

In a foggy traffic environment, the vision sensor signal of intelligent vehicles will be distorted, the outline of obstacles will become blurred, and the color information in the traffic road will be missing. To solve this problem, four ultra-fast defogging strategies in a traffic environment are proposed for the first time. Through experiments, it is found that the performance of Fast Defogging Strategy 3 is more suitable for fast defogging in a traffic environment. This strategy reduces the original foggy picture by 256 times via bilinear interpolation, and the defogging is processed via the dark channel prior algorithm. Then, the image after fog removal is processed via 4-time upsampling and Gaussian transform. Compared with the original dark channel prior algorithm, the image edge is clearer, and the color information is enhanced. The fast defogging strategy and the original dark channel prior algorithm can reduce the defogging time by 83.93-84.92%. Then, the image after fog removal is inputted into the YOLOv4, YOLOv5, YOLOv6, and YOLOv7 target detection algorithms for detection and verification. It is proven that the image after fog removal can effectively detect vehicles and pedestrians in a complex traffic environment. The experimental results show that the fast defogging strategy is suitable for fast defogging in a traffic environment.

Medienart:

E-Artikel

Erscheinungsjahr:

2023

Erschienen:

2023

Enthalten in:

Zur Gesamtaufnahme - volume:23

Enthalten in:

Sensors (Basel, Switzerland) - 23(2023), 17 vom: 29. Aug.

Sprache:

Englisch

Beteiligte Personen:

Liu, Xinchao [VerfasserIn]
Hong, Liang [VerfasserIn]
Lin, Yier [VerfasserIn]

Links:

Volltext

Themen:

Bilinear interpolation
Dark channel prior
Fast defogging strategy
Gaussian transform
Journal Article
Upsampling

Anmerkungen:

Date Revised 12.09.2023

published: Electronic

Citation Status PubMed-not-MEDLINE

doi:

10.3390/s23177506

funding:

Förderinstitution / Projekttitel:

PPN (Katalog-ID):

NLM361854552