Camera-based advanced driver assistance with integrated YOLOv4for real-time detection

International Journal of Artificial Intelligence

Camera-based advanced driver assistance with integrated YOLOv4for real-time detection

Abstract

Testing object detection in adverse weather conditions poses significant chal lenges. This paper presents a framework for a camera-based advanced driver assistance system (ADAS) using the YOLOv4 model, supported by an electronic control unit (ECU). The ADAS-based ECU identifies object classes from real-time video, with detection efficiency validated against the YOLOv4 model. Performance is analysed using three testing methods: projection, video injection, and real vehicle testing. Each method is evaluated for accuracy in object detection, synchronization rate, correlated outcomes, and computational complexity. Results show that the projection method achieves highest accuracy with minimal frame deviation (1-2 frames) and up to 90% correlated outcomes, at approximately 30% computational complexity. The video injection method shows moderate accuracy and complexity, with frame deviation of 3-4 frames and 75%correlated outcomes. The real vehicle testing method, though demand ing higher computational resources and showing a lower synchronization rate (> 5 frames deviation), provides critical insights under realistic weather condi tions despite higher misclassification rates. The study highlights the importance of choosing appropriate method based on testing conditions and objectives, bal ancing computational efficiency, synchronization accuracy, and robustness in various weather scenarios. This research significantly advances autonomous ve hicle technology, particularly in enhancing ADAS object detection capabilities in diverse environmental conditions.

Discover Our Library

Embark on a journey through our expansive collection of articles and let curiosity lead your path to innovation.

Explore Now
Library 3D Ilustration