Real-Time 360-Degree Surround View System Using Multi-Camera Image Fusion for Autonomous Driving

Conference proceedings article


Authors/Editors


Strategic Research Themes


Publication Details

Author listBenjamas Panomruttanarug, Chanwut Sanpetvessakul, Muttreeyaporn Munkong

Publication year2025


Abstract

This paper presents a 360 -degree surround-view generation method using projective transformation and homography estimation from multiple wide-angle cameras. Each camera was individually calibrated, and homography matrices were computed by matching detected chessboard corners to predefined templates in a unified bird's-eye view (BEV) layout. To achieve seamless integration, the BEV canvas was divided into eight logical regions, and a distance-based spatial blending technique was applied in overlapping areas. This blending strategy computed pixel-wise weights based on the Euclidean distance to the outer boundaries of each camera's valid projection zone, effectively minimizing visual seams, ghosting, and illumination artifacts. The system demonstrated consistent performance under both daytime and nighttime conditions. With an average processing time of approximately 0.28 seconds per frame, the proposed framework enables near real-time operation on embedded platforms and provides a robust foundation for downstream perception tasks such as road extraction and autonomous navigation. Code to reproduce our results is available at: https://github.com/souldeathz/360-degree-Surround-View-application.


Keywords

No matching items found.


Last updated on 2026-18-03 at 12:00