Camera localization, i.e., camera pose regression, represents an important task in computer vision with many practical applications such as in the context of intelligent vehicles and their localization. Having reliable es- timates of the regression uncertainties is also important, as it would allow us to catch dangerous localization failures. In the literature, uncertainty estimation in Deep Neural Networks (DNNs) is often performed through sampling methods, such as Monte Carlo Dropout (MCD) and Deep Ensemble (DE), at the expense of un- desirable execution time or an increase in hardware resources. In this work, we considered an uncertainty estimation approach named Deep Evidential Regression (DER) that avoids any sampling technique, providing direct uncertainty estimates. Our goal is to provide a systematic approach to intercept localization failures of camera localization systems based on DNNs architectures, by analyzing the generated uncertainties. We propose to exploit CMRNet, a DNN approach for multi-modal image to LiDAR map registration, by mod- ifying its internal configuration to allow for extensive experimental activity on two different datasets. The experimental section highlights CMRNet’s major flaws and proves that our proposal does not compromise the original localization performances, but also provides the necessary introspection measures that would allow end-users to act accordingly.

Vaghi, M., Ballardini, A., Fontana, S., Sorrenti, D. (2024). Uncertainty-Aware DNN for Multi-Modal Camera Localization. In Proceedings of the 21st International Conference on Informatics in Control, Automation and Robotics - (Volume 2) (pp.80-90). Science and Technology Publications, Lda [10.5220/0013064600003822].

Uncertainty-Aware DNN for Multi-Modal Camera Localization

Vaghi M.
Primo
;
Fontana S.;Sorrenti D. G.
Ultimo
2024

Abstract

Camera localization, i.e., camera pose regression, represents an important task in computer vision with many practical applications such as in the context of intelligent vehicles and their localization. Having reliable es- timates of the regression uncertainties is also important, as it would allow us to catch dangerous localization failures. In the literature, uncertainty estimation in Deep Neural Networks (DNNs) is often performed through sampling methods, such as Monte Carlo Dropout (MCD) and Deep Ensemble (DE), at the expense of un- desirable execution time or an increase in hardware resources. In this work, we considered an uncertainty estimation approach named Deep Evidential Regression (DER) that avoids any sampling technique, providing direct uncertainty estimates. Our goal is to provide a systematic approach to intercept localization failures of camera localization systems based on DNNs architectures, by analyzing the generated uncertainties. We propose to exploit CMRNet, a DNN approach for multi-modal image to LiDAR map registration, by mod- ifying its internal configuration to allow for extensive experimental activity on two different datasets. The experimental section highlights CMRNet’s major flaws and proves that our proposal does not compromise the original localization performances, but also provides the necessary introspection measures that would allow end-users to act accordingly.
paper
Camera Localization; Deep Learning; Uncertainty Estimation;
English
21st International Conference on Informatics in Control, Automation and Robotics, ICINCO 2024 - November 18-20, 2024
2024
Gini, G; Precup, RE; Filev, D
Proceedings of the 21st International Conference on Informatics in Control, Automation and Robotics - (Volume 2)
9789897587177
2024
2
80
90
https://www.scitepress.org/ProceedingsDetails.aspx?ID=haumXcKckHM=&t=1
none
Vaghi, M., Ballardini, A., Fontana, S., Sorrenti, D. (2024). Uncertainty-Aware DNN for Multi-Modal Camera Localization. In Proceedings of the 21st International Conference on Informatics in Control, Automation and Robotics - (Volume 2) (pp.80-90). Science and Technology Publications, Lda [10.5220/0013064600003822].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/552572
Citazioni
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
Social impact