EconPapers    
Economics at your fingertips  
 

Deep Unsupervised Learning for Simultaneous Visual Odometry and Depth Estimation: A Novel Approach

Aymen Said

No 56ngs, OSF Preprints from Center for Open Science

Abstract: This article presents a novel approach for simultaneous visual odometry and depth estimation using deep unsupervised learning techniques. The proposed method leverages the power of deep neural networks to learn representations of visual data and estimate both camera motion and scene depth without the need for ground truth annotations. By formulating the problem as a self-supervised learning task, the network learns to extract meaningful features and infer depth information from monocular images. Experimental results on various datasets demonstrate the effectiveness and accuracy of the proposed approach in real-world scenarios.

Date: 2023-05-20
New Economics Papers: this item is included in nep-cmp and nep-mfd
References: Add references at CitEc
Citations:

Downloads: (external link)
https://osf.io/download/6468aebda97a7209eb0b77a4/

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:osf:osfxxx:56ngs

DOI: 10.31219/osf.io/56ngs

Access Statistics for this paper

More papers in OSF Preprints from Center for Open Science
Bibliographic data for series maintained by OSF ().

 
Page updated 2025-03-19
Handle: RePEc:osf:osfxxx:56ngs