EconPapers    
Economics at your fingertips  
 

Product 3D virtual display scene modelling based on augmented reality technology

Heng Luo

International Journal of Product Development, 2022, vol. 26, issue 1/2/3/4, 131-142

Abstract: In order to overcome the problems of low precision and long time in traditional methods, a 3D virtual display scene modelling method based on augmented reality technology is proposed. The 3D virtual display scene image of the product is obtained by camera, and the 3D virtual display scene coordinate is obtained by using the 3D registration technology in augmented reality. The real product scene image and virtual position are synthesised in virtual reality according to the acquired 3D space coordinates, and the geometric distortion phenomenon generated in the synthesis process is corrected. After the correction, the 3D virtual display scene coordinates are obtained, the product scene image is binary processing, the product image contour is extracted, and the 3D virtual display scene of the product is generated. The simulation results show that the modelling accuracy of this method is kept at 95%, and the modelling time is only 4 s.

Keywords: augmented reality technology; 3D virtual; display scene; 3D registration technology; anti-perspective transformation method. (search for similar items in EconPapers)
Date: 2022
References: Add references at CitEc
Citations:

Downloads: (external link)
http://www.inderscience.com/link.php?id=125345 (text/html)
Access to full text is restricted to subscribers.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:ids:ijpdev:v:26:y:2022:i:1/2/3/4:p:131-142

Access Statistics for this article

More articles in International Journal of Product Development from Inderscience Enterprises Ltd
Bibliographic data for series maintained by Sarah Parker ().

 
Page updated 2025-03-19
Handle: RePEc:ids:ijpdev:v:26:y:2022:i:1/2/3/4:p:131-142