EconPapers    
Economics at your fingertips  
 

Towards real-time photorealistic 3D holography with deep neural networks

Liang Shi (), Beichen Li, Changil Kim, Petr Kellnhofer and Wojciech Matusik ()
Additional contact information
Liang Shi: Massachusetts Institute of Technology
Beichen Li: Massachusetts Institute of Technology
Changil Kim: Massachusetts Institute of Technology
Petr Kellnhofer: Massachusetts Institute of Technology
Wojciech Matusik: Massachusetts Institute of Technology

Nature, 2021, vol. 591, issue 7849, 234-239

Abstract: Abstract The ability to present three-dimensional (3D) scenes with continuous depth sensation has a profound impact on virtual and augmented reality, human–computer interaction, education and training. Computer-generated holography (CGH) enables high-spatio-angular-resolution 3D projection via numerical simulation of diffraction and interference1. Yet, existing physically based methods fail to produce holograms with both per-pixel focal control and accurate occlusion2,3. The computationally taxing Fresnel diffraction simulation further places an explicit trade-off between image quality and runtime, making dynamic holography impractical4. Here we demonstrate a deep-learning-based CGH pipeline capable of synthesizing a photorealistic colour 3D hologram from a single RGB-depth image in real time. Our convolutional neural network (CNN) is extremely memory efficient (below 620 kilobytes) and runs at 60 hertz for a resolution of 1,920 × 1,080 pixels on a single consumer-grade graphics processing unit. Leveraging low-power on-device artificial intelligence acceleration chips, our CNN also runs interactively on mobile (iPhone 11 Pro at 1.1 hertz) and edge (Google Edge TPU at 2.0 hertz) devices, promising real-time performance in future-generation virtual and augmented-reality mobile headsets. We enable this pipeline by introducing a large-scale CGH dataset (MIT-CGH-4K) with 4,000 pairs of RGB-depth images and corresponding 3D holograms. Our CNN is trained with differentiable wave-based loss functions5 and physically approximates Fresnel diffraction. With an anti-aliasing phase-only encoding method, we experimentally demonstrate speckle-free, natural-looking, high-resolution 3D holograms. Our learning-based approach and the Fresnel hologram dataset will help to unlock the full potential of holography and enable applications in metasurface design6,7, optical and acoustic tweezer-based microscopic manipulation8–10, holographic microscopy11 and single-exposure volumetric 3D printing12,13.

Date: 2021
References: Add references at CitEc
Citations: View citations in EconPapers (9)

Downloads: (external link)
https://www.nature.com/articles/s41586-020-03152-0 Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:nat:nature:v:591:y:2021:i:7849:d:10.1038_s41586-020-03152-0

Ordering information: This journal article can be ordered from
https://www.nature.com/

DOI: 10.1038/s41586-020-03152-0

Access Statistics for this article

Nature is currently edited by Magdalena Skipper

More articles in Nature from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-19
Handle: RePEc:nat:nature:v:591:y:2021:i:7849:d:10.1038_s41586-020-03152-0