VCLab

 

RESEARCH AREAS   PEOPLE   PUBLICATIONS   COURSES   ABOUT US
Home / Publications

indicator

ACM SIGGRAPH 2022 (Transactions on Graphics)

 
Egocentric Scene Reconstruction from an Omnidirectional Video
 
             
  Hyeonjoong Jang   Andréas Meuleman   Dahyun Kang  
  Donggun Kim   Christian Richardt   Min H. Kim  
 
  KAIST     University of Bath
 
   
  We introduce a practical reconstruction method for 3D scene geometry from short handheld omnidirectional videos. (a) Example video frame captured by a 360° camera (inset). (b) An inverse depth frame estimated by our spherical disparity estimation. (c) To reconstruct egocentric scene geometry effectively from a short omnidirectional video, we devise a scene reconstruction method using a novel spherical binoctree data structure. (d) The reconstructed 3D scene geometry. (e) 3D rendering of the reconstructed scene with our texture mapping. Please see our supplemental video for additional results and comparisons.    
     
   
  Fast forward video at ACM SIGGRAPH 2022  
     
   
  Presentation video at ACM SIGGRAPH 2022  
     
   
  Supplemental video results  
     
   
  Abstract
   
 

Omnidirectional videos capture environmental scenes effectively, but they have rarely been used for geometry reconstruction. In this work, we propose an egocentric 3D reconstruction method that can acquire scene geometry with high accuracy from a short egocentric omnidirectional video. To this end, we first estimate per-frame depth using a spherical disparity network. We then fuse per-frame depth estimates into a novel spherical binoctree data structure that is specifically designed to tolerate spherical depth estimation errors. By subdividing the spherical space into binary tree and octree nodes that represent spherical frustums adaptively, the spherical binoctree effectively enables egocentric surface geometry reconstruction for environmental scenes while simultaneously assigning high-resolution nodes for closely observed surfaces. This allows to reconstruct an entire scene from a short video captured with a small camera trajectory. Experimental results validate the effectiveness and accuracy of our approach for reconstructing the 3D geometry of environmental scenes from short egocentric omnidirectional video inputs. We further demonstrate various applications using a conventional omnidirectional camera, including novel-view synthesis, object insertion, and relighting of scenes using reconstructed 3D models with texture.

   
  BibTeX
 
  @Article{Egocentric:SIG:2022,
  author  = {Hyeonjoong Jang and Andréas Meuleman and Dahyun Kang and
            Donggun Kim and Christian Richardt and Min H. Kim},
  title   = {Egocentric Scene Reconstruction from an Omnidirectional Video},
  journal = {ACM Transactions on Graphics (Proc. SIGGRAPH 2022)},
  year    = {2022},
  volume  = {41},
  number  = {4},
  }   
   
   
icon
Preprint paper:
PDF (8.3MB)
icon
Supplemental
document:
PDF (4.6MB)
SIGGRAPH 2022 slides
PDF (14.9 MB)
www GitHub code &
dataset
www ACM
Digital Library
 

Hosted by Visual Computing Laboratory, School of Computing, KAIST.

KAIST