EZCam: WYSWYG Camera Manipulator for Path Design
  Cheng-Chi Li     Yu-Chi Lai     Nai-Sheng Hsu     Hong-Nian Kuo     Meng-Ting Tsai     Chih-Yuan Yao  
Li, C.-C., Lai, Y.-C., Syu, N.-S., Kuo, H.-N., Todorov, D., and Yao, C.-Y., “EZCam: WYSWYG Camera Manipulator for Path Design.”, IEEE Transaction on Circuits and Systems on Video Technology, Volume 27, Issue 8, PP. 1632-1646, DOI 10.1109/TCSVT.2016.2543018, 2017 Submitted material web page
Abstract

With advance in movie industry, composite interactions and complex visual effects require to shoot at the designed part of a scene for immersion. Traditionally, the director of photography (DP) plans a camera path by recursively reviewing and commenting path-planning rendered results. Since the adjust-render-review process is not immediate and interactive, mis-communications happen to make the process ineffective and time consuming. Therefore, this work proposes a What-You-See-What-You-Get camera path reviewing system for the director to interactively instruct and design camera paths. Our system consists of a camera handle, a parameter control board, and a camera tracking box with mutually perpendicular marker planes. When manipulating the handle, the attached camera captures markers on visible planes with selected parameters to adjust the rendering view of the world. The director can directly examine results to give immediate comments and feedbacks for transformation and parameter adjustment in order to achieve effective communication and reduce the reviewing time. Finally, we conduct a set of qualitative and quantitative evaluations to show that our system is robust and efficient and can provide means to give interactive and immediate instructions for effective communication and efficiency enhancement during path design.

Bibtex

@ARTICLE{Li2017, 
author={C.-C. Li and Y.-C. Lai and N.-S. Hsu and H.-N. Kuo and Todorov D.  and Yao, C.-Y.}, 
journal={IEEE Transaction on Circuits and Systems on Video Technology}, 
title={EZCam: WYSWYG Camera Manipulator for Path Design.}, 
year={2017}, 
volume={27}, 
number={7}, 
pages={1632-1646}, 
}

Acknowledgement

We thank participants of all user studies. This work was also supported by NSC-104-2221-E-011-029-MY3, NSC103-2221-E-011-114-MY2, NSC-104-2218-E-011-006, NSC-103-2218-E-011-014, NSC-104-2221-E-011-092 and NSC-103-2221-E-011-076, Taiwan.