We present solutions for enhancing the spatial and/or temporal resolution of videos. Our algorithm targets the emerging consumer-level hybrid cameras that can simultaneously capture video and high-resolution stills. Our technique produces a high spacetime resolution video using the highresolution stills for rendering and the low-resolution video to guide the reconstruction and the rendering process. Our framework integrates and extends two existing algorithms, namely a high-quality optical flow algorithm and a highquality image-based-rendering algorithm. The framework enables a variety of applications that were previously unavailable to the amateur user, such as the ability to (1) automatically create videos with high spatiotemporal resolution, and (2) shift a high-resolution still to nearby points in time to better capture a missed event.
@inproceedings{Gupta2009Enhancingexperiencingspacetime, author = {A. Gupta, P. Bhat, M. Dontcheva, O. Deussen, B. Curless, M. Cohen}, booktitle = {2009 IEEE International Conference on Computational Photography (ICCP)}, doi = {10.1109/ICCPHOT.2009.5559006}, keywords = {cameras;image reconstruction;image resolution;image sequences;rendering (computer graphics);consumer-level hybrid cameras;high-quality image-based-rendering algorithm;high-quality optical flow algorithm;high-resolution stills;low-resolution video;spacetime resolution;video temporal resolution;Cameras;Image reconstruction;Optical imaging;Pixel;Spatial resolution;Videos}, month = {apr}, pages = {1--9}, title = {Enhancing and experiencing spacetime resolution with videos and stills}, url = {http://grail.cs.washington.edu/projects/enhancing-spacetime/}, year = {2009} }