Robust optical flow estimation for continuous blurred scenes using RGB-motion imaging and directional filtering

Wenbin Li, Yang Chen, Jee Hang Lee, Gang Ren, Darren Cosker

Research output: Chapter in Book/Report/Conference proceedingConference contribution

10 Citations (Scopus)

Abstract

Optical flow estimation is a difficult task given real-world video footage with camera and object blur. In this paper, we combine a 3D pose&position tracker with an RGB sensor allowing us to capture video footage together with 3D camera motion. We show that the additional camera motion information can be embedded into a hybrid optical flow framework by interleaving an iterative blind deconvolution and warping based minimization scheme. Such a hybrid framework significantly improves the accuracy of optical flow estimation in scenes with strong blur. Our approach yields improved overall performance against three state-of-the-art baseline methods applied to our proposed ground truth sequences, as well as in several other real-world sequences captured by our novel imaging system.

Original languageEnglish
Title of host publicationIEEE Winter Conference on Applications of Computer Vision, 2014
PublisherIEEE
Pages792-799
Number of pages8
ISBN (Print)9781479949854
DOIs
Publication statusPublished - 23 Jun 2014
Event2014 IEEE Winter Conference on Applications of Computer Vision, WACV 2014 - Steamboat Springs, CO, USA United States
Duration: 24 Mar 201426 Mar 2014

Conference

Conference2014 IEEE Winter Conference on Applications of Computer Vision, WACV 2014
CountryUSA United States
CitySteamboat Springs, CO
Period24/03/1426/03/14

ASJC Scopus subject areas

  • Computer Science Applications
  • Computer Vision and Pattern Recognition

Cite this

Li, W., Chen, Y., Lee, J. H., Ren, G., & Cosker, D. (2014). Robust optical flow estimation for continuous blurred scenes using RGB-motion imaging and directional filtering. In IEEE Winter Conference on Applications of Computer Vision, 2014 (pp. 792-799). [6836022] IEEE. https://doi.org/10.1109/WACV.2014.6836022

Robust optical flow estimation for continuous blurred scenes using RGB-motion imaging and directional filtering. / Li, Wenbin; Chen, Yang; Lee, Jee Hang; Ren, Gang; Cosker, Darren.

IEEE Winter Conference on Applications of Computer Vision, 2014. IEEE, 2014. p. 792-799 6836022.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Li, W, Chen, Y, Lee, JH, Ren, G & Cosker, D 2014, Robust optical flow estimation for continuous blurred scenes using RGB-motion imaging and directional filtering. in IEEE Winter Conference on Applications of Computer Vision, 2014., 6836022, IEEE, pp. 792-799, 2014 IEEE Winter Conference on Applications of Computer Vision, WACV 2014, Steamboat Springs, CO, USA United States, 24/03/14. https://doi.org/10.1109/WACV.2014.6836022
Li W, Chen Y, Lee JH, Ren G, Cosker D. Robust optical flow estimation for continuous blurred scenes using RGB-motion imaging and directional filtering. In IEEE Winter Conference on Applications of Computer Vision, 2014. IEEE. 2014. p. 792-799. 6836022 https://doi.org/10.1109/WACV.2014.6836022
Li, Wenbin ; Chen, Yang ; Lee, Jee Hang ; Ren, Gang ; Cosker, Darren. / Robust optical flow estimation for continuous blurred scenes using RGB-motion imaging and directional filtering. IEEE Winter Conference on Applications of Computer Vision, 2014. IEEE, 2014. pp. 792-799
@inproceedings{ef0533563c2f42e99e1827e19796e539,
title = "Robust optical flow estimation for continuous blurred scenes using RGB-motion imaging and directional filtering",
abstract = "Optical flow estimation is a difficult task given real-world video footage with camera and object blur. In this paper, we combine a 3D pose&position tracker with an RGB sensor allowing us to capture video footage together with 3D camera motion. We show that the additional camera motion information can be embedded into a hybrid optical flow framework by interleaving an iterative blind deconvolution and warping based minimization scheme. Such a hybrid framework significantly improves the accuracy of optical flow estimation in scenes with strong blur. Our approach yields improved overall performance against three state-of-the-art baseline methods applied to our proposed ground truth sequences, as well as in several other real-world sequences captured by our novel imaging system.",
author = "Wenbin Li and Yang Chen and Lee, {Jee Hang} and Gang Ren and Darren Cosker",
year = "2014",
month = "6",
day = "23",
doi = "10.1109/WACV.2014.6836022",
language = "English",
isbn = "9781479949854",
pages = "792--799",
booktitle = "IEEE Winter Conference on Applications of Computer Vision, 2014",
publisher = "IEEE",
address = "USA United States",

}

TY - GEN

T1 - Robust optical flow estimation for continuous blurred scenes using RGB-motion imaging and directional filtering

AU - Li, Wenbin

AU - Chen, Yang

AU - Lee, Jee Hang

AU - Ren, Gang

AU - Cosker, Darren

PY - 2014/6/23

Y1 - 2014/6/23

N2 - Optical flow estimation is a difficult task given real-world video footage with camera and object blur. In this paper, we combine a 3D pose&position tracker with an RGB sensor allowing us to capture video footage together with 3D camera motion. We show that the additional camera motion information can be embedded into a hybrid optical flow framework by interleaving an iterative blind deconvolution and warping based minimization scheme. Such a hybrid framework significantly improves the accuracy of optical flow estimation in scenes with strong blur. Our approach yields improved overall performance against three state-of-the-art baseline methods applied to our proposed ground truth sequences, as well as in several other real-world sequences captured by our novel imaging system.

AB - Optical flow estimation is a difficult task given real-world video footage with camera and object blur. In this paper, we combine a 3D pose&position tracker with an RGB sensor allowing us to capture video footage together with 3D camera motion. We show that the additional camera motion information can be embedded into a hybrid optical flow framework by interleaving an iterative blind deconvolution and warping based minimization scheme. Such a hybrid framework significantly improves the accuracy of optical flow estimation in scenes with strong blur. Our approach yields improved overall performance against three state-of-the-art baseline methods applied to our proposed ground truth sequences, as well as in several other real-world sequences captured by our novel imaging system.

UR - http://www.scopus.com/inward/record.url?scp=84904706236&partnerID=8YFLogxK

U2 - 10.1109/WACV.2014.6836022

DO - 10.1109/WACV.2014.6836022

M3 - Conference contribution

SN - 9781479949854

SP - 792

EP - 799

BT - IEEE Winter Conference on Applications of Computer Vision, 2014

PB - IEEE

ER -