Robust Optical Flow Estimation for Continuous Blurred Scenes using RGB-Motion Imaging and Directional Filtering

Research output: Contribution to conferencePaper

64 Downloads (Pure)

Abstract

Optical flow estimation is a difficult task given real-world video footage with camera and object blur. In this paper, we combine a 3D pose&position tracker with an RGB sensor allowing us to capture video footage together with 3D camera motion. We show that the additional camera motion information can be embedded into a hybrid optical flow framework by interleaving an iterative blind deconvolution and warping based minimization scheme. Such a hybrid framework significantly improves the accuracy of optical flow estimation in scenes with strong blur. Our approach yields improved overall performance against three state-of-the-art baseline methods applied to our proposed ground truth sequences as well as in several other real-world cases.
Original languageEnglish
Publication statusPublished - 30 Jan 2013
EventIEEE Winter Conference on Applications of Computer Vision - , UK United Kingdom
Duration: 30 Jul 2013 → …

Conference

ConferenceIEEE Winter Conference on Applications of Computer Vision
CountryUK United Kingdom
Period30/07/13 → …

Fingerprint

Optical flows
Cameras
Imaging techniques
Deconvolution
Sensors

Cite this

Cosker, D., & Li, W. (2013). Robust Optical Flow Estimation for Continuous Blurred Scenes using RGB-Motion Imaging and Directional Filtering. Paper presented at IEEE Winter Conference on Applications of Computer Vision, UK United Kingdom.

Robust Optical Flow Estimation for Continuous Blurred Scenes using RGB-Motion Imaging and Directional Filtering. / Cosker, Darren; Li, Wenbin.

2013. Paper presented at IEEE Winter Conference on Applications of Computer Vision, UK United Kingdom.

Research output: Contribution to conferencePaper

Cosker, D & Li, W 2013, 'Robust Optical Flow Estimation for Continuous Blurred Scenes using RGB-Motion Imaging and Directional Filtering' Paper presented at IEEE Winter Conference on Applications of Computer Vision, UK United Kingdom, 30/07/13, .
Cosker D, Li W. Robust Optical Flow Estimation for Continuous Blurred Scenes using RGB-Motion Imaging and Directional Filtering. 2013. Paper presented at IEEE Winter Conference on Applications of Computer Vision, UK United Kingdom.
Cosker, Darren ; Li, Wenbin. / Robust Optical Flow Estimation for Continuous Blurred Scenes using RGB-Motion Imaging and Directional Filtering. Paper presented at IEEE Winter Conference on Applications of Computer Vision, UK United Kingdom.
@conference{2e8ae3f15fdd4236bc9a1065cef592a9,
title = "Robust Optical Flow Estimation for Continuous Blurred Scenes using RGB-Motion Imaging and Directional Filtering",
abstract = "Optical flow estimation is a difficult task given real-world video footage with camera and object blur. In this paper, we combine a 3D pose&position tracker with an RGB sensor allowing us to capture video footage together with 3D camera motion. We show that the additional camera motion information can be embedded into a hybrid optical flow framework by interleaving an iterative blind deconvolution and warping based minimization scheme. Such a hybrid framework significantly improves the accuracy of optical flow estimation in scenes with strong blur. Our approach yields improved overall performance against three state-of-the-art baseline methods applied to our proposed ground truth sequences as well as in several other real-world cases.",
author = "Darren Cosker and Wenbin Li",
year = "2013",
month = "1",
day = "30",
language = "English",
note = "IEEE Winter Conference on Applications of Computer Vision ; Conference date: 30-07-2013",

}

TY - CONF

T1 - Robust Optical Flow Estimation for Continuous Blurred Scenes using RGB-Motion Imaging and Directional Filtering

AU - Cosker, Darren

AU - Li, Wenbin

PY - 2013/1/30

Y1 - 2013/1/30

N2 - Optical flow estimation is a difficult task given real-world video footage with camera and object blur. In this paper, we combine a 3D pose&position tracker with an RGB sensor allowing us to capture video footage together with 3D camera motion. We show that the additional camera motion information can be embedded into a hybrid optical flow framework by interleaving an iterative blind deconvolution and warping based minimization scheme. Such a hybrid framework significantly improves the accuracy of optical flow estimation in scenes with strong blur. Our approach yields improved overall performance against three state-of-the-art baseline methods applied to our proposed ground truth sequences as well as in several other real-world cases.

AB - Optical flow estimation is a difficult task given real-world video footage with camera and object blur. In this paper, we combine a 3D pose&position tracker with an RGB sensor allowing us to capture video footage together with 3D camera motion. We show that the additional camera motion information can be embedded into a hybrid optical flow framework by interleaving an iterative blind deconvolution and warping based minimization scheme. Such a hybrid framework significantly improves the accuracy of optical flow estimation in scenes with strong blur. Our approach yields improved overall performance against three state-of-the-art baseline methods applied to our proposed ground truth sequences as well as in several other real-world cases.

M3 - Paper

ER -