SciELO - Scientific Electronic Library Online

 
vol.18 issue4Designing Minimal Sorting Networks Using a Bio-inspired TechniqueWikification of Learning Objects Using Metadata as an Alternative Context for Disambiguation author indexsubject indexsearch form
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

Related links

  • Have no similar articlesSimilars in SciELO

Share


Computación y Sistemas

On-line version ISSN 2007-9737Print version ISSN 1405-5546

Comp. y Sist. vol.18 n.4 Ciudad de México Oct./Dec. 2014

https://doi.org/10.13053/CyS-18-4-1557 

Artículos regulares

 

Periodicity-Based Computation of Optical Flow

 

Georgii Khachaturov, Silvia Beatriz González Brambila, and Jesús Isidro González Trejo

 

Departamento de Sistemas, Universidad Autónoma Metropolitana (Azcapotzalco). Mexico. xgeorge@correo.azc.uam.mx

 

Article received on 27/09/2013.
Accepted on 27/06/2014.

 

Abstract

The standard Brightness Constancy Equation states spatiotemporal shift invariance of the input data along a local velocity of optical flow. In its turn, the shift invariance leads to a periodic function of a real argument. This allows application of a known test for periodicity to computation of optical flow at random locations. The approach is valid also for higher dimensions: for example, it applies to a sequence of 3D tomography images. The proposed method has a reasonably high accuracy for continuous flow and is noise tolerant. Special attention is paid to weak signal input. It is shown that a drastic reduction in the signal strength worsens the accuracy of estimates insignificantly. For a possible application to tomography, this would lead to an unprecedented diminution of harmful radiation exposure.

Keywords. Optical flow, periodicity-based processing, preventive tomography, night vision.

 

DESCARGAR ARTÍCULO EN FORMATO PDF

 

References

1. Anandan, P. (1989). A computational framework and an algorithm for the measurement of visual motion. IJCV, Vol. 2, No.3, pp. 283-310.         [ Links ]

2. Baker, S., Scharstein, D., Lewis, J.P., Roth, S., Black, M. J., & Szeliski, R. (2011). A Database and Evaluation Methodology for Optical Flow. IJCV, Vol. 92, No. 1, pp.1-31.         [ Links ]

3. Baker, S., Scharstein, D., Lewis, J.P., Roth, S., Black, M.J., & Szeliski, R. (2013). Middlebury open evaluation system [online]         [ Links ].

4. Barron, J.L., Fleet, D.J., & Beauchemin, S. (1994). Performance of optical flow techniques. IJCV, Vol. 12, No. 1, pp. 43-77.         [ Links ]

5. Black, M.J. (2013). The Yosemite dataset provided with ground truth [online]         [ Links ].

6. Brox, T. & Malik, J. (2010). Large displacement Optical Flow: descriptor matching in variational motion estimation, IEEE TPAMI, doi: 10.1109/TPAMI.2010.143.

7. Bruhn, A., Weickert, J., & Schnorr, C. (2005). Lucas/Kanade meets Horn/Schunck: Combining local and global optic flow methods. IJCV, Vol. 61, No. 3, pp. 211-231.         [ Links ]

8. Fischler, M. & Bolls, R. (1981). Random sample consensus: A paradigm for fitting with application to image analysis and automated cartography. Communication of the ACM, Vol. 24, No. 6, pp. 381-395.         [ Links ]

9. Fleet, D.J. & Jepson, A.D. (1990). Computation of component image velocity from local phase information. IJCV. Vol. 5, pp. 77-104.         [ Links ]

10. Goldluecke, B. & Cremers, D. (2010). Convex Relaxation for Multilabel Problems with Product Label Spaces. Proc. of European Conf. on Computer Vision (ECCV-2010).         [ Links ]

11. Heeger, D.J. (1987). Model for the extraction of image flow. J. Opt. Soc. Am. A 4, pp. 1455-1471.         [ Links ]

12. Horn, B. (1986). Robot vision. Cambridge: MIT Press.         [ Links ]

13. Horn, B. & Schunck, B.G. (1981). Determining optical flow. Artificial Intelligence, Vol. 17, pp. 185-203.         [ Links ]

14. Horowitz, P., & Hill, W. (1989). The Art of Electronics. 2nd edition. Cambridge (UK): Cambridge University Press.         [ Links ]

15. Hörmann, W., Leydold, J., & Derflinger, G. (2004). Automatic nonuniform random variate generation. Springer.         [ Links ]

16. Jepson, A., & Black, M.J. (1993). Mixture models for optical flow computation. CVPR, pp. 760-761.         [ Links ]

17. Jojic, N., & Frey, B. (2001). Learning flexible sprites in video layers. CVPR, Vol. 1, pp. 199-206.         [ Links ]

18. Khachaturov, G. (1995). An approach to detection of line elements. Proc. of the Second Asian Conference on Computer Vision (ACCV'95), Vol. 3, pp. 559-563.         [ Links ]

19. Khachaturov, G. (2011). A scalable, high-precision, and low-noise detector of shift-invariant image locations. Pat. Rec. Letters, Vol. 32, pp. 145-152, doi: 10.1016/j.patrec.2010.10.002.         [ Links ]

20. Lucas, B.D. & Kanade, T. (1981). An iterative image registration technique with an application to stereo vision. Proc. of DAPRA Imaging Understanding Workshop, pp. 121-130.         [ Links ]

21. Marr, D. & Ullman, S. (1981). Directional selectivity and its use in early visual processing. Proc. Roy. Soc., London B 211, pp. 151-180.         [ Links ]

22. Nagel, H.H. (1989). On a constraint equation for the estimation of displacement rates in image sequences. IEEE Trans PAMI, Vol. 11, pp. 13-30.         [ Links ]

23. Otte, M. & Nagel, H.H. (1994). Optical flow estimation: advances and comparisons. Proc. of the European conference on computer vision, pp. 51-60.         [ Links ]

24. Owens, J., Luebke, D., Govindaraju, N., Harris, M., Krüger, J., Lefohn, A., & Purcell, T. (2007). A Survey of General-Purpose Computation on Graphics Hardware. Computer Graphics Forum, Vol. 26, No. 1, pp. 80-113.         [ Links ]

25. Rao, C.R., Toutenburg, H., Fieger, A., Heumann, C., Nittner, T., & Scheid, S. (1999). Linear Models: Least Squares and Alternatives. Springer Series in Statistics.         [ Links ]

26. Reddy, B.S. & Chatterji, B.N. (1996). An FFT-based technique for translation, rotation, and scale-invariant image registration. IEEE Trans. on Image Processing, Vol. 5, No. 8, pp. 1266-1271.         [ Links ]

27. Scharstein, D. (2013). Open source code for the flow representation in the colour-coding format [online]         [ Links ].

28. Scharstein, D. & Szeliski, R. (2003). High-accuracy stereo depth maps using structured light. Proc. of the IEEE conference on computer vision and pattern recognition, pp. 195-202.         [ Links ]

29. Seitz, S., Curless, B., Diebel, J., Scharstein, D., & Szeliski, R. (2006). A comparison and evaluation of multi-view stereo reconstruction algorithms. Proc. of the IEEE conference on computer vision and pattern recognition. Vol. 1, pp. 519-526.         [ Links ]

30. Singh. (1990). An estimation theoretic framework for image-flow computation. Proc. of ICCV, Osaka, pp. 168-177.         [ Links ]

31. Sudderth, E., Sun, D., & Black, M. (2012). Layered Segmentation and Optical Flow Estimation over Time. CVPR, doi: 10.1109/CVPR.2012.6247873.

32. Tagliasacchi, M. (2007). A Genetic Algorithm for Optical Flow Estimation. Image and Vision Computing, Vol. 25, pp. 141-147, doi: 10.1016/j.imavis.2006.01.021.         [ Links ]

33. Uras, S., Girosi, F., Verri, A., & Torre, V. (1988). A computational approach to motion perception. Biol. Cybern., Vol. 60, pp. 79-87.         [ Links ]

34. Werlberger, M., Pock, T., & Bischof, H. (2010). Motion estimation with non-local total variation regularization. Proc. of IEEE Conference CVPR-2010.         [ Links ]

35. Xu, L., Chen, J., & Jia, J. (2008). A segmentation based variational model for accurate optical flow estimation. Proc. of the ECCV-2008, Vol. 1, pp. 671 -684.         [ Links ]

36. Zimmer, H., Bruhn, A., Weickert, J., Valgaerts, L., Salgado, A., Rosenhahn, B., & Seidel, H.P. (2009). Complementary optic flow. Proc. of Int. Conf. on Energy Minimization Methods in Comp.Vis. and Pat.Rec. (EMMCVPR).         [ Links ]

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License