Please use this identifier to cite or link to this item: /library/oar/handle/123456789/122088
Title: Bounding box matching : a sparse object-centric correspondence method for stereo vision
Authors: Sindel, Tomas
Naraharisetti, Prabhu R.
Saliba, Michael A.
Fabri, Simon G.
Keywords: Semantic integration (Computer systems)
LabVIEW
Computer graphics
Three-dimensional display systems
Computer vision equipment industry
Issue Date: 2022
Publisher: Institute of Electrical and Electronics Engineers
Citation: Sindel, T., Naraharisetti, P. R., Saliba, M. A., & Fabri, S. G. (2022, February). Bounding Box Matching: A Sparse Object-centric Correspondence Method for Stereo Vision. In 2022 8th International Conference on Automation, Robotics and Applications (ICARA), Czech Republic. 223-227.
Abstract: In this work a simplified method for sparse, object-centric disparity estimation is proposed. It combines the state-of-the-art object detector YOLOv4 with image rectification to produce a disparity map with high speed suitable for real-time applications. Similarly, as other methods based on convolutional neural networks, this approach uses contextual and semantic image information and is robust to ill-posed image regions such as reflective, textureless and occluded regions, but requires less computational resources at the expense of detail and estimation accuracy. The method has been implemented on the Tensorflow platform and has been consequently deployed with the LabVIEW graphical programming language. It has been shown that the method works best at large distances, small object depth to distance ratios and moderate eccentricities. The source code can be found at: https://github.com/tsindel/bbox-matching.
URI: https://www.um.edu.mt/library/oar/handle/123456789/122088
Appears in Collections:Scholarly Works - FacEngME

Files in This Item:
File Description SizeFormat 
Bounding_box_matching_a_sparse_object_centric_correspondence_method_for_stereo_vision_2022.pdf
  Restricted Access
2.81 MBAdobe PDFView/Open Request a copy


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.