A Novel Information Fusion Method for Vision Perception and Location of Intelligent Industrial Robots

Authors

  • Shoufeng Jin Department of Mechanical Engineering, Xi'an Polytechnic University, No. 95, Jinhua North Road, Xincheng District, Xi'an, Shaanxi 710048, China
  • Qiangqiang Lin
  • Jian Yang Department of Mechanical, Materials and Manufacturing Engineering, University of Nottingham Ningbo China,199 Taikang East Road, Ningbo 315100, China
  • Yu Bie School of Chemical Engineering, Kunming University of Science and Technology, 727 Jingming South Road, Chenggong District, Kunming, Yunnan,650504, China
  • Mingrui Tian Shaanxi Provincial Key Laboratory of Chang'an University of highway construction machinery, No. 95, Jinhua North Road, Xincheng District , Xi'an, Shaanxi 710064, China
  • Zhixiong Li School of Mechanical, Materials, Mechatronic and Biomedical Engineering, University of Wollongong, Wollongong, NSW 2522, Australia

DOI:

https://doi.org/10.5755/j01.eie.25.5.20587

Keywords:

Industrial robot, SURF algorithm, BRIEF descriptor, Affine transformation, Centroid coordinates

Abstract

An improved SURF (Speeded-Up Robust Feature) algorithm is proposed to deal with the time-consuming and low precision of positioning of industrial robot. Hessian matrix determinant is used to extract feature points from the target image and a multi-scale spatial pyramid is constructed. The location and scale value of feature points are determined by neighbourhood non-maximum suppression method. The direction of feature points is defined as directional feature descriptors by the binary robust independent elementary feature (BRIEF). The progressive sample consensus (PROSAC) is used to carry out second precise matching and remove mismatching points based on the Hamming distance. Then, an affine transformation model is established to describe the relationship between the template and target images. Centroid coordinates of the target can be obtained based on the affine transformation. Comparative tests were carried out to demonstrate that the proposed method can effectively improve the recognition rate and positioning accuracy of the industrial robots. The average time consuming is less than 0.2 s, the matching accuracy is 96 %, and the positioning error of the robot is less than 1.5 mm. Therefore, the proposed method has practical application importance.

Downloads

Published

2019-10-06

How to Cite

Jin, S., Lin, Q., Yang, J., Bie, Y., Tian, M., & Li, Z. (2019). A Novel Information Fusion Method for Vision Perception and Location of Intelligent Industrial Robots. Elektronika Ir Elektrotechnika, 25(5), 4-10. https://doi.org/10.5755/j01.eie.25.5.20587

Issue

Section

AUTOMATION, ROBOTICS