CN113305851A - Online detection device for robot micro-assembly - Google Patents

Online detection device for robot micro-assembly Download PDF

Info

Publication number
CN113305851A
CN113305851A CN202110671842.XA CN202110671842A CN113305851A CN 113305851 A CN113305851 A CN 113305851A CN 202110671842 A CN202110671842 A CN 202110671842A CN 113305851 A CN113305851 A CN 113305851A
Authority
CN
China
Prior art keywords
robot
unit
rod piece
assembly
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110671842.XA
Other languages
Chinese (zh)
Inventor
王福杰
李超凡
秦毅
任斌
郭芳
胡耀华
姚智伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan University of Technology
Original Assignee
Dongguan University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongguan University of Technology filed Critical Dongguan University of Technology
Priority to CN202110671842.XA priority Critical patent/CN113305851A/en
Publication of CN113305851A publication Critical patent/CN113305851A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1687Assembly, peg and hole, palletising, straight line, weaving pattern movement

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses an online detection device for robot micro-assembly, which comprises a setting unit: the system is used for setting the position and the posture of each rod piece of the robot in the assembly time range in space relative to an absolute coordinate system or a robot base coordinate system; a control unit: the joint motion control system is used for controlling the mechanical arm of the robot to perform joint motion according to set basic parameters in the assembly time; a collecting unit: and determining the coordinates of the characteristic points of the robot, and fitting all received data information into a unified coordinate system. According to the invention, through the mutual cooperation of the setting unit, the control unit, the acquisition unit, the parameter calibration and comparison unit and the adjustment unit, the basic parameters of each rod piece of the robot are controlled according to the coordinate position group under the condition of continuous motion of each rod piece of the robot through the preset coordinate position group, the consistent motion is realized, the occurrence of errors in micro-assembly is avoided, and the quality of an assembled product is improved.

Description

Online detection device for robot micro-assembly
Technical Field
The invention relates to the technical field of automatic systems, in particular to an online detection device for robot micro-assembly.
Background
The assembly robot is the core equipment of a flexible automatic assembly system and consists of a robot manipulator, a controller, an end effector and a sensing system. The structure types of the manipulator comprise a horizontal joint type, a rectangular coordinate type, a multi-joint type, a cylindrical coordinate type and the like; the controller generally adopts a multi-CPU or multi-stage computer system to realize motion control and motion programming; the end effector is designed into various paws, wrists and the like for adapting to different assembling objects; the sensing system is used to obtain information of the interaction between the assembly robot and the environment and the assembly object.
Compared with the traditional machining method and numerical control machine tool machining, the robot-assisted machining has the advantages of good flexibility, large operation space, high efficiency, low cost, easiness in integration with other advanced technologies and the like, and is researched and used for casting grinding, deburring, chamfering, rapid prototyping, carving and other directions in recent years. The off-line programming method of the robot has attracted great interest to people and becomes a very active research direction in the current robotics in the aspects of improving the working efficiency of the robot, planning complex motion trajectories, collision and interference inspection, visually observing programming results, optimizing programming and the like. However, in the process that the robot participates in various machining processes of the machine, as the rigidity of the robot system is lower than that of a CNC system, and uncertain factors are more, the track directly programmed through CADICAM offline has larger error in practice, and online teaching is usually required for modification. And when the part itself has a large error, it becomes very difficult to obtain a good machining effect. With the development of science and technology, many advanced control and optimization control theories are applied to robot control systems. However, whether during the application of advanced control strategies or during direct control of product quality, one of the most problematic problems is the difficulty of online real-time measurement of product quality or process parameters. Therefore, research and development of an intelligent industrial robot capable of sensing external environment changes with a certain intelligence level is a difficult problem in the front of robotics research and researchers-2 ].
Compared with the traditional machining method and numerical control machine tool machining, the robot-assisted machining has the advantages of good flexibility, large operating space, high efficiency, low cost, easiness in integration with other advanced technologies and the like, and is researched in recent years for casting grinding, deburring, chamfering, rapid prototyping, carving and other directions, the robot offline programming method has attracted great interest in improving the working efficiency of the robot, planning complex motion tracks, collision and interference inspection, observing programming results visually, optimizing programming and the like, and becomes a very active research direction in the current robotics, however, in the process of the robot participating in various machining processes of the machine, as the robot system has lower rigidity than a CNC system and more uncertain factors, the track directly programmed through CADICAM offline has larger errors in practice, usually needs to be modified on-line teaching, and when the part has larger errors, however, in the application process of the advanced control strategy or the direct control process of the product quality, the most troublesome problem is that the quality or the processing parameters of the product are difficult to be measured on line in real time.
In the industrial robot detection field, not only the quality and the reliability of products are improved, but also the production efficiency of the industrial robot is ensured, efficient online detection can be better realized, and the flexibility and the automation degree on a workpiece production line are improved. Depending on the measurement method, the vision measurement is actually performed under the condition that the parameters of the camera and the illumination environment are unchanged. The camera can be used for shooting the target object image under the condition without a special light source, and then the 3D pose information of the target object can be recovered by applying an image processing method.
Even if the micro-assembly is applied to the assembly of smaller parts, at present, the precision of the robot assembly needs to be constantly detected in order to ensure the assembly precision of the existing robot micro-assembly, and the traditional detection method is to measure the axis track by installing a displacement sensor and indirectly measure the assembly error, so that the method is indirect measurement and is not direct enough.
To this end, we propose an online detection device for robotic micro-assembly to solve the above problems.
Disclosure of Invention
The invention aims to provide an on-line detection device for robot micro-assembly.
In order to achieve the purpose, the invention adopts the following technical scheme:
an on-line detection device for robot micro-assembly, comprising a setting unit: the system is used for setting the position and the posture of each rod piece of the robot in the assembly time range in space relative to an absolute coordinate system or a robot base coordinate system; a control unit: the joint motion control system is used for controlling the mechanical arm of the robot to perform joint motion according to set basic parameters in the assembly time; a collecting unit: determining coordinates of the characteristic points of the robot, and fitting all received data information in a unified coordinate system; parameter calibration: constructing an image model of various internal and external parameters of the acquisition unit so as to determine the corresponding relation of each rod piece of the robot between a world coordinate system and an image plane coordinate system; a comparison unit: obtaining polar constraint relation among the acquisition units to obtain a basic matrix, obtaining more matching corresponding points through the basic matrix, calculating to obtain optimal polar constraint, and comparing coordinates obtained by the acquisition units with space coordinates of each rod piece of the robot in the setting unit; an adjusting unit: and determining whether to adjust the motion parameters of the mechanical arm of the robot according to the data obtained by the comparison file, so as to ensure the assembly accuracy.
Preferably, the basic parameters of the setting include working space, maximum speed, moment of inertia.
Preferably, the acquisition unit includes one of a laser probe, a camera, and a video capture card for capturing data of each rod of the laser irradiation robot.
Preferably, the parameter calibration includes calibration of a single camera and calibration of relative positions of binocular stereo cameras.
Preferably, the acquisition unit uses plane laser to irradiate the surface of the measured object to obtain two-dimensional plane data, uses laser rays to irradiate each rod of the robot to acquire three-dimensional data, and can obtain the displacement characteristics of the measured object through analysis of the acquired data.
Preferably, the adjusting unit calibrates the obtained matching result of the internal and external parameters and the stereo image in the acquisition unit, calculates the distance between each rod of the robot and the acquisition unit, obtains three-dimensional stereo information, and reconstructs the spatial information and the posture of each rod of the robot.
Compared with the prior art, the invention has the beneficial effects that:
according to the invention, through the mutual cooperation of the setting unit, the control unit, the acquisition unit, the parameter calibration and comparison unit and the adjustment unit, the basic parameters of each rod piece of the robot are controlled according to the coordinate position group under the condition of continuous motion of each rod piece of the robot through the preset coordinate position group, the consistent motion is realized, the occurrence of errors in micro-assembly is avoided, and the quality of an assembled product is improved.
Drawings
Fig. 1 is a block diagram of an on-line detection device for robot micro-assembly according to the present invention.
In the figure: the device comprises a 10 setting unit, a 20 control unit, a 30 acquisition unit, a 40 parameter calibration unit, a 50 comparison unit and a 60 adjustment unit.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
Referring to fig. 1, an on-line inspection apparatus for robot micro-assembly includes:
the setting unit 10: the system is used for setting the position and the posture of each rod piece of the robot in the assembly time range in space relative to an absolute coordinate system or a robot base coordinate system;
the robot of the invention can adopt a multi-joint type servo-driven six-degree-of-freedom robot, has the characteristics of high transmission precision, compact structure, light weight, large working range and the like, has six operating degrees of freedom, has the maximum arm load of 6kg and the repetition precision of 0.08mm, and has the operating range of 1373mm in maximum operating radius and 285mm in minimum with the center of a machine base as the center of a circle.
The set basic parameters comprise the working space, the maximum speed, the moment and the inertia of the robot.
And arranging the coordinate points of each rod piece of the robot according to the sequence from the end point to the end point required by the continuous action of the actuator to form the continuous work of the robot.
The control unit 20: the joint motion control system is used for controlling the mechanical arm of the robot to perform joint motion according to set basic parameters in the assembly time;
according to the coordinate point of each rod piece of the robot in the assembly time, the speed, the acceleration and the motion range of the mechanical arm of the robot are set, so that the robot can compare the coordinate point of each rod piece of the robot with the set coordinate point at the moment so as to adjust the parameters of the robot, errors are avoided, and therefore the coordinate point of each rod piece of the robot needs to be collected constantly for comparison and adjustment.
The acquisition unit 30: determining coordinates of the characteristic points of the robot, and fitting the received data information in a unified coordinate system; the collecting unit 30 includes one of a laser measuring head, a camera, and a video capture card for collecting data of each rod of the laser irradiation robot. The video capture card is also called a video capture card, and is used for inputting video data or mixed data of video and audio output by simulating video signals output by a camera, a video recorder, an LD video disc player, a television and the like into a computer, converting the video data or the mixed data into digital data which can be distinguished by the computer, storing the digital data in the computer and converting the digital data into a video data file which can be edited and processed. It is a hardware device essential for video processing. The acquisition unit 30 irradiates the surface of the object to be measured with a planar laser to obtain two-dimensional planar data, irradiates each rod of the robot with a laser beam to acquire three-dimensional data, and analyzes the acquired data to obtain the displacement characteristics of the object to be measured, so that the three-dimensional data of each rod of the robot can be acquired.
By adopting a non-contact measurement mode, the corresponding speed is improved, the influence on a production line is reduced, the performance of long-time stable and reliable repeated work is realized, and the method is suitable for the assembly line operation of robot micro-assembly.
And (4) parameter calibration 40: constructing an image model of various internal and external parameters of the acquisition unit 30 to determine the corresponding relationship between each rod piece of the robot in a world coordinate system and an image plane coordinate system; the parameter calibration 40 includes calibration of a single camera and calibration of the relative positions of binocular stereo cameras. In image acquisition, a 3D scene of an objective world needs to be projected onto a 2D image plane of a camera, and the projection process can be described by coordinate transformation. Imaging transformations involve the conversion between different coordinate systems, the specification of which is important in stereo vision, including world, camera, image, imaging plane, etc. The space coordinate of the object is transformed into the digital transformation of the computer, and the digital transformation is carried out for four times, namely, the transformation from a world coordinate system to a camera coordinate system is firstly carried out; secondly, converting a camera coordinate system to an undistorted image plane coordinate to meet a triangular relation; thirdly, converting the undistorted image plane coordinates into actual image plane coordinates with radial and tangential distortions; finally, the actual image coordinates of the camera are transformed to computer image coordinates. Computer vision common coordinate systems are defined using right-hand criteria.
The camera calibration herein is a process of calculating internal parameters and external parameters of the camera from calibration objects whose reference coordinate system coordinates and image coordinate system coordinates are known, and taking basic parameters of the camera as the internal parameters, the calibration objects are generally mark points, and coordinates in a world coordinate system are known, thereby being obtained with high accuracy in a computer image coordinate system. In the computer, two or one camera at different positions shoots the same scene through moving or rotating, and the time difference of a space point is calculated, so that the coordinate system of the point is obtained.
The comparison unit 50: obtaining polar constraint relation between the acquisition units 30 to obtain a basic matrix, obtaining more matching corresponding points through the basic matrix, calculating to obtain optimal polar constraint, and comparing coordinates obtained by the acquisition units 30 with the space coordinates of each rod piece of the robot in the setting unit 10. Two digital images of a scene are acquired from different angles at different moments by one camera or two digital images of the scene are acquired simultaneously by the two cameras from different angles, three-dimensional geometric information of an object is recovered based on a parallax principle, and the spatial state and the position of a surrounding real object are reconstructed.
To improve the matching accuracy, the following constraints can be adopted: epipolar line, uniqueness, similarity, sequence consistency, left-right consistency constraint and error reduction.
The adjusting unit 60: and determining whether to adjust the motion parameters of the mechanical arm of the robot according to the data obtained from the comparison file 50, so as to ensure the assembly accuracy. The adjusting unit 60 calculates the distance between each rod of the robot and the acquisition unit 30 according to the internal and external parameters calibrated by the acquisition unit 30 and the matching result of the stereo image, obtains three-dimensional stereo information, and reconstructs the spatial information and the posture of each rod of the robot.
Even if the robot has motion errors of the respective rods, the parameters can be adjusted again by the adjusting unit 60.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be considered to be within the technical scope of the present invention, and the technical solutions and the inventive concepts thereof according to the present invention should be equivalent or changed within the scope of the present invention.

Claims (6)

1. An on-line detection device for robot micro-assembly, comprising:
setting unit (10): the system is used for setting the position and the posture of each rod piece of the robot in the assembly time range in space relative to an absolute coordinate system or a robot base coordinate system;
control unit (20): the joint motion control system is used for controlling the mechanical arm of the robot to perform joint motion according to set basic parameters in the assembly time;
acquisition unit (30): determining coordinates of the characteristic points of the robot, and fitting all received data information in a unified coordinate system;
parameter calibration (40): constructing an image model of various internal and external parameters of the acquisition unit (30) so as to determine the corresponding relation between each rod piece of the robot in a world coordinate system and an image plane coordinate system;
comparison unit (50): obtaining polar constraint relation among the acquisition units (30) to obtain a basic matrix, obtaining more matching corresponding points through the basic matrix, calculating to obtain optimal polar constraint, and comparing coordinates obtained by the acquisition units (30) with space coordinates of each rod piece of the robot in the setting unit (10);
adjusting means (60): and determining whether to adjust the motion parameters of the mechanical arm of the robot according to the data obtained by the comparison file (50), so as to ensure the assembly accuracy.
2. The on-line detection device for robot microassembly of claim 1, wherein: the set basic parameters comprise working space, maximum speed, moment and inertia.
3. The on-line detection device for robot microassembly of claim 1, wherein: the acquisition unit (30) comprises one of a laser measuring head, a camera and a video acquisition card, wherein the laser measuring head is used for collecting data of each rod piece of the laser irradiation robot.
4. The on-line detection device for robot microassembly of claim 3, wherein: the parameter calibration (40) comprises calibration of a single camera and calibration of the relative position of a binocular stereo camera.
5. The on-line detection device for robot microassembly of claim 3, wherein: the acquisition unit (30) uses plane laser to irradiate the surface of the measured object so as to obtain two-dimensional plane data, uses laser rays to irradiate each rod piece of the robot to acquire three-dimensional data, and can obtain the displacement characteristics of the measured object through the analysis of the acquired data.
6. The on-line detection device for robot microassembly of claim 1, wherein: the adjusting unit (60) calculates the distance between each rod piece of the robot relative to the acquisition unit (30) according to the matching result of the internal and external parameters and the stereo image obtained by calibration of the acquisition unit (30), so as to obtain three-dimensional information and reconstruct the space information and the posture of each rod piece of the robot.
CN202110671842.XA 2021-06-17 2021-06-17 Online detection device for robot micro-assembly Pending CN113305851A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110671842.XA CN113305851A (en) 2021-06-17 2021-06-17 Online detection device for robot micro-assembly

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110671842.XA CN113305851A (en) 2021-06-17 2021-06-17 Online detection device for robot micro-assembly

Publications (1)

Publication Number Publication Date
CN113305851A true CN113305851A (en) 2021-08-27

Family

ID=77379232

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110671842.XA Pending CN113305851A (en) 2021-06-17 2021-06-17 Online detection device for robot micro-assembly

Country Status (1)

Country Link
CN (1) CN113305851A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114952843A (en) * 2022-05-30 2022-08-30 江南大学 Micro-assembly operating system based on master-slave cooperation of double robots
CN114952873A (en) * 2022-08-02 2022-08-30 季华实验室 Mechanical arm three-dimensional reconstruction method and device, electronic equipment and storage medium
CN117516485A (en) * 2024-01-04 2024-02-06 东北大学 Pose vision measurement method for automatic guiding and mounting of aircraft engine

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103707300A (en) * 2013-12-20 2014-04-09 上海理工大学 Manipulator device
CN104325268A (en) * 2014-11-04 2015-02-04 南京赫曼机器人自动化有限公司 Industrial robot three-dimensional space independent assembly method based on intelligent learning
US20160349730A1 (en) * 2015-05-29 2016-12-01 Cessna Aircraft Company Robotic system and method for processing aircraft component
JP2017219365A (en) * 2016-06-06 2017-12-14 セイコーエプソン株式会社 Position and posture detection device, robot controller, and robot
CN109434839A (en) * 2018-12-25 2019-03-08 江南大学 A kind of robot self-calibrating method based on monocular vision auxiliary positioning
CN112132894A (en) * 2020-09-08 2020-12-25 大连理工大学 Mechanical arm real-time tracking method based on binocular vision guidance

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103707300A (en) * 2013-12-20 2014-04-09 上海理工大学 Manipulator device
CN104325268A (en) * 2014-11-04 2015-02-04 南京赫曼机器人自动化有限公司 Industrial robot three-dimensional space independent assembly method based on intelligent learning
US20160349730A1 (en) * 2015-05-29 2016-12-01 Cessna Aircraft Company Robotic system and method for processing aircraft component
JP2017219365A (en) * 2016-06-06 2017-12-14 セイコーエプソン株式会社 Position and posture detection device, robot controller, and robot
CN109434839A (en) * 2018-12-25 2019-03-08 江南大学 A kind of robot self-calibrating method based on monocular vision auxiliary positioning
CN112132894A (en) * 2020-09-08 2020-12-25 大连理工大学 Mechanical arm real-time tracking method based on binocular vision guidance

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘诗钊: "基于视觉感知的机械臂运动控制研究", 《中国优秀硕士学位论文全文库信息科技辑》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114952843A (en) * 2022-05-30 2022-08-30 江南大学 Micro-assembly operating system based on master-slave cooperation of double robots
CN114952843B (en) * 2022-05-30 2023-02-28 江南大学 Micro-assembly operating system based on master-slave cooperation of double robots
CN114952873A (en) * 2022-08-02 2022-08-30 季华实验室 Mechanical arm three-dimensional reconstruction method and device, electronic equipment and storage medium
CN114952873B (en) * 2022-08-02 2022-10-18 季华实验室 Mechanical arm three-dimensional reconstruction method and device, electronic equipment and storage medium
CN117516485A (en) * 2024-01-04 2024-02-06 东北大学 Pose vision measurement method for automatic guiding and mounting of aircraft engine
CN117516485B (en) * 2024-01-04 2024-03-22 东北大学 Pose vision measurement method for automatic guiding and mounting of aircraft engine

Similar Documents

Publication Publication Date Title
CN113305851A (en) Online detection device for robot micro-assembly
EP1711317B1 (en) Machine vision controlled robot tool system
US8244402B2 (en) Visual perception system and method for a humanoid robot
CN111745267A (en) System and method for tracking groove weld in real time based on laser displacement sensor
CN105082161A (en) Robot vision servo control device of binocular three-dimensional video camera and application method of robot vision servo control device
Oh et al. Stereo vision based automation for a bin-picking solution
Stavnitzky et al. Multiple camera model-based 3-D visual servo
CN110450163A (en) The general hand and eye calibrating method based on 3D vision without scaling board
CN114536346B (en) Mechanical arm accurate path planning method based on man-machine cooperation and visual detection
Borangiu et al. Robot arms with 3D vision capabilities
Yang et al. Visual servoing control of baxter robot arms with obstacle avoidance using kinematic redundancy
Xu et al. Seam tracking and visual control for robotic arc welding based on structured light stereovision
Behera et al. A hybrid neural control scheme for visual-motor coordination
Đurović et al. Low cost robot arm with visual guided positioning
Deshmukh et al. Kinematic modeling of an automated laser line point cloud scanning system
Seçil et al. 3-d visualization system for geometric parts using a laser profile sensor and an industrial robot
Mohamed et al. Automating active stereo vision calibration process with cobots
Pajor et al. Stereovision system for motion tracking and position error compensation of loading crane
CN112123329A (en) Robot 3D vision hand-eye calibration method
CN117464692B (en) Lining plate grabbing mechanical arm control method based on structured light vision system
Yong RESEARCH ON PATH RECOGNITION OF WELDING MANIPULATOR BASED ON AUTOMATIC CONTROL ALGORITHM.
CN113400300B (en) Servo system for robot tail end and control method thereof
Qin et al. Sensor calibration and trajectory planning in 3D vision-guided robots
Sa et al. Research on Hand-eye Calibration Method Based on Binocular Camera
Chavitranuruk et al. Vision System for Detecting and Locating Micro-Scale Objects with Guided Cartesian Robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210827

RJ01 Rejection of invention patent application after publication