CN117464686A - 3D vision high-precision positioning and guiding method - Google Patents

3D vision high-precision positioning and guiding method Download PDF

Info

Publication number
CN117464686A
CN117464686A CN202311633829.0A CN202311633829A CN117464686A CN 117464686 A CN117464686 A CN 117464686A CN 202311633829 A CN202311633829 A CN 202311633829A CN 117464686 A CN117464686 A CN 117464686A
Authority
CN
China
Prior art keywords
tool
robot
coordinate
grabbing
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311633829.0A
Other languages
Chinese (zh)
Inventor
王伟
汪良红
候金良
黄炎浩
石伟华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Fuwei Intelligent Technology Co ltd
Original Assignee
Guangzhou Fuwei Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Fuwei Intelligent Technology Co ltd filed Critical Guangzhou Fuwei Intelligent Technology Co ltd
Priority to CN202311633829.0A priority Critical patent/CN117464686A/en
Publication of CN117464686A publication Critical patent/CN117464686A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Abstract

The invention discloses a 3D vision high-precision positioning and guiding method, which relates to the technical field of vision positioning, and the device discloses the following steps: step one: tool coordinates tool are made near the grabbing position of the clamping jaw, and conversion from a camera coordinate system to the tool coordinates tool is obtained through hand-eye calibration and is recorded asStep two: placing an object to be grasped and placed at a fixed position, and photographing the object by aiming at the object by a mobile robot to obtain the coordinates of the object to the cameraSimultaneously acquiring pose transformation from a current robot tool coordinate tool to a robot base coordinate from a demonstratorThe invention adopts 3D visual positioning to realize high-precision guidance of the grabbing and placing process of the robot,part of system errors can be eliminated, and positioning and guiding precision can be improved.

Description

3D vision high-precision positioning and guiding method
Technical Field
The invention relates to the technical field of visual positioning, in particular to a 3D visual high-precision positioning guiding method.
Background
In the process of grasping and releasing the 3D vision guided robot, the traditional method has systematic errors such as hand-eye calibration errors, workpiece grasping and releasing position setting errors, robot tool coordinate errors and the like, so that high-precision grasping and releasing below millimeter is difficult to achieve in the grasping process, and the application scene of the 3D vision and the robot is limited.
Disclosure of Invention
Aiming at the defects existing in the prior art, the invention aims to provide a 3D vision high-precision positioning and guiding method.
In order to achieve the above purpose, the present invention provides the following technical solutions:
A3D vision high-precision positioning and guiding method comprises the following steps:
step one: tool coordinates tool are made near the grabbing position of the clamping jaw, and conversion from a camera coordinate system to the tool coordinates tool is obtained through hand-eye calibration and is recorded as
Step two: placing an object to be grasped and placed at a fixed position, and photographing the object by aiming at the object by a mobile robot to obtain the coordinates of the object to the cameraSimultaneously acquiring pose conversion from a current robot tool coordinate tool to a robot base coordinate from a demonstrator +.>The transformation from the object to the robot base coordinate is obtained through pose transformation:
step three: moving a robot clamping jaw to a position to be grasped and put, and acquiring a current robot tool coordinate tool from a demonstrator to the robotTransformation of base coordinatesCalculating to obtain the position of the object to the coordinate of the tool, namely the position of the grabbing point:
step four: in the 3D vision guiding grabbing and placing process, the transformation from an object to a robot base coordinate is obtained according to the flow of a formula (1)It is +.>Performing operation to obtain the actual grabbing position of the robot:
step five: tool coordinate tool for robot to walkThe pose can reach the grabbing and placing position.
Further, in the 3D recognition guiding grabbing and placing process, when the object and the camera are relatively alignedWhen the angle deviation is larger, the systematic error is amplified, and the relative pose of the object and the camera is corrected to be equal to +.>Similarly, the following is concrete: robot is +.>Position photographing, and obtaining pose transformation from a current fixed object to a tool coordinate tool:when correcting the photographing position, obtaining the transformation of the object to the robot base coordinates according to the flow of the formula (1)>And (3) calculating the corrected photographing position with the formula (4):robot walks using tool coordinates tool>And (3) obtaining corrected shooting positions, correcting, then performing shooting, grabbing and placing, correcting the shooting positions with deviation, and correcting according to a formula (5) again.
Compared with the prior art, the invention has the following beneficial effects:
according to the invention, the 3D visual positioning is adopted to realize high-precision guiding of the grabbing and placing process of the robot, so that a part of system errors can be eliminated, and the positioning and guiding precision is improved.
Drawings
FIG. 1 is a flow chart of the present invention.
Detailed Description
Example 1
Referring to FIG. 1
The invention adopts 3D vision positioning to realize high-precision guidance of the grabbing and placing process of the robot. Firstly, a 3D camera is arranged at the tail end of a robot flange, a clamping jaw is also arranged at the tail end of the flange, a tool coordinate tool is prepared near the grabbing position of the clamping jaw, and the conversion from a camera coordinate system to the tool coordinate tool is obtained through hand-eye calibration and is recorded as
Pick-and-place point optimization algorithm
Placing an object to be grasped and placed at a fixed position, and photographing the object by aiming at the object by a mobile robot to obtain the coordinates of the object to the cameraSimultaneously acquiring pose conversion from a current robot tool coordinate tool to a robot base coordinate from a demonstrator +.>The transformation from the object to the robot base coordinate is obtained through pose transformation:
moving a robot clamping jaw to a position to be grasped and placed, and acquiring transformation from a demonstrator from a current robot tool coordinate tool to a robot base coordinateCalculating to obtain the position of the object to the coordinate of the tool, namely the position of the grabbing point:
the grabbing point eliminates errors in setting the coordinates of the robot tool and the grabbing and placing positions of the workpiece manually.
In the 3D vision guiding grabbing and placing process, the transformation from an object to a robot base coordinate is obtained according to the flow of a formula (1)It is +.>Performing operation to obtain the actual grabbing position of the robot:
tool coordinate tool for robot to walkThe pose can reach the grabbing and placing position.
Correction algorithm for photographing point
Further, during the 3D recognition guiding grabbing and placing process, when the object is in relative pose with the camera, the camera is in relative pose with the cameraWhen the angle deviation is larger, the system error is amplified. Thus correcting the relative pose of the object and the camera to be the same as the relative pose before the object is put in the positionSimilarly, the following is concrete:
robot is atPosition photographing, and obtaining pose transformation from a current fixed object to a tool coordinate tool:
when correcting the photographing position, obtaining the transformation from the object to the robot base coordinate according to the flow of the formula (1)And (3) calculating the corrected photographing position with the formula (4):
tool coordinate tool for robot to walkAnd obtaining the corrected photographing position of the pose. And (5) shooting, grabbing and placing after correction.
If the correction of the photographing position is still deviated, the correction can be performed again according to the formula (5).
The above description is only a preferred embodiment of the present invention, and the protection scope of the present invention is not limited to the above examples, and all technical solutions belonging to the concept of the present invention belong to the protection scope of the present invention. It should be noted that modifications and adaptations to those skilled in the art without departing from the principles of the present invention are intended to be considered as protecting the scope of the present template.
The foregoing describes one embodiment of the present invention in detail, but the description is only a preferred embodiment of the present invention and should not be construed as limiting the scope of the invention. All equivalent changes and modifications within the scope of the present invention are intended to be covered by the present invention.

Claims (2)

1. The 3D vision high-precision positioning and guiding method is characterized by comprising the following steps of:
step one: tool coordinates tool are made near the grabbing position of the clamping jaw, and conversion from a camera coordinate system to the tool coordinates tool is obtained through hand-eye calibration and is recorded as
Step two: placing an object to be grasped and placed at a fixed position, and photographing the object by aiming at the object by a mobile robot to obtain the coordinates of the object to the cameraSimultaneously acquiring pose conversion from a current robot tool coordinate tool to a robot base coordinate from a demonstrator +.>The transformation from the object to the robot base coordinate is obtained through pose transformation:
step three: moving the robot jaw toObtaining the transformation from the current robot tool coordinate tool to the robot base coordinate from the demonstrator at the position to be grasped and placedCalculating to obtain the position of the object to the coordinate of the tool, namely the position of the grabbing point:
step four: in the 3D vision guiding grabbing and placing process, the transformation from an object to a robot base coordinate is obtained according to the flow of a formula (1)It is +.>Performing operation to obtain the actual grabbing position of the robot:
step five: tool coordinate tool for robot to walkThe pose can reach the grabbing and placing position.
2. The 3D vision high precision positioning and guiding method according to claim 1, wherein in the 3D recognition guiding process, when the object and the camera are relatively alignedWhen the angle deviation is larger, the systematic error is amplified, and the relative pose of the object and the camera is corrected to be equal to +.>Similarly, the following is concrete: machine for processing a sheet of materialPeople are in->Position photographing, and obtaining pose transformation from a current fixed object to a tool coordinate tool: />When correcting the photographing position, obtaining the transformation of the object to the robot base coordinates according to the flow of the formula (1)>And (3) calculating the corrected photographing position with the formula (4): />Robot walks using tool coordinates tool>And (3) obtaining corrected shooting positions, correcting, then performing shooting, grabbing and placing, correcting the shooting positions with deviation, and correcting according to a formula (5) again.
CN202311633829.0A 2023-11-30 2023-11-30 3D vision high-precision positioning and guiding method Pending CN117464686A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311633829.0A CN117464686A (en) 2023-11-30 2023-11-30 3D vision high-precision positioning and guiding method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311633829.0A CN117464686A (en) 2023-11-30 2023-11-30 3D vision high-precision positioning and guiding method

Publications (1)

Publication Number Publication Date
CN117464686A true CN117464686A (en) 2024-01-30

Family

ID=89625740

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311633829.0A Pending CN117464686A (en) 2023-11-30 2023-11-30 3D vision high-precision positioning and guiding method

Country Status (1)

Country Link
CN (1) CN117464686A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106182004A (en) * 2016-08-01 2016-12-07 上海交通大学 The method of the industrial robot automatic pin hole assembling that view-based access control model guides
CN107300100A (en) * 2017-05-22 2017-10-27 浙江大学 A kind of tandem type mechanical arm vision guide approach method of Online CA D model-drivens
CN107756398A (en) * 2017-09-30 2018-03-06 深圳市功夫机器人有限公司 Robot vision bootstrap technique, device and equipment
CN108177143A (en) * 2017-12-05 2018-06-19 上海工程技术大学 A kind of robot localization grasping means and system based on laser vision guiding
CN109848994A (en) * 2019-02-22 2019-06-07 浙江启成智能科技有限公司 A kind of robot vision guidance location algorithm

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106182004A (en) * 2016-08-01 2016-12-07 上海交通大学 The method of the industrial robot automatic pin hole assembling that view-based access control model guides
CN107300100A (en) * 2017-05-22 2017-10-27 浙江大学 A kind of tandem type mechanical arm vision guide approach method of Online CA D model-drivens
CN107756398A (en) * 2017-09-30 2018-03-06 深圳市功夫机器人有限公司 Robot vision bootstrap technique, device and equipment
CN108177143A (en) * 2017-12-05 2018-06-19 上海工程技术大学 A kind of robot localization grasping means and system based on laser vision guiding
CN109848994A (en) * 2019-02-22 2019-06-07 浙江启成智能科技有限公司 A kind of robot vision guidance location algorithm

Similar Documents

Publication Publication Date Title
CN106182004B (en) The method of the industrial robot automatic pin hole assembly of view-based access control model guidance
CN111775146B (en) Visual alignment method under industrial mechanical arm multi-station operation
CN110125926B (en) Automatic workpiece picking and placing method and system
CN102686041A (en) Pasting method for machine vision-based irregular electronic device
CN110276799B (en) Coordinate calibration method, calibration system and mechanical arm
CN110293559B (en) Installation method for automatically identifying, positioning and aligning
CN110625600B (en) Robot tail end workpiece coordinate system calibration method
CN112894823B (en) Robot high-precision assembling method based on visual servo
CN112720458B (en) System and method for online real-time correction of robot tool coordinate system
CN113211431B (en) Pose estimation method based on two-dimensional code correction robot system
US7957834B2 (en) Method for calculating rotation center point and axis of rotation, method for generating program, method for moving manipulator and positioning device, and robotic system
CN112109072B (en) Accurate 6D pose measurement and grabbing method for large sparse feature tray
CN114119570A (en) Automatic model changing method, device, controller and storage medium
CN109514554B (en) Tool coordinate system rapid calibration method using robot terminal vision system
CN111482964A (en) Novel robot hand-eye calibration method
CN111267094A (en) Workpiece positioning and grabbing method based on binocular vision
CN114074331A (en) Disordered grabbing method based on vision and robot
CN117464686A (en) 3D vision high-precision positioning and guiding method
CN116423526B (en) Automatic calibration method and system for mechanical arm tool coordinates and storage medium
CN110533727B (en) Robot self-positioning method based on single industrial camera
CN116038703A (en) Robot grabbing pose acquisition method and system for photovoltaic cell strings
CN112792818B (en) Visual alignment method for rapidly guiding manipulator to grasp target
CN113352323B (en) Automatic calibration alignment system and method
JP2021024053A (en) Correction method of visual guidance robot arm
CN113829346B (en) Workpiece positioning method, workpiece positioning device and groove cutting workstation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination