CN108098762A - A kind of robotic positioning device and method based on novel visual guiding - Google Patents
A kind of robotic positioning device and method based on novel visual guiding Download PDFInfo
- Publication number
- CN108098762A CN108098762A CN201611043697.6A CN201611043697A CN108098762A CN 108098762 A CN108098762 A CN 108098762A CN 201611043697 A CN201611043697 A CN 201611043697A CN 108098762 A CN108098762 A CN 108098762A
- Authority
- CN
- China
- Prior art keywords
- robot
- camera
- pose
- calibration
- line
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1671—Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Abstract
The invention discloses a kind of robotic positioning device and method based on novel visual guiding, the system and device includes line-structured light self-scanning device, robot module and host computer;The line-structured light self-scanning device includes industrial camera, laser and galvanometer;The robot module includes robot body and robot controller.The systems approach is Robot Hand-eye relation and the method for tool coordinates system combined calibrating, and using this method, system finally realizes three-dimensional localization of the industrial robot to random pose target object, can be applied to robot crawl, transport operation task.
Description
Technical field
The present invention relates to a kind of industrial robot position system device and methods, and novel visual is based on more particularly to one kind
The robotic positioning device and method of guiding.
Background technology
, it is necessary to which the pose of target object, which is known in advance, could complete corresponding operation for robot majority application scenarios
Task.At present in the commercial Application of robot, mostly manually the mode of teaching or off-line programing comes planning robot's
Operating path, however the starting pose of this method considered critical target object and pose is terminated, it can only mechanically repeat to carry
The path that front lay pulls, and different targets needs teaching respectively to obtain corresponding teaching file, highly structural
Working environment seriously constrains machine task efficiency, flexibility and intelligent, can not meet the requirement of flexible manufacturing system.
Robot vision is widely used in the industrial production with its quick, stable, non-contacting feature.It is logical
The mode assist people for crossing vision guide completes positioning to target object, can effectively improve production line work efficiency and
Automatization level.
Current robot positioning system is mostly using following two:
(1) the robot localization method based on binocular stereo vision:This mode can realize that the quick, non-of target object connects
Three-dimensional localization is touched, but binocular vision corresponding points are difficult to match, and are applicable only to the target object with notable feature;
(2) workpiece fixed camera:Guiding system based on two-dimensional visual can be only applied to no position level transmission occasion,
It can not realize that 3D vision guides.
Application publication number is that the application for a patent for invention of CN102794763A discloses one kind " based on line-structured light visual sensing
The welding robot system scaling method of device guiding ", which obtains circle target image by camera, and utilizes square
The methods of battle array conversion, Hough transform, carries out processing calculating to image, the final shift value for obtaining workpiece under different positions and pose.It should
Application for a patent for invention realizes the fast of trick relational matrix and sensor parameters of the robot based on line structured light vision sensor
Speed calibration, but it need to be handled image using many algorithms, and calibration algorithm is complex.
Application publication number is that the application for a patent for invention of CN106003036A discloses a kind of " object based on binocular vision guiding
Body captures and place system ", which identifies the method with positioning using binocular vision completion target object, and guiding mechanical arm is complete
The crawl of paired target object is acted with placement.The application for a patent for invention can realize quick, non-contact three to target object
Dimension positioning, but binocular vision corresponding points are difficult to match, and are applicable only to the target object with notable feature, it is not suitable extensively
With.
Application publication number is that the application for a patent for invention of CN103558850A discloses a kind of a kind of " weldering of laser vision guiding
Welding robot full-automatic movement self-calibration method ", this method are related to camera imaging principle, laser structure light measurement principle and trick
System Working Principle realizes the full automatic calibration of the welding robot system of laser structure light guiding.This system need to pass through camera
Two groups of translational motions are done in three dimensions to realize the calibration of camera parameter, and judge whether movement meets calibration request every time,
Operating method is slightly burdensome.
The content of the invention
It is an object of the invention to overcome the deficiencies of the prior art and provide a kind of robots based on novel visual guiding to determine
Position device, the device is using Robot Hand-eye relation and the method for tool coordinates system combined calibrating, by the three-dimensional position of target object
Appearance is converted from camera coordinates system to robot tool coordinate system, the final three-dimensional for realizing the random pose target object of industrial robot
Positioning.
Determine it is another object of the present invention to provide a kind of using a kind of above-mentioned robot based on novel visual guiding
The method that position device is realized, proposes Robot Hand-eye relation and the method for tool coordinates system combined calibrating, is independently demarcated with the two
It compares, there is no error propagation problem, improves the positioning accuracy of robot.Furthermore the scaling method of systematic parameter is easy easily
Row, the parameter calibration of each module of system can be completed using plane chessboard target, disclosure satisfy that industry spot demand.
The present invention solve above-mentioned technical problem technical solution be:
A kind of robotic positioning device and method based on novel visual guiding, which is characterized in that the system and device bag
Include line-structured light self-scanning device, robot module and host computer.Wherein, the line-structured light self-scanning device is by work
Industry camera, laser and galvanometer are formed, and as the visual sensor in system, are rotated by galvanometer and are realized laser plane to mesh
The scanning of object is marked, and by handling the image that camera takes, obtains three of target object under camera coordinates system
Tie up pose;The robot module includes robot body and robot controller;Wherein, the robot controller is used to connect
The control instruction that the host computer is sent is received, controls the robot body;The robot body be used to implement to
The specified operation of seat in the plane appearance target object;When getting pose of the target object under robot tool coordinate system, robot sheet
Just according to positioning result, mechanical arm drives end to perform instrument and is moved to object pose body, is operated accordingly;Described
Host computer is described for being precisely controlled, the image collected is handled and being sent instructions to galvanometer corner
Robot controller.
The method be Robot Hand-eye relation and tool coordinates system combined calibrating, wherein:
The realization of systems approach needs to complete three parameter calibration work:The calibration of line-structured light self-scanning device parameter, phase
Machine and the calibration of Robot Hand-eye relation, the calibration of robot tool coordinate system;
The line-structured light self-scanning device parameter calibration, that is, establish the pose transformational relation between galvanometer and camera;
The camera and the calibration of Robot Hand-eye relation, that is, establish between line-structured light self-scanning device and robot
Pose transformational relation;
The robot tool coordinate system calibration, that is, establish the pose transformational relation between instrument and robot flange.
Line-structured light self-scanning device ginseng in a kind of robot localization method based on novel visual guiding of the present invention
Number scaling method, comprises the following steps:
(1) calibration for cameras intrinsic parameter;
(2) homogeneous transform matrix between calibration for cameras coordinate system and line-structured light coordinate system.
Camera and Robot Hand-eye relation in a kind of robot localization method based on novel visual guiding of the present invention
Chessboard target by the way of trick relation and tool coordinates system combined calibrating, i.e., is fixed on instrument one side, chess by scaling method
Disk target will follow movement of tool, and camera shoots the chessboard target under each pose, while recorder people is at these positions
Pose, using various matrix solving methods, finally calibrate homogeneous transform matrix of the target compared with robot end's flange.
Robot tool coordinate system calibration in a kind of robot localization method based on novel visual guiding of the present invention
Method, according to homogeneous transformation relation, by the target demarcated in above-mentioned steps compared with the homogeneous change of robot end's flange
It changes matrix and combines solution with the homogeneous transform matrix of target to instrument, finally definite tool coordinates system to robot flange coordinate system
Homogeneous transform matrix.
The present invention has following advantageous effect compared with prior art:
1st, line-structured light self-scanning device and industrial robot composition Eye to Hand formula eye-in-hand vision systems, pass through line
Structure light self-scanning obtains target object three-dimensional pose under camera coordinates system, further according to Robot Hand-eye relation, realizes machine
Positioning of the people to target object realizes 3D vision guiding;
2nd, a kind of Robot Hand-eye relation and the method for tool coordinates combined calibrating are proposed, compared with the two is independently demarcated,
There is no error propagation problems, improve robot localization precision to a certain degree;
3rd, method is simple for system parameter calibration, and each module parameter mark of system can be completed using plane chessboard target
It is fixed, it disclosure satisfy that industry spot calibration request.
Description of the drawings
Fig. 1 is an a kind of specific embodiment of robotic positioning device based on novel visual guiding of the present invention
Structure diagram.
Fig. 2 is the flow chart of Robot Hand-eye relation and tool coordinates system combined calibrating method.
Fig. 3 demarcates schematic diagram for line-structured light self-scanning device parameter.
Fig. 4 is the schematic diagram of Robot Hand-eye relation and tool coordinates system combined calibrating scheme.
Specific embodiment
With reference to embodiment and attached drawing, the present invention is described in further detail, but embodiments of the present invention are unlimited
In this.
A kind of robotic positioning device based on novel visual guiding of the present invention is by line-structured light self-scanning device, machine
People's module is formed with host computer, wherein:
Referring to Fig. 1, the line-structured light self-scanning device includes industrial camera 1, galvanometer 2, laser 3;Wherein, it is described
Industrial camera 1 several two-dimentional chessboard target images being under different positions and pose are shot in its working range;The galvanometer 2
The voltage signal sent by host computer 7 accurately controls corner;The line-structured light plane warp that the laser 3 is sent
The reflection of galvanometer 2 projects 6 surface of target object and forms laser striation, wherein, after the laser striation that Calibration of Laser device 3 is formed, lead to
The processing of host computer 7 is crossed, pose of the target object 6 under camera coordinates system can be obtained;The robot 4 is according to target
Pose of the object 6 under robot tool coordinate system, mechanical arm 5 drive end to perform instrument and are moved to object pose, go forward side by side
The corresponding operation of row.
Referring to Fig. 2, the flow of the Robot Hand-eye relation and tool coordinates system combined calibrating method, including following step
Suddenly:
S1:Line-structured light self-scanning device parameter is demarcated, that is, establishes the pose transformational relation between galvanometer and camera, referring to
The line-structured light self-scanning device parameter calibration schematic diagram of Fig. 3, specific method are as follows:
S11:Calibration for cameras intrinsic parameter:In camera working range, several two-dimentional chessboards being under different positions and pose are shot
Target image (comes from paper A flexible new technique for camera using Zhang Zhengyou Camera Calibration Algorithms
Calibration [J], author Zhang Zhengyou, IEEE Transactions on Pattern Analysis&Machine
Intelligence,2000,22(11):1330-1334) calibration for cameras intrinsic parameter, including camera focus f, principal point coordinate (u0,
v0) and distortion factor k1、k2、p1、p2。
S12:Homogeneous transform matrix between calibration for cameras coordinate system and line-structured light coordinate system:Control galvanometer input voltage
For U0, the angle of emergence of the laser plane after vibration mirror reflected is at this timeIn line-structured light self-scanning device working range, by chess
Disk target is put successively with different positions and pose, obtains the optical strip image of m (m >=2) width chessboard target image and corresponding pose;Using
Positive friend's Camera Calibration Algorithm calculates outer parameter matrix R, T between camera coordinates system and target co-ordinates system, and then will extract
Laser optical losses (reference papers A multi-scale analysis based method for extracting
Coordinates of laser light stripe centers [J], author Li Fengjiao, Li little Jing, Liu Zhen, Acta
Optica Sinica,20104,34(11):1110002. and paper Extrication method for sub-pixel
Center of linear structured light stripe [J], author Jiang Yongfu, Jiang Kaiyong, Lin Junyi, Laser&
Optoelectronics Process,2015,52(7):071502) it is transformed under camera coordinates system;It is misaligned based on this m item
Laser striation, using least square fitting laser plane, plane vector n0=(a0,b0,c0).Change galvanometer successively
It is U to control voltage1、U2…Un-1(n >=2), repeat the above steps, and obtain the optical plane under each voltage under camera coordinates system
Equation, optical plane normal vector are ni=(ai,bi,ci)。
The intersection that optical plane is emitted under each voltage is galvanometer shaft, it is vertical with each optical plane normal vector, with optical plane
The direction vector dot-product of normal vector and galvanometer shaft is optimization aim, obtains galvanometer coordinate system xwThe direction vector of axis.Definition
The normal vector of optical plane is galvanometer coordinate system y when to control voltage be U0wDirection of principal axis is vectorial, then zwThe direction vector of axis is xw×yw。
With galvanometer xwThe origin of axis and the intersection point of camera coordinates system oyz planes as galvanometer coordinate system, then to each optical plane distance most
Small point (0, ty,tx) it is coordinate of the galvanometer coordinate origin under camera coordinates system.
By above-mentioned calibration process, camera coordinates system and the homogeneous transform matrix of galvanometer coordinate system are obtained, so as to establish two
The following transformational relation of coordinate system:
Wherein, (xc,yc,zc) for target point in camera coordinates system coordinate value, (xw,yw,zw) for target point in galvanometer coordinate
It is coordinate value.
Being emitted equation of the optical plane under galvanometer coordinate system is:
Wherein,The angle of emergence for being laser plane after vibration mirror reflected.
From camera national forest park in Xiaokeng:
Wherein, (u, v) is using the top left corner apex of ccd image plane as the pixel coordinate system coordinate value of origin, Nx、NyPoint
Physical size of each pixel on plane of delineation x and y directions is not represented.
Simultaneous equations (1), (2), (3) can obtain:
By solving equation group (4), you can acquire target point coordinate (x under galvanometer coordinate systemw,yw,zw), according to camera
Homogeneous transform matrix between coordinate system and galvanometer coordinate system, and then target point is obtained in camera coordinates system coordinate (xc,yc,zc)。
S2:Camera and the calibration of Robot Hand-eye relation, that is, establish the position between line-structured light self-scanning device and robot
Appearance transformational relation, specific method are as follows:
Referring to Fig. 4, chessboard target 6 is fixed on instrument one side, chessboard target 6 will follow movement of tool, and camera 1 is shot respectively
Chessboard target under a pose, while record pose of the industrial robot 5 at these positions.
Remember CWFor robot basis coordinates system, robot base is defined in;CCFor camera coordinate system, origin is defined on camera
Optical centre;CTCPFor robot end's flange coordinate system;CObjFor plane target drone coordinate system;CToolFor robot tool coordinate
System.
Homogeneous transform matrix in space between arbitrary Two coordinate system can be expressed as:
Wherein, R is 3 × 3 unit orthogonal matrixes, represents the rotation relationship between Two coordinate system;T be 3 × 1 translation matrix, generation
Position offset between table Two coordinate system;0 is 3 × 1 null matrix.
In calibration process, robot often moves a position, you can obtains a homogeneous transform matrixIt is and right
The homogeneous transform matrix answeredWithin the system, the homogeneous variation relation between each coordinate system meets:
WhereinFor the outer parameter of plane target drone coordinate system to the transformational relation between camera coordinates system, i.e. camera, root
The target image that is taken according to robot and proven camera intrinsic parameter, can be obtained using Zhang Zhengyou standardizations;
For camera coordinates system to the homogeneous transition matrix between robot coordinate system, due to the two coordinate systems be it is fixed, because
AndIt is unique constant;It is chessboard target co-ordinates system to the homogeneous transform matrix between robot flange coordinate system, this two
The movement of the position random device people of a coordinate system and change, but the relative position of the two coordinate systems immobilizes, thusIt is and unique;It, can be with for robot flange coordinate system to the homogeneous transform matrix between robot basis coordinates system
It is solved by robot direct kinematics.
In view of the numerical value magnitude of offset T is much larger than spin matrix R, direct solution homogeneous transform matrix is difficult to meet engineering
Actual needs.According to the mathematical feature and geometric meaning of homogeneous transform matrix, formula (6) is resolved into rotation and translation two parts,
Paper (is come from using two-step method《Two-step method demarcates Robot Hand-eye relation research【J】, author Xie Fazhong, Wu Nianxiang, Changchun work
Journey institute journal, 2012,13 (03):The R and T in homogeneous transform matrix 112-116) are asked for successively:
Above formula expansion can be obtained:
Spin matrix is solved firstIt willSubstitution formula (9) solves two homogeneous transformations
Offset portion in matrix can list 3 linearity non homogeneous differential equations under each pose of robot, utilize linear least square
Over-determined systems are solved, it can be in the hope of position offset
S3:Robot tool coordinate system is demarcated, that is, establishes the pose transformational relation between instrument and robot flange, specifically
Method is as follows:
It is followed by calibration tool coordinate system, i.e., true on the basis of S2 steps complete the calibration of Robot Hand-eye relation
Tool coordinates system is determined to the homogeneous transform matrix of robot end's flange.In above-mentioned trick relation calibration process, demarcate
Go out homogeneous transform matrix of the target compared with robot end's flangeSimultaneously according to mechanical gripper and the processing ruler of target
It is very little, on the premise of target fixed error is ignored, target can be obtained to the homogeneous transform matrix of instrumentWherein:
According to homogeneous transformation relation, the homogeneous transformation relation of flange coordinate system to tool coordinates system meets:
Due to plane target drone coordinate system, tool coordinates system reference axis is easy to align, and target and tool sizes it is known that thus
Since droop is givenThe error brought independently demarcates less than tool coordinates system in allowed band and brings error, meets
Engineer application demand.This tool coordinates system scaling scheme can simplify demarcating steps, improve work efficiency, while avoid solely
The accumulated error that day-mark is brought surely itself does not have the execution instrument of degree of freedom suitable for following manipulator motion.
It is above-mentioned for the preferable embodiment of the present invention, but embodiments of the present invention and from the limitation of the above,
His any Spirit Essence without departing from the present invention with made under principle change, modification, replacement, combine, simplification, should be
The substitute mode of effect, is included within protection scope of the present invention.
Claims (3)
1. a kind of robotic positioning device based on novel visual guiding, which is characterized in that the system and device includes line-structured light
Self-scanning device, robot module and host computer, wherein,
The line-structured light self-scanning device is used to obtain three-dimensional pose of the target object under camera coordinates system;The line
Structure light self-scanning device includes industrial camera, galvanometer and laser;The figure that the industrial camera need to be handled for shooting
Picture;The laser sends line-structured light plane and projects testee surface through vibration mirror reflected and form laser striation;It is described
Galvanometer plays reflex, and the voltage signal that the corner rotated can be sent by host computer is precisely controlled;
The robot module includes robot body and robot controller;Wherein, the robot controller is used to receive
The control instruction that the host computer is sent controls the robot body;The robot body is used to implement to random
The specified operation of pose target object.
The host computer is used to be precisely controlled galvanometer corner, handle the image collected and sent
It instructs to the robot controller.
2. a kind of method realized using a kind of robotic positioning device based on novel visual guiding described in claim 1,
It is characterized in that, this method is realized by Robot Hand-eye relation and tool coordinates system combined calibrating by the three-dimensional of target object
Pose is converted from camera coordinates system to robot coordinate system, so as to finally realize three-dimensional of the robot to random pose target object
Positioning.
3. the method that a kind of robotic positioning device based on novel visual guiding according to claim 2 is realized, special
Sign is that the realization of system need to complete three staking-out works, including the calibration of line-structured light self-scanning device parameter, camera and machine
The calibration of human hand eye relation, the calibration of robot tool coordinate system, wherein:
The line-structured light self-scanning device parameter calibration, that is, establish the pose transformational relation between galvanometer and camera;
The camera and the calibration of Robot Hand-eye relation, that is, establish the pose between line-structured light self-scanning device and robot
Transformational relation;
The robot tool coordinate system calibration, that is, establish the pose transformational relation between instrument and robot flange.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611043697.6A CN108098762A (en) | 2016-11-24 | 2016-11-24 | A kind of robotic positioning device and method based on novel visual guiding |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611043697.6A CN108098762A (en) | 2016-11-24 | 2016-11-24 | A kind of robotic positioning device and method based on novel visual guiding |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108098762A true CN108098762A (en) | 2018-06-01 |
Family
ID=62204682
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611043697.6A Pending CN108098762A (en) | 2016-11-24 | 2016-11-24 | A kind of robotic positioning device and method based on novel visual guiding |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108098762A (en) |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108827154A (en) * | 2018-07-09 | 2018-11-16 | 深圳辰视智能科技有限公司 | A kind of robot is without teaching grasping means, device and computer readable storage medium |
CN108995874A (en) * | 2018-07-27 | 2018-12-14 | 同济大学 | A kind of auxiliary device supplied for blank and finished parts storage |
CN109605381A (en) * | 2019-01-29 | 2019-04-12 | 欧米瑞(广东)智能制造有限公司 | A kind of three-dimensional localization reclaimer system and method for fetching |
CN109794963A (en) * | 2019-01-07 | 2019-05-24 | 南京航空航天大学 | A kind of robot method for rapidly positioning towards curved surface member |
CN109814124A (en) * | 2019-01-28 | 2019-05-28 | 河北省科学院应用数学研究所 | A kind of robot positioning system and method based on structure light 3 D sensor |
CN109848994A (en) * | 2019-02-22 | 2019-06-07 | 浙江启成智能科技有限公司 | A kind of robot vision guidance location algorithm |
CN110000790A (en) * | 2019-04-19 | 2019-07-12 | 深圳科瑞技术股份有限公司 | A kind of scaling method of SCARA robot eye-to-hand hand-eye system |
CN110136208A (en) * | 2019-05-20 | 2019-08-16 | 北京无远弗届科技有限公司 | A kind of the joint automatic calibration method and device of Visual Servoing System |
CN110276806A (en) * | 2019-05-27 | 2019-09-24 | 江苏大学 | Online hand-eye calibration and crawl pose calculation method for four-freedom-degree parallel-connection robot stereoscopic vision hand-eye system |
CN110285831A (en) * | 2019-07-05 | 2019-09-27 | 浙江大学城市学院 | A kind of network light projector scaling method |
CN110355754A (en) * | 2018-12-15 | 2019-10-22 | 深圳铭杰医疗科技有限公司 | Robot eye system, control method, equipment and storage medium |
CN110421565A (en) * | 2019-08-07 | 2019-11-08 | 江苏汇博机器人技术股份有限公司 | Robot global positioning and measuring system and method for practical training |
CN111097662A (en) * | 2020-01-06 | 2020-05-05 | 广东博智林机器人有限公司 | Gluing method and device of gluing robot, storage medium and gluing robot |
CN111347410A (en) * | 2018-12-20 | 2020-06-30 | 沈阳新松机器人自动化股份有限公司 | Multi-vision fusion target guiding robot and method |
CN111409075A (en) * | 2020-04-22 | 2020-07-14 | 无锡中车时代智能装备有限公司 | Simple and convenient robot hand-eye calibration system and calibration method |
CN112129809A (en) * | 2020-08-13 | 2020-12-25 | 苏州赛米维尔智能装备有限公司 | Copper sheet thermal resistivity detection device based on visual guidance and detection method thereof |
CN112620989A (en) * | 2020-11-11 | 2021-04-09 | 郑智宏 | Automatic welding method based on three-dimensional visual guidance |
CN112935650A (en) * | 2021-01-29 | 2021-06-11 | 华南理工大学 | Calibration optimization method for laser vision system of welding robot |
CN113787522A (en) * | 2021-10-12 | 2021-12-14 | 华侨大学 | Hand-eye calibration method for eliminating accumulated errors of mechanical arm |
CN114067000A (en) * | 2022-01-05 | 2022-02-18 | 中国科学院自动化研究所 | Multi-target monitoring method and system based on panoramic camera and galvanometer camera |
CN114289934A (en) * | 2021-09-27 | 2022-04-08 | 西安知象光电科技有限公司 | Three-dimensional vision-based automatic welding system and method for large structural part |
CN114670199A (en) * | 2022-03-29 | 2022-06-28 | 深圳市智流形机器人技术有限公司 | Identification positioning device, system and real-time tracking system |
CN114734480A (en) * | 2021-01-07 | 2022-07-12 | 中国科学院沈阳自动化研究所 | Industrial robot space position appearance precision test system |
CN114851188A (en) * | 2022-03-29 | 2022-08-05 | 深圳市智流形机器人技术有限公司 | Identification positioning method and device, and real-time tracking method and device |
-
2016
- 2016-11-24 CN CN201611043697.6A patent/CN108098762A/en active Pending
Cited By (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108827154A (en) * | 2018-07-09 | 2018-11-16 | 深圳辰视智能科技有限公司 | A kind of robot is without teaching grasping means, device and computer readable storage medium |
CN108995874A (en) * | 2018-07-27 | 2018-12-14 | 同济大学 | A kind of auxiliary device supplied for blank and finished parts storage |
CN110355754B (en) * | 2018-12-15 | 2023-09-22 | 深圳铭杰医疗科技有限公司 | Robot hand-eye system, control method, device and storage medium |
CN110355754A (en) * | 2018-12-15 | 2019-10-22 | 深圳铭杰医疗科技有限公司 | Robot eye system, control method, equipment and storage medium |
CN111347410A (en) * | 2018-12-20 | 2020-06-30 | 沈阳新松机器人自动化股份有限公司 | Multi-vision fusion target guiding robot and method |
CN109794963B (en) * | 2019-01-07 | 2021-06-01 | 南京航空航天大学 | Robot rapid positioning method facing curved surface component |
CN109794963A (en) * | 2019-01-07 | 2019-05-24 | 南京航空航天大学 | A kind of robot method for rapidly positioning towards curved surface member |
CN109814124A (en) * | 2019-01-28 | 2019-05-28 | 河北省科学院应用数学研究所 | A kind of robot positioning system and method based on structure light 3 D sensor |
CN109605381A (en) * | 2019-01-29 | 2019-04-12 | 欧米瑞(广东)智能制造有限公司 | A kind of three-dimensional localization reclaimer system and method for fetching |
CN109848994A (en) * | 2019-02-22 | 2019-06-07 | 浙江启成智能科技有限公司 | A kind of robot vision guidance location algorithm |
CN110000790A (en) * | 2019-04-19 | 2019-07-12 | 深圳科瑞技术股份有限公司 | A kind of scaling method of SCARA robot eye-to-hand hand-eye system |
CN110000790B (en) * | 2019-04-19 | 2021-11-16 | 深圳市科瑞软件技术有限公司 | Calibration method of eye-to-hand system of SCARA robot |
CN110136208A (en) * | 2019-05-20 | 2019-08-16 | 北京无远弗届科技有限公司 | A kind of the joint automatic calibration method and device of Visual Servoing System |
CN110276806A (en) * | 2019-05-27 | 2019-09-24 | 江苏大学 | Online hand-eye calibration and crawl pose calculation method for four-freedom-degree parallel-connection robot stereoscopic vision hand-eye system |
CN110285831A (en) * | 2019-07-05 | 2019-09-27 | 浙江大学城市学院 | A kind of network light projector scaling method |
CN110421565A (en) * | 2019-08-07 | 2019-11-08 | 江苏汇博机器人技术股份有限公司 | Robot global positioning and measuring system and method for practical training |
CN110421565B (en) * | 2019-08-07 | 2022-05-13 | 江苏汇博机器人技术股份有限公司 | Robot global positioning and measuring system and method for practical training |
CN111097662A (en) * | 2020-01-06 | 2020-05-05 | 广东博智林机器人有限公司 | Gluing method and device of gluing robot, storage medium and gluing robot |
CN111409075A (en) * | 2020-04-22 | 2020-07-14 | 无锡中车时代智能装备有限公司 | Simple and convenient robot hand-eye calibration system and calibration method |
CN112129809A (en) * | 2020-08-13 | 2020-12-25 | 苏州赛米维尔智能装备有限公司 | Copper sheet thermal resistivity detection device based on visual guidance and detection method thereof |
CN112129809B (en) * | 2020-08-13 | 2023-12-29 | 苏州赛米维尔智能装备有限公司 | Copper sheet thermal resistivity detection device based on visual guidance and detection method thereof |
CN112620989A (en) * | 2020-11-11 | 2021-04-09 | 郑智宏 | Automatic welding method based on three-dimensional visual guidance |
CN114734480A (en) * | 2021-01-07 | 2022-07-12 | 中国科学院沈阳自动化研究所 | Industrial robot space position appearance precision test system |
CN112935650A (en) * | 2021-01-29 | 2021-06-11 | 华南理工大学 | Calibration optimization method for laser vision system of welding robot |
CN114289934A (en) * | 2021-09-27 | 2022-04-08 | 西安知象光电科技有限公司 | Three-dimensional vision-based automatic welding system and method for large structural part |
CN113787522A (en) * | 2021-10-12 | 2021-12-14 | 华侨大学 | Hand-eye calibration method for eliminating accumulated errors of mechanical arm |
CN114067000A (en) * | 2022-01-05 | 2022-02-18 | 中国科学院自动化研究所 | Multi-target monitoring method and system based on panoramic camera and galvanometer camera |
CN114067000B (en) * | 2022-01-05 | 2022-03-25 | 中国科学院自动化研究所 | Multi-target monitoring method and system based on panoramic camera and galvanometer camera |
CN114670199A (en) * | 2022-03-29 | 2022-06-28 | 深圳市智流形机器人技术有限公司 | Identification positioning device, system and real-time tracking system |
CN114851188A (en) * | 2022-03-29 | 2022-08-05 | 深圳市智流形机器人技术有限公司 | Identification positioning method and device, and real-time tracking method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108098762A (en) | A kind of robotic positioning device and method based on novel visual guiding | |
CN110238849B (en) | Robot hand-eye calibration method and device | |
CN108399639A (en) | Fast automatic crawl based on deep learning and arrangement method | |
CN111775146A (en) | Visual alignment method under industrial mechanical arm multi-station operation | |
CN109658460A (en) | A kind of mechanical arm tail end camera hand and eye calibrating method and system | |
CN107883929B (en) | Monocular vision positioning device and method based on multi-joint mechanical arm | |
CN111801198B (en) | Hand-eye calibration method, system and computer storage medium | |
CN110421562A (en) | Mechanical arm calibration system and scaling method based on four item stereo visions | |
CN106524945B (en) | A kind of plane included angle On-line Measuring Method based on mechanical arm and structure light vision | |
CN110276806A (en) | Online hand-eye calibration and crawl pose calculation method for four-freedom-degree parallel-connection robot stereoscopic vision hand-eye system | |
CN109794963B (en) | Robot rapid positioning method facing curved surface component | |
CN109900251A (en) | A kind of robotic positioning device and method of view-based access control model technology | |
JP2013036988A (en) | Information processing apparatus and information processing method | |
CN113379849B (en) | Robot autonomous recognition intelligent grabbing method and system based on depth camera | |
WO2022061673A1 (en) | Calibration method and device for robot | |
JPWO2018043525A1 (en) | Robot system, robot system control apparatus, and robot system control method | |
CN114643578B (en) | Calibration device and method for improving robot vision guiding precision | |
CN110202560A (en) | A kind of hand and eye calibrating method based on single feature point | |
CN112109072B (en) | Accurate 6D pose measurement and grabbing method for large sparse feature tray | |
CN114519738A (en) | Hand-eye calibration error correction method based on ICP algorithm | |
CN109059755B (en) | High-precision hand-eye calibration method for robot | |
CN112958960B (en) | Robot hand-eye calibration device based on optical target | |
CN110533727B (en) | Robot self-positioning method based on single industrial camera | |
Fan et al. | An automatic robot unstacking system based on binocular stereo vision | |
CN115619877A (en) | Method for calibrating position relation between monocular laser sensor and two-axis machine tool system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20180601 |
|
WD01 | Invention patent application deemed withdrawn after publication |