CN110533727A - A kind of robot self-localization method based on single industrial camera - Google Patents

A kind of robot self-localization method based on single industrial camera Download PDF

Info

Publication number
CN110533727A
CN110533727A CN201910666397.0A CN201910666397A CN110533727A CN 110533727 A CN110533727 A CN 110533727A CN 201910666397 A CN201910666397 A CN 201910666397A CN 110533727 A CN110533727 A CN 110533727A
Authority
CN
China
Prior art keywords
robot
coordinate system
industrial
industrial camera
anchor point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910666397.0A
Other languages
Chinese (zh)
Other versions
CN110533727B (en
Inventor
韦溟
张丽艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN201910666397.0A priority Critical patent/CN110533727B/en
Publication of CN110533727A publication Critical patent/CN110533727A/en
Application granted granted Critical
Publication of CN110533727B publication Critical patent/CN110533727B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/11Complex mathematical operations for solving equations, e.g. nonlinear equations, general mathematical optimization problems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Operations Research (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

The invention discloses a kind of robot self-localization methods based on single industrial camera, belong to industrial robot field, anchor point known to six or more coordinates is successively individually imaged in the industrial camera that method of the invention only needs that robot is driven to drive end in a manner of teaching, it can quickly and easily determine the transformation relation between world coordinate system and robot basis coordinates system, realize robot self-localization.The present invention in same piece image without being imaged the multiple target points being distributed within the scope of larger space, as long as and successively each anchor point is individually imaged respectively;The limitation of the method for the present invention not camera subject visual field size and anchor point distributed architecture;The present invention is successively individually imaged established solving model and method for solving for multiple anchor points and can be widely applied to solve the problems, such as the self-positioning of all kinds of " hand-eye " robot systems.

Description

A kind of robot self-localization method based on single industrial camera
Technical field
The present invention relates to industrial robot field, specially a kind of robot self-localization side based on single industrial camera Method.
Background technique
With the high speed development of intelligent Manufacturing Technology, industrial robot in industrial circle using more and more extensive.Its In, riveting actuator will be bored and be fixed on that articulated arm industrial robot end carries out automation drilling and riveting is current aircraft dress With a research hotspot in field, and gradually start to step into practical stage, substitutes traditional manual work.Machine People's automatic Drilling/Riveting can not only greatly improve drilling and riveting quality, also can effectively improve and bore riveting efficiency.
Robot bores before the work of riveting system, it is thus necessary to determine that position and posture of the robot in world coordinate system.It is alive The case where boundary's coordinate system is overlapped with workpiece coordinate system is, that is, it needs to be determined that riveting piece coordinate system to be drilled and robot coordinate Transformation relation between system.This link is known as robot localization (also referred to as robot centering, workpiece positioning, workpiece centering).
During aircraft puts together machines people's automatic Drilling/Riveting operation at present, robot localization mainly uses the big ruler such as laser tracker Very little measuring device is completed.On the one hand, it needs to survey multiple characteristic points in drilled riveting piece using external measurement devices Amount, and then establish workpiece coordinate system and measure the transformation relation between coordinate system;On the other hand, it is arranged on robot end Target spot (target ball), control robot are rotated around different axis, a position are often rotated to, using external measurement equipment to robot The target spot arranged on end carries out three-dimensional coordinate measurement, then a series of three-dimensional coordinates of target spot carry out when by rotating around each axis Circular fitting, and then set up robot basis coordinates system and measure the transformation relation between coordinate system.Then, it is sat according to workpiece Transformation relation between transformation relation, robot basis coordinates system between mark system and measurement coordinate system and measurement coordinate system, calculates Transformation relation between workpiece coordinate system and robot basis coordinates system, to realize robot localization.This utilization is externally measured The method that equipment carries out robot localization is not only needed by additional measuring device, and complex steps are time-consuming, measure Journey and positioning calculation process are difficult to realize integrated and automate.
In the robot automatic Drilling/Riveting system assembled towards aircraft, in order to guarantee the accuracy of robot hole position, An industrial camera, the online compensation for drilling location error can be connected firmly in the brill riveting actuator of robot end.According to Some machine vision theory and methods shoot target point known to n relative position using an industrial camera, can solve Pose of the camera relative to known target point out, this is referred to as perspective n point problem (PnP problem).It solves PnP problem and requires camera N target point is imaged simultaneously in piece image, in order to guarantee the uniqueness and accuracy that solve, general n >=6 in industrial application, And n target point should be distributed in as far as possible in a biggish spatial dimension.However, due to boring the industrial phase in riveting actuator Machine shoots image every time and all only needs the benchmark to independent 5 millimeters of a diameter or so when carrying out drilling error online compensation Hole (or benchmark rivet) is imaged, and in order to guarantee the image quality of datum hole, avoid captured single datum hole Shared region is too small in the picture, and the visual field of the industrial camera connected firmly in end effector of robot is usually smaller, Wu Fa The multiple target points being distributed within the scope of larger space are imaged in piece image simultaneously.Therefore, automatic Drilling/Riveting before making the present invention The included industrial camera of robot is not used to robot self poisoning.It is additional to continue to use laser tracker etc. always in the industry at present Measuring device carry out cumbersome robot localization work.
Summary of the invention
The present invention aiming at the problems existing in the prior art, disclose a kind of robot based on single industrial camera from Localization method, the single industrial camera for being fixed on robot end using one determine six or more in world coordinate system Site successively shoots image, determines the transformation relation between robot basis coordinates system and world coordinate system according to this group of image.This It is quickly and easily self-positioning in working site that the method for invention can be realized industrial robot.
The present invention is implemented as follows:
A kind of robot self-localization method based on single industrial camera, including industrial robot, the industrial machine People includes robot end's ring flange, and the robot ring flange front end is fixed with an industrial camera;In world coordinate system There are n anchor points distinguishable in visual pattern, n >=6, and n anchor point be not conllinear, n anchor point is in world coordinate system OWUnder coordinate it is known that being denoted as Pi=(Xi,Yi,Zi)T, i=1,2, L, n carry out industrial robot certainly using single industrial camera The method of positioning comprising the steps of:
Step 1: control industrial robot is mobile, the contained industrial camera of robot end is made successively to reach i-th of positioning Point top can be to the position of the anchor point blur-free imaging, i=1,2, L, n, and control industrial camera shoots the figure of the anchor point Picture, and record robot ring flange coordinate system O under the positionFOrigin is with respect to industrial robot basis coordinates system ORIt is flat between origin The amount of shifting toAnd the Eulerian angles α of robot ring flangeiii
Step 2: robot ring flange coordinate system O when calculating i-th of anchor point of shooting using formula (1)FRelative to machine Device people's basis coordinates system ORSpin matrix
Robot ring flange coordinate system O when indicating to shoot i-th of anchor point using formula (2)FRelative to robot basis coordinates It is ORTransformation matrix
Step 3: in the image of i-th shooting, the image coordinate of i-th of anchor point is extracted, (u is denoted asi,vi);
Step 4: according to the transformation relation and camera imaging model between coordinate system, world coordinates is solved using formula (3) It is OWRelative to robot basis coordinates system ORTransformation relationComplete positioning of the robot in world coordinate system:
In formula (3), λiFor proportionality coefficient, K is the Intrinsic Matrix of industrial camera,
Wherein αx, αy, s, u0, v0For the camera imaging parameter calibrated.
Further, the industrial robot lower end is provided with robot motion's guide rail;The anchor point, which is set to, to be added Work tooling rack is further fixedly arranged on workpiece to be processed on the processing tool rack.
Further, the industrial camera is perspective imaging camera;The imaging parameters and coordinate of the industrial camera It is OCWith ring flange coordinate system OFBetween 4 × 4 transformation matrixsIt is calibrated;N anchor point is in world coordinate system OWUnder Coordinate it is known that being denoted as Pi=(Xi,Yi,Zi)T, i=1,2, L, n.
Further, the contained industrial camera in industrial robot end successively reaches n anchor point top position shooting Image, image coordinate (u of the n anchor point in captured n width imagei,vi), i=1,2, L, n, discrete distribution In the different location of the industrial camera plane of delineation.
Further, it is solved in the step four using formula (3)Method comprise the steps of:
4.1, for known quantityIt enables
Wherein m1 i, m2 i, m3 iIt is three-dimensional row vector, m14 i, m24 i, m34 iFor scalar value;Note:
It is obtained according to formula (3):
According to (4) formula, formed system of linear equations (5)
System of linear equations (5) are solved with least square method, obtain world coordinate system O to be solvedWRelative to industrial machine People's basis coordinates system ORTransformation relationIn spin matrix and translation vector primary Calculation result
With
4.2, solve so thatFrobenius Norm minimum unit orthogonal matrix
4.3, system of linear equations (6) are solved
It obtains
4.4, withWithFor world coordinate system O to be solvedWRelative to industrial robot (3) basis coordinates system OR's Transformation relationIn spin matrix and translation vector initial value, by nonlinear optimization method solve formula (7) obtain's Final result:
Wherein
The beneficial effect of the present invention compared with prior art is:
1, existing robot localization method is needed using additional measuring device, process very complicated at present, and is surveyed Amount process and positioning calculation process are difficult to realize integrated and automate;Method of the invention is not necessarily to other additional measuring devices, directly It connects using the single industrial camera carried on industrial robot, robot self-localization can be completed, it is easier, quick, easy Row;
2, traditional method positioned based on single camera shot image by solving PnP problem, needs camera one It is secondary that the multiple target points being distributed within the scope of larger space are imaged, but the included industrial camera of robot is regarded under normal conditions Place limit is difficult to realize, and the industrial camera that the method for the present invention only needs robot included successively respectively determines six or more Site is individually imaged, and camera subject visual field does not limit;
3, for multiple anchor points, successively individually newly-established solving model and solution are created in imaging to the method for the invention Method can be widely applied to solve the problems, such as the self-positioning of all kinds of " hand-eye " robot systems;
4, method of the invention only need the industrial camera of robot end to anchor point known to six or more coordinates according to Secondary independent imaging can quickly and easily determine the transformation relation between world coordinate system and robot basis coordinates system, realize Robot self-localization;This method is in addition to also can be applied to other than aircraft puts together machines in people's automatic Drilling/Riveting and has clear demand End has the robot self-localization of industrial camera in other field.
Detailed description of the invention
Fig. 1 is a kind of robot self-localization method based on single industrial camera in particularly preferred embodiment of the invention System composition schematic diagram;
Fig. 2 is six positioning point images of robot shooting in particularly preferred embodiment of the invention, and the positioning extracted Point image coordinate;
Wherein, 1- processing tool rack, 2- workpiece to be processed, 3- industrial robot, 3-1- robot ring flange, 3-2- Industrial camera, 4- robot motion guide rail, 5- world coordinate system, 6- anchor point.
Specific embodiment
It is clear to keep the purpose of the present invention, technical solution and effect clearer, be exemplified below example to the present invention into One step is described in detail.It should be understood that specific implementation described herein is not used to limit this hair only to explain the present invention It is bright.
As shown in Figure 1, including industrial robot 3 in self locating device of the invention, the industrial robot 3 includes End flange 3-1, the front end robot ring flange 3-1 are connected industrial camera 3-2.
3 lower end of industrial robot is provided with robot motion's guide rail 4 in the embodiment of the present invention, in world coordinate system 5 There are 6 distinguishable in visual pattern and not conllinear anchor points.6 anchor points are located at processing tool rack 1 in the present embodiment On, workpiece to be processed 2 is further fixedly arranged on the processing tool rack 1.Single industrial camera 3-2 is fixed on robot On ring flange;Industrial camera 3-2 is perspective imaging camera;The imaging parameters of industrial camera 3-2 and the industrial camera coordinate It is OCWith robot ring flange 3-1 coordinate system OFBetween 4 × 4 transformation matrixsIt is calibrated.
In this embodiment, industrial camera is the GC2450M model camera and Schneider of AVT company, Germany production The KREUZNACH APO-XENOPLAN 1.4/23-0903 model camera lens of company's production, image resolution ratio are 2448 × 2050, Effective viewing field 71mm × 59mm, image have been subjected to distortion correction, camera Intrinsic Matrix
Hand-eye relational matrix has been subjected to calibration, is
In this embodiment, the artificial KUKA-KR30 articulated arm industrial robot of machine, on processing tool rack There are six the concentric round sensation target of black and white, the center of circle of white circular is the positioning in the present embodiment on each sensation target Point, six anchor points are in world coordinate system OWUnder coordinate it is known that being denoted as Pi=(Xi,Yi,Zi)T, i=1,2, L, 6, it is specific to sit Scale value is as shown in Table 1.In practical engineering applications, anchor point generally can be in the anchor point spread in workpiece and its tooling The heart, three-dimensional coordinate are set in the design phase, and guarantee the position precision of positioning dot center by process.
Table one
In the present embodiment, method for self-locating is carried out using industrial robot 3 comprising the steps of:
Step 1: controlling robot end's movement by robot interactive tutorial function, make the contained industry of robot end Camera 3-2 is successively reached can be to the position of the independent blur-free imaging of corresponding anchor point, under each position above six anchor points It controls the industrial camera and shoots the image of corresponding anchor point, and record robot ring flange coordinate system O under the positionFOrigin Opposed robots' basis coordinates system ORTranslation vector between originAnd the Eulerian angles α of robot ring flangeiii, i =1,2, L, 6, six robot motion's parameter physical record results in the present embodiment are as shown in Table 2;
Table two
Step 2: robot ring flange coordinate system O when calculating i-th of anchor point of shooting using formula (1)FRelative to machine Device people's basis coordinates system ORSpin matrix
Robot ring flange coordinate system O when indicating to shoot i-th of anchor point using formula (2)FRelative to robot basis coordinates It is ORTransformation matrix
Step 3: extracting the image coordinate of i-th of anchor point in the image of i-th shooting, being denoted as (ui,vi), i =1,2, L, 6;
Step 4: solving world coordinates using formula (3) according to transformation relation and camera imaging model between coordinate system It is OWRelative to robot basis coordinates system ORTransformation relationComplete positioning of the robot in world coordinate system:
In formula (3), λiFor proportionality coefficient, K is the Intrinsic Matrix of calibrated industrial camera;
As shown in Fig. 2, the present embodiment is in step 1, the contained industrial camera of robot end successively reaches six Anchor point top position shoots image, image coordinate (u of six anchor points in six captured width imagesi,vi), i=1, 2, L, 6, the different location for being distributed in the industrial camera plane of delineation as discrete as possible, specific six anchor points are in image It is as shown in Table 3 that position coordinates in plane extract result.
Table three
Further, formula (3) are utilized to solve in step 4Method, the present embodiment comprises the steps of:
4.1, for known quantityIt enables
Wherein m1 i, m2 i, m3 iIt is three-dimensional row vector, m14 i, m24 i, m34 iFor scalar value;Note
It is obtained according to formula (3):
According to (4) formula, formed system of linear equations (5)
System of linear equations (5) are solved with least square method, obtain world coordinate system O to be solvedWRelative to robot base Coordinate system ORTransformation relationIn spin matrix and translation vector primary Calculation result
With
4.2, by singular value decomposition method solve so thatFrobenius Norm minimum unit just Hand over matrix
4.3, system of linear equations (6) are solved
It obtains
4.4, withWithFor world coordinate system O to be solvedWRelative to robot basis coordinates system ORTransformation close SystemIn spin matrix and translation vector initial value, by nonlinear optimization method solve formula (7) obtainMost termination Fruit:
Wherein
Finally find out
For the accuracy of comparative illustration this patent methods and results, using additional independent commercial measuring system to this reality It applies the robot basis coordinates in example and the transformation relation between world coordinate system is measured.Using independent commercial measurement system The result that unified test measures is denoted asSpecially
The result that the method for the present invention obtainsWith the positioning result of commercial measuring systemCompare, corresponding spin moment The Frobenius norm of battle arrayIt is 0.008, the Frobenius norm of corresponding translation vectorFor 4.952.As it can be seen that the result that result and commercial measuring system that the method for robot self-localization obtains obtain is smaller compared to deviation, It is the effective ways of robot self poisoning.
It although an embodiment of the present invention has been shown and described, for the ordinary skill in the art, can be with A variety of variations, modification, replacement can be carried out to these embodiments without departing from the principles and spirit of the present invention by understanding And modification, the scope of the present invention is defined by the appended.

Claims (4)

1. a kind of robot self-localization method based on single industrial camera, including industrial robot (3), world coordinate system (5), It is characterized in that, the industrial robot (3) includes robot ring flange (3-1), before the robot ring flange (3-1) End is fixed with industrial camera (3-2);There are n anchor points (6) distinguishable in visual pattern in world coordinate system (5), n >=6, And n anchor point be not conllinear;Industrial robot (3) self-positioning method is carried out using industrial camera (3-2), includes following step It is rapid:
Step 1: control industrial robot (3) is mobile, so that the contained industrial camera of robot end (3-2) is successively reached i-th and determine The figure of the anchor point can be shot to the position of the anchor point blur-free imaging, i=1,2, L, n, control industrial camera above site Picture, and record robot ring flange (3-1) coordinate system O under the positionFOrigin is with respect to industrial robot (3) basis coordinates system OROrigin Between translation vectorAnd the Eulerian angles α of robot ring flange (3-1)iii
Step 2: robot ring flange coordinate system O when calculating i-th of anchor point of shooting using formula (1)FRelative to robot base Coordinate system ORSpin matrixI=1,2, L, n:
Robot ring flange coordinate system O when indicating to shoot i-th of anchor point using formula (2)FRelative to robot basis coordinates system OR's Transformation matrixI=1,2, L, n,
Step 3: in the image of i-th shooting, the image coordinate of i-th of anchor point is extracted, (u is denoted asi,vi);
Step 4: according to the transformation relation and camera imaging model between coordinate system, world coordinate system O is solved using formula (3)WPhase For robot basis coordinates system ORTransformation relationComplete positioning of the robot in world coordinate system:
In formula (3), λiFor proportionality coefficient, K is the Intrinsic Matrix of industrial camera,
Wherein αx, αy, s, u0, v0For the camera imaging parameter calibrated.
2. a kind of robot self-localization method based on single industrial camera according to claim 1, which is characterized in that institute The industrial camera (3-2) stated is perspective imaging camera;The imaging parameters and coordinate system O of the industrial camera (3-2)CWith method Blue disk (3-1) coordinate system OFBetween 4 × 4 transformation matrixsIt is calibrated;N anchor point is in world coordinate system OWUnder Coordinate is it is known that be denoted as Pi=(Xi,Yi,Zi)T, i=1,2, L, n.
3. a kind of robot self-localization method based on single industrial camera according to claim 1, which is characterized in that institute The contained industrial camera in industrial robot (3) end (3-2) successively reaches n anchor point top position shooting figure in the step of stating one Picture, image coordinate (u of the n anchor point in captured n width imagei,vi), i=1,2, L, n are discrete to be distributed in institute State the different location of the industrial camera plane of delineation.
4. a kind of robot self-localization method based on single industrial camera according to claim 1, which is characterized in that institute It is solved in the step of stating four using formula (3)Method comprise the steps of:
4.1, for known quantityI=1,2, L, n are enabled
Wherein m1 i, m2 i, m3 iIt is three-dimensional row vector, m14 i, m24 i, m34 iFor scalar value;Note:
It is obtained according to formula (3):
According to (4) formula, formed system of linear equations (5)
System of linear equations (5) are solved with least square method, obtain world coordinate system O to be solvedWRelative to industrial robot (3) Basis coordinates system ORTransformation relationIn spin matrix and translation vector primary Calculation result
With
4.2, solve so thatFrobenius Norm minimum unit orthogonal matrix
4.3, system of linear equations (6) are solved
It obtains
4.4, withWithFor world coordinate system O to be solvedWRelative to industrial robot (3) basis coordinates system ORTransformation close SystemIn spin matrix and translation vector initial value, by nonlinear optimization method solve formula (7) obtainMost termination Fruit:
Wherein
CN201910666397.0A 2019-07-23 2019-07-23 Robot self-positioning method based on single industrial camera Active CN110533727B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910666397.0A CN110533727B (en) 2019-07-23 2019-07-23 Robot self-positioning method based on single industrial camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910666397.0A CN110533727B (en) 2019-07-23 2019-07-23 Robot self-positioning method based on single industrial camera

Publications (2)

Publication Number Publication Date
CN110533727A true CN110533727A (en) 2019-12-03
CN110533727B CN110533727B (en) 2023-07-11

Family

ID=68661843

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910666397.0A Active CN110533727B (en) 2019-07-23 2019-07-23 Robot self-positioning method based on single industrial camera

Country Status (1)

Country Link
CN (1) CN110533727B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112157284A (en) * 2020-09-29 2021-01-01 蒙美兰 Industrial robot automatic drilling system and use method
CN112833883A (en) * 2020-12-31 2021-05-25 杭州普锐视科技有限公司 Indoor mobile robot positioning method based on multiple cameras

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108827154A (en) * 2018-07-09 2018-11-16 深圳辰视智能科技有限公司 A kind of robot is without teaching grasping means, device and computer readable storage medium
CN109059755A (en) * 2018-06-11 2018-12-21 天津科技大学 A kind of robot high-precision hand and eye calibrating method
CN109389642A (en) * 2017-08-04 2019-02-26 惠州市阿图达机电有限公司 Vision system is to the scaling method of robot, system and has store function device
CN109807937A (en) * 2018-12-28 2019-05-28 北京信息科技大学 A kind of Robotic Hand-Eye Calibration method based on natural scene

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109389642A (en) * 2017-08-04 2019-02-26 惠州市阿图达机电有限公司 Vision system is to the scaling method of robot, system and has store function device
CN109059755A (en) * 2018-06-11 2018-12-21 天津科技大学 A kind of robot high-precision hand and eye calibrating method
CN108827154A (en) * 2018-07-09 2018-11-16 深圳辰视智能科技有限公司 A kind of robot is without teaching grasping means, device and computer readable storage medium
CN109807937A (en) * 2018-12-28 2019-05-28 北京信息科技大学 A kind of Robotic Hand-Eye Calibration method based on natural scene

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112157284A (en) * 2020-09-29 2021-01-01 蒙美兰 Industrial robot automatic drilling system and use method
CN112833883A (en) * 2020-12-31 2021-05-25 杭州普锐视科技有限公司 Indoor mobile robot positioning method based on multiple cameras
CN112833883B (en) * 2020-12-31 2023-03-10 杭州普锐视科技有限公司 Indoor mobile robot positioning method based on multiple cameras

Also Published As

Publication number Publication date
CN110533727B (en) 2023-07-11

Similar Documents

Publication Publication Date Title
CN111775146B (en) Visual alignment method under industrial mechanical arm multi-station operation
KR102280663B1 (en) Calibration method for robot using vision technology
CN108399639A (en) Fast automatic crawl based on deep learning and arrangement method
CN110450163A (en) The general hand and eye calibrating method based on 3D vision without scaling board
CN110276799B (en) Coordinate calibration method, calibration system and mechanical arm
CN107590835A (en) Mechanical arm tool quick change vision positioning system and localization method under a kind of nuclear environment
CN109848951A (en) Automatic processing equipment and method for large workpiece
CN109877840A (en) A kind of double mechanical arms scaling method based on camera optical axis constraint
CN112833792B (en) Precision calibration and verification method for six-degree-of-freedom mechanical arm
TWI699264B (en) Correction method of vision guided robotic arm
CN112958960B (en) Robot hand-eye calibration device based on optical target
CN114643578B (en) Calibration device and method for improving robot vision guiding precision
CN110533727A (en) A kind of robot self-localization method based on single industrial camera
CN110202560A (en) A kind of hand and eye calibrating method based on single feature point
CN109059755B (en) High-precision hand-eye calibration method for robot
CN111145272A (en) Manipulator and camera hand-eye calibration device and method
CN111482964A (en) Novel robot hand-eye calibration method
CN114654465A (en) Welding seam tracking and extracting method based on line laser structure optical vision sensing
CN110962127B (en) Auxiliary calibration device for tail end pose of mechanical arm and calibration method thereof
CN206633018U (en) Integrated apparatus is assembled in flexible on-line measurement for shaft hole matching
CN112238453B (en) Vision-guided robot arm correction method
CN215037637U (en) Camera external parameter calibration device for visual guidance of industrial robot
CN112598752A (en) Calibration method based on visual identification and operation method
CN116619350A (en) Robot error calibration method based on binocular vision measurement
CN211699034U (en) Hand-eye calibration device for manipulator and camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant