CN106041937B - A kind of control method of the manipulator crawl control system based on binocular stereo vision - Google Patents

A kind of control method of the manipulator crawl control system based on binocular stereo vision Download PDF

Info

Publication number
CN106041937B
CN106041937B CN201610676653.0A CN201610676653A CN106041937B CN 106041937 B CN106041937 B CN 106041937B CN 201610676653 A CN201610676653 A CN 201610676653A CN 106041937 B CN106041937 B CN 106041937B
Authority
CN
China
Prior art keywords
workpiece
manipulator
point cloud
camera
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610676653.0A
Other languages
Chinese (zh)
Other versions
CN106041937A (en
Inventor
崔庆
王志飞
张卓辉
向宇
李晓亮
夏璐璐
高梓翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan Alsontech Intelligent Technology Co Ltd
Original Assignee
Henan Alsontech Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan Alsontech Intelligent Technology Co Ltd filed Critical Henan Alsontech Intelligent Technology Co Ltd
Priority to CN201610676653.0A priority Critical patent/CN106041937B/en
Publication of CN106041937A publication Critical patent/CN106041937A/en
Application granted granted Critical
Publication of CN106041937B publication Critical patent/CN106041937B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

A kind of control method of the manipulator crawl control system based on binocular stereo vision, includes the following steps:Calibration for cameras parameter realizes the conversion of camera coordinates system and robot coordinate system then by teaching manipulator;Line laser does cambered surface scanning cloud movement, while camera acquires image;Laser rays is extracted respectively to the image of camera shooting;The position of laser rays is accurately positioned using sub-pixel recognition;Homotopy mapping is carried out to the laser rays extracted in picture;The three dimensional space coordinate each put on laser rays is calculated by principle of triangulation, completes the scanning to workpiece in scene, obtains the point cloud data in camera fields of view;Workpiece point cloud template is established, manipulator is chosen and captures point, by the point cloud data of acquisition and workpiece point cloud template matches, and calculate transformational relation between the two;Manipulator crawl point in cloud template is replaced into the coordinate for being counted as changing coordinates, and being converted under robot coordinate system, is then passed to manipulator, manipulator executes action.

Description

A kind of control method of the manipulator crawl control system based on binocular stereo vision
Technical field
The present invention relates to manipulators to capture control field, and in particular to a kind of 3D point cloud that can obtain workpiece in hopper in real time Data, and pass through template matches, the crawl of guiding manipulator and the manipulator crawl control based on binocular stereo vision for putting workpiece The control method of system processed.
Background technology
Currently, industrial automation assembly receives more and more attention, carrying out industry assembling by vision guide manipulator is Realize the important channel of industrial automation.During manipulator execution task, control machinery hand, which navigates to target location, is One very crucial problem.CN103963058A captures control method and side based on the mechanical arm that multi-directional vision positions Method, CN104331894A a kind of robot de-stacking method and a kind of small capacity doubles of CN104677330A based on binocular stereo vision Item stereo vision range-measurement system all discloses a kind of vision guide manipulator grasping means.But these methods all exist to different shaped Number product identification rate is poor, positioning accuracy is difficult to meet the defect that requirement is put in crawl.
Invention content
The present invention proposes a kind of control method of the manipulator crawl control system based on binocular stereo vision, by binocular Stereo visual system calculates the three-dimensional coordinate of object according to the two dimensional image of object, then goes to control according to this three-dimensional coordinate Manipulator processed is automatically moved to target location, realizes the crawl that workpiece is arbitrarily put in hopper, so realize workpiece put or Assembly solves the automation issues of industry assembling or loading and unloading, to reduce human cost, improves production efficiency, eliminates manpower Operate potential security risk.
The object of the invention to solve the technical problems is realized using following technical scheme.It proposes according to the present invention It is a kind of based on binocular stereo vision manipulator crawl control system control method, include the following steps:1) calibration for cameras Intrinsic parameter and outer parameter, then pass through teaching manipulator, realize camera coordinates system and robot coordinate system conversion;2) line swashs Light does cambered surface scanning motion, while positioned at the camera of line laser both sides respectively from the two viewing angles images in left and right;3) to two-phase The two images of machine shooting extract laser rays respectively;4) position of laser rays is accurately positioned using sub-pixel recognition;5) to two-phase The laser rays extracted in machine picture carries out homotopy mapping;6) each put on laser rays three are calculated by principle of triangulation Dimension space coordinate completes the scanning to workpiece in scene, obtains the point cloud data in camera fields of view;7) workpiece point cloud mould is established Plate chooses manipulator and captures point, then by the point cloud data of acquisition and workpiece point cloud template matches, to find target workpiece And calculate transformational relation between the two;8) manipulator being arranged in cloud template point position is captured to calculate by step (7) The transformational relation gone out is converted into changing coordinates, and the coordinate is transformed into the coordinate under robot coordinate system, and coordinate is passed to Manipulator, if coordinate value, within the motion range of manipulator, manipulator executes grabbing workpiece and puts the action of workpiece, so Camera starts to shoot image afterwards, into a first shooting crawl cycle.
The object of the invention to solve the technical problems also can be used following technical measures and further realize.
Control method according to claim 1 above-mentioned, which is characterized in that it is wherein described the step of 1 in, pass through volume Circular index point calibration for cameras after code, when teaching manipulator encoded by teaching after the position of circular index point complete to machine The conversion of camera coordinates system and robot coordinate system is realized in the teaching of tool hand.
Control method above-mentioned, wherein it is described the step of 3 in, extract laser lines when, it is unrelated using laser lines illumination Extracting method combines gray threshold and the picture method that often row pixel is most worth to realize that laser lines illumination is unrelated and tentatively carries It takes.
Control method above-mentioned, wherein it is described the step of 3 in, in laser lines extraction process, using laser lines arrange It is at random that the method for pixel deviations limitation rejects the laser rays caused by workpiece surface reflection light.
Control method above-mentioned, wherein it is described the step of 4 in, laser rays is accurately positioned using the method for grey scale centre of gravity Lines center, to realize laser lines sub-pix positioning.
Control method above-mentioned, wherein it is described the step of 6 in, pass through principle of triangulation and calculate the three-dimensional put on laser rays Space coordinate, specific calculating process are:
After image shot by camera, a picture point on camera determines straight line, picture point and camera lens with camera lens center All under camera coordinates system, the space line equation of this 2 points compositions is this 2 points coordinate of center:
Wherein, X, Y, Z are the three-dimensional coordinate of target point;X, y, f are picpointed coordinate;Xs, Ys, Zs are optical center coordinates; ai、bi、ciFor coordinate system transformation parameter;
Choose two images on a pair of of same place, substitute into above-mentioned linear equation can obtain 4 set of equations at equation group, To obtain X, the value of Y, Z calculate the three dimensional space coordinate put on laser rays.
Control method above-mentioned, wherein it is described the step of 7 specifically include following steps:1) start dual camera systems, scanning Workpiece point cloud data, then in the point cloud data of interface deletion error to get to the workpiece point cloud model of completion;2) by step 6 Curvature information of the obtained point cloud data with the workpiece point cloud model according to workpiece surface data carries out point cloud matching, obtains mesh Mark workpiece;3) transformational relation of point cloud data and workpiece point cloud model is calculated.
The present invention has clear advantage and advantageous effect compared with prior art.By above-mentioned technical proposal, the present invention The control method of manipulator crawl control system based on binocular stereo vision can reach comparable technological progress and practicability, And with the extensive utility value in industry, at least there are following advantages:1) this system is swept by binocular vision and line laser The mode for retouching combination obtains workpiece point cloud data, and models and match by cloud, obtains workpiece grabbing position, and then guide machine Tool hand captures and puts workpiece, and this system realizes the automation that workpiece is put in manipulator crawl, improves enterprises production efficiency, Increase enterprise competitiveness.
2) this system with manipulator by being combined, it is only necessary to which operator captures and put field in computer end setting workpiece Scape can realize the Unmanned operation in site environment, ensure that the safety of operating personnel.
3) this system can intuitively show a cloud scanning process and workpiece scanning result.Personnel easy to operation understand system The operation conditions of system makes operating personnel grasp the working condition of system in real time, improves the maintainability of system.
Description of the drawings
Fig. 1 is the hardware structure diagram of present system;
Fig. 2 is index point code pattern of the present invention;
Fig. 3 is present invention point cloud Prototype drawing.
Fig. 4 is the work flow diagram of present system;
Fig. 5 principle of triangulation figures of the present invention.
【Main element symbol description】
O1-xyz:Camera coordinates system
O2-xyz:Camera coordinates system
P1:Point in camera coordinates
P2:Point in camera coordinates
S1:Camera lens center
S2:Camera lens center
W:Point in real space
Specific implementation mode
It is of the invention to reach the technological means and effect that predetermined goal of the invention is taken further to illustrate, below in conjunction with Attached drawing and preferred embodiment capture control system to a kind of manipulator based on binocular stereo vision proposed according to the present invention Control method its specific method, step and its effect are described in detail as after.
- 5 are please referred to Fig.1, a kind of manipulator based on binocular stereo vision of the present invention captures the control method of control system, Include the following steps:
1) intrinsic parameter of calibration for cameras and outer parameter realize camera coordinates system and manipulator then by teaching manipulator The conversion of coordinate system.In the present invention is implemented, by the circular index point calibration for cameras after coding, by right when teaching manipulator The teaching to manipulator is completed by the teaching of the circular index point position after coding, realizes camera coordinates system and robot coordinate The conversion of system.
8 circular dots are formed a circle codification point group by the embodiment of the present invention, and each group of encoded point position relationship is not It is identical, and it is endowed a unique number.Fig. 2 is encoded point organization chart, during use, by encoded point according to encoded point group Cutting is opened, and is pasted on scaling board.Using encoded point come the intrinsic parameter of calibration for cameras and outer parameter, and realize camera coordinates The conversion of system and robot coordinate system.This method has high stated accuracy, and adapts to a variety of image deformations, adapts to a variety of figures As deformation is because when encoded point extraction while considering the shape and luminance information of circular dot, and high stated accuracy be due to Using round outer boundary as supporting domain, this is had than criss-cross scaling board using the method for angle point grid for the extraction in the circular dot center of circle Higher accuracy.Meanwhile the scaling method in invention has corrected camera lens tangential distortion and radial distortion simultaneously, further It ensure that measurement accuracy.
2) line laser does cambered surface scanning motion, while positioned at the camera of line laser both sides respectively from two viewing angles in left and right Work.The hardware of manipulator crawl control system of the present invention includes two cameras and a line laser between two cameras Device, composition figure please refer to Fig.1.
3) laser rays is extracted respectively to the two images of double camera shooting.When extraction projects the laser lines on workpiece, Using the unrelated extracting method of laser lines illumination, that is, gray threshold and the picture method that often row pixel is most worth is combined to realize laser rays The unrelated preliminary extraction of illumination realizes laser rays this method ensure that laser lines stablize extraction under ambient lighting Item extracts the hyposensitiveness perception to illumination variation.
It in laser lines extraction process, needs to remove laser rays outlier, the present invention uses laser lines row pixel Method rejecting laser rays caused by workpiece surface reflection light of deviation limitation is at random, and this method effectively eliminates non-laser line On outlier, provide accurate kind of point for the matching of post laser same place.
4) position of laser rays is accurately positioned using sub-pixel recognition.The present invention determines for the sub-pix put on laser rays Position, accurately positions the lines center of laser rays using the method for grey scale centre of gravity, to realize that the sub-pix of laser lines is determined Position.The application of this method improves the positioning accuracy of algorithm.
5) homotopy mapping is carried out to the laser rays extracted in two camera pictures.
6) three dimensional space coordinate each put on laser rays is calculated by principle of triangulation, completed to workpiece in scene Scanning obtains the point cloud data in camera fields of view.
Referring to Fig. 2, O1-xyz and O2-xyz is two camera space coordinates respectively;P1, P2 are a pair of same places; S1, S2 are the centers of camera lens;W is a point in real space.P1, S1 determine straight line in space, P2, S2 determines another straight line, they are in space compared to W.
Space line:After image shot by camera, a picture point on camera CCD can determine one with camera lens center Straight line.All under camera coordinates system, the space line equation of this 2 points compositions is for picture point and this 2 points coordinate of optical center:
Wherein, X, Y, Z are the three-dimensional coordinate of target point, are unknown numbers;
X, y, f are picpointed coordinate, are known quantity (being obtained by analyzing image);
Xs, Ys, Zs are optical center coordinates, are known quantity (being obtained in camera calibration process);
ai、bi、ciIt is known quantity (being obtained in camera calibration process) for coordinate system transformation parameter;
One image can one linear equation of row, two images can two linear equations of row, totally 4 equation groups, And the unknown number in formula only therefore can calculate three unknown numbers there are three (three-dimensional point coordinate X, Y, Z).
7) workpiece point cloud template is established, manipulator is chosen and captures point, then by the point cloud data of acquisition and workpiece point cloud Template matches, to find target workpiece and calculate transformational relation between the two.
The method that cad data imports, the present invention is used to be set by the way of workpiece point cloud self-modeling different from existed system Determine the position of manipulator grabbing workpiece.Modeling process is as follows:Start dual camera systems, workpiece point cloud data is scanned, then on boundary Then the point cloud data of face deletion error therefrom chooses manipulator crawl point, puts cloud to get to the workpiece point cloud model of completion The manipulator grabbing workpiece position of Prototype drawing and setting referring to Fig. 3, a visual angle of 3D point cloud data is shown in the figure, It can be checked with 3D in the system interface of exploitation.
After establishing the point cloud template of workpiece, so that it may to carry out the scanning of subsequent workpiece and the control of manipulator.Pass through binocular Camera laser system surface sweeping workpiece obtain workpiece 3D data, then with a cloud template Auto-matching.The matching that the present invention uses Method is that the curvature information of foundation workpiece surface data carries out point cloud matching, and then the workpiece grabbing position that will be set in template point cloud It sets and is transformed under current workpiece coordinate system, guide manipulator crawl and put workpiece.Using the curvature of workpiece surface point cloud data Information as occurrence be because the curvature information of workpiece surface is an invariant features, as long as workpiece surface does not deform upon, The curvature information of workpiece will not change.Different from existing point, line, SURFACES MATCHING method, the present invention uses workpiece table Face data curvature information matches workpiece, has the advantage independent of workpiece shapes feature, even if workpiece portion feature changes Become, as long as workpiece portion curvature can match with template workpiece, the workpiece grabbing position after can accurately being converted.Most The transformational relation of point cloud data and workpiece point cloud model is calculated according to point cloud matching afterwards.
8) manipulator being arranged in cloud template point position is captured to be converted by step (7) calculated transformational relation Changing coordinates, and the coordinate is transformed into the coordinate under robot coordinate system, coordinate is passed into manipulator, if coordinate value is in machine Within the motion range of tool hand, then manipulator executes grabbing workpiece and puts the action of workpiece, and then camera starts to shoot image, Into a first shooting crawl cycle.
The above described is only a preferred embodiment of the present invention, limitation in any form not is done to the present invention, though So the present invention has been disclosed as a preferred embodiment, and however, it is not intended to limit the invention, any technology people for being familiar with this profession Member, without departing from the scope of the present invention, when the technology contents using the disclosure above make a little change or modification For the equivalent embodiment of equivalent variations, as long as being the content without departing from technical solution of the present invention, according to the technical essence of the invention Any simple modification, equivalent change and modification made to the above embodiment, in the range of still falling within technical solution of the present invention.

Claims (6)

1. a kind of control method of the manipulator crawl control system based on binocular stereo vision, which is characterized in that including following Step:
The intrinsic parameter of step 1 calibration for cameras and outer parameter realize that camera coordinates system and manipulator are sat then by teaching manipulator Mark the conversion of system;
Step 2 line laser does cambered surface scanning motion, while positioned at the camera of line laser both sides respectively from two viewing angles in left and right Image;
The two images that step 3 shoots double camera extract laser rays respectively;
Step 4 is accurately positioned the position of laser rays using sub-pixel recognition;
Step 5 carries out homotopy mapping to the laser rays extracted in two camera pictures;
Step 6 calculates the three dimensional space coordinate each put on laser rays by principle of triangulation, completes to workpiece in scene Scanning obtains the point cloud data in camera fields of view;
Step 7 establishes workpiece point cloud template, chooses manipulator and captures point, then by the point cloud data of acquisition and workpiece point cloud mould Plate matches, and to find target workpiece and calculate transformational relation between the two, specifically includes following steps:
(1) start dual camera systems, scan workpiece point cloud data, then in the point cloud data of interface deletion error to get to complete At workpiece point cloud model;
(2) curvature information of the point cloud data for obtaining step 6 with the workpiece point cloud model according to workpiece surface data carries out Point cloud matching obtains target workpiece;
(3) transformational relation of point cloud data and workpiece point cloud model is calculated;
The manipulator being arranged in cloud template is captured point position and be converted by the calculated transformational relation of step 7 by step 8 works as Preceding coordinate, and the coordinate is transformed into the coordinate under robot coordinate system, coordinate is passed into manipulator, if coordinate value is in machinery Within the motion range of hand, then manipulator executes grabbing workpiece and puts the action of workpiece, and then camera starts to shoot image, into Enter next shooting crawl cycle.
2. control method according to claim 1, which is characterized in that it is wherein described the step of 1 in, pass through the circle after coding Shape index point calibration for cameras is completed to show manipulator when teaching manipulator by circular index point position after the teaching coding The conversion of camera coordinates system and robot coordinate system is realized in religion.
3. control method according to claim 1, which is characterized in that it is wherein described the step of 3 in, extract laser lines When, using the unrelated extracting method of laser lines illumination, that is, combines gray threshold and the picture method that often row pixel is most worth to realize and swash The unrelated preliminary extraction of light line illumination.
4. control method according to claim 1, which is characterized in that it is wherein described the step of 3 in, laser lines extract In the process, the laser rays caused by workpiece surface reflection light is rejected using the method for laser lines row pixel deviations limitation to dissipate Disorderly.
5. control method according to claim 1, which is characterized in that it is wherein described the step of 4 in, using grey scale centre of gravity Method accurately positions the lines center of laser rays, to realize laser lines sub-pix positioning.
6. control method according to claim 1, which is characterized in that it is wherein described the step of 6 in, it is former to pass through triangulation Reason calculates the three dimensional space coordinate put on laser rays, and specific calculating process is:After image shot by camera, a picture point on camera Straight line is determined with camera lens center, and picture point and this 2 points coordinate of optical center are all under camera coordinates system, this 2 points The space line equation of composition is:
Wherein, X, Y, Z are the three-dimensional coordinate of target point;X, y, f are picpointed coordinate;Xs, Ys, Zs are optical center coordinates;ai、bi、 ciFor coordinate system transformation parameter;
Choose two images on a pair of of same place, substitute into above-mentioned linear equation can obtain 4 set of equations at equation group, to X is obtained, the value of Y, Z calculate the three dimensional space coordinate put on laser rays.
CN201610676653.0A 2016-08-16 2016-08-16 A kind of control method of the manipulator crawl control system based on binocular stereo vision Active CN106041937B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610676653.0A CN106041937B (en) 2016-08-16 2016-08-16 A kind of control method of the manipulator crawl control system based on binocular stereo vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610676653.0A CN106041937B (en) 2016-08-16 2016-08-16 A kind of control method of the manipulator crawl control system based on binocular stereo vision

Publications (2)

Publication Number Publication Date
CN106041937A CN106041937A (en) 2016-10-26
CN106041937B true CN106041937B (en) 2018-09-14

Family

ID=57195003

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610676653.0A Active CN106041937B (en) 2016-08-16 2016-08-16 A kind of control method of the manipulator crawl control system based on binocular stereo vision

Country Status (1)

Country Link
CN (1) CN106041937B (en)

Families Citing this family (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106530297B (en) * 2016-11-11 2019-06-07 北京睿思奥图智能科技有限公司 Grasping body area positioning method based on point cloud registering
CN106767527B (en) * 2016-12-07 2019-06-04 西安知象光电科技有限公司 A kind of optics mixing detection method of three-D profile
CN107030690A (en) * 2016-12-22 2017-08-11 中国科学院沈阳自动化研究所 A kind of mechanical arm barrier-avoiding method of view-based access control model
US11366450B2 (en) * 2017-03-23 2022-06-21 Abb Schweiz Ag Robot localization in a workspace via detection of a datum
CN107186708B (en) * 2017-04-25 2020-05-12 珠海智卓投资管理有限公司 Hand-eye servo robot grabbing system and method based on deep learning image segmentation technology
CN107192331A (en) * 2017-06-20 2017-09-22 佛山市南海区广工大数控装备协同创新研究院 A kind of workpiece grabbing method based on binocular vision
CN107301666B (en) * 2017-06-28 2021-04-13 电子科技大学 Automatic robot shooting method
CN107292930B (en) * 2017-07-19 2020-07-14 云南电网有限责任公司电力科学研究院 Mechanical arm guiding method and device
CN107932502A (en) * 2017-11-07 2018-04-20 陕西科技大学 A kind of SCARA method for planning track of robot based on binocular stereo vision
JP6693981B2 (en) * 2018-02-19 2020-05-13 ファナック株式会社 Simulation device for simulating robot movement
CN108834576B (en) * 2018-06-07 2021-03-26 华南农业大学 Citrus picking robot based on binocular vision and implementation method thereof
CN109129480A (en) * 2018-08-24 2019-01-04 上海理工大学 A kind of mechanical arm system of processing based on binocular 3D positioning
CN109146939A (en) * 2018-09-07 2019-01-04 中科新松有限公司 A kind of generation method and system of workpiece grabbing template
CN109335649B (en) * 2018-09-26 2020-05-01 重庆巴奥科技有限公司 Irregular distribution grabbing and refining method for intelligent goods placement
CN109318232A (en) * 2018-10-22 2019-02-12 佛山智能装备技术研究院 A kind of polynary sensory perceptual system of industrial robot
CN109513629B (en) * 2018-11-14 2021-06-11 深圳蓝胖子机器智能有限公司 Method, device and computer readable storage medium for sorting packages
CN109513630B (en) * 2018-11-14 2021-06-11 深圳蓝胖子机器智能有限公司 Package sorting system, control method thereof and storage medium
CN109297433A (en) * 2018-11-15 2019-02-01 青岛星晖昌达智能自动化装备有限公司 3D vision guide de-stacking measuring system and its control method
CN109202912B (en) * 2018-11-15 2020-09-11 太原理工大学 Method for registering target contour point cloud based on monocular depth sensor and mechanical arm
CN109227205B (en) * 2018-11-27 2021-01-01 上海交通大学 Control method and system for feeding and discharging of double-station robot machine tool
CN109658413B (en) * 2018-12-12 2022-08-09 达闼机器人股份有限公司 Method for detecting grabbing position of robot target object
CN110355754B (en) * 2018-12-15 2023-09-22 深圳铭杰医疗科技有限公司 Robot hand-eye system, control method, device and storage medium
CN109588320A (en) * 2019-01-21 2019-04-09 河南埃尔森智能科技有限公司 A kind of unmanned milk cow milking system based on 3D vision guide
CN109532522A (en) * 2019-01-21 2019-03-29 河南埃尔森智能科技有限公司 A kind of unmanned charging system of automobile based on 3D vision technique and its application method
CN109498914A (en) * 2019-01-21 2019-03-22 河南埃尔森智能科技有限公司 A kind of automatic intravenous injection system and its injecting method based on 3D vision guide
CN109814124A (en) * 2019-01-28 2019-05-28 河北省科学院应用数学研究所 A kind of robot positioning system and method based on structure light 3 D sensor
CN109671123B (en) * 2019-01-31 2023-10-17 泉州华中科技大学智能制造研究院 Sole glue spraying equipment and method based on monocular vision
CN111738971B (en) * 2019-03-19 2024-02-27 北京伟景智能科技有限公司 Circuit board stereoscopic scanning detection method based on line laser binocular stereoscopic vision
CN111452036B (en) * 2019-03-19 2023-08-04 北京伟景智能科技有限公司 Workpiece grabbing method based on line laser binocular stereoscopic vision
CN109940616B (en) * 2019-03-21 2022-06-03 佛山智能装备技术研究院 Intelligent grabbing system and method based on brain-cerebellum mode
CN110091330A (en) * 2019-04-30 2019-08-06 深圳市视觉龙科技有限公司 A kind of MiniOCV black charging tray vision guide deviation-rectifying system
CN110340891B (en) * 2019-07-11 2022-05-24 河海大学常州校区 Mechanical arm positioning and grabbing system and method based on point cloud template matching technology
CN112207857A (en) * 2019-07-12 2021-01-12 上海智殷自动化科技有限公司 Offline teaching system based on binocular vision system
CN110424754B (en) * 2019-07-20 2022-01-11 深圳中物智建科技有限公司 Butt joint method and system for wall building robot
CN110517208B (en) * 2019-08-19 2023-06-16 广东弓叶科技有限公司 Coordinate system association method and system
CN110587616B (en) * 2019-08-20 2023-01-24 辽宁成大生物股份有限公司 Cell factory operation process method
CN110539297B (en) * 2019-08-21 2020-12-22 长春工程学院 3D vision-guided wheel set matching manipulator positioning method and device
CN110988586B (en) * 2019-10-28 2021-11-16 国电南瑞科技股份有限公司 Distribution network live working robot operation accurate positioning method and system
CN110883774B (en) * 2019-11-20 2023-04-11 珠海格力智能装备有限公司 Robot joint angle zero calibration system, method and storage medium
CN111067197A (en) * 2019-12-16 2020-04-28 河南埃尔森智能科技有限公司 Robot sole dynamic gluing system and method based on 3D scanning
CN110881748A (en) * 2019-12-16 2020-03-17 埃视森智能科技(上海)有限公司 Robot sole automatic gluing system and method based on 3D scanning
CN111136656B (en) * 2019-12-24 2020-12-08 上海智殷自动化科技有限公司 Method for automatically identifying and grabbing three-dimensional irregular object of robot
CN111872922B (en) * 2020-07-29 2021-09-03 贵州电网有限责任公司 Three-degree-of-freedom parallel robot hand-eye calibration method based on 3D vision sensor
CN111941425B (en) * 2020-08-17 2023-02-10 沈阳飞机工业(集团)有限公司 Rapid workpiece positioning method of robot milling system based on laser tracker and binocular camera
CN112150439B (en) * 2020-09-23 2023-10-20 创新奇智(青岛)科技有限公司 Automatic sorting equipment and sorting method for injection molding parts
CN112244078B (en) * 2020-10-15 2022-03-29 佛山松瀚智能设备有限公司 Feeding positioning method of automatic shrimp peeling machine
CN112476434B (en) * 2020-11-24 2021-12-28 新拓三维技术(深圳)有限公司 Visual 3D pick-and-place method and system based on cooperative robot
CN112935562A (en) * 2021-01-19 2021-06-11 中国科学院云南天文台 Laser precision machining method based on paraxial offline measurement
CN113378626A (en) * 2021-04-22 2021-09-10 北京铁科首钢轨道技术股份有限公司 Visual grabbing method for elastic strips
CN113610741A (en) * 2021-08-18 2021-11-05 梅卡曼德(北京)机器人科技有限公司 Point cloud processing method and device based on laser line scanning
CN115903105A (en) * 2021-09-30 2023-04-04 梅卡曼德(北京)机器人科技有限公司 Bawell prism, in-line laser, laser projection module and laser 3D camera
CN114018214A (en) * 2021-10-18 2022-02-08 武汉理工大学 Marker binocular sub-pixel distance measurement method based on hardware acceleration system
CN114102593B (en) * 2021-11-24 2023-03-28 航天晨光股份有限公司 Method for grabbing regular materials by robot based on two-dimensional low-definition image
CN113989278A (en) * 2021-12-24 2022-01-28 湖南视比特机器人有限公司 Positioning method and system of gripping device
CN114677429B (en) * 2022-05-27 2022-08-30 深圳广成创新技术有限公司 Positioning method and device of manipulator, computer equipment and storage medium
CN115106312A (en) * 2022-06-13 2022-09-27 中实洛阳重型机械有限公司 Intelligent ore sorting device based on binocular camera laser guide
CN117104831A (en) * 2023-09-01 2023-11-24 中信戴卡股份有限公司 Robot 3D recognition and processing method and system for knuckle workpiece

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100345150B1 (en) * 2000-09-29 2002-07-24 현대자동차주식회사 Method for compensating robot position using laser analyzer
JP5744587B2 (en) * 2011-03-24 2015-07-08 キヤノン株式会社 Robot control apparatus, robot control method, program, and recording medium
CN102567989A (en) * 2011-11-30 2012-07-11 重庆大学 Space positioning method based on binocular stereo vision
CN103247053B (en) * 2013-05-16 2015-10-14 大连理工大学 Based on the part accurate positioning method of binocular microscopy stereo vision
CN103271784B (en) * 2013-06-06 2015-06-10 山东科技大学 Man-machine interactive manipulator control system and method based on binocular vision
CN103499297B (en) * 2013-10-25 2016-01-13 爱科维申科技(天津)有限公司 A kind of high-precision measuring method based on CCD
CN203557388U (en) * 2013-10-29 2014-04-23 中国人民解放军总装备部军械技术研究所 Target pose obtaining mechanism target grabbing system of explosive-handling robot
CN105563481B (en) * 2014-11-11 2018-06-29 沈阳新松机器人自动化股份有限公司 A kind of robot vision bootstrap technique for peg-in-hole assembly
CN104331894A (en) * 2014-11-19 2015-02-04 山东省科学院自动化研究所 Robot unstacking method based on binocular stereoscopic vision
CN105058387A (en) * 2015-07-17 2015-11-18 北京航空航天大学 Industrial robot base coordinate system calibration method based on laser tracker

Also Published As

Publication number Publication date
CN106041937A (en) 2016-10-26

Similar Documents

Publication Publication Date Title
CN106041937B (en) A kind of control method of the manipulator crawl control system based on binocular stereo vision
CN112476434B (en) Visual 3D pick-and-place method and system based on cooperative robot
CN108274092B (en) Automatic groove cutting system and method based on three-dimensional vision and model matching
CN109272523B (en) Random stacking piston pose estimation method based on improved CVFH (continuously variable frequency) and CRH (Crh) characteristics
CN104842361B (en) Robotic system with 3d box location functionality
CN110751691B (en) Automatic pipe fitting grabbing method based on binocular vision
CN104626169A (en) Robot part grabbing method based on vision and mechanical comprehensive positioning
Bone et al. Automated modeling and robotic grasping of unknown three-dimensional objects
CN110084243B (en) File identification and positioning method based on two-dimensional code and monocular camera
CN107590835A (en) Mechanical arm tool quick change vision positioning system and localization method under a kind of nuclear environment
CN111121655A (en) Visual detection method for pose and aperture of coplanar workpiece with equal large hole patterns
CN108460552B (en) Steel storage control system based on machine vision and PLC
CN106446888B (en) Multi-identifier identification method and equipment for camera module
CN112372641B (en) Household service robot character grabbing method based on visual feedforward and visual feedback
CN114952809A (en) Workpiece identification and pose detection method and system and grabbing control method of mechanical arm
US20210283782A1 (en) Measurement parameter optimization method and device, and computer control program stored on computer-readable storage medium
CN112109072B (en) Accurate 6D pose measurement and grabbing method for large sparse feature tray
CN115213896A (en) Object grabbing method, system and equipment based on mechanical arm and storage medium
Pan et al. Manipulator package sorting and placing system based on computer vision
CN110992416A (en) High-reflection-surface metal part pose measurement method based on binocular vision and CAD model
CN112633187B (en) Automatic robot carrying method, system and storage medium based on image analysis
Lin et al. Vision based object grasping of industrial manipulator
CN114187312A (en) Target object grabbing method, device, system, storage medium and equipment
CN215970736U (en) Steel rail marking device based on three-dimensional visual guidance
CN111080685A (en) Airplane sheet metal part three-dimensional reconstruction method and system based on multi-view stereoscopic vision

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant