CN104786226A - Posture and moving track positioning system and method of robot grabbing online workpiece - Google Patents

Posture and moving track positioning system and method of robot grabbing online workpiece Download PDF

Info

Publication number
CN104786226A
CN104786226A CN201510136679.1A CN201510136679A CN104786226A CN 104786226 A CN104786226 A CN 104786226A CN 201510136679 A CN201510136679 A CN 201510136679A CN 104786226 A CN104786226 A CN 104786226A
Authority
CN
China
Prior art keywords
workpiece
coordinate system
point
camera
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510136679.1A
Other languages
Chinese (zh)
Inventor
全燕鸣
朱正伟
郭清达
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN201510136679.1A priority Critical patent/CN104786226A/en
Publication of CN104786226A publication Critical patent/CN104786226A/en
Pending legal-status Critical Current

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a posture and moving track positioning system and method of robot grabbing an online workpiece. The device comprises a camera support, an industrial camera, the tested workpiece, a workpiece conveying belt, a tail end executer, a robot system, a workbench, an image processor and the like. The method comprises the steps that a depth positioning model of a plane point in an absolute coordinate system in the vertical direction, a depth positioning model of a space point in the absolute coordinate system in the vertical direction, an X-axis positioning model of the plane point in the absolute coordinate system in the horizontal direction and an X-axis positioning model of the space point in the absolute coordinate system in the horizontal direction are established, three-dimensional coordinates of imaging points in all the cameras can be solved through combination of the priori knowledge of the workpiece, accordingly, and a robot can be guided to adopt the accurate tail end executer posture and movement track to intelligently grab the workpiece accurately. The posture and moving track positioning system and method have the advantages that the grabbing precision is high, the structure is simple, and cost is low.

Description

Capture the robot pose of online workpiece and movement locus navigation system and method
Technical field
Motion planning and robot control of the present invention and vision positioning field, particularly a kind of capture online workpiece robot pose and movement locus navigation system and method.
Background technology
At present, industrial robot is applied widely in each industrial circle, and it can replace manually making and repeatably moves accurately.Robot will complete the intelligent grabbing of 3 D workpiece, must along particular track, adopt corresponding pose and movement locus to move.At present in the crawl of static three-dimensional workpiece, what industrial robot captured pose and movement locus is positioned with on-line teaching, off-line programing; In the online crawl in real time of Moving Workpieces, there is the guidance system that monocular vision guidance system, binocular vision guidance system and structured light are combined with vision at present.But, there is many-sided deficiency in existing industrial robot three-dimensional pose and movement locus location technology:
1) on-line teaching does not need the tool coordinates system of demarcating robot, the pose of robot, movement locus and clamping parameter rely on the experience range estimation of teaching engineering staff to obtain, system accuracy is lower, online programming is loaded down with trivial details, take the production time, be difficult to meet the current flexible production demand of multi items conversion fast, particularly can not meet the demand of the uncertain online conveying workpieces of automatic capturing pose.
2) must demarcate robot to obtain real robot location's spatial model.Conventional calibration facility has three coordinate measuring machine, joint arm measuring machine and laser tracker, but for restricted to the Robot calibration in operative scenario, location position error is still larger.
3) single camera vision system is generally used for the information extraction on two dimensional surface, cannot obtain the information of workpiece on Z axis (depth direction), and what pass to robot only has position in plane and angular metric.Camera optical axis must keep exact vertical with workpiece planarization, and depth location can only lean on empirical value people for presetting fixed value.
4) structured light combines with one camera and obtains the mode of workpiece information, need carry out graphics decoding computing.For ensureing calculation accuracy, camera and workpiece relative position must be kept to fix, be difficult to the crawl being applicable to Moving Workpieces, and relate to three-dimensional point cloud calculating, computing complicated and time consumption.
5) Binocular Stereo Vision System, utilize two cameras be fixed angle from different viewing angles workpiece image, calculate the three-dimensional coordinate of workpiece.Will through the process such as binocular vision calibration, extraction of depth information, computing more complicated, consuming time longer, cost is higher, and robustness is poor.
Therefore, for the 3 D workpiece that industrial robot automatic capturing moves online, there are development workpiece pose and movement locus to identify the demand of new method online fast, to improve above-mentioned shortcoming with not enough.
Summary of the invention
Main purpose of the present invention is that the shortcoming overcoming prior art is with not enough, a kind of the robot pose and the movement locus navigation system that capture online workpiece are provided, this system based on single camera vision system, be applicable to utilize industrial robot intelligence, rapidly capture known form on the modal production pipeline in manufacturing shop and the uncertain middle-size and small-size workpiece of pose (can difformity, specification, material various workpieces mixing).
Another object of the present invention is to provide a kind of based on the robot pose of the online workpiece of above-mentioned crawl and the localization method of movement locus navigation system.The method based on single camera vision system, but can obtain the three dimensional local information of workpiece, and calculates simple, and speed is fast.
Object of the present invention is realized by following technical scheme: the robot pose and the movement locus navigation system that capture online workpiece, comprise the camera for captured in real-time workpiece, the robot system for grabbing workpiece, obtain for image captured by camera image processor, the workbench that robot system captures coordinate, described camera is fixed on a camera support, and camera support and robot system are arranged at the both sides of workbench respectively; Described image processor is connected with the robot controller in robot system with camera respectively; Described workbench is provided with workpiece transfer band, and workpiece moves on workpiece transfer band; The end of described robot system is provided with for clamping and the end effector of rotational workpieces; The optical axis of described camera is vertical with the direction of workpiece motion s, and camera is for obtaining the shaft side figure picture of workpiece.The present invention adopts one camera obtain the workpiece information of uncertain pose in conveying motion and calculate its three-dimensional coordinate, and guided robot adopts correct end effector pose and movement locus accurately to capture workpiece intelligently.
Preferably, be provided with at the two ends of described workpiece transfer band the photoelectric sensor whether entering viewing field of camera for detecting workpiece, this photoelectric sensor is connected with the external trigger head of camera.Thus camera can be triggered by sensor signal and take pictures.
Based on the robot pose of the online workpiece of above-mentioned crawl and a localization method for movement locus navigation system, comprise step:
The installation site of S1, adjustment camera and setting angle, after making workpiece enter viewing field of camera, phase function obtains the shaft side figure picture of workpiece;
S2, scaling board to be positioned on workpiece transfer band, and to be positioned at the field of view center of camera, geometric calibration is carried out to camera-workpiece transfer band plane-end effector;
" the conversion Mathematical Modeling in v in workpiece features image coordinate and absolute coordinate system XYZ between workpiece three-dimensional pose coordinate of S3, according to the installing space position of camera and known workpiece shapes size, setting up frame coordinate system uo;
S4, according to the locus of robot in absolute coordinate system, set up workpiece in absolute coordinate system XYZ three-dimensional pose coordinate with at robot coordinate system X 1y 1z 1conversion Mathematical Modeling between middle coordinate;
S5, workpiece conveyor carry workpiece translational motion motion with speed V, when detect workpiece enter viewing field of camera after camera obtain the shaft side figure picture of workpiece, extract the image characteristic point of workpiece in this shaft side figure picture;
S6, the coordinate of image characteristic point according to workpiece, the Mathematical Modeling that the calibration value in integrating step S2 and step S3, S4 set up, obtains workpiece when robot captures at robot coordinate system X 1y 1z 1in three-dimensional pose and space coordinates, these data are transferred to robot controller, so control end effector capture online workpiece.
Further, described localization method, also comprises step:
The position of S7, robot controller Real-time Collection current end actuator, then the deviation of this physical location and target location is calculated, feed back a motion control signal according to this deviation and control end effector motion, form a close loop control circuit, until the accurate grabbing workpiece of end effector.
Concrete, in described step S3, set up frame coordinate system uo " method of the conversion Mathematical Modeling in v in workpiece features image coordinate and absolute coordinate system XYZ between workpiece three-dimensional pose coordinate is:
Set up the spatial point X axis location model in the horizontal direction in planar point in the planar point depth localization model in vertical direction in absolute coordinate system, the spatial point depth localization model in vertical direction in absolute coordinate system, absolute coordinate system X axis location model in the horizontal direction and absolute coordinate system, again in conjunction with the priori of workpiece, thus the three-dimensional coordinate of the point of imaging in all cameras is obtained, the computing formula of above-mentioned each model is as follows:
(3-1) set up coordinate system: absolute coordinate system XYZ with the optical axis center of camera lens at the upright projection M of table plane for initial point, to cross the vertical plane of camera lens optical axis for reference, perpendicular to the direction X-axis of this plane, direction perpendicular to X-axis is Y-direction, coordinate is expressed as (X, Y), in units of length; Photo coordinate system xo'y initial point is at the center of CCD imaging plane, and coordinate is expressed as (x, y); " v initial point is in the upper left corner of the plane of delineation, and coordinate is expressed as (u, v), and in units of pixel, o' is at coordinate system uo, and " coordinate under v is (u for frame coordinate system uo 0, v 0); Described planar point is the point in absolute coordinate system XYZ in XMY plane below;
(3-2) the planar point depth localization model in vertical direction in absolute coordinate system is set up: establish the planar point P (0, Y, 0) in absolute coordinate system, then this point to the distance of initial point M is:
MP = Y = h tan ( α + arctan v 0 - v f y )
Wherein, α, h are the outer parameter by demarcating the camera obtained, f y=f/d y, d ybe the spacing of pixel, f is the intrinsic parameter by demarcating the camera obtained; V is that planar point P is at the frame coordinate system uo " ordinate in v;
(3-3) the spatial point depth localization model in vertical direction in absolute coordinate system is set up: establish spatial point P 1the P' point of imaging plane is all corresponded to, spatial point P with planar point P 1the actual physics height P of XMY plane in absolute coordinate system 1t=h 1, known, the actual physics height OM=h (namely this height is one of them the outer parameter above by demarcating the camera obtained) of XMY plane, then P in the some O to absolute coordinate system of industrial camera optical center 1distance between subpoint T in XMY plane and initial point M is:
MT = h tan ( α + arctan v 0 - v f y ) × ( 1 - h 1 h ) ;
Wherein, v is that spatial point P1 is at the frame coordinate system uo " ordinate in v;
(3-4) the planar point X axis location model in the horizontal direction in absolute coordinate system is set up: establish a planar point Q, the coordinate of its X axis in XMY plane is:
PQ = ( u - u 0 ) × h f y 2 + ( v - v 0 ) 2 × sin [ α + arctan ( v 0 - v f × d y ) ]
Wherein, u, v are that planar point Q is at the frame coordinate system uo " coordinate in v;
(3-5) the spatial point X axis location model in the horizontal direction in absolute coordinate system is set up: establish spatial point Q 1the Q' point in imaging plane is all corresponded to, spatial point Q with planar point Q 1the actual physics height Q of XMY plane in absolute coordinate system 1n=h 1, known, the actual physics height OM=h of XMY plane, then spatial point Q in the optical axis center point O to absolute coordinate system of camera lens 1at the upright projection point N (X of XMY plane e, Y e) coordinate computing formula as follows:
X e = h tan ( α + arctan u 0 - u fx ) × ( 1 - h 1 h ) ;
Y e = h tan ( α + arctan v 0 - v f y ) × ( 1 - h 1 h ) ;
Wherein, u, v are spatial point Q 1at the frame coordinate system uo " coordinate in v; f x=f/d x, d xbe the spacing of pixel, f is the intrinsic parameter by demarcating the camera obtained.
Concrete, in described step S4, set up workpiece in absolute coordinate system XYZ three-dimensional pose coordinate with at robot coordinate system X 1y 1z 1the method of the conversion Mathematical Modeling between middle coordinate is:
Absolute coordinate system XYZ and robot coordinate system X 1y 1z 1z-direction consistent, by translation and rotate the space transforming realized between two coordinate systems around Z axis, wherein changing Mathematical Modeling is: set absolute coordinate system XYZ into coordinate system A, robot coordinate system X 1y 1z 1for coordinate system B, then:
B=Trans(ΔX,ΔY,ΔZ)Rot(Z,θ)A
Wherein Trans (Δ X, Δ Y, the Δ Z) homogeneous transformation that is translation, its Mathematical Modeling is:
Trans ( ΔX , ΔY , ΔZ ) = 1 0 0 ΔX 0 1 0 ΔY 0 0 1 ΔZ 0 0 0 1
In formula, element Δ X, Δ Y, Δ Z represents the amount of movement along respective reference axis X, Y, Z respectively;
Wherein Rot (Z, θ) carries out around Z axis the rotation operator that rotates, and Mathematical Modeling is:
Rot ( Z , θ ) = cos θ - sin θ 0 0 sin θ cos θ 0 0 0 0 1 0 0 0 0 1
In formula, element θ is the angle rotated around Z axis, and above-mentioned parameter Δ X, Δ Y, Δ Z, θ solve actual value by demarcation.
Compared with prior art, tool has the following advantages and beneficial effect in the present invention:
(1) make robot automtion automatic operation, capture precision high.The present invention adopts one camera obtain the workpiece information of uncertain pose in conveying motion and calculate its three-dimensional coordinate, and guided robot adopts correct end effector pose and movement locus accurately to capture workpiece intelligently.
(2) plurality of target mixing can identify, adapt to multi items Flexible Production demand.The present invention designs monocular vision and obtains workpiece spindle side image, finds out its feature according to different known form objects, can be adapted to difformity, specification, the workpiece identification of various workpieces blending transportation occasion of material and robot and capture and guide.
(3) wide adaptability, application conditions is wide in range.The present invention without the specific restriction (but being good with middle-size and small-size object) of material, shape, size, also has larger adaptability to workpiece with the speed of conveyer belt and position deviation size on a moving belt to identification objective workpiece.
(4) structure is simple, with low cost.The present invention adopts single general camera, adapted ordinary PC, forms Intelligent Recognition and grasping system with regular industrial robot, without the need to other additional devices and software platform.
(5) simple to operate, automaticity is high, nuisanceless.Hardware and software in the present invention is all quite simple, and do not produce any pollution, user easily grasps, and can be used for the full-automatic continued operation of guided robot, and energy generating run database automatically, preserve operating result, run into fault capable of automatic alarm.
Accompanying drawing explanation
Fig. 1 is present system hardware architecture schematic diagram.
Fig. 2 is industrial robot grasping system workflow diagram of the present invention.
Fig. 3-1 is planar point of the present invention depth localization modular concept figure in vertical direction.
Fig. 3-2 is spatial point of the present invention depth localization modular concept figure in vertical direction.
Fig. 3-3 is planar point of the present invention X axis location model schematic diagrams in the horizontal direction.
Fig. 3-4 is spatial point of the present invention X axis location model schematic diagrams in the horizontal direction.
Fig. 4 (a) is the present embodiment measured workpiece is cubical schematic diagram.
The schematic diagram of Fig. 4 (b) to be the present embodiment measured workpiece be other rule bodies.
Detailed description of the invention
Below in conjunction with embodiment and accompanying drawing, the present invention is described in further detail, but embodiments of the present invention are not limited thereto.
Embodiment 1
Consult Fig. 1, the present embodiment captures the robot pose of online workpiece and movement locus navigation system comprises camera support 1, industrial camera 2, measured workpiece 3, workpiece transfer band 4, end effector 5, robot system 6, workbench 7, image processor 8, Worktable control device 9, displacement transducer 10 and robot controller 11.Industrial camera 2 is arranged on camera support 1, and camera optical axis and vertical line are that suitable tilt angled down is installed, can obtain workpiece on conveyer belt shaft side figure picture and initial time end effector of robot position.Camera and industrial robot are arranged on motion platform both sides respectively, ensure that robotic arm does not block camera fields of view.Photoelectric sensor is arranged on by conveyor belt side, whether enters viewing field of camera for detecting workpiece.Image processor is connected with the robot controller in robot system with camera respectively, obtains robot system capture coordinate for image captured by camera.End effector is used for clamping, rotates and place work piece.
See Fig. 2, this gives the method for work of said system, be summarized as follows:
(1) vision monitoring: carrying out in monitor procedure, whether real-time detection is current will obtain image, if obtain image, then first object is carried out being divided into cube (such as Fig. 4 (a)) Sum fanction body (such as Fig. 4 (b)) two class, then pretreatment is carried out, extract minutiae, line, face to the image gathered;
(2) information converting transmission: for the feature point, line, surface of said extracted, utilize functional relation to change, be converted into coordinates of targets and angle, then according to communications protocol format data, finally data are sent to industrial robot;
(3) industrial robot control procedure: after namely robot is subject to data, first resolve, then according to coordinates of targets and angle adjustment attitude orientation, can also be fed back operating process by position compensation algorithm simultaneously, finally realize accurately location and crawl.
Below the localization method of described system is specifically described.
The installation site of S1, adjustment camera and setting angle, after making workpiece enter viewing field of camera, phase function obtains the shaft side figure picture of workpiece.
S2, scaling board to be positioned on workpiece transfer band, and to be positioned at the field of view center of camera, geometric calibration is carried out to camera-workpiece transfer band plane-end effector, demarcate the intrinsic parameter f that obtains camera and outer parameter alpha, h, h 1.
" the conversion Mathematical Modeling in v in workpiece features image coordinate and absolute coordinate system XYZ between workpiece three-dimensional pose coordinate of S3, according to the installing space position of camera and known workpiece shapes size, setting up frame coordinate system uo.
This Mathematical Modeling specifically comprises four models, is the spatial point X axis location model in the horizontal direction in planar point in the planar point depth localization model in vertical direction in absolute coordinate system, the spatial point depth localization model in vertical direction in absolute coordinate system, absolute coordinate system X axis location model in the horizontal direction and absolute coordinate system respectively.According to above-mentioned model, then in conjunction with the priori of workpiece, thus the three-dimensional coordinate of the point of imaging in all cameras is obtained.
The reasoning process of above-mentioned each model is as follows:
(3-1) set up coordinate system: absolute coordinate system XYZ with the optical axis center of camera lens at the upright projection M of table plane for initial point, to cross the vertical plane of camera lens optical axis for reference, perpendicular to the direction X-axis of this plane, direction perpendicular to X-axis is Y-direction, coordinate is expressed as (X, Y), in units of length; Photo coordinate system xo'y initial point is at the center of CCD imaging plane, and coordinate is expressed as (x, y); " v initial point is in the upper left corner of the plane of delineation, and coordinate is expressed as (u, v), and in units of pixel, o' is at coordinate system uo, and " coordinate under v is (u for frame coordinate system uo 0, v 0); Described planar point is the point in absolute coordinate system XYZ in XMY plane below.
(3-2) the planar point depth localization model in vertical direction in absolute coordinate system is set up.
See Fig. 3-1, if the planar point P in absolute coordinate system (0, Y, 0), then:
β = α - γ ; γ = arctan y f ; Y = h tan β = h tan ( α - arctan y f ) - - - ( 1.1 )
Because what obtain from image is not y value, but the number v of pixel, required reduction formula is: y=(v-v 0) × d y(wherein d ythe spacing of pixel), f y=f/d y.Then as shown in figure 3-1, the distance of the planar point P in absolute coordinate system to initial point M is:
MP = Y = h tan ( α + arctan v 0 - v f y ) - - - ( 1.2 )
Wherein, f y=f/d y, d ybe the spacing of pixel, f is the intrinsic parameter by demarcating the camera obtained; V is that planar point P is at the frame coordinate system uo " ordinate in v.
(3-3) the spatial point depth localization model in vertical direction in absolute coordinate system is set up.
See Fig. 3-2, because a some P' on the two dimensional image of monocular vision often will correspond to the many points in three dimensions, if spatial point P 1the P' point of imaging plane is all corresponded to, spatial point P with planar point P 1the actual physics height P of XMY plane in absolute coordinate system 1t=h 1, known, the actual physics height OM=h of XMY plane in the some O to absolute coordinate system of industrial camera optical center, then:
tan β = P 1 T TP = OM MP - - - ( 1.3 )
TP = P 1 T OM × MP = h 1 h × Y - - - ( 1.4 )
So the distance between the subpoint T of P1 in XMY plane and initial point M is:
MT = MP - TP = Y - h 1 h × Y = h tan ( α + arctan v 0 - v f y ) × ( 1 - h 1 h ) - - - ( 1.5 )
Wherein, v is that spatial point P1 is at the frame coordinate system uo " ordinate in v.
(3-4) the planar point X axis location model in the horizontal direction in absolute coordinate system is set up.
See Fig. 3-3, if a planar point Q (X, Y), then:
β = α - γ ; γ = arctan ( y f ) - - - ( 1.6 )
OP = OM sin β = h sin ( α - γ ) = h sin ( α - arctan y f ) - - - ( 1.7 )
Change pixel coordinate into:
OP = h sin [ α - arctan ( v - v 0 f × d y ) ] = h sin [ α + arctan ( v 0 - v f × d y ) ] - - - ( 1.8 )
P ′ O = P ′ O ′ 2 + O ′ O 2 = f 2 + y 2 = f 2 + ( v - v 0 ) 2 × d y 2 - - - ( 1.9 )
Obtain according to correspondence theorem:
P ′ O ′ P ′ O = PQ PO - - - ( 1.10 )
Then
PQ = P ′ Q ′ P ′ O × PO - - - ( 1.11 )
What obtain from image is not x value, but the number u of pixel, required reduction formula is: x=(u-u 0) × d x(wherein d xthe spacing of pixel), d xand d yfor intrinsic parameters of the camera, represent Distance geometry y direction, the x direction distance between CCD two neighboring photosites respectively, be linear CCD camera due to what adopt, x direction and y direction are evenly equal, therefore d x=d y.
So the coordinate obtaining planar point Q X axis in XMY plane by formula (1.11) is:
Transverse axis coordinate is:
PQ = P ′ Q ′ P ′ O × PO ( u - u 0 ) × h f y 2 + ( v - v 0 ) 2 × sin [ α + arctan ( v 0 - v f × d y ) ] - - - ( 1.12 )
Wherein, u, v are that planar point Q is at the frame coordinate system uo " coordinate in v.
(3-5) the spatial point X axis location model in the horizontal direction in absolute coordinate system is set up.
Because a some Q' on the two dimensional image of monocular vision often will correspond to three-dimensional many points, as shown in Figure 3-4, spatial point Q 1the Q' point in imaging plane is all corresponded to, spatial point Q with planar point Q 1the actual physics height Q of XMY plane in absolute coordinate system 1n=h1, known, the actual physics height OM=h of XMY plane in the optical axis center point O to absolute coordinate system of camera lens.
In OMQ plane, obtained by similar
Q 1 N NQ = OM MQ - - - ( 1.13 )
QM = PM 2 + PQ 2 = X 2 + Y 2 - - - ( 1.14 )
NQ = Q 1 N × QM OM = h 1 h × X 2 + Y 2 - - - ( 1.15 )
NM = MQ - NQ = X 2 + Y 2 × ( 1 - h 1 h ) - - - ( 1.16 )
Then spatial point Q1 is at the upright projection point N (X of XMY plane e, Y e), obtained by similar:
EN NM = PQ MQ - - - ( 1.17 )
X e = EN = PQ × NM MQ = X × ( 1 - h 1 h ) = h tan ( α + arctan u 0 - u fx ) × ( 1 - h 1 h ) - - - ( 1.18 )
In like manner can obtain:
Y e = EN = PM × NM MQ = Y × ( 1 - h 1 h ) = h tan ( α + arctan v 0 - v f y ) × ( 1 - h 1 h ) - - - ( 1.19 )
Wherein, u, v are that spatial point Q1 is at the frame coordinate system uo " coordinate in v; f x=f/d x, d xbe the spacing of pixel, f is the intrinsic parameter by demarcating the camera obtained.
S4, according to the locus of robot in absolute coordinate system, set up workpiece in absolute coordinate system XYZ three-dimensional pose coordinate with at robot coordinate system X 1y 1z 1conversion Mathematical Modeling between middle coordinate.
Concrete grammar is:
Absolute coordinate system XYZ and robot coordinate system X 1y 1z 1z-direction consistent, by translation and rotate the space transforming realized between two coordinate systems around Z axis, wherein changing Mathematical Modeling is: set absolute coordinate system XYZ into coordinate system A, robot coordinate system X 1y 1z 1for coordinate system B, then:
B=Trans(ΔX,ΔY,ΔZ)Rot(Z,θ)A
Wherein Trans (Δ X, Δ Y, the Δ Z) homogeneous transformation that is translation, its Mathematical Modeling is:
Trans ( ΔX , ΔY , ΔZ ) = 1 0 0 ΔX 0 1 0 ΔY 0 0 1 ΔZ 0 0 0 1
In formula, element Δ X, Δ Y, Δ Z represents the amount of movement along respective reference axis X, Y, Z respectively;
Wherein Rot (Z, θ) carries out around Z axis the rotation operator that rotates, and Mathematical Modeling is:
Rot ( Z , θ ) = cos θ - sin θ 0 0 sin θ cos θ 0 0 0 0 1 0 0 0 0 1
In formula, element θ is the angle rotated around Z axis, and above-mentioned parameter Δ X, Δ Y, Δ Z, θ solve actual value by demarcation.
S5, workpiece conveyor carry workpiece translational motion motion with speed V, when detect workpiece enter viewing field of camera after camera obtain the shaft side figure picture of workpiece, extract the image characteristic point of workpiece in this shaft side figure picture.
Such as, A, B, C, D, A', B', C', D' in Fig. 4 (a), (b).
S6, by the image coordinate (u of impact point i, v i) and inside and outside parameter substitute into formula (1.12) and formula (1.2), obtain the coordinate position of the planar point in the world coordinate system corresponding to frame coordinate.Such as can obtain the three-dimensional coordinate of the coordinate system that cube C, D, C`, D` characteristic point is set up relative to the upright projection point of video camera.
By the image coordinate (u of impact point i, v i) and inside and outside parameter substitute into formula (1.18) and formula (1.19), try to achieve the coordinate position of object relative to video camera of certain altitude.Such as can obtain the three-dimensional coordinate of A, B, A`, B` characteristic point in space.
According to the Mathematical Modeling that the three-dimensional coordinate of each characteristic point in absolute coordinate system and step S4 are set up, obtain workpiece when robot captures at robot coordinate system X 1y 1z 1in three-dimensional pose and space coordinates, these data are transferred to robot controller, so control end effector capture online workpiece.
S7, in order to further improve the precision of crawl, the position of robot controller also Real-time Collection current end actuator, then the deviation of this physical location and target location is calculated, feed back a motion control signal according to this deviation and control end effector motion, form a close loop control circuit, until the accurate grabbing workpiece of end effector.
Above-described embodiment is the present invention's preferably embodiment; but embodiments of the present invention are not restricted to the described embodiments; change, the modification done under other any does not deviate from Spirit Essence of the present invention and principle, substitute, combine, simplify; all should be the substitute mode of equivalence, be included within protection scope of the present invention.

Claims (6)

1. capture robot pose and the movement locus navigation system of online workpiece, it is characterized in that, comprise the camera for captured in real-time workpiece, the robot system for grabbing workpiece, obtain for image captured by camera image processor, the workbench that robot system captures coordinate, described camera is fixed on a camera support, and camera support and robot system are arranged at the both sides of workbench respectively; Described image processor is connected with the robot controller in robot system with camera respectively; Described workbench is provided with workpiece transfer band, and workpiece moves on workpiece transfer band; The end of described robot system is provided with for clamping and the end effector of rotational workpieces; The optical axis of described camera is vertical with the direction of workpiece motion s, and camera is for obtaining the shaft side figure picture of workpiece.
2. the robot pose of the online workpiece of crawl according to claim 1 and movement locus navigation system, it is characterized in that, be provided with at the two ends of described workpiece transfer band the photoelectric sensor whether entering viewing field of camera for detecting workpiece, this photoelectric sensor is connected with the external trigger head of camera.
3., based on the robot pose of the online workpiece of crawl described in any one of claim 1-2 and a localization method for movement locus navigation system, it is characterized in that, comprise step:
The installation site of S1, adjustment camera and setting angle, after making workpiece enter viewing field of camera, phase function obtains the shaft side figure picture of workpiece;
S2, scaling board to be positioned on workpiece transfer band, and to be positioned at the field of view center of camera, geometric calibration is carried out to camera-workpiece transfer band plane-end effector;
" the conversion Mathematical Modeling in v in workpiece features image coordinate and absolute coordinate system XYZ between workpiece three-dimensional pose coordinate of S3, according to the installing space position of camera and known workpiece shapes size, setting up frame coordinate system uo;
S4, according to the locus of robot in absolute coordinate system, set up workpiece in absolute coordinate system XYZ three-dimensional pose coordinate with at robot coordinate system X 1y 1z 1conversion Mathematical Modeling between middle coordinate;
S5, workpiece conveyor carry workpiece translational motion motion with speed V, when detect workpiece enter viewing field of camera after camera obtain the shaft side figure picture of workpiece, extract the image characteristic point of workpiece in this shaft side figure picture;
S6, the coordinate of image characteristic point according to workpiece, the Mathematical Modeling that the calibration value in integrating step S2 and step S3, S4 set up, obtains workpiece when robot captures at robot coordinate system X 1y 1z 1in three-dimensional pose and space coordinates, these data are transferred to robot controller, so control end effector capture online workpiece.
4. localization method according to claim 3, is characterized in that, also comprises step:
The position of S7, robot controller Real-time Collection current end actuator, then the deviation of this physical location and target location is calculated, feed back a motion control signal according to this deviation and control end effector motion, form a close loop control circuit, until the accurate grabbing workpiece of end effector.
5. localization method according to claim 3, is characterized in that, in described step S3, sets up frame coordinate system uo " method of the conversion Mathematical Modeling in v in workpiece features image coordinate and absolute coordinate system XYZ between workpiece three-dimensional pose coordinate is:
Set up the spatial point X axis location model in the horizontal direction in planar point in the planar point depth localization model in vertical direction in absolute coordinate system, the spatial point depth localization model in vertical direction in absolute coordinate system, absolute coordinate system X axis location model in the horizontal direction and absolute coordinate system, again in conjunction with the priori of workpiece, thus the three-dimensional coordinate of the point of imaging in all cameras is obtained, the computing formula of above-mentioned each model is as follows:
(3-1) set up coordinate system: absolute coordinate system XYZ with the optical axis center of camera lens at the upright projection M of table plane for initial point, to cross the vertical plane of camera lens optical axis for reference, perpendicular to the direction X-axis of this plane, direction perpendicular to X-axis is Y-direction, coordinate is expressed as (X, Y), in units of length; Photo coordinate system xo'y initial point is at the center of CCD imaging plane, and coordinate is expressed as (x, y); " v initial point is in the upper left corner of the plane of delineation, and coordinate is expressed as (u, v), and in units of pixel, o' is at coordinate system uo, and " coordinate under v is (u for frame coordinate system uo 0, v 0); Described planar point is the point in absolute coordinate system XYZ in XMY plane;
(3-2) the planar point depth localization model in vertical direction in absolute coordinate system is set up: establish the planar point P (0, Y, 0) in absolute coordinate system, then this point to the distance of initial point M is:
MP = Y = h tan ( α + arctan v 0 - v f y )
Wherein, α, h are the outer parameter by demarcating the camera obtained, f y=f/d y, d ybe the spacing of pixel, f is the intrinsic parameter by demarcating the camera obtained; V is that planar point P is at the frame coordinate system uo " ordinate in v;
(3-3) the spatial point depth localization model in vertical direction in absolute coordinate system is set up: establish spatial point P1 and planar point P all to correspond to the P' point of imaging plane, the actual physics height P1T=h of XMY plane in spatial point P1 to absolute coordinate system 1, known, the actual physics height OM=h of XMY plane in the some O to absolute coordinate system of industrial camera optical center, then the distance between the subpoint T of P1 in XMY plane and initial point M is:
MT = h tan ( α + arctan v 0 - v f y ) × ( 1 - h 1 h ) ;
Wherein, v is that spatial point P1 is at the frame coordinate system uo " ordinate in v;
(3-4) the planar point X axis location model in the horizontal direction in absolute coordinate system is set up: establish a planar point Q, the coordinate of its X axis in XMY plane is:
PQ = ( u - u 0 ) × h f y 2 + ( v - v 0 ) 2 × sin [ α + arctan ( v 0 - v f × d y ) ]
Wherein, u, v are that planar point Q is at the frame coordinate system uo " coordinate in v;
(3-5) the spatial point X axis location model in the horizontal direction in absolute coordinate system is set up: establish spatial point Q1 and planar point Q all to correspond to Q' point in imaging plane, the actual physics height Q1N=h1 of XMY plane in spatial point Q1 to absolute coordinate system, known, the actual physics height OM=h of XMY plane in the optical axis center point O to absolute coordinate system of camera lens, then spatial point Q1 is at the upright projection point N (X of XMY plane e, Y e) coordinate computing formula as follows:
X e = h tan ( α + arctan u 0 - u fx ) × ( 1 - h 1 h ) ;
Y e = h tan ( α + arctan v 0 - v f y ) × ( 1 - h 1 h ) ;
Wherein, u, v are that spatial point Q1 is at the frame coordinate system uo " coordinate in v; f x=f/d x, d xbe the spacing of pixel, f is the intrinsic parameter by demarcating the camera obtained.
6. localization method according to claim 3, is characterized in that, in described step S4, set up workpiece in absolute coordinate system XYZ three-dimensional pose coordinate with at robot coordinate system X 1y 1z 1the method of the conversion Mathematical Modeling between middle coordinate is:
Absolute coordinate system XYZ and robot coordinate system X 1y 1z 1z-direction consistent, by translation and rotate the space transforming realized between two coordinate systems around Z axis, wherein changing Mathematical Modeling is: set absolute coordinate system XYZ into coordinate system A, robot coordinate system X 1y 1z 1for coordinate system B, then:
B=Trans(ΔX,ΔY,ΔZ)Rot(Z,θ)A;
Wherein Trans (Δ X, Δ Y, the Δ Z) homogeneous transformation that is translation, its Mathematical Modeling is:
Trans ( ΔX , ΔY , ΔZ ) = 1 0 0 ΔX 0 1 0 ΔY 0 0 1 ΔZ 0 0 0 1
In formula, element Δ X, Δ Y, Δ Z represents the amount of movement along respective reference axis X, Y, Z respectively;
Wherein Rot (Z, θ) carries out around Z axis the rotation operator that rotates, and Mathematical Modeling is:
Rot ( Z , θ ) = cos θ - sin θ 0 0 sin θ cos θ 0 0 0 0 1 0 0 0 0 1
In formula, element θ is the angle rotated around Z axis, and above-mentioned parameter Δ X, Δ Y, Δ Z, θ solve actual value by demarcation.
CN201510136679.1A 2015-03-26 2015-03-26 Posture and moving track positioning system and method of robot grabbing online workpiece Pending CN104786226A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510136679.1A CN104786226A (en) 2015-03-26 2015-03-26 Posture and moving track positioning system and method of robot grabbing online workpiece

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510136679.1A CN104786226A (en) 2015-03-26 2015-03-26 Posture and moving track positioning system and method of robot grabbing online workpiece

Publications (1)

Publication Number Publication Date
CN104786226A true CN104786226A (en) 2015-07-22

Family

ID=53551632

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510136679.1A Pending CN104786226A (en) 2015-03-26 2015-03-26 Posture and moving track positioning system and method of robot grabbing online workpiece

Country Status (1)

Country Link
CN (1) CN104786226A (en)

Cited By (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105291104A (en) * 2015-12-09 2016-02-03 哈尔滨云控机器人科技有限公司 Robot remote control method based on cloud platform
CN105345813A (en) * 2015-11-13 2016-02-24 张碧陶 High-precision mechanical arm positioning method based on generalized coordinates
CN105773591A (en) * 2016-01-26 2016-07-20 深圳市鼎泰智能装备股份有限公司 Novel grabbing robot device
CN106228563A (en) * 2016-07-29 2016-12-14 杭州鹰睿科技有限公司 Automatic setup system based on 3D vision
CN106272424A (en) * 2016-09-07 2017-01-04 华中科技大学 A kind of industrial robot grasping means based on monocular camera and three-dimensional force sensor
CN106331620A (en) * 2016-08-25 2017-01-11 中国大冢制药有限公司 Medicine bottle positioning analysis method of filling production line
CN106346486A (en) * 2016-11-04 2017-01-25 武汉海默自控股份有限公司 Six-axis cooperated robot multi-loop control system and control method thereof
CN106441094A (en) * 2016-09-10 2017-02-22 上海大学 Adaptive calibration vision online detection device and method
CN106483984A (en) * 2016-12-13 2017-03-08 广州智能装备研究院有限公司 The method and apparatus that a kind of control robot follows conveyer belt
CN106483963A (en) * 2015-08-26 2017-03-08 泰科电子(上海)有限公司 The automatic calibration method of robot system
CN106802656A (en) * 2016-12-01 2017-06-06 台山核电合营有限公司 A kind of tunnel clears up the cambered surface attitude adjusting method of robot
CN106845354A (en) * 2016-12-23 2017-06-13 中国科学院自动化研究所 Partial view base construction method, part positioning grasping means and device
CN106903720A (en) * 2017-04-28 2017-06-30 安徽捷迅光电技术有限公司 A kind of auto-correction method of the coordinate vision of delivery platform and robot
CN106956292A (en) * 2017-04-28 2017-07-18 安徽捷迅光电技术有限公司 A kind of coordinate visual physical bearing calibration based on delivery platform and robot
CN107148639A (en) * 2015-09-15 2017-09-08 深圳市大疆创新科技有限公司 It is determined that method and device, tracks of device and the system of the positional information of tracking target
CN107139003A (en) * 2017-06-27 2017-09-08 巨轮(广州)机器人与智能制造有限公司 Modularization vision system preparation method
CN107150343A (en) * 2017-04-05 2017-09-12 武汉科技大学 A kind of system that object is captured based on NAO robots
CN107192331A (en) * 2017-06-20 2017-09-22 佛山市南海区广工大数控装备协同创新研究院 A kind of workpiece grabbing method based on binocular vision
CN107256567A (en) * 2017-01-22 2017-10-17 梅卡曼德(北京)机器人科技有限公司 A kind of automatic calibration device and scaling method for industrial robot trick camera
CN107336234A (en) * 2017-06-13 2017-11-10 赛赫智能设备(上海)股份有限公司 A kind of reaction type self study industrial robot and method of work
TWI608320B (en) * 2016-12-19 2017-12-11 四零四科技股份有限公司 Three dimensional trace verification apparatus and method thereof
CN107618030A (en) * 2016-07-16 2018-01-23 深圳市得意自动化科技有限公司 The Robotic Dynamic tracking grasping means of view-based access control model and system
CN107633534A (en) * 2017-08-11 2018-01-26 宁夏巨能机器人股份有限公司 A kind of robot hand posture correction system and method based on 3D visual identitys
CN107756423A (en) * 2016-08-17 2018-03-06 发那科株式会社 Robot controller
CN108038861A (en) * 2017-11-30 2018-05-15 深圳市智能机器人研究院 A kind of multi-robot Cooperation method for sorting, system and device
CN108356422A (en) * 2017-01-23 2018-08-03 宝山钢铁股份有限公司 The on-line measurement of band volume continuous laser blanking, waste material are fallen and finished product separation recognition methods
CN108526976A (en) * 2018-06-07 2018-09-14 芜湖隆深机器人有限公司 A kind of cylinder end piece feeding positioning transferring device
CN108655026A (en) * 2018-05-07 2018-10-16 上海交通大学 A kind of quick teaching sorting system of robot and method
CN108789414A (en) * 2018-07-17 2018-11-13 五邑大学 Intelligent machine arm system based on three-dimensional machine vision and its control method
CN108827154A (en) * 2018-07-09 2018-11-16 深圳辰视智能科技有限公司 A kind of robot is without teaching grasping means, device and computer readable storage medium
CN108845572A (en) * 2018-05-29 2018-11-20 盐城工学院 A kind of industrial carrying machine people's localization method
WO2018214147A1 (en) * 2017-05-26 2018-11-29 深圳配天智能技术研究院有限公司 Robot calibration method and system, robot and storage medium
CN108994830A (en) * 2018-07-12 2018-12-14 上海航天设备制造总厂有限公司 System calibrating method for milling robot off-line programing
WO2018228249A1 (en) * 2017-06-16 2018-12-20 东莞晶苑毛织制衣有限公司 Automatic detecting, clothes grabbing and encasing system and use method therefor
CN109311599A (en) * 2016-07-21 2019-02-05 克朗斯股份公司 For handling the device and method of the member cargo of movement, there is the conveying, processing and/or packaging unit of the equipment of the member cargo for handling movement
CN109454501A (en) * 2018-10-19 2019-03-12 江苏智测计量技术有限公司 A kind of lathe on-line monitoring system
CN109579766A (en) * 2018-12-24 2019-04-05 苏州瀚华智造智能技术有限公司 A kind of product shape automatic testing method and system
CN109604466A (en) * 2018-12-21 2019-04-12 岚士智能科技(上海)有限公司 A kind of the stamping parts feeding robot and charging method of view-based access control model identification
WO2019100400A1 (en) * 2017-11-27 2019-05-31 Abb Schweiz Ag Apparatus and method for use with robot
CN110243376A (en) * 2019-06-28 2019-09-17 湖南三一快而居住宅工业有限公司 A kind of indoor orientation method and indoor locating system
CN110298885A (en) * 2019-06-18 2019-10-01 仲恺农业工程学院 A kind of stereoscopic vision recognition methods of Non-smooth surface globoid target and positioning clamping detection device and its application
CN110342252A (en) * 2019-07-01 2019-10-18 芜湖启迪睿视信息技术有限公司 A kind of article automatically grabs method and automatic grabbing device
CN110648361A (en) * 2019-09-06 2020-01-03 深圳市华汉伟业科技有限公司 Real-time pose estimation method and positioning and grabbing system of three-dimensional target object
CN111232664A (en) * 2020-03-18 2020-06-05 上海载科智能科技有限公司 Industrial robot applied soft package unstacking, unloading and stacking device and method for unstacking, unloading and stacking
CN111300481A (en) * 2019-12-11 2020-06-19 苏州大学 Robot grabbing pose correction method based on vision and laser sensor
CN111699078A (en) * 2017-12-08 2020-09-22 库卡德国有限公司 Operation of the robot
US10860040B2 (en) 2015-10-30 2020-12-08 SZ DJI Technology Co., Ltd. Systems and methods for UAV path planning and control
CN112571410A (en) * 2019-09-27 2021-03-30 杭州萤石软件有限公司 Region determination method and device, mobile robot and system
CN113015604A (en) * 2018-12-11 2021-06-22 株式会社富士 Robot control system and robot control method
CN113211450A (en) * 2021-06-17 2021-08-06 宝武集团马钢轨交材料科技有限公司 Train wheel visual identification positioning calibration device and method
CN113510697A (en) * 2021-04-23 2021-10-19 知守科技(杭州)有限公司 Manipulator positioning method, device, system, electronic device and storage medium
CN113639641A (en) * 2021-09-18 2021-11-12 中国计量大学 Workpiece reference surface positioning detection device and method
CN114322752A (en) * 2020-09-30 2022-04-12 合肥欣奕华智能机器有限公司 Method, device and equipment for automatically conveying glass
CN115072357A (en) * 2021-03-15 2022-09-20 中国人民解放军96901部队24分队 Robot reprint automatic positioning method based on binocular vision
CN115629043A (en) * 2022-11-28 2023-01-20 中国科学院过程工程研究所 Zinc concentrate material recovery sampling detection method and detection system
CN117400256A (en) * 2023-11-21 2024-01-16 扬州鹏顺智能制造有限公司 Industrial robot continuous track control method based on visual images

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070076946A1 (en) * 2005-09-30 2007-04-05 Nachi-Fujikoshi Corp. Object search apparatus, robot system equipped with object search apparatus, and object search method
JP2011083882A (en) * 2009-10-19 2011-04-28 Yaskawa Electric Corp Robot system
CN203317430U (en) * 2013-06-25 2013-12-04 南通职业大学 Industrial robot vision picking control system
CN103706568A (en) * 2013-11-26 2014-04-09 中国船舶重工集团公司第七一六研究所 System and method for machine vision-based robot sorting
CN103786153A (en) * 2012-10-31 2014-05-14 发那科株式会社 Object pickup device and method for picking up object
CN103817699A (en) * 2013-09-25 2014-05-28 浙江树人大学 Quick hand-eye coordination method for industrial robot
CN104217441A (en) * 2013-08-28 2014-12-17 北京嘉恒中自图像技术有限公司 Mechanical arm positioning fetching method based on machine vision
CN104227723A (en) * 2013-06-07 2014-12-24 株式会社安川电机 Workpiece detector, robot system, method for producing to-be-processed material, method for detecting workpiece
CN204585232U (en) * 2015-03-26 2015-08-26 华南理工大学 Capture robot pose and the movement locus navigation system of online workpiece

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070076946A1 (en) * 2005-09-30 2007-04-05 Nachi-Fujikoshi Corp. Object search apparatus, robot system equipped with object search apparatus, and object search method
JP2011083882A (en) * 2009-10-19 2011-04-28 Yaskawa Electric Corp Robot system
CN103786153A (en) * 2012-10-31 2014-05-14 发那科株式会社 Object pickup device and method for picking up object
CN104227723A (en) * 2013-06-07 2014-12-24 株式会社安川电机 Workpiece detector, robot system, method for producing to-be-processed material, method for detecting workpiece
CN203317430U (en) * 2013-06-25 2013-12-04 南通职业大学 Industrial robot vision picking control system
CN104217441A (en) * 2013-08-28 2014-12-17 北京嘉恒中自图像技术有限公司 Mechanical arm positioning fetching method based on machine vision
CN103817699A (en) * 2013-09-25 2014-05-28 浙江树人大学 Quick hand-eye coordination method for industrial robot
CN103706568A (en) * 2013-11-26 2014-04-09 中国船舶重工集团公司第七一六研究所 System and method for machine vision-based robot sorting
CN204585232U (en) * 2015-03-26 2015-08-26 华南理工大学 Capture robot pose and the movement locus navigation system of online workpiece

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
贾耀辉: "基于单目视觉的移动机器人目标定位方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (89)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106483963A (en) * 2015-08-26 2017-03-08 泰科电子(上海)有限公司 The automatic calibration method of robot system
CN106483963B (en) * 2015-08-26 2020-02-11 泰科电子(上海)有限公司 Automatic calibration method of robot system
CN107148639A (en) * 2015-09-15 2017-09-08 深圳市大疆创新科技有限公司 It is determined that method and device, tracks of device and the system of the positional information of tracking target
US10976753B2 (en) 2015-09-15 2021-04-13 SZ DJI Technology Co., Ltd. System and method for supporting smooth target following
US10928838B2 (en) 2015-09-15 2021-02-23 SZ DJI Technology Co., Ltd. Method and device of determining position of target, tracking device and tracking system
CN110276786B (en) * 2015-09-15 2021-08-20 深圳市大疆创新科技有限公司 Method and device for determining position information of tracking target, tracking device and system
CN110276786A (en) * 2015-09-15 2019-09-24 深圳市大疆创新科技有限公司 Determine method and device, tracking device and the system of the location information of tracking target
CN107148639B (en) * 2015-09-15 2019-07-16 深圳市大疆创新科技有限公司 Determine method and device, tracking device and the system of the location information of tracking target
US11635775B2 (en) 2015-09-15 2023-04-25 SZ DJI Technology Co., Ltd. Systems and methods for UAV interactive instructions and control
US10860040B2 (en) 2015-10-30 2020-12-08 SZ DJI Technology Co., Ltd. Systems and methods for UAV path planning and control
CN105345813B (en) * 2015-11-13 2017-03-22 张碧陶 High-precision mechanical arm positioning method based on generalized coordinates
CN105345813A (en) * 2015-11-13 2016-02-24 张碧陶 High-precision mechanical arm positioning method based on generalized coordinates
CN105291104B (en) * 2015-12-09 2017-05-17 哈尔滨云控机器人科技有限公司 Robot remote control method based on cloud platform
CN105291104A (en) * 2015-12-09 2016-02-03 哈尔滨云控机器人科技有限公司 Robot remote control method based on cloud platform
CN105773591A (en) * 2016-01-26 2016-07-20 深圳市鼎泰智能装备股份有限公司 Novel grabbing robot device
CN107618030A (en) * 2016-07-16 2018-01-23 深圳市得意自动化科技有限公司 The Robotic Dynamic tracking grasping means of view-based access control model and system
CN109311599A (en) * 2016-07-21 2019-02-05 克朗斯股份公司 For handling the device and method of the member cargo of movement, there is the conveying, processing and/or packaging unit of the equipment of the member cargo for handling movement
US11034534B2 (en) 2016-07-21 2021-06-15 Krones Aktiengesellschaft Apparatus and method for handling moving piece goods, and a conveying, processing and/or packaging plant with an apparatus for handling moving piece goods
CN109311599B (en) * 2016-07-21 2020-11-10 克朗斯股份公司 Device and method for handling moving piece goods, conveying, handling and/or packaging unit having a device for handling moving piece goods
CN106228563B (en) * 2016-07-29 2019-02-26 杭州鹰睿科技有限公司 Automatic setup system based on 3D vision
CN106228563A (en) * 2016-07-29 2016-12-14 杭州鹰睿科技有限公司 Automatic setup system based on 3D vision
US10507583B2 (en) 2016-08-17 2019-12-17 Fanuc Corporation Robot control device
CN107756423B (en) * 2016-08-17 2020-02-28 发那科株式会社 Robot control device
CN107756423A (en) * 2016-08-17 2018-03-06 发那科株式会社 Robot controller
CN106331620B (en) * 2016-08-25 2019-03-29 中国大冢制药有限公司 Filling production lines medicine bottle method for positioning analyzing
CN106331620A (en) * 2016-08-25 2017-01-11 中国大冢制药有限公司 Medicine bottle positioning analysis method of filling production line
CN106272424A (en) * 2016-09-07 2017-01-04 华中科技大学 A kind of industrial robot grasping means based on monocular camera and three-dimensional force sensor
CN106441094A (en) * 2016-09-10 2017-02-22 上海大学 Adaptive calibration vision online detection device and method
CN106346486B (en) * 2016-11-04 2018-07-27 武汉海默机器人有限公司 Six axis of one kind cooperation robot multiloop control system and its control method
CN106346486A (en) * 2016-11-04 2017-01-25 武汉海默自控股份有限公司 Six-axis cooperated robot multi-loop control system and control method thereof
CN106802656B (en) * 2016-12-01 2020-01-21 台山核电合营有限公司 Cambered surface posture adjusting method of tunnel cleaning robot
CN106802656A (en) * 2016-12-01 2017-06-06 台山核电合营有限公司 A kind of tunnel clears up the cambered surface attitude adjusting method of robot
CN106483984B (en) * 2016-12-13 2019-06-25 广州智能装备研究院有限公司 A kind of method and apparatus that control robot follows conveyer belt to move
CN106483984A (en) * 2016-12-13 2017-03-08 广州智能装备研究院有限公司 The method and apparatus that a kind of control robot follows conveyer belt
TWI608320B (en) * 2016-12-19 2017-12-11 四零四科技股份有限公司 Three dimensional trace verification apparatus and method thereof
CN106845354A (en) * 2016-12-23 2017-06-13 中国科学院自动化研究所 Partial view base construction method, part positioning grasping means and device
CN106845354B (en) * 2016-12-23 2020-01-03 中国科学院自动化研究所 Part view library construction method, part positioning and grabbing method and device
CN107256567A (en) * 2017-01-22 2017-10-17 梅卡曼德(北京)机器人科技有限公司 A kind of automatic calibration device and scaling method for industrial robot trick camera
CN108356422A (en) * 2017-01-23 2018-08-03 宝山钢铁股份有限公司 The on-line measurement of band volume continuous laser blanking, waste material are fallen and finished product separation recognition methods
CN107150343A (en) * 2017-04-05 2017-09-12 武汉科技大学 A kind of system that object is captured based on NAO robots
CN107150343B (en) * 2017-04-05 2019-07-23 武汉科技大学 A kind of system based on NAO robot crawl object
CN106903720A (en) * 2017-04-28 2017-06-30 安徽捷迅光电技术有限公司 A kind of auto-correction method of the coordinate vision of delivery platform and robot
CN106956292A (en) * 2017-04-28 2017-07-18 安徽捷迅光电技术有限公司 A kind of coordinate visual physical bearing calibration based on delivery platform and robot
WO2018214147A1 (en) * 2017-05-26 2018-11-29 深圳配天智能技术研究院有限公司 Robot calibration method and system, robot and storage medium
CN107336234A (en) * 2017-06-13 2017-11-10 赛赫智能设备(上海)股份有限公司 A kind of reaction type self study industrial robot and method of work
WO2018228249A1 (en) * 2017-06-16 2018-12-20 东莞晶苑毛织制衣有限公司 Automatic detecting, clothes grabbing and encasing system and use method therefor
CN107192331A (en) * 2017-06-20 2017-09-22 佛山市南海区广工大数控装备协同创新研究院 A kind of workpiece grabbing method based on binocular vision
CN107139003A (en) * 2017-06-27 2017-09-08 巨轮(广州)机器人与智能制造有限公司 Modularization vision system preparation method
CN107633534A (en) * 2017-08-11 2018-01-26 宁夏巨能机器人股份有限公司 A kind of robot hand posture correction system and method based on 3D visual identitys
WO2019100400A1 (en) * 2017-11-27 2019-05-31 Abb Schweiz Ag Apparatus and method for use with robot
CN111344119A (en) * 2017-11-27 2020-06-26 Abb瑞士股份有限公司 Apparatus and method for robot
US11642800B2 (en) 2017-11-27 2023-05-09 Abb Schweiz Ag Apparatus and method for use with robot
CN108038861A (en) * 2017-11-30 2018-05-15 深圳市智能机器人研究院 A kind of multi-robot Cooperation method for sorting, system and device
CN111699078A (en) * 2017-12-08 2020-09-22 库卡德国有限公司 Operation of the robot
CN108655026B (en) * 2018-05-07 2020-08-14 上海交通大学 Robot rapid teaching sorting system and method
CN108655026A (en) * 2018-05-07 2018-10-16 上海交通大学 A kind of quick teaching sorting system of robot and method
CN108845572A (en) * 2018-05-29 2018-11-20 盐城工学院 A kind of industrial carrying machine people's localization method
CN108526976A (en) * 2018-06-07 2018-09-14 芜湖隆深机器人有限公司 A kind of cylinder end piece feeding positioning transferring device
CN108827154A (en) * 2018-07-09 2018-11-16 深圳辰视智能科技有限公司 A kind of robot is without teaching grasping means, device and computer readable storage medium
CN108994830A (en) * 2018-07-12 2018-12-14 上海航天设备制造总厂有限公司 System calibrating method for milling robot off-line programing
CN108789414A (en) * 2018-07-17 2018-11-13 五邑大学 Intelligent machine arm system based on three-dimensional machine vision and its control method
CN109454501A (en) * 2018-10-19 2019-03-12 江苏智测计量技术有限公司 A kind of lathe on-line monitoring system
EP3895855A4 (en) * 2018-12-11 2022-05-11 Fuji Corporation Robot control system and robot control method
CN113015604A (en) * 2018-12-11 2021-06-22 株式会社富士 Robot control system and robot control method
CN113015604B (en) * 2018-12-11 2024-03-08 株式会社富士 Robot control system and robot control method
CN109604466A (en) * 2018-12-21 2019-04-12 岚士智能科技(上海)有限公司 A kind of the stamping parts feeding robot and charging method of view-based access control model identification
CN109579766A (en) * 2018-12-24 2019-04-05 苏州瀚华智造智能技术有限公司 A kind of product shape automatic testing method and system
CN109579766B (en) * 2018-12-24 2020-08-11 苏州瀚华智造智能技术有限公司 Automatic product appearance detection method and system
CN110298885A (en) * 2019-06-18 2019-10-01 仲恺农业工程学院 A kind of stereoscopic vision recognition methods of Non-smooth surface globoid target and positioning clamping detection device and its application
CN110243376A (en) * 2019-06-28 2019-09-17 湖南三一快而居住宅工业有限公司 A kind of indoor orientation method and indoor locating system
CN110342252A (en) * 2019-07-01 2019-10-18 芜湖启迪睿视信息技术有限公司 A kind of article automatically grabs method and automatic grabbing device
CN110648361A (en) * 2019-09-06 2020-01-03 深圳市华汉伟业科技有限公司 Real-time pose estimation method and positioning and grabbing system of three-dimensional target object
CN110648361B (en) * 2019-09-06 2022-01-11 深圳市华汉伟业科技有限公司 Real-time pose estimation method and positioning and grabbing system of three-dimensional target object
CN112571410B (en) * 2019-09-27 2022-04-29 杭州萤石软件有限公司 Region determination method and device, mobile robot and system
CN112571410A (en) * 2019-09-27 2021-03-30 杭州萤石软件有限公司 Region determination method and device, mobile robot and system
CN111300481A (en) * 2019-12-11 2020-06-19 苏州大学 Robot grabbing pose correction method based on vision and laser sensor
CN111300481B (en) * 2019-12-11 2021-06-29 苏州大学 Robot grabbing pose correction method based on vision and laser sensor
CN111232664B (en) * 2020-03-18 2021-10-26 上海载科智能科技有限公司 Industrial robot applied soft package unstacking, unloading and stacking device and method for unstacking, unloading and stacking
CN111232664A (en) * 2020-03-18 2020-06-05 上海载科智能科技有限公司 Industrial robot applied soft package unstacking, unloading and stacking device and method for unstacking, unloading and stacking
CN114322752A (en) * 2020-09-30 2022-04-12 合肥欣奕华智能机器有限公司 Method, device and equipment for automatically conveying glass
CN114322752B (en) * 2020-09-30 2024-03-12 合肥欣奕华智能机器股份有限公司 Method, device and equipment for automatically transmitting glass
CN115072357A (en) * 2021-03-15 2022-09-20 中国人民解放军96901部队24分队 Robot reprint automatic positioning method based on binocular vision
CN115072357B (en) * 2021-03-15 2023-07-07 中国人民解放军96901部队24分队 Robot reloading automatic positioning method based on binocular vision
CN113510697A (en) * 2021-04-23 2021-10-19 知守科技(杭州)有限公司 Manipulator positioning method, device, system, electronic device and storage medium
CN113510697B (en) * 2021-04-23 2023-02-14 知守科技(杭州)有限公司 Manipulator positioning method, device, system, electronic device and storage medium
CN113211450A (en) * 2021-06-17 2021-08-06 宝武集团马钢轨交材料科技有限公司 Train wheel visual identification positioning calibration device and method
CN113639641A (en) * 2021-09-18 2021-11-12 中国计量大学 Workpiece reference surface positioning detection device and method
CN115629043A (en) * 2022-11-28 2023-01-20 中国科学院过程工程研究所 Zinc concentrate material recovery sampling detection method and detection system
CN117400256A (en) * 2023-11-21 2024-01-16 扬州鹏顺智能制造有限公司 Industrial robot continuous track control method based on visual images

Similar Documents

Publication Publication Date Title
CN104786226A (en) Posture and moving track positioning system and method of robot grabbing online workpiece
CN204585232U (en) Capture robot pose and the movement locus navigation system of online workpiece
US9197810B2 (en) Systems and methods for tracking location of movable target object
CN102448679B (en) Method and system for extremely precise positioning of at least one object in the end position in space
US20160291571A1 (en) System and method for aligning a coordinated movement machine reference frame with a measurement system reference frame
Palmieri et al. A comparison between position-based and image-based dynamic visual servoings in the control of a translating parallel manipulator
CN102135776A (en) Industrial robot control system based on visual positioning and control method thereof
CN103895042A (en) Industrial robot workpiece positioning grabbing method and system based on visual guidance
Liu et al. Pose alignment of aircraft structures with distance sensors and CCD cameras
CN105073348A (en) A robot system and method for calibration
CN107150032A (en) A kind of workpiece identification based on many image acquisition equipments and sorting equipment and method
Kim Computer vision assisted virtual reality calibration
CN106737859A (en) The method for calibrating external parameters of sensor and robot based on invariable plane
Zhu et al. Kinematic self-calibration method for dual-manipulators based on optical axis constraint
CN115042175A (en) Method for adjusting tail end posture of mechanical arm of robot
Qiao Advanced sensor and target development to support robot accuracy degradation assessment
Hefele et al. Robot pose correction using photogrammetric tracking
JPH07237158A (en) Position-attitude detecting method and device thereof and flexible production system
Xu et al. Industrial robot base assembly based on improved Hough transform of circle detection algorithm
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution
Fan et al. An automatic robot unstacking system based on binocular stereo vision
Yang et al. A coaxial vision assembly algorithm for un-centripetal holes on large-scale stereo workpiece using multiple-dof robot
Chen et al. Application of visual servoing to an X-ray based welding inspection robot
Qiao Advanced sensing development to support robot accuracy assessment and improvement
Qingda et al. Workpiece posture measurement and intelligent robot grasping based on monocular vision

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150722

WD01 Invention patent application deemed withdrawn after publication