CN111452034A - Double-camera machine vision intelligent industrial robot control system and control method - Google Patents

Double-camera machine vision intelligent industrial robot control system and control method Download PDF

Info

Publication number
CN111452034A
CN111452034A CN201910052198.0A CN201910052198A CN111452034A CN 111452034 A CN111452034 A CN 111452034A CN 201910052198 A CN201910052198 A CN 201910052198A CN 111452034 A CN111452034 A CN 111452034A
Authority
CN
China
Prior art keywords
camera
coordinate system
image
robot
industrial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910052198.0A
Other languages
Chinese (zh)
Inventor
张胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Ruobo Intelligent Robot Co ltd
Original Assignee
Guangdong Ruobo Intelligent Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Ruobo Intelligent Robot Co ltd filed Critical Guangdong Ruobo Intelligent Robot Co ltd
Priority to CN201910052198.0A priority Critical patent/CN111452034A/en
Publication of CN111452034A publication Critical patent/CN111452034A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)

Abstract

A control system and a control method for a double-camera machine vision intelligent industrial robot can solve the defects of the traditional technology. The system comprises an industrial robot, a robot controller, an industrial touch platform and a dual-camera device, wherein the dual-camera device comprises a camera A and a camera B, the dual-camera device is in communication connection with the industrial touch platform, the industrial touch platform is in communication connection with the robot controller, the robot controller is connected with the industrial robot, the camera A is installed on a tail end shaft arm of the industrial robot, and the camera B is installed above a flow work platform at a target placement end. The industrial robot comprises an industrial robot, a robot controller, an industrial touch panel all-in-one machine and a double-camera device, is clear and concise in composition and connection, has strong operability and practicability, can be quickly arranged and applied to an industrial assembly system, a carrying system and a sorting system, and meets the intelligent transformation requirement of robot exchange.

Description

Double-camera machine vision intelligent industrial robot control system and control method
Technical Field
The invention relates to the technical field of robots, in particular to a control system and a control method of an intelligent industrial robot with double-camera machine vision.
Background
Most of the current intelligent industrial robot systems with machine vision adopt a single camera, and the detection modes and the existing problems of the target object and the target position are as follows:
1) only the target object is detected, the target position is not detected, and the risk of target position abnormity exists;
2) firstly, detecting a target position, and then detecting a target object, wherein the problem of low working timeliness exists;
3) one system is adopted for detecting the target object, and the other system is adopted for detecting the target position, so that the problems of excessive systems, complex connection, delayed processing and the like exist.
Disclosure of Invention
The invention provides a control system and a control method of a double-camera machine vision intelligent industrial robot, which can solve the technical problem that the detection mode of a target object and a target position in the traditional related technology has technical defects.
In order to achieve the purpose, the invention adopts the following technical scheme:
the double-camera machine vision intelligent industrial robot control system comprises an industrial robot, a robot controller, an industrial touch platform and a double-camera device, wherein the double-camera device comprises a camera A and a camera B, the double-camera device is in communication connection with the industrial touch platform, the industrial touch platform is in communication connection with the robot controller, the robot controller is connected with the industrial robot, the camera A is installed on a tail end shaft arm of the industrial robot, and the camera B is installed above a flow work platform of a target placement end.
Further, the dual-camera device and the industrial touch panel all-in-one machine are connected through a USB or gigabit Ethernet.
Furthermore, the industrial touch platform is an industrial touch panel all-in-one machine.
Furthermore, the industrial touch platform and the robot controller are connected through a serial port, and the robot controller and the industrial robot are connected through a motor interface and an encoder interface.
Further, the camera a is fixedly mounted on the end shaft arm of the industrial robot by adopting an L-shaped holed steel plate.
A control method of a double-camera machine vision intelligent industrial robot comprises the following steps:
s1, determining the selection type of the camera A and the camera B according to the application requirement of the robot in the system needing to be adapted;
s2, determining the relation between a camera A coordinate system and a robot coordinate system through camera calibration, determining the relation between a camera B coordinate system and the robot coordinate system, converting the visual coordinate of the camera A into a coordinate in the robot coordinate system, and converting the visual coordinate of the camera B into a coordinate in the robot coordinate system;
s3, the industrial touch platform analyzes and processes the image information collected by the camera, and converts the pose of the target object in the image into the pose in the machine coordinate system, so as to obtain the coordinates of the grabbed position and the installed position of the target;
s4, the industrial touch platform judges the image analysis output result shot by the camera A, picks up the object or the part, rotates and adjusts the object or the part to a proper position, and then moves to the position above the flow water work platform at the target placement end;
s5, the industrial touch platform judges the image analysis output result shot by the camera B and determines that the target position is free of obstacles or the object position required to be assembled by the target position is correct;
and S6, the industrial touch platform judges according to the image analysis output result shot by the camera A and the image analysis output result shot by the camera B, and determines whether to place the object at the target position or stop the action immediately to give an alarm.
Further, the camera calibration in the step S2 includes,
determining internal and external parameters of a camera through internal reference calibration and external reference calibration, and establishing a relation between machine coordinates and image coordinates of the camera according to the internal and external parameters of the camera;
according to the calibration relation of the internal and external parameters of the camera, the calibration of the camera is obtained according to the following steps:
s21, printing a black and white checkerboard, and pasting the checkerboard on a flat plate surface as a standard calibration object;
s22, shooting a plurality of photos in different directions for the calibration object by adjusting the direction of the calibration object;
s23, extracting black and white checkerboard corner points from the picture;
s24, estimating internal parameters and external parameters of the camera under the condition of ideal distortion-free;
s25, estimating a distortion coefficient under the actual radial distortion by using a least square method;
and S26, finally, optimizing and estimating through a maximum likelihood method, and determining the camera parameters.
Further, the S3 specifically includes:
s31, the industrial touch platform obtains required position information and angular information by carrying out preprocessing, feature extraction, template matching and position extraction operation on image information shot by the camera A, identifies objects or parts in a visual field and outputs the positions and angular orientations of the objects or the parts;
and S32, the industrial touch platform obtains the required position information and angular information by carrying out preprocessing, feature extraction, template matching and position extraction operation on the image shot by the camera B, identifies the object or part in the view field and outputs the position and angular orientation of the object or part.
Further, the internal and external parameters of the camera are determined through internal reference calibration and external reference calibration, and the relation between the machine coordinate of the camera and the image coordinate is established according to the internal and external parameters of the camera;
the method specifically comprises the following steps:
let the machine coordinate system xw,yw,zwIs a three-dimensional rectangular coordinate system;
camera coordinate system xc,yc,zcThe three-dimensional rectangular coordinate system is also adopted, the origin is positioned at the optical center of the lens, the x axis and the y axis are respectively parallel to two sides of the phase plane, and the z axis is the optical axis of the lens and is vertical to the image plane;
machine coordinate system and camera coordinate system relationship:
Figure BSA0000177992560000031
wherein R is a rotation matrix of 3 x 3, t is a translation vector of 3 x 1, (x)c,yc,zc,1)THomogeneous coordinates of the camera coordinate system, (x)w,yw,zw,1)THomogeneous coordinates of a machine coordinate system;
the pixel coordinate system uov is a two-dimensional rectangular coordinate system, the origin o is located at the upper left corner of the image, the u-axis and the v-axis are respectively parallel to two sides of the image surface, and the unit of the coordinate axis in the pixel coordinate system is a pixel;
the image coordinate system XOY is in a translation relationship with the pixel coordinate system uov, the origin of the image coordinate system is the intersection point of the optical axis of the camera and the phase plane, i.e. the central point of the image, the unit of the coordinate axis is usually millimeter, and the X axis and the Y axis are respectively parallel to the u axis and the v axis; the two coordinate systems are thus transformed as follows:
Figure BSA0000177992560000032
wherein dX and dY are the physical size of the pixel in the direction of X, Y axis, and u0,v0Is a principal point coordinate;
determining the relation between any point P in space and the projection point P' on the image according to the pinhole imaging principle, and expressing the relation by a matrix:
Figure BSA0000177992560000033
wherein s is a scale factor and f is an effective focal length, (x, y, z, 1)TIs the homogeneous coordinate of the spatial point P in the camera coordinate system oxyz, (X, Y, 1)TIs the homogeneous coordinate of the projection image point P' in the image coordinate system OXY;
and obtaining the relation between the pixel coordinate system and the machine coordinate system in a conclusion way:
Figure BSA0000177992560000041
wherein M is1Parameter matrix, M, called pixel coordinate and image coordinate2Called projection parameter matrix, M3Referred to as a parameter matrix of camera coordinates and machine coordinates.
The step S1 further includes:
and determining the type selection of the camera A and the camera B according to the installation position, the positioning precision, the working distance and the visual field range of the robot in the assembly system, the carrying system or the sorting system which are adapted according to the requirements.
According to the technical scheme, compared with the prior art, the invention has the following beneficial effects:
1. the invention comprises an industrial robot, a robot controller, an industrial touch panel all-in-one machine and a dual-camera device (comprising a camera A and a camera B). The robot is clear and concise in composition and connection, has strong operability and practicability, can be quickly arranged and applied to an industrial assembly system, a carrying system and a sorting system, and meets the intelligent transformation requirement of robot changing.
2. The invention can determine the type selection of the camera A and the camera B according to the installation position, the positioning precision, the working distance, the visual field range and the like of the robot in an assembly system, a carrying system or a sorting system which are adaptive as required in engineering application, realizes self calibration through projection error correction and coordinate matching, and has the characteristics of wide application range and flexibility.
3. The industrial touch panel all-in-one machine makes judgment according to the image analysis output result shot by the camera A and the image analysis output result shot by the camera B, so that the action requirement of the robot is determined. The completion of the process integrates the fusion of a plurality of key technologies: machine vision technology, sensor technology, image processing technology, and automatic control technology.
4. In engineering application, the industrial touch panel all-in-one machine can be connected to a local area network or a wide area network through the gigabit Ethernet, remote control debugging and setting are realized, great convenience is brought to maintenance of the industrial touch panel all-in-one machine, and the industrial touch panel all-in-one machine is more efficient, quicker and more convenient to use.
Drawings
FIG. 1 is a schematic structural diagram of an embodiment of the present invention;
FIG. 2 is a control block diagram of an embodiment of the present invention;
FIG. 3 is a flow chart of a camera calibration algorithm according to an embodiment of the invention;
FIG. 4 is a flow chart of the control of the mounting system in accordance with an embodiment of the present invention;
FIG. 5 is a flow chart of positioning installation according to an embodiment of the present invention;
FIG. 6 shows a mounting method of a camera A according to an embodiment of the present invention;
fig. 7 is a schematic view of an industrial robot control operation interface according to an embodiment of the present invention;
fig. 8 is an internal block diagram of a robot controller according to an embodiment of the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings:
as shown in fig. 1, the dual-camera machine vision intelligent industrial robot control system of the embodiment of the present invention includes an industrial robot, a robot controller, an industrial touch platform and a dual-camera device, where the industrial touch platform is an industrial touch tablet personal computer, and the dual-camera device includes a camera a and a camera B.
The dual-phase machine device is connected with the industrial touch platform through a USB or gigabit Ethernet, the industrial touch platform is connected with the robot controller through a serial port, and the robot controller is connected with the industrial robot through a motor interface and an encoder interface.
As shown in figure 6, the camera A is directly installed on the tail end shaft arm of the industrial robot, particularly, an L-shaped perforated steel plate is fixedly installed, the camera, the lens and the light source are integrally installed, the camera B is installed above the flow work platform at the target placement end, and the working process comprises camera calibration, image acquisition and processing and robot motion control.
The camera calibration of the camera a and the camera B includes an internal reference calibration and an external reference calibration, the internal reference calibration parameters mainly include focal length, radial distortion, principal point (image center), and the like, the external reference calibration parameters mainly include position and rotation angle of the camera relative to world coordinates (in this case, referred to as machine coordinates), and the relationship between the machine coordinates and the image coordinates can be established through the internal and external parameters. The method comprises the following specific steps:
machine coordinate system (x)w,yw,zw) Namely a three-dimensional rectangular coordinate system, and the spatial positions of the camera and the object to be measured can be described by taking the three-dimensional rectangular coordinate system as a reference. The position of the machine coordinate system can be freely determined according to actual conditionsAnd (4) determining. Camera coordinate system (x)c,yc,zc) The three-dimensional rectangular coordinate system is also provided, the origin is positioned at the optical center of the lens, the x axis and the y axis are respectively parallel to two sides of the phase plane, and the z axis is the optical axis of the lens and is vertical to the image plane.
Machine coordinate system and camera coordinate system relationship:
Figure BSA0000177992560000051
wherein R is a rotation matrix of 3 x 3, t is a translation vector of 3 x 1, (x)c,yc,xc,1)THomogeneous coordinates of the camera coordinate system, (x)w,yw,zw,1)THomogeneous coordinates of the machine coordinate system.
The pixel coordinate system uov is a two-dimensional rectangular coordinate system, the origin o is located at the upper left corner of the image, and the u-axis and the v-axis are respectively parallel to two sides of the image plane, reflecting the arrangement of the pixels in the camera CCD/CMOS chip. The unit of the coordinate axis in the pixel coordinate system is a pixel (integer).
The image coordinate system XOY is in a translational relationship with the pixel coordinate system uov, the origin of the image coordinate system is the intersection (principal point) of the camera optical axis and the phase plane, i.e., the center point of the image, the coordinate axes are usually in millimeters (mm), and the X-axis and the Y-axis are parallel to the u-axis and the v-axis, respectively. The two coordinate systems are thus transformed as follows:
Figure BSA0000177992560000061
wherein dX and dY are the physical size of the pixel in the direction of X, Y axis, and u0,v0As principal point (image origin) coordinates.
The relationship between any point P in space and the projection point P' on the image can be determined according to the pinhole imaging principle and can be represented by a matrix:
Figure BSA0000177992560000062
wherein s is a ratioExample factor (s is not 0), f is the effective focal length (distance of the optical center to the image plane), (x, y, z, 1)TIs the homogeneous coordinate of the spatial point P in the camera coordinate system oxyz, (X, Y, 1)TIs the homogeneous coordinate of the projection image point P' in the image coordinate system OXY.
In summary, the relationship between the pixel coordinate system and the machine coordinate system can be obtained as follows:
Figure BSA0000177992560000063
where M1 is referred to as a parameter matrix of pixel coordinates and image coordinates, M2 is referred to as a projection parameter matrix, and M3 is referred to as a parameter matrix of camera coordinates and machine coordinates.
According to the calibration relationship between the internal and external parameters of the camera a and the camera B, the calibration of the camera can be obtained according to the following steps, as shown in fig. 3:
1) printing a black and white checkerboard, and pasting the checkerboard on a flat plate surface as a standard calibration object;
2) shooting photos in different directions for the calibration object by adjusting the direction of the calibration object;
3) extracting the angular points of the black and white checkerboard from the picture;
4) under the condition of ideal distortion-free estimation, five internal parameters and six external parameters are estimated;
5) estimating a distortion coefficient under the actual radial distortion by using a least square method;
6) and the maximum likelihood method optimizes estimation and improves estimation precision.
After the calibration of the camera A and the camera B is determined, images of a working platform on an assembly production line can be collected through the camera A and the camera B, and the industrial touch panel all-in-one machine mainly comprises image filtering, image enhancement and template matching through analyzing and processing the images shot by the camera A and the camera B. Gaussian filtering can be adopted to eliminate noise, enhance contrast, sharpen edges and highlight features through image filtering; the method mainly emphasizes the definition of certain characteristics of an image through image enhancement to enable the image to be clear, enables the characteristic extraction to be simpler and more accurate, and seeks a method for the same target through corresponding relation, similarity and consistency analysis of image content, characteristics, structures, relations, textures, gray levels and the like for template matching. Therefore, the object or the part needing to be assembled in the view field is identified and detected, the assembled object is determined, so that the key is to perform template matching based on the characteristics, and the position and the angular direction of the object can be obtained according to the matched characteristics after the requirement of characteristic matching is met.
The whole process is to carry out operations such as preprocessing, feature extraction, template matching, position extraction and the like on the acquired image information to obtain required position information and angular information, convert the pose of the target object in the image into the pose in a machine coordinate system so as to obtain the coordinates of the grabbed position and the installed position of the target, and the control flow of the assembly system is shown in fig. 4.
The robot motion control is completed through human-computer interaction by intelligent robot control software on the industrial touch-control tablet all-in-one machine, and as shown in fig. 7, an intelligent robot control interface on the industrial touch-control tablet all-in-one machine mainly comprises a user login button, an exit button, a camera a calibration parameter setting dialog box, a camera B calibration parameter setting dialog box, a robot calibration dialog box, a communication setting dialog box, an image preprocessing dialog box, a template matching setting dialog box and the like.
The control principle of the robot is as follows: and calling a robot library function by the windows program, sending an instruction to the robot controller by the library function through the serial port, and finally executing the instruction by the robot controller. Fig. 8 is an internal block diagram of a robot controller according to an embodiment of the present invention, in which a main control chip of the robot controller employs a 32-bit MPU, and has various high-speed calculation functions such as differential-integral fourier transform and filtering, and is embedded with an SRing communication function, an encoder count conversion function, a pulse input/output function, and a digital signal processing function. Because the servo motor is specially controlled, the design is simple and efficient. Especially, the communication function is strong. The SRing mode can be a master station to 8 slave stations; the transmission speed is 5Mbps, and the data volume is 1 channel with 128 bits.
The 32-bit MPU (central processing unit), namely a 32-bit central processing unit, comprises the following components:
(1) timing part
Internal DP LL (F9H 3). external crystal oscillator at 40 MHz.
A 200MHz periodic pulse is generated. The divided cycles are used as a pulse for encoding and decoding.
(2) Encoding unit
Internal various timers are generated. And (4) periodic synchronization.
The data structure is formed by the data memory data and the control bit. The data structure is modulated and then encoded.
(3) Decoding unit
The data receiving decoder RP LL releases the modulation and sends the data to the data memory.
(4) The transmit data buffer 64W x 16bit RAM.
(5) Accept data buffer 64W x 16bit RAM.
(6) Transmit buffering
The data is received from the MPU and temporarily stored.
(7) Accept buffer
The temporarily stored data is sent to the MCU.
From the above, it can be seen that the robot controller of the embodiment of the present invention adopts high integration and miniaturization, and in cooperation with a light intelligent industrial robot, can be very conveniently arranged in a production site, and a user can basically install the robot controller by himself and perform simple debugging according to the specification. The user is greatly facilitated, the application scene of the dual-camera machine vision intelligent industrial robot control system is effectively expanded, and the utilization rate of the dual-camera machine vision intelligent industrial robot control system is greatly improved.
The working principle diagram of the embodiment of the invention is shown in fig. 2, wherein the camera a and the camera B take images of a material taking position and a material placing position, convert the images into digital image signals and transmit the digital image signals to the industrial touch panel all-in-one machine, then the industrial touch panel all-in-one machine calculates scene positions of targets according to the images, then a control program generates control instructions and sends the control instructions to the robot controller through the RS232, the robot controller interprets the control instructions into pulse numbers and sends the pulse numbers to the servo motors respectively, and the servo motors act according to the instruction requirements, so that the whole operation of the machine is completed, and objects or parts shot by the camera a are accurately placed at the target positions shot by the camera B.
The positioning and installation process of the present embodiment is shown in fig. 5. Namely, the method for realizing the double-camera machine vision intelligent industrial robot system comprises the following steps:
(1) determining the model selection of the camera A and the camera B according to the installation position, the positioning precision, the working distance, the visual field range and the like of a robot in an assembly system, a carrying system or a sorting system which need to be adapted;
(2) determining the relation between a camera A coordinate system and a robot coordinate system through camera calibration, determining the relation between a camera B coordinate system and the robot coordinate system, converting the visual coordinate of the camera A into the coordinate in the robot coordinate system, and converting the visual coordinate of the camera B into the coordinate in the robot coordinate system;
(3) the industrial touch flat panel all-in-one machine identifies an object or a part in a view field by analyzing and processing an image shot by a camera A, and outputs the position and the angular direction of the object or the part;
(4) the industrial touch panel all-in-one machine identifies an object or a part in a view field by analyzing and processing an image shot by the camera B, and outputs the position and the angular direction of the object or the part;
(5) the industrial touch panel all-in-one machine judges an image analysis output result shot by the camera A, picks up an object or a part, rotates and adjusts the object or the part to a proper position, and then moves to a position above a flow work platform at a target placing end;
(6) the industrial touch panel all-in-one machine judges the image analysis output result shot by the camera B and determines that the target position is free of obstacles or the position of an object to be assembled at the target position is correct;
(7) and the industrial touch flat panel all-in-one machine makes a judgment according to the image analysis output result shot by the camera A and the image analysis output result shot by the camera B, and determines whether to place the object at the target position or stop the action immediately to give an alarm.
In summary, the embodiment of the invention is a small-sized intelligent industrial robot system, which has the characteristics of flexibility, high efficiency and reliability, and the method is easy to realize and has strong operability in system design.
The above-mentioned embodiments are merely illustrative of the preferred embodiments of the present invention, and do not limit the scope of the present invention, and various modifications and improvements of the technical solutions of the present invention by those skilled in the art should fall within the protection scope of the present invention without departing from the design spirit of the present invention.

Claims (10)

1. The utility model provides a double-camera machine vision intelligence industrial robot control system, includes industrial robot, robot controller, industry touch platform, its characterized in that: the system is characterized by further comprising a double-camera device, wherein the double-camera device comprises a camera A and a camera B, the double-camera device is in communication connection with the industrial touch platform, the industrial touch platform is in communication connection with the robot controller, the robot controller is connected with the industrial robot, the camera A is installed on a tail end shaft arm of the industrial robot, and the camera B is installed above a flow work platform at a target placing end.
2. The dual camera machine vision intelligent industrial robot control system of claim 1, wherein: the dual-camera device and the industrial touch panel all-in-one machine are connected through a USB or a gigabit Ethernet.
3. The dual camera machine vision intelligent industrial robot control system of claim 2, wherein: the industrial touch platform is an industrial touch panel all-in-one machine.
4. The dual camera machine vision intelligent industrial robot control system of claim 3, wherein: the industrial touch platform is connected with the robot controller through a serial port, and the robot controller is connected with the industrial robot through a motor interface and an encoder interface.
5. The control system of dual-camera machine vision intelligent industrial robot of claim 4, wherein said camera A is fixedly mounted on the end shaft arm of said industrial robot by adopting L-shaped perforated steel plate.
6. A control method of a double-camera machine vision intelligent industrial robot is characterized in that:
the method comprises the following steps:
s1, determining the selection type of the camera A and the camera B according to the application requirement of the robot in the system needing to be adapted;
s2, determining the relation between a camera A coordinate system and a robot coordinate system through camera calibration, determining the relation between a camera B coordinate system and the robot coordinate system, converting the visual coordinate of the camera A into a coordinate in the robot coordinate system, and converting the visual coordinate of the camera B into a coordinate in the robot coordinate system;
s3, the industrial touch platform analyzes and processes the image information collected by the camera, and converts the pose of the target object in the image into the pose in the machine coordinate system, so as to obtain the coordinates of the grabbed position and the installed position of the target;
s4, the industrial touch platform judges the image analysis output result shot by the camera A, picks up the object or the part, rotates and adjusts the object or the part to a proper position, and then moves to the position above the flow water work platform at the target placement end;
s5, the industrial touch platform judges the image analysis output result shot by the camera B and determines that the target position is free of obstacles or the object position required to be assembled by the target position is correct;
and S6, the industrial touch platform judges according to the image analysis output result shot by the camera A and the image analysis output result shot by the camera B, and determines whether to place the object at the target position or stop the action immediately to give an alarm.
7. The dual-camera machine-vision intelligent industrial robot control method of claim 6, wherein: the camera calibration in the step S2 includes,
determining internal and external parameters of a camera through internal reference calibration and external reference calibration, and establishing a relation between machine coordinates and image coordinates of the camera according to the internal and external parameters of the camera;
according to the calibration relation of the internal and external parameters of the camera, the calibration of the camera is obtained according to the following steps:
s21, printing a black and white checkerboard, and pasting the checkerboard on a flat plate surface as a standard calibration object;
s22, shooting a plurality of photos in different directions for the calibration object by adjusting the direction of the calibration object;
s23, extracting black and white checkerboard corner points from the picture;
s24, estimating internal parameters and external parameters of the camera under the condition of ideal distortion-free;
s25, estimating a distortion coefficient under the actual radial distortion by using a least square method;
and S26, finally, optimizing and estimating through a maximum likelihood method, and determining the camera parameters.
8. The dual-camera machine-vision intelligent industrial robot control method of claim 6, wherein: the S3 specifically includes:
s31, the industrial touch platform obtains required position information and angular information by carrying out preprocessing, feature extraction, template matching and position extraction operation on image information shot by the camera A, identifies objects or parts in a visual field and outputs the positions and angular orientations of the objects or the parts;
and S32, the industrial touch platform obtains the required position information and angular information by carrying out preprocessing, feature extraction, template matching and position extraction operation on the image shot by the camera B, identifies the object or part in the view field and outputs the position and angular orientation of the object or part.
9. The dual camera machine vision intelligent industrial robot control method of claim 7, wherein: determining internal and external parameters of a camera through internal reference calibration and external reference calibration, and establishing a relation between machine coordinates and image coordinates of the camera according to the internal and external parameters of the camera;
the method specifically comprises the following steps:
let the machine coordinate system xw,yw,zwIs a three-dimensional rectangular coordinate system;
camera coordinate system xc,yc,zcThe three-dimensional rectangular coordinate system is also adopted, the origin is positioned at the optical center of the lens, the x axis and the y axis are respectively parallel to two sides of the phase plane, and the z axis is the optical axis of the lens and is vertical to the image plane;
machine coordinate system and camera coordinate system relationship:
Figure FSA0000177992550000031
wherein R is a rotation matrix of 3 x 3, t is a translation vector of 3 x 1, (x)c,yc,zc,1)THomogeneous coordinates of the camera coordinate system, (x)w,yw,zw,1)THomogeneous coordinates of a machine coordinate system;
the pixel coordinate system uov is a two-dimensional rectangular coordinate system, the origin o is located at the upper left corner of the image, the u-axis and the v-axis are respectively parallel to two sides of the image surface, and the unit of the coordinate axis in the pixel coordinate system is a pixel;
the image coordinate system XOY is in a translation relationship with the pixel coordinate system uov, the origin of the image coordinate system is the intersection point of the optical axis of the camera and the phase plane, i.e. the central point of the image, the unit of the coordinate axis is usually millimeter, and the X axis and the Y axis are respectively parallel to the u axis and the v axis; the two coordinate systems are thus transformed as follows:
Figure FSA0000177992550000032
wherein dX and dY are the physical size of the pixel in the direction of X, Y axis, and u0,v0Is a principal point coordinate;
determining the relation between any point P in space and the projection point P' on the image according to the pinhole imaging principle, and expressing the relation by a matrix:
Figure FSA0000177992550000033
wherein s is a scale factor and f is an effective focal length, (x, y, z, 1)TIs the homogeneous coordinate of the spatial point P in the camera coordinate system oxyz, (X, Y, 1)TIs the homogeneous coordinate of the projection image point P' in the image coordinate system OXY;
and obtaining the relation between the pixel coordinate system and the machine coordinate system in a conclusion way:
Figure FSA0000177992550000034
wherein M is1Parameter matrix, M, called pixel coordinate and image coordinate2Called projection parameter matrix, M3Referred to as a parameter matrix of camera coordinates and machine coordinates.
10. The dual-camera machine-vision intelligent industrial robot control method of claim 6, wherein: the step S1 further includes:
and determining the type selection of the camera A and the camera B according to the installation position, the positioning precision, the working distance and the visual field range of the robot in the assembly system, the carrying system or the sorting system which are adapted according to the requirements.
CN201910052198.0A 2019-01-21 2019-01-21 Double-camera machine vision intelligent industrial robot control system and control method Pending CN111452034A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910052198.0A CN111452034A (en) 2019-01-21 2019-01-21 Double-camera machine vision intelligent industrial robot control system and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910052198.0A CN111452034A (en) 2019-01-21 2019-01-21 Double-camera machine vision intelligent industrial robot control system and control method

Publications (1)

Publication Number Publication Date
CN111452034A true CN111452034A (en) 2020-07-28

Family

ID=71675389

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910052198.0A Pending CN111452034A (en) 2019-01-21 2019-01-21 Double-camera machine vision intelligent industrial robot control system and control method

Country Status (1)

Country Link
CN (1) CN111452034A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112083317A (en) * 2020-09-21 2020-12-15 深圳市控汇智能股份有限公司 High efficiency industrial robot detecting system
CN112739192A (en) * 2020-12-30 2021-04-30 深圳市卓兴半导体科技有限公司 Automatic positioning method and system of multi-station equipment and laminating equipment
CN112847321A (en) * 2021-01-04 2021-05-28 扬州市职业大学(扬州市广播电视大学) Industrial robot visual image recognition system based on artificial intelligence
CN113709362A (en) * 2021-08-05 2021-11-26 深圳光远智能装备股份有限公司 Dual-camera alignment system for precise positioning
CN114419437A (en) * 2022-01-12 2022-04-29 湖南视比特机器人有限公司 Workpiece sorting system based on 2D vision and control method and control device thereof
CN115511967A (en) * 2022-11-17 2022-12-23 歌尔股份有限公司 Visual positioning method, device and system
CN115981178A (en) * 2022-12-19 2023-04-18 广东若铂智能机器人有限公司 Simulation system and method for fish and aquatic product slaughtering
CN115981178B (en) * 2022-12-19 2024-05-24 广东若铂智能机器人有限公司 Simulation system for slaughtering fish and aquatic products

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030168317A1 (en) * 2002-01-14 2003-09-11 Fromme Christopher C. Conveyor belt inspection system and method
US20140046486A1 (en) * 2012-08-08 2014-02-13 Canon Kabushiki Kaisha Robot device
CN105234943A (en) * 2015-09-09 2016-01-13 大族激光科技产业集团股份有限公司 Industrial robot demonstration device and method based on visual recognition
US20160059419A1 (en) * 2014-09-03 2016-03-03 Canon Kabushiki Kaisha Robot apparatus and method for controlling robot apparatus
CN106695792A (en) * 2017-01-05 2017-05-24 中国计量大学 Tracking and monitoring system and method of stacking robot based on machine vision
CN106826817A (en) * 2017-01-11 2017-06-13 河北省自动化研究所 Double feedback mechanical arm automatic assembling and disassembling system and methods
CN107478203A (en) * 2017-08-10 2017-12-15 王兴 A kind of 3D imaging devices and imaging method based on laser scanning
WO2018043525A1 (en) * 2016-09-02 2018-03-08 倉敷紡績株式会社 Robot system, robot system control device, and robot system control method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030168317A1 (en) * 2002-01-14 2003-09-11 Fromme Christopher C. Conveyor belt inspection system and method
US20140046486A1 (en) * 2012-08-08 2014-02-13 Canon Kabushiki Kaisha Robot device
US20160059419A1 (en) * 2014-09-03 2016-03-03 Canon Kabushiki Kaisha Robot apparatus and method for controlling robot apparatus
CN105234943A (en) * 2015-09-09 2016-01-13 大族激光科技产业集团股份有限公司 Industrial robot demonstration device and method based on visual recognition
WO2018043525A1 (en) * 2016-09-02 2018-03-08 倉敷紡績株式会社 Robot system, robot system control device, and robot system control method
CN106695792A (en) * 2017-01-05 2017-05-24 中国计量大学 Tracking and monitoring system and method of stacking robot based on machine vision
CN106826817A (en) * 2017-01-11 2017-06-13 河北省自动化研究所 Double feedback mechanical arm automatic assembling and disassembling system and methods
CN107478203A (en) * 2017-08-10 2017-12-15 王兴 A kind of 3D imaging devices and imaging method based on laser scanning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘常: "双相机机器人视觉引导系统", 《设备管理与维修》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112083317A (en) * 2020-09-21 2020-12-15 深圳市控汇智能股份有限公司 High efficiency industrial robot detecting system
CN112739192A (en) * 2020-12-30 2021-04-30 深圳市卓兴半导体科技有限公司 Automatic positioning method and system of multi-station equipment and laminating equipment
CN112847321A (en) * 2021-01-04 2021-05-28 扬州市职业大学(扬州市广播电视大学) Industrial robot visual image recognition system based on artificial intelligence
CN113709362A (en) * 2021-08-05 2021-11-26 深圳光远智能装备股份有限公司 Dual-camera alignment system for precise positioning
CN114419437A (en) * 2022-01-12 2022-04-29 湖南视比特机器人有限公司 Workpiece sorting system based on 2D vision and control method and control device thereof
CN115511967A (en) * 2022-11-17 2022-12-23 歌尔股份有限公司 Visual positioning method, device and system
CN115981178A (en) * 2022-12-19 2023-04-18 广东若铂智能机器人有限公司 Simulation system and method for fish and aquatic product slaughtering
CN115981178B (en) * 2022-12-19 2024-05-24 广东若铂智能机器人有限公司 Simulation system for slaughtering fish and aquatic products

Similar Documents

Publication Publication Date Title
CN111452034A (en) Double-camera machine vision intelligent industrial robot control system and control method
CN107767423B (en) mechanical arm target positioning and grabbing method based on binocular vision
CN107255476B (en) Indoor positioning method and device based on inertial data and visual features
WO2019170164A1 (en) Depth camera-based three-dimensional reconstruction method and apparatus, device, and storage medium
US9995578B2 (en) Image depth perception device
CN110246177B (en) Automatic wave measuring method based on vision
CN110782496B (en) Calibration method, calibration device, aerial photographing equipment and storage medium
KR100776805B1 (en) Efficient image transmission method and apparatus using stereo vision processing for intelligent service robot system
CN111673735A (en) Mechanical arm control method and device based on monocular vision positioning
CN111801198A (en) Hand-eye calibration method, system and computer storage medium
CN111178317A (en) Detection positioning method, system, device, electronic equipment and storage medium
CN108453739B (en) Stereoscopic vision positioning mechanical arm grabbing system and method based on automatic shape fitting
CN113379839B (en) Ground visual angle monocular vision odometer method based on event camera system
CN102785719A (en) Wall-climbing robot, system and method for shooting water gage images of ship
WO2020063058A1 (en) Calibration method for multi-degree-of-freedom movable vision system
CN108900775B (en) Real-time electronic image stabilization method for underwater robot
CN113361365A (en) Positioning method and device, equipment and storage medium
CN115714855A (en) Three-dimensional visual perception method and system based on stereoscopic vision and TOF fusion
CN110355758B (en) Machine following method and equipment and following robot system
CN111290584A (en) Embedded infrared binocular gesture control system and method
CN111640129B (en) Visual mortar recognition system applied to indoor wall construction robot
CN112509138B (en) LCOS-based high-precision three-dimensional reconstruction system for indoor plastering robot
Wang et al. A vision location system design of glue dispensing robot
CN102316272B (en) Remote controller control method, apparatus thereof and remote controller
CN107121131B (en) A kind of horizontal relative pose recognition methods of binocular camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
DD01 Delivery of document by public notice
DD01 Delivery of document by public notice

Addressee: Guangdong Ruopai Intelligent Robot Co.,Ltd. Person in charge of patentsThe principal of patent

Document name: Deemed withdrawal notice

WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200728