CN112008696A - Industrial robot system based on vision - Google Patents

Industrial robot system based on vision Download PDF

Info

Publication number
CN112008696A
CN112008696A CN202010928493.0A CN202010928493A CN112008696A CN 112008696 A CN112008696 A CN 112008696A CN 202010928493 A CN202010928493 A CN 202010928493A CN 112008696 A CN112008696 A CN 112008696A
Authority
CN
China
Prior art keywords
pose
offset
article
standard
industrial robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010928493.0A
Other languages
Chinese (zh)
Inventor
韩永远
单文泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Phoenix Nanjing Intelligent Manufacturing Technology Engineering Co ltd
Original Assignee
Phoenix Nanjing Intelligent Manufacturing Technology Engineering Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Phoenix Nanjing Intelligent Manufacturing Technology Engineering Co ltd filed Critical Phoenix Nanjing Intelligent Manufacturing Technology Engineering Co ltd
Priority to CN202010928493.0A priority Critical patent/CN112008696A/en
Publication of CN112008696A publication Critical patent/CN112008696A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices

Abstract

The invention relates to a vision-based industrial robot system, comprising: a mechanical arm; a gripping mechanism mounted to the robotic arm configured to grip an item; a vision system installed at the grasping mechanism and configured to visually acquire information of an actual pose of an article to be grasped; a computing device configured to compute an offset between an actual pose of the item and a standard workpiece pose of the item, and a programmable logic controller coupled to the robotic arm, the gripper mechanism, and the computing device, the programmable logic controller configured to perform pose compensation on a gripping action of the robotic arm based on the offset such that the gripper mechanism can accurately grip the item, wherein the pose compensation includes at least one of position compensation and pose compensation.

Description

Industrial robot system based on vision
Technical Field
The present invention relates to the field of automation and, more particularly, to a vision-based industrial robot system.
Background
Under the background of increasing labor cost, the demands of enterprises for automation and intelligent products are continuously expanded, and robots with different functions and types are gradually popularized in the enterprises. The assembly robot, one of the most important types in the industrial field, is used more and more widely in the production process of enterprises, and the design and development of the assembly robot are mature. However, due to the influence of factors such as assembly precision, price and intelligence degree, the assembly robot is difficult to be accepted by popular small and medium enterprises. In many assembly plants, significant labor is still required to assemble the product.
In most application scenes of industrial robots, the robots are operated according to fixed programs, the initial pose and the final pose of an object are specified in advance, and the quality of finishing a work task is ensured by the positioning precision of a production line. For high quality operation, the production line is required to be relatively fixed and the positioning precision is high, so that the production flexibility is reduced, the cost is greatly increased, and the flexibility of the production line and the product quality are contradictory.
Therefore, there is a need for a robotic assembly system that can address the above problems.
Disclosure of Invention
The invention relates to a vision-based industrial robot system, comprising: a mechanical arm; a gripping mechanism mounted to the robotic arm configured to grip an item; a vision system installed at the grasping mechanism and configured to visually acquire information of an actual pose of an article to be grasped; a computing device configured to compute an offset between an actual pose of the item and a standard workpiece pose of the item, and a programmable logic controller coupled to the robotic arm, the gripper mechanism, and the computing device, the programmable logic controller configured to perform pose compensation on a gripping action of the robotic arm based on the offset such that the gripper mechanism can accurately grip the item, wherein the pose compensation includes at least one of position compensation and pose compensation.
The industrial robot system as described above, configured to: establishing a workpiece coordinate system at the position of an article to be grabbed, and enabling the tail end rotation center of the mechanical arm to be located at a standard grabbing position in the established workpiece coordinate system; the programmable logic controller is configured to: performing rotation operation and translation operation on the workpiece coordinate system according to the offset; and repositioning the tip rotation center of the robot arm to the standard gripping position in the rotated and translated workpiece coordinate system.
The industrial robot system as described above, the vision system being configured to be able to calibrate the acquired image plane of the article with the workpiece coordinate system, so that the actual pose of the article determined by the vision system can be converted into actual coordinate values in the workpiece coordinate system; the computing device is configured to compute a position offset and an angle offset between the actual coordinate values of an article and standard coordinate values of an article, wherein a standard workpiece pose of the article corresponds to the standard coordinate values preset in the workpiece coordinate system; and the programmable logic controller is configured to rotate the workpiece coordinate system according to the calculated angular offset, translate the workpiece coordinate system according to the calculated positional offset, and reposition the tip rotational center of the robotic arm to the standard grasping position in the rotated and translated workpiece coordinate system.
The industrial robot system as described above, the vision system comprising a camera configured for: an image of an article to be grasped is captured, and the computing device corresponds pixel locations of the article to be grasped in the image to coordinate values in the workpiece coordinate system.
The industrial robot system as described above, the computing device being configured for: obtaining the standard coordinate values (x, y, theta) of the article1) (ii) a Calculating actual coordinate values (x) of the article from the image of the article1,y12) (ii) a Obtaining a coordinate value (x) corresponding to a standard grabbing position where the rotation center of the mechanical arm is located0,y0) (ii) a Calculating the angular offset β ═ θ21And translational offset:
offset (u)x=(x1-x0)cosβ-(y1-y0)sinβ+x0-x,
Offset _ y ═ x1-x0)sinβ+(y1-y0)cosβ+y0-y,
And sending the angular offset and the translational offset to the programmable logic controller.
The industrial robot system as described above, comprising a 6-axis robot, said vision system and said gripping mechanism being mounted on the 6 th axis of said robot arm.
In the industrial robot system as described above, the standard workpiece pose of the article is preset or is captured by the vision system.
The industrial robot system as described above, the vision system further comprising a light source.
The industrial robot system as described above, the gripping mechanism includes a gripping jaw for gripping an article and a gripping jaw cylinder for driving the gripping jaw to move.
The industrial robot system as described above, configured to perform an assembly task.
Drawings
To further clarify the above and other advantages and features of embodiments of the present invention, a more particular description of embodiments of the present invention will be rendered by reference to the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope as claimed.
FIG. 1 is a schematic illustration of a robot in operation according to one embodiment of the present invention;
FIG. 2 is a schematic view of a grasping system that may be mounted to the robot of FIG. 1 according to an embodiment of the present invention;
FIG. 3 is a flow diagram of the operation of a vision-based positioning robotic system according to one embodiment of the present invention;
FIG. 4 is a schematic illustration of a mapping relationship between a camera image plane and a robot's object coordinate system according to one embodiment of the present invention;
FIG. 5 is a schematic diagram of a method for computing robot pose compensation within a computing device of a robotic system, according to one embodiment of the present invention;
fig. 6 is a schematic diagram for calculating an offset amount that a robot needs to compensate according to an embodiment of the present invention.
Detailed Description
The following detailed description refers to the accompanying drawings. The drawings show, by way of illustration, specific embodiments in which the claimed subject matter may be practiced. It is to be understood that the following detailed description is intended for purposes of illustration, and is not to be construed as limiting the invention; those skilled in the art, having the benefit of this disclosure, may effect numerous modifications thereto and changes may be made without departing from the scope and spirit of the claimed subject matter.
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of various described embodiments. It will be apparent, however, to one skilled in the art that the various embodiments described may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail as not to unnecessarily obscure aspects of the embodiments. Unless defined otherwise, technical and scientific terms used herein shall have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs.
Embodiments of the present application are exemplary implementations or examples. Reference in the specification to "an embodiment," "one embodiment," "some embodiments," "various embodiments," or "other embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least some embodiments, but not necessarily all embodiments, of the technology. The various appearances "an embodiment," "one embodiment," or "some embodiments" are not necessarily all referring to the same embodiments. Elements or aspects from one embodiment may be combined with elements or aspects of another embodiment.
Visual guidance and positioning are ideal tools to resolve the conflict between production line flexibility and product quality. The visual guidance positioning technology mainly comprises the steps of knowing the specific working environment of the robot in real time through a visual system, and correspondingly adjusting the action of the robot according to the change of the environment, so that the accurate completion of tasks is guaranteed. The ideal visual guidance positioning technology is mainly to grasp the approximate orientation of an object through observation of the object on the basis of vision. In the grabbing process, the robot can observe the deviation between the mechanical arm and the object while moving, and the deviation is used as reference data for adjusting the direction, so that the moving direction of the mechanical arm is gradually adjusted, and the object is accurately grabbed.
In this case, even if there is a large error in the adjustment or positioning of the production line, the robot does not have much influence on the accurate operation. The vision system actually provides an external closed-loop control mechanism, ensuring that the robot automatically compensates for errors due to environmental changes.
Fig. 1 is a schematic illustration of a robot 100 in operation according to an embodiment of the present invention. As shown in fig. 1, the robot 100 may have robotic arms and a corresponding control system. In one non-limiting example of the invention, the robot 100 may be a 6-axis robot. The robot 100 may be an industrial robot, for example, configured for performing assembly tasks. However, it is to be understood that the robot 100 of the present invention is not limited to 6-axis robots, nor to being configured to perform assembly tasks.
Fig. 2 is a schematic diagram of a grasping system 200 that may be mounted to the robot of fig. 1 according to an embodiment of the present invention. The grasping system 200 may be mounted at the end of a robotic arm of a robot. In case the robot 100 is a 6-axis robot, the gripper system 200 may be mounted on the 6 th axis of the 6-axis robot, such that the gripper system 200 may be brought to a gripping position by a robot arm of the robot 100 to grip an item.
The grasping system 200 may include a vision system 202 and a grasping mechanism 204. As shown in fig. 2, the vision system 202 and the grasping mechanism 204 may be coupled together in any manner. The vision system 202 may include a camera. The vision system 202 may additionally include a light source. The vision system 202 may be configured to visually acquire the actual position and pose of the item to be grasped. For example, a camera of the vision system 202 may be configured to capture an image of an item. The gripper mechanism 204 may include a gripper 2044 and a gripper cylinder 2042 for gripping the article. The clamping jaw 2044 is used for grabbing articles, and the clamping jaw cylinder 2042 is used for driving the clamping jaw 2044 to move.
For the gripping mechanism 204 shown in fig. 2, the position and posture of the gripping mechanism 204 need to substantially correspond to the position and posture of the object to be gripped, and the gripping action is successful. The attitude may be, for example, the direction of the grasping action defined by the pair of jaws 2044 of fig. 2. In practice, for various reasons, position and posture changes (either deviation of positioning, or deflection of direction, or both) may occur during the process of transporting the object to the grabbing area of the robot arm, which may make the grabbing mechanism 204 unable to complete grabbing according to the original grabbing position and posture.
In the present application, both the object position and pose are denoted by "pose".
In the embodiment of the invention, according to the actual pose information of the product obtained by the vision system 202, the pose of the grabbing action is corrected accordingly.
FIG. 3 is a flow diagram 300 of the operation of a vision positioning based robotic system in accordance with one embodiment of the present invention. The robotic system may establish a workpiece coordinate system at the location of the article to be grasped. The object coordinate system may be stored locally in the robot system for each invocation. Before grasping an article, the tip rotation center of the robot arm of the robot 100 may be provided with a standard grasping position in the pre-established workpiece coordinate system. The standard grasp position may be expressed as a coordinate value of the rotation center of the robot arm in the workpiece coordinate system. The tip rotation center of the arm of the robot 100 may also be provided with a standard grasp attitude in a pre-established workpiece coordinate system. The attitude may be, for example, an attitude perpendicular or parallel to the direction of travel of the pipeline. Correspondingly, the product to be grabbed, which is transported from the product line, ideally can have a standard workpiece pose. In the present invention, the tip rotation center of the robot arm refers to the rotation center of the shaft at the farthest end of the robot arm. For example, in the case where the robot 100 is a 6-axis robot, the robot arm rotation end rotation center refers to the rotation center of the 6 th axis of the robot 100.
In one embodiment, the standard workpiece pose of the product may be a workpiece pose that is preset and input to the computing device 101 of the robotic system, or may be a workpiece pose captured by the vision system 202 for a product placed in the standard pose. Under an ideal condition, if the system is well debugged and no other factors influence the pose of the incoming material product, the mechanical arm can accurately and successfully grab the product in the standard workpiece pose according to the standard grabbing pose. However, in real-world situations, the product being transported from the product line may deviate from the expected standard workpiece pose for a variety of reasons. For example, the actual pose of the product has a shift and/or rotation compared to the standard workpiece pose of the product. According to the embodiment of the application, the offset between the actual position and the attitude of the product and the attitude of the standard workpiece of the product can be calculated, so that the mechanical arm of the robot can perform attitude compensation according to the offset, and further the grabbing mechanism can accurately grab the product.
Specifically, the above standard grasp poses (i.e., the standard position of the tip rotation center of the robot arm and its corresponding position of the grasp mechanism 204, and the posture of the grasp mechanism 204) and/or the standard workpiece pose information of the product may be input to the computing device 101. Alternatively, the tip end rotation center pose information of the robot arm and/or the standard workpiece pose information of the product may be stored in the robot system after debugging and setting for recall. Each time a product comes on the product line, the camera 2022 may capture an image of the incoming product, and thus, the actual pose of the product may be determined. Specifically, the camera may capture an image of an incoming product and may send the image to the computing device 101 to identify the product appearance features and determine the pixel locations corresponding to the product to determine the pose of the product (e.g., the pose angle of the long axis of the product). As described below, the position of the pixel of the product in the camera image plane may correspond to the coordinate value of the product in the camera coordinate system. Furthermore, the camera image plane may be calibrated with the robot workpiece coordinate system established as above (this calibration process may be performed by the vision system), so that the coordinate values in the camera coordinate system may be converted into coordinate values in the robot workpiece coordinate system. Thus, the pixel position of the actual product can be converted into the actual coordinate value of the product in the workpiece coordinate system of the robot. The conversion process may be performed by the computing device 101. The calculation means 101 can calculate the offset between the actual coordinate value of the product and the coordinate value in the standard workpiece pose in the workpiece coordinate system based thereon.
The system of the present invention may further include a Programmable Logic Controller (PLC) 102. The Programmable Logic Controller (PLC)102 may be coupled to a robotic arm, a gripping mechanism, a computing device, and the like. The computing device 101 can transmit the calculated offset to a Programmable Logic Controller (PLC) 102. As a non-limiting example, the transmission of the above information may be over Ethernet. It will be appreciated that the transmission of information may be accomplished using any other available communication technology. After receiving the information of the pose offset, the PLC can control the robot to perform pose compensation, so that the product can be successfully grabbed under the actual pose of the product. In a preferred embodiment of the present invention, the implementation of pose compensation by the PLC controlled robot 100 may mainly include: and moving the workpiece coordinate system according to the calculated pose offset, and rotating the workpiece coordinate system to enable the pose of the grabbing action of the tail end rotation center of the mechanical arm in the new coordinate system to correspond to the standard grabbing pose. In an exemplary embodiment, the PLC 102 can transmit a control signal including the offset to the robot 100 through the fieldbus according to the calculated offset to implement pose compensation of the control robot 100. It will be appreciated that the method described above in connection with fig. 3 is not intended to limit the order of individual acts or operations. One or more of the above actions or operations may be performed in a different order or simultaneously.
Fig. 4 is a schematic diagram of a mapping relationship between a camera image plane and an object coordinate system of a robot according to an embodiment of the present invention. The workpiece coordinate plane to which the robot arm corresponds may be denoted by a. A camera coordinate system may be established in the camera graphics plane. The camera coordinate system plane may be denoted by O. For example, point O1 in the camera image plane may have a mapping relationship with coordinate point a1 in the robot coordinate system. The coordinate space of the camera can be expressed using pixel values, the size of each pixel being fixed, i.e. the size of one pixel corresponds to the actual millimeter size, so that the pixel values can be converted into actual coordinates and mapped to coordinates in the workpiece coordinate system.
The rotation center coordinates of the robot arm may be the rotation center of the robot arm captured by another camera (not shown) and converted from pixel coordinates to coordinates in the workpiece coordinate system. Alternatively, the rotation center coordinates of the robot arm may also be determined in advance and input and/or stored in the storage means of the robot system. Further, the actual coordinates of the product in the camera coordinate system may be obtained by calculating the corresponding actual millimeters of the product from the pixel coordinates of the camera in a proportional relationship based on the image of the actual product captured by the camera, and the coordinates in the camera coordinate system may be converted into coordinates in the workpiece coordinate system.
As described above, the camera needs to identify the item and determine the pixel location of the item. In one embodiment of the invention, for any shape of the item (e.g., a regular or irregular shape), the vision system 202 may take a point on the shape of the item as a reference point and determine the pose of the reference point as a representation of the product pose. Correspondingly, the standard workpiece pose of the product and the actual pose of the product are both expressed by the reference point.
FIG. 5 is a schematic diagram of a method for computing robot pose compensation within a computing device of a robotic system, according to one embodiment of the invention. It should be appreciated that FIG. 5 is an analysis of the computational principles within the robotic system, intended to illustrate how the offset between the product actual pose and the standard workpiece pose sum is determined from the various known poses. The offset may be calculated by the computing means 101 of the robot system. In fig. 5, a indicates the corresponding position point of the tip rotation center of the robot arm.
The schematic i of fig. 5 shows: the mechanical arm rotation center is at an initial standard grabbing position, and the product is in a standard workpiece pose. As an example, but not by way of limitation, a standard workpiece pose, shown in diagram i, is as follows: the product center point is at point B, and the product attitude (represented by the long axis direction of the product) has a 90 degree angle with the reference direction (e.g., the gripping direction of the pair of jaws 2044 in fig. 2). For ease of discussion and understanding, the clamping jaw 2044 of fig. 2 is taken as an example, and thus a 90 degree included angle is determined as an example standard product pose, however, depending on the actual situation, a product in a standard workpiece pose may have a product orientation at any suitable angle as its standard product pose.
The schematic ii of fig. 5 shows: the end rotation center of the robot arm is still at the initial standard gripping position, but the actual attitude of the product deviates from the standard workpiece attitude, i.e., the product deviates from the standard workpiece attitude of the product in the position coordinates and/or the orientation angle in the diagram ii. As an example and not by way of limitation, for various reasons, during the product is conveyed to the robot arm, at least a change in direction occurs, so that the current attitude of the product forms an angle of 60 degrees with the gripping action direction of the gripping jaws, and therefore, there is a deviation of 30 degrees between the actual attitude of the product and the standard attitude of the product.
Further, although not explicitly shown in the schematic view ii, it is understood that in actual operation, the current pose of the product may deviate from the standard pose of the product.
In order to adapt the gripping action of the robot arm to the actual position and posture of the product, the position and posture of the gripping action of the robot arm need to be adjusted. This adjustment of the position and attitude is done in the manner shown in fig. iii and iv.
The schematic diagram iii of fig. 5 shows: and rotating the workpiece coordinate system to compensate the attitude deviation of the product. In the rotated workpiece coordinate system, the attitude angle of the product forms an angle of 90 degrees with the clamping action direction of the clamping jaw again.
The schematic diagram iv of fig. 5 shows: the workpiece coordinate system is translated so that the product is repositioned to a standard gripping position.
And in the updated workpiece coordinate system, re-determining the coordinates of the central point of the product, and re-determining the attitude angle of the product. The obtained center point coordinates and attitude angles can be used as the position and attitude of the updated grabbing action.
Fig. 6 is a schematic diagram for calculating a pose offset amount that the robot needs to compensate according to an embodiment of the present invention. First, the known coordinates include: standard grasping position coordinate A (x) of the tip rotation center of the robot arm0,y0) Standard coordinates B (x, y, theta) of the product1) Actual coordinates C (x) of the product1,y12). From these known coordinates, in conjunction with the coordinate system shown in fig. 6, the computing device 101 can calculate the angle of rotation required to move the product in virtually any pose and position to the standard workpiece pose as follows: beta is theta21And the amount of offset that needs to be translated is:
offset _ x ═ x1-x0)cosβ-(y1-y0)sinβ+x0-x,
Offset _ y ═ x1-x0)sinβ+(y1-y0)cosβ+y0-y。
The robot can accurately grab the actual product in any posture and position by adding the calculated compensation values of the angle beta, the offset _ x and the offset _ y to the initial standard placement posture.
The coordinate system of fig. 6 schematically shows a process that may assist in understanding the offset calculation: the standard grabbing position coordinate of the rotation center of the mechanical arm is A (x)0,y0) The actual coordinate of the product is C (x)1,y1) (angle not shown); firstly, the actual coordinate C (x) of the product is measured1,y1) Rotating the mechanical arm around the rotation center A point by a rotation angle beta, and then actually sitting the product at the momentMark C (x)1,y1) After rotation, C ' (x ', y '). The coordinate values of the coordinates C ' (x ', y ') can be calculated:
x’=(x1-x0)cosβ-(y1-y0)sinβ+x0
y’=(x1-x0)sinβ+(y1-y0)cosβ+y0
the rotated C point, i.e. C '(x', y '), has the same angle as the standard coordinate B of the product, so that the translation offset can be obtained by subtracting the coordinate of C' from the standard coordinate B:
offset _ x ═ x1-x0)cosβ-(y1-y0)sinβ+x0-x,
Offset _ y ═ x1-x0)sinβ+(y1-y0)cosβ+y0-y。
After the rotation angle and the translation offset are calculated, the offset parameters can be sent to the PLC, and then the PLC can control the robot to perform pose compensation according to the parameters.
Although fig. 5 depicts directly moving the robotic arm to accurately position relative to a product having a deviation in pose, and fig. 6 depicts moving the actual product to calculate an offset value, this is but one analysis of the deviation between the actual pose of the product and the standard workpiece pose of the product by the algorithm of the present invention. In a preferred embodiment of the present invention, after calculating the offset, the actual operation of the PLC controlling the robot to perform pose compensation includes: the robot is enabled to compensate for the attitude deviation of the actual product by first controlling the movement of the workpiece coordinate system based on the calculated offset amount, and then moving and positioning the robot arm to this standard grasp position in the moved workpiece coordinate system based on the initial standard grasp position (e.g., coordinates) of the robot arm in the workpiece coordinate system. Further, it should be noted that, during the calculation, there may be a case where the rotation direction indicated by the image in the camera is not consistent with the direction in which the coordinate system actually needs to be rotated, and it needs to be specified during the robot commissioning. Fig. 6 illustrates an XY plane, however it is understood that other coordinate systems may be established without departing from the concepts of the present invention.
The invention adopts the method of converting the planar pixels of the camera into the coordinates in the coordinate system of the workpiece, and the calibration mode is simple and convenient. In addition, according to the technical scheme, only the offset needs to be calculated, the workpiece coordinate system is moved according to the offset, the mechanical arm is moved and positioned to the standard grabbing position, and the absolute coordinate value to which the robot needs to move does not need to be calculated, so that the calculation process is simplified. And, calculate the offset but another advantage of the absolute coordinate value is, can reveal the article actual position and the standard work piece position and deviate from in real time, directly perceivedly by how much specifically, can make the further analysis to the offset subsequently.
Furthermore, embodiments of the present invention are described herein primarily in the context of a robot being assembled in an industrial process. However, it will be appreciated that such a vision-based guidance and positioning system of the present invention may be applied in any other scenario where it is desirable to adjust the robot positioning according to the actual pose of an item. Furthermore, it is understood that the object to be gripped by the robot may not only comprise a specific product on the production line, but also any object to be gripped. The computing device described in this application may be any device having computing functionality.
Those skilled in the art will appreciate that the control signals referred to herein may be represented using a variety of different techniques and means. For example, the control signals described in the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The various illustrative logical modules and control circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic control device (e.g., a PLC), discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
In one or more exemplary embodiments, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software as a computer program product, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a web site, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk (disk) and disc (disc), as used herein, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks (disks) usually reproduce data magnetically, while discs (discs) reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
Those skilled in the art can make appropriate modifications and adaptations to the embodiments described in detail above without departing from the spirit and substance of the present invention. Therefore, it is intended that claimed subject matter not be limited to the particular examples disclosed, but that such claimed subject matter may also include all implementations falling within the scope of the appended claims, and equivalents thereof.

Claims (10)

1. A vision-based industrial robot system, comprising:
a mechanical arm;
a gripping mechanism mounted to the robotic arm configured to grip an item;
a vision system installed at the grasping mechanism and configured to visually acquire information of an actual pose of an article to be grasped;
a computing device configured to compute an offset between the actual pose of the item and a standard workpiece pose of the item, and
a programmable logic controller coupled to the robotic arm, the gripper mechanism, and the computing device, the programmable logic controller configured to perform pose compensation on a gripping action of the robotic arm based on the offset such that the gripper mechanism can accurately grip the item, wherein the pose compensation includes at least one of position compensation and pose compensation.
2. The industrial robot system of claim 1, wherein the industrial robot system is configured to:
establishing a workpiece coordinate system at the position of an article to be grabbed, and enabling the tail end rotation center of the mechanical arm to be located at a standard grabbing position in the established workpiece coordinate system;
the programmable logic controller is configured to:
performing rotation operation and translation operation on the workpiece coordinate system according to the offset; and
repositioning the tip rotation center of the robotic arm to the standard grasp location in the rotated and translated workpiece coordinate system.
3. An industrial robot system according to claim 2,
the vision system is configured to be able to calibrate the acquired image plane of the article with the workpiece coordinate system so that the actual pose of the article determined by the vision system can be converted into actual coordinate values in the workpiece coordinate system;
the computing device is configured to compute a position offset and an angle offset between the actual coordinate values of an article and standard coordinate values of an article, wherein a standard workpiece pose of the article corresponds to the standard coordinate values preset in the workpiece coordinate system; and
the programmable logic controller is configured to rotate the workpiece coordinate system according to the calculated angular offset, translate the workpiece coordinate system according to the calculated positional offset, and reposition the tip rotational center of the robotic arm to the standard grasping position in the rotated and translated workpiece coordinate system.
4. An industrial robot system according to claim 3,
the vision system includes a camera configured to: an image of an article to be grasped is captured, and the computing device corresponds pixel locations of the article to be grasped in the image to coordinate values in the workpiece coordinate system.
5. An industrial robot system according to claim 4,
the computing device is configured to:
obtaining the standard coordinate values (x, y, theta) of the article1);
Calculating actual coordinate values (x) of the article from the image of the article1,y12);
Obtaining a coordinate value (x) corresponding to a standard grabbing position where the rotation center of the mechanical arm is located0,y0);
Calculating the angular offset β ═ θ21And translational offset:
offset _ x ═ x1-x0)cosβ-(y1-y0)sinβ+x0-x,
Offset _ y ═ x1-x0)sinβ+(y1-y0)cosβ+y0-y, and
and sending the angle offset and the translation offset to the programmable logic controller.
6. An industrial robot system according to any of the claims 1-5,
the industrial robot system comprises a 6-axis robot, and the vision system and the gripping mechanism are mounted on the 6 th axis of the mechanical arm.
7. An industrial robot system according to any of the claims 1-5,
the standard workpiece pose of the article is pre-set or is captured by the vision system.
8. An industrial robot system according to any of the claims 1-5,
the vision system also includes a light source.
9. An industrial robot system according to any of the claims 1-5,
the grabbing mechanism comprises a clamping jaw and a clamping jaw air cylinder, the clamping jaw is used for grabbing articles, and the clamping jaw air cylinder is used for driving the clamping jaw to move.
10. An industrial robot system according to any of the claims 1-5,
the industrial robot system is configured to perform an assembly task.
CN202010928493.0A 2020-09-07 2020-09-07 Industrial robot system based on vision Pending CN112008696A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010928493.0A CN112008696A (en) 2020-09-07 2020-09-07 Industrial robot system based on vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010928493.0A CN112008696A (en) 2020-09-07 2020-09-07 Industrial robot system based on vision

Publications (1)

Publication Number Publication Date
CN112008696A true CN112008696A (en) 2020-12-01

Family

ID=73517010

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010928493.0A Pending CN112008696A (en) 2020-09-07 2020-09-07 Industrial robot system based on vision

Country Status (1)

Country Link
CN (1) CN112008696A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112692840A (en) * 2020-12-10 2021-04-23 安徽巨一科技股份有限公司 Mechanical arm positioning guiding and calibrating method based on machine vision cooperation
CN112792818A (en) * 2021-02-03 2021-05-14 深圳市冠运智控科技有限公司 Visual alignment method for rapidly guiding mechanical arm to grab target
CN113084808A (en) * 2021-04-02 2021-07-09 上海智能制造功能平台有限公司 Monocular vision-based 2D plane grabbing method for mobile mechanical arm
CN114199126A (en) * 2021-12-01 2022-03-18 大族激光科技产业集团股份有限公司 Non-contact positioning system and method
CN114378825A (en) * 2022-01-21 2022-04-22 四川长虹智能制造技术有限公司 Multi-camera visual positioning method and system and electronic equipment
CN115008477A (en) * 2022-08-09 2022-09-06 苏州华兴源创科技股份有限公司 Manipulator movement compensation method, manipulator movement compensation device and computer-readable storage medium
CN115465661A (en) * 2022-08-29 2022-12-13 济南星辉数控机械科技有限公司 Method and system for accurately grabbing cut plate
CN116153824A (en) * 2023-04-20 2023-05-23 沈阳和研科技股份有限公司 Discharging precision compensation method based on visual algorithm
CN116954178A (en) * 2023-09-18 2023-10-27 深圳市新技智能设备有限公司 Operation control method and system based on visual positioning

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA1228923A (en) * 1985-02-08 1987-11-03 Technical Arts Corporation High speed scanning method and apparatus
CN101097132A (en) * 2006-06-30 2008-01-02 廊坊智通机器人系统有限公司 Workpieces reference frame marking method based on relative measurement
US20150159987A1 (en) * 2013-12-10 2015-06-11 Mitutoyo Corporation Multi-axis type three-dimensional measuring apparatus
CN107685329A (en) * 2017-10-16 2018-02-13 河南森源电气股份有限公司 A kind of robot workpiece positioning control system and method
CN108345307A (en) * 2018-03-05 2018-07-31 菲尼克斯(南京)智能制造技术工程有限公司 Vision guide system and visual guide method
CN109015660A (en) * 2018-09-29 2018-12-18 天津七所高科技有限公司 A kind of robot pose adjustment device and method of view-based access control model
CN109483539A (en) * 2018-11-19 2019-03-19 施努卡(苏州)智能装备有限公司 Vision positioning method
CN110160446A (en) * 2019-06-17 2019-08-23 珠海格力智能装备有限公司 Localization method, device, storage medium and the system of material assembly
CN110202576A (en) * 2019-06-14 2019-09-06 福耀集团(福建)机械制造有限公司 A kind of workpiece two-dimensional visual guidance crawl detection system and method
CN110757504A (en) * 2019-09-30 2020-02-07 宜宾职业技术学院 Positioning error compensation method of high-precision movable robot
US20200206913A1 (en) * 2019-01-01 2020-07-02 Giant AI Software Compensated Robotics

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA1228923A (en) * 1985-02-08 1987-11-03 Technical Arts Corporation High speed scanning method and apparatus
CN101097132A (en) * 2006-06-30 2008-01-02 廊坊智通机器人系统有限公司 Workpieces reference frame marking method based on relative measurement
US20150159987A1 (en) * 2013-12-10 2015-06-11 Mitutoyo Corporation Multi-axis type three-dimensional measuring apparatus
CN107685329A (en) * 2017-10-16 2018-02-13 河南森源电气股份有限公司 A kind of robot workpiece positioning control system and method
CN108345307A (en) * 2018-03-05 2018-07-31 菲尼克斯(南京)智能制造技术工程有限公司 Vision guide system and visual guide method
CN109015660A (en) * 2018-09-29 2018-12-18 天津七所高科技有限公司 A kind of robot pose adjustment device and method of view-based access control model
CN109483539A (en) * 2018-11-19 2019-03-19 施努卡(苏州)智能装备有限公司 Vision positioning method
US20200206913A1 (en) * 2019-01-01 2020-07-02 Giant AI Software Compensated Robotics
CN110202576A (en) * 2019-06-14 2019-09-06 福耀集团(福建)机械制造有限公司 A kind of workpiece two-dimensional visual guidance crawl detection system and method
CN110160446A (en) * 2019-06-17 2019-08-23 珠海格力智能装备有限公司 Localization method, device, storage medium and the system of material assembly
CN110757504A (en) * 2019-09-30 2020-02-07 宜宾职业技术学院 Positioning error compensation method of high-precision movable robot

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
HONGTAI CHENG: "Dynamic error modeling and compensation in high speed delta robot pick-and-place process", 《DYNAMIC ERROR MODELING AND COMPENSATION IN HIGH SPEED DELTA ROBOT PICK-AND-PLACE PROCESS》 *
庄其鑫: "基于可行图的五轴在机测量探针轴向优化方法", 《基于可行图的五轴在机测量探针轴向优化方法 *
张军等主编: "《摄影测量与遥感技术》", 31 July 2015, 西南交通大学出版社 *
王爱民著: "《制造系统工程》", 31 January 2017, 北京理工大学出版社 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112692840A (en) * 2020-12-10 2021-04-23 安徽巨一科技股份有限公司 Mechanical arm positioning guiding and calibrating method based on machine vision cooperation
CN112792818A (en) * 2021-02-03 2021-05-14 深圳市冠运智控科技有限公司 Visual alignment method for rapidly guiding mechanical arm to grab target
CN112792818B (en) * 2021-02-03 2024-01-26 深圳市冠运智控科技有限公司 Visual alignment method for rapidly guiding manipulator to grasp target
CN113084808B (en) * 2021-04-02 2023-09-22 上海智能制造功能平台有限公司 Monocular vision-based 2D plane grabbing method for mobile mechanical arm
CN113084808A (en) * 2021-04-02 2021-07-09 上海智能制造功能平台有限公司 Monocular vision-based 2D plane grabbing method for mobile mechanical arm
CN114199126A (en) * 2021-12-01 2022-03-18 大族激光科技产业集团股份有限公司 Non-contact positioning system and method
CN114378825A (en) * 2022-01-21 2022-04-22 四川长虹智能制造技术有限公司 Multi-camera visual positioning method and system and electronic equipment
CN114378825B (en) * 2022-01-21 2023-05-12 四川长虹智能制造技术有限公司 Multi-camera visual positioning method and system and electronic equipment
CN115008477A (en) * 2022-08-09 2022-09-06 苏州华兴源创科技股份有限公司 Manipulator movement compensation method, manipulator movement compensation device and computer-readable storage medium
CN115465661A (en) * 2022-08-29 2022-12-13 济南星辉数控机械科技有限公司 Method and system for accurately grabbing cut plate
CN116153824B (en) * 2023-04-20 2023-08-25 沈阳和研科技股份有限公司 Discharging precision compensation method based on visual algorithm
CN116153824A (en) * 2023-04-20 2023-05-23 沈阳和研科技股份有限公司 Discharging precision compensation method based on visual algorithm
CN116954178A (en) * 2023-09-18 2023-10-27 深圳市新技智能设备有限公司 Operation control method and system based on visual positioning
CN116954178B (en) * 2023-09-18 2023-12-01 深圳市新技智能设备有限公司 Operation control method and system based on visual positioning

Similar Documents

Publication Publication Date Title
CN112008696A (en) Industrial robot system based on vision
CN110842928B (en) Visual guiding and positioning method for compound robot
JP6468741B2 (en) Robot system and robot system calibration method
JP6484213B2 (en) Robot system including a plurality of robots, robot control apparatus, and robot control method
US20040266276A1 (en) Connector gripping device, connector inspection system comprising the device, and connector connection system
CN111369625B (en) Positioning method, positioning device and storage medium
CN108927801B (en) Method and device for adjusting tail end attitude of mechanical arm
CN113715016B (en) Robot grabbing method, system, device and medium based on 3D vision
CN114043087A (en) Three-dimensional trajectory laser welding seam tracking attitude planning method
TW202124110A (en) Automated calibration system and method for workpiece coordinate frame of a robot
JP2020138292A (en) Robot system and control method
CN107756391B (en) Correction method of mechanical arm correction system
CN112809167B (en) Robot weld joint tracking method for all-position welding of large-curvature pipe fitting
CN116423526B (en) Automatic calibration method and system for mechanical arm tool coordinates and storage medium
WO2018214156A1 (en) Method of correcting locomotion control command of robot, and related apparatus for same
CN110549326B (en) Robot grinding and polishing processing pose adjusting method based on multiple active compliant controllers
CN110488751B (en) Graphite tray visual positioning system of automatic process line
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution
JP2020138293A (en) Robot system and control method
TWI617405B (en) Correction method for robot arm correction system
CN110480685A (en) A kind of Agricultural vehicle wheel automatic production line vision manipulator
CN112792818B (en) Visual alignment method for rapidly guiding manipulator to grasp target
CN113043264B (en) Zero calibration method for integrated joint seven-axis robot
JP7384653B2 (en) Control device for robot equipment that controls the position of the robot
CN111283676B (en) Tool coordinate system calibration method and calibration device of three-axis mechanical arm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
AD01 Patent right deemed abandoned
AD01 Patent right deemed abandoned

Effective date of abandoning: 20221206