CN111890343A - Robot object collision detection method and device - Google Patents
Robot object collision detection method and device Download PDFInfo
- Publication number
- CN111890343A CN111890343A CN202010747264.9A CN202010747264A CN111890343A CN 111890343 A CN111890343 A CN 111890343A CN 202010747264 A CN202010747264 A CN 202010747264A CN 111890343 A CN111890343 A CN 111890343A
- Authority
- CN
- China
- Prior art keywords
- transported
- robot
- detected
- image
- article
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/08—Programme-controlled manipulators characterised by modular constructions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
- B25J9/1666—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
The application provides a robot object collision detection method and device, and the method comprises the following steps: acquiring an image to be detected comprising an object stage and an object to be transported on the object stage; determining distance information between the object to be transported and the inner wall of the objective table according to the image to be detected; determining the deviation of the object to be transported according to the distance information between the object to be transported and each side wall in the inner wall of the object stage and a preset linear regression function; and when the deviation of the to-be-transported object is larger than a preset deviation threshold value, determining that the to-be-transported object is collided. According to the detection method provided by the scheme, the object stage and the to-be-detected image of the to-be-transported object on the object stage are obtained in real time, and the distance information between the to-be-transported object and the inner wall of the object stage is determined according to the to-be-detected image, so that whether the to-be-transported object collides or not is detected in real time, and the timeliness of a detection result is improved.
Description
Technical Field
The invention relates to the field of robot control, in particular to a robot carrying collision detection method and device.
Background
At present, robots are widely applied, and human resources are saved. However, in the process of transporting goods by using the robot, there is a risk that the goods collide with the side wall of the stage by the robot, and the goods are damaged to some extent.
In the prior art, for a robot for assembling parts, generally, only the quality of the assembled product is detected, and if the quality of the product is determined not to be a problem, the robot is determined not to collide during the transportation of the parts.
However, if the prior art detects whether the part collides in the transportation process, the detection result has the problem of poor timeliness, so that a collision detection method capable of monitoring the transportation process of the robot carrier is urgently needed, and the method has important significance for reducing the breakage rate of the part.
Disclosure of Invention
Therefore, the technical problem to be solved by the present invention is to overcome the defect of poor timeliness of the detection method for robot object collision in the prior art, so as to provide a method and a device for detecting robot object collision.
The application provides a collision detection method for robot carrying objects, which is applied to a robot carrying system, wherein the robot carrying system comprises a robot and a carrying platform for placing objects to be transported, and the method comprises the following steps:
acquiring an image to be detected comprising an object stage and an object to be transported on the object stage;
determining distance information between the article to be transported and the inner wall of the objective table according to the image to be detected;
determining the deviation of the article to be transported according to the distance information between the article to be transported and each side wall in the inner wall of the objective table and a preset linear regression function;
and when the dispersion of the to-be-transported goods is larger than a preset dispersion threshold value, determining that the to-be-transported goods is collided.
Optionally, according to the image to be detected, determine the distance information between the article to be transported and the inner wall of the object stage, including:
and respectively calculating the distance between the article to be transported and each side wall in the inner wall of the object stage.
Optionally, before determining distance information between the article to be transported and the inner wall of the object stage according to the image to be detected; the method further comprises the following steps:
judging whether the motion track of the robot is a preset standard track or not according to the image to be detected;
and when the motion track of the robot is determined not to be the preset standard track, executing the step of determining the distance between the article to be transported and the inner wall of the object stage according to the image to be detected.
Optionally, the method further includes:
when the motion track of the robot is determined to be a preset standard track, determining the displacement difference of the to-be-transported object according to the to-be-detected image and the to-be-detected image after a preset time interval;
and when the displacement difference of the article to be transported is larger than a preset position difference threshold value, determining that the article to be transported is collided.
Optionally, the acquiring an image to be detected including an object stage and an object to be transported on the object stage includes:
acquiring a video including an object stage and an object to be transported on the object stage within a preset time period;
and performing frame processing on the video according to a preset time interval to obtain at least one image to be detected.
Optionally, the method further includes: when determining that the article to be transported collides, generating collision information;
and alarming according to the collision information to prompt an operator that the article to be transported collides.
The second aspect of the present application provides a robot-carried object collision detection device, including: the device comprises an acquisition module, a first calculation module, a second calculation module and a detection module.
The acquisition module is used for acquiring an image to be detected, which comprises an object stage and an object to be transported on the object stage;
the first calculation module is used for determining distance information between the article to be transported and the inner wall of the object stage according to the image to be detected;
the second calculation module is used for determining the deviation of the article to be transported according to the distance information between the article to be transported and each side wall in the inner wall of the objective table and a preset linear regression function;
the detection module is used for determining that the articles to be transported collide when the dispersion of the articles to be transported is larger than a preset dispersion threshold value.
Optionally, the first calculating module is specifically configured to: and respectively calculating the distance between the article to be transported and each side wall in the inner wall of the object stage.
Optionally, the first computing module is further configured to: judging whether the motion track of the robot is a preset standard track or not according to the image to be detected;
and when the motion track of the robot is determined not to be the preset standard track, executing the step of determining the distance information between the article to be transported and the inner wall of the object stage according to the image to be detected.
Optionally, the first computing module is further configured to: when the motion track of the robot is determined to be a preset standard track, determining the displacement difference of the to-be-transported object according to the to-be-detected image and the to-be-detected image after a preset time interval;
and when the displacement difference of the article to be transported is larger than a preset position difference threshold value, determining that the article to be transported is collided.
Optionally, the obtaining module is specifically configured to: acquiring a video including an object stage and an object to be transported on the object stage within a preset time period;
and performing frame processing on the video according to a preset time interval to obtain at least one image to be detected.
Optionally, the detection module is further configured to: when determining that the article to be transported collides, generating collision information;
and alarming according to the collision information to prompt an operator that the article to be transported collides.
A third aspect of the present application provides an electronic device, comprising: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executes computer-executable instructions stored by the memory to cause the at least one processor to perform the method as set forth in the first aspect above and in various possible designs of the first aspect.
A fourth aspect of the present application provides a storage medium containing computer-executable instructions for performing a method as set forth in the first aspect above and in various possible designs of the first aspect when executed by a computer processor.
This application technical scheme has following advantage:
according to the method and the device for detecting collision of the robot object carrying, an image to be detected comprising an object carrying table and an object to be transported on the object carrying table is obtained; determining distance information between the object to be transported and the inner wall of the objective table according to the image to be detected; determining the deviation of the object to be transported according to the distance information between the object to be transported and each side wall in the inner wall of the object stage and a preset linear regression function; and when the deviation of the to-be-transported object is larger than a preset deviation threshold value, determining that the to-be-transported object is collided. According to the detection method provided by the scheme, the object stage and the to-be-detected image of the to-be-transported object on the object stage are obtained in real time, and the distance information between the to-be-transported object and the inner wall of the object stage is determined according to the to-be-detected image, so that whether the to-be-transported object collides or not is detected in real time, and the timeliness of a detection result is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art according to the drawings.
Fig. 1 is a schematic structural diagram of a robot loading collision detection system based on an embodiment of the application;
fig. 2 is a schematic flowchart of a method for detecting collision of a robot object according to an embodiment of the present disclosure;
FIG. 3 is a schematic structural diagram of an exemplary image to be detected provided by an embodiment of the present application;
fig. 4 is a schematic flowchart of another robot object collision detection method provided in the embodiment of the present application;
fig. 5 is a schematic structural diagram of an exemplary image capturing device provided in an embodiment of the present application;
fig. 6 is a schematic flowchart of another method for detecting collision of a robot object according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a robot loading collision detection device according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the prior art, for a robot for assembling parts, generally, only the quality of the assembled product is detected, and if the quality of the product is determined not to be a problem, the robot is determined not to collide during the transportation of the parts. However, the detection result of the prior art has the problem of poor timeliness if the part is detected whether collision occurs during transportation. Moreover, since the collision condition of the parts cannot be detected in time, some parts may collide in the transportation process to cause deformation, and although no performance defect is found in the quality detection of the assembled product, the product will have performance failure and other faults caused by deformation in the long-term use process.
In order to solve the above problems, the robot object collision detection method and apparatus provided in the embodiments of the present application acquire an image to be detected including an object stage and an object to be transported on the object stage; determining distance information between the object to be transported and the inner wall of the objective table according to the image to be detected; determining the deviation of the object to be transported according to the distance information between the object to be transported and each side wall in the inner wall of the object stage and a preset linear regression function; and when the deviation of the to-be-transported object is larger than a preset deviation threshold value, determining that the to-be-transported object is collided. According to the detection method provided by the scheme, the object table and the to-be-transported image of the object table to be transported are obtained in real time, and the distance information between the to-be-transported object and the inner wall of the object table is determined according to the to-be-detected image so as to detect whether the to-be-transported object is collided or not in real time, and the timeliness of the detection result is improved.
The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
First, a configuration of a robot baggage collision detection system according to the present application will be described:
the robot carrying collision detection method and device are suitable for detecting the collision condition of the articles transported by the robot in real time. As shown in fig. 1, the structural schematic diagram of a robot loading collision detection system according to an embodiment of the present invention mainly includes a robot loading system, an image capturing device, and an electronic device for detecting whether a collision occurs to an article to be transported, where the robot loading system includes a robot and a loading platform for placing the article to be transported. Specifically, in the process of carrying out article transportation at the robot, wherein, the robot mainly used will wait to transport article and place the objective table on, the image acquisition device gathers waiting to detect the image of waiting to transport article on objective table and the objective table on the objective table in real time to will wait to detect image transmission to electronic equipment, electronic equipment waits to detect the image according to obtaining, confirms the distance information between waiting to transport article and the objective table inner wall, whether collision has taken place with the article of waiting to transport in real time detection.
The embodiment of the application provides a robot loading collision detection method, which is used for solving the technical problem that the timeliness of the robot loading collision detection method in the prior art is poor. The execution subject of the embodiment of the application is an electronic device, such as a server, a desktop computer, a notebook computer, a tablet computer, and other electronic devices that can be used for detecting the collision condition of the article to be transported.
As shown in fig. 2, a schematic flowchart of a collision detection method for a robot object according to an embodiment of the present application is provided, where the method is applied to a robot object loading system, the robot object loading system includes a robot and an object stage for placing an object to be transported, and the method includes:
what need explain is, in order to guarantee the definition of the image of waiting to detect who obtains, can wait to detect image acquisition when image acquisition device, for image acquisition device provides the lighting apparatus of the preliminary specification of auxiliary lighting, utilize the lighting apparatus of preliminary specification to shine the objective table and the article of waiting to transport on the objective table to improve the image quality of waiting to detect the image that image acquisition device gathered. Wherein, the lighting apparatus who predetermines the specification can be annular lamp strip, also can be bar lamp strip, specifically can set up according to the operational environment of robot, does not do the restriction in this application embodiment.
In particular, the object carrier may be a gondola, i.e. the object carrier comprises a base plate, on which the object to be transported is placed, and an inner wall fixed around the base plate. And determining the position relation of the image to be detected and the image to be detected according to the image to be detected, and further determining the distance information between the edge of the article to be transported and the inner wall of the object stage.
And step 203, determining the deviation of the article to be transported according to the distance information between the article to be transported and each side wall in the inner wall of the objective table and a preset linear regression function.
It should be explained that the robot includes a gripper for gripping an object to be transported, and the preset linear regression equation may be determined by a simulation experiment of an associated operator, and specifically represents a position relationship between the moving robot gripper and the object stage.
For example, the preset linear regression equation may be: l ═ k × z + b, where L denotes a motion trajectory of the robot gripper, z denotes a vertical distance between the robot gripper and the ground, k denotes a regression coefficient, and b denotes a correction constant, where the regression coefficient k and the correction constant b may be determined by an operator according to a simulation experiment result, or may be determined according to a related work experience, and embodiments of the present application are not limited.
The deviation represents the deviation degree between the actual motion track of the object to be transported, which is grabbed by the robot gripper, and a preset linear regression equation. Specifically, according to the determined distance information between the article to be transported and each side wall in the inner wall of the object stage, the specific position coordinates (namely the actual position coordinates) of the article to be transported when placed on the object stage are determined, and the ideal position coordinates of the article to be transported on the object stage when the robot gripper places the article to be transported are determined according to a preset linear regression equation. Further, according to the deviation between the actual position coordinate and the ideal position coordinate of the object to be transported, the deviation degree (namely the deviation) between the actual motion track of the object to be transported and a preset linear regression equation when the gripper transports the object to be transported is determined.
In particular, in one embodiment, as some of the items to be transported are transported, there may be instances where deformation occurs due to a collision. For an article to be transported that has undergone deformation, it may not change much in relative position for at least one side wall, but may have undergone a severe change in position for the other side walls, and even a secondary collision has occurred.
Therefore, in order to improve the accuracy of the result of the dispersion calculation, the distance between the article to be transported and each of the side walls in the inner wall of the stage may be calculated separately.
Exemplary, as shown in fig. 3, a schematic structural diagram of an exemplary image to be detected provided for the embodiment of the present application is shown, where x1Representing the distance, x, between the item to be transported and the left side wall of the inner wall of the stage2Denotes a distance between the article to be transported and a right side wall of the stage inner wall, and y denotes a distance between the article to be transported and a front side wall of the stage inner wall. Wherein, the preset linear regression equation may include: x is the number of1=k*z+b、x2And k, z + b and y, k, z + b respectively represent the motion trail of the robot hand relative to each side wall.
Specifically, according to the distance between the article to be transported and each side wall in the inner wall of the object stage, the motion change condition of the article to be transported relative to the object stage can be accurately determined, and the accuracy of the dispersion calculation result is improved.
And 204, when the dispersion of the article to be transported is larger than a preset dispersion threshold value, determining that the article to be transported is collided.
Specifically, the preset dispersion threshold may be set according to an actual situation, and when it is determined that the dispersion of the article to be transported is greater than the preset dispersion threshold, it indicates that the motion trajectory of the article to be transported has deviated from the preset linear regression equation. The preset linear regression equation is highly consistent with the motion track of the robot gripper, so that the collision of the object to be transported on the objective table can be determined, or the object to be transported has higher collision risk.
On the basis of the above embodiments, because there are various actual operating environments of the robot and the directions of movement of the robot are different, in order to improve the applicability of the collision detection method for robot objects provided by the embodiments of the present application, as shown in fig. 4, which is a schematic flow chart of another collision detection method for robot objects provided by the embodiments of the present application, as an implementable manner, on the basis of the above embodiments, in one embodiment, before determining distance information between an article to be transported and an inner wall of a stage according to an image to be detected (step 202); the method further comprises the following steps:
when the motion track of the robot is determined not to be the preset standard track, a step of determining the distance between the article to be transported and the inner wall of the object stage according to the image to be detected is executed (step 202 is executed).
It should be explained that the positional relationship between the robot and the stage can be determined based on the image to be detected. Because the image to be detected is acquired in real time, the motion trail of the robot can be judged according to the image to be detected.
Correspondingly, the method further comprises the following steps:
step 402, when the motion track of the robot is determined to be a preset standard track, determining the displacement of the article to be transported according to the image to be detected and the image to be detected before a preset time interval;
and step 403, when the displacement of the article to be transported is greater than a preset displacement threshold value, determining that the article to be transported collides.
It should be explained that the image acquisition device includes a camera, and a motion track parallel to a main axis direction of the camera is a preset standard track.
Fig. 5 is a schematic structural diagram of an exemplary image capturing device provided in an embodiment of the present application. When the robot is assembled, the image acquisition device can be installed at the top of an object stage, namely the direction of the main shaft of the camera is the vertical direction, and if the motion tracks of the robot are the same as the vertical direction, namely the motion tracks of the robot gripper are the same as the vertical direction, the motion tracks of the robot are determined to be the preset standard tracks.
Illustratively, when determining the movement of the robotWhen the track is a preset standard track, the distance x between the object to be transported in the image to be detected and the left side wall of the inner wall of the object stage at the current moment can be determined1The distance x between the article to be transported and the right side wall of the inner wall of the object stage2And the distance y between the article to be transported and the front side wall in the inner wall of the object stage, and the distance x between the article to be transported in the image to be detected and the left side wall in the inner wall of the object stage at preset time intervals1' determining the displacement of the article to be transported relative to each side wall by the distance x ' between the article to be transported and the right side wall in the inner wall of the object stage and the distance y ' between the article to be transported and the front side wall in the inner wall of the object stage. Specifically, Δ x1=|x1-x1’|,Δx2=|x2-x2'|, Δ y ═ y-y' |, where Δ x1Representing the displacement of the goods to be transported, Δ x, relative to the left side wall2Indicating the displacement of the item to be transported relative to the right side wall and deltay indicating the displacement of the item to be transported relative to the front side wall. When Δ x1、Δx2And when any displacement of the delta y is larger than a preset displacement threshold value, determining that the goods to be transported collide, wherein the preset displacement threshold value can be set according to the actual situation.
On the basis of the foregoing embodiment, in order to improve reliability and timeliness of the acquired image to be detected, as shown in fig. 6, which is a schematic flow chart of another robot object collision detection method provided in the embodiment of the present application, as an implementable manner, on the basis of the foregoing embodiment, in an embodiment, an image to be detected including an object stage and an object to be transported on the object stage is acquired (step 201); the method comprises the following steps:
2011, acquiring a video including the object stage and the object to be transported on the object stage within a preset time period;
Specifically, the image acquisition device can acquire the video of the robot within a preset time period, and send the acquired video to the electronic equipment in real time, and the electronic equipment stores the acquired video and performs framing processing on the video according to a preset time interval to obtain at least one image to be detected.
For example, if the obtained video is a video within a time period of 8:00 to 9:00 (that is, the preset time period is 8:00 to 9:00), and the preset time interval may be 1 minute, the first image to be detected is the image to be detected corresponding to the time of 8:00 in the video, and the second image to be detected is the image to be detected corresponding to the time of 8:01 in the video. By analogy, a plurality of images to be detected can be obtained, wherein the specific frame division mode is not limited in the embodiment of the application.
Specifically, in one embodiment, when it is determined that an article to be transported has collided, collision information is generated; and alarming according to the collision information to prompt an operator that the object to be transported collides.
When an operator determines that the object to be detected collides, the quality of the object can be detected in time, if the object is determined to have serious quality problems due to collision, the object is scrapped, and meanwhile, the robot can be subjected to corresponding fault detection to determine whether the collision is caused by the fault of the robot.
The alarm can be given in a mode of instrument display, and the alarm of prompt information can also be given in other modes such as a prompt lamp or a prompt tone, and the embodiment of the application is not limited.
According to the robot carrying collision detection method provided by the embodiment of the application, the to-be-detected image comprising the carrying platform and the to-be-transported object on the carrying platform is obtained; determining distance information between the object to be transported and the inner wall of the objective table according to the image to be detected; determining the deviation of the object to be transported according to the distance information between the object to be transported and each side wall in the inner wall of the object stage and a preset linear regression function; and when the deviation of the to-be-transported object is larger than a preset deviation threshold value, determining that the to-be-transported object is collided. According to the detection method provided by the scheme, the object stage and the to-be-detected image of the to-be-transported object on the object stage are obtained in real time, and the distance information between the to-be-transported object and the inner wall of the object stage is determined according to the to-be-detected image, so that whether the to-be-transported object collides or not is detected in real time, and the timeliness of a detection result is improved.
The embodiment of the application provides a robot carrying collision detection device, which is used for solving the technical problem of poor timeliness of a detection method for carrying collision of a robot in the prior art. As shown in fig. 7, a schematic structural diagram of a robot loading collision detecting apparatus provided in an embodiment of the present application is shown, where the apparatus 70 includes: an acquisition module 701, a first calculation module 702, a second calculation module 703 and a detection module 704.
The system comprises an acquisition module 701, a detection module and a display module, wherein the acquisition module 701 is used for acquiring an image to be detected, which comprises an object stage and an object to be transported on the object stage; the first calculation module 702 is configured to determine, according to the image to be detected, distance information between the article to be transported and an inner wall of the object stage; the second calculation module 703 is configured to determine a deviation of the article to be transported according to distance information between the article to be transported and each side wall in the inner wall of the object stage and a preset linear regression function; the detecting module 704 is configured to determine that the article to be transported collides when the deviation of the article to be transported is greater than a preset deviation threshold.
Specifically, in an embodiment, the first calculating module 702 is specifically configured to: and respectively calculating the distance between the article to be transported and each side wall in the inner wall of the object stage.
Specifically, in one embodiment, the first calculation module 702 is further configured to: judging whether the motion track of the robot is a preset standard track or not according to the image to be detected;
and when the motion track of the robot is determined not to be the preset standard track, determining the distance information between the object to be transported and the inner wall of the object stage according to the image to be detected.
Specifically, in one embodiment, the first calculation module 702 is further configured to: when the motion track of the robot is determined to be a preset standard track, determining the displacement of the article to be transported according to the image to be detected and the image to be detected before a preset time interval;
and when the displacement of the article to be transported is greater than a preset displacement threshold value, determining that the article to be transported is collided.
Specifically, in an embodiment, the obtaining module 701 is specifically configured to: acquiring a video including an object stage and an object to be transported on the object stage within a preset time period;
and performing framing processing on the video according to a preset time interval to obtain at least one image to be detected.
Specifically, in one embodiment, the detection module 704 is further configured to: generating collision information when determining that the article to be transported collides;
and alarming according to the collision information to prompt an operator that the object to be transported collides.
The robot carrying collision detection device provided by the embodiment of the application is used for executing the robot carrying collision detection method provided by the embodiment, the implementation mode is the same as the principle, and the description is omitted.
The embodiment of the application also provides electronic equipment which is used for executing the method provided by the embodiment.
Fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device 80 includes: at least one processor 81 and memory 82;
wherein execution of the memory-stored computer-executable instructions by the at least one processor causes the at least one processor to perform the instructions of the method as in any one of the preceding embodiments.
The electronic device provided by the embodiment of the application is used for executing the robot carrying collision detection method provided by the embodiment, the implementation mode and the principle are the same, and the description is omitted.
The embodiment of the present application provides a storage medium containing computer executable instructions, where the storage medium stores computer processor execution instructions, and when the processor executes the computer execution instructions, the method provided in any one of the above embodiments is implemented.
The storage medium containing the computer-executable instructions according to the embodiment of the present application may be used to store the computer-executable instructions of the robot object collision detection method provided in the foregoing embodiment, and the implementation manner and the principle thereof are the same and are not described again.
It should be understood that the above examples are only for clarity of illustration and are not intended to limit the embodiments. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications therefrom are within the scope of the invention.
Claims (10)
1. A robot object collision detection method is applied to a robot object system, the robot object system comprises a robot and an object stage for placing an object to be transported, and the method is characterized by comprising the following steps:
acquiring an image to be detected comprising an object stage and an object to be transported on the object stage;
determining distance information between the article to be transported and the inner wall of the objective table according to the image to be detected;
determining the deviation of the article to be transported according to the distance information between the article to be transported and each side wall in the inner wall of the objective table and a preset linear regression function;
and when the dispersion of the to-be-transported goods is larger than a preset dispersion threshold value, determining that the to-be-transported goods is collided.
2. The method for detecting collision of a robot object according to claim 1, wherein the determining the distance information between the object to be transported and the inner wall of the object stage based on the image to be detected comprises:
and respectively calculating the distance between the article to be transported and each side wall in the inner wall of the object stage.
3. The robot-carried object collision detecting method according to claim 1, wherein before determining distance information between the object to be transported and the inner wall of the stage from the image to be detected; the method further comprises the following steps:
judging whether the motion track of the robot is a preset standard track or not according to the image to be detected;
and when the motion track of the robot is determined not to be the preset standard track, executing the step of determining the distance information between the article to be transported and the inner wall of the object stage according to the image to be detected.
4. The method of claim 3, further comprising:
when the motion track of the robot is determined to be a preset standard track, determining the displacement difference of the to-be-transported object according to the to-be-detected image and the to-be-detected image after a preset time interval;
and when the displacement difference of the article to be transported is larger than a preset position difference threshold value, determining that the article to be transported is collided.
5. The robot object collision detection method according to claim 4, wherein the acquiring of the to-be-detected image including the object stage and the object to be transported on the object stage comprises:
acquiring a video including an object stage and an object to be transported on the object stage within a preset time period;
and performing frame processing on the video according to a preset time interval to obtain at least one image to be detected.
6. A robot-carried object collision detection method according to any of claims 1-5, further comprising:
when determining that the article to be transported collides, generating collision information;
and alarming according to the collision information to prompt an operator that the article to be transported collides.
7. A robot object collision detection device, comprising: the device comprises an acquisition module, a first calculation module, a second calculation module and a detection module;
the acquisition module is used for acquiring an image to be detected, which comprises an object stage and an object to be transported on the object stage;
the first calculation module is used for determining distance information between the article to be transported and the inner wall of the object stage according to the image to be detected;
the second calculation module is used for determining the deviation of the article to be transported according to the distance information between the article to be transported and each side wall in the inner wall of the objective table and a preset linear regression function;
the detection module is used for determining that the articles to be transported collide when the dispersion of the articles to be transported is larger than a preset dispersion threshold value.
8. The robot-mounted object collision detecting device according to claim 7, wherein the first calculating module is specifically configured to:
and respectively calculating the distance between the article to be transported and each side wall in the inner wall of the object stage.
9. An electronic device, comprising: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executing the computer-executable instructions stored by the memory causes the at least one processor to perform the method of any one of claims 1-6.
10. A storage medium containing computer-executable instructions for performing the method of any one of claims 1-6 when executed by a computer processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010747264.9A CN111890343B (en) | 2020-07-29 | 2020-07-29 | Robot object collision detection method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010747264.9A CN111890343B (en) | 2020-07-29 | 2020-07-29 | Robot object collision detection method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111890343A true CN111890343A (en) | 2020-11-06 |
CN111890343B CN111890343B (en) | 2021-10-15 |
Family
ID=73182508
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010747264.9A Active CN111890343B (en) | 2020-07-29 | 2020-07-29 | Robot object collision detection method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111890343B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113213054A (en) * | 2021-05-12 | 2021-08-06 | 深圳市海柔创新科技有限公司 | Adjustment method, device, equipment, robot and warehousing system of goods taking and placing device |
CN113334392A (en) * | 2021-08-06 | 2021-09-03 | 成都博恩思医学机器人有限公司 | Mechanical arm anti-collision method and device, robot and storage medium |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2221152A1 (en) * | 2009-02-19 | 2010-08-25 | ABB Technology AB | A robot system and a method for picking and placing components |
CN104385279A (en) * | 2014-09-26 | 2015-03-04 | 京东方科技集团股份有限公司 | Manipulator demonstration method and system |
CN105437261A (en) * | 2016-01-04 | 2016-03-30 | 杭州亚美利嘉科技有限公司 | Early warning method and device for tire wear of robot |
CN106393097A (en) * | 2015-07-30 | 2017-02-15 | 发那科株式会社 | Industrial robot system and control method thereof |
CN106969732A (en) * | 2016-01-13 | 2017-07-21 | 中联重科股份有限公司 | Study of Boom Cracking detection method, device, system and engineering machinery |
CN106985145A (en) * | 2017-04-24 | 2017-07-28 | 合肥工业大学 | One kind carries transfer robot |
CN107443430A (en) * | 2017-09-12 | 2017-12-08 | 珠海市微半导体有限公司 | The detection method of intelligent robot collision obstacle and build drawing method |
CN107962569A (en) * | 2017-11-23 | 2018-04-27 | 珠海格力电器股份有限公司 | A kind of collision checking method of robot, device and intelligent robot |
CN108015774A (en) * | 2017-12-15 | 2018-05-11 | 北京艾利特科技有限公司 | A kind of sensorless mechanical arm collision checking method |
JP2018176334A (en) * | 2017-04-10 | 2018-11-15 | キヤノン株式会社 | Information processing device, measurement device, system, interference determination method and article manufacturing method |
US10192195B1 (en) * | 2016-10-25 | 2019-01-29 | Amazon Technologies, Inc. | Techniques for coordinating independent objects with occlusions |
CN109397283A (en) * | 2018-01-17 | 2019-03-01 | 清华大学 | A kind of robot collision checking method and device based on velocity deviation |
CN110744551A (en) * | 2019-11-20 | 2020-02-04 | 上海非夕机器人科技有限公司 | Robot clamping jaw movement control method and device, robot and storage device |
CN110834334A (en) * | 2019-11-20 | 2020-02-25 | 常州捷佳创精密机械有限公司 | Control method and device for manipulator and processing tank equipment |
CN110893617A (en) * | 2018-09-13 | 2020-03-20 | 深圳市优必选科技有限公司 | Obstacle detection method and device and storage device |
CN111152226A (en) * | 2020-01-19 | 2020-05-15 | 吉利汽车研究院(宁波)有限公司 | Robot working track planning method and system |
-
2020
- 2020-07-29 CN CN202010747264.9A patent/CN111890343B/en active Active
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2221152A1 (en) * | 2009-02-19 | 2010-08-25 | ABB Technology AB | A robot system and a method for picking and placing components |
CN104385279A (en) * | 2014-09-26 | 2015-03-04 | 京东方科技集团股份有限公司 | Manipulator demonstration method and system |
CN106393097A (en) * | 2015-07-30 | 2017-02-15 | 发那科株式会社 | Industrial robot system and control method thereof |
CN105437261A (en) * | 2016-01-04 | 2016-03-30 | 杭州亚美利嘉科技有限公司 | Early warning method and device for tire wear of robot |
CN106969732A (en) * | 2016-01-13 | 2017-07-21 | 中联重科股份有限公司 | Study of Boom Cracking detection method, device, system and engineering machinery |
US10192195B1 (en) * | 2016-10-25 | 2019-01-29 | Amazon Technologies, Inc. | Techniques for coordinating independent objects with occlusions |
JP2018176334A (en) * | 2017-04-10 | 2018-11-15 | キヤノン株式会社 | Information processing device, measurement device, system, interference determination method and article manufacturing method |
CN106985145A (en) * | 2017-04-24 | 2017-07-28 | 合肥工业大学 | One kind carries transfer robot |
CN107443430A (en) * | 2017-09-12 | 2017-12-08 | 珠海市微半导体有限公司 | The detection method of intelligent robot collision obstacle and build drawing method |
CN107962569A (en) * | 2017-11-23 | 2018-04-27 | 珠海格力电器股份有限公司 | A kind of collision checking method of robot, device and intelligent robot |
CN108015774A (en) * | 2017-12-15 | 2018-05-11 | 北京艾利特科技有限公司 | A kind of sensorless mechanical arm collision checking method |
CN109397283A (en) * | 2018-01-17 | 2019-03-01 | 清华大学 | A kind of robot collision checking method and device based on velocity deviation |
CN110893617A (en) * | 2018-09-13 | 2020-03-20 | 深圳市优必选科技有限公司 | Obstacle detection method and device and storage device |
CN110744551A (en) * | 2019-11-20 | 2020-02-04 | 上海非夕机器人科技有限公司 | Robot clamping jaw movement control method and device, robot and storage device |
CN110834334A (en) * | 2019-11-20 | 2020-02-25 | 常州捷佳创精密机械有限公司 | Control method and device for manipulator and processing tank equipment |
CN111152226A (en) * | 2020-01-19 | 2020-05-15 | 吉利汽车研究院(宁波)有限公司 | Robot working track planning method and system |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113213054A (en) * | 2021-05-12 | 2021-08-06 | 深圳市海柔创新科技有限公司 | Adjustment method, device, equipment, robot and warehousing system of goods taking and placing device |
TWI801123B (en) * | 2021-05-12 | 2023-05-01 | 大陸商深圳市海柔創新科技有限公司 | Adjustment method, apparatus, and device of fetching and placing goods apparatus, robot, warehousing system, computer readable storage medium and computer program product |
CN113334392A (en) * | 2021-08-06 | 2021-09-03 | 成都博恩思医学机器人有限公司 | Mechanical arm anti-collision method and device, robot and storage medium |
CN113334392B (en) * | 2021-08-06 | 2021-11-09 | 成都博恩思医学机器人有限公司 | Mechanical arm anti-collision method and device, robot and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN111890343B (en) | 2021-10-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6392908B2 (en) | Visual sensor abnormality cause estimation system | |
CN111890343B (en) | Robot object collision detection method and device | |
CN110490995B (en) | Method, system, equipment and storage medium for monitoring abnormal running state of belt | |
CN111144426B (en) | Sorting method, sorting device, sorting equipment and storage medium | |
WO2019041952A1 (en) | Methods and systems for improved quality inspection of products using a robot | |
US20190084009A1 (en) | Projection instruction device, parcel sorting system, and projection instruction method | |
CN102189551A (en) | Robot system and transfer method | |
CN110621984A (en) | Method and system for improving quality inspection | |
CN111392402B (en) | Automatic grabbing method, device, equipment and storage medium | |
US10675659B2 (en) | Instruction projecting device, package sorting system and instruction projecting method | |
CN111191650B (en) | Article positioning method and system based on RGB-D image visual saliency | |
CN108858251A (en) | A kind of collision avoidance system of high-speed motion manipulator | |
JP2019184380A (en) | Imaging device and inspection system using imaging device | |
EP3434623A1 (en) | Projection indicator, cargo assortment system, and projection indicating method | |
Thirde et al. | A Real-Time Scene Understanding System for Airport Apron Monitoring. | |
Lemos et al. | Convolutional neural network based object detection for additive manufacturing | |
CN113211426A (en) | Robot fault diagnosis method and device, computer equipment and storage medium | |
US11370124B2 (en) | Method and system for object tracking in robotic vision guidance | |
Siam et al. | Fault tolerant control of an industrial manufacturing process using image processing | |
EP3689792B1 (en) | Package recognition device, package sorting system and package recognition method | |
EP3434625B1 (en) | Projection instruction device, parcel sorting system, and projection instruction method | |
CN113226666A (en) | Method and apparatus for monitoring a robotic system | |
US20240037952A1 (en) | Analysis device, analysis system, analysis method, and computer-readable medium | |
WO2023073780A1 (en) | Device for generating learning data, method for generating learning data, and machine learning device and machine learning method using learning data | |
WO2024053150A1 (en) | Picking system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |