CN114347040A - Method and device for picking up target object, robot and storage medium - Google Patents

Method and device for picking up target object, robot and storage medium Download PDF

Info

Publication number
CN114347040A
CN114347040A CN202210152273.2A CN202210152273A CN114347040A CN 114347040 A CN114347040 A CN 114347040A CN 202210152273 A CN202210152273 A CN 202210152273A CN 114347040 A CN114347040 A CN 114347040A
Authority
CN
China
Prior art keywords
target object
robot
picking
pickup device
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210152273.2A
Other languages
Chinese (zh)
Other versions
CN114347040B (en
Inventor
张发恩
林国森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ainnovation Hefei Technology Co ltd
Original Assignee
Ainnovation Hefei Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ainnovation Hefei Technology Co ltd filed Critical Ainnovation Hefei Technology Co ltd
Priority to CN202210152273.2A priority Critical patent/CN114347040B/en
Publication of CN114347040A publication Critical patent/CN114347040A/en
Application granted granted Critical
Publication of CN114347040B publication Critical patent/CN114347040B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1669Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application provides a method, a device, a robot and a storage medium for picking up a target object, wherein the method comprises the following steps: if the image shot by the camera is recognized to comprise the target object, carrying out position recognition on the target object in the image to obtain the position of the target object; determining whether the target object is located within the working range of the pickup device according to the position of the target object, and if so, starting the pickup device to pick up the target object; if the object is not located in the picking device, the robot carries out position adjustment, so that the object is located in the working range of the picking device after the position adjustment. The technical problem of low manual picking efficiency in the prior art is solved.

Description

Method and device for picking up target object, robot and storage medium
Technical Field
The present application relates to the field of robotics, and in particular, to a method and an apparatus for picking up an object, a robot, and a storage medium.
Background
With the increasing demand for the appearance of the city, the regulations for forbidding smoking in public places are more and more extensive, and the labels for forbidding smoking are usually pasted on the streets or restaurants so as to control smoking to a certain extent.
Although there is a sign of no smoking, there are still many people smoking on the street, and even the cigarette ends are discarded at will, which not only affects the appearance of the city, but also easily causes the spread of diseases.
The existing scheme is that sanitation personnel manually pick up the goods, and the picking efficiency is low.
Disclosure of Invention
Based on the above, a method, a device, a robot and a storage medium for picking up the object are provided to improve the picking up efficiency of the object.
In a first aspect, a method for picking up an object is provided, which is applied to a robot, wherein a camera and a picking-up device are arranged in the robot; the method for picking up the target object comprises the following steps: if the image shot by the camera is recognized to comprise the target object, carrying out position recognition on the target object in the image to obtain the position of the target object; determining whether the target object is located within the working range of the pickup device according to the position of the target object, and if so, starting the pickup device to pick up the target object; if the object is not located in the picking device, the robot carries out position adjustment, so that the object is located in the working range of the picking device after the position adjustment.
According to the method for picking up the target object, when the target object is identified in the image shot by the camera, the position of the target object in the image is identified to obtain the position of the target object; further, whether the target object is located within the working range of the picking device is determined according to the position of the target object, and if the target object is located within the working range of the picking device, the picking device is started to pick the target object; if the object is not located, the robot adjusts the position, so that the object is located within the working range of the picking device after the position is adjusted, and the object is picked up. The automatic target object finding and picking-up device has the advantages that the automatic target object finding and picking-up are achieved, the picking-up efficiency is improved, and the technical problem that the manual picking-up efficiency is low is solved.
In one embodiment, the determining whether the target object is located within the working range of the pickup device according to the position of the target object includes: if the position of the object is located in the designated area in the image, the object is located in the working range of the pickup device, and the probability that the object located in the designated area in the image is successfully picked up by the pickup device is higher than the probability that the object located in other areas in the image is successfully picked up by the pickup device; otherwise, the object is not located within the working range of the pick-up device.
In the embodiment, whether the target object is located within the working range of the pickup device is judged through the image space, and compared with a mode that the position of the target object is converted to other spaces to be judged after the position of the target object is obtained, the processing efficiency can be improved.
In one embodiment, the designated area is a designated pixel point.
In the embodiment, since the designated area is the designated pixel, the position of the target object is accurately determined compared with the case that the designated area includes at least two pixels, so that the picking success rate of the picking device is further improved.
In one embodiment, the robot performs position adjustment, comprising: calculating the position offset according to the position of the target object and the designated area; and the robot adjusts the position according to the position offset.
In the above embodiment, since the position offset amount is calculated according to the target object position and the designated area, and the robot performs the position adjustment according to the position offset amount, it means that the robot approaches from the designated area to the target object position according to the position offset amount, and thus reaches the target object position, thereby avoiding the robot performing the position adjustment in no direction.
In one embodiment, the actuating the pick-up device to pick up the object comprises: determining the relative position between the target object and the pickup device according to the position of the target object; determining working parameters of the pickup device according to the relative position; and starting the picking device according to the working parameters so that the picking device picks the target object according to the working parameters.
Although the target object is located within the working range of the pickup device in the above embodiment, under the same condition, the pickup success rates of the position points located in the working range of the pickup device are not all the same, for example, the closer a position point is to a specified position point, the higher the pickup success rate corresponding to the position point is, and conversely, the farther a position point is from the specified position point, the lower the pickup success rate corresponding to the position point is. Therefore, in order to ensure that each position point can successfully pick up the object at one time when the object is picked up, the relative position between the object and the picking device needs to be determined according to the position of the object, so that the working parameters of the picking device are adjusted according to the relative position, the working parameters are used for finely controlling the picking device, and the successful picking can be ensured at one time.
In one embodiment, the performing position recognition on the target object in the image to obtain the position of the target object includes: inputting the image into a position recognition model to obtain a first vertex coordinate and a second vertex coordinate of a target frame surrounding the target object, which are output by the position recognition model; and obtaining the position of the target object according to the first vertex coordinate and the second vertex coordinate of the target frame.
In the above embodiment, since the target position is obtained according to the first vertex coordinates and the second vertex coordinates of the target frame, the pickup device is facilitated to pick up the target better, for example, the coordinates of the center of the target frame can be regarded as the coordinates of the center of the target, and thus, the pickup device is further facilitated to pick up the target by taking the coordinates of the center of the target frame as the target position.
In one embodiment, the pick-up device comprises a pipette device.
In the above embodiment, the pickup device includes the pipette device, which is simple in structure, does not require a fine and complicated control as the robot or the fork, and is lower in the maintenance cost at the later stage, compared to the robot or the fork.
In a second aspect, a pickup device for an object is provided, which is applied to a robot, wherein a camera and the pickup device are arranged in the robot; the pickup device of the object comprises: the acquisition unit is used for identifying the position of the target object in the image to obtain the position of the target object if the image shot by the camera is identified to contain the target object; the positioning unit is used for determining whether the target object is positioned in the working range of the pickup device according to the position of the target object, and if so, starting the pickup device to pick up the target object; and the adjusting unit is used for adjusting the position of the robot if the robot is not positioned, so that the target object is positioned in the working range of the picking device after the position is adjusted.
In one embodiment, the location unit is specifically configured to: if the position of the object is located in the designated area in the image, the object is located in the working range of the pickup device, and the probability that the object located in the designated area in the image is successfully picked up by the pickup device is higher than the probability that the object located in other areas in the image is successfully picked up by the pickup device; otherwise, the object is not located within the working range of the pick-up device.
In one embodiment, the designated area is a designated pixel point.
In an embodiment, the adjusting unit is specifically configured to: calculating the position offset according to the position of the target object and the designated area; and the robot adjusts the position according to the position offset.
In one embodiment, the location unit is specifically configured to: determining the relative position between the target object and the pickup device according to the position of the target object; determining working parameters of the pickup device according to the relative position; and starting the picking device according to the working parameters so that the picking device picks the target object according to the working parameters.
In an embodiment, the obtaining unit is specifically configured to: inputting the image into a position recognition model to obtain a first vertex coordinate and a second vertex coordinate of a target frame surrounding the target object, which are output by the position recognition model; and obtaining the position of the target object according to the first vertex coordinate and the second vertex coordinate of the target frame.
In one embodiment, the pick-up device comprises a pipette device.
In a third aspect, a robot is provided, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method for picking up an object as described above when executing the computer program.
In a fourth aspect, a computer-readable storage medium is provided, in which computer program instructions are stored, which, when read and executed by a processor, perform the steps of the method for picking up an object as described above.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic flow chart illustrating an implementation of a target object picking method in an embodiment of the present application;
FIG. 2 is a schematic view of a robot in an embodiment of the present application;
FIG. 3 is a schematic view of a screen according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of the structure of a pickup device for a target object in the embodiment of the present application;
fig. 5 is a block diagram of the internal structure of the robot in the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In one embodiment, a method of picking up an object is provided. The main execution body of the target object picking method according to the embodiment of the present invention is an apparatus capable of implementing the target object picking method according to the embodiment of the present invention, and the apparatus includes a robot, for example, a robot dog. The robot is provided with the camera and the picking device, and the automatic picking of the target object is realized through the camera and the picking device, so that the picking efficiency of the target object is improved, and the technical problem of low manual picking efficiency in the prior art is solved. The camera can obtain images through shooting, and the pickup device can pick up the target object. In the following, in order to better explain the method for picking up the target object provided in the embodiment of the present invention, the target object is taken as the cigarette butt, but it should be understood that the target object is not limited to the cigarette butt, and the target object may be another object, for example, the target object is a mask.
As shown in fig. 1, there is provided a method for picking up an object, including:
and step 100, if the image shot by the camera includes the target object, identifying the position of the target object in the image to obtain the position of the target object.
In one example, the robot performs image recognition on an image captured by the camera, and if the robot recognizes that the image captured by the camera includes a target object, for example, a cigarette end, the robot continues to perform position recognition on the target object to obtain a position of the target object in the image, that is, a target object position.
In another example, the server performs image recognition on an image shot by the camera, the robot sends the image shot by the camera to the server, the server performs image recognition on the image shot by the camera, if the server recognizes that the image shot by the camera includes the target object, the server sends target object existence information to the robot, and after receiving the target object existence information sent by the server, the robot determines that the image shot by the camera includes the target object, so that the position recognition of the target object in the image is continued, and the position of the target object in the image is obtained.
The position of the object may be represented by a coordinate, for example, the position of the object in the image is represented by a coordinate (x, y).
The location identification of the object in the image may be performed by an object detection method, for example, SSD, Yolov 5.
And 200, determining whether the target object is located in the working range of the pickup device according to the position of the target object, and if so, starting the pickup device to pick up the target object.
The working range of the pickup device refers to a range that the pickup device can reach during operation, for example, in fig. 2, the working range of the pickup device is an elliptical range of the pickup side of the pickup device. When the target object is located within the working range of the picking device, the robot is considered to be capable of picking up the target object; and when the target object is not located within the working range of the picking device, the robot is considered to be unable to pick up the target object.
Thus, in order for the pickup device to pick up the target object, it is necessary to determine whether the target object is located within the working range of the pickup device. In one example, the position of the target object is mapped to a world coordinate system, a world coordinate 1 is obtained, a world coordinate range 2 corresponding to the working range of the pickup device is obtained, if the world coordinate 1 is found to be included in the world coordinate range 2, the target object is considered to be located within the working range of the pickup device, and conversely, if the world coordinate 1 is found not to be included in the world coordinate range 2, the target object is considered not to be located within the working range of the pickup device. In another example, if the physical point of the object position corresponding to the ground is within the working range of the pickup device corresponding to the physical area of the ground, the object is considered to be within the working range of the pickup device, and conversely, if the physical point of the object position corresponding to the ground is not within the working range of the pickup device corresponding to the physical area of the ground, the object is considered to be not within the working range of the pickup device.
And when the target object is found to be positioned in the working range of the picking device, the picking device is started to pick the target object. Further optionally, if the picking fails, the robot performs position adjustment to successfully pick up the target object after the position adjustment.
And step 300, if the robot is not located, the robot performs position adjustment, so that the target object is located within the working range of the pickup device after the position adjustment.
If the target object is not found within the working range of the picking device, the robot needs to perform position adjustment for picking the target object, for example, the robot determines the orientation of the target object, performs position adjustment according to the orientation of the target object, and then starts the picking device to pick the target object after the position adjustment is completed.
Optionally, in order to ensure that the robot can pick up the target object after performing the position adjustment, the position determination is further required, specifically, after performing the position adjustment, the method further includes: controlling a camera to shoot images; if the image is identified to include the target object, carrying out position identification on the target object in the image to obtain the position of the target object; determining whether the position of the target object is within the working range of the picking device, and if so, starting the picking device to pick the target object; if the target object is not located in the working range of the picking device, the robot continues to adjust the position until the target object is located in the working range of the picking device.
According to the method for picking up the target object, when the target object is identified in the image shot by the camera, the position of the target object in the image is identified to obtain the position of the target object; further, whether the target object is located within the working range of the picking device is determined according to the position of the target object, and if the target object is located within the working range of the picking device, the picking device is started to pick the target object; if the object is not located, the robot adjusts the position, so that the object is located within the working range of the picking device after the position is adjusted, and the object is picked up. The automatic target object finding and picking-up device has the advantages that the automatic target object finding and picking-up are achieved, the picking-up efficiency is improved, and the technical problem that the manual picking-up efficiency is low is solved.
In one embodiment, the step 200 of determining whether the object is located within the working range of the pickup device according to the position of the object includes:
in step 201, if the position of the object is located in the designated area in the image, the object is located within the working range of the pickup device, and the probability that the object located in the designated area in the image is successfully picked up by the pickup device is higher than the probability that the object located in other areas in the image is successfully picked up by the pickup device.
And the designated area is an area in the image, and the probability that the target object positioned in the designated area is successfully picked up by the pickup device is higher than the probability that the target object positioned in other areas in the image is successfully picked up by the pickup device. In one example, the specified area is an area to which the working range of the pickup device is mapped in the image; in another example, the designated area is a sub-area in the image to which the working range of the pickup device is mapped, and it is understood that when the designated area is a sub-area, if the position of the object is located within the designated area, the probability that the object is successfully picked up by the pickup device is higher than other sub-areas located in the area in the image to which the working range of the pickup device is mapped, and higher than other areas in the image except the area in the image to which the working range of the pickup device is mapped.
Otherwise, the target object is not located within the working range of the pick-up device, step 202.
If the object position is not within the specified area in the image, the object is not within the working range of the pickup device.
In the embodiment, whether the target object is located within the working range of the pickup device is judged through the image space, and compared with a mode that the position of the target object is converted to other spaces to be judged after the position of the target object is obtained, the processing efficiency can be improved.
In one embodiment, the designated area is a designated pixel point.
The designated pixel point is a pixel point in the image, specifically, the designated pixel point is a pixel point mapped to the image by a designated position point in the working range of the pickup device, the designated position point can be understood as an excellent pickup position, and the probability that the target object located at the designated pixel point is successfully picked up by the pickup device is higher than the probability that the target objects located at other pixel points in the image are successfully picked up by the pickup device, for example, as shown in fig. 2, the designated position point is the center of an ellipse.
Since the designated area is a designated pixel point, when the position determination is performed through the image space, if the position of the target object is located in the designated area in the image in step 201, the target object is located within the working range of the pickup device, which includes: and if the position of the target object is located at the designated pixel point, the target object is located within the working range of the pickup device. For example, the target position is (x1, y1), the coordinate position of the designated pixel point is (x2, y2), if x1 is x2 and y1 is y2, the target is located within the working range of the pickup device, otherwise, the target is not located within the working range of the pickup device. When the position determination is performed through another space, if the position of the target object is located in a designated area in the image, the step 201 includes: mapping the position of the target object to a world coordinate system to obtain a world coordinate 1 corresponding to the position of the target object; acquiring world coordinates 2 corresponding to the specified pixel points; if the world coordinate 1 is equal to the world coordinate 2, the target object is located within the working range of the pickup device.
In the embodiment, since the designated area is the designated pixel, the position of the target object is accurately determined compared with the case that the designated area includes at least two pixels, so that the picking success rate of the picking device is further improved.
In one embodiment, the step 300 robot performs a position adjustment, comprising:
step 301, calculating according to the position of the target object and the designated area to obtain the position offset.
The position offset reflects the situation that the position of the target object deviates from the designated area.
In one example, the position offset includes a distance and a bearing angle of the target position from the specified area. In a further example, when the designated area is a designated pixel point, the distance and direction angle of the target position from the designated area are calculated according to the target position and the coordinate position of the designated pixel point, for example, the target position is (x1, y1), the coordinate position of the designated pixel point is (x2, y2), and thus the distance of the target position from the designated area is ((x1-x2)2+(y1-y2)2)0 . 5And the azimuth angle is arctan ((y1-y2)/(x1-x 2)). In another further example, when the designated area includes at least two pixel points, a certain pixel point in the designated area is taken as a target pixel point, the distance and the direction angle between the target object position and the target pixel point are calculated in the same manner as in the above example, and the calculated distance and the calculated direction angle between the target object position and the target pixel point are taken as the position offset.
In another example, the position offset includes an abscissa offset and an ordinate offset. In a further example, when the designated area is a designated pixel point, the abscissa offset is x2-x1 and the ordinate offset is y2-y 1; in another further example, when the designated area includes at least two pixel points, a certain pixel point in the designated area is taken as a target pixel point, and the abscissa offset and the ordinate offset are calculated in the same manner as in the above example.
And step 302, the robot adjusts the position according to the position offset.
Correspondingly, when the position offset comprises the distance and the direction angle between the position of the target object and the designated area, the robot adjusts the position according to the distance and the direction angle between the position of the target object and the designated area; when the position offset includes an abscissa offset and an ordinate offset, the robot performs position adjustment according to the abscissa offset and the ordinate offset, for example, the abscissa offset is greater than 0, which indicates that the robot needs to move to the left, and conversely, the abscissa offset is less than 0, which indicates that the robot needs to move to the right, the ordinate offset is greater than 0, which indicates that the robot needs to move to the front, and conversely, the ordinate offset is less than 0, which indicates that the robot needs to move to the rear, and it should be noted that, since the calculated abscissa offset and the ordinate offset belong to offsets in an image space, and the robot substantially performs position adjustment in a physical space, in order to perform position adjustment, the abscissa offset and the ordinate offset need to be converted into a physical distance (for example, a unit cm of the physical distance), the robot will thus make position adjustments according to the converted physical distance.
In the above embodiment, since the position offset amount is calculated according to the target object position and the designated area, and the robot performs the position adjustment according to the position offset amount, it means that the robot approaches from the designated area to the target object position according to the position offset amount, and thus reaches the target object position, thereby avoiding the robot performing the position adjustment in no direction.
In one embodiment, step 200 activates a pickup device to pick up an object, comprising:
step 200A, determining the relative position between the target object and the pickup device according to the position of the target object.
In one example, the relative position between the object and the pickup device is represented by an orientation, and the orientation includes an upper left, a lower left, an upper right, and a lower right, specifically, if the object is located in a direction angle range of 0 ° to 90 °, the relative position between the object and the pickup device is determined to be an upper right, if the object is located in a direction angle range of 90 ° to 180 °, the relative position between the object and the pickup device is determined to be an upper left, and if the object is located in a direction angle range of 180 ° to 270 °, the relative position between the object and the pickup device is determined to be a lower left, and if the object is located in a direction angle range of 270 ° to 360 °, the relative position between the object and the pickup device is determined to be a lower right.
In one example, the relative position between the target object and the pickup device is represented by the distance of the target object from the designated area, for example, first world coordinates of the target object position in the world coordinate system and second world coordinates of the designated position point in the working range of the pickup device in the world coordinate system are first mapped, and then the distance between the target object and the pickup device can be determined from the first world coordinates and the second world coordinates.
In another example, the relative position between the target object and the pickup device is represented by the distance and the direction angle of the target object from the designated area, for example, first world coordinates of the target object position in the world coordinate system and second world coordinates of the designated position point in the working range of the pickup device in the world coordinate system are first mapped, and then the distance and the direction angle between the target object and the pickup device can be determined from the first world coordinates and the second world coordinates.
And 200B, determining the working parameters of the pickup device according to the relative position.
The working parameters are parameters of the picking device when the picking device executes the picking work.
For example, the pick-up device is a suction pipe device, the working parameters include a suction force value of an upper left region, a suction force value of a lower left region, a suction force value of an upper right region and a suction force value of a lower right region, the relative position is an upper left region, and then the working parameters of the pick-up device are determined as follows: upper left zone suction value: x1, left inferior zone suction value: x2, upper right zone suction value: x3 and lower right area suction values: x4, X1, X2, X3, X4, the suction force of the upper left area of the suction pipe device is larger than or equal to that of the lower left area, and the suction force of the upper right area is larger than or equal to that of the lower right area.
For another example, the pick-up device is a suction device, the working parameter includes a suction value, and the relative position is a distance, such that the larger the distance, the larger the suction value, and the smaller the distance, the smaller the suction value may be.
For another example, the pick-up device is a suction pipe device, the working parameters include a left area suction value and a right area suction value, the relative positions are a distance and a direction angle, if the direction angle is in a direction angle range of 270 ° to 90 °, the larger the distance is, the larger the right area suction value is, correspondingly, the larger the suction force of the right area of the suction pipe device is, the smaller the distance is, the smaller the right area suction value can be, correspondingly, the larger the suction force of the right area of the suction pipe device is, if the direction angle is in the direction angle range of 90 ° to 270 °, the larger the distance is, the larger the left area suction value is, correspondingly, the larger the suction force of the left area of the suction pipe device is, the smaller the distance is, the smaller the left area suction value can be, correspondingly, the smaller the suction force of the left area of the suction pipe device is.
And 200C, starting the picking device according to the working parameters so that the picking device picks up the target object according to the working parameters.
Since the operating parameters have been determined, the pick-up device is now activated directly in accordance with the operating parameters, so that the pick-up device picks up the object in accordance with the operating parameters.
Although the target object is located within the working range of the pickup device in the above embodiment, under the same condition, the pickup success rates of the position points located in the working range of the pickup device are not all the same, for example, the closer a position point is to a specified position point, the higher the pickup success rate corresponding to the position point is, and conversely, the farther a position point is from the specified position point, the lower the pickup success rate corresponding to the position point is. Therefore, in order to ensure that each position point can successfully pick up the object at one time when the object is picked up, the relative position between the object and the picking device needs to be determined according to the position of the object, so that the working parameters of the picking device are adjusted according to the relative position, the working parameters are used for finely controlling the picking device, and the successful picking can be ensured at one time.
In one embodiment, the step 100 of performing position recognition on the target object in the image to obtain the target object position includes:
step 101, inputting the image into a position recognition model to obtain a first vertex coordinate and a second vertex coordinate of a target frame surrounding the target object, which are output by the position recognition model.
The position identification model is a model which is obtained by training in advance and is used for identifying the position of a target object; the target frame can be a rectangular frame or a square frame; a first vertex coordinate that is a coordinate of a vertex of the four vertices of the target frame, for example, the first vertex coordinate is a coordinate of a vertex of an upper left corner of the target frame; the second vertex coordinate is a coordinate of a vertex of the four vertices of the target frame, for example, the second vertex coordinate is a coordinate of a vertex of a lower right corner of the target frame, and the first vertex coordinate and the second vertex coordinate are coordinates of different vertices, respectively.
Pre-training to obtain a position recognition model, comprising: acquiring a training image and a mark frame coordinate corresponding to the training image, wherein the training image comprises a target object such as a cigarette end, and the mark frame coordinate is a first vertex coordinate and a second vertex coordinate of a frame surrounding the cigarette end in the training image; inputting the training image into a position recognition model to obtain a first prediction coordinate and a second prediction coordinate output by the position recognition model; obtaining model loss according to the first prediction coordinate, the second prediction coordinate and the marking frame coordinate; and training the position recognition model according to the model loss until the model loss is less than the preset model loss or the iteration times reach the preset iteration times, and finishing the training.
Furthermore, in order to enable the position recognition model to better recognize cigarette ends of various brands and shapes, images of cigarette ends of various brands and shapes are collected in advance to obtain training images. For example, a whole cigarette end of a mars or moon brand, i.e., a cigarette end that the user has drawn a little or no cigarette, a cigarette end of a mars or moon brand that is half drawn, a whole cigarette end of a mars or moon brand that is flattened, a cigarette end of a mars or moon brand that is half drawn, and the like.
And 102, obtaining the position of the target object according to the first vertex coordinate and the second vertex coordinate of the target frame.
Assuming that the target level is set to the center of the target frame, the first vertex coordinates are represented by (x0, y0), and the second vertex coordinates are represented by (x1, y1), if the first vertex coordinates are the coordinates of the vertex at the upper left corner of the target frame, and the second vertex coordinates are the coordinates of the vertex at the lower right corner of the target frame, the target level is ((x0+ x1)/2, (y0+ y 1)/2).
In the above embodiment, since the target position is obtained according to the first vertex coordinates and the second vertex coordinates of the target frame, the pickup device is facilitated to pick up the target better, for example, the coordinates of the center of the target frame can be regarded as the coordinates of the center of the target, and thus, the pickup device is further facilitated to pick up the target by taking the coordinates of the center of the target frame as the target position.
In one embodiment, if step 200 is located, the step of starting the picking device to pick up the object includes: if the distance between the pickup device and the ground is the designated distance, the pickup device is started to pick up the target object.
The specified distance is a preset distance, for example, the specified distance is 1.5 cm, and when the distance between the pickup device and the ground is the specified distance, the pickup device is considered to be capable of picking up cigarette ends better. For example, the robot is provided with a distance sensor, the distance sensor is activated to detect the distance between the pickup device and the ground, the distance between the pickup device and the ground detected by the distance sensor is acquired, a predetermined distance is subtracted from the acquired distance, if the distance is 0, the pickup device is activated to pick up the target object, if the distance is less than 0, the robot raises the height of the robot to raise the pickup device, or the robot directly raises the pickup device, and if the distance is more than 0, the robot lowers the height of the robot to lower the pickup device, or the robot directly lowers the pickup device.
Above-mentioned embodiment, the robot can adjust the distance between pickup apparatus and the ground, when the object is located pickup apparatus's working range, still need confirm whether the distance between pickup apparatus and the ground is the specified distance, if for the specified distance, just think that pickup apparatus can be better pick up the object, at this moment, just start pickup apparatus and pick up the object to guarantee to a certain extent that once picks up can succeed.
In one embodiment, the pick-up device comprises a pipette device.
A pipette device comprising means for picking up objects by suction, as shown in fig. 2, the robot having four legs by which the robot can move, so that the robot can patrol and pick up objects all the time.
After the suction pipe device is started, except that the cigarette end can be sucked, the suction pipe device can possibly suck dust and the like, in order to filter the dust and the like, the suction of excessive dust is prevented, and optionally, as shown in fig. 3, the other end of the suction pipe device adopts the design of a filter screen, so that small particles such as dust can return to the ground again through the filter screen, and the cigarette end of a target object is left at the filter screen.
In the above embodiment, the pickup device includes the pipette device, which is simple in structure, does not require a fine and complicated control as the robot or the fork, and is lower in the maintenance cost at the later stage, compared to the robot or the fork.
In one embodiment, as shown in fig. 4, there is provided a pickup apparatus 400 for an object, applied to a robot in which a camera and the pickup apparatus are provided; the pickup apparatus 400 for an object includes:
an obtaining unit 401, configured to perform position recognition on a target object in an image if it is recognized that the image captured by the camera includes the target object, so as to obtain a position of the target object; a location unit 402, configured to determine whether the target object is located within a working range of the pickup device according to the position of the target object, and if so, start the pickup device to pick up the target object; and an adjusting unit 403, configured to perform position adjustment on the robot if the robot is not located, so that the target object is located within the working range of the pickup device after the position adjustment.
In one embodiment, the location unit 402 is specifically configured to: if the position of the object is located in the designated area in the image, the object is located in the working range of the pickup device, and the probability that the object located in the designated area in the image is successfully picked up by the pickup device is higher than the probability that the object located in other areas in the image is successfully picked up by the pickup device; otherwise, the object is not located within the working range of the pick-up device.
In one embodiment, the designated area is a designated pixel point.
In an embodiment, the adjusting unit 403 is specifically configured to: calculating the position offset according to the position of the target object and the designated area; and the robot adjusts the position according to the position offset.
In one embodiment, the location unit 402 is specifically configured to: determining the relative position between the target object and the pickup device according to the position of the target object; determining working parameters of the pickup device according to the relative position; and starting the picking device according to the working parameters so that the picking device picks the target object according to the working parameters.
In an embodiment, the obtaining unit 401 is specifically configured to: inputting the image into a position recognition model to obtain a first vertex coordinate and a second vertex coordinate of a target frame surrounding the target object, which are output by the position recognition model; and obtaining the position of the target object according to the first vertex coordinate and the second vertex coordinate of the target frame.
In one embodiment, the pick-up device comprises a pipette device.
In one embodiment, as shown in fig. 5, a robot is provided having a camera and a pick-up device disposed therein. The robot further comprises a processor, a memory and a network interface which are connected through a system bus, wherein the memory comprises a nonvolatile storage medium and an internal memory, the nonvolatile storage medium of the robot stores an operating system, and also stores a computer program, and when the computer program is executed by the processor, the processor can realize the picking method of the target object. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM). The internal memory may also store a computer program that, when executed by the processor, causes the processor to perform a method of picking up an object. Those skilled in the art will appreciate that the configuration shown in fig. 5 is a block diagram of only a portion of the configuration associated with the present application and does not constitute a limitation on the robot to which the present application may be applied, and that a particular robot may include more or fewer components than those shown, or may combine certain components, or have a different arrangement of components.
The object picking method provided by the present application can be implemented in the form of a computer program that can be run on a robot as shown in fig. 5. The memory of the robot may store therein respective program templates constituting the pickup device of the object. For example, at unit 402 and at adjustment unit 403.
A robot comprising a memory and a processor, the memory storing a computer program which, when executed by the processor, causes the processor to perform the steps of:
if the image shot by the camera is recognized to comprise the target object, carrying out position recognition on the target object in the image to obtain the position of the target object;
determining whether the target object is located within the working range of the pickup device according to the position of the target object, and if so, starting the pickup device to pick up the target object;
if the object is not located in the picking device, the robot carries out position adjustment, so that the object is located in the working range of the picking device after the position adjustment.
In one embodiment, a computer readable storage medium is provided, storing a computer program that, when executed by a processor, causes the processor to perform the steps of:
if the image shot by the camera is recognized to comprise the target object, carrying out position recognition on the target object in the image to obtain the position of the target object;
determining whether the target object is located within the working range of the pickup device according to the position of the target object, and if so, starting the pickup device to pick up the target object;
if the object is not located in the picking device, the robot carries out position adjustment, so that the object is located in the working range of the picking device after the position adjustment.
It should be noted that the above-mentioned object pickup method, object pickup apparatus, robot, and computer-readable storage medium belong to a general inventive concept, and the contents in the embodiments of the object pickup method, object pickup apparatus, robot, and computer-readable storage medium are mutually applicable.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
Furthermore, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. The method for picking up the target object is characterized by being applied to a robot, wherein a camera and a picking-up device are arranged in the robot; the method for picking up the target object comprises the following steps:
if the image shot by the camera is recognized to comprise the target object, carrying out position recognition on the target object in the image to obtain the position of the target object;
determining whether the target object is located within the working range of the pickup device according to the position of the target object, and if so, starting the pickup device to pick up the target object;
if the object is not located in the picking device, the robot carries out position adjustment, so that the object is located in the working range of the picking device after the position adjustment.
2. The method of claim 1, wherein determining whether the object is within the working range of the pick-up device based on the object position comprises:
if the position of the object is located in the designated area in the image, the object is located in the working range of the pickup device, and the probability that the object located in the designated area in the image is successfully picked up by the pickup device is higher than the probability that the object located in other areas in the image is successfully picked up by the pickup device;
otherwise, the object is not located within the working range of the pick-up device.
3. The method of claim 2, wherein the designated area is a designated pixel.
4. A pick-up method according to any one of claims 2 or 3, characterised in that the robot performs position adjustment comprising:
calculating the position offset according to the position of the target object and the designated area;
and the robot adjusts the position according to the position offset.
5. The method of claim 1, wherein the activating the pick-up device to pick up the object comprises:
determining the relative position between the target object and the pickup device according to the position of the target object;
determining working parameters of the pickup device according to the relative position;
and starting the picking device according to the working parameters so that the picking device picks the target object according to the working parameters.
6. The method for picking up according to claim 1, wherein the identifying the position of the object in the image to obtain the position of the object comprises:
inputting the image into a position recognition model to obtain a first vertex coordinate and a second vertex coordinate of a target frame surrounding the target object, which are output by the position recognition model;
and obtaining the position of the target object according to the first vertex coordinate and the second vertex coordinate of the target frame.
7. A method according to any one of claims 1 to 3, or 5 to 6, wherein the pick-up means comprises a pipette device.
8. The pickup device of a target object is characterized by being applied to a robot, wherein a camera and the pickup device are arranged in the robot; the pickup device of the object comprises:
the acquisition unit is used for identifying the position of the target object in the image to obtain the position of the target object if the image shot by the camera is identified to contain the target object;
the positioning unit is used for determining whether the target object is positioned in the working range of the pickup device according to the position of the target object, and if so, starting the pickup device to pick up the target object;
and the adjusting unit is used for adjusting the position of the robot if the robot is not positioned, so that the target object is positioned in the working range of the picking device after the position is adjusted.
9. A robot comprising a camera and a pick-up device, the robot further comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor when executing the computer program performing the steps of the method of picking up an object according to any of claims 1 to 7.
10. A computer-readable storage medium, having stored thereon computer program instructions, which, when read and executed by a processor, perform the steps of the method of picking up objects according to any one of claims 1 to 7.
CN202210152273.2A 2022-02-18 2022-02-18 Target object pickup method, device, robot and storage medium Active CN114347040B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210152273.2A CN114347040B (en) 2022-02-18 2022-02-18 Target object pickup method, device, robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210152273.2A CN114347040B (en) 2022-02-18 2022-02-18 Target object pickup method, device, robot and storage medium

Publications (2)

Publication Number Publication Date
CN114347040A true CN114347040A (en) 2022-04-15
CN114347040B CN114347040B (en) 2024-06-11

Family

ID=81094232

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210152273.2A Active CN114347040B (en) 2022-02-18 2022-02-18 Target object pickup method, device, robot and storage medium

Country Status (1)

Country Link
CN (1) CN114347040B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116619420A (en) * 2023-07-10 2023-08-22 国网江苏省电力有限公司南通供电分公司 Line auxiliary construction robot
WO2024125041A1 (en) * 2022-12-16 2024-06-20 美智光电科技股份有限公司 Sensing range calibration method and apparatus, and ceiling electric appliance, device and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1846951A (en) * 2005-04-11 2006-10-18 中国科学院自动化研究所 Control device and method for intelligent mobile robot capable of picking up article automatically
US7200260B1 (en) * 1999-04-08 2007-04-03 Fanuc Ltd Teaching model generating device
CN107030687A (en) * 2016-02-04 2017-08-11 上海晨兴希姆通电子科技有限公司 Position bias detecting method and module, crawl position calibration method, grasping system
CN108858202A (en) * 2018-08-16 2018-11-23 中国科学院自动化研究所 The control method of part grabbing device based on " to quasi- approach-crawl "
KR20200010916A (en) * 2018-07-23 2020-01-31 중앙대학교 산학협력단 Suction device of minute particles
CN111723782A (en) * 2020-07-28 2020-09-29 北京印刷学院 Deep learning-based visual robot grabbing method and system
CN111815708A (en) * 2020-07-17 2020-10-23 中国科学院自动化研究所 Service robot grabbing detection method based on dual-channel convolutional neural network
CN112025701A (en) * 2020-08-11 2020-12-04 浙江大华技术股份有限公司 Method, device, computing equipment and storage medium for grabbing object
CN112605986A (en) * 2020-11-09 2021-04-06 深圳先进技术研究院 Method, device and equipment for automatically picking up goods and computer readable storage medium
CN112720464A (en) * 2020-12-09 2021-04-30 深圳先进技术研究院 Target picking method based on robot system, electronic equipment and storage medium
CN113696178A (en) * 2021-07-29 2021-11-26 大箴(杭州)科技有限公司 Control method and system, medium and equipment for intelligent robot grabbing

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7200260B1 (en) * 1999-04-08 2007-04-03 Fanuc Ltd Teaching model generating device
CN1846951A (en) * 2005-04-11 2006-10-18 中国科学院自动化研究所 Control device and method for intelligent mobile robot capable of picking up article automatically
CN107030687A (en) * 2016-02-04 2017-08-11 上海晨兴希姆通电子科技有限公司 Position bias detecting method and module, crawl position calibration method, grasping system
KR20200010916A (en) * 2018-07-23 2020-01-31 중앙대학교 산학협력단 Suction device of minute particles
CN108858202A (en) * 2018-08-16 2018-11-23 中国科学院自动化研究所 The control method of part grabbing device based on " to quasi- approach-crawl "
CN111815708A (en) * 2020-07-17 2020-10-23 中国科学院自动化研究所 Service robot grabbing detection method based on dual-channel convolutional neural network
CN111723782A (en) * 2020-07-28 2020-09-29 北京印刷学院 Deep learning-based visual robot grabbing method and system
CN112025701A (en) * 2020-08-11 2020-12-04 浙江大华技术股份有限公司 Method, device, computing equipment and storage medium for grabbing object
CN112605986A (en) * 2020-11-09 2021-04-06 深圳先进技术研究院 Method, device and equipment for automatically picking up goods and computer readable storage medium
CN112720464A (en) * 2020-12-09 2021-04-30 深圳先进技术研究院 Target picking method based on robot system, electronic equipment and storage medium
CN113696178A (en) * 2021-07-29 2021-11-26 大箴(杭州)科技有限公司 Control method and system, medium and equipment for intelligent robot grabbing

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024125041A1 (en) * 2022-12-16 2024-06-20 美智光电科技股份有限公司 Sensing range calibration method and apparatus, and ceiling electric appliance, device and storage medium
CN116619420A (en) * 2023-07-10 2023-08-22 国网江苏省电力有限公司南通供电分公司 Line auxiliary construction robot

Also Published As

Publication number Publication date
CN114347040B (en) 2024-06-11

Similar Documents

Publication Publication Date Title
CN114347040A (en) Method and device for picking up target object, robot and storage medium
CN103189827B (en) Object display apparatus and object displaying method
WO2019042426A1 (en) Augmented reality scene processing method and apparatus, and computer storage medium
CN110568447A (en) Visual positioning method, device and computer readable medium
CN110926330B (en) Image processing apparatus, image processing method, and program
CN107016348B (en) Face detection method and device combined with depth information and electronic device
US11348354B2 (en) Human body tracing method, apparatus and device, and storage medium
JP2000293670A (en) Method and device for automatically recognizing road sign of video picture and storage medium storing program for automatically recognizing road sign
CN109213202B (en) Goods placement method, device, equipment and storage medium based on optical servo
CN113213054A (en) Adjustment method, device, equipment, robot and warehousing system of goods taking and placing device
JP7354767B2 (en) Object tracking device and object tracking method
US9947106B2 (en) Method and electronic device for object tracking in a light-field capture
CN112605993A (en) Automatic file grabbing robot control system and method based on binocular vision guidance
JP6399485B2 (en) Travel path recognition device
JP2009217832A (en) Method and device for automatically recognizing road sign in video image, and storage medium which stores program of road sign automatic recognition
US10692230B2 (en) Document imaging using depth sensing camera
CN112184814B (en) Positioning method and positioning system
CN107194385A (en) A kind of intelligent vehicle license plate recognition system
JP3516118B2 (en) Object recognition method and object recognition device
CN109409387B (en) Acquisition direction determining method and device of image acquisition equipment and electronic equipment
CN113516685B (en) Target tracking method, device, equipment and storage medium
CN115205825A (en) Improved YOLOV 5-based traffic sign detection and identification method for driving video sequence images
CN114943954A (en) Parking space detection method, device and system
EP3944134A3 (en) Obstacle detection device, obstacle detection method, and storage medium storing obstacle detection program
CN114445494A (en) Image acquisition and processing method, image acquisition device and robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant