CN113696180A - Robot automatic recharging method and device, storage medium and robot system - Google Patents

Robot automatic recharging method and device, storage medium and robot system Download PDF

Info

Publication number
CN113696180A
CN113696180A CN202111010936.9A CN202111010936A CN113696180A CN 113696180 A CN113696180 A CN 113696180A CN 202111010936 A CN202111010936 A CN 202111010936A CN 113696180 A CN113696180 A CN 113696180A
Authority
CN
China
Prior art keywords
robot
identification code
distance
preset
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111010936.9A
Other languages
Chinese (zh)
Inventor
刘磊
朱启波
魏伦灿
李烁林
曹益全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qianliyan Guangzhou Artificial Intelligence Technology Co ltd
Original Assignee
Qianliyan Guangzhou Artificial Intelligence Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qianliyan Guangzhou Artificial Intelligence Technology Co ltd filed Critical Qianliyan Guangzhou Artificial Intelligence Technology Co ltd
Priority to CN202111010936.9A priority Critical patent/CN113696180A/en
Publication of CN113696180A publication Critical patent/CN113696180A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)

Abstract

The application relates to the technical field of intelligent robots, and provides a robot automatic recharging method, a device, a storage medium and a robot system. The robot automatic recharging method comprises the following steps: when the robot meets the recharging condition, controlling the robot to move to a target position; shooting the characteristic identification code on the charging seat through a camera carried on the robot; the outer boundary of the feature identification code is provided with a black annular edge with a preset edge length, the preset edge length is determined based on the identification distance of the robot, a plurality of pixel blocks which are arranged between black and white are arranged inside the feature identification code, and the size of each pixel block is determined based on the identification distance of the robot; and determining pose estimation of the robot according to the feature identification code, estimating the mobile robot according to the pose, judging whether the robot is charged successfully or not, and if so, finishing robot recharging. The method and the device overcome the defects of low texture and visual identification under the dim light condition of visual recharging, and improve the stability and the scene adaptability of the robot system.

Description

Robot automatic recharging method and device, storage medium and robot system
Technical Field
The application relates to the technical field of intelligent robots, in particular to a robot automatic recharging method, device, storage medium and robot system.
Background
With the continuous progress of science and technology, robots are widely used, such as sweeping robots, furniture security robots, industry consulting robots and the like, and one of the most important characteristics of the robots is automatic positioning and automatic recharging. At present, automatic positioning includes a plurality of technologies, such as inertial navigation, visual navigation, laser navigation, and the like, and the technologies have the characteristics of autonomous navigation, no need of assistance of an external device, and strong applicability.
The automatic recharging mode of the existing visual navigation generally sets the characteristic identification code for visual positioning on the charging seat, but the existing characteristic identification code has poor recognition effect under low texture and dim light scenes, and the robot positioning recharging failure is easily caused.
Disclosure of Invention
The application provides an automatic robot recharging method, device, storage medium and robot system, and aims to solve the problem that the existing characteristic identification code is poor in identification effect under low-texture and dim-light scenes, and the robot positioning recharging is prone to failure.
In order to solve the problems, the following technical scheme is adopted in the application:
the application provides an automatic robot recharging method, which comprises the following steps:
when the robot meets the recharging condition, controlling the robot to move to a target position; the target position is a position in a preset relative position relation with the charging seat;
shooting the characteristic identification code on the charging seat through a camera carried on the robot; wherein the feature identification code is used for guiding the robot to visually recognize, and the feature identification code comprises the following features: the outer boundary of the feature identification code is provided with a black annular edge with a preset edge length, the preset edge length is determined based on the identification distance of the robot, a plurality of pixel blocks which are black and white and alternate are arranged inside the feature identification code, and the size of each pixel block is determined based on the identification distance of the robot;
and determining pose estimation of the robot according to the feature identification code, moving the robot according to the pose estimation, judging whether the robot is charged successfully or not, and finishing recharging the robot if the robot is charged successfully.
Preferably, the pixel block is a rectangular pixel block, and the step of determining the pose estimation of the robot according to the feature identification code includes:
extracting a target rectangular pixel block positioned in the center of the feature identification code; the target rectangular pixel block is composed of a preset number of rectangular pixel blocks which enable the target rectangular pixel block to be square;
determining four vertexes and corresponding coordinate information of the target rectangular pixel block, and calculating the left-side pixel length of the target rectangular pixel block according to the two vertexes located at the leftmost side and the corresponding coordinate information;
calculating the length of the right side pixel of the right side of the target rectangular pixel block according to the two vertexes which are positioned at the rightmost side and the corresponding coordinate information;
calculating according to the left-side pixel length, the right-side pixel length and an object image formula to obtain a left-side distance between the camera and the leftmost side of the target rectangular pixel block and a right-side distance between the camera and the rightmost side of the target rectangular pixel block;
and calculating according to the left side distance, the right side distance and a triangulation principle to obtain pose estimation of the robot.
Preferably, the step of determining the pose estimate of the robot according to the feature recognition code includes:
defining an area with a preset size in the feature identification code;
calculating the average value of all pixel points in the region;
taking a preset percentage of the average value as a target average value, setting the pixel value of the pixel point in the region smaller than the target average value as 0, and setting the pixel value of the pixel point in the region larger than or equal to the target average value as 1 to generate a target feature identification code;
and determining pose estimation of the robot according to the target feature identification code.
Further, after the step of moving the robot according to the pose estimation, the method further includes:
when the distance between the robot and the charging seat is detected to be smaller than a preset distance, acquiring distance information between the robot and the charging seat through a laser radar carried on the robot;
and controlling the movement of the robot according to the distance information.
Preferably, the step of detecting that the distance between the robot and the charging seat is less than a preset distance includes:
when the robot identifies the complete feature identification code of the last frame, calculating the distance between the robot and a charging seat;
acquiring depth information of the charging seat by using the laser radar, and calculating an error between the depth information and preset depth information;
judging whether the error is smaller than a preset error or not;
if so, taking the current distance between the robot and the charging seat as the preset distance.
Preferably, the step of determining whether the robot is successfully charged includes:
when receiving the charging information transmitted by the charging seat, judging that the robot is charged successfully; and the charging information is charging confirmation information transmitted by an infrared sensor on the charging seat after the robot moves to a specified distance and triggers a photoelectric door in the charging seat.
Preferably, the step of extracting the target rectangular pixel block located at the very center of the feature identification code includes:
traversing all outer contours of the feature identification codes, respectively screening the inner contour of each outer contour, and extracting the inner contour attribute of each inner contour of each outer contour;
judging whether the inner contour attribute meets a preset condition or not;
if yes, determining a target outer contour with the inner contour attribute meeting the preset condition from all the outer contours, and extracting the target rectangular pixel block according to the target outer contour.
The application provides an automatic recharging device of robot includes:
the control module is used for controlling the robot to move to a target position when the robot meets the recharging condition; the target position is a position in a preset relative position relation with the charging seat;
the shooting module is used for shooting the characteristic identification code on the charging seat through a camera carried on the robot; wherein the feature identification code is used for guiding the robot to visually recognize, and the feature identification code comprises the following features: the outer boundary of the feature identification code is provided with a black annular edge with a preset edge length, the preset edge length is determined based on the identification distance of the robot, a plurality of pixel blocks which are black and white and alternate are arranged inside the feature identification code, and the size of each pixel block is determined based on the identification distance of the robot;
and the moving module is used for determining pose estimation of the robot according to the feature identification code, moving the robot according to the pose estimation, judging whether the robot is charged successfully or not, and finishing recharging of the robot if the robot is charged successfully.
The robot system comprises a charging seat and a robot, wherein the robot comprises a memory and a processor, the memory stores computer readable instructions, and the computer readable instructions when executed by the processor cause the processor to execute the steps of the robot automatic recharging method.
The present application provides a storage medium having stored thereon a computer program which, when executed by a processor, implements a robot auto-refill method as defined in any one of the above.
Compared with the prior art, the technical scheme of the application has the following advantages:
according to the automatic robot recharging method and device, the storage medium and the robot system, when the robot meets recharging conditions, the robot is controlled to move to a target position; shooting the characteristic identification code on the charging seat through a camera carried on the robot; wherein, the characteristic identification code comprises the following characteristics: the outer boundary of the feature identification code is provided with a black annular edge with a preset edge length, the preset edge length is determined based on the identification distance of the robot, a plurality of black-white pixel blocks are arranged inside the feature identification code, and the size of each pixel block is determined based on the identification distance of the robot; and determining pose estimation of the robot according to the feature identification code, estimating the mobile robot according to the pose, judging whether the robot is charged successfully or not, and if so, finishing robot recharging. The characteristic identification code is characterized in that a black annular edge with preset side length is arranged on the outer boundary, a plurality of black-white pixel blocks are arranged in the characteristic identification code, the size of each pixel block is determined based on the identification distance of the robot, the characteristic identification code can adapt to charging at different distances, the defects of low texture and visual identification under the dark light condition of visual recharging are overcome, the stability and scene adaptability of a robot system are improved, and the charging success is ensured.
Drawings
FIG. 1 is a block diagram of a robot automatic recharging method according to an embodiment of the present disclosure;
FIG. 2 is a block diagram of an embodiment of an automatic robot recharging device of the present application;
fig. 3 is a block diagram of an internal structure of a robot according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
In some of the flows described in the specification and claims of this application and in the above-described figures, a number of operations are included that occur in a particular order, but it should be clearly understood that these operations may be performed out of order or in parallel as they occur herein, with the order of the operations being numbered, e.g., S11, S12, etc., merely to distinguish between various operations, and the order of the operations itself is not intended to represent any order of performance. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
It will be understood by those of ordinary skill in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, wherein the same or similar reference numerals refer to the same or similar elements or elements with the same or similar functions throughout. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, in an embodiment of a robot automatic recharging method provided in the present application, the robot automatic recharging method includes the following steps:
s11, when the robot meets the recharging condition, controlling the robot to move to the target position; the target position is a position in a preset relative position relation with the charging seat;
s12, shooting the feature identification code on the charging seat through a camera carried on the robot; wherein the feature identification code is used for guiding the robot to visually recognize, and the feature identification code comprises the following features: the outer boundary of the feature identification code is provided with a black annular edge with a preset edge length, the preset edge length is determined based on the identification distance of the robot, a plurality of pixel blocks which are black and white and alternate are arranged inside the feature identification code, and the size of each pixel block is determined based on the identification distance of the robot;
and S13, determining pose estimation of the robot according to the feature identification code, moving the robot according to the pose estimation, judging whether the robot is charged successfully or not, and if so, finishing recharging the robot.
As described in step S11, when the robot needs to be recharged in the case of insufficient power or the like to ensure continuous operation, it may be considered that the robot satisfies the recharging condition.
Preferably, when the robot satisfies at least one of the following conditions, it is determined that the robot satisfies a recharge condition:
the battery electric quantity of the robot is lower than a preset electric quantity value;
after the robot finishes all current tasks;
the robot receives a recharging instruction;
the charging process of the robot is interrupted passively.
When the target position is determined, the camera on the robot can shoot the charging seat at will to obtain an image, and the image is processed to obtain the target position with a certain position relation with the charging seat, such as a position five centimeters away from the front of the charging seat.
As described in step S12, for the low-texture feature identifier, because the color is single, when global binarization is performed on the low-texture feature identifier, the outer contour of the feature identifier is easily confused with the background, so that the feature of the outer contour is lost after global binarization, and therefore, the outer boundary of the feature identifier is set to be a black annular edge with a preset edge length, and the black annular edge effectively isolates the inside of the feature identifier from the external background, so as to distinguish the outer contour from the background and perform effective identification. Specifically, the feature identification code is used for guiding the robot to perform visual identification, a black annular edge with a preset edge length is set at the outer boundary of the feature identification code so as to facilitate identification, the preset edge length is determined based on the identification distance of the robot, for example, when the charging distance of the robot needs to be set to be longer, the preset edge length can be set to be longer, and therefore the preset edge length is in direct proportion to the identification distance of the robot. In addition, a plurality of black-and-white pixel blocks are provided inside the feature recognition code, and the size of each pixel block is determined based on the recognition distance of the robot.
In an embodiment, the feature identification code is processed by the adaptive equalization method, and the adaptive equalization method enables scene binarization to be more discrete, can keep more features and avoids loss of low-texture features.
The feature identification code adopted by the application can effectively avoid the interference of other feature textures in a low-texture scene, and can also effectively encrypt, identify and position the feature.
In step S13, the pose estimation of the robot is determined according to the feature identification code, the robot is moved according to the pose estimation, after the robot moves to a certain position, whether the robot is charged successfully is determined, and if yes, the robot recharging is completed.
Among them, pose estimation plays a very important role in the field of computer vision. The method has great application in the aspects of estimating the pose of the robot by using the vision sensor for control, robot navigation, augmented reality and the like. The basis of this process of pose estimation is to find the corresponding points between the real world and the image projections. And then adopting a corresponding pose estimation method according to the types of the point pairs, such as 2D-2D,2D-3D and 3D-3D. Of course, the same type of point pairs can be divided into algebraic and nonlinear optimization-based methods, such as direct linear transformation and beam-balancing. We generally call the process of estimating pose from known points as solving for PnP.
According to the automatic recharging method for the robot, when the robot meets recharging conditions, the robot is controlled to move to a target position; shooting the characteristic identification code on the charging seat through a camera carried on the robot; wherein, the characteristic identification code comprises the following characteristics: the outer boundary of the feature identification code is provided with a black annular edge with a preset edge length, the preset edge length is determined based on the identification distance of the robot, a plurality of black-white pixel blocks are arranged inside the feature identification code, and the size of each pixel block is determined based on the identification distance of the robot; and determining pose estimation of the robot according to the feature identification code, estimating the mobile robot according to the pose, judging whether the robot is charged successfully or not, and if so, finishing robot recharging. The characteristic identification code is characterized in that a black annular edge with preset side length is arranged on the outer boundary, a plurality of black-white pixel blocks are arranged in the characteristic identification code, the size of each pixel block is determined based on the identification distance of the robot, the characteristic identification code can adapt to charging at different distances, the defects of low texture and visual identification under the dark light condition of visual recharging are overcome, the stability and scene adaptability of a robot system are improved, and the charging success is ensured.
In an embodiment, the pixel block is a rectangular pixel block, and the step of determining the pose estimation of the robot according to the feature identification code includes:
extracting a target rectangular pixel block positioned in the center of the feature identification code; the target rectangular pixel block is composed of a preset number of rectangular pixel blocks which enable the target rectangular pixel block to be square;
determining four vertexes and corresponding coordinate information of the target rectangular pixel block, and calculating the left-side pixel length of the target rectangular pixel block according to the two vertexes located at the leftmost side and the corresponding coordinate information;
calculating the length of the right side pixel of the right side of the target rectangular pixel block according to the two vertexes which are positioned at the rightmost side and the corresponding coordinate information;
calculating according to the left-side pixel length, the right-side pixel length and an object image formula to obtain a left-side distance between the camera and the leftmost side of the target rectangular pixel block and a right-side distance between the camera and the rightmost side of the target rectangular pixel block;
and calculating according to the left side distance, the right side distance and a triangulation principle to obtain pose estimation of the robot.
In this embodiment, after the feature identification code is acquired by the camera, the target rectangular pixel block located at the center of the feature identification code is extracted, where the target rectangular pixel block is formed by a preset number of rectangular pixel blocks that make the target rectangular pixel block in a square shape, for example, a target rectangular pixel block in a square shape is formed by four rectangular pixel blocks.
Then, positioning four vertexes of the target rectangular pixel block, determining coordinate information of the four vertexes, calculating to obtain the length of a left-side pixel according to the coordinate information of the vertex positioned on the left side of the target rectangular pixel block, calculating to obtain the length of a right-side pixel according to the coordinate information of the vertex positioned on the right side of the target rectangular pixel block, and calculating to obtain the distance between the camera and the leftmost side of the target rectangular pixel block through an object image formula to obtain the distance of the left side; and calculating the distance between the camera and the rightmost side of the target rectangular pixel block through an object image formula to obtain the right side distance, and calculating the pose estimation of the robot according to the left side distance, the right side distance and a triangulation positioning principle, so that the pose estimation of the robot is accurately determined.
The object image formula utilizes a pinhole imaging principle, is applied to a camera pinhole imaging model, and particularly takes a plane where a camera CCD is located as an imaging plane, a camera lens as a pinhole, and a shot target as an object.
Calculating the right side distance according to a calculation formula of similar triangles:
D*h=d*H;
where D is the distance from the camera CCD to the camera lens, H is the image height, D is the distance from the object to the camera lens, and H is the object height, and in the camera model, D and H are constant values, that is, D x H ═ C.
The triangulation positioning principle specifically comprises the following steps:
according to the cosine theorem a2=b2+c2-2bccos ℃, -when the three sides of the triangle are known, the angle value of any interior angle of the triangle can be calculated. In this triangle model, the left-most distance a, right-most distance b from the camera to the target rectangular pixel block, and the left-to-right distance c from the pixel block itself, so the parameters of the entire triangle can be calculated.
In one embodiment, the step of determining a pose estimate of the robot from the feature identifier comprises:
defining an area with a preset size in the feature identification code;
calculating the average value of all pixel points in the region;
taking a preset percentage of the average value as a target average value, setting the pixel value of the pixel point in the region smaller than the target average value as 0, and setting the pixel value of the pixel point in the region larger than or equal to the target average value as 1 to generate a target feature identification code;
and determining pose estimation of the robot according to the target feature identification code.
The embodiment provides an improved feature recognition algorithm to solve the technical problem that in a low-texture and dark-light scene, a global condition binarization method can cause local feature points to be filtered due to overlarge weight occupied by a 'background' presented by a picture in a binarization process, so that the recognition effect is poor. Therefore, the method adopts a local self-adaptive mean value binarization method to avoid the interference of a large background.
Specifically, an area size S is set inside the feature identification code (1/8 with the shortest camera resolution width in the actual test is set to have the best effect);
calculating a mean value D of all pixel points of the region S, and taking a preset percentage of the mean value as a target mean value, preferably, the preset percentage is 85%;
if the pixel point is less than 85% of the mean value D, setting the pixel point to be 0, otherwise, setting the pixel point to be 1.
By the method, the pixel point is not influenced by global conditions and is only related to adjacent pixels, so that the adaptability is stronger, and the binarization effect is better under the conditions of uneven illumination and overlarge background.
The step of calculating the mean value D of all the pixels in the region S specifically includes:
assuming that the coordinates of the current pixel are (x, y), the side length of the square region S is S, and the pixel value is represented as PxyMean value of
Figure BDA0003238460560000091
Wherein x-s is more than or equal to 0 and less than or equal to xi≤x,0≤y-s≤yi≤y。
In one embodiment, the step of extracting the target rectangular pixel block located at the very center of the feature identification code includes:
traversing all outer contours of the feature identification codes, respectively screening the inner contour of each outer contour, and extracting the inner contour attribute of each inner contour of each outer contour;
judging whether the inner contour attribute meets a preset condition or not;
if yes, determining a target outer contour with the inner contour attribute meeting the preset condition from all the outer contours, and extracting the target rectangular pixel block according to the target outer contour.
The outline extraction in the traditional identification technology generally establishes two-level relation between the outlines of all the feature identification codes, wherein the outline of the outer layer comprises the outline of the inner layer, and the outline of the inner layer can also continuously comprise the embedded outline. The location of the inner points needs to be screened by the outer contour, but because the distance that the inner points can represent is shorter, the inner contours (small pixel blocks) are densely arranged and are closer to each other, so that the identification distance is very close, and mutual interference is easy to occur. Therefore, the method screens the outer contours by adopting the inner contours (the number of squares and the position are designed in a self-defined mode), specifically, all the outer contours of the feature identification codes can be traversed, the attribute of the inner contour (the small pixel block contour in the annular frame) of each outer contour is screened, when the attribute of the inner contour meets all the conditions, if the coordinate information of the leftmost position in the inner contour meets the requirement or the attribute of the inner contour contains the feature point matched with the key information, the outer contour with the inner contour attribute meeting all the conditions is determined and is used as the target outer contour, and the area contained in the target outer contour is used as the target rectangular pixel block. The target outer contour is used for positioning, so that the positioning distance is increased, the property that the outer contour is larger is fully utilized, the contour which does not meet the condition is filtered, and the quasi-square contour is screened out. Wherein, the screening conditions are as follows:
contains a sub-outline;
no parent profile;
the outline minimum outside rectangle area is less than 16 (too small may be a noise point);
satisfying a quadrilateral fit with a maximum error of 4;
the feature identification code is a convex polygon;
diagonal sides d1, d2, two adjacent sides d3, d4, contour area, perimeter p of the target rectangular pixel block satisfy d3 × 4> d4& & d4 &4 > d3& & d3d4< area & & 1.5& & area >15& & d1> -0.15 & & d2> -0.15 ═ p.
The screened outer contour meets the following conditions:
no parent profile;
the number of sub-contours satisfies: the number of the screened outlines is equal to the number of designed squares, 1.5 and the number of the screened outlines is equal to the number of designed squares, 0.5;
a quadrilateral fit with a maximum pixel error of 3 is satisfied;
filtering is not like a rectangular quadrilateral;
performing sub-pixel extraction on four vertexes of the target rectangular pixel block;
to this end, we have located 4 vertices of the target rectangular pixel block.
The embodiment improves the binarization method in the existing key point positioning, so that the binarized pixel is not influenced by global conditions, is only related to adjacent pixels, and has stronger adaptability. The binarization effect is better under the conditions of uneven illumination, larger background and smaller feature identification code; meanwhile, the outer contour is screened through the inner contour, and the outer contour is used for positioning, so that the positioning distance is farther, and the larger attribute of the outer contour is fully utilized.
In an embodiment, after the step of moving the robot according to the pose estimation, the method further comprises:
when the distance between the robot and the charging seat is detected to be smaller than a preset distance, acquiring distance information between the robot and the charging seat through a laser radar carried on the robot;
and controlling the movement of the robot according to the distance information.
The robot is limited by the size limitation of a robot charging seat, the distance of visual identification is short, and the transmission visual positioning code cannot meet the requirement of remote identification in a large scene, so that the abnormal condition processing capability of the robot in the recharging process is weak. Meanwhile, due to the technical characteristics of the RGBD camera lens, a visual blind area exists in a short distance range, the movement recognition is easy to fail in a dark light scene, and the scene applicability of pure visual recharging is poor. For example, due to the technical limitation of the camera itself, when the camera is too close to the target object, there is a dead zone with a certain distance (different camera corresponding values are different, and may be set to 10cm in general), and feature matching and positioning cannot be performed, if only motion information of the odometer is relied on at this time, recharging failure or damage to the device may be caused.
Therefore, in order to overcome the technical problems, the method and the device fuse the laser depth point cloud data, automatically switch the laser matching to execute the recharging instruction under the condition of the short distance range, greatly improve the scene adaptability of the automatic recharging technology, and overcome the defects of the short distance blind area of visual identification and the visual identification under the condition of dark light.
In an embodiment, the step of detecting that the distance between the robot and the charging stand is less than a preset distance includes:
when the robot identifies the complete feature identification code of the last frame, calculating the distance between the robot and a charging seat;
acquiring depth information of the charging seat by using the laser radar, and calculating an error between the depth information and preset depth information;
judging whether the error is smaller than a preset error or not;
if so, taking the current distance between the robot and the charging seat as the preset distance.
In this embodiment, in the last preset distance in the recharging process, the camera cannot recognize the feature identification code due to the focal length limitation of the camera and the size of the feature identification code, and at this time, the robot can only perform motion control through the communication of the charging seat to prevent collision. However, the pose of the robot cannot be adjusted even by communication with the charging stand, and recharging fails due to external interference (such as moving the charging stand, abnormal power failure, entry of foreign objects, etc.). In order to improve the stability of the whole robot system, the invention uses the inherent laser radar of the robot to obtain the distance information between the robot and the charging pile, and controls the motion of the robot once.
Specifically, the depth information of the charging base can be acquired by using a laser radar, and interframe matching is performed based on the depth information to determine the error between the depth information and the preset depth information. The interframe matching algorithm comprises a laser intensity information interframe matching algorithm and a laser depth information interframe matching algorithm:
laser intensity information interframe matching algorithm: the laser intensity information represents the photon intensity of photons of the invisible light wave band, which are emitted from the laser generator to the surface of the object and reflected back to the laser receiver, and all point cloud data collected by the laser radar at a certain angular resolution or time are used as one frame of data of the current period. The environment is subjected to small discrete change along with the movement of the robot, each frame of real-time intensity information data of the laser radar is used as perception to identify the environment at the moment, and the change of two frames of laser intensity point clouds is compared and matched to obtain a mutual transformation relation, namely a laser intensity information interframe matching algorithm.
The laser depth information interframe matching algorithm comprises the following steps: the laser depth information is consistent with the principle of the laser intensity information, and is different from the principle that the laser depth information is distance information obtained by converting the flight time length of photons from emission to reception through light speed. Compared with the laser intensity information, the inter-frame environment characteristic is obvious, the matching effect is better, and the scene applicability is better.
And then judging whether the error is smaller than a preset error, when the error is smaller than the preset error, taking the current distance between the robot and the charging seat as the preset distance, and switching the visual identification positioning mode into a laser positioning mode within the preset distance to realize short-distance accurate positioning so as to ensure successful charging.
In one embodiment, the step of determining whether the robot is successfully charged includes:
when receiving the charging information transmitted by the charging seat, judging that the robot is charged successfully; and the charging information is charging confirmation information transmitted by an infrared sensor on the charging seat after the robot moves to a specified distance and triggers a photoelectric door in the charging seat.
When the robot recognizes the complete characteristic identification code of the last frame, the distance between the robot and the charging seat is calculated, meanwhile, the laser radar acquires the depth information of the charging seat for comparison, and if the error is smaller than the set range, the distance is determined to be the last recharging distance of the robot. After recharging, visual identification positioning is failed, and the robot movement is controlled to stop through the depth information of the laser. When the robot moves to a specified distance, the photoelectric door in the charging seat is triggered, meanwhile, the charging seat and the robot communicate through the infrared sensor to confirm hand shaking, the charging seat is powered on after confirmation, the infrared sensor on the charging seat can transmit charging confirmation information to the robot, and the robot finishes a charging instruction after receiving the charging confirmation information.
Referring to fig. 2, an embodiment of the present application further provides an automatic robot recharging device, wherein,
the control module 11 is used for controlling the robot to move to a target position when the robot meets the recharging condition; the target position is a position in a preset relative position relation with the charging seat;
a shooting module 12, configured to shoot the feature identification code on the charging seat through a camera mounted on the robot; wherein the feature identification code is used for guiding the robot to visually recognize, and the feature identification code comprises the following features: the outer boundary of the feature identification code is provided with a black annular edge with a preset edge length, the preset edge length is determined based on the identification distance of the robot, a plurality of pixel blocks which are black and white and alternate are arranged inside the feature identification code, and the size of each pixel block is determined based on the identification distance of the robot;
and the moving module 13 is configured to determine pose estimation of the robot according to the feature identification code, move the robot according to the pose estimation, determine whether the robot is successfully charged, and complete recharging of the robot if the robot is successfully charged.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
The robot system comprises a charging seat and a robot, wherein the robot comprises a storage and a processor, and computer readable instructions are stored in the storage and are executed by the processor, so that the processor executes the steps of the robot automatic recharging method.
In one embodiment, as shown in FIG. 3. The robot comprises a processor 402, a memory 403, an input unit 404, a display unit 405, etc. Those skilled in the art will appreciate that the device configuration means shown in fig. 3 do not constitute a limitation of all devices and may include more or less components than those shown, or some components in combination. The memory 403 may be used to store the computer program 401 and the functional modules, and the processor 402 runs the computer program 401 stored in the memory 403 to execute various functional applications of the device and data processing. The memory may be internal or external memory, or include both internal and external memory. The memory may comprise read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), flash memory, or random access memory. The external memory may include a hard disk, a floppy disk, a ZIP disk, a usb-disk, a magnetic tape, etc. The memories disclosed herein include, but are not limited to, these types of memories. The memory disclosed herein is by way of example only and not by way of limitation.
The input unit 404 is used for receiving input of signals and receiving keywords input by a user. The input unit 404 may include a touch panel and other input devices. The touch panel can collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel by using any suitable object or accessory such as a finger, a stylus and the like) and drive the corresponding connecting device according to a preset program; other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., play control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like. The display unit 405 may be used to display information input by a user or information provided to a user and various menus of the computer device. The display unit 405 may take the form of a liquid crystal display, an organic light emitting diode, or the like. The processor 402 is a control center of the computer device, connects various parts of the entire computer using various interfaces and lines, and performs various functions and processes data by operating or executing software programs and/or modules stored in the memory 402 and calling data stored in the memory.
As one embodiment, the robot includes: one or more processors 402, a memory 403, one or more computer programs 401, wherein the one or more computer programs 401 are stored in the memory 403 and configured to be executed by the one or more processors 402, the one or more computer programs 401 being configured to perform the robot auto-refill method of the above embodiments.
In one embodiment, the present application also proposes a storage medium storing computer-readable instructions which, when executed by one or more processors, cause the one or more processors to perform the above described robotic auto-refill method. For example, the storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It will be understood by those skilled in the art that all or part of the processes of the methods of the above embodiments may be implemented by a computer program, which may be stored in a storage medium and executed by a computer, and the processes of the embodiments of the methods may be included. The storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a Random Access Memory (RAM).
By combining the above embodiments, the application has the following greatest beneficial effects:
according to the automatic robot recharging method and device, the storage medium and the robot system, when the robot meets recharging conditions, the robot is controlled to move to a target position; shooting the characteristic identification code on the charging seat through a camera carried on the robot; wherein, the characteristic identification code comprises the following characteristics: the outer boundary of the feature identification code is provided with a black annular edge with a preset edge length, the preset edge length is determined based on the identification distance of the robot, a plurality of black-white pixel blocks are arranged inside the feature identification code, and the size of each pixel block is determined based on the identification distance of the robot; and determining pose estimation of the robot according to the feature identification code, estimating the mobile robot according to the pose, judging whether the robot is charged successfully or not, and if so, finishing robot recharging. The characteristic identification code is characterized in that a black annular edge with preset side length is arranged on the outer boundary, a plurality of black-white pixel blocks are arranged in the characteristic identification code, the size of each pixel block is determined based on the identification distance of the robot, the characteristic identification code can adapt to charging at different distances, the defects of low texture and visual identification under the dark light condition of visual recharging are overcome, the stability and scene adaptability of a robot system are improved, and the charging success is ensured.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A robotic automatic refill method, comprising:
when the robot meets the recharging condition, controlling the robot to move to a target position; the target position is a position in a preset relative position relation with the charging seat;
shooting the characteristic identification code on the charging seat through a camera carried on the robot; wherein the feature identification code is used for guiding the robot to visually recognize, and the feature identification code comprises the following features: the outer boundary of the feature identification code is provided with a black annular edge with a preset edge length, the preset edge length is determined based on the identification distance of the robot, a plurality of pixel blocks which are black and white and alternate are arranged inside the feature identification code, and the size of each pixel block is determined based on the identification distance of the robot;
and determining pose estimation of the robot according to the feature identification code, moving the robot according to the pose estimation, judging whether the robot is charged successfully or not, and finishing recharging the robot if the robot is charged successfully.
2. The method of claim 1, wherein the pixel block is a rectangular pixel block, and wherein the step of determining the pose estimate of the robot from the feature identifier comprises:
extracting a target rectangular pixel block positioned in the center of the feature identification code; the target rectangular pixel block is composed of a preset number of rectangular pixel blocks which enable the target rectangular pixel block to be square;
determining four vertexes and corresponding coordinate information of the target rectangular pixel block, and calculating the left-side pixel length of the target rectangular pixel block according to the two vertexes located at the leftmost side and the corresponding coordinate information;
calculating the length of the right side pixel of the right side of the target rectangular pixel block according to the two vertexes which are positioned at the rightmost side and the corresponding coordinate information;
calculating according to the left-side pixel length, the right-side pixel length and an object image formula to obtain a left-side distance between the camera and the leftmost side of the target rectangular pixel block and a right-side distance between the camera and the rightmost side of the target rectangular pixel block;
and calculating according to the left side distance, the right side distance and a triangulation principle to obtain pose estimation of the robot.
3. The method of claim 1, wherein the step of determining a pose estimate for the robot from the feature identifier comprises:
defining an area with a preset size in the feature identification code;
calculating the average value of all pixel points in the region;
taking a preset percentage of the average value as a target average value, setting the pixel value of the pixel point in the region smaller than the target average value as 0, and setting the pixel value of the pixel point in the region larger than or equal to the target average value as 1 to generate a target feature identification code;
and determining pose estimation of the robot according to the target feature identification code.
4. The method of claim 1, wherein the step of moving the robot according to the pose estimate is further followed by:
when the distance between the robot and the charging seat is detected to be smaller than a preset distance, acquiring distance information between the robot and the charging seat through a laser radar carried on the robot;
and controlling the movement of the robot according to the distance information.
5. The method of claim 4, wherein the step of detecting that the distance between the robot and the charging seat is less than a preset distance comprises:
when the robot identifies the complete feature identification code of the last frame, calculating the distance between the robot and a charging seat;
acquiring depth information of the charging seat by using the laser radar, and calculating an error between the depth information and preset depth information;
judging whether the error is smaller than a preset error or not;
if so, taking the current distance between the robot and the charging seat as the preset distance.
6. The method of claim 1, wherein the step of determining whether the robot has been successfully charged comprises:
when receiving the charging information transmitted by the charging seat, judging that the robot is charged successfully; and the charging information is charging confirmation information transmitted by an infrared sensor on the charging seat after the robot moves to a specified distance and triggers a photoelectric door in the charging seat.
7. The method of claim 2, wherein the step of extracting the target rectangular pixel block located at the very center of the feature identification code comprises:
traversing all outer contours of the feature identification codes, respectively screening the inner contour of each outer contour, and extracting the inner contour attribute of each inner contour of each outer contour;
judging whether the inner contour attribute meets a preset condition or not;
if yes, determining a target outer contour with the inner contour attribute meeting the preset condition from all the outer contours, and extracting the target rectangular pixel block according to the target outer contour.
8. A robotic automatic refill device, comprising:
the control module is used for controlling the robot to move to a target position when the robot meets the recharging condition; the target position is a position in a preset relative position relation with the charging seat;
the shooting module is used for shooting the characteristic identification code on the charging seat through a camera carried on the robot; wherein the feature identification code is used for guiding the robot to visually recognize, and the feature identification code comprises the following features: the outer boundary of the feature identification code is provided with a black annular edge with a preset edge length, the preset edge length is determined based on the identification distance of the robot, a plurality of pixel blocks which are black and white and alternate are arranged inside the feature identification code, and the size of each pixel block is determined based on the identification distance of the robot;
and the moving module is used for determining pose estimation of the robot according to the feature identification code, moving the robot according to the pose estimation, judging whether the robot is charged successfully or not, and finishing recharging of the robot if the robot is charged successfully.
9. A robot system comprising a charging dock and a robot, characterized in that the robot comprises a memory and a processor, the memory having stored therein computer readable instructions which, when executed by the processor, cause the processor to perform the steps of the robot auto-recharge method of any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when being executed by a processor, carries out the robot auto-refill method according to any one of claims 1-7.
CN202111010936.9A 2021-08-31 2021-08-31 Robot automatic recharging method and device, storage medium and robot system Pending CN113696180A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111010936.9A CN113696180A (en) 2021-08-31 2021-08-31 Robot automatic recharging method and device, storage medium and robot system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111010936.9A CN113696180A (en) 2021-08-31 2021-08-31 Robot automatic recharging method and device, storage medium and robot system

Publications (1)

Publication Number Publication Date
CN113696180A true CN113696180A (en) 2021-11-26

Family

ID=78657822

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111010936.9A Pending CN113696180A (en) 2021-08-31 2021-08-31 Robot automatic recharging method and device, storage medium and robot system

Country Status (1)

Country Link
CN (1) CN113696180A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114355889A (en) * 2021-12-08 2022-04-15 上海擎朗智能科技有限公司 Control method, robot charging stand, and computer-readable storage medium
CN114397886A (en) * 2021-12-20 2022-04-26 烟台杰瑞石油服务集团股份有限公司 Charging method and charging system
CN115137255A (en) * 2022-06-29 2022-10-04 深圳市优必选科技股份有限公司 Charging abnormity processing method and device, readable storage medium and sweeping robot
CN115933706A (en) * 2023-02-07 2023-04-07 科大讯飞股份有限公司 Robot charging method and device, robot and robot system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109062207A (en) * 2018-08-01 2018-12-21 深圳乐动机器人有限公司 Localization method, device, robot and the storage medium of cradle
CN110543178A (en) * 2019-09-30 2019-12-06 深圳市银星智能科技股份有限公司 Robot recharging method and system, robot and charging station
CN110673612A (en) * 2019-10-21 2020-01-10 重庆邮电大学 Two-dimensional code guide control method for autonomous mobile robot
CN111697651A (en) * 2020-06-17 2020-09-22 上海交通大学医学院 Fill electric pile, autonomic mobile device, autonomic charging system
CN112183133A (en) * 2020-08-28 2021-01-05 同济大学 Aruco code guidance-based mobile robot autonomous charging method
CN112346453A (en) * 2020-10-14 2021-02-09 深圳市杉川机器人有限公司 Automatic robot recharging method and device, robot and storage medium
CN112748737A (en) * 2020-12-28 2021-05-04 上海电机学院 Laser charging method for estimating trinocular visual pose of patrol robot
CN112886670A (en) * 2021-03-04 2021-06-01 武汉联一合立技术有限公司 Charging control method and device for robot, robot and storage medium
CN113258638A (en) * 2021-05-14 2021-08-13 江苏天策机器人科技有限公司 Intelligent robot charging method and system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109062207A (en) * 2018-08-01 2018-12-21 深圳乐动机器人有限公司 Localization method, device, robot and the storage medium of cradle
CN110543178A (en) * 2019-09-30 2019-12-06 深圳市银星智能科技股份有限公司 Robot recharging method and system, robot and charging station
CN110673612A (en) * 2019-10-21 2020-01-10 重庆邮电大学 Two-dimensional code guide control method for autonomous mobile robot
CN111697651A (en) * 2020-06-17 2020-09-22 上海交通大学医学院 Fill electric pile, autonomic mobile device, autonomic charging system
CN112183133A (en) * 2020-08-28 2021-01-05 同济大学 Aruco code guidance-based mobile robot autonomous charging method
CN112346453A (en) * 2020-10-14 2021-02-09 深圳市杉川机器人有限公司 Automatic robot recharging method and device, robot and storage medium
CN112748737A (en) * 2020-12-28 2021-05-04 上海电机学院 Laser charging method for estimating trinocular visual pose of patrol robot
CN112886670A (en) * 2021-03-04 2021-06-01 武汉联一合立技术有限公司 Charging control method and device for robot, robot and storage medium
CN113258638A (en) * 2021-05-14 2021-08-13 江苏天策机器人科技有限公司 Intelligent robot charging method and system

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114355889A (en) * 2021-12-08 2022-04-15 上海擎朗智能科技有限公司 Control method, robot charging stand, and computer-readable storage medium
CN114397886A (en) * 2021-12-20 2022-04-26 烟台杰瑞石油服务集团股份有限公司 Charging method and charging system
CN114397886B (en) * 2021-12-20 2024-01-23 烟台杰瑞石油服务集团股份有限公司 Charging method and charging system
CN115137255A (en) * 2022-06-29 2022-10-04 深圳市优必选科技股份有限公司 Charging abnormity processing method and device, readable storage medium and sweeping robot
CN115137255B (en) * 2022-06-29 2023-11-21 深圳市优必选科技股份有限公司 Charging abnormity processing method and device, readable storage medium and sweeping robot
CN115933706A (en) * 2023-02-07 2023-04-07 科大讯飞股份有限公司 Robot charging method and device, robot and robot system

Similar Documents

Publication Publication Date Title
CN113696180A (en) Robot automatic recharging method and device, storage medium and robot system
CN109890573B (en) Control method and device for mobile robot, mobile robot and storage medium
US10810456B2 (en) Apparatus and methods for saliency detection based on color occurrence analysis
EP3278058B1 (en) Imager for detecting visual light and infrared projected patterns
KR101776622B1 (en) Apparatus for recognizing location mobile robot using edge based refinement and method thereof
CN110622085A (en) Mobile robot and control method and control system thereof
CN106575438B (en) Combination of Stereoscopic and Structured Light Processing
CN108247647A (en) A kind of clean robot
US9460339B2 (en) Combined color image and depth processing
CN109213137A (en) sweeping robot, sweeping robot system and its working method
CN109241820B (en) Unmanned aerial vehicle autonomous shooting method based on space exploration
CN109671115A (en) The image processing method and device estimated using depth value
CN110084243B (en) File identification and positioning method based on two-dimensional code and monocular camera
KR20110011424A (en) Method for recognizing position and controlling movement of a mobile robot, and the mobile robot using the same
US20200145639A1 (en) Portable 3d scanning systems and scanning methods
CN110347153A (en) A kind of Boundary Recognition method, system and mobile robot
CN108459597A (en) A kind of mobile electronic device and method for handling the task of mission area
WO2020034963A1 (en) Charging device identification method, mobile robot and charging device identification system
CN111665826A (en) Depth map acquisition method based on laser radar and monocular camera and sweeping robot
CN112204345A (en) Indoor positioning method of mobile equipment, mobile equipment and control system
CN112034837A (en) Method for determining working environment of mobile robot, control system and storage medium
CN209991983U (en) Obstacle detection equipment and unmanned aerial vehicle
Zhong et al. Stairway detection using Gabor filter and FFPG
CN111736596A (en) Vehicle with gesture control function, gesture control method of vehicle, and storage medium
CN116795117A (en) Automatic recharging method and device for robot, storage medium and robot system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20211126

WD01 Invention patent application deemed withdrawn after publication