CN109612469B - Method for searching position of charging base by robot and robot - Google Patents

Method for searching position of charging base by robot and robot Download PDF

Info

Publication number
CN109612469B
CN109612469B CN201910031044.3A CN201910031044A CN109612469B CN 109612469 B CN109612469 B CN 109612469B CN 201910031044 A CN201910031044 A CN 201910031044A CN 109612469 B CN109612469 B CN 109612469B
Authority
CN
China
Prior art keywords
obstacle
image
intersection point
robot
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910031044.3A
Other languages
Chinese (zh)
Other versions
CN109612469A (en
Inventor
王晓佳
谌鎏
郭盖华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen LD Robot Co Ltd
Original Assignee
Shenzhen LD Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen LD Robot Co Ltd filed Critical Shenzhen LD Robot Co Ltd
Priority to CN201910031044.3A priority Critical patent/CN109612469B/en
Publication of CN109612469A publication Critical patent/CN109612469A/en
Application granted granted Critical
Publication of CN109612469B publication Critical patent/CN109612469B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0219Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/12Target-seeking control

Abstract

The invention is applicable to the technical field of robots and provides a method for searching the position of a charging base by a robot and the robot, wherein the method for searching the position of the charging base by the robot comprises the steps of acquiring an electronic map of an area to be searched, determining a path for searching the position of the charging base according to the distance between any two obstacles in the electronic map, determining that any two obstacles are in a parallel position relation, and searching the position of the charging base according to the determined search path. According to the invention, the charging base can be searched quickly and efficiently when the position of the charging base is changed according to the distance between the obstacles in the electronic map.

Description

Method for searching position of charging base by robot and robot
Technical Field
The present invention relates to the field of robot technology, and in particular, to a method for searching a position of a charging base by a robot, and a computer-readable storage medium.
Background
Generally, the robot returns to the charging base to charge after completing the current cleaning task, so as to ensure that sufficient electric quantity is available during the next cleaning.
However, in practical applications, the position of the charging base is changed continuously, and at this time, the charging base is searched according to the label information in the electronic map, which is difficult to ensure that the charging base is searched quickly and efficiently.
Therefore, a new technical solution is needed to solve the above technical problems.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method for searching a position of a charging base by a robot and a robot, which can search the charging base quickly and efficiently when the position of the charging base is changed.
A first aspect of an embodiment of the present invention provides a method for a robot to search for a charging base, including:
acquiring an electronic map of an area to be searched, wherein the electronic map contains information of obstacles;
determining a path for searching the position of a charging base according to the distance between any two obstacles in the electronic map, wherein the any two obstacles are in a parallel position relation;
and searching the position of the charging base according to the determined searching path.
The second aspect of the embodiments of the present invention provides a method for searching a position of a charging base by a robot, including:
acquiring an electronic map of an area to be searched, wherein the electronic map contains information of obstacles;
scanning the boundary of an obstacle in the electronic map, obtaining an intersection point of a scanning line and the boundary of the obstacle, and recording the intersection point as a first intersection point;
controlling the first intersection point to shift a fourth preset distance along the scanning direction or the direction opposite to the scanning direction to obtain a first shift image;
carrying out gray filling on the first offset image to obtain a first filled image;
scanning an unfilled region in the first filling image to obtain an intersection point of a scanning line and the boundary of the unfilled region in the first filling image, and recording the intersection point as a second intersection point;
controlling the second intersection point to shift a fifth preset distance along the scanning direction or the direction opposite to the scanning direction to obtain a second shift image, wherein the scanning direction is the direction when scanning the unfilled region in the first filled image;
carrying out gray level filling on the second offset image to obtain a second filled image;
extracting a contour of an unfilled region in the second fill image and determining the extracted contour as a path for searching for a charging base position;
and searching the position of the charging base according to the determined searching path.
A third aspect of embodiments of the present invention provides a robot, including a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the method of the first aspect or the second aspect when executing the computer program.
A fourth aspect of embodiments of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of the first aspect and/or the second aspect.
Compared with the prior art, the embodiment of the invention has the following beneficial effects: compared with the prior art, the embodiment of the invention can quickly and efficiently search the charging base when the position of the charging base is changed, and has stronger usability and practicability.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flowchart illustrating a method for searching a position of a charging base by a robot according to an embodiment of the present invention;
fig. 2-a is a schematic flowchart of a method for searching a position of a charging base by a robot according to an embodiment of the present invention;
fig. 2-b is a schematic diagram of an electronic map constructed in the method for searching the position of the charging base by the robot according to the second embodiment of the present invention;
fig. 2-c is a schematic diagram of a first offset image in the method for searching the position of the charging base by the robot according to the second embodiment of the invention;
fig. 2-d is a schematic diagram of a first fill-in image in the method for searching the position of the charging base by the robot according to the second embodiment of the present invention;
fig. 2-e is a schematic diagram of a second offset image in the method for searching the position of the charging base by the robot according to the second embodiment of the present invention;
fig. 2-f is a schematic diagram of a second fill-in image in the method for searching the position of the charging base by the robot according to the second embodiment of the present invention;
2-g are schematic outlines of unfilled areas extracted from a second fill image in the method for searching the position of a charging base by a robot according to the second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a robot according to a third embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
It should be understood that, the sequence numbers of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation on the implementation process of the embodiment of the present invention.
It should be noted that, the descriptions of "first" and "second" in this embodiment are used to distinguish different regions, modules, and the like, and do not represent a sequential order, and the descriptions of "first" and "second" are not limited to be of different types.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
Example one
Fig. 1 is a schematic flowchart of a method for searching a charging base position by a robot according to an embodiment of the present invention, where the method may include the following steps:
s101: and acquiring the electronic map of the area to be searched.
Wherein the robot includes, but is not limited to, an autonomous charging robot; the electronic map contains information of the obstacle, such as position information and/or attribute information of the obstacle.
In one embodiment, the types of obstacles include, but are not limited to, boundaries of an area to be searched (e.g., boundaries of a room) and obstacles within the area to be searched (e.g., a refrigerator).
In one embodiment, the electronic map may be constructed according to a cleaning instruction of a user.
In one embodiment, the user may issue the cleaning instruction through a remote controller, a mobile terminal, or a control panel on the robot body.
In an embodiment, the robot may also obtain an electronic map pre-stored in a local or other terminal device, so as to obtain the electronic map required in step S101, for example, when the robot is a floor sweeping robot, the electronic map may be obtained by communicating with a floor sweeping robot.
In one embodiment, the electronic map includes, but is not limited to, a grid map, a topological map, and a vector map.
In one embodiment, the electronic map may be constructed by a lidar sensor on the robot body.
In one embodiment, the pre-stored electronic map may also be retrieved by calling.
In one embodiment, if the electronic map is a grid map, the area covered by the obstacle is represented by a black grid, and the free area is represented by a white grid. It will be appreciated that the robot can only move within these free areas.
S102: and determining a path for searching the position of the charging base according to the distance between any two obstacles in the electronic map.
Wherein the arbitrary two obstacles are in a parallel positional relationship.
Since the distance should be possible as the up-down distance between two obstacles, it is also possible as the left-right distance between two obstacles. Therefore, the parallel position relationship includes up-down parallel and left-right parallel, for example, two walls which can be facing north and south outside the two barriers, and also front and back of the refrigerator.
In one embodiment, determining a path for searching the position of the charging base according to the distance between any two obstacles in the electronic map comprises:
and calculating the distance between any two obstacles, judging whether the calculated distance is smaller than a first preset distance, if so, taking a set of points which are respectively away from the two obstacles by a second preset distance as a path for searching the position of the charging base, and if not, taking a set of points which are respectively away from the two obstacles by a third preset distance as a path for searching the position of the charging base.
In other words, if the calculated distance is d, the farthest distance at which the robot can receive the infrared signal is dmaxThe first predetermined distance is d1=2dmaxThe second predetermined distance is d2The third predetermined distance is d3Then at d<2dmaxWhen the two obstacles are respectively separated by
Figure BDA0001944237850000061
As a path for searching the charging base; when d is more than or equal to 2dmaxWhen d is to be respectively spaced from two obstacles3=dmaxThe set of points serves as a path for searching the charging base.
The two obstacles may have the same or different properties, and may be, for example, two walls in a parallel relationship, or a wall in a parallel relationship and an obstacle (refrigerator) other than the wall.
It should be further noted that the set of points may be one point, one straight line, or one ring.
S103: and searching the position of the charging base according to the determined searching path.
The charging base is an accessory of the robot and is used for providing electric energy for the robot; and, the charging base embeds has infrared sending module, can constantly send infrared signal to the robot.
In one embodiment, the searching for the location of the charging base according to the determined search path includes:
judging the shape of the search path;
if the search path is annular, controlling the robot to search the position of the charging base along the clockwise direction or the anticlockwise direction of the ring;
if the shape of the search path is linear, controlling the robot to search the position of the charging base along the line in a reciprocating manner;
and if the shape of the search path is a point, controlling the robot to search the position of the charging base along the clockwise direction or the anticlockwise direction of the point.
In order to enable the built-in infrared receiving device of the robot to always face the boundary of the obstacle, in one embodiment, if the built-in infrared receiving device of the robot is located at the right side of the body, the robot is controlled to search for the position of the charging base along the counterclockwise direction of the ring or the point.
In order to enable the built-in infrared receiving device of the robot to always face the boundary of the obstacle, in one embodiment, if the built-in infrared receiving device of the robot is located at the left side of the body, the robot is controlled to search for the position of the charging base along the clockwise direction of the ring or the point.
In one embodiment, if the search path is in the shape of a point, the robot is controlled to rotate at least one turn clockwise or counterclockwise to search for the position of the charging dock.
Therefore, in the embodiment of the invention, the charging base can be searched quickly and efficiently when the position of the charging base is changed according to the distance between the obstacles in the electronic map, and the charging base has strong usability and practicability.
Example two
Fig. 2-a is a schematic flowchart of a method for searching a charging base position by a robot according to a second embodiment of the present invention, where the method may include the following steps:
s201: and acquiring the electronic map of the area to be searched.
The step S201 is the same as the step S101 in the first embodiment, and the specific implementation process of the step S201 can be referred to the description of the step S101, which is not repeated herein.
S202: and scanning the boundary of the obstacle in the electronic map, obtaining an intersection point of the scanning line and the boundary of the obstacle, and recording the intersection point as a first intersection point.
In one embodiment, the boundaries of the obstacles in the electronic map are longitudinally scanned sequentially from left to right.
In one embodiment, the distance between two adjacent longitudinal scanning lines is 5 cm.
S203: and controlling the first intersection point to shift a fourth preset distance along the scanning direction or the direction opposite to the scanning direction to obtain a first shift image.
And the fourth preset distance is less than or equal to the farthest distance at which the robot can receive the infrared signals.
In one embodiment, controlling the first intersection to shift by a fourth preset distance in the scanning direction or a direction opposite to the scanning direction, obtaining the first shift image includes:
judging the position of the first intersection point;
if the first intersection point is located at the boundary of a first obstacle, controlling the first intersection point to downwards shift by a fourth preset distance along the scanning direction according to the farthest distance that the robot can receive infrared signals, and obtaining a first shift image, wherein the first obstacle is an obstacle in the electronic map;
and if the first intersection point is located at the boundary of a second obstacle, controlling the first intersection point to upwards shift by a fourth preset distance along the direction opposite to the scanning direction according to the farthest distance that the robot can receive the infrared signal, and obtaining a first shift image, wherein the second obstacle is an obstacle in the electronic map, and is located below the first obstacle.
It should be noted that the position of the first intersection point may be determined based on the longitudinal scanning direction, for example, when the longitudinal scanning direction is from top to bottom, the first intersection point located in front of the scanning line belongs to an intersection point on the second obstacle boundary, and the first intersection point located behind the scanning line belongs to an intersection point on the first obstacle boundary.
S204: and carrying out gray filling on the first offset image to obtain a first filled image.
In an embodiment, the filling may be performed on a partial region in the first offset image, specifically:
and filling an area formed by all first intersection points which are away from the boundary of the first obstacle by a fourth preset distance and the boundary of the first obstacle, and an area formed by all first intersection points which are away from the boundary of the second obstacle by the fourth preset distance and the boundary of the second obstacle, so as to obtain a first filling image.
In one embodiment, the partial region in the first offset image may be laterally filled or longitudinally filled.
S205: and scanning the unfilled region in the first filling image, obtaining an intersection point of the scanning line and the unfilled region boundary in the first filling image, and recording the intersection point as a second intersection point.
In one embodiment, the unfilled regions in the first fill image are scanned laterally in a top-down sequence.
S206: and controlling the second intersection point to shift a fifth preset distance along the scanning direction or the direction opposite to the scanning direction to obtain a second shift image.
Wherein the scanning direction is a direction when scanning an unfilled region in the first fill image.
In one embodiment, the fifth predetermined distance is equal in value to the fourth predetermined distance in step S203.
In one embodiment, controlling the second intersection to shift by a fifth preset distance in the scanning direction or a direction opposite to the scanning direction, and obtaining the second shift image includes:
judging the position of the second intersection point;
if the second intersection point is located at the boundary of a third obstacle, controlling the second intersection point to shift to the right by a fifth preset distance along the scanning direction according to the farthest distance that the robot can receive the infrared signals, and obtaining a second shift image, wherein the third obstacle is an obstacle in the electronic map;
and if the second intersection point is located at the boundary of a fourth obstacle in the obstacles, controlling the second intersection point to shift a fifth preset distance to the left in a direction opposite to the scanning direction according to the farthest distance that the robot can receive the infrared signal, so as to obtain a second shift image, wherein the scanning direction is the direction when an unfilled area in the first filled image is scanned, the fourth obstacle is an obstacle in the electronic map, and the fourth obstacle is located on the right side of the third obstacle.
The position of the second intersection point may be determined based on the direction of the horizontal scanning, for example, when the direction of the vertical scanning is from left to right, the second intersection point located in front of the scanning line belongs to an intersection point on a fourth obstacle boundary, and the second intersection point located behind the scanning line belongs to an intersection point on a third obstacle boundary.
The first barrier, the second barrier, the third barrier and the fourth barrier may be barriers in one room at the same time, for example, the first barrier is a north wall for lying in the main, the second barrier is a south wall for lying in the main, the third barrier is an east wall for lying in the main, and the fourth barrier is a west wall for lying in the main.
S207: and carrying out gray level filling on the second offset image to obtain a second filled image.
In an embodiment, the gray scale filling may be performed on the second offset image with reference to a method for filling the first offset image, specifically: and filling an area formed by all second intersection points which are away from the boundary of the third obstacle by a fifth preset distance and the boundary of the third obstacle, and an area formed by all second intersection points which are away from the boundary of the fourth obstacle by the fifth preset distance and the boundary of the fourth obstacle, so as to obtain a second filling image.
S208: extracting an outline of an unfilled region in the second fill image, and determining the extracted outline as a path for searching a charging base.
In one embodiment, existing or future available methods may be employed to extract the outline of the unfilled region in the second fill image.
It should be understood that after the outline of the unfilled area in the second filled image is obtained, the trend of the outline is the motion track of the robot searching charging base.
S209: and searching the position of the charging base according to the determined searching path.
The step S209 is the same as the step S103 in the first embodiment, and the specific implementation process thereof can refer to the description of the step S103, which is not repeated herein.
It should be noted that the fourth preset distance and the fifth preset distance in the embodiment of the present invention are relative to the first preset distance, the second preset distance, and the third preset distance in the above first embodiment, and the fourth preset distance and the fifth preset distance may be any one or more of the second preset distance and/or the third preset distance, respectively.
The above steps are further explained and illustrated below by taking a specific application scenario as an example, assuming that the room to be cleaned is a four-room one-hall as shown in fig. 2-b (where numerals 1, 2, 3 and 4 denote room 1, room 2, room 3 and room 4, respectively, and the boundaries of each room are represented by black grids in the figure). Firstly, longitudinally scanning the boundaries of a room 1, a room 2, a room 3 and a room 4 in the electronic map from left to right in turn to obtain a scanning map (including a first intersection) as shown in fig. 2-b, secondly, controlling an upper boundary intersection point of the first intersection point to be downwardly shifted by a first preset distance along the scanning direction, controlling a lower boundary intersection point of the first intersection point to be upwardly shifted by a first preset distance along a direction opposite to the scanning direction, obtaining a first shifted image as shown in fig. 2-c, then, transversely filling or longitudinally filling the first shifted image to obtain a first filled image as shown in fig. 2-d, then, transversely scanning unfilled blank grid regions in the first filled image in turn according to the sequence from top to bottom to obtain the intersection point of each transverse scanning line and the left and right boundaries of each room, and controlling a left intersection point in the intersection points to shift to the right by a second preset distance along the scanning direction, and a right intersection point in the intersection points to shift to the left by the second preset distance along a direction opposite to the scanning direction, so as to obtain a second shift image shown in fig. 2-e, then performing transverse filling or longitudinal filling on the second shift image, so as to obtain a second filling image shown in fig. 2-f, then extracting the outline of an unfilled blank grid region in the second filling image, so as to obtain an image shown in fig. 2-g, and finally controlling the robot to search for the position of the charging base along the boundary of the outline.
Therefore, in the embodiment of the invention, by inwardly contracting the boundaries of all the areas to be searched, not only the movement track for searching the charging base can be obtained, but also the situation that the search areas are crossed when the charging base is searched can be avoided, the charging base can be searched quickly and efficiently, and the method has strong usability and practicability.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a robot according to a third embodiment of the present invention. As shown in fig. 3, the robot 3 of this embodiment includes: a processor 30, a memory 31 and a computer program 32 stored in said memory 31 and executable on said processor 30. The processor 30, when executing the computer program 32, implements the steps of the first embodiment of the method, such as the steps S101 to S103 shown in fig. 1. Alternatively, the steps in the second embodiment of the method described above, for example, steps S201 to S209 shown in fig. 2, are implemented.
The robot may include, but is not limited to, a processor 30, a memory 31. Those skilled in the art will appreciate that fig. 3 is merely an example of a robot 3 and does not constitute a limitation of the robot 3 and may include more or fewer components than shown, or some components in combination, or different components, e.g., the robot may also include input output devices, network access devices, buses, etc.
The Processor 30 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 31 may be an internal storage unit of the robot 3, such as a hard disk or a memory of the robot 3. The memory 31 may also be an external storage device of the robot 3, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the robot 3. Further, the memory 31 may also include both an internal storage unit and an external storage device of the robot 3. The memory 31 is used for storing the computer program and other programs and data required by the robot. The memory 31 may also be used to temporarily store data that has been output or is to be output.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art may appreciate that the modules, elements, and/or method steps of the various embodiments described in connection with the embodiments disclosed herein may be implemented as grid hardware or a combination of computer software and grid hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, etc. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (8)

1. A method of a robot searching for a charging base location, comprising:
acquiring an electronic map of an area to be searched, wherein the electronic map contains information of obstacles;
scanning the boundary of an obstacle in the electronic map, obtaining an intersection point of a scanning line and the boundary of the obstacle, and recording the intersection point as a first intersection point;
controlling the first intersection point to shift a fourth preset distance along the scanning direction or the direction opposite to the scanning direction to obtain a first shift image;
carrying out gray filling on the first offset image to obtain a first filled image;
scanning an unfilled region in the first filling image to obtain an intersection point of a scanning line and the boundary of the unfilled region in the first filling image, and recording the intersection point as a second intersection point;
controlling the second intersection point to shift a fifth preset distance along the scanning direction or the direction opposite to the scanning direction to obtain a second shift image, wherein the scanning direction is the direction when scanning the unfilled region in the first filled image;
carrying out gray level filling on the second offset image to obtain a second filled image;
extracting a contour of an unfilled region in the second fill image and determining the extracted contour as a path for searching for a charging base position;
and searching the position of the charging base according to the determined searching path.
2. The method of claim 1, wherein controlling the first intersection point to shift by a fourth preset distance in the scanning direction or a direction opposite to the scanning direction, and obtaining a first shift image comprises:
judging the position of the first intersection point;
if the first intersection point is located at the boundary of a first obstacle, controlling the first intersection point to downwards shift by a fourth preset distance along the scanning direction according to the farthest distance that the robot can receive infrared signals, and obtaining a first shift image, wherein the first obstacle is an obstacle in the electronic map;
and if the first intersection point is located at the boundary of a second obstacle, controlling the first intersection point to upwards shift by a fourth preset distance along the direction opposite to the scanning direction according to the farthest distance that the robot can receive the infrared signal, and obtaining a first shift image, wherein the second obstacle is an obstacle in the electronic map, and is located below the first obstacle.
3. The method of claim 2, wherein gray-scale filling the first offset image, obtaining a first filled image comprises:
and filling an area formed by all first intersection points which are away from the boundary of the first obstacle by a fourth preset distance and the boundary of the first obstacle, and an area formed by all first intersection points which are away from the boundary of the second obstacle by the fourth preset distance and the boundary of the second obstacle, so as to obtain a first filling image.
4. The method according to any one of claims 1 to 3, wherein controlling the second intersection to shift by a fifth preset distance in the scanning direction or a direction opposite to the scanning direction, and obtaining a second shift image comprises:
judging the position of the second intersection point;
if the second intersection point is located at the boundary of a third obstacle, controlling the second intersection point to shift to the right by a fifth preset distance along the scanning direction according to the farthest distance that the robot can receive the infrared signals, and obtaining a second shift image, wherein the third obstacle is an obstacle in the electronic map;
and if the second intersection point is located at the boundary of a fourth obstacle in the obstacles, controlling the second intersection point to shift a fifth preset distance to the left in a direction opposite to the scanning direction according to the farthest distance that the robot can receive the infrared signal, so as to obtain a second shift image, wherein the scanning direction is the direction when an unfilled area in the first filled image is scanned, the fourth obstacle is an obstacle in the electronic map, and the fourth obstacle is located on the right side of the third obstacle.
5. The method of claim 4, wherein gray-filling the second offset image, obtaining a second filled image comprises:
and filling an area formed by all second intersection points which are away from the boundary of the third obstacle by a fifth preset distance and the boundary of the third obstacle, and an area formed by all second intersection points which are away from the boundary of the fourth obstacle by the fifth preset distance and the boundary of the fourth obstacle, so as to obtain a second filling image.
6. The method of claim 1, wherein searching for the location of the charging base according to the determined search path comprises:
judging the shape of the search path;
if the search path is annular, controlling the robot to search the position of the charging base along the clockwise direction or the anticlockwise direction of the ring;
if the shape of the search path is a linear shape, controlling the robot to search the position of the charging base along the linear reciprocating type;
and if the shape of the search path is a point, controlling the robot to search the position of the charging base along the clockwise direction or the anticlockwise direction of the point.
7. A robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor realizes the method steps of any of claims 1 to 6 when executing the computer program.
8. A computer-readable storage medium, in which a computer program is stored which, when being executed by one or more processors, carries out the steps of the method according to any one of claims 1 to 6.
CN201910031044.3A 2019-01-14 2019-01-14 Method for searching position of charging base by robot and robot Active CN109612469B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910031044.3A CN109612469B (en) 2019-01-14 2019-01-14 Method for searching position of charging base by robot and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910031044.3A CN109612469B (en) 2019-01-14 2019-01-14 Method for searching position of charging base by robot and robot

Publications (2)

Publication Number Publication Date
CN109612469A CN109612469A (en) 2019-04-12
CN109612469B true CN109612469B (en) 2020-05-22

Family

ID=66018629

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910031044.3A Active CN109612469B (en) 2019-01-14 2019-01-14 Method for searching position of charging base by robot and robot

Country Status (1)

Country Link
CN (1) CN109612469B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111708357B (en) * 2019-09-17 2023-07-18 深圳银星智能集团股份有限公司 Cleaning end condition identification method and device and sweeping robot
CN113720344A (en) * 2021-08-30 2021-11-30 深圳市银星智能科技股份有限公司 Path searching method and device, intelligent device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160048347A (en) * 2014-10-24 2016-05-04 노틸러스효성 주식회사 An automatic docking system of mobile robot charging station and the method thereof
CN107703933A (en) * 2016-08-09 2018-02-16 深圳光启合众科技有限公司 Charging method, device and the equipment of robot
CN108627171A (en) * 2018-04-20 2018-10-09 杭州晶智能科技有限公司 The intelligent method of localization for Mobile Robot recharging base based on wireless signal strength gradient

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100523367B1 (en) * 2000-10-26 2005-10-20 마츠시다 덴코 가부시키가이샤 Self-controlling movement device having obstacle avoidance function
CN1287722C (en) * 2002-06-21 2006-12-06 泰怡凯电器(苏州)有限公司 Method for identifying automatic dust collector cleanable area and obstacle area
CN103645733B (en) * 2013-12-02 2014-08-13 江苏建威电子科技有限公司 A robot automatically finding a charging station and a system and method for automatically finding a charging station thereof
KR20180021595A (en) * 2016-08-22 2018-03-05 엘지전자 주식회사 Moving Robot and controlling method
CN106647747B (en) * 2016-11-30 2019-08-23 北京儒博科技有限公司 A kind of robot charging method and device
CN109388093B (en) * 2017-08-02 2020-09-15 苏州珊口智能科技有限公司 Robot attitude control method and system based on line feature recognition and robot
CN109062207B (en) * 2018-08-01 2021-09-24 深圳乐动机器人有限公司 Charging seat positioning method and device, robot and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160048347A (en) * 2014-10-24 2016-05-04 노틸러스효성 주식회사 An automatic docking system of mobile robot charging station and the method thereof
CN107703933A (en) * 2016-08-09 2018-02-16 深圳光启合众科技有限公司 Charging method, device and the equipment of robot
CN108627171A (en) * 2018-04-20 2018-10-09 杭州晶智能科技有限公司 The intelligent method of localization for Mobile Robot recharging base based on wireless signal strength gradient

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"An application of charging management for mobile robot transportation in laboratory environments";Hui Liu 等;《2013 IEEE International Instrumentation and Measurement Technology Conference (I2MTC)》;20130715;全文 *
"一种室内清洁机器人返回路径规划算法";林丹;《重庆科技学院学报》;20100228(第1期);全文 *

Also Published As

Publication number Publication date
CN109612469A (en) 2019-04-12

Similar Documents

Publication Publication Date Title
CN110307838B (en) Robot repositioning method and device, computer-readable storage medium and robot
CN109737974B (en) 3D navigation semantic map updating method, device and equipment
EP2713309B1 (en) Method and device for detecting drivable region of road
CN109887033A (en) Localization method and device
CN111486855A (en) Indoor two-dimensional semantic grid map construction method with object navigation points
CN111198378B (en) Boundary-based autonomous exploration method and device
CN109612469B (en) Method for searching position of charging base by robot and robot
WO2007052191A2 (en) Filling in depth results
CN112700552A (en) Three-dimensional object detection method, three-dimensional object detection device, electronic apparatus, and medium
CN113112491B (en) Cliff detection method, cliff detection device, robot and storage medium
CN112513563A (en) Work machine transported object specifying device, work machine transported object specifying method, completion model production method, and learning dataset
CN112509027A (en) Repositioning method, robot, and computer-readable storage medium
CN115014328A (en) Dynamic loading method, device, equipment and medium for grid map
CN108550134B (en) Method and device for determining map creation effect index
CN111609854A (en) Three-dimensional map construction method based on multiple depth cameras and sweeping robot
CN113761255A (en) Robot indoor positioning method, device, equipment and storage medium
CN111179413B (en) Three-dimensional reconstruction method, device, terminal equipment and readable storage medium
CN113440054B (en) Method and device for determining range of charging base of sweeping robot
CN114744721A (en) Charging control method of robot, terminal device and storage medium
CN112747757A (en) Method and device for providing radar data, computer program and computer-readable storage medium
CN113932793A (en) Three-dimensional coordinate positioning method and device, electronic equipment and storage medium
US11227166B2 (en) Method and device for evaluating images, operating assistance method, and operating device
CN112184901A (en) Depth map determination method and device
Kim et al. Mobile robot localization by matching 2d image features to 3d point cloud
CN112614181B (en) Robot positioning method and device based on highlight target

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 518000 room 1601, building 2, Vanke Cloud City phase 6, Tongfa South Road, Xili community, Xili street, Nanshan District, Shenzhen City, Guangdong Province (16th floor, block a, building 6, Shenzhen International Innovation Valley)

Patentee after: Shenzhen Ledong robot Co.,Ltd.

Address before: 518000 16th floor, building B1, Nanshan wisdom garden, 1001 Xueyuan Avenue, Taoyuan Street, Nanshan District, Shenzhen City, Guangdong Province

Patentee before: SHENZHEN LD ROBOT Co.,Ltd.

CP03 Change of name, title or address