CN112578787A - Object searching method, device and storage medium - Google Patents

Object searching method, device and storage medium Download PDF

Info

Publication number
CN112578787A
CN112578787A CN201910945132.4A CN201910945132A CN112578787A CN 112578787 A CN112578787 A CN 112578787A CN 201910945132 A CN201910945132 A CN 201910945132A CN 112578787 A CN112578787 A CN 112578787A
Authority
CN
China
Prior art keywords
self
sensor
moving robot
area
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910945132.4A
Other languages
Chinese (zh)
Other versions
CN112578787B (en
Inventor
尹慧慧
单俊杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecovacs Robotics Suzhou Co Ltd
Original Assignee
Ecovacs Robotics Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecovacs Robotics Suzhou Co Ltd filed Critical Ecovacs Robotics Suzhou Co Ltd
Priority to CN201910945132.4A priority Critical patent/CN112578787B/en
Publication of CN112578787A publication Critical patent/CN112578787A/en
Application granted granted Critical
Publication of CN112578787B publication Critical patent/CN112578787B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • G05D1/0236Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0225Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • G05D1/0261Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means using magnetic plots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • G05D1/0263Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means using magnetic strips
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/028Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The embodiment of the application provides an object searching method, object searching equipment and a storage medium. In some embodiments of the present application, a first exploration area is determined on a map corresponding to an environment in which the self-moving robot is located, wherein the first exploration area is within a signal perception range of the first sensor; the first sensor is used for sensing a preset signal emitted by a target object; if the first sensor does not receive the preset signal, marking that the target object does not exist in the first exploration area. The preset signal is searched in the designated area by using the first sensor carried by the self-moving robot, so that the area where the target object is located can be quickly and accurately determined, and the exploration efficiency and accuracy of the self-moving robot on the target object in a strange area can be effectively improved.

Description

Object searching method, device and storage medium
Technical Field
The present application relates to the field of artificial intelligence technologies, and in particular, to an object search method, device, and storage medium.
Background
With the continuous development of artificial intelligence technology, various intelligent robots increasingly enter the lives of people, such as logistics robots, floor sweeping robots, welcoming robots and the like.
Taking the sweeping robot as an example, the sweeping robot needs to return to the charging seat for charging under the condition of low electric quantity. When the sweeping robot needs to be recharged, the position of the charging seat may not be known, because the sweeping robot enters an unfamiliar environment, or because the position of the charging seat may change dynamically.
Disclosure of Invention
Aspects of the present disclosure provide an object search method, apparatus, and storage medium to enable a search for a location of a target object such as a charging stand.
The embodiment of the application provides an object searching method, which comprises the following steps:
determining a first exploration area on a map corresponding to the environment where the self-moving robot is located, wherein the first exploration area is within a signal sensing range of a first sensor;
the first sensor is used for sensing a preset signal emitted by a target object;
if the first sensor does not receive the preset signal, marking that the target object does not exist in the first exploration area.
Embodiments of the present application provide a computer-readable storage medium storing a computer program that, when executed by one or more processors, causes the one or more processors to perform actions comprising:
determining a first exploration area on a map corresponding to the environment where the self-moving robot is located, wherein the first exploration area is within a signal sensing range of a first sensor;
the first sensor is used for sensing a preset signal emitted by a target object;
if the first sensor does not receive the preset signal, marking that the target object does not exist in the first exploration area.
The embodiment of the application provides a from mobile robot, includes: the machine body is provided with one or more processors, one or more memories for storing computer programs and a first sensor;
the one or more processors to execute the computer program to:
determining a first exploration area on a map corresponding to the environment where the self-moving robot is located, wherein the first exploration area is within a signal sensing range of a first sensor;
the first sensor is used for sensing a preset signal emitted by a target object;
if the first sensor does not receive the preset signal, marking that the target object does not exist in the first exploration area.
In some embodiments of the present application, the self-moving robot turns on a first sensor, and the first sensor searches a preset signal of the target object in a first search area corresponding to a first position where the self-moving robot is currently located within a signal sensing range of the first sensor. For example, an infrared emitter is installed on the target object, and an infrared signal is emitted through the infrared emitter. And then an infrared receiver for receiving the infrared signal is installed on the self-moving robot. If the infrared signal is received from the mobile robot at the first position, the target object is located in the first search area, and the searched target object is marked in the map. If the self-moving robot does not receive the infrared signal at the first position, the first exploration area is marked to not contain the target object, and the self-moving robot continues to search for the target object from other unexplored areas. The preset signal is searched in the designated area by using the first sensor carried by the self-moving robot, so that the area where the target object is located can be quickly and accurately determined, and the exploration efficiency and accuracy of the self-moving robot on the target object in a strange area can be effectively improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic flowchart of an object search method according to an embodiment of the present application;
fig. 2 is a schematic diagram of an exploring target object from a mobile robot according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a target object marking method according to an embodiment of the present application;
fig. 4 is a flowchart illustrating a method for area exploration according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram illustrating a method for determining a position of a first target point according to an embodiment of the present disclosure;
fig. 6 is a schematic flowchart of a method for traveling from a mobile robot to a first target point according to an embodiment of the present disclosure;
fig. 7a is a flowchart illustrating a method for searching an area after a mobile robot moves to a first target point according to an embodiment of the present disclosure;
FIG. 7b is a schematic diagram illustrating a method for determining a position of a second target point according to an embodiment of the present application;
fig. 8 is a schematic flowchart of a method for a cleaning robot to search for a charging seat according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a self-moving robot according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, and "a" and "an" generally include at least two, but do not exclude at least one, unless the context clearly dictates otherwise.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a good or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such good or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a commodity or system that includes the element.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
In the application, the self-moving robot can independently walk and execute corresponding service functions, and also can have the functions of calculation, communication, internet surfing and the like. The embodiment of the application can be unmanned aerial vehicle, unmanned vehicle and the like from mobile robot. The basic service functions of the self-moving robot are different according to different application scenes. The self-moving robot can be a sweeping self-moving robot, a following self-moving robot, a welcoming self-moving robot and the like. For example, for a sweeping self-moving robot applied to a home, an office building, a market, and other scenes, the basic service function is to sweep the ground in the scene; for the glass cleaning self-moving robot applied to scenes such as families, office buildings, markets and the like, the basic service function is to clean the glass in the scene; for following a self-moving robot, the basic service function is to follow a target object; the basic service function of a guest-welcoming mobile robot is to welcome a customer and guide the customer to a destination.
In order to improve the autonomous working capacity of the self-moving robot, the application provides an object searching method. Fig. 1 is a schematic flowchart of an object search method according to an embodiment of the present application. The method comprises the following steps:
101: determining a first exploration area on a map corresponding to the environment where the self-moving robot is located, wherein the first exploration area is within a signal perception range of the first sensor; the first sensor is used for sensing a preset signal emitted by a target object.
102: if the first sensor does not receive the preset signal, marking that the target object does not exist in the first exploration area.
When the self-moving robot searches for a target object, in some application scenarios, the target object may actively transmit a preset signal, which may be, for example, an infrared signal, a bluetooth signal, or the like. Correspondingly, a first sensor capable of identifying the preset signal is also installed at the self-moving robot end. No matter what kind of first sensor is adopted to receive the preset signal, a corresponding signal sensing range can be found from the mobile robot. For example, the target object may be a charging stand, the corresponding preset signal may be an infrared signal, at least one infrared receiver (i.e., at least one first sensor) is installed on the self-moving robot, and the self-moving robot receives the infrared signal within an effective distance of the infrared signal, searches for the charging stand, and performs position marking.
In other application scenarios, the target object may be sensed by the self-moving robot, in other words, the self-moving robot actively acquires the preset signal, and a near field communication chip, such as RFID, NFC, or the like, may be mounted on the target object. In the application scene, a corresponding near field communication chip is required to be preset on the target object, the target object can be identified in a non-contact manner in the approaching process of the self-moving robot, and the position of the target object in the exploration area is determined.
In some application scenarios, the self-moving robot may actively acquire the preset signal, where the preset signal may be image information acquired by the self-moving robot through a camera, and it is no longer necessary to install other electronic devices for identity recognition on the target object. For example, a camera is arranged on the self-moving robot, so that image information of an object can be clearly shot, the self-moving robot can perform image recognition on the shot image information and compare the image information with an image of a target object, a recognition result of the target object is obtained, and searching of the target object is completed.
For ease of understanding, the following description will be given taking as an example the target object actively emitting an infrared signal. Fig. 2 is a schematic diagram of a self-moving robot exploring a target object according to an embodiment of the present disclosure. It can be seen that an infrared transmitter is installed on a target object (e.g., a charging dock), and an infrared receiver is required to be installed on a corresponding self-moving robot, for example, 3 first sensors are installed on the self-moving robot, so that the self-moving robot can more easily receive a preset signal transmitted by the target object. Of course, the number of the first sensors arranged on the self-moving robot can be set according to the requirements of users. For example, the effective detection angle of the first sensor is 90 degrees, 3 first sensors are arranged to meet the requirement of finding a target object, and the total detection angle range is 270 degrees; if the effective detection angle of the first sensor is 30 degrees, 9 first sensors are needed to meet the requirement of finding the target object.
It should be noted that the signal sensing range of the first sensor mentioned herein includes the effective signal distance and the signal coverage angle. For example, the effective distance of signal perception of some first sensors is 3 meters, and the signal coverage angle is 90 °. If the mobile robot is not in the signal sensing range, the self-moving robot cannot find the charging seat and continues searching.
The area covered in the signal perception range can be understood as the explored area. In other words, when the mobile robot moves to the first position, the area covered by the first sensor during the movement of the mobile robot and the area covered by the mobile robot when the mobile robot moves to the first position can be both understood as the first search area corresponding to the first position, that is, the first position is in the first search area.
When the self-moving robot is at the first position, each first sensor continuously detects whether a preset signal exists around. If the preset signal is not received at the first position from each first sensor of the mobile robot, it may be considered that the target object does not exist in the first search area.
If the first sensor of the self-moving robot receives a preset signal at a first position, the position of the target object is marked within the first exploration area so that the self-moving robot can find the target object again. For example, if the target object is a charging seat, after the self-moving robot marks the position of the target object in the map, the self-moving robot can find the charging seat for charging according to the mark. It should be noted that, if the self-moving robot wants to mark the position of the target object in the map, a map needs to be created first. Specifically, the self-moving robot can utilize the second sensor to build a map in the environment where the self-moving robot is located, wherein the map comprises basic information in the current environment and possibly does not contain obstacle information. Therefore, the self-moving robot needs to actively perform object search and obstacle recognition based on the current map, mark the position of the target object in the map, and mark the position of the obstacle in the map. After obtaining the map, the self-moving robot may determine a first location and a corresponding first area based on the current map. The second Sensor here may be a Laser Docking Sensor (LDS).
It is easily understood that, when marking the position of the target object, the specific position of the target object may be determined according to the signal strength of the preset signal and the relative angle of the preset signal to the first sensor. If the requirement for determining the position of the target object is high, after the first sensor receives the preset signal, the self-moving robot continues to move towards the position of the target object until the self-moving robot is in contact with the target object or the distance (the distance is small enough to be ignored and the positioning of the target object is not influenced) is very close, and the position of the target object is accurately determined. Here, the marking of the position of the target object in the first search area may be understood as marking the target object on a map corresponding to the current environment so that the mobile robot approaches or avoids the marked target object again after that.
As an alternative embodiment, if the self-moving robot does not find the target object in the first search area, the self-moving robot will continue to search other areas, the specific search method is shown in fig. 3, and fig. 3 is a flowchart of a target object marking method provided by the embodiment of the present application, including the following steps:
301: controlling the self-moving robot to move from a first location to a second location in the first exploration area.
302: and determining a second exploration area corresponding to the second position on the map according to the signal perception range of the first sensor.
303: and if the first sensor does not receive the preset signal, marking that the target object does not exist in the second exploration area.
304: and if the first sensor receives a preset signal transmitted by a target object, marking the position of the target object in the second exploration area.
The first position and the second position may be determined by the mobile robot based on a predetermined travel path and a predetermined travel mode. For example, in one detection cycle of the self-moving robot, two positions before and after the self-moving robot moves, specifically, the first detection cycle corresponds to the first position, and the second detection cycle corresponds to the second position. Of course, when the moving distance from the mobile robot reaches a certain threshold value, the position may be updated once, the position before the update is referred to as a first position, the position after the update is referred to as a second position, and the search result of the moved first position may be marked on the map.
The self-moving robot leaves the first position and continues to move to the second position. The first sensor detects whether a preset signal emitted by a target object exists in the process of moving the mobile robot from the first position to the second position and after the mobile robot reaches the second position. If the preset signal is not received from the first sensor of the mobile robot at the second position, it may be considered that the target object does not exist in the second exploration area. And if the first sensor of the self-moving robot receives a preset signal at the second position, marking the position of the target object in the second exploration area. It is easily understood that, when determining the position of the target object, the specific position of the target object may be determined according to the signal strength of the preset signal and the relative angle of the preset signal to the first sensor.
For example, the target object is a charging stand for charging a self-moving robot. Assuming that the self-moving robot is put into a new environment, the self-moving robot needs to explore and recognize the new environment. In the exploration and identification process, the self-moving robot firstly solves the charging problem and immediately starts to search the charging seat. The charging dock is found autonomously without the help of personnel. The charging seat is connected with a wall body or a power supply on the ground through a plug, and the charging seat is continuously supplied with electric energy. An infrared transmitter for transmitting a preset signal is arranged on the charging seat, so that the self-moving robot can determine the position of the charging seat through the first sensor according to the received preset signal. The charging dock is then marked at the corresponding location on the map. Note that in the new environment, the self-moving robot can create a map using the second sensor (i.e., the LDS sensor).
The self-moving robot is put into a new environment, the self-moving robot can work first, and when the self-moving robot detects that the electric quantity is low, the self-moving robot starts to search for the charging seat automatically. After finding the charging seat from the mobile robot, the mobile robot can automatically charge.
The charging seat is the mode of charging from mobile robot, can be the contact charging, for example, set up the assorted interface that charges respectively on charging seat and the mobile robot, two interface that charge contact each other can charge. In order to ensure the charging interfaces to be reliably contacted, the two charging interfaces can be respectively provided with opposite magnetic poles, attraction is generated when the two charging interfaces are close to each other, and the two charging interfaces are better connected together to carry out reliable charging.
The charging seat can also adopt wireless charging for the mode of charging from mobile robot, is provided with induction coil on from mobile robot, only needs to be close to the charging seat from mobile robot, utilizes the electromagnetic induction principle to charge for from mobile robot.
Through this scheme, under the supplementary condition of personnel that need not, the charging seat can be explored voluntarily from the strange environment to self-moving robot to realize automatic charging, help improving work efficiency, alleviate personnel's work load.
In the process of searching for the target object, surrounding obstacles can also be detected by using the first sensor. The first sensor for detecting an obstacle may be an infrared first sensor, or may be an ultrasonic first sensor or a camera provided separately, and is not particularly limited herein. When the self-moving robot detects an obstacle in the first search area by using the first sensor, the position of the detected obstacle is marked in the first search area. Similarly, when the self-moving robot detects an obstacle in the second search area by using the first sensor, the position of the detected obstacle is marked in the second search area.
In some application scenarios, it is also desirable to distinguish the type of obstacle, for example, a biological obstacle (e.g., an animal such as a cat, a dog, etc.), a fixed obstacle (e.g., a wall), a movable obstacle (e.g., a trash can), and so forth. The work of distinguishing the obstacle types can be automatically completed by the self-moving robot or can be completed under the assistance of personnel. Further, the marking may be selected according to the type of the obstacle, in other words, the marking is made clear in the map for a fixed obstacle, and the marking is not made for a movable obstacle and other creatures, etc., because the position of the movable obstacle is not fixed.
Through the marking work of the obstacles in the areas where the searching and the identification are finished, such as the first searching area, the second searching area and the like, the walking route can be conveniently planned by the mobile robot. For example, if the mobile robot finds a target object and then attempts to move to the target object again, a travel route having the shortest distance or the shortest time can be planned according to the obstacle marking result, so that the speed of the mobile robot reaching the target object (for example, a charging stand) can be effectively increased. Note that since the movable obstacle, the biological obstacle, and the like are already unmarked at the time of the obstacle marking, the influence of the movable obstacle is not considered at the time of planning the walkable route.
It is easy to understand that when the target object is automatically searched from the mobile robot in a new environment, the position of the target object can be accurately found and marked through an identification and exploration process of an unfamiliar area. There are many area exploration methods, such as a left turn-first or right turn-first area exploration strategy, and so on. Then, in some existing area search methods, the walking track of the self-moving robot is guided, and repeated walking may occur, which results in low area search efficiency.
Therefore, in an optional embodiment, a region exploration method is provided, so that the region exploration efficiency can be improved by the self-moving robot, and the target object can be found more quickly. Fig. 4 is a schematic flowchart of a region exploration method according to an embodiment of the present application, including the following steps:
401: at least one boundary point on the adjacent boundary line of the explored area and the unexplored area is obtained.
402: and determining one boundary point used for guiding the self-moving robot to conduct area exploration from the at least one boundary point as a first target point.
403: and marking an unexplored area corresponding to the signal sensing range of the first sensor in the process that the self-moving robot moves to the first target point as an explored area.
404: and determining a corresponding area exploration strategy according to the result of the self-moving robot moving to the first target point.
The result of the self-moving robot moving to the first target point as referred to herein includes: the first result is: the self-moving robot does not move to the first target point; the second result is: the self-moving robot successfully moves to the first target point.
In order to facilitate understanding of the area search method, the following embodiments will be described by taking a self-moving robot as a sweeping self-moving robot as an example.
The user places the self-sweeping mobile robot in a completely unexplored strange environment, and each area in the current strange environment does not contain an explored area, so that no adjacent boundary line of the explored area and the unexplored area exists. In this case, the first target point needs to be customized in the unexplored area from the mobile robot. Specifically, fig. 5 is a schematic diagram of determining a position of a first target point according to an embodiment of the present application. The self-moving robot sweeper is provided with at least one infrared first sensor, the effective detection angle is 90 degrees, and the effective detection distance is 3 meters. The floor sweeping self-moving robot starts the infrared first sensor to detect the surrounding environment at the current initial position, and whether obstacles exist in the detection range or not is identified. And if no obstacle is found, selecting one point from the detected obstacle-free areas as a first target point so as to guide the sweeping from the mobile robot to search the areas through the first target point. When the first target point is selected, the mobile robot may randomly select a point from an area that does not include the obstacle, or a point located at an intermediate position may be selected as the first target point.
During the movement of the floor from the mobile robot to the first target point, the infrared first sensor is used to identify the surrounding environment, so that the initial explored area can be obtained, and further the explored area and the unexplored area, and the adjacent boundary lines between the explored area and the unexplored area can be determined. It should be noted that, when determining the first target point, a distance between the first target point and the current initial position of the self-sweeping mobile robot is smaller than an effective detection distance of the infrared first sensor, and is also larger than a precision threshold (for example, the precision threshold is 1 meter, and a distance between the initial position and the first target point may be 2 meters).
It should be noted that the movement of the sweeper mobile robot is within a certain precision range, and the sweeper mobile robot may have an error when performing self-positioning and/or target positioning, for example, the sweeper mobile robot intends to reach the second location, and the actual sweeper mobile robot considers that the sweeper mobile robot has reached the second location when reaching a location twenty centimeters away from the second location. Therefore, the precision threshold is significantly larger than the precision range of the movement of the self-moving robot, for example, the precision threshold is required to be 1 meter.
If the result of moving from the mobile robot to the first target point is the first result, that is, the sweeping self-mobile robot may not reach the first target point, the reason why the sweeping self-mobile robot cannot move to the first target point may be as follows:
for example, if a sweeping self-moving robot is provided with an infrared first sensor for area exploration, the infrared first sensor cannot identify the existence of an obstacle due to good light transmittance of a glass door or a glass wall.
For example, in the current environment, there are many light sources and the light is strong, which causes significant interference to the infrared first sensor, and the infrared first sensor cannot accurately identify the position of the obstacle.
For example, a short step exists on the ground, the moving roller of the self-moving robot for sweeping cannot cross the block of the step, but the height of the step is obviously lower than that of the infrared first sensor, and the infrared first sensor cannot identify the existence of the obstacle.
In order to effectively solve the problem that the first target point cannot be reached, a method as shown in fig. 6 may be adopted. Fig. 6 is a flowchart illustrating a method for traveling from a mobile robot to a first target point according to an embodiment of the present disclosure. The method specifically comprises the following steps: 601: and if the self-moving robot collides with an obstacle after moving to the second position, the self-moving robot is continuously controlled to move to the first target point by changing the walking direction of the self-moving robot. 602: and if the change times of the walking direction reach preset times, updating the first target point to the second position, and marking the position of the obstacle in the second exploration area. 603: and controlling the self-moving robot to determine a second target point and exploring the target object.
In the above listed situations, when the self-moving sweeping robot cannot reach the first target point, the self-moving sweeping robot cannot accurately recognize the existence of the obstacle or cannot accurately recognize the size, the position, and the like of the obstacle, and the self-moving sweeping robot will continue to move toward the predetermined first target point, so that the problems of crash due to continuously-executed unachievable operation, incapability of moving due to the self-moving sweeping robot being stuck on a step, incapability of recognizing the obstacle due to repeated impact of the self-moving sweeping robot, damage, and the like easily occur.
For example, when sweeping the self-moving robot toward the first target point, the traveling route of sweeping the self-moving robot is blocked because an unrecognizable obstacle. If the number of attempted changes reaches 5 times (assuming that the preset number is 5 times), the robot will still be blocked by the unidentifiable obstacle, and the self-moving robot will give up the first target point, in other words, the self-moving robot will give up the first target point.
Then, the self-moving robot for sweeping searches for the next boundary point from the boundary line as a second target point according to the current second position. Specifically, the sweep self-moving robot determines the coordinate information of the second position, and further determines whether or not an unexplored area exists, and if so, further determines the adjacent boundary line between the searched area and the unexplored area. Since there are many boundary points on the boundary line, a boundary point closest to the second position is determined as the second target point from among the boundary points.
The searched area referred to herein is a first search area, a second search area, and the like in which the self-moving robot has completed searching while moving to the first target point.
If the result of moving from the mobile robot to the first target point is the second result, that is, after the floor is swept from the mobile robot to the first target point smoothly, the method shown in fig. 7a is further adopted, and fig. 7b is a schematic diagram of determining the position of the second target point according to the embodiment of the present application. Fig. 7a is a flowchart illustrating a method for searching after a mobile robot moves to a first target point according to an embodiment of the present disclosure. As can be seen from the figure, 701: after the sweeping self-moving robot moves to the first target point, the sweeping self-moving robot determines the boundary point with the minimum distance as a second target point according to the distance between at least one boundary point and the first target point, so that the sweeping self-moving robot conducts area exploration according to the second target point. 702: and if the second target point is not found after the floor sweeping is moved to the first target point from the mobile robot, ending the area exploration.
In practical applications, the moving of the sweeping robot to the first target point may be that the sweeping robot completely coincides with the first target point, or that the sweeping robot does not completely coincide with the first target point, and a certain distance exists, for example, assuming that the moving precision of the sweeping robot is 0.1 meter, when the sweeping robot moves to a distance of 0.1 meter from the first target point, it can be considered that the sweeping robot reaches the first target point. Therefore, when determining the second target point, the present embodiment selects a boundary point closest to the first target point from the boundary line as the second target point. In practical application, a boundary point closest to the current position of the sweeping self-moving robot may be selected from the boundary line as the second target point.
As shown in fig. 7b, after the sweep is found to have successfully moved to the first target point from the mobile robot, it is further searched from the periphery whether there is an unexplored area, and if there is an unexplored area, it is determined as a second target point on the boundary line between the explored area and the unexplored area. After the second target point is determined, the sweep continues to move from the first target point to the second target point from the mobile robot. It should be noted that the travel trajectory of the sweeper from the mobile robot moving from the current position to the first target point or from the first target point to the second target point is not necessarily a straight line, and the arrow in fig. 7b does not indicate the travel trajectory. The specific moving process is the same as or similar to the moving process to the first target point in the foregoing embodiment, and details are not repeated here, and reference may be made to the foregoing embodiment specifically.
As can be seen from the foregoing, the target point is determined on the boundary line between the explored area and the unexplored area. As an alternative, there may be many boundary points since on one adjacent boundary line. When the first target point or the second target point is selected, a boundary point closest to the current position of the sweeping self-moving robot can be selected from the at least one boundary point. Therefore, the floor sweeping can be guided to search the area nearby from the mobile robot. Of course, if necessary, a boundary point farthest from the current position of the mobile robot may be selected as the first target point or the second target point.
However, in some cases, there may be a plurality of boundary points closest to the current position of the mobile robot, for example, the signal sensing area corresponding to the first sensor of the mobile robot is a fan-shaped structure, and the mobile robot is located at the center of the fan-shaped explored area. The distances between each boundary point on the boundary line of the fan-shaped curve formed by the explored area and the unexplored area adjacent to each other and the sweeping self-moving robot are the same, in this case, the boundary point closest to the current position of the sweeping self-moving robot is not unique, and therefore, the manner of determining the first target point or the second target point may be various, for example, one manner is to randomly select one boundary point from the boundary line as the first target point or the second target point, and the other manner is to select the boundary point at the middle position on the boundary line of the fan-shaped explored area as the first target point or the second target point. The above-mentioned manner of determining the first target point or the second target point from the plurality of nearest boundary points is only used as an example, and in practical applications, a user may select an appropriate manner of determining the first target point or the second target point according to practical situations.
To facilitate understanding of the whole process of searching for the target object from the mobile robot, the following takes sweeping from the mobile robot searching for the charging stand as an example, as shown in fig. 8, fig. 8 is a schematic flow chart of a method for sweeping from the mobile robot searching for the charging stand according to the embodiment of the present disclosure. The specific method comprises the following steps:
801: first, a current target point is selected from an area searched by the self-sweeping mobile robot without obstacles, and a specific method for selecting the current target point is not described herein, and the embodiment corresponding to fig. 4 may be specifically referred to.
802: in the exploration process of the self-moving robot for sweeping, the exploration information can be updated in time according to a certain period. The search information includes the position of the marking target object, the position of the marking obstacle, and the like described above.
803: and judging whether the sweeping self-moving robot searches the preset signal of the charging seat.
The self-moving robot can continuously search the preset signal of the charging seat when moving to the current target point. For example, during the operation of the mobile robot, the robot cleans the floor by turning on the infrared receiver and continuously searching whether there is an infrared signal sent by the charging stand.
804: if the floor sweeping self-moving robot does not search the preset signal of the charging seat, further judging whether the floor sweeping self-moving robot moves to the current target point.
And if the floor sweeping does not obtain the marking result of the charging seat when the mobile robot moves to the current target point, determining a next target point on the boundary of the searched area and the unexplored area. The explored area consists of an exploring area corresponding to each of a plurality of positions in the process of moving from the first position to the current target point;
controlling the sweeping self-moving robot to move from the current target point to the next target point;
determining a third exploration area corresponding to a third position on the map according to the signal perception range of the first sensor, wherein the third position is any position in the process of moving from the current target point to the next target point;
if the preset signal is not received through the first sensor, marking that the charging seat does not exist in the third exploration area; and proceeds to step 802 for updating the discovery information.
805: if the floor sweeping self-mobile robot receives the preset signal of the charging seat, the position of the charging seat in the exploration area is marked, and then the exploration of the charging seat is finished.
And if the infrared receiver of the self-moving robot receives the infrared signal transmitted by the charging seat through sweeping, marking the position of the charging seat in the third exploration area, and ending the exploration of the charging seat. After the charging seat is marked in the map, if the self-sweeping mobile robot wants to perform recharging (i.e. recharging again), the position of the charging seat can be quickly and automatically found according to the marking result without repeating the searching process shown in fig. 1.
Of course, a plurality of charging seats may be installed in the search area at the same time, or the floor sweeping robot may continue to search to mark the position of each charging seat. The self-moving robot can select to charge nearby, and the charging exploration time of the self-moving robot can be saved. Meanwhile, the charging requirements of a plurality of self-moving robots can be met.
806: and if the sweeping robot does not move to the current target point from the mobile robot, judging whether an invisible obstacle is encountered. The invisible obstacle mentioned here has already been explained in the embodiment corresponding to fig. 6, and is not repeated here. If the sweep self-moving robot does not encounter an invisible obstacle, step 802 is executed to update the exploration information.
807: if the robot meets invisible obstacles, whether the change times of changing the walking direction of the self-moving robot during sweeping exceeds the preset times or not is judged.
Specifically, the self-moving robot changes the walking direction after encountering an invisible obstacle every time, and continues to try to move to the current target point.
808: and if the number of times of changing the walking direction of the self-moving robot during sweeping exceeds the preset number of times, marking the obstacle in the tried exploration area.
809: the next target point (next target point) located on the border of the explored and unexplored area is sought. If the number of times of changing the walking direction of the mobile robot during sweeping does not exceed the preset number of times, the tried search area is marked as an obstacle, and step 802 is executed to update the search information.
810: sweeping is performed whether the mobile robot finds the next target point. If so, the step 802 is executed to update the exploration information, and then the above steps are continuously executed. If not, the exploration of the charging seat is finished.
The target points determined by the sweeping self-moving robot are all set on the adjacent boundary lines of the searched area and the unexplored area, so that the sweeping self-moving robot is guided to search the areas towards the unexplored area, repeated identification or search of the searched area by the sweeping self-moving robot can be effectively avoided, and the area searching efficiency can be effectively improved.
Fig. 9 is a schematic structural diagram of a self-moving robot according to an embodiment of the present application. The self-moving robot comprises a machine body, one or more processors 902, one or more memories 903 storing computer programs, and first sensors 905, wherein the first sensors 905 comprise at least one external first sensor 905 deployed on the self-moving robot and other first sensors 905 installed on the machine body for maintaining basic functions of the self-moving device. In addition to this, the self-moving device may further include necessary components such as a second sensor 901, a power supply component 904, and the like.
The at least one external first sensor is used for acquiring preset signals within respective signal sensing ranges;
one or more processors 902 for executing a computer program for:
determining a first exploration area on a map corresponding to the environment where the self-moving robot is located, wherein the first exploration area is within a signal perception range of the first sensor; the first sensor is used for sensing a preset signal emitted by a target object;
if the first sensor does not receive the preset signal, marking that the target object does not exist in the first exploration area.
Optionally, one or more processors 902 are configured to mark the presence of the target object in the first exploration area if the first sensor receives the preset signal, so that the self-moving robot finds the target object again.
Optionally, one or more processors 902 for controlling the self-moving robot to move from the first location to a second location; determining a second exploration area corresponding to the second position on the map according to the signal perception range of the first sensor; if the preset signal is not received through the first sensor, marking that the target object does not exist in the second exploration area; and if a preset signal transmitted by a target object is received through the first sensor, marking the position of the target object in the second exploration area.
Alternatively, the target object that the self-moving robot wants to search for may be a charging dock.
Optionally, one or more first sensors 905, if an obstacle is detected within the first exploration area, mark the location of the obstacle within the first exploration area by one or more processors 902.
Optionally, the first position is an initial position. One or more processors 902 are configured to determine a first target point on the map, the first target point being no further from the first location than the signal perception range, the second location being any location in the process of moving from the first location to the first target point.
Optionally, if the self-moving robot collides with an obstacle after moving to the second position, the one or more processors 902 continue to control the self-moving robot to move to the first target point by changing the walking direction of the self-moving robot; and if the change times of the walking direction reach preset times, updating the first target point to the second position, and marking the position of the obstacle in the second exploration area.
Optionally, the one or more processors 902 are configured to determine a second target point on a boundary between an explored area and an unexplored area if the labeling result of the target object is not obtained when the self-moving robot moves to the first target point, where the explored area is composed of an exploration area corresponding to each of a plurality of positions in the process of moving from the first position to the first target point; controlling the self-moving robot to move from the first target point to the second target point; determining a third exploration area corresponding to a third position on the map according to the signal perception range of the first sensor, wherein the third position is any position in the process of moving from the first target point to the second target point; if the preset signal is not received through the first sensor, marking that the target object does not exist in the third exploration area; and if a preset signal transmitted by a target object is received through the first sensor, marking the position of the target object in the third exploration area.
Optionally, the first sensor is at least one infrared receiver.
Optionally, the one or more processors 902 may be further configured to: the self-moving robot utilizes a second sensor to construct a map corresponding to the environment where the self-moving robot is located; a self-moving robot determines a first location of the self-moving robot in the map using a second sensor.
The embodiment of the application also provides a computer readable storage medium storing the computer program. The computer-readable storage medium stores a computer program, and the computer program, when executed by one or more processors, causes the one or more processors to perform the steps in the respective method embodiments of fig. 1-8.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. An object searching method, which is suitable for a self-moving robot comprising at least one sensor, is characterized in that the method comprises the following steps:
determining a first exploration area on a map corresponding to the environment where the self-moving robot is located, wherein the first exploration area is within a signal sensing range of a first sensor;
the first sensor is used for sensing a preset signal emitted by a target object;
if the first sensor does not receive the preset signal, marking that the target object does not exist in the first exploration area.
2. The method of claim 1, further comprising: and if the first sensor receives the preset signal, marking that the target object exists in the first exploration area.
3. The method of claim 1, wherein said marking of the absence of the target object within the first exploration area further comprises:
controlling the self-moving robot to move from a first location to a second location in the first exploration area;
determining a second exploration area corresponding to the second position on the map according to the signal perception range of the first sensor;
if the first sensor does not receive the preset signal, marking that the target object does not exist in the second exploration area;
and if the first sensor receives a preset signal transmitted by a target object, marking the position of the target object in the second exploration area.
4. The method of claim 1, further comprising:
if an obstacle is detected in the first search area, the position of the obstacle is marked in the first search area.
5. The method of claim 3, wherein the first position is an initial position, the method further comprising:
and determining a first target point on the map, wherein the distance between the first target point and the first position is not more than the signal perception range, and the second position is any position in the process of moving from the first position to the first target point.
6. The method of claim 5, further comprising:
if the self-moving robot collides with an obstacle after moving to the second position, the self-moving robot is continuously controlled to move to the first target point by changing the walking direction of the self-moving robot;
and if the change times of the walking direction reach preset times, updating the first target point to the second position, and marking the position of the obstacle in the second exploration area.
7. The method of claim 5, further comprising:
if the marking result of the target object is not obtained when the self-moving robot moves to the first target point, determining a second target point on the boundary of an explored area and an unexplored area, wherein the explored area consists of exploring areas corresponding to a plurality of positions in the process of moving from the first position to the first target point;
controlling the self-moving robot to move from the first target point to the second target point;
determining a third exploration area corresponding to a third position on the map according to the signal perception range of the first sensor, wherein the third position is any position in the process of moving from the first target point to the second target point;
if the first sensor does not receive the preset signal, marking that the target object does not exist in the third exploration area;
and if the first sensor receives a preset signal transmitted by a target object, marking the position of the target object in the third exploration area.
8. The method of claim 1, further comprising, prior to determining the first exploration area on a map corresponding to the environment in which the self-moving robot is located:
the self-moving robot utilizes a second sensor to construct a map corresponding to the environment where the self-moving robot is located;
the self-moving robot determines a first location of the self-moving robot in the map using the second sensor.
9. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by one or more processors, causes the one or more processors to perform acts comprising:
determining a first exploration area on a map corresponding to the environment where the self-moving robot is located, wherein the first exploration area is within a signal sensing range of a first sensor;
the first sensor is used for sensing a preset signal emitted by a target object;
if the first sensor does not receive the preset signal, marking that the target object does not exist in the first exploration area.
10. A self-moving robot, comprising: the machine body is provided with one or more processors, one or more memories for storing computer programs and a first sensor;
the one or more processors execute the computer program to:
determining a first exploration area on a map corresponding to the environment where the self-moving robot is located, wherein the first exploration area is within a signal sensing range of a first sensor;
the first sensor is used for sensing a preset signal emitted by a target object;
if the first sensor does not receive the preset signal, marking that the target object does not exist in the first exploration area.
CN201910945132.4A 2019-09-30 2019-09-30 Object searching method, device and storage medium Active CN112578787B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910945132.4A CN112578787B (en) 2019-09-30 2019-09-30 Object searching method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910945132.4A CN112578787B (en) 2019-09-30 2019-09-30 Object searching method, device and storage medium

Publications (2)

Publication Number Publication Date
CN112578787A true CN112578787A (en) 2021-03-30
CN112578787B CN112578787B (en) 2022-11-18

Family

ID=75117290

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910945132.4A Active CN112578787B (en) 2019-09-30 2019-09-30 Object searching method, device and storage medium

Country Status (1)

Country Link
CN (1) CN112578787B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113440054A (en) * 2021-06-30 2021-09-28 北京小狗吸尘器集团股份有限公司 Method and device for determining range of charging base of sweeping robot
CN114543808A (en) * 2022-02-11 2022-05-27 杭州萤石软件有限公司 Indoor relocation method, device, equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150151646A1 (en) * 2012-01-17 2015-06-04 Sharp Kabushiki Kaisha Self-propelled electronic device
CN106814739A (en) * 2017-04-01 2017-06-09 珠海市微半导体有限公司 A kind of mobile robot recharges control system and control method
CN108037759A (en) * 2017-12-05 2018-05-15 福玛特机器人科技股份有限公司 Sweeping robot recharges system and recharges paths planning method
CN109062207A (en) * 2018-08-01 2018-12-21 深圳乐动机器人有限公司 Localization method, device, robot and the storage medium of cradle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150151646A1 (en) * 2012-01-17 2015-06-04 Sharp Kabushiki Kaisha Self-propelled electronic device
CN106814739A (en) * 2017-04-01 2017-06-09 珠海市微半导体有限公司 A kind of mobile robot recharges control system and control method
CN108037759A (en) * 2017-12-05 2018-05-15 福玛特机器人科技股份有限公司 Sweeping robot recharges system and recharges paths planning method
CN109062207A (en) * 2018-08-01 2018-12-21 深圳乐动机器人有限公司 Localization method, device, robot and the storage medium of cradle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113440054A (en) * 2021-06-30 2021-09-28 北京小狗吸尘器集团股份有限公司 Method and device for determining range of charging base of sweeping robot
CN114543808A (en) * 2022-02-11 2022-05-27 杭州萤石软件有限公司 Indoor relocation method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN112578787B (en) 2022-11-18

Similar Documents

Publication Publication Date Title
CN106980320B (en) Robot charging method and device
CN109062207B (en) Charging seat positioning method and device, robot and storage medium
US9020682B2 (en) Autonomous mobile body
CN107041718B (en) Cleaning robot and control method thereof
CN109407675B (en) Obstacle avoidance method and chip for robot returning seat and autonomous mobile robot
EP3782774B1 (en) Mobile robot
KR101494224B1 (en) Autonomous mobile body
EP3951543A1 (en) Mobile device recharging method and mobile device
US7734385B2 (en) Traveling control method, medium, and apparatus for autonomous navigation
KR101430103B1 (en) Golf ball pick up robot
KR102350533B1 (en) Method of configuring position based on vision information and robot implementing thereof
EP2336802A1 (en) Enhanced visual landmark for localization
US11188753B2 (en) Method of using a heterogeneous position information acquisition mechanism in an operating space and robot and cloud server implementing the same
CN112578787B (en) Object searching method, device and storage medium
CN112214015A (en) Self-moving robot and recharging method, system and computer storage medium thereof
CN108108850B (en) Motion device, path searching control method thereof and device with storage function
CN101847011B (en) Portable area positioning and covering method for mobile robot
CN111694360B (en) Method and device for determining position of sweeping robot and sweeping robot
CN108061886A (en) The recharging method and sweeping robot of sweeping robot
CN112987743B (en) Quick seat finding method for robot, chip and robot
CN114061561A (en) Intelligent navigation system
CN116661458A (en) Robot travel control method, robot, and storage medium
WO2022222678A1 (en) Intelligent mowing system and intelligent mowing device
Noaman et al. Landmarks exploration algorithm for mobile robot indoor localization using VISION sensor
JP7149531B2 (en) Leading device and leading method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant