CN110134117B - Mobile robot repositioning method, mobile robot and electronic equipment - Google Patents

Mobile robot repositioning method, mobile robot and electronic equipment Download PDF

Info

Publication number
CN110134117B
CN110134117B CN201810126863.1A CN201810126863A CN110134117B CN 110134117 B CN110134117 B CN 110134117B CN 201810126863 A CN201810126863 A CN 201810126863A CN 110134117 B CN110134117 B CN 110134117B
Authority
CN
China
Prior art keywords
mobile robot
data set
target
image data
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810126863.1A
Other languages
Chinese (zh)
Other versions
CN110134117A (en
Inventor
朱云飞
郭斌
朱建华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Ezviz Software Co Ltd
Original Assignee
Hangzhou Ezviz Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Ezviz Software Co Ltd filed Critical Hangzhou Ezviz Software Co Ltd
Priority to CN201810126863.1A priority Critical patent/CN110134117B/en
Publication of CN110134117A publication Critical patent/CN110134117A/en
Application granted granted Critical
Publication of CN110134117B publication Critical patent/CN110134117B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device

Abstract

The application provides a mobile robot repositioning method, a mobile robot and an electronic device. The mobile robot repositioning method comprises the following steps: controlling the mobile robot to move to a preset reference position; acquiring a target image dataset with an image acquisition device of the mobile robot; determining a target reference data set matched with the target image data set from each reference data set contained in a preset feature database; each reference data set is: an image data set collected at predetermined respective reference positions when the mobile robot has position information; determining a target associated coordinate corresponding to the target reference data set as a coordinate of a current position of the mobile robot; the target associated coordinates are coordinates of a reference position corresponding to the target reference data set. The repositioning of the mobile robot can be quickly and effectively realized through the scheme.

Description

Mobile robot repositioning method, mobile robot and electronic equipment
Technical Field
The present disclosure relates to the field of mobile robots, and in particular, to a method for repositioning a mobile robot, and an electronic device.
Background
The mobile robot is a machine device which automatically executes work, can receive human commands, can run a pre-programmed program, and can perform actions according to a principle schema established by an artificial intelligence technology. The task of mobile robots today is to assist or replace human work, such as production, construction, or dangerous work.
The mobile robot can construct a map about the scene according to a map construction algorithm, and then perform location-based work based on the constructed map. However, in some cases, the mobile robot has a need for relocation, where relocation refers to randomly placing the mobile robot into a known map environment, and the mobile robot can find its own coordinates in the map. For example: when the mobile robot is restarted after shutdown, there will be a need for repositioning.
Therefore, how to quickly and effectively realize the relocation of the mobile robot is an urgent problem to be solved.
Disclosure of Invention
In view of this, the present application provides a method for repositioning a mobile robot, a mobile robot and an electronic device, so as to quickly and effectively reposition the mobile robot.
Specifically, the method is realized through the following technical scheme:
in a first aspect, an embodiment of the present application provides a mobile robot repositioning method, including:
controlling the mobile robot to move to a preset reference position;
acquiring a target image dataset with an image acquisition device of the mobile robot;
determining a target reference data set matched with the target image data set from each reference data set contained in a preset feature database; wherein each reference data set is: an image data set collected at predetermined respective reference positions when the mobile robot has position information;
determining the target associated coordinates corresponding to the target reference data set as the coordinates of the current position of the mobile robot; and the target associated coordinates are coordinates of a reference position corresponding to the target reference data set.
Optionally, the respective reference positions are:
and the included angle positions of the target scene where the mobile robot is located.
Optionally, the step of controlling the mobile robot to move to a predetermined reference position includes:
and controlling the mobile robot to move to a preset included angle position based on the distance measuring sensor and the gyroscope of the mobile robot.
Optionally, the step of controlling the mobile robot to move to a predetermined included angle position based on the distance measuring sensor and the gyroscope of the mobile robot includes:
controlling a mobile robot to rotate in situ by a preset angle, and sampling data by using a ranging sensor and a gyroscope of the mobile robot in the rotating process to obtain a distance data set and an angle data set;
converting each sampling point in the data sampling process to a two-dimensional coordinate system based on the distance data set and the angle data set, and fitting to obtain at least one straight line in the two-dimensional coordinate system based on each sampling point in the two-dimensional coordinate system;
controlling the mobile robot to move along any straight line obtained by fitting, and adjusting the mobile robot to a target position when the mobile robot is detected to be blocked by a target obstacle to advance, wherein the target position is as follows: and the position which is away from the target obstacle by a first distance and is away from the obstacle corresponding to the straight line by a second distance.
Optionally, the respective reference positions are:
respective positions at a predetermined distance from the first auxiliary signal device;
alternatively, the first and second liquid crystal display panels may be,
causing the second secondary signal device to receive respective locations of the network signal at predetermined strength values; wherein the network signal is emitted by the mobile robot.
Optionally, the step of acquiring a target image data set by using an image acquisition device of the mobile robot includes:
and controlling the mobile robot to rotate to a preset deflection angle in situ, acquiring image data by using image acquisition equipment of the mobile robot after the mobile robot rotates to the preset deflection angle, and constructing a target image data set based on the acquired image data.
Optionally, the step of acquiring a target image data set by using an image acquisition device of the mobile robot includes:
and controlling the mobile robot to rotate to a plurality of different preset deflection angles in situ in sequence, acquiring image data by using image acquisition equipment of the mobile robot after each rotation to one preset deflection angle, and constructing a target image data set based on the acquired image data.
Optionally, the method for relocating a mobile robot provided in the embodiment of the present application further includes:
replacing the target reference data set in the feature database with the target image data set when a predetermined replacement condition is satisfied.
In a second aspect, an embodiment of the present application provides a mobile robot, including:
the mobile control unit is used for controlling the mobile robot to move to a preset reference position;
A data set obtaining unit for acquiring a target image data set by using an image acquisition device of the mobile robot;
a target reference data set determining unit configured to determine a target reference data set matching the target image data set from each reference data set included in a preset feature database; wherein each reference data set is: an image data set collected at predetermined respective reference positions when the mobile robot has position information;
the coordinate determination unit is used for determining the target associated coordinates corresponding to the target reference data set as the coordinates of the current position of the mobile robot; and the target associated coordinates are coordinates of a reference position corresponding to the target reference data set.
Optionally, the respective reference positions are:
and the included angle positions of the target scene where the mobile robot is located.
Optionally, the mobile control unit is specifically configured to:
and controlling the mobile robot to move to a preset included angle position based on the distance measuring sensor and the gyroscope of the mobile robot.
Optionally, the mobile control unit is specifically configured to:
controlling a mobile robot to rotate in situ by a preset angle, and sampling data by using a ranging sensor and a gyroscope of the mobile robot in the rotating process to obtain a distance data set and an angle data set;
Converting each sampling point in the data sampling process to a two-dimensional coordinate system based on the distance data set and the angle data set, and fitting to obtain at least one straight line in the two-dimensional coordinate system based on each sampling point in the two-dimensional coordinate system;
controlling the mobile robot to move along any straight line obtained by fitting, and adjusting the mobile robot to a target position when the mobile robot is detected to be blocked by a target obstacle to advance, wherein the target position is as follows: and the position which is away from the target obstacle by a first distance and is away from the obstacle corresponding to the straight line by a second distance.
Optionally, the respective reference positions are:
respective positions at a predetermined distance from the first auxiliary signal device;
alternatively, the first and second electrodes may be,
causing the second secondary signal device to receive respective locations of the network signal at predetermined strength values; wherein the network signal is emitted by the mobile robot.
Optionally, the data set obtaining unit is specifically configured to:
and controlling the mobile robot to rotate to a preset deflection angle in situ, acquiring image data by using image acquisition equipment of the mobile robot after the mobile robot rotates to the preset deflection angle, and constructing a target image data set based on the acquired image data.
Optionally, the data set obtaining unit is specifically configured to:
and controlling the mobile robot to rotate to a plurality of different preset deflection angles in situ in sequence, acquiring image data by using image acquisition equipment of the mobile robot after each rotation to one preset deflection angle, and constructing a target image data set based on the acquired image data.
Optionally, the mobile robot provided in the embodiment of the present application further includes:
a replacement unit configured to replace the target reference data set in the feature database with the target image data set when a predetermined replacement condition is satisfied.
In a third aspect, an embodiment of the present application provides an electronic device, including: an internal bus, a memory, a processor, and a communication interface; the processor, the communication interface and the memory complete mutual communication through the internal bus; the memory is used for storing a machine feasible instruction corresponding to the mobile robot repositioning method;
the processor is configured to read the machine-readable instructions on the memory, and execute the instructions to implement the mobile robot relocation method provided in the first aspect.
In the method provided by the embodiment of the application, after the mobile robot is controlled to move to a preset reference position, the image acquisition equipment of the mobile robot is used for acquiring a target image data set; and determining a target reference data set matched with the target image data set from all reference data sets contained in a preset feature database, namely determining which specific reference position the mobile robot currently moves to, and further determining target associated coordinates corresponding to the target reference data set as coordinates of the current position of the mobile robot. Therefore, the repositioning of the mobile robot can be quickly and effectively realized through the scheme.
Drawings
Fig. 1 is a flowchart of a mobile robot repositioning method according to an embodiment of the present disclosure;
fig. 2(a) is a schematic diagram of coordinates formed by converting each sampling point into a two-dimensional coordinate system, and fig. 2(b) is a schematic diagram of at least one straight line obtained based on fitting of the coordinate points shown in fig. 2 (a);
fig. 3(a) is a top view of a room where the mobile robot is located, fig. 3(b) is a schematic diagram of a grid map of the room where the mobile robot is located, and fig. 3(c) is a schematic diagram of the mobile robot moving to an included angle position corresponding to the included angle a;
Fig. 4 is a schematic structural diagram of a mobile robot according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
In order to quickly and effectively realize the relocation of a mobile robot, the embodiment of the application provides a mobile robot relocation method, a mobile robot and electronic equipment.
First, a mobile robot repositioning method provided in an embodiment of the present application will be described.
It should be noted that the executing body of the mobile robot repositioning method provided by the embodiment of the present application may be a mobile robot repositioning device. In a specific application, the mobile robot repositioning device may operate in a mobile robot, may also operate in a cloud server corresponding to the mobile robot, and may of course operate in a client device corresponding to the mobile robot, which is reasonable.
The triggering condition of relocation may be: restarting after shutdown, or losing map coordinates due to other reasons, or receiving a user-specified repositioning instruction, and the like. The precondition for repositioning the mobile robot in the target scene is: a map is previously constructed about the target scene according to a map construction algorithm so that the mobile robot can have position information before relocation, and the mobile robot relocation apparatus can previously construct a feature database and associate the feature database with coordinates. It can be understood that any mapping algorithm may be applied to the mapping process of the target scene, and the mapping process does not relate to the invention point of the embodiment of the present application, and therefore, is not limited herein.
In addition, the target scene related to the present application may be an indoor scene, such as: indoor homes, underground parking lots, office buildings and other areas. Of course, the target scene related to the present application may not be limited to an indoor scene, and a small outdoor scene such as an open parking lot, a park, a factory area, etc. may also apply the method provided in the present application. As shown in fig. 1, a mobile robot repositioning method may include:
S101, controlling the mobile robot to move to a preset reference position;
in order to quickly and effectively realize the relocation of the mobile robot, the relocation idea proposed in the embodiment of the present application is as follows: the mobile robot repositioning device controls the mobile robot to move to any reference position, and then analyzes which reference position is any reference position based on the image data set, so that the coordinates are determined to complete repositioning. Based on the processing idea, when the mobile robot repositioning device detects that the repositioning condition is met, the mobile robot can be controlled to move to a preset reference position, and then the subsequent image acquisition process is executed, wherein the preset reference position is any one of a plurality of preset reference positions.
It is emphasized that the respective reference positions are positions that can be determined independently of map information. In particular applications, there are various specific forms of the respective reference positions, which are exemplified below.
Optionally, in a specific implementation manner, the respective reference positions are: and the position of each included angle of the target scene where the mobile robot is located. In terms of the dimension of the angle, the included angle corresponding to the included angle position can be a right-angle type or a non-right-angle type; from the dimension of the included angle, it is reasonable that the included angle corresponding to the included angle position may be formed by a wall, a non-wall, or a wall and a non-wall. For example, referring to the top view of the room shown in fig. 3(a), in which the included angles A, B, C, D and E exist, the predetermined reference position to which the mobile robot moves may be any one of the included angles A, B, C, D and E.
Wherein, when the reference positions are: when the mobile robot is located at each included angle position of the target scene, the repositioning device of the mobile robot may control the mobile robot to move to a predetermined included angle position based on the distance measuring sensor and the gyroscope of the mobile robot, which is not limited to this. Specifically, the process of controlling the mobile robot to move to a predetermined included angle position based on the distance measuring sensor and the gyroscope of the mobile robot may include:
controlling the mobile robot to rotate in place by a preset angle, and sampling data by using a ranging sensor and a gyroscope of the mobile robot in the rotating process to obtain a distance data set and an angle data set;
converting each sampling point in the data sampling process to a two-dimensional coordinate system based on the distance data set and the angle data set, and fitting to obtain at least one straight line in the two-dimensional coordinate system based on each sampling point in the two-dimensional coordinate system;
controlling the mobile robot to move along any straight line obtained by fitting, and adjusting the mobile robot to a target position when the mobile robot is detected to be blocked by a target obstacle, wherein the target position is as follows: and the position which is away from the target obstacle by a first distance and is away from the obstacle corresponding to the straight line by a second distance.
Wherein the distance based data set has a unique correspondence with the angle based data set. Moreover, based on the distance data set and the angle data set, the concrete realization mode of converting each sampling point acquired in the data sampling process to the two-dimensional coordinate system can adopt any conversion mode in the prior art; the specific implementation manner of obtaining at least one straight line in the two-dimensional coordinate system by fitting based on each sampling point in the two-dimensional coordinate system may also adopt any straight line fitting manner in the prior art. For example, a schematic diagram of coordinates formed by converting each sampling point obtained by sampling data into a two-dimensional coordinate system may be referred to as fig. 2(a), and correspondingly, a schematic diagram of at least one straight line obtained by fitting based on the coordinate points in fig. 2(a) may be referred to as fig. 2 (b).
It can be understood that the straight line obtained by fitting is an obstacle boundary, the mobile robot is controlled to move along any straight line, that is, the mobile robot is controlled to move along a certain obstacle boundary, when the mobile robot is detected to be blocked by a target obstacle and to move forward, the mobile robot cannot move forward continuously, and it can be determined that the mobile robot approaches to an included angle; and because the position that approaches the contained angle exists a plurality of and the reference position has the uniqueness, when judging this mobile robot is close the contained angle, can be based on range sensor, with this mobile robot adjustment to the target location, this target location is: and the position which is away from the target obstacle by a first distance and is away from the obstacle corresponding to the straight line by a second distance. The predetermined angle may be set according to specific applications, for example, the predetermined angle may be 360 degrees, 300 degrees, 240 degrees, 180 degrees, or the like; similarly, the first distance and the second distance may be set according to specific applications, and the first distance and the second distance may be set to the same value, so that the mobile robot can determine a unique angle position no matter which direction the mobile robot approaches the angle.
Optionally, in another specific implementation manner, the respective reference positions may be: respective positions at a predetermined distance from the first auxiliary signal device. Accordingly, the step of controlling the mobile robot to move to a predetermined reference position may include: and controlling the mobile robot to move according to a preset track, monitoring the distance between the mobile robot and the first auxiliary signal device in the moving process, and determining that the mobile robot moves to a preset reference position, namely a position away from the first auxiliary signal device by the preset distance when the mobile robot is monitored to be away from the first auxiliary signal device by the preset distance.
It will be appreciated that the mobile robot may be bluetooth enabled, and then the first auxiliary signal device may be a bluetooth enabled device with a fixed location, and the distance between the location of the first auxiliary signal device and the location of the mobile robot may be determined by the bluetooth distance. Alternatively, the mobile robot may comprise a signal transmitter, and the first auxiliary signal device may be a signal receiver, and the distance between the position of the first auxiliary signal device and the position of the mobile robot may be determined by the signal transmission time. Among them, the types of signal transmitter and signal receiver may include, but are not limited to, an optical type or an acoustic type; also, it is reasonable that the predetermined trajectory can be set randomly by the mobile robotic relocating device or according to a particular algorithm.
Optionally, in another specific implementation manner, the respective reference positions may be: causing the second secondary signal device to receive respective locations of the network signal at predetermined strength values; wherein, the network signal is sent out by the mobile robot. Accordingly, the step of controlling the mobile robot to move to a predetermined reference position may include: and controlling the mobile robot to move according to a preset track, monitoring the strength value of the network signal received by the second auxiliary signal equipment in the moving process, and determining that the mobile robot moves to a preset reference position when the strength value of the network signal received by the second auxiliary signal equipment is the preset strength value, namely, a position of the network signal with the preset strength value received by the second auxiliary signal equipment.
The second auxiliary signal equipment is equipment which is provided with a wireless access module and is fixed in position; and the mobile robot may have a wifi detector, so that the mobile robot may form a wireless network through the wifi detector and detect the intensity of the network signal received by the second auxiliary signal device. It can be understood that this wifi detector specifically can be the wifi probe among the prior art, and wherein, the wifi probe is the equipment that is used for detecting the wifi terminal in the surrounding environment now, and its fundamental operating principle is: creating a wifi hotspot, and collecting information of terminals accessible in the surrounding environment, for example: unique identification information, a WiFi signal strength value, a WiFi signal duration, etc., wherein it can be understood that the unique identification information of the WiFi terminal may be a MAC (Media Access Control) address, and the specific display of the WiFi probe may not be limited to a WiFi probe.
It should be noted that the specific forms of the reference positions given above are only examples and should not be construed as limiting the embodiments of the present application.
S102, acquiring a target image data set by using image acquisition equipment of the mobile robot;
after the mobile robot repositioning device controls the mobile robot to move to a preset reference position, a target image data set can be acquired by using image acquisition equipment of the mobile robot, and then the preset reference position is analyzed based on the target image data set. The image capturing device may specifically be a camera, but is not limited to this.
Optionally, in a specific implementation manner, the step of acquiring the target image dataset by using the image acquisition device of the mobile robot may include:
and controlling the mobile robot to rotate to a preset deflection angle in situ, acquiring image data by using image acquisition equipment of the mobile robot after the mobile robot rotates to the preset deflection angle, and constructing a target image data set based on the acquired image data.
In this specific implementation manner, after the mobile robot rotates in situ to the predetermined deflection angle, at least one picture may be acquired by the image acquisition device, or at least one video may be recorded by the image acquisition device, and then the acquired at least one picture or the acquired at least one video may be used as the content of the target image data set. And, the predetermined deflection angle is an angle offset based on a reference direction, wherein the reference direction may be set according to a specific application, and a specific angle value of the predetermined deflection angle may also be set according to the specific application. It will be appreciated that in a particular application, the predetermined deflection angle may be determined in accordance with the indoor layout such that the captured target image dataset contains richer item information.
Optionally, in another specific implementation manner, the step of acquiring the target image data set by using the image acquisition device of the mobile robot may include:
the mobile robot is controlled to rotate to a plurality of different preset deflection angles in situ in sequence, after the mobile robot rotates to a preset deflection angle each time, image data are collected by using image collection equipment of the mobile robot, and a target image data set is constructed on the basis of the collected image data.
In this specific implementation manner, after the mobile robot rotates to a predetermined deflection angle in situ each time, at least one picture may be acquired by the image acquisition device, or at least one video may be recorded by the image acquisition device, and then the at least one picture or the at least one video corresponding to each of the acquired predetermined deflection angles is used as the content of the target image data set, where the target image data set corresponds to data content at multiple angles. Also, any one of the predetermined deflection angles is an angle that is offset based on a reference direction, wherein the reference direction may be set according to a specific application, and a plurality of different predetermined deflection angles may be set according to the specific application.
S103, determining a target reference data set matched with the target image data set from each reference data set contained in a preset feature database; wherein each reference data set is: an image data set collected at predetermined respective reference positions when the mobile robot has position information;
it should be noted that, before relocation, the mobile robot relocation apparatus may pre-construct a feature database, where the feature database includes a plurality of reference data sets. After obtaining the target image dataset, a target reference dataset matching the target image dataset may be determined from each reference dataset contained in a preset feature database. It is understood that, for constructing the feature database, when the mobile robot has position information, the mobile robot relocating device may control the mobile robot to move to each reference position, and collect the reference data set at each reference position, where a specific implementation manner of the mobile robot relocating device controlling the mobile robot to move to each reference position may refer to a specific implementation manner of controlling the mobile robot to move to a predetermined reference position. In addition, in order to ensure that the target image data set is effectively matched with each reference data set, the specific implementation mode of acquiring the target image data set by using the image acquisition equipment of the mobile robot is the same as the specific acquisition mode of each reference data set.
There are various specific implementation manners for determining, from the reference data sets contained in the preset feature database, a target reference data set matched with the target image data set, for example: the determination is based on similarity, the matching number of feature points or a neural network model. For clarity of the scheme and layout, the following description will be made by way of example.
S104, determining the target associated coordinates corresponding to the target reference data set as the coordinates of the current position of the mobile robot; wherein the target associated coordinates are coordinates of a reference position corresponding to the target reference data set.
When the feature database is constructed, each reference data set may be associated with the coordinates of the corresponding reference position, that is, the associated coordinates corresponding to each reference data set are predetermined. In this way, after the target reference data set is determined, the target associated coordinates corresponding to the target reference data set can be directly determined as the coordinates of the current position of the mobile robot, so that the relocation of the mobile robot is completed.
It should be noted that, in order to avoid that the matching degree is continuously decreased due to a long-term small-amplitude scene change, so as to improve the stability of the algorithm, the method provided in the embodiment of the present application may further include:
Replacing the target reference data set in the feature database with the target image data set when a predetermined replacement condition is satisfied.
Wherein, the predetermined replacement condition may be: reaching a predetermined time point; alternatively, an alternate instruction is obtained.
Of course, the predetermined replacement condition may also be: a condition relating to a degree of matching of the target reference data set with the target image data set. For example: the matching degree may be a similarity degree, and accordingly, the predetermined replacement condition may be: the similarity of the current time is reduced relative to the similarity of the previous time, or the similarity of the current time is lower than a preset similarity threshold, and the like; alternatively, the matching degree may be a feature point matching number, and the corresponding predetermined replacement condition may be: the feature point matching number of this time is decreased relative to the feature point matching number of the last time, or the feature point matching number of this time is lower than a predetermined number threshold, and so on.
It should be emphasized that the above-described predefined alternatives are merely examples and should not be construed as limiting the embodiments of the present application.
In the method provided by the embodiment of the application, after the mobile robot is controlled to move to a preset reference position, the image acquisition equipment of the mobile robot is used for acquiring a target image data set; and determining a target reference data set matched with the target image data set from all reference data sets contained in a preset feature database, namely determining which specific reference position the mobile robot currently moves to, and further determining target associated coordinates corresponding to the target reference data set as coordinates of the current position of the mobile robot. Therefore, the repositioning of the mobile robot can be quickly and effectively realized through the scheme.
For clarity of layout and solution, the following description will be given by way of example of a specific implementation of determining a target reference data set matching the target image data set from each reference data set included in a preset feature database:
optionally, in a specific implementation manner, the step of determining, from each reference data set included in the preset feature database, a target reference data set matching the target image data set may include:
respectively calculating the similarity of the target image data set and each reference data set;
and taking the reference data set with the highest corresponding similarity as a target reference data set matched with the target image data set.
It will be appreciated that when the target image data set and each reference data set contain a picture, the specific process of calculating the similarity between the target image data set and any reference data set may include:
calculating the similarity between the picture contained in the target image data set and the picture contained in the reference data set;
and taking the calculated similarity as the similarity of the target image data set and the reference data set.
When the target image data set and each reference data set include at least two pictures, a specific process of calculating the similarity between the target image data set and any reference data set may include:
Calculating the similarity between at least two pictures contained in the target image data set and each picture in the reference data set;
and taking the calculated highest similarity as the similarity between the target image data set and the reference data set, or taking the average value of the calculated similarities as the similarity between the target image data set and the reference data set.
When the target image data set and each reference data set contain at least one video segment, the specific process of calculating the similarity between the target image data set and any reference data set may include:
calculating the similarity between each video frame in the video contained in the target image data set and each video frame in the video contained in the reference data set;
and taking the calculated highest similarity as the similarity between the target image data set and the reference data set, or taking the average value of the calculated similarities as the similarity between the target image data set and the reference data set.
It can be understood that the calculation method of the similarity between any two pictures may be any calculation method of the similarity in the prior art.
Optionally, in a specific implementation manner, the step of determining, from each reference data set included in the preset feature database, a target reference data set matching the target image data set may include:
Extracting first-class image characteristic points of pictures or video frames contained in the target image data set;
matching the first type of image feature points with second type of image feature points of pictures or video frames contained in each reference data set respectively;
and taking the reference data set with the highest matching number as a target reference data set matched with the target image data set.
It can be understood that the image feature points can reflect the essential features of the image, the target object in the image can be identified, and the matching of the image can be completed through the matching of the feature points. The mobile robot relocating device may use orb (organized FAST and Rotated brief) algorithm to extract the image feature points, but is not limited thereto. Wherein, the ORB is an algorithm for fast feature point extraction and description.
Optionally, in another specific implementation manner, the step of determining, from each reference data set included in the preset feature database, a target reference data set matching the target image data set may include:
and determining a target reference data set matched with the target image data set from each reference data set contained in a preset feature database based on a pre-trained neural network model.
The neural network model may be a convolutional neural network model, a cyclic neural network model, or the like.
It can be understood that a neural network model may be trained using a large number of scene pictures with object name labels, and after the training is completed, the neural network model may identify objects included in the pictures input into the neural network model. In this way, the mobile robot relocating device can determine the object contained in the target image data set through the neural network model, and if the object contained in the target image data set is the same as the object contained in a reference data set and the change range of the pixel position occupied by the object in the picture is lower than a predetermined range threshold value, the reference data set can be used as the target reference data set matched with the target image data set.
It should be noted that, the specific implementation manner of determining the target reference data set matching the target image data set from the reference data sets contained in the preset feature database is merely an example, and should not be construed as a limitation to the embodiment of the present application.
The method for relocating the mobile robot provided by the embodiment of the present application is described below with reference to specific application examples.
Assuming that a target scene where the mobile robot is located is a room, the room has five reference positions, which are: angle positions A, B, C, D and E, each reference position being L away from the respective angle edge, and true north being the reference direction of the mobile robot, i.e. true north being 0 degrees. The top view of the room can be seen in fig. 3(a), wherein the upper left black rectangular area in fig. 3(a) is used to represent an obstacle, and the same upper left black rectangular area in fig. 3(b) and 3(c) is used to represent an obstacle. In addition, M in fig. 3(a) indicates a mobile robot, and 01 indicates an image capturing apparatus of the mobile robot.
In order to realize relocation, the early data processing process is as follows:
the mobile robot constructs a map about the room according to a mapping algorithm such that the mobile robot has location information, and the grid map shown in fig. 3(b) is a map of the room constructed by the mobile robot. When the mobile robot has position information, the mobile robot repositioning device can control the mobile robot to move to each included angle position based on the ranging sensor and the gyroscope, when each included angle position is reached, the mobile robot is controlled to rotate in place, after the mobile robot rotates to a deflection angle theta, image data are collected by the image collecting device, a reference data set is formed, and the collected image data are one picture. Wherein, the included angle edge of each included angle position from the corresponding included angle is L.
Furthermore, as can be seen from the map of the room, since the coordinates where the mobile robot is located when the mobile robot moves to the included angle position a are (X0, Yn), the reference data set collected by the mobile robot at the included angle position a can be associated with the coordinates (X0, Yn), that is, the associated coordinates of the reference data set corresponding to the included angle position a are (X0, Yn). Similarly, the associated coordinates of the reference data set corresponding to the included angle positions B, C, D and E are determined.
Based on the previous data processing process, the relocation process is as follows:
when relocation is needed, the mobile robot relocation device controls the mobile robot to move to a preset included angle position based on the distance measuring sensor and the gyroscope: the angle position A is shown in FIG. 3 (c).
The mobile robot repositioning device controls the mobile robot to rotate to a deflection angle theta in situ, image data are collected by using image collecting equipment of the mobile robot, and a target image data set is constructed based on the collected image data; the acquired image data is a picture, which can be referred to as FA shown in fig. 3 (c).
The mobile robot repositioning device calculates the similarity between the pictures contained in the target image data set and the pictures contained in each reference data set;
And determining the coordinates of the current position of the mobile robot by the associated coordinates corresponding to the reference data set with the highest similarity, namely (X0, Yn), and finishing the relocation of the mobile robot.
Therefore, the mobile robot can be quickly and effectively repositioned by the scheme in the specific embodiment.
Corresponding to the method embodiment, the embodiment of the application also provides a mobile robot. As shown in fig. 4, the mobile robot may include:
a movement control unit 410 for controlling the mobile robot to move to a predetermined reference position;
a data set obtaining unit 420 for acquiring a target image data set using an image acquisition device of the mobile robot;
a target reference data set determining unit 430, configured to determine, from each reference data set included in a preset feature database, a target reference data set that matches the target image data set; wherein each reference data set is: an image data set collected at predetermined respective reference positions when the mobile robot has position information;
a coordinate determining unit 440, configured to determine a target associated coordinate corresponding to the target reference data set as a coordinate of a current position of the mobile robot; and the target associated coordinates are coordinates of a reference position corresponding to the target reference data set.
According to the mobile robot provided by the embodiment of the application, after the mobile robot is controlled to move to a preset reference position, the image acquisition equipment of the mobile robot is used for acquiring a target image data set; and determining a target reference data set matched with the target image data set from all reference data sets contained in a preset feature database, namely determining which specific reference position the mobile robot currently moves to, and further determining target associated coordinates corresponding to the target reference data set as coordinates of the current position of the mobile robot. Therefore, the repositioning of the mobile robot can be quickly and effectively realized through the scheme.
Optionally, the respective reference positions are:
and the included angle positions of the target scene where the mobile robot is located.
Optionally, the mobile control unit is specifically configured to:
and controlling the mobile robot to move to a preset included angle position based on the distance measuring sensor and the gyroscope of the mobile robot.
Optionally, the mobile control unit 410 is specifically configured to:
controlling a mobile robot to rotate in situ by a preset angle, and sampling data by using a ranging sensor and a gyroscope of the mobile robot in the rotating process to obtain a distance data set and an angle data set;
Converting each sampling point in the data sampling process to a two-dimensional coordinate system based on the distance data set and the angle data set, and fitting to obtain at least one straight line in the two-dimensional coordinate system based on each sampling point in the two-dimensional coordinate system;
controlling the mobile robot to move along any straight line obtained by fitting, and adjusting the mobile robot to a target position when the mobile robot is detected to be blocked by a target obstacle to advance, wherein the target position is as follows: and the position which is away from the target obstacle by a first distance and is away from the obstacle corresponding to the straight line by a second distance.
Optionally, the respective reference positions are:
respective positions at a predetermined distance from the first auxiliary signal device;
alternatively, the first and second electrodes may be,
causing the second secondary signal device to receive respective locations of the network signal at predetermined strength values; wherein the network signal is emitted by the mobile robot.
Optionally, the data set obtaining unit 420 is specifically configured to:
and controlling the mobile robot to rotate to a preset deflection angle in situ, acquiring image data by using image acquisition equipment of the mobile robot after the mobile robot rotates to the preset deflection angle, and constructing a target image data set based on the acquired image data.
Optionally, the data set obtaining unit 420 is specifically configured to:
and controlling the mobile robot to rotate to a plurality of different preset deflection angles in situ in sequence, acquiring image data by using image acquisition equipment of the mobile robot after each rotation to one preset deflection angle, and constructing a target image data set based on the acquired image data.
Optionally, the mobile robot may further include:
a replacement unit configured to replace the target reference data set in the feature database with the target image data set when a predetermined replacement condition is satisfied.
Corresponding to the method embodiment, the embodiment of the application also provides an electronic device; as shown in fig. 5, the electronic device includes: an internal bus 510, a memory (memory)520, a processor (processor)530, and a communication Interface (Communications Interface) 540; wherein, the processor 530, the communication interface 540 and the memory 520 complete the communication with each other through the internal bus 510;
the memory 520 is configured to store a machine-feasible instruction corresponding to the mobile robot relocation method;
the processor 530 is configured to read the machine-readable instructions on the memory 520 and execute the instructions to implement a mobile robot relocation method provided by the present application. The mobile robot repositioning method comprises the following steps:
Controlling the mobile robot to move to a preset reference position;
acquiring a target image dataset with an image acquisition device of the mobile robot;
determining a target reference data set matched with the target image data set from each reference data set contained in a preset feature database; wherein each reference data set is: an image data set collected at predetermined respective reference positions when the mobile robot has position information;
determining the target associated coordinates corresponding to the target reference data set as the coordinates of the current position of the mobile robot; and the target associated coordinates are coordinates of a reference position corresponding to the target reference data set.
In this embodiment, for the description of the specific steps of the mobile robot repositioning method, reference may be made to the description in the method embodiments provided in this application, which is not described herein again.
The implementation process of the functions and actions of each unit in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

Claims (13)

1. A mobile robot relocation method, comprising:
controlling a mobile robot to rotate in situ by a preset angle, and sampling data by using a ranging sensor and a gyroscope of the mobile robot in the rotating process to obtain a distance data set and an angle data set;
converting each sampling point in the data sampling process to a two-dimensional coordinate system based on the distance data set and the angle data set, and fitting to obtain at least one straight line in the two-dimensional coordinate system based on each sampling point in the two-dimensional coordinate system;
controlling the mobile robot to move along any straight line obtained by fitting, and adjusting the mobile robot to a target position which is preset as a reference position when the mobile robot is detected to be blocked by a target obstacle to advance, wherein the target position is as follows: a position which is a first distance away from the target obstacle and a second distance away from the obstacle corresponding to the straight line;
Acquiring a target image dataset by using an image acquisition device of the mobile robot;
determining a target reference data set matched with the target image data set from each reference data set contained in a preset feature database; wherein each reference data set is: an image data set collected at predetermined respective reference positions when the mobile robot has position information;
determining the target associated coordinates corresponding to the target reference data set as the coordinates of the current position of the mobile robot; and the target associated coordinates are coordinates of a reference position corresponding to the target reference data set.
2. The method of claim 1, wherein the reference position is:
and the included angle position of the target scene where the mobile robot is located.
3. The method of claim 1, wherein the respective reference positions are:
respective positions at a predetermined distance from the first auxiliary signal device;
alternatively, the first and second liquid crystal display panels may be,
causing the second secondary signal device to receive respective locations of the network signal at predetermined strength values; wherein the network signal is emitted by the mobile robot.
4. The method according to any one of claims 1 to 3, wherein the step of acquiring a target image dataset with an image acquisition device of the mobile robot comprises:
and controlling the mobile robot to rotate to a preset deflection angle in situ, acquiring image data by using image acquisition equipment of the mobile robot after the mobile robot rotates to the preset deflection angle, and constructing a target image data set based on the acquired image data.
5. The method according to any of claims 1-3, wherein the step of acquiring a target image dataset with an image acquisition device of the mobile robot comprises:
and controlling the mobile robot to rotate to a plurality of different preset deflection angles in situ in sequence, acquiring image data by using image acquisition equipment of the mobile robot after each rotation to one preset deflection angle, and constructing a target image data set based on the acquired image data.
6. The method of any one of claims 1-3, further comprising:
replacing the target reference data set in the feature database with the target image data set when a predetermined replacement condition is satisfied.
7. A mobile robot, comprising:
the mobile control unit is used for controlling the mobile robot to rotate in place by a preset angle, and in the rotating process, a distance measuring sensor and a gyroscope of the mobile robot are used for carrying out data sampling to obtain a distance data set and an angle data set; converting each sampling point in the data sampling process to a two-dimensional coordinate system based on the distance data set and the angle data set, and fitting to obtain at least one straight line in the two-dimensional coordinate system based on each sampling point in the two-dimensional coordinate system; controlling the mobile robot to move along any straight line obtained by fitting, and adjusting the mobile robot to a target position which is preset as a reference position when the mobile robot is detected to be blocked by a target obstacle to advance, wherein the target position is as follows: a position which is a first distance away from the target obstacle and a second distance away from the obstacle corresponding to the straight line;
a data set obtaining unit for acquiring a target image data set by using an image acquisition device of the mobile robot;
a target reference data set determining unit configured to determine a target reference data set matching the target image data set from each reference data set included in a preset feature database; wherein each reference data set is: an image data set collected at predetermined respective reference positions when the mobile robot has position information;
The coordinate determination unit is used for determining the target associated coordinates corresponding to the target reference data set as the coordinates of the current position of the mobile robot; and the target associated coordinates are coordinates of a reference position corresponding to the target reference data set.
8. The mobile robot of claim 7, wherein the reference positions are:
and the included angle position of the target scene where the mobile robot is located.
9. The mobile robot of claim 7, wherein the respective reference positions are:
respective positions at a predetermined distance from the first auxiliary signal device;
alternatively, the first and second electrodes may be,
causing the second secondary signal device to receive respective locations of the network signal at predetermined strength values; wherein the network signal is emitted by the mobile robot.
10. A mobile robot according to any of claims 7-9, characterized in that the dataset acquisition unit is specifically adapted to:
and controlling the mobile robot to rotate to a preset deflection angle in situ, acquiring image data by using image acquisition equipment of the mobile robot after the mobile robot rotates to the preset deflection angle, and constructing a target image data set based on the acquired image data.
11. A mobile robot according to any of claims 7-9, characterized in that the data set obtaining unit is specifically adapted to:
and controlling the mobile robot to rotate to a plurality of different preset deflection angles in situ in sequence, acquiring image data by using image acquisition equipment of the mobile robot after each rotation to one preset deflection angle, and constructing a target image data set based on the acquired image data.
12. The mobile robot of any one of claims 7-9, further comprising:
a replacement unit configured to replace the target reference data set in the feature database with the target image data set when a predetermined replacement condition is satisfied.
13. An electronic device, comprising: an internal bus, a memory, a processor, and a communication interface; the processor, the communication interface and the memory complete mutual communication through the internal bus; the memory is used for storing machine readable instructions corresponding to the mobile robot repositioning method;
the processor configured to read the machine readable instructions on the memory and execute the machine readable instructions to implement the mobile robot relocation method of any one of claims 1-6.
CN201810126863.1A 2018-02-08 2018-02-08 Mobile robot repositioning method, mobile robot and electronic equipment Active CN110134117B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810126863.1A CN110134117B (en) 2018-02-08 2018-02-08 Mobile robot repositioning method, mobile robot and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810126863.1A CN110134117B (en) 2018-02-08 2018-02-08 Mobile robot repositioning method, mobile robot and electronic equipment

Publications (2)

Publication Number Publication Date
CN110134117A CN110134117A (en) 2019-08-16
CN110134117B true CN110134117B (en) 2022-07-29

Family

ID=67567594

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810126863.1A Active CN110134117B (en) 2018-02-08 2018-02-08 Mobile robot repositioning method, mobile robot and electronic equipment

Country Status (1)

Country Link
CN (1) CN110134117B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110307838B (en) * 2019-08-26 2019-12-10 深圳市优必选科技股份有限公司 Robot repositioning method and device, computer-readable storage medium and robot
CN111158374A (en) * 2020-01-10 2020-05-15 惠州拓邦电气技术有限公司 Repositioning method, repositioning system, mobile robot and storage medium
CN113343739B (en) * 2020-03-02 2022-07-22 杭州萤石软件有限公司 Relocating method of movable equipment and movable equipment
CN112509027B (en) * 2020-11-11 2023-11-21 深圳市优必选科技股份有限公司 Repositioning method, robot, and computer-readable storage medium
CN112729302B (en) * 2020-12-15 2024-03-29 深圳供电局有限公司 Navigation method and device for inspection robot, inspection robot and storage medium
CN114910020B (en) * 2021-02-09 2023-11-21 北京小米机器人技术有限公司 Positioning method and device of movable equipment, movable equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102129679A (en) * 2010-12-02 2011-07-20 湖南农业大学 Local positioning system and method
CN103302658A (en) * 2012-03-16 2013-09-18 株式会社安川电机 Robot system
CN106054878A (en) * 2016-06-03 2016-10-26 中国计量大学 Inertial guidance vehicle navigation method based on two-dimensional code positioning, and inertial guidance vehicle
CN106291517A (en) * 2016-08-12 2017-01-04 苏州大学 The indoor cloud robot angle localization method optimized with visual information based on position
CN206085054U (en) * 2016-09-21 2017-04-12 贵州师范学院 Environment robot of family based on sound reachs location upon delta -T

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4216772B2 (en) * 2004-06-17 2009-01-28 株式会社東芝 Self-position identification device and self-position identification method
US9014848B2 (en) * 2010-05-20 2015-04-21 Irobot Corporation Mobile robot system
US20120065829A1 (en) * 2010-09-15 2012-03-15 Jyh-Cheng Yu Wall-following Moving Device
CN103565344B (en) * 2012-08-08 2017-04-19 科沃斯机器人股份有限公司 Self-moving robot and walking method thereof
AU2016365422A1 (en) * 2015-12-04 2018-06-28 Magic Leap, Inc. Relocalization systems and methods

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102129679A (en) * 2010-12-02 2011-07-20 湖南农业大学 Local positioning system and method
CN103302658A (en) * 2012-03-16 2013-09-18 株式会社安川电机 Robot system
CN106054878A (en) * 2016-06-03 2016-10-26 中国计量大学 Inertial guidance vehicle navigation method based on two-dimensional code positioning, and inertial guidance vehicle
CN106291517A (en) * 2016-08-12 2017-01-04 苏州大学 The indoor cloud robot angle localization method optimized with visual information based on position
CN206085054U (en) * 2016-09-21 2017-04-12 贵州师范学院 Environment robot of family based on sound reachs location upon delta -T

Also Published As

Publication number Publication date
CN110134117A (en) 2019-08-16

Similar Documents

Publication Publication Date Title
CN110134117B (en) Mobile robot repositioning method, mobile robot and electronic equipment
US10518414B1 (en) Navigation method, navigation system, movement control system and mobile robot
CN109643127B (en) Map construction, positioning, navigation and control method and system, and mobile robot
CN109901590B (en) Recharging control method of desktop robot
US11151281B2 (en) Video monitoring method for mobile robot
CN108297115B (en) Autonomous repositioning method for robot
CN103491339B (en) Video acquiring method, equipment and system
EP4050449A1 (en) Method and device for robot positioning, smart robot, and storage medium
KR20120067013A (en) Apparatus and method for indoor localization based on camera
CN107665508B (en) Method and system for realizing augmented reality
CN108748184B (en) Robot patrol method based on regional map identification and robot equipment
WO2019001237A1 (en) Mobile electronic device, and method in mobile electronic device
CN105116886A (en) Robot autonomous walking method
WO2019019819A1 (en) Mobile electronic device and method for processing tasks in task region
CN110737798B (en) Indoor inspection method and related product
CN105262991A (en) Two-dimensional code based substation equipment object recognition method
CN104535047A (en) Multi-agent target tracking global positioning system and method based on video stitching
CN106352871A (en) Indoor visual positioning system and method based on artificial ceiling beacon
CN103632044A (en) Camera topology building method and device based on geographic information system
CN111679664A (en) Three-dimensional map construction method based on depth camera and sweeping robot
CN111950440A (en) Method, device and storage medium for identifying and positioning door
CN114529621B (en) Household type graph generation method and device, electronic equipment and medium
WO2018228258A1 (en) Mobile electronic device and method therein
CN111609854A (en) Three-dimensional map construction method based on multiple depth cameras and sweeping robot
CN111856499B (en) Map construction method and device based on laser radar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant