CN115830576A - Mobile robot repositioning method and device and mobile robot - Google Patents

Mobile robot repositioning method and device and mobile robot Download PDF

Info

Publication number
CN115830576A
CN115830576A CN202211599322.3A CN202211599322A CN115830576A CN 115830576 A CN115830576 A CN 115830576A CN 202211599322 A CN202211599322 A CN 202211599322A CN 115830576 A CN115830576 A CN 115830576A
Authority
CN
China
Prior art keywords
mobile robot
indication mark
local
navigation
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211599322.3A
Other languages
Chinese (zh)
Inventor
赖有仿
何婉君
熊金冰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN202211599322.3A priority Critical patent/CN115830576A/en
Publication of CN115830576A publication Critical patent/CN115830576A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application relates to the technical field of robots, and discloses a mobile robot repositioning method, a mobile robot repositioning device and a mobile robot, wherein the method comprises the steps that when the loss of positioning information is detected, the mobile robot enters a local map exploration mode; under a local map exploration mode, carrying out local path planning and navigation and identifying whether an indication mark exists; when the indication mark is identified, relative navigation is carried out by using the pose information of the indication mark until the text information in the indication mark is identified by a character identification technology; and repositioning the robot by using the pose information and the text information of the indication mark and combining a global map. The method can realize the relocation of the robot under the scene with not rich structural features, and particularly can effectively solve the relocation problem under the condition that the performance or hardware of the robot is limited aiming at some low-cost robots.

Description

Mobile robot repositioning method and device and mobile robot
Technical Field
The application relates to the technical field of robots, in particular to a mobile robot repositioning method and device and a mobile robot.
Background
With the development of intelligent robots, automatic door-to-door delivery of large and small objects by using mobile robots is frequently performed in places such as hospitals and office buildings, which provides great convenience for people. However, for some scenes with repetitive layout structures and lacking unique features, for example, in a gallery with a room layout with similar appearance, an indoor environment of a certain room, an office, etc. needs to be found, and particularly for some low-cost mobile robots, global positioning is realized by only using a single-line laser radar, a milemeter and a global grid map, and the phenomenon of losing self-positioning information easily occurs. In actual application, once positioning loss occurs, it is often difficult to successfully perform relocation, and further, a task is interrupted and cannot be continued.
Disclosure of Invention
In view of this, embodiments of the present application provide a mobile robot repositioning method and apparatus, and a mobile robot, which can solve the problem that the robot is difficult to reposition due to positioning loss in the moving process.
In a first aspect, an embodiment of the present application provides a mobile robot repositioning method, including:
when the loss of the positioning information is detected, the mobile robot enters a local map exploration mode;
under the local map exploration mode, local path planning and navigation are carried out, and local path images are collected in real time to identify whether indication marks exist in the local path images or not;
when the indication mark is identified, relative navigation is carried out according to the pose information of the indication mark, so that the mobile robot gradually approaches the indication mark until text information in the indication mark is identified through a character recognition technology, and the relative navigation is stopped;
and determining the current position of the mobile robot in a global map by utilizing the pose information and the text information of the indication identifier and combining the global map.
In some embodiments, the performing local path planning and navigation includes:
acquiring point cloud data of the current moment in real time through a laser radar device arranged on the mobile robot to construct a local map under a robot coordinate system;
and carrying out grid idle or occupied state analysis on the local map to determine all idle state areas in the local map, selecting a navigation target point from the idle state areas meeting the conditions, carrying out local path planning, and further carrying out mobile exploration according to the generated local path.
In some embodiments, the selecting a navigation target point from an idle state area satisfying a condition includes:
and taking one idle state area with the largest area or the area exceeding a preset threshold value in all the idle state areas as a target idle area, and taking the central area position of the target idle area as a navigation target point at the next moment.
In some embodiments, the selecting a navigation target point from an idle state area satisfying a condition includes:
determining an expansion radius based on the diameter of the mobile robot, and combining the expansion radius with the center of the mobile robot as a circle center to form a circle;
and uniformly dividing the circle according to the size of a preset fan shape, searching the local map by using the circle to obtain the position with the largest number of the fan shapes occupied by the idle state area, and taking the position as a navigation target point at the next moment.
In some embodiments, the performing relative navigation with the pose information of the indicator when the indicator is recognized comprises:
when the indication mark is identified, segmenting an indication mark pixel region in the local path image through semantics, extracting centroid and depth information of the indication mark pixel region, and calculating to obtain a normal vector of the indication mark;
acquiring pose information of the indication mark under a camera coordinate system of the mobile robot by using an included angle between the center of mass and the normal vector, and mapping the pose information under the camera coordinate system to a robot coordinate system to obtain the pose information of the indication mark under the robot coordinate system;
and setting the pose information under the robot coordinate system as a relative target point, and starting relative navigation.
In some embodiments, the determining the current position of the mobile robot in the global map using the pose information of the indicator, the text information, and in combination with the global map includes:
and inquiring the global position of the indication identifier in a pre-constructed global map according to the text information of the indication identifier, and performing coordinate mapping by combining the pose information of the indication identifier in the robot coordinate system to obtain the current position of the mobile robot in the global map.
In some embodiments, the mobile robot repositioning method further comprises:
and after the position in the global map is acquired again, the local map exploration mode is switched to a global map navigation mode, and the current task is continuously executed.
In a second aspect, an embodiment of the present application provides a mobile robot relocating device, including:
the mode switching module is used for entering a local map exploration mode when the loss of the positioning information is detected;
the local exploration module is used for planning and navigating a local path in the local map exploration mode and acquiring a local path image in real time to identify whether an indication mark exists in the local path image;
the relative navigation module is used for performing relative navigation according to the pose information of the indication mark when the indication mark is identified, so that the mobile robot gradually approaches the indication mark until the text information in the indication mark is identified through a character recognition technology, and stopping the relative navigation;
and the global positioning module is used for obtaining the current position of the mobile robot in a global map by utilizing the pose information and the text information of the indication mark and combining the global map.
In a third aspect, an embodiment of the present application provides a mobile robot, which includes a processor and a memory, where the memory stores a computer program, and the processor is configured to execute the computer program to implement the mobile robot relocation method described above.
In a fourth aspect, the present application provides a readable storage medium, which stores a computer program, and when the computer program is executed on a processor, the computer program implements the above-mentioned mobile robot relocation method.
The embodiment of the application has the following beneficial effects:
according to the mobile robot repositioning method, when the loss of the positioning information is detected, a local map exploration mode is entered; under a local map exploration mode, carrying out local path planning and navigation and identifying whether an indication mark exists; then, when the indication mark is identified, relative navigation is carried out by using the pose information of the indication mark until the text information in the indication mark is identified by a character recognition technology; and finally, performing global repositioning by using the pose information and the text information of the indication marks and combining a global map. According to the method, the free exploration of a local map is utilized, the identification of the indication mark is carried out by combining an image identification technology, and then the indication mark is subjected to relative navigation so as to carry out text identification of the indication mark, so that the relocation of the robot can be realized under the scene with an insufficient structural feature, and especially for some low-cost robots, the relocation problem under the condition that the performance or hardware of the robots is limited can be effectively solved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
FIG. 1 is a flow chart of a mobile robot repositioning method according to an embodiment of the application;
FIG. 2 is a flow chart illustrating local path exploration of a mobile robot repositioning method according to an embodiment of the application;
FIG. 3 is a first schematic diagram illustrating selection of a target free area in a mobile robot relocation method according to an embodiment of the present application;
FIG. 4 is a second schematic diagram illustrating selection of a target free area in a mobile robot relocation method according to an embodiment of the present application;
FIG. 5 is a flow chart illustrating relative navigation of a mobile robot repositioning method according to an embodiment of the application;
FIG. 6 is a schematic diagram of an embodiment of the present application showing a mobile robot relocating device;
fig. 7 shows a schematic structural diagram of a mobile robot according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
Hereinafter, the terms "including", "having", and their derivatives, which may be used in various embodiments of the present application, are intended to indicate only specific features, numbers, steps, operations, elements, components, or combinations of the foregoing, and should not be construed as first excluding the existence of, or adding to, one or more other features, numbers, steps, operations, elements, components, or combinations of the foregoing. Furthermore, the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the various embodiments of the present application belong. The terms (such as those defined in commonly used dictionaries) should be interpreted as having a meaning that is consistent with their contextual meaning in the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein in various embodiments.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
In consideration of the fact that under the condition of certain repeated structures and characteristic lack, the phenomenon that positioning information is lost and is difficult to reposition easily occurs to some low-cost mobile robots. The loss of the positioning information refers to the fact that the mobile robot has the position information that the mobile robot cannot determine in the global map, for example, the data of the odometer is lost, or the feature extraction failure or the recognition error is caused by the existence of too many similar structural features in the environment. Once the positioning information is lost, the robot cannot perform positioning and navigation by using the global map, so that the robot task cannot continue.
In order to avoid interruption of robot task execution, a new repositioning method is provided, based on the original hardware structure of the mobile robot, such as a single-line laser radar, a speedometer, a camera device and the like, and combining technologies of free exploration, semantic recognition, text recognition and the like of a local map, so as to establish contact with a global map by utilizing effective character information, thereby completing repositioning. The mobile robot relocation method will be described below with reference to some specific embodiments.
Fig. 1 shows a flowchart of a mobile robot relocation method according to an embodiment of the present application. Exemplarily, the mobile robot relocation method comprises the steps of:
and S110, when the loss of the positioning information is detected, the mobile robot enters a local map searching mode.
The local map exploration mode is a navigation working mode set in the mobile robot, and in the local map exploration mode, a local path with the current position of the robot as a starting point is usually planned according to local map information constructed by the robot in real time. The distance of the local path is generally short, and may be, for example, 1 to 2 meters per movement, or may be shorter or longer, and may be determined according to actual circumstances. The global map navigation mode is opposite to the local map exploration mode, and in the global map navigation mode, when the mobile robot executes a task, the mobile robot can acquire the position of the mobile robot in the global map in real time and move towards the target position in the global map. It is understood that the global map is represented in a world coordinate system.
Exemplarily, the mobile robot performs global path planning and navigation in a global map navigation mode by default, and during the moving process, the laser radar can acquire point cloud data in real time to construct a local map, and the odometer is used to calculate the moving distance of the robot, and the global map is combined to determine the position information of the robot in the global map. However, once the robot loses the positioning information for some reason, the robot automatically switches from the global map navigation mode to the local map exploration mode, and starts to perform obstacle avoidance and local path planning based on the local map.
And S120, planning and navigating a local path in a local map exploration mode, and acquiring a local path image in real time to identify whether an indication mark exists in the local path image.
Exemplarily, in the local map exploration mode, the mobile robot performs exploration navigation by using the local map, and performs local path planning by analyzing an idle state area in the local map and selecting a navigation target point from the idle state area, so that the generated local path can be ensured to avoid obstacles in the way.
In an embodiment, as shown in fig. 2, the performing of the local path planning and navigation in step S120 may include the following sub-steps:
and S121, acquiring point cloud data of the current moment in real time through a laser radar device arranged on the mobile robot to construct a local map under a robot coordinate system.
It can be understood that, since the laser radar device is disposed on the robot, the local map constructed based on the real-time point cloud data is represented in the robot coordinate system. Also, the environmental information contained in the local map is limited, which is related to the detection view angle of the laser radar apparatus.
And S122, carrying out grid idle or occupation state analysis on the local map to determine all idle state areas in the local map.
The grid map is used for dividing the environment into a plurality of grids with uniform sizes, and each grid is in an occupied state or in an occupied state, namely, the existence of obstacles is represented; or an idle state, i.e. indicating no obstacle. In this embodiment, the local map is rasterized, and then the state of each grid is determined, so as to determine all the areas in the local map which are in an idle state. And then, local path planning is carried out in the idle state areas so as to ensure that the planned local path can avoid the obstacles.
And S123, selecting a navigation target point from the idle state area meeting the conditions, planning a local path, and then performing mobile exploration according to the generated local path.
When the local path planning is performed, a navigation target point needs to be selected first, and considering that all idle state areas are not necessarily connected, there are usually a plurality of idle state areas, or a plurality of area divisions may be performed according to a relatively regular shape, so in an embodiment, an idle state area (e.g., an area S0 shown in fig. 3) with the largest area among all idle state areas may be used as a target idle area, and a central area position of the target idle area may be used as a navigation target point at the next time. The central area position may be the center of the target idle area, or may be a position within a central area range including the center, and the position may be specifically selected according to a requirement, and is not limited herein. Alternatively, one of the idle state regions having a region area exceeding the preset threshold may be used as the target idle region, and the like, which is not limited herein. It is understood that the area of the corresponding free area is the largest or exceeds the preset threshold, i.e. the area is indicated to have fewer obstacles, which may allow the robot to pass through smoothly.
As an alternative, considering that the robot has a certain width, in order to avoid collision with an obstacle, etc., for example, in another embodiment, when selecting a navigation target point from the idle state area, as shown in fig. 4, an expansion radius may be determined based on the diameter of the mobile robot, and the expansion radius may be combined with the center of the mobile robot to form a circle; and then, uniformly dividing the circle according to the size of a preset sector, searching the local map by using the circle to obtain a position which enables the sector to be occupied by the idle state area in the largest number, namely the sector area and the idle state area are overlapped in the largest ratio, and then taking the position as a navigation target point at the next moment.
And then, selecting a navigation target point from the target idle state area, and generating an optimal local path by using a local planner. The local path planning, which is not described herein, may be implemented by using some disclosed planning methods, and is not limited herein.
In this embodiment, in the process of local map exploration, the image recognition function is started to search for the indication identifier in the current local path. The indicator refers to a marker having a position or direction indication during navigation, and may include, but is not limited to, a sign displaying a house number, an indicator light displaying a direction indication, and the like. It will be appreciated that in most scenarios, a corresponding indication may be set in order to facilitate the user to find the corresponding room or a certain location. These indicators are typically pre-recorded in the global map.
In order to acquire the position information of the robot in the global map again, the embodiment searches for an existing indicator, such as a sign, on a local path by using an image recognition technology, for example, the mobile robot may acquire a local path image by using an installed camera device, and then performs indicator recognition on the local path image, for example, a pixel region, which may be an indicator sign, in the image may be segmented by semantics, and performs pixel-level classification to determine whether the pixel region is the sign.
If an effective indication mark is identified, the segmented pixel area where the indication mark is located can be further processed to obtain the pose information of the indication mark, and then a relative navigation task is started. Of course, if no valid indicator is identified from the current local map, the search may be continued in the next local map until a valid indicator is identified. The semantic segmentation technique can be implemented by using an existing correlation network, and is not described herein.
S130, when the indication mark is identified, relative navigation is carried out according to the pose information of the indication mark, the mobile robot is enabled to gradually approach the indication mark until the text information in the indication mark is identified through a character recognition technology, and the relative navigation is stopped.
The relative navigation is to move along a position and a direction gradually approaching the indication mark by taking the identified indication mark as a target point. It will be appreciated that since the text on the index mark is typically small and the mobile robot has a limited height, it needs to be close to recognize the above text information.
In one embodiment, as shown in fig. 5, in the above step S130, performing relative navigation with the pose information indicating the marker includes:
s131, when the indication mark is identified, segmenting an indication mark pixel region in the local path image through semantics, extracting centroid and depth information of the indication mark pixel region, and calculating to obtain a normal vector of the indication mark.
Exemplarily, when the Image acquisition is performed by using a depth camera, the depth information of the indication identification pixel region may be extracted therefrom, and for the centroid of the indication identification pixel region, for example, the centroid of the region Image may be calculated by using Image Moments (images) or the like. In one embodiment, the indicating identification pixel area image may be converted into a binary image, the number of pixels greater than zero in the binary image is used as the area of the binary image, the corresponding coordinates are then multiplied by the value of each pixel point, and then summation is performed, and the ratio of the summation to the sum of all pixel values is the position of the centroid. Of course, the position of the centroid can be obtained in other ways, and this is only an example.
Furthermore, according to the depth information of each pixel point in the pixel area of the indication mark, a group of pixel points can be selected from the pixel points to form a plane as the plane of the indication mark, wherein the distance from the group of pixel points to the plane needs to be ensured to be minimum during selection.
S132, acquiring the pose information of the indicator in the camera coordinate system of the mobile robot by using the included angle between the centroid and the normal vector, and mapping the pose information in the camera coordinate system to the robot coordinate system to obtain the pose information of the indicator in the robot coordinate system.
The pose includes position and direction information of the mobile robot, specifically, the position of the mobile robot may be determined by a centroid of the indicator pixel area, and the direction may be obtained by using an angle between the centroid and a normal vector. It can be understood that what is solved above is that the position and pose of the indicator in the camera coordinate system are solved, and then the position and pose information in the camera coordinate system can be mapped to the robot coordinate system through coordinate system transformation, that is, by using a transformation matrix between the camera coordinate system and the robot coordinate system.
If described by an expression, there are:
Figure BDA0003994579440000121
wherein the content of the first and second substances,
Figure BDA0003994579440000122
identifies the pose under the robot coordinate system r for the indication,
Figure BDA0003994579440000124
the pose under the camera coordinate system c is identified for this indication,
Figure BDA0003994579440000123
is a transformation matrix between the camera coordinate system and the robot coordinate system.
And S133, setting the pose information in the robot coordinate system as a relative target point, and performing relative navigation.
It will be appreciated that by unifying the pointing indicia to be represented in the robot coordinate system, the local map is also represented in the robot coordinate system, thus facilitating relative navigation.
Exemplarily, a path to the indicator may be planned again by using the local planner, and during the relative navigation, the mobile robot will gradually approach the indicator, and at the same time, a text recognition function, such as OCR (optical character recognition) or the like, will be turned on, so that the mobile robot can continuously recognize text information on the indicator, for example, a house number or the like, until a specific character is recognized, and then the current relative navigation task may be stopped.
And S140, obtaining the current position of the mobile robot in the global map by using the pose information and the text information of the indication mark and combining the global map.
Exemplarily, the global position of the indicator in a pre-constructed global map can be queried according to the text information on the indicator, and coordinate mapping is performed in combination with pose information of the indicator in a robot coordinate system to obtain the current position of the mobile robot in the global map, so that relocation is completed. It will be appreciated that the coordinate mapping herein is primarily a conversion between the robot coordinate system to the world coordinate system.
As an optional solution, the mobile robot relocation method further includes:
and after the position in the global map is acquired again, switching the local map exploration mode to the global map navigation mode, and continuously executing the current task.
It can be understood that after the position information of the robot in the global map is acquired again, the robot may be switched to global positioning, so that the time for freely searching the local path may be reduced, and the execution efficiency of the task may be improved.
According to the mobile robot repositioning method, when the positioning information is lost, a local map exploration mode is entered; searching a local map in a local map searching mode, and identifying an existing indication mark in the searching process so as to perform relative navigation on the indication mark until text information in the indication mark is identified through technologies such as OCR (optical character recognition); and finally, performing global relocation by using the pose information and the text information of the indication marks and combining a global map. Because the structural characteristics are not required to be excessively depended during relocation, the relocation of the robot can be realized under the scene with not rich structural characteristics, and especially for some low-cost robots, the relocation problem under the condition that the performance or hardware of the robot is limited can be effectively solved.
Fig. 6 shows a schematic structural diagram of a mobile robot relocating device according to an embodiment of the application. Exemplarily, the mobile robot relocating device includes:
and a mode switching module 110, configured to, when the loss of the positioning information is detected, enter a local map exploration mode by the mobile robot.
The local exploration module 120 is configured to perform local path planning and navigation in a local map exploration mode, and acquire a local path image in real time to identify whether an indication identifier exists in the local path image.
And the relative navigation module 130 is configured to perform relative navigation with pose information of the indicator when the indicator is identified, so that the mobile robot gradually approaches the indicator until text information in the indicator is identified by a character recognition technology, and stop the relative navigation.
And the global positioning module 140 is configured to determine a current position of the mobile robot in the global map by using the pose information and the text information of the indicator and combining the global map.
It is to be understood that the apparatus of the present embodiment corresponds to the mobile robot relocation method of the above-described embodiment, and the alternatives of the above-described embodiment are also applicable to the present embodiment, and therefore, the description thereof will not be repeated.
The application also provides a mobile robot, for example, the mobile robot can be a wheel type robot, a foot type robot, etc., the existing form is not limited, and the mobile robot can be applied to various scenes, especially the occasions with repeated structures and no unique characteristics, such as office buildings, hospitals, etc. with similar decoration styles.
Exemplarily, as shown in fig. 7, the mobile robot includes a processor 11, a memory 12 and a sensing unit 13, for example, the sensing unit 13 may include, but is not limited to, basic hardware such as a camera device, a laser radar device, and an odometer, in this application, the camera device may be used to collect environment image information of the robot during movement for image recognition and the like; the laser radar device can be used for collecting space point cloud information in the moving process so as to construct a map, image point clouds and the like; the odometer can be used to measure the distance information of the robot movement, etc., and of course, in other scenarios, the sensing units 13 may have other functions, which is not limited herein. And the memory 12 stores a computer program that the processor 11 executes to cause the mobile robot to perform the functions of the respective modules in the above-described mobile robot relocation method or the above-described mobile robot relocation apparatus.
The processor 11 may be an integrated circuit chip having signal processing capability. The Processor may be a general-purpose Processor including at least one of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), and a Network Processor (NP), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like that implements or executes the methods, steps and logic blocks disclosed in the embodiments of the present application.
The Memory 12 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like. The memory is used for storing a computer program, and the processor can execute the computer program correspondingly after receiving the execution instruction.
The present application also provides a readable storage medium for storing the computer program used in the above mobile robot, which when executed on a processor implements the mobile robot relocation method of the above embodiment, the method including: when the loss of the positioning information is detected, the mobile robot enters a local map exploration mode; under a local map exploration mode, local path planning and navigation are carried out, and local path images are collected in real time to identify whether indication marks exist in the local path images or not; when the indication mark is identified, relative navigation is carried out according to the pose information of the indication mark, so that the mobile robot gradually approaches the indication mark until text information in the indication mark is identified through a character recognition technology, and the relative navigation is stopped; and determining the current position of the mobile robot in the global map by using the pose information and the text information of the indication mark and combining the global map.
It will be appreciated that the alternatives described above in relation to the mobile robot relocation method are equally applicable to this embodiment and will not be repeated here.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The apparatus embodiments described above are merely illustrative and, for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, each functional module or unit in each embodiment of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium, which includes instructions for causing a computer device (which may be a smart phone, a personal computer, a server, or a network device) to execute all or part of a magnetic disk or an optical disk or other various media that may store program codes for implementing the methods according to the embodiments of the present application.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application.

Claims (10)

1. A mobile robot relocation method, comprising:
when the loss of the positioning information is detected, the mobile robot enters a local map exploration mode;
under the local map exploration mode, local path planning and navigation are carried out, and local 5-path images are collected in real time to identify whether indication marks exist in the local path images or not;
when the indication mark is identified, relative navigation is carried out according to the pose information of the indication mark, so that the mobile robot gradually approaches the indication mark until text information in the indication mark is identified through a character recognition technology, and the relative navigation is stopped;
and determining 0 the current position of the mobile robot in a global map by using the pose information and the text information of the indication identifier and combining the global map.
2. The mobile robot relocating method according to claim 1, wherein the performing of the local path planning and navigation comprises:
acquiring point cloud data of the current moment in real time through a laser radar device arranged on the mobile robot to construct a local map under a robot coordinate system;
and 5, carrying out grid idle or occupied state analysis on the local map to determine all idle state areas in the local map, selecting a navigation target point from the idle state areas meeting the conditions, carrying out local path planning, and further carrying out mobile exploration according to the generated local path.
3. The mobile robot relocation method according to claim 2, wherein said selecting a navigation target point from an idle-state area satisfying a condition comprises:
and 0, taking one idle state area with the largest area or the area exceeding a preset threshold value in all idle state areas as a target idle area, and taking the central area position of the target idle area as a navigation target point at the next moment.
4. The mobile robot relocation method according to claim 2, wherein said selecting a navigation target point from an idle-state area satisfying a condition comprises:
determining an expansion radius based on the diameter of the mobile robot, and combining the expansion radius with the center of the mobile robot as a circle center to form a circle;
and uniformly dividing the circle according to the size of a preset fan shape, searching the local map by using the circle to obtain the position with the largest number of the fan shapes occupied by the idle state area, and taking the position as a navigation target point at the next moment.
5. The mobile robot relocation method according to claim 1, wherein said performing relative navigation with pose information of the indicator upon recognizing the indicator includes:
when the indication mark is identified, segmenting an indication mark pixel region in the local path image through semantics, extracting centroid and depth information of the indication mark pixel region, and calculating to obtain a normal vector of the indication mark;
acquiring pose information of the indication mark under a camera coordinate system of the mobile robot by using an included angle between the center of mass and the normal vector, and mapping the pose information under the camera coordinate system to a robot coordinate system to obtain the pose information of the indication mark under the robot coordinate system;
and setting the pose information under the robot coordinate system as a relative target point, and starting relative navigation.
6. The mobile robot repositioning method according to claim 5, wherein the determining the current position of the mobile robot in the global map using the pose information of the indicator, the text information and in combination with the global map comprises:
and inquiring the global position of the indication mark in a pre-constructed global map according to the text information of the indication mark, and performing coordinate mapping by combining the pose information of the indication mark in the robot coordinate system to obtain the current position of the mobile robot in the global map.
7. The mobile robot relocation method according to any one of claims 1 to 6, further comprising:
and after the position in the global map is acquired again, the local map exploration mode is switched to a global map navigation mode, and the current task is continuously executed.
8. A mobile robot relocating device, comprising:
the mode switching module is used for entering a local map exploration mode when the loss of the positioning information is detected;
the local exploration module is used for planning and navigating a local path in the local map exploration mode and acquiring a local path image in real time to identify whether an indication mark exists in the local path image;
the relative navigation module is used for performing relative navigation by using the pose information of the indicator when the indicator is identified, so that the mobile robot gradually approaches the indicator until the text information in the indicator is identified by a character recognition technology, and stopping the relative navigation;
and the global positioning module is used for obtaining the current position of the mobile robot in the global map by utilizing the pose information and the text information of the indication mark and combining the global map.
9. A mobile robot, characterized in that the mobile robot comprises a processor and a memory, the memory storing a computer program for executing the computer program for implementing the mobile robot relocation method of any of claims 1-7.
10. A readable storage medium, characterized in that it stores a computer program which, when executed on a processor, implements a mobile robot relocation method according to any one of claims 1-7.
CN202211599322.3A 2022-12-12 2022-12-12 Mobile robot repositioning method and device and mobile robot Pending CN115830576A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211599322.3A CN115830576A (en) 2022-12-12 2022-12-12 Mobile robot repositioning method and device and mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211599322.3A CN115830576A (en) 2022-12-12 2022-12-12 Mobile robot repositioning method and device and mobile robot

Publications (1)

Publication Number Publication Date
CN115830576A true CN115830576A (en) 2023-03-21

Family

ID=85546977

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211599322.3A Pending CN115830576A (en) 2022-12-12 2022-12-12 Mobile robot repositioning method and device and mobile robot

Country Status (1)

Country Link
CN (1) CN115830576A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117589154A (en) * 2024-01-19 2024-02-23 深圳竹芒科技有限公司 Relocation method of self-mobile device, self-mobile device and readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117589154A (en) * 2024-01-19 2024-02-23 深圳竹芒科技有限公司 Relocation method of self-mobile device, self-mobile device and readable storage medium
CN117589154B (en) * 2024-01-19 2024-05-24 深圳竹芒科技有限公司 Relocation method of self-mobile device, self-mobile device and readable storage medium

Similar Documents

Publication Publication Date Title
JP6822797B2 (en) Marking-based position measurement
US9157757B1 (en) Methods and systems for mobile-agent navigation
CN111487641B (en) Method and device for detecting object by using laser radar, electronic equipment and storage medium
CN108151750B (en) Positioning method and device
JP6317230B2 (en) Vehicle detection method based on image and map at intersection
JP6906262B1 (en) Robot repositioning method, equipment and devices
Ardeshir et al. GIS-assisted object detection and geospatial localization
CN106092104A (en) The method for relocating of a kind of Indoor Robot and device
JP2017054494A5 (en)
KR102518257B1 (en) Method and system for determining position of vehicle
CN110969592B (en) Image fusion method, automatic driving control method, device and equipment
CN109544443B (en) Route map generation method and device
US11915478B2 (en) Bayesian methodology for geospatial object/characteristic detection
CN115830576A (en) Mobile robot repositioning method and device and mobile robot
TW202124915A (en) Autonomous vehicle semantic map establishment system and establishment method
CN116222539A (en) High-precision map data differentiated updating method and system
CN112418316B (en) Robot repositioning method and device, laser robot and readable storage medium
CN113008249B (en) Avoidance point detection method and avoidance method of mobile robot and mobile robot
CN113189610A (en) Map-enhanced autonomous driving multi-target tracking method and related equipment
JP2019179495A (en) Sensor processing system, distance measuring system, mobile object, sensor processing method, and program
CN115856911A (en) Positioning method and device
US20230236021A1 (en) Information processing device
CN114488026A (en) Underground parking garage passable space detection method based on 4D millimeter wave radar
CN114090905A (en) Alert line position identification method and device, computer equipment and storage medium
WO2021138372A1 (en) Feature coverage analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination