CN111121754A - Mobile robot positioning navigation method and device, mobile robot and storage medium - Google Patents

Mobile robot positioning navigation method and device, mobile robot and storage medium Download PDF

Info

Publication number
CN111121754A
CN111121754A CN201911413449.XA CN201911413449A CN111121754A CN 111121754 A CN111121754 A CN 111121754A CN 201911413449 A CN201911413449 A CN 201911413449A CN 111121754 A CN111121754 A CN 111121754A
Authority
CN
China
Prior art keywords
ultra
coordinate
robot
base station
grid map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911413449.XA
Other languages
Chinese (zh)
Inventor
汤煜
刘志超
赵勇胜
熊友军
庞建新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ubtech Technology Co ltd
Original Assignee
Shenzhen Ubtech Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ubtech Technology Co ltd filed Critical Shenzhen Ubtech Technology Co ltd
Priority to CN201911413449.XA priority Critical patent/CN111121754A/en
Publication of CN111121754A publication Critical patent/CN111121754A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The embodiment of the application is applicable to the technical field of robots, and discloses a mobile robot positioning and navigation method, a device, a mobile robot and a computer readable storage medium, wherein the method comprises the following steps: acquiring a grid map of a target area; the mobile robot is provided with an ultra-wide band tag; determining a first ultra-wideband base station coordinate of the ultra-wideband base station under a grid map coordinate system of the robot; obtaining absolute position point information of the robot in a grid map according to the first ultra-wideband base station coordinate and the ultra-wideband positioning signal of the mobile robot; and based on the grid map, carrying out robot positioning navigation through absolute position point information and sensor signals, wherein the sensor signals comprise at least one of laser radar signals, inertial navigation signals, odometer signals and visual sensor signals. According to the embodiment of the application, the UWB positioning signals are used for assisting the robot in positioning and navigation, so that the accuracy of positioning and navigation of the mobile robot can be improved.

Description

Mobile robot positioning navigation method and device, mobile robot and storage medium
Technical Field
The present application relates to the field of robotics, and in particular, to a method and an apparatus for positioning and navigating a mobile robot, and a computer-readable storage medium.
Background
With the continuous development of the robot technology, the application of the robot is more and more extensive.
Currently, in order to move freely and safely indoors, a mobile robot needs to know the position of the mobile robot in a map and where the map position of a target point is. The positioning and navigation methods of the mobile robot include inertial navigation, laser radar positioning and navigation, visual positioning and navigation, and the like, and some methods perform positioning and navigation by fusing data of a plurality of sensors, for example, fusing data of sensors such as laser radar, odometer, inertial navigation, and the like.
However, in such scenes as camera farms, warehouses, and shopping malls, the lidar cannot form effective positioning due to the limitation of the range. Due to the high similarity and absence of the scenes, false alarms and positioning errors are often generated in vision. The positioning navigation mode of fusing a plurality of sensors not only increases the complexity of the algorithm, but also generates larger accumulated errors when the system runs for a long time. That is, the accuracy of the existing mobile robot positioning and navigation method is low.
Disclosure of Invention
The embodiment of the application provides a mobile robot positioning and navigation method and device, a mobile robot and a computer readable storage medium, so as to solve the problem that the existing mobile robot positioning and navigation mode is low in accuracy.
In a first aspect, an embodiment of the present application provides a mobile robot positioning and navigation method, including:
acquiring a grid map of a target area; the target area is provided with an ultra-wide band base station in advance, and the mobile robot is loaded with an ultra-wide band tag;
determining a first ultra-wideband base station coordinate of the ultra-wideband base station under a grid map coordinate system of the robot;
obtaining absolute position point information of the robot in the grid map according to the first ultra-wideband base station coordinate and the ultra-wideband positioning signal of the mobile robot;
and based on the grid map, performing robot positioning navigation through the absolute position point information and sensor signals, wherein the sensor signals comprise at least one of laser radar signals, inertial navigation signals, odometer signals and visual sensor signals.
According to the embodiment of the application, the current position of the robot on the grid map is determined by the ultra-wideband positioning signal through the ultra-wideband base station coordinate under the grid map coordinate system of the robot, the absolute position point information and other sensor signals are fused, positioned and navigated, namely the ultra-wideband positioning signal is used for assisting the mobile robot to position and navigate, and the ultra-wideband positioning signal can provide the current position of the mobile robot in the grid map, so that the mobile robot can be accurately positioned in some large scenes or repeated scenes, accumulated errors cannot be generated, and the accuracy of positioning and navigation is improved.
In one possible implementation manner of the first aspect, determining first ultra-wideband base station coordinates of the ultra-wideband base station in a grid map coordinate system of the robot includes:
determining a first coordinate of a target position point selected from the grid map in a robot grid map coordinate system;
determining a second coordinate of the target position point under an ultra-wideband coordinate system;
calculating a transformation matrix between the ultra-wideband coordinate system and the grid map coordinate system of the robot according to the first coordinate and the second coordinate;
acquiring a second ultra-wideband base station coordinate of the ultra-wideband base station under the ultra-wideband coordinate system;
and transforming the second ultra-wideband base station coordinate into a first ultra-wideband base station coordinate under the grid map coordinate system of the robot according to the transformation matrix.
In a possible implementation manner of the first aspect, determining first coordinates of a target location point selected from the grid map in the grid map coordinate system of the robot includes:
acquiring pixel coordinates of the target position point in an image coordinate system;
and calculating a first coordinate of the target position point in the robot grid map coordinate system according to the pixel coordinate of the origin of the robot grid map coordinate system, the pixel coordinate of the target position point and the image resolution of the grid map.
In a possible implementation manner of the first aspect, calculating a first coordinate of the target position point in the robot grid map coordinate system according to a pixel coordinate of an origin of the robot grid map coordinate system, a pixel coordinate of the target position point, and an image resolution of the grid map includes:
by passing
Figure BDA0002350569780000031
Calculating a first coordinate of each target position point in the grid map coordinate system of the robot;
wherein (A)x,Ay) Is the first coordinate of the target position point under the grid map coordinate system of the robot, (A)u,Av) (x) is the pixel coordinate of the target location point0,y0) And r is the image resolution of the grid map, and is the pixel coordinate of the origin of the coordinate system of the grid map of the robot.
In a possible implementation manner of the first aspect, transforming the second ultra-wideband base station coordinate into a first ultra-wideband base station coordinate in the grid map coordinate system of the robot according to the transformation matrix includes:
through N'k=T*NkTwo-dimensional coordinates of each second ultra-wideband base station coordinateConverting the coordinate into a two-dimensional coordinate of a first ultra-wideband base station coordinate under the robot map coordinate system;
obtaining a first ultra-wideband base station coordinate under the grid map coordinate system of the robot based on the two-dimensional coordinate of each first ultra-wideband base station coordinate and the height value of the second ultra-wideband base station coordinate;
wherein T is the transformation matrix, and the coordinate of the first ultra-wideband base station is N'k(x, y, z), the second ultra-wideband base station coordinate is Nk(x,y,z);
N'k(x, y) is a two-dimensional coordinate of the coordinates of the first ultra-wideband base station, Nk(x, y) is a two-dimensional coordinate of a second ultra-wideband base station coordinate, and z is a height value of the second ultra-wideband base station coordinate.
In a possible implementation manner of the first aspect, acquiring a grid map of a target area includes:
and scanning the target area through a laser radar to obtain a grid map of the target area.
In a second aspect, an embodiment of the present application provides a mobile robot positioning navigation device, including:
the grid map acquisition module is used for acquiring a grid map of a target area; the target area is provided with an ultra-wide band base station in advance, and the mobile robot is loaded with an ultra-wide band tag;
the determining module is used for determining a first ultra-wideband base station coordinate of the ultra-wideband base station in a grid map coordinate system of the robot;
the position point information obtaining module is used for obtaining absolute position point information of the robot in the grid map according to the first ultra-wideband base station coordinate and the ultra-wideband positioning signal of the mobile robot;
and the positioning navigation module is used for performing robot positioning navigation through the absolute position point information and sensor signals based on the grid map, wherein the sensor signals comprise at least one of laser radar signals, inertial navigation signals, odometer signals and visual sensor signals.
In a possible implementation manner of the second aspect, the determining module is specifically configured to:
determining a first coordinate of a target position point selected from the grid map in a robot grid map coordinate system;
determining a second coordinate of the target position point under an ultra-wideband coordinate system;
calculating a transformation matrix between the ultra-wideband coordinate system and the grid map coordinate system of the robot according to the first coordinate and the second coordinate;
acquiring a second ultra-wideband base station coordinate of the ultra-wideband base station under the ultra-wideband coordinate system;
and transforming the second ultra-wideband base station coordinate into a first ultra-wideband base station coordinate under the grid map coordinate system of the robot according to the transformation matrix.
In a third aspect, an embodiment of the present application provides a mobile robot, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor, when executing the computer program, implements the method according to any one of the above first aspects.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the method according to any one of the above first aspects.
In a fifth aspect, embodiments of the present application provide a computer program product, which, when run on a terminal device, causes the terminal device to perform the method of any one of the first aspect.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a schematic diagram of a prior art positioning and navigation scheme;
FIG. 2 is a schematic view of positioning and navigation provided by an embodiment of the present application;
fig. 3 is a schematic block diagram of a flow chart of a mobile robot positioning and navigation method according to an embodiment of the present disclosure;
FIG. 4 is a schematic block diagram of a process for transforming coordinates of a UWB base station according to an embodiment of the present application;
fig. 5 is a block diagram of a mobile robot positioning and navigation device according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
Positioning and navigation of a mobile robot generally include the steps of mapping, positioning, navigating and the like, wherein mapping refers to building a map representing the surrounding environment. Positioning refers to positioning the mobile robot at the current position. Navigation means that after the current position is located, a path from the current position to the target point position is planned, and then navigation is performed from the current position to the target point position according to the planned path.
In the positioning navigation, whether the positioning accuracy of the robot affects the starting point of the robot and the movement path of the robot is correct, and the positioning accuracy of the positioning navigation mode in the prior art is low.
Several positioning and navigation schemes in the prior art are described below with reference to fig. 1.
Referring to fig. 1, a schematic diagram of a positioning and navigation solution in the prior art is shown, which includes a solution one, a solution two, and a solution three, as shown in fig. 1, these three positioning solutions are robot positioning solutions that are relatively common in the prior art. These several solutions can be used in combination with each other or separately, but each solution has its own drawbacks, resulting in a lower positioning accuracy.
The first scheme is as follows:
the scheme is an inertial positioning navigation scheme, the mobile robot obtains acceleration by using an accelerometer, a gyroscope and the like, and the position of the robot is positioned through acceleration integration. Under the scheme, under the condition of long-time use, a drift phenomenon can be generated, and the positioning navigation accuracy is influenced.
Scheme II:
and carrying out matching place or AMCL positioning through a laser radar sensor, a speedometer and the like to obtain the position of the robot. And the laser radar has a good positioning and navigation effect under the condition of a small scene. However, in some large scenes, such as airports, shopping malls, etc., these scenes are quite open and the lidar sensor may not reach the edge. The robot cannot use laser data or an odometer to correct the position of the robot during traveling, so that the actual traveling route and the planned route are different, and the robot considers that the robot reaches the target point but does not actually reach the target point, which may be specifically referred to as a positioning navigation diagram shown in fig. 2.
As shown in fig. 2, the starting point of the robot is the current position point of the robot, and the target point of the robot is the position point to which the robot needs to go. The ideal route between the robot starting point and the robot target point is a travel route planned for the mobile robot. The actual route of the robot is an actual traveling route obtained by the mobile robot using the second scheme. It can be seen that the actual arrival point of the mobile robot is not coincident with the target point due to inaccurate positioning and navigation.
The third scheme is as follows:
the robot position, namely the visual positioning navigation, is determined by collecting the surrounding environment and matching the pictures in the picture library, and the situation of positioning error can be generated for scenes with repetition or scenes with high similarity.
The inventor finds in research that the existing positioning and navigation scheme shown in fig. 1 has a positioning and navigation error, and is low in accuracy, and the main reason is that a relatively rough absolute position point is lacked, if the position coordinate of the absolute position point exists, the absolute position point is used for assisting the robot to perform positioning and navigation, the absolute position point information and other sensor data are fused, and the current position of the mobile robot is corrected, so that the mobile robot can still accurately position and navigate in some large scenes or repeated scenes, accumulated errors are not generated even after long-time movement, and the positioning and navigation accuracy of the mobile robot is improved.
In the embodiment of the application, Ultra Wide Band (UWB) (for simplicity and convenience of description, described below by UWB) is used to assist the mobile robot in positioning and navigation, and UWB positioning signals are used as absolute position point information to correct the current position of the mobile robot, so that the positioning and navigation accuracy is improved.
In order to use the UWB to assist the robot in moving and navigating, a relation between a UWB coordinate system and a robot grid map coordinate system needs to be established first, and UWB base station coordinates under the UWB coordinate system are converted into UWB base station coordinates under the robot grid map coordinate system; then, the converted UWB base station coordinates are utilized, and the current absolute position point information of the robot in the grid map is obtained through a UWB tag carried on the robot; and finally, the absolute position point information is utilized, and signals of other sensors are fused to correct the current position of the robot.
The mobile robot positioning and navigation scheme provided by the embodiment of the application can be applied to a mobile robot, and the specific type of the mobile robot can be any, for example, the mobile robot is a sweeping robot. The embodiment of the present application does not set any limit to the specific type of the mobile robot.
Referring to fig. 3, a schematic block flow diagram of a mobile robot positioning and navigation method provided in an embodiment of the present application may include the following steps:
s301, acquiring a grid map of a target area; the target area is provided with an ultra-wide band base station in advance, and the mobile robot is provided with an ultra-wide band tag.
The target area refers to a motion guidance area of the mobile robot, and the target area may be any type of area. For example, the target area is an airport or a mall, and the scenes are relatively clear and many similar scenes are repeated.
The corresponding position of the target area is preset with a UWB base station, a certain number of UWB tags are installed on the mobile robot, and the number of the UWB base station and the UWB tags can be set according to requirements. And in the moving process of the mobile robot, the UWB positioning signal can be obtained through the UWB base station and the UWB tag.
In specific application, a target area can be scanned through a high-precision laser radar, and a grid map of the target area is established.
Step S302, determining a first ultra-wideband base station coordinate of the ultra-wideband base station in a grid map coordinate system of the robot.
The first ultra-wideband base station coordinate is a UWB base station coordinate in a grid map coordinate system of the robot. In general, the UWB base station coordinates refer to coordinates in the UWB coordinates, and the UWB base station coordinates in the robot grid coordinate system are obtained by transforming the UWB base station coordinates in the UWB coordinate system to the robot grid map coordinate system based on a transformation matrix between the UWB coordinate system and the robot grid map coordinate system. Specifically, firstly, determining a transformation matrix between a UWB coordinate system and a grid map coordinate system of the robot and UWB base station coordinates under the UWB coordinate system; and transforming the UWB base station coordinate to a grid map coordinate system of the robot according to the transformation matrix.
In specific application, the first ultra-wideband base station coordinate is obtained by pre-calculation and stored locally in the mobile robot, and at the moment, the first ultra-wideband base station coordinate can be obtained by query; or may be calculated in real time instead of being calculated in advance.
And S303, obtaining absolute position point information of the robot in the grid map according to the first ultra-wideband base station coordinate and the ultra-wideband positioning signal of the mobile robot.
In the specific application, in the moving process of the mobile robot, the UWB positioning signal can be obtained through the interaction between the carried UWB tag and the UWB base station. And then according to the UWB base station coordinates under the grid map coordinate system of the robot, determining the position point of the UWB tag under the grid map coordinate system of the robot, namely the current position point information of the robot in the grid map, wherein the current position point information can be used as absolute position point information.
And S304, based on the grid map, carrying out robot positioning navigation through absolute position point information and sensor signals, wherein the sensor signals comprise at least one of laser radar signals, inertial navigation signals, odometer signals and visual sensor signals.
It should be noted that the UWB positioning signal may assist the robot in positioning and navigating, and although the UWB positioning signal is relatively coarse, it may provide an absolute position point for the mobile robot. Based on the absolute position point, signals of other sensors are fused to correct the current position of the mobile robot.
The sensor signal may be one or more of a lidar signal, an inertial navigation signal, an odometer signal, and a visual sensor signal, which may be determined based on a positioning navigation scheme specifically used by the mobile robot.
For example, when the mobile robot adopts inertial navigation, the sensor signal is an inertial navigation signal, and at this time, the UWB positioning signal and the inertial navigation signal are fused to correct the current position of the mobile robot, so that the mobile robot does not drift.
For another example, when the mobile robot employs visual navigation, the sensor signal is a visual sensor signal, and at this time, an absolute position is provided for the robot through the UWB positioning signal, and in some repeated similar scenes, the robot can still correct the current position through the visual signal and the UWB positioning signal.
For another example, when the mobile robot is navigated by using laser radar, the sensor signals include an odometer signal and a laser radar positioning signal, and at this time, although the position of the robot itself cannot be corrected by using the laser positioning data or the odometer signal, the UWB positioning signal provides an absolute position for the robot, and provides a relatively coarse position coordinate to correct the current position of the robot.
It can be understood that the positioning and navigation process includes positioning and navigation, and after the current position of the robot is corrected by fusing other sensor signals with absolute position point information, a path from the current position to the target point is planned, and then the robot navigates to the target point according to the path.
According to the embodiment of the application, the current position of the robot on the grid map is determined by the ultra-wideband positioning signal through the ultra-wideband base station coordinate under the grid map coordinate system of the robot, the absolute position point information and other sensor signals are fused, positioning and navigation are carried out, namely the ultra-wideband positioning signal is used for assisting the mobile robot to carry out positioning and navigation, the ultra-wideband positioning signal can provide the current position of the mobile robot in the grid map, so that the mobile robot can be accurately positioned in some large scenes or repeated scenes, accumulated errors cannot be generated, and the accuracy of positioning and navigation is improved.
The calculation of the coordinates of the first ultra-wideband base station will now be described.
Referring to fig. 4, which is a schematic flow chart of a UWB base station coordinate transformation process provided in an embodiment of the present application, the specific process for determining coordinates of a first ultra-wideband base station of an ultra-wideband base station in a grid map coordinate system of a robot may include:
step S401, determining a first coordinate of a target position point selected from a grid map in a robot grid map coordinate system.
It should be noted that the target location points are location points selected from the grid map according to requirements, and the number and the location of the target location points may be arbitrary. However, in order to improve the accuracy of the coordinates of the first ultra-wideband base station, the selected target location point generally needs to cover the entire range of the map as much as possible, and the location point is a salient point. Therefore, the principle of selecting the target location point may include: covering the whole range of the constructed image as much as possible; the selected location points are some salient points in the mapping process. The salient point is a position point such as an intersection point or an inflection point of a wall.
After the target position points are selected, first coordinates of the target position points in a grid map coordinate system of the robot are measured or calculated.
In specific application, the grid map is an image, and the first coordinate of the target position point in the grid map coordinate system of the robot can be calculated according to the resolution of the image and the pixel coordinate of the target position point. Specifically, pixel coordinates of the target position point in the image coordinate system may be obtained first; and calculating a first coordinate of the target position point in the robot grid map coordinate system according to the pixel coordinate of the origin of the robot grid map coordinate system, the pixel coordinate of the target position point and the image resolution of the grid map.
Further, can be prepared by
Figure BDA0002350569780000111
Calculating a first coordinate of each target position point in a grid map coordinate system of the robot;
wherein (A)x,Ay) Is the first coordinate of the target position point in the grid map coordinate system of the robot, (A)u,Av) Is the pixel coordinate of the target location point, (x)0,y0) The coordinate of the pixel of the origin of the coordinate system of the grid map of the robot is shown, and r is the image resolution of the grid map.
And respectively calculating the first coordinate of each target position point in the grid map coordinate system of the robot according to the formula.
And S402, determining a second coordinate of the target position point in the ultra-wideband coordinate system.
In a specific application, a total station can be used for measuring the coordinates of each position point in the UWB coordinate system.
And S403, calculating a transformation matrix between the ultra-wideband coordinate system and the grid map coordinate system of the robot according to the first coordinate and the second coordinate.
Specifically, after obtaining a first coordinate of the same target position point in the robot grid map coordinate system and a second coordinate in the UWB coordinate system, a transformation matrix for transforming from the UWB coordinate system to the robot grid map coordinate system may be calculated. The method of calculating the transformation Matrix may be, but is not limited to, Iterative Closest Point (ICP), Homography Matrix (Homography Matrix), or the like.
And S404, acquiring a second ultra-wideband base station coordinate of the ultra-wideband base station in the ultra-wideband coordinate system.
In a specific application, a total station may be used to measure the coordinates of each UWB base station, which are the coordinates in the total station coordinate system (i.e., UWB coordinate system).
And S405, transforming the second ultra-wideband base station coordinate into the first ultra-wideband base station coordinate in the grid map coordinate system of the robot according to the transformation matrix.
In the concrete application, firstly, N 'is passed'k=T*NkAnd transforming the two-dimensional coordinates of each second ultra-wideband base station coordinate into the two-dimensional coordinates of the first ultra-wideband base station coordinate in the robot map coordinate system.
Then, obtaining a first ultra-wideband base station coordinate under a grid map coordinate system of the robot based on the two-dimensional coordinate of each first ultra-wideband base station coordinate and the height value of the second ultra-wideband base station coordinate;
t is a transformation matrix, and the coordinate of the first ultra-wideband base station is N'k(x, y, z), second ultra-wideband base station coordinate Nk(x,y,z);
N'k(x, y) is a two-dimensional coordinate of the coordinates of the first ultra-wideband base station, Nk(x, y) is a two-dimensional coordinate of a second ultra-wideband base station coordinate, and z is a height value of the second ultra-wideband base station coordinate.
It should be noted that the coordinates of each UWB base station are three-dimensional, and may be specifically represented as (x, y, z), where z refers to height information of the UWB base station, and this height information is directly measured and does not change with the change of the transformation matrix.
In other words, in the process of transforming by using the transformation matrix, only the two-dimensional coordinates of the UWB base station, i.e., (x, y) of the coordinates of the UWB base station, need to be transformed, and the z of the coordinates of the UWB base station does not need to be transformed. The z (i.e. height value) of the UWB base station coordinates is invariant, i.e. the z of the UWB base station coordinates in the UWB coordinate system is the same as the z in the robot grid map coordinate system.
In specific application, the two-dimensional coordinates of the UWB base station can be transformed by using a transformation matrix, and then the three-dimensional coordinates of the robot grid map coordinate system can be obtained by adding the (x, y) of the UWB base station under the robot grid map coordinate system obtained by transformation and the height information of the base station.
For example, after obtaining the grid map, firstly constructing a robot grid map coordinate system XOY coordinate system; the salient point in the grid map is selected and marked as B, C, D, and the pixel coordinates of the three salient points are (B)u,Bv)、(Cu,Cv)、(Du,Dv). The image resolution of the grid map is r. Original XOY coordinate systemThe coordinate of the point is O (x)o,yo). The coordinates of the B, C, D point in the XOY coordinate system are calculated as follows:
Figure BDA0002350569780000121
wherein the B, C, D point has the coordinate of (B) in the XOY coordinate systemx,By)、(Cx,Cy)、(Dx,Dy)。
Next, a UWB positioning system is constructed, and UWB base stations are installed in advance in the target area. Constructing a total station coordinate system xoy (namely a UWB coordinate system), and measuring the coordinates of B, C, D points by the total station to obtain the coordinates (b) of B, C, D points in the xoy coordinate systemx,by,bz)、(cx,cy,cz)、(dx,dy,dz)。
Simultaneously, measuring the coordinates of each UWB base station through a total station, namely N1(x,y,z)、N2(x,y,z)、N3(x,y,z)。
Then, the coordinates of point B, C, D in the xoy coordinate system are (b)x,by,bz)、(cx,cy,cz)、(dx,dy,dz) And B, C, D point has the coordinate of (B) in the XOY coordinate systemx,By)、(Cx,Cy)、(Dx,Dy) And calculating a transformation matrix T from the UWB coordinate system XOY to the robot grid map coordinate system XOY. It will be appreciated that the height information in the UWB coordinate system need not be used in the calculation of the transformation matrix.
Finally, through N'k=T*NkCoordinate N of the UWB base station in the xoy coordinate system1(x,y)、N2(x,y)、N3(x, y) to coordinates of the UWB base station in the XOY coordinate system. Then, the height information, i.e., the height information (i.e., z) of each UWB base station in the XOY coordinate system, is added to obtain the coordinates of the UWB base station in the XOY coordinate system.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 5 shows a block diagram of a mobile robot positioning and navigation apparatus according to an embodiment of the present application, which corresponds to the mobile robot positioning and navigation method according to the foregoing embodiment, and only shows portions related to the embodiment of the present application for convenience of description.
Referring to fig. 5, the apparatus may include:
a grid map obtaining module 51, configured to obtain a grid map of a target area; the mobile robot is provided with an ultra-wide band tag;
the determining module 52 is configured to determine a first ultra-wideband base station coordinate of the ultra-wideband base station in a grid map coordinate system of the robot;
a position point information obtaining module 53, configured to obtain absolute position point information of the robot in the grid map according to the first ultra-wideband base station coordinate and the ultra-wideband positioning signal of the mobile robot;
and a positioning navigation module 54 for performing robot positioning navigation based on the grid map by using the absolute position point information and sensor signals, wherein the sensor signals include at least one of laser radar signals, inertial navigation signals, odometer signals and visual sensor signals.
In a possible implementation manner, the determining module is specifically configured to:
determining a first coordinate of a target position point selected from a grid map in a robot grid map coordinate system;
determining a second coordinate of the target position point under the ultra-wideband coordinate system;
calculating a transformation matrix between the ultra-wideband coordinate system and the grid map coordinate system of the robot according to the first coordinate and the second coordinate;
acquiring a second ultra-wideband base station coordinate of the ultra-wideband base station in an ultra-wideband coordinate system;
and transforming the second ultra-wideband base station coordinate into the first ultra-wideband base station coordinate in the grid map coordinate system of the robot according to the transformation matrix.
In a possible implementation manner, the determining module is specifically configured to:
acquiring pixel coordinates of a target position point in an image coordinate system;
and calculating a first coordinate of the target position point in the robot grid map coordinate system according to the pixel coordinate of the origin of the robot grid map coordinate system, the pixel coordinate of the target position point and the image resolution of the grid map.
In a possible implementation manner, the determining module is specifically configured to:
by passing
Figure BDA0002350569780000141
Calculating a first coordinate of each target position point in a grid map coordinate system of the robot;
wherein (A)x,Ay) Is the first coordinate of the target position point in the grid map coordinate system of the robot, (A)u,Av) Is the pixel coordinate of the target location point, (x)0,y0) The coordinate of the pixel of the origin of the coordinate system of the grid map of the robot is shown, and r is the image resolution of the grid map.
In a possible implementation manner, the determining module is specifically configured to:
through N'k=T*NkConverting the two-dimensional coordinates of each second ultra-wideband base station coordinate into the two-dimensional coordinates of the first ultra-wideband base station coordinate under the robot map coordinate system;
obtaining a first ultra-wideband base station coordinate under a grid map coordinate system of the robot based on a two-dimensional coordinate of each first ultra-wideband base station coordinate and a height value of a second ultra-wideband base station coordinate;
t is a transformation matrix, and the coordinate of the first ultra-wideband base station is N'k(x, y, z), second ultra-wideband base station coordinate Nk(x,y,z);
N'k(x, y) is a two-dimensional coordinate of the coordinates of the first ultra-wideband base station, Nk(x, y) is the second of the second UWB base station coordinatesAnd a dimensional coordinate z is a height value of the second ultra-wideband base station coordinate.
In a possible implementation manner, the grid map obtaining module is specifically configured to:
and scanning the target area through the laser radar to obtain a grid map of the target area.
The mobile robot positioning and navigation device has the function of realizing the mobile robot positioning and navigation method, the function can be realized by hardware, and can also be realized by hardware executing corresponding software, the hardware or the software comprises one or more modules corresponding to the function, and the modules can be software and/or hardware.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/modules, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and reference may be made to the part of the embodiment of the method specifically, and details are not described here.
Fig. 6 is a schematic structural diagram of a mobile robot according to an embodiment of the present application. As shown in fig. 6, the mobile robot 6 of this embodiment includes: at least one processor 60, a memory 61, and a computer program 62 stored in the memory 61 and executable on the at least one processor 60, the processor 60 implementing the steps in any of the various mobile robot positioning navigation method embodiments described above when executing the computer program 62.
The mobile robot may include, but is not limited to, a processor 60, a memory 61, a UWB tag 63, a movement mechanism 64, and a sensor 65. Those skilled in the art will appreciate that fig. 6 is merely an example of the mobile robot 6, and does not constitute a limitation of the mobile robot 6, and may include more or less components than those shown, or combine some of the components, or different components, such as input and output devices, network access devices, etc.
The Processor 60 may be a Central Processing Unit (CPU), and the Processor 60 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may in some embodiments be an internal storage unit of the mobile robot 6, such as a hard disk or a memory of the mobile robot 6. The memory 61 may also be an external storage device of the mobile robot 6 in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the mobile robot 6. Further, the memory 61 may also include both an internal storage unit and an external storage device of the mobile robot 6. The memory 61 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer program. The memory 61 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiment of the present application provides a computer program product, which when running on a mobile robot, enables the mobile robot to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), random-access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A mobile robot positioning and navigation method is characterized by comprising the following steps:
acquiring a grid map of a target area; the target area is provided with an ultra-wide band base station in advance, and the mobile robot is loaded with an ultra-wide band tag;
determining a first ultra-wideband base station coordinate of the ultra-wideband base station under a grid map coordinate system of the robot;
obtaining absolute position point information of the robot in the grid map according to the first ultra-wideband base station coordinate and the ultra-wideband positioning signal of the mobile robot;
and based on the grid map, performing robot positioning navigation through the absolute position point information and sensor signals, wherein the sensor signals comprise at least one of laser radar signals, inertial navigation signals, odometer signals and visual sensor signals.
2. The method of claim 1, wherein determining first ultra-wideband base station coordinates of the ultra-wideband base station in a robot grid map coordinate system comprises:
determining a first coordinate of a target position point selected from the grid map in a robot grid map coordinate system;
determining a second coordinate of the target position point under an ultra-wideband coordinate system;
calculating a transformation matrix between the ultra-wideband coordinate system and the grid map coordinate system of the robot according to the first coordinate and the second coordinate;
acquiring a second ultra-wideband base station coordinate of the ultra-wideband base station under the ultra-wideband coordinate system;
and transforming the second ultra-wideband base station coordinate into a first ultra-wideband base station coordinate under the grid map coordinate system of the robot according to the transformation matrix.
3. The method of claim 2, wherein determining first coordinates of a target location point selected from the grid map in the robot grid map coordinate system comprises:
acquiring pixel coordinates of the target position point in an image coordinate system;
and calculating a first coordinate of the target position point in the robot grid map coordinate system according to the pixel coordinate of the origin of the robot grid map coordinate system, the pixel coordinate of the target position point and the image resolution of the grid map.
4. The method of claim 3, wherein calculating a first coordinate of the target location point in the robot grid map coordinate system based on pixel coordinates of an origin of the robot grid map coordinate system, pixel coordinates of the target location point, and an image resolution of the grid map comprises:
by passing
Figure FDA0002350569770000021
Calculating a first coordinate of each target position point in the grid map coordinate system of the robot;
wherein (A)x,Ay) Is the first coordinate of the target position point under the grid map coordinate system of the robot, (A)u,Av) (x) is the pixel coordinate of the target location point0,y0) And r is the image resolution of the grid map, and is the pixel coordinate of the origin of the coordinate system of the grid map of the robot.
5. The method of claim 2, wherein transforming the second ultra-wideband base station coordinates into first ultra-wideband base station coordinates in the robot grid map coordinate system according to the transformation matrix comprises:
through N'k=T*NkConverting the two-dimensional coordinate of each second ultra-wideband base station coordinate into the two-dimensional coordinate of the first ultra-wideband base station coordinate under the robot map coordinate system;
obtaining a first ultra-wideband base station coordinate under the grid map coordinate system of the robot based on the two-dimensional coordinate of each first ultra-wideband base station coordinate and the height value of the second ultra-wideband base station coordinate;
wherein T is the transformation matrix, and the coordinate of the first ultra-wideband base station is N'k(x, y, z), the second ultra-wideband base station coordinate is Nk(x,y,z);
N'k(x, y) is a two-dimensional coordinate of the coordinates of the first ultra-wideband base station, Nk(x, y) is a two-dimensional coordinate of a second ultra-wideband base station coordinate, and z is a height value of the second ultra-wideband base station coordinate.
6. The method of claim 1, wherein obtaining a grid map of a target area comprises:
and scanning the target area through a laser radar to obtain a grid map of the target area.
7. A mobile robot positioning navigation device, comprising:
the grid map acquisition module is used for acquiring a grid map of a target area; the target area is provided with an ultra-wide band base station in advance, and the mobile robot is loaded with an ultra-wide band tag;
the determining module is used for determining a first ultra-wideband base station coordinate of the ultra-wideband base station in a grid map coordinate system of the robot;
the position point information obtaining module is used for obtaining absolute position point information of the robot in the grid map according to the first ultra-wideband base station coordinate and the ultra-wideband positioning signal of the mobile robot;
and the positioning navigation module is used for performing robot positioning navigation through the absolute position point information and sensor signals based on the grid map, wherein the sensor signals comprise at least one of laser radar signals, inertial navigation signals, odometer signals and visual sensor signals.
8. The apparatus of claim 7, wherein the determination module is specifically configured to:
determining a first coordinate of a target position point selected from the grid map in a robot grid map coordinate system;
determining a second coordinate of the target position point under an ultra-wideband coordinate system;
calculating a transformation matrix between the ultra-wideband coordinate system and the grid map coordinate system of the robot according to the first coordinate and the second coordinate;
acquiring a second ultra-wideband base station coordinate of the ultra-wideband base station under the ultra-wideband coordinate system;
and transforming the second ultra-wideband base station coordinate into a first ultra-wideband base station coordinate under the grid map coordinate system of the robot according to the transformation matrix.
9. A mobile robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 6 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 6.
CN201911413449.XA 2019-12-31 2019-12-31 Mobile robot positioning navigation method and device, mobile robot and storage medium Pending CN111121754A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911413449.XA CN111121754A (en) 2019-12-31 2019-12-31 Mobile robot positioning navigation method and device, mobile robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911413449.XA CN111121754A (en) 2019-12-31 2019-12-31 Mobile robot positioning navigation method and device, mobile robot and storage medium

Publications (1)

Publication Number Publication Date
CN111121754A true CN111121754A (en) 2020-05-08

Family

ID=70506537

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911413449.XA Pending CN111121754A (en) 2019-12-31 2019-12-31 Mobile robot positioning navigation method and device, mobile robot and storage medium

Country Status (1)

Country Link
CN (1) CN111121754A (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111474518A (en) * 2020-05-25 2020-07-31 浙江大华技术股份有限公司 Positioning method, fusion positioning base station, and storage medium
CN111649748A (en) * 2020-06-16 2020-09-11 湖北友系互联科技有限公司 Indoor navigation method and system
CN111708362A (en) * 2020-05-22 2020-09-25 上海黑眸智能科技有限责任公司 Mobile robot laser recharging method, system and terminal
CN111983559A (en) * 2020-08-14 2020-11-24 Oppo广东移动通信有限公司 Indoor positioning navigation method and device
CN112132929A (en) * 2020-09-01 2020-12-25 北京布科思科技有限公司 Grid map marking method based on depth vision and single line laser radar
CN112179352A (en) * 2020-09-30 2021-01-05 北京小米移动软件有限公司 Space map construction method and device, movement control method and device, and medium
CN112462758A (en) * 2020-11-06 2021-03-09 深圳市优必选科技股份有限公司 Drawing establishing method and device, computer readable storage medium and robot
CN112702699A (en) * 2020-12-21 2021-04-23 南京大学 Indoor positioning method fusing UWB and LiDAR
CN112697151A (en) * 2020-12-24 2021-04-23 北京百度网讯科技有限公司 Method, apparatus and storage medium for determining initial point of mobile robot
CN112947435A (en) * 2021-02-04 2021-06-11 沈阳仪表科学研究院有限公司 Navigation control method for wall-climbing robot
CN113110496A (en) * 2021-05-08 2021-07-13 珠海市一微半导体有限公司 Mobile robot mapping method and system
CN113791625A (en) * 2021-09-30 2021-12-14 深圳市优必选科技股份有限公司 Full coverage path generation method and device, terminal equipment and storage medium
CN114061563A (en) * 2021-10-15 2022-02-18 深圳优地科技有限公司 Method and device for judging reasonability of target point, terminal equipment and storage medium
CN114371710A (en) * 2022-01-07 2022-04-19 牧原肉食品有限公司 Mobile robot navigation method and device based on reflective columns and readable storage medium
CN114777757A (en) * 2022-03-14 2022-07-22 深圳市优必选科技股份有限公司 Beacon map construction method, device, equipment and storage medium
CN115308684A (en) * 2022-07-05 2022-11-08 广州晓网电子科技有限公司 Uwb ultra-wideband indoor positioning method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105865451A (en) * 2016-04-19 2016-08-17 深圳市神州云海智能科技有限公司 Method and device applied to indoor location of mobile robot
CN106052684A (en) * 2016-06-16 2016-10-26 济南大学 Mobile robot IMU/UWB/code disc loose combination navigation system and method adopting multi-mode description
CN107478214A (en) * 2017-07-24 2017-12-15 杨华军 A kind of indoor orientation method and system based on Multi-sensor Fusion
CN108012326A (en) * 2017-12-07 2018-05-08 珠海市微半导体有限公司 The method and chip of robot monitoring pet based on grating map
CN109388149A (en) * 2018-09-26 2019-02-26 杭州电子科技大学 A kind of control method of the intelligent check system based on unmanned plane
CN110375730A (en) * 2019-06-12 2019-10-25 深圳大学 The indoor positioning navigation system merged based on IMU and UWB

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105865451A (en) * 2016-04-19 2016-08-17 深圳市神州云海智能科技有限公司 Method and device applied to indoor location of mobile robot
CN106052684A (en) * 2016-06-16 2016-10-26 济南大学 Mobile robot IMU/UWB/code disc loose combination navigation system and method adopting multi-mode description
CN107478214A (en) * 2017-07-24 2017-12-15 杨华军 A kind of indoor orientation method and system based on Multi-sensor Fusion
CN108012326A (en) * 2017-12-07 2018-05-08 珠海市微半导体有限公司 The method and chip of robot monitoring pet based on grating map
CN109388149A (en) * 2018-09-26 2019-02-26 杭州电子科技大学 A kind of control method of the intelligent check system based on unmanned plane
CN110375730A (en) * 2019-06-12 2019-10-25 深圳大学 The indoor positioning navigation system merged based on IMU and UWB

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
卢靖宇等: "基于超宽带的移动机器人室内定位系统设计", 《电子技术应用》 *
王芳等: "基于超宽带和航位推算的室内机器人UKF定位算法", 《导航定位与授时》 *
王辉 等: "《防空导弹导航、制导与控制系统设计》", 31 August 2017 *

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111708362A (en) * 2020-05-22 2020-09-25 上海黑眸智能科技有限责任公司 Mobile robot laser recharging method, system and terminal
CN111474518B (en) * 2020-05-25 2023-07-14 浙江大华技术股份有限公司 Positioning method, fusion positioning base station and storage medium
CN111474518A (en) * 2020-05-25 2020-07-31 浙江大华技术股份有限公司 Positioning method, fusion positioning base station, and storage medium
CN111649748A (en) * 2020-06-16 2020-09-11 湖北友系互联科技有限公司 Indoor navigation method and system
CN111983559A (en) * 2020-08-14 2020-11-24 Oppo广东移动通信有限公司 Indoor positioning navigation method and device
CN112132929A (en) * 2020-09-01 2020-12-25 北京布科思科技有限公司 Grid map marking method based on depth vision and single line laser radar
CN112132929B (en) * 2020-09-01 2024-01-26 北京布科思科技有限公司 Grid map marking method based on depth vision and single-line laser radar
CN112179352A (en) * 2020-09-30 2021-01-05 北京小米移动软件有限公司 Space map construction method and device, movement control method and device, and medium
CN112462758B (en) * 2020-11-06 2022-05-06 深圳市优必选科技股份有限公司 Drawing establishing method and device, computer readable storage medium and robot
CN112462758A (en) * 2020-11-06 2021-03-09 深圳市优必选科技股份有限公司 Drawing establishing method and device, computer readable storage medium and robot
CN112702699A (en) * 2020-12-21 2021-04-23 南京大学 Indoor positioning method fusing UWB and LiDAR
CN112702699B (en) * 2020-12-21 2021-12-03 南京大学 Indoor positioning method fusing UWB and LiDAR
CN112697151B (en) * 2020-12-24 2023-02-21 北京百度网讯科技有限公司 Method, apparatus, and storage medium for determining initial point of mobile robot
CN112697151A (en) * 2020-12-24 2021-04-23 北京百度网讯科技有限公司 Method, apparatus and storage medium for determining initial point of mobile robot
CN112947435A (en) * 2021-02-04 2021-06-11 沈阳仪表科学研究院有限公司 Navigation control method for wall-climbing robot
CN113110496A (en) * 2021-05-08 2021-07-13 珠海市一微半导体有限公司 Mobile robot mapping method and system
CN113110496B (en) * 2021-05-08 2024-05-07 珠海一微半导体股份有限公司 Mobile robot mapping method and system
CN113791625A (en) * 2021-09-30 2021-12-14 深圳市优必选科技股份有限公司 Full coverage path generation method and device, terminal equipment and storage medium
CN114061563A (en) * 2021-10-15 2022-02-18 深圳优地科技有限公司 Method and device for judging reasonability of target point, terminal equipment and storage medium
CN114061563B (en) * 2021-10-15 2024-04-05 深圳优地科技有限公司 Target point rationality judging method, device, terminal equipment and storage medium
CN114371710A (en) * 2022-01-07 2022-04-19 牧原肉食品有限公司 Mobile robot navigation method and device based on reflective columns and readable storage medium
CN114371710B (en) * 2022-01-07 2024-04-30 牧原肉食品有限公司 Navigation method, equipment and readable storage medium of mobile robot based on reflective column
CN114777757A (en) * 2022-03-14 2022-07-22 深圳市优必选科技股份有限公司 Beacon map construction method, device, equipment and storage medium
CN115308684A (en) * 2022-07-05 2022-11-08 广州晓网电子科技有限公司 Uwb ultra-wideband indoor positioning method and device

Similar Documents

Publication Publication Date Title
CN111121754A (en) Mobile robot positioning navigation method and device, mobile robot and storage medium
CN111108342B (en) Visual range method and pair alignment for high definition map creation
US10509983B2 (en) Operating device, operating system, operating method, and program therefor
JP7236565B2 (en) POSITION AND ATTITUDE DETERMINATION METHOD, APPARATUS, ELECTRONIC DEVICE, STORAGE MEDIUM AND COMPUTER PROGRAM
KR20220053513A (en) Image data automatic labeling method and device
Acharya et al. BIM-Tracker: A model-based visual tracking approach for indoor localisation using a 3D building model
US11227395B2 (en) Method and apparatus for determining motion vector field, device, storage medium and vehicle
CN111174799A (en) Map construction method and device, computer readable medium and terminal equipment
KR20220025028A (en) Method and device for building beacon map based on visual beacon
CN110936383A (en) Obstacle avoiding method, medium, terminal and device for robot
CN110470333B (en) Calibration method and device of sensor parameters, storage medium and electronic device
CN110501036A (en) The calibration inspection method and device of sensor parameters
CN112556685B (en) Navigation route display method and device, storage medium and electronic equipment
EP3961583B1 (en) Method for detecting obstacle, electronic device, roadside device and cloud control platform
CN113587934B (en) Robot, indoor positioning method and device and readable storage medium
CN111353453B (en) Obstacle detection method and device for vehicle
KR102490521B1 (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
CN112232275A (en) Obstacle detection method, system, equipment and storage medium based on binocular recognition
CN115456898A (en) Method and device for building image of parking lot, vehicle and storage medium
CN117824667A (en) Fusion positioning method and medium based on two-dimensional code and laser
JP2023503750A (en) ROBOT POSITIONING METHOD AND DEVICE, DEVICE, STORAGE MEDIUM
CN116698014A (en) Map fusion and splicing method based on multi-robot laser SLAM and visual SLAM
CN115060268A (en) Fusion positioning method, system, equipment and storage medium for machine room
CN114323038A (en) Outdoor positioning method fusing binocular vision and 2D laser radar
CN113932793A (en) Three-dimensional coordinate positioning method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200508