CN112414403B - Robot positioning and attitude determining method, equipment and storage medium - Google Patents

Robot positioning and attitude determining method, equipment and storage medium Download PDF

Info

Publication number
CN112414403B
CN112414403B CN202110097185.2A CN202110097185A CN112414403B CN 112414403 B CN112414403 B CN 112414403B CN 202110097185 A CN202110097185 A CN 202110097185A CN 112414403 B CN112414403 B CN 112414403B
Authority
CN
China
Prior art keywords
robot
point cloud
coordinate system
inertial navigation
laser point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110097185.2A
Other languages
Chinese (zh)
Other versions
CN112414403A (en
Inventor
甘雨
杨世忠
刘望城
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Beidou Microchip Data Technology Co ltd
Original Assignee
Hunan Beidou Microchip Data Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Beidou Microchip Data Technology Co ltd filed Critical Hunan Beidou Microchip Data Technology Co ltd
Priority to CN202110097185.2A priority Critical patent/CN112414403B/en
Publication of CN112414403A publication Critical patent/CN112414403A/en
Application granted granted Critical
Publication of CN112414403B publication Critical patent/CN112414403B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a positioning and attitude determining method, equipment and a storage medium of a robot, wherein the method comprises the steps of firstly, constructing a prior map according to collected laser point cloud data and inertial navigation data; secondly, curvature calculation is carried out on the laser point cloud data to extract laser point cloud characteristics, and the initial position posture of the robot is obtained according to the inertial navigation position posture; then based on the initial position posture, obtaining the current position posture of the robot in a laser point cloud coordinate system, and constructing a local tracking map; then, matching similar environments to obtain position and attitude conversion parameters between the prior map and the local tracking map; and finally, correcting the current position posture of the robot in the laser point cloud coordinate system. The invention can improve the positioning and attitude determination precision of the robot in the autonomous movement without depending on external signals; the invention adopts the curvature calculation mode for the laser point cloud data, and also adopts the grid map for tracking calculation during positioning, thereby greatly reducing the pressure of data transmission.

Description

Robot positioning and attitude determining method, equipment and storage medium
Technical Field
The invention relates to the technical field of robot positioning and navigation, in particular to a positioning and attitude determining method and equipment for a robot and a storage medium.
Background
With the gradual deepening of robot development and application, the robot positioning and pose determination provides a foundation for autonomous navigation positioning of the robot, is one of core technologies for promoting the robot to be widely applied, and has become one of bottlenecks for limiting improvement of robot service capability.
In the related positioning and attitude determination technology, a robot positioning and attitude determination method generally adopts differential GNSS and inertial navigation to carry out combined navigation to obtain position and attitude information, GNSS positioning is easily interfered by external weather environment factors, inertial navigation has high instantaneous positioning accuracy, but errors are accumulated along with time to reduce the accuracy, the two methods can make up for the deficiencies by combining, but still belong to active positioning, require input of external GNSS signals and are influenced by uncontrollable factors of external conditions, and the positioning effect is uncontrollable.
Disclosure of Invention
The present invention is directed to solving at least one of the problems of the prior art. Therefore, the invention provides a robot positioning and attitude determining method, equipment and a storage medium, which do not need additional GNSS positioning equipment to provide positioning information and can completely avoid the problem of weak positioning signals caused by external factors.
In a first aspect of the present invention, there is provided a positioning and attitude determination method for a robot for controlling a tool, the robot including a lidar and an inertial navigation measurement unit and moving within a defined area, comprising the steps of:
constructing a prior map according to the laser point cloud data and the inertial navigation data in the limited area acquired by the robot;
carrying out curvature calculation on laser point cloud data to extract laser point cloud characteristics, and acquiring corresponding initial position postures of the inertial navigation position postures of the robot in the prior map according to a repositioning method based on the extracted laser point cloud characteristics and the inertial navigation position postures of the robot;
acquiring the current position posture of the robot in a laser point cloud coordinate system based on the initial position posture, and constructing a local tracking point cloud map based on the current position posture of the robot in the laser point cloud coordinate system;
converting the local tracking point cloud map into a local tracking grid map, and acquiring position and attitude conversion parameters of the prior map and the local tracking grid map according to a similar environment detection method;
and correcting the current position and posture of the robot in a laser point cloud coordinate system based on the position and posture conversion parameters.
According to the embodiment of the invention, at least the following technical effects are achieved:
the method comprises the steps of firstly, acquiring laser point cloud data and inertial navigation data in a limited area according to a robot, and constructing a prior map in the limited area; secondly, curvature calculation is carried out on the laser point cloud data to extract laser point cloud characteristics, and the inertial navigation position attitude of the robot is combined to obtain the corresponding initial position attitude of the inertial navigation position attitude of the robot in the prior map according to a repositioning method; then, based on the initial position posture, obtaining the current position posture of the robot in a laser point cloud coordinate system, and constructing a local tracking point cloud map according to the current position posture of the robot in the laser point cloud coordinate system; then, converting the local tracking point cloud map into a local tracking grid map, and performing similar environment matching according to a similar environment detection method to obtain position and attitude conversion parameters between the prior map and the local tracking grid map; and finally, correcting the current position posture of the robot in the laser point cloud coordinate system based on the position posture conversion parameters. According to the method, additional GNSS positioning equipment is not needed to provide positioning information, an external signal is not needed, and the positioning and attitude determining precision of the robot in the autonomous movement is improved. In addition, the method adopts a curvature calculation mode for the laser point cloud data, only effective information in the laser point cloud data is reserved, redundant data is eliminated, and a grid map is also adopted for tracking calculation during positioning, so that the calculation amount is reduced, the requirement on calculation force is lowered, and meanwhile, the pressure of data transmission is greatly reduced.
In a second aspect of the present invention, there is provided a positioning and attitude determination apparatus for a robot, comprising: at least one control processor and a memory for communicative connection with the at least one control processor; the memory stores instructions executable by the at least one control processor to enable the at least one control processor to perform a method of positioning and pose-ing a robot according to the first aspect of the invention.
In a third aspect of the present invention, a computer-readable storage medium is provided, which stores computer-executable instructions for causing a computer to perform the method for positioning and determining the pose of a robot according to the first aspect of the present invention.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flow chart of a positioning and attitude determination method for a robot according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an embodiment of a positioning and attitude determination method for a robot according to the present invention;
fig. 3 is a schematic structural diagram of a positioning and attitude determination device of a robot according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the existing positioning and attitude determination technology, a robot positioning and attitude determination method generally adopts a differential GNSS and an inertial navigation to carry out combined navigation to obtain position and attitude information, the GNSS positioning is easily interfered by external weather environment factors, the inertial navigation has high instantaneous positioning accuracy, but errors are accumulated along with time to reduce the accuracy, the two methods can make up for the deficiencies of the two methods by combination, but still belong to active positioning, need the input of external GNSS signals, and are influenced by the uncontrollable factors of external conditions, and the positioning effect is uncontrollable.
In order to solve the above-mentioned disadvantages, referring to fig. 1 and 2, an embodiment of the present invention provides a positioning and attitude determination method for a robot for controlling a tool, the robot including a laser radar and an inertial navigation measurement unit and moving within a defined area, comprising the steps of:
s101, constructing a point cloud grid map according to laser point cloud data in a limited area collected by a robot, and combining inertial navigation data to serve as a prior map;
s102, curvature calculation is carried out on laser point cloud data to extract features, and an initial position attitude corresponding to the current inertial navigation position attitude of the robot in a prior map is obtained according to a repositioning method by combining the inertial navigation position attitude;
s103, acquiring the current position posture of the robot in a laser point cloud coordinate system based on the initial position posture, and constructing a local tracking point cloud map based on the current position posture;
step S104, processing the local tracking point cloud map, converting the local tracking point cloud map into a local tracking grid map, and acquiring position and posture conversion parameters of a prior map and the local tracking grid map according to a similar environment detection method;
and S105, correcting the current position and posture of the robot in the laser point cloud coordinate system based on the position and posture conversion parameters, and jumping to the step S103.
The method does not need additional GNSS positioning equipment to provide positioning information, does not depend on external signals, improves the positioning and attitude determination precision of the robot in autonomous movement, adopts a curvature calculation mode for laser point cloud data, only retains effective information in the laser point cloud data, eliminates redundant data, also adopts a grid map to perform tracking calculation during positioning, reduces the calculation amount, reduces the requirement on calculation force, and greatly reduces the pressure of data transmission.
As an optional implementation manner, the control tool in this embodiment may be a control system of the robot (a control system with a processor as a core), may also be a server that establishes a communication connection with the robot, and may also be an upper computer (e.g., a PC end) that establishes a communication connection with the robot. Various control modes of the robot can be realized.
Based on the above embodiment, step S101 specifically includes the steps of:
to be provided with
Figure DEST_PATH_IMAGE001
As a data format, constructing an inertial navigation position attitude, wherein
Figure DEST_PATH_IMAGE002
The coordinates are represented by a coordinate system of,
Figure DEST_PATH_IMAGE003
the gesture is represented by a gesture that is,
Figure DEST_PATH_IMAGE004
indicating identification。
To be provided with
Figure DEST_PATH_IMAGE005
As a data format, a point cloud grid map is constructed, wherein
Figure DEST_PATH_IMAGE006
Is shown and
Figure 464256DEST_PATH_IMAGE004
the identification of the matching format of (a) is,
Figure DEST_PATH_IMAGE007
representing the data of the current point cloud frame,
Figure DEST_PATH_IMAGE008
representing a matrix composed of fixed row and column numbers, each position in the matrix is stored with an elevation mean value of the laser point cloud in the position range which is consistent with the real geographic coordinates,
Figure DEST_PATH_IMAGE009
is a pair of
Figure 676931DEST_PATH_IMAGE008
A single row matrix of the average value for each column,
Figure DEST_PATH_IMAGE010
is a pair of
Figure 271861DEST_PATH_IMAGE008
Taking a single-column matrix of the mean value of each row;
point cloud frame data
Figure 205182DEST_PATH_IMAGE007
The format is as follows:
Figure DEST_PATH_IMAGE011
wherein
Figure DEST_PATH_IMAGE012
Representing point coordinate information in the laser point cloud,
Figure DEST_PATH_IMAGE013
the information on the color is represented by,
Figure DEST_PATH_IMAGE014
the information on the intensity of the laser light is represented,
Figure DEST_PATH_IMAGE015
representing time information.
And storing the inertial navigation position attitude and the point cloud grid map constructed by the format as prior map data in the preset area to generate a prior map.
Based on the above embodiment, step S102 specifically includes the steps of:
and S1021, extracting laser point cloud characteristics from the collected laser point cloud data.
Firstly, extracting features according to the inter-point curvature in the laser point cloud, wherein the curvature is calculated in the following mode:
(1) for m points on the edge of each scanning line in the laser radar, the curvature calculation condition of each m point on the left and the right is not met, so that the calculation is not involved;
(2) for any target laser point, on the scanning line where the point is located, selecting n points on the left side and the right side of the point, wherein n is less than
Figure DEST_PATH_IMAGE016
(3) Calculating the average difference between each point and the target laser point in the x coordinate direction
Figure DEST_PATH_IMAGE017
:
Figure DEST_PATH_IMAGE018
(4) Calculating the average difference of each point and the target laser point in the y coordinate direction
Figure DEST_PATH_IMAGE019
:
Figure DEST_PATH_IMAGE020
(5) Calculating the average difference of each point and the target laser point in the z coordinate direction
Figure DEST_PATH_IMAGE021
:
Figure DEST_PATH_IMAGE022
(6) Using the obtained
Figure 660346DEST_PATH_IMAGE017
Figure 247185DEST_PATH_IMAGE019
And
Figure 735935DEST_PATH_IMAGE021
the three-dimensional curvature is calculated according to the following formula:
Figure DEST_PATH_IMAGE023
and then, after the three-dimensional curvature c is obtained, selecting the characteristic points according to the three-dimensional curvature threshold t to distinguish the angle characteristic points from the plane characteristic points.
Angular feature point selection conditions:
(1) at most N points are selected from the point with the maximum three-dimensional curvature, and only the points with the curvature larger than t can be selected;
(2) if the existing points in the five points around one point are selected as the angle characteristic points, skipping the points and selecting the points with smaller curvature.
Selecting conditions of the plane feature points:
(1) at most N points are selected from the point with the minimum three-dimensional curvature, and only the point with the curvature smaller than t can be selected;
(2) if the existing points in the five points around one point are selected as the plane feature points, skipping the points and selecting the points with larger curvature.
And respectively obtaining laser point cloud characteristics (namely the obtained laser point cloud angular points and plane point characteristics) under the laser coordinate system.
And S1022, extracting the current inertial navigation position posture of the robot from the acquired inertial navigation data.
According to the acquired inertial navigation data, calculating the position and the attitude of the robot under the inertial navigation coordinate system to obtain the position and the attitude of the robot
Figure DEST_PATH_IMAGE024
And S1023, converting the laser point cloud characteristic data into an inertial navigation coordinate system according to a fixed rotation relation and translation relation between the inertial navigation coordinate system and the laser radar coordinate system, combining the laser point cloud characteristic data with inertial navigation position attitude data, registering the laser point cloud characteristic data with point cloud frame data in a prior map, and acquiring the position attitude of the origin of the inertial navigation coordinate system corresponding to the prior map.
Using point cloud registration algorithm with
Figure DEST_PATH_IMAGE025
As an initial registration value, registering the laser point cloud characteristics with point cloud frame data in a prior map, and determining the position and the attitude of the origin of an inertial navigation coordinate system in the repositioning period of the robot under the coordinate system of the prior map to obtain the position and the attitude of the origin of the inertial navigation coordinate system in the repositioning period of the robot
Figure DEST_PATH_IMAGE026
And S1024, obtaining the corresponding initial position posture of the current inertial navigation position posture of the robot in the prior map through the corresponding position posture of the current inertial navigation position posture of the robot and the original point of the inertial navigation coordinate system in the prior map.
The robot is obtained by the following formulaThe front inertial navigation position attitude is corresponding to the initial position attitude under the prior map coordinate system
Figure DEST_PATH_IMAGE027
Figure DEST_PATH_IMAGE028
Wherein,
Figure 426942DEST_PATH_IMAGE027
representing the inertial navigation position attitude of the robot in the initial position attitude in the prior map.
Based on the above embodiment, step S103 specifically includes the steps of:
firstly, according to the initial position and posture
Figure 684748DEST_PATH_IMAGE027
And converting the current inertial navigation position posture of the robot to obtain the current position posture of the robot in the laser point cloud coordinate system by the fixed rotation relation and translation relation between the inertial navigation coordinate system and the laser radar coordinate system:
Figure DEST_PATH_IMAGE029
wherein,
Figure DEST_PATH_IMAGE030
representing the current position and posture of the robot in the laser point cloud coordinate system,
Figure DEST_PATH_IMAGE031
representing the initial position pose of the robot in a prior map,
Figure DEST_PATH_IMAGE032
representing a fixed rotational relationship between the inertial navigation coordinate system and the lidar coordinate system,
Figure DEST_PATH_IMAGE033
representing a fixed translational relationship between the inertial navigation coordinate system and the lidar coordinate system.
Then, according to
Figure 832701DEST_PATH_IMAGE030
And constructing a local tracking point cloud map by the currently acquired laser point cloud and the currently acquired inertial navigation position attitude.
Based on the above embodiment, step S104 specifically includes the steps of:
firstly, processing a local tracking point cloud map at the current moment, dividing an area by a fixed distance and an angle by taking the x direction of a laser coordinate system as a starting point and the center of a laser as the origin of the coordinate system, and dividing the laser point according to the distance
Figure DEST_PATH_IMAGE034
And angle
Figure DEST_PATH_IMAGE035
The arrangement is summarized as fixed
Figure DEST_PATH_IMAGE036
Matrix of rows and columns, the number of rows and columns corresponding to each laser spot
Figure DEST_PATH_IMAGE037
And
Figure DEST_PATH_IMAGE038
the calculation formula is as follows:
Figure DEST_PATH_IMAGE039
Figure DEST_PATH_IMAGE040
wherein,
Figure DEST_PATH_IMAGE041
is a preset distance threshold.
Storing the height average value H of all laser point clouds of a local tracking point cloud map in the position range at each position in the matrix:
Figure DEST_PATH_IMAGE042
and processing the matrix, and respectively taking the mean values of the columns and the rows to obtain a row matrix and a column matrix of the local tracking grid map.
Then, establishing a binary tree index model for a matrix in a prior map, establishing a binary tree index model for a row matrix and a column matrix of a local tracking grid map, performing iterative search on the binary tree index model and the row matrix and the column matrix of the local tracking grid map, calculating environmental similarity and identification, setting a threshold value for the environmental similarity, judging that the local tracking grid map has certain similarity with the prior map data only if the similarity is greater than the threshold value, determining that the position of the robot belongs to a specific area of the prior map, and performing registration on point cloud frame data by using the identification to obtain position posture conversion parameters
Figure DEST_PATH_IMAGE043
Based on the above embodiment, step S105 specifically includes the steps of:
based on the current position and posture of the robot
Figure DEST_PATH_IMAGE044
Using position attitude transformation parameters
Figure 556550DEST_PATH_IMAGE043
Making a correction to obtain
Figure DEST_PATH_IMAGE045
Figure DEST_PATH_IMAGE046
The method comprises the steps of firstly, acquiring laser point cloud data and inertial navigation data in a limited area according to a robot, and constructing a prior map in the limited area; secondly, curvature calculation is carried out on the laser point cloud data to extract laser point cloud characteristics, and the inertial navigation position attitude of the robot is combined to obtain the corresponding initial position attitude of the inertial navigation position attitude of the robot in the prior map according to a repositioning method; then, based on the initial position posture, obtaining the current position posture of the robot in a laser point cloud coordinate system, and constructing a local tracking point cloud map according to the current position posture of the robot in the laser point cloud coordinate system; then, converting the local tracking point cloud map into a local tracking grid map, and performing similar environment matching according to a similar environment detection method to obtain position and attitude conversion parameters between the prior map and the local tracking grid map; and finally, correcting the current position posture of the robot in the laser point cloud coordinate system based on the position posture conversion parameters. According to the method, additional GNSS positioning equipment is not needed to provide positioning information, an external signal is not needed, and the positioning and attitude determining precision of the robot in the autonomous movement is improved. In addition, the method adopts a curvature calculation mode for the laser point cloud data, only effective information in the laser point cloud data is reserved, redundant data is eliminated, and a grid map is also adopted for tracking calculation during positioning, so that the calculation amount is reduced, the requirement on calculation force is lowered, and meanwhile, the pressure of data transmission is greatly reduced.
Referring to fig. 3, an embodiment of the present invention provides a robot positioning and attitude determination device, which may be any type of smart terminal, such as a mobile phone, a tablet computer, a personal computer, and the like.
Specifically, the positioning and attitude determination device of the robot comprises: one or more control processors and memory. The control processor and memory may be connected by a bus or other means.
The memory, which is a non-transitory computer-readable storage medium, may be used to store non-transitory software programs, non-transitory computer-executable programs, and modules, such as program instructions/modules corresponding to the positioning and pose device of the robot in the embodiments of the present invention. The control processor executes the robot positioning and attitude determination method of the above method embodiments by running non-transitory software programs, instructions and modules stored in the memory.
The memory may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function. Further, the memory may include high speed random access memory, and may also include non-transitory memory, such as at least one disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory optionally includes memory located remotely from the control processor, and the remote memory may be connected to the robot's position and orientation device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more modules are stored in the memory and, when executed by the one or more control processors, perform the positioning and pose determination method for the robot in the above-described method embodiments.
The embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium stores computer-executable instructions, and the computer-executable instructions, which are executed by one or more control processors, may cause the one or more control processors to execute the positioning and attitude determination method for a robot in the above method embodiment.
Through the above description of the embodiments, those skilled in the art can clearly understand that the embodiments can be implemented by software plus a general hardware platform. Those skilled in the art will appreciate that all or part of the processes of the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer readable storage medium, and when executed, may include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the invention have been shown and described, it will be understood by those of ordinary skill in the art that: various changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.

Claims (9)

1. A method for positioning and attitude determination of a robot, for controlling a tool, said robot comprising a lidar and an inertial navigation measurement unit and moving within a defined area, comprising the steps of:
constructing a prior map according to the laser point cloud data and the inertial navigation data in the limited area acquired by the robot;
carrying out curvature calculation on laser point cloud data to extract laser point cloud characteristics, and acquiring corresponding initial position postures of the inertial navigation position postures of the robot in the prior map according to a repositioning method based on the extracted laser point cloud characteristics and the inertial navigation position postures of the robot;
acquiring the current position posture of the robot in a laser point cloud coordinate system based on the initial position posture, and constructing a local tracking point cloud map based on the current position posture of the robot in the laser point cloud coordinate system;
converting the local tracking point cloud map into a local tracking grid map, and acquiring position and attitude conversion parameters of the prior map and the local tracking grid map according to a similar environment detection method;
and correcting the current position and posture of the robot in a laser point cloud coordinate system based on the position and posture conversion parameters.
2. The method for positioning and determining the attitude of the robot according to claim 1, wherein a priori map is constructed according to the laser point cloud data and the inertial navigation data collected by the robot in the limited area, and the method comprises the following steps:
obtaining an inertial navigation position attitude according to the inertial navigation data, wherein the data format of the inertial navigation position attitude is as follows:
Figure 377896DEST_PATH_IMAGE001
wherein
Figure 183784DEST_PATH_IMAGE002
The coordinates are represented by a coordinate system of,
Figure 697942DEST_PATH_IMAGE003
the gesture is represented by a gesture that is,
Figure 561993DEST_PATH_IMAGE004
represents a logo;
acquiring a point cloud raster map according to the laser point cloud data, wherein the data format of the point cloud raster map is
Figure 94474DEST_PATH_IMAGE005
Wherein
Figure 120199DEST_PATH_IMAGE006
Is shown and described
Figure 856074DEST_PATH_IMAGE004
The identity of the match is identified,
Figure 523816DEST_PATH_IMAGE007
representing the current point cloud frame data,
Figure 412268DEST_PATH_IMAGE008
representing a matrix composed of fixed row and column numbers, each position in the matrix is stored with an elevation mean value of the laser point cloud in the position range which is consistent with the real geographic coordinates,
Figure 874474DEST_PATH_IMAGE009
is a pair of
Figure 97645DEST_PATH_IMAGE008
A single row matrix of the average value for each column,
Figure 303498DEST_PATH_IMAGE010
is a pair of
Figure 279413DEST_PATH_IMAGE008
Taking a single-column matrix of the mean value of each row; point cloud frame data
Figure 912520DEST_PATH_IMAGE007
The format is as follows:
Figure 622987DEST_PATH_IMAGE011
Figure 163690DEST_PATH_IMAGE012
representing point coordinate information in the laser point cloud,
Figure 23805DEST_PATH_IMAGE013
the information on the color is represented by,
Figure 296654DEST_PATH_IMAGE014
the information on the intensity of the laser light is represented,
Figure 494417DEST_PATH_IMAGE015
representing time information;
and constructing a prior map according to the acquired inertial navigation position attitude and the point cloud grid map.
3. The method for positioning and determining the attitude of the robot according to claim 1, wherein the method for acquiring the inertial navigation position attitude of the robot from the repositioning method comprises the following steps:
converting the extracted laser point cloud characteristics into an inertial navigation coordinate system according to a fixed rotation relation and translation relation between the inertial navigation coordinate system and a laser radar coordinate system, combining the inertial navigation position attitude of the robot, registering the laser point cloud characteristics with point cloud frame data in the prior map, and acquiring the position attitude of the origin of the inertial navigation coordinate system corresponding to the prior map;
and obtaining the corresponding initial position attitude of the current inertial navigation position attitude of the robot in the prior map according to the current inertial navigation position attitude of the robot and the corresponding position attitude of the origin of the inertial navigation coordinate system in the prior map.
4. The method according to claim 3, wherein the method comprises the steps of obtaining a current position and posture of the robot in a laser point cloud coordinate system based on the initial position and posture, and constructing a local tracking point cloud map based on the current position and posture of the robot in the laser point cloud coordinate system, and comprises the following steps:
converting the inertial navigation position attitude of the robot according to the initial position attitude and the fixed rotation relation and translation relation between the inertial navigation coordinate system and the laser radar coordinate system to obtain the current position attitude of the robot in the laser point cloud coordinate system:
Figure 25762DEST_PATH_IMAGE016
wherein,
Figure 789319DEST_PATH_IMAGE017
representing the current position and posture of the robot in a laser point cloud coordinate system,
Figure 764228DEST_PATH_IMAGE018
representing an initial position pose of the robot in the a priori map,
Figure 183708DEST_PATH_IMAGE019
representing a fixed rotational relationship between the inertial navigation coordinate system and the lidar coordinate system,
Figure 331792DEST_PATH_IMAGE020
representing a fixed translation relation between an inertial navigation coordinate system and a laser radar coordinate system;
and constructing a local tracking point cloud map according to the current position posture of the robot in the laser point cloud coordinate system, the currently acquired laser point cloud and the currently acquired inertial navigation position posture.
5. The method according to claim 4, wherein the obtaining of the position and posture conversion parameters of the prior map and the local tracking grid map according to a similar environment detection method comprises the following steps:
respectively acquiring matrix data of the prior map and the local tracking grid map;
respectively establishing a binary tree index model for the matrix data of the prior map and the matrix data of the local tracking grid map, and performing iterative search on the two binary tree index models to calculate environment similarity and identification;
and if the environmental similarity is greater than a threshold value, registering the point cloud frame data of the prior map by using the identifier to obtain a position and attitude conversion parameter.
6. The method of claim 5, wherein the correcting the current position and orientation of the robot in the laser point cloud coordinate system based on the position and orientation conversion parameter comprises:
Figure 887539DEST_PATH_IMAGE021
wherein, the
Figure 49661DEST_PATH_IMAGE022
Representing the position-posture conversion parameter, the
Figure 753175DEST_PATH_IMAGE023
Representing the current position and attitude of the robot in a laser point cloud coordinate system
Figure 377054DEST_PATH_IMAGE024
And representing the corrected current position posture of the robot in a laser point cloud coordinate system.
7. The method according to any one of claims 1 to 6, wherein the control means comprises one of:
a control system of the robot; or
A server establishing communication connection with the robot; or
And the upper computer is in communication connection with the robot.
8. A robot positioning and attitude determination method device is characterized by comprising the following steps: at least one control processor and a memory for communicative connection with the at least one control processor; the memory stores instructions executable by the at least one control processor to enable the at least one control processor to perform a method of positioning and pose-ing a robot as claimed in any one of claims 1 to 7.
9. A computer-readable storage medium storing computer-executable instructions for causing a computer to perform the method of positioning and pose determination for a robot of any one of claims 1 to 7.
CN202110097185.2A 2021-01-25 2021-01-25 Robot positioning and attitude determining method, equipment and storage medium Active CN112414403B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110097185.2A CN112414403B (en) 2021-01-25 2021-01-25 Robot positioning and attitude determining method, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110097185.2A CN112414403B (en) 2021-01-25 2021-01-25 Robot positioning and attitude determining method, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112414403A CN112414403A (en) 2021-02-26
CN112414403B true CN112414403B (en) 2021-04-16

Family

ID=74783020

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110097185.2A Active CN112414403B (en) 2021-01-25 2021-01-25 Robot positioning and attitude determining method, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112414403B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113011517B (en) * 2021-03-30 2024-09-17 上海商汤临港智能科技有限公司 Positioning result detection method and device, electronic equipment and storage medium
CN113112478B (en) * 2021-04-15 2023-12-15 深圳市优必选科技股份有限公司 Pose recognition method and terminal equipment
CN115220009A (en) * 2021-04-15 2022-10-21 阿里巴巴新加坡控股有限公司 Data processing method and device, electronic equipment and computer storage medium
CN113965556B (en) * 2021-10-21 2023-10-31 飞纳经纬科技(北京)有限公司 Method, device and system for 3D real-time display of inertial navigation gesture based on web

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201116959D0 (en) * 2011-09-30 2011-11-16 Bae Systems Plc Vehicle localisation with 2d laser scanner and 3d prior scans
US9082008B2 (en) * 2012-12-03 2015-07-14 Honeywell International Inc. System and methods for feature selection and matching
US10565457B2 (en) * 2017-08-23 2020-02-18 Tusimple, Inc. Feature matching and correspondence refinement and 3D submap position refinement system and method for centimeter precision localization using camera-based submap and LiDAR-based global map
CN111060099B (en) * 2019-11-29 2023-08-04 畅加风行(苏州)智能科技有限公司 Real-time positioning method for unmanned automobile
US11725944B2 (en) * 2020-03-02 2023-08-15 Apollo Intelligent Driving Technology (Beijing) Co, Ltd. Method, apparatus, computing device and computer-readable storage medium for positioning
CN111427061A (en) * 2020-06-15 2020-07-17 北京云迹科技有限公司 Robot mapping method and device, robot and storage medium
CN112180382B (en) * 2020-09-28 2024-03-08 知行汽车科技(苏州)股份有限公司 Constant-speed model-based self-adaptive 3D-LSLAM positioning method, device and system

Also Published As

Publication number Publication date
CN112414403A (en) 2021-02-26

Similar Documents

Publication Publication Date Title
CN112414403B (en) Robot positioning and attitude determining method, equipment and storage medium
CN108986161B (en) Three-dimensional space coordinate estimation method, device, terminal and storage medium
WO2021233029A1 (en) Simultaneous localization and mapping method, device, system and storage medium
CN111968177B (en) Mobile robot positioning method based on fixed camera vision
CN108297115B (en) Autonomous repositioning method for robot
CN109425348B (en) Method and device for simultaneously positioning and establishing image
WO2021016854A1 (en) Calibration method and device, movable platform, and storage medium
CN111311650A (en) Point cloud data registration method and device and storage medium
CN113409459A (en) Method, device and equipment for producing high-precision map and computer storage medium
CN114463308B (en) Visual inspection method, device and processing equipment for visual angle photovoltaic module of unmanned aerial vehicle
CN110634138A (en) Bridge deformation monitoring method, device and equipment based on visual perception
CN114485698A (en) Intersection guide line generating method and system
CN113313765B (en) Positioning method, positioning device, electronic equipment and storage medium
CN117075171B (en) Pose information determining method, device and equipment of laser radar and storage medium
CN112219225A (en) Positioning method, system and movable platform
CN114092771A (en) Multi-sensing data fusion method, target detection device and computer equipment
CN113686240B (en) Positioning method and device based on electric power pole tower, computer equipment and storage medium
CN113763468B (en) Positioning method, device, system and storage medium
CN112001247B (en) Multi-target detection method, equipment and storage device
CN114155415A (en) Multi-data fusion vehicle detection method, system, equipment and storage medium
CN112598736A (en) Map construction based visual positioning method and device
CN113313764B (en) Positioning method, positioning device, electronic equipment and storage medium
CN116660916B (en) Positioning method, mapping method and electronic equipment for orchard mobile robot
CN118379330B (en) Target tracking method based on four cameras, clustered robot and storage medium
CN118411499B (en) Target detection method based on tetrahedron model, clustered robot and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A positioning and attitude determination method, equipment and storage medium of robot

Effective date of registration: 20220408

Granted publication date: 20210416

Pledgee: Bank of Changsha Limited by Share Ltd. science and Technology Branch

Pledgor: Hunan Beidou microchip Data Technology Co.,Ltd.

Registration number: Y2022980003980

PC01 Cancellation of the registration of the contract for pledge of patent right
PC01 Cancellation of the registration of the contract for pledge of patent right

Granted publication date: 20210416

Pledgee: Bank of Changsha Limited by Share Ltd. science and Technology Branch

Pledgor: Hunan Beidou microchip Data Technology Co.,Ltd.

Registration number: Y2022980003980