CN117999207A - Motion state estimation method and device - Google Patents

Motion state estimation method and device Download PDF

Info

Publication number
CN117999207A
CN117999207A CN202180100707.8A CN202180100707A CN117999207A CN 117999207 A CN117999207 A CN 117999207A CN 202180100707 A CN202180100707 A CN 202180100707A CN 117999207 A CN117999207 A CN 117999207A
Authority
CN
China
Prior art keywords
yaw angle
grid
point cloud
grid map
yaw
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180100707.8A
Other languages
Chinese (zh)
Inventor
郑争兴
羌波
刘志洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN117999207A publication Critical patent/CN117999207A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

A motion state estimation method and device, the method includes: acquiring a first local grid map (S501), wherein a value of each grid in the first local grid map is used for indicating static points accumulated in the grid based on a first history point cloud; the static points in the first frame point cloud are registered with the first local grid map to obtain a first yaw angle (S502). The motion state estimation method and the motion state estimation device can improve the estimation precision of the yaw angle.

Description

Motion state estimation method and device Technical Field
The application relates to the technical field of intelligent driving, in particular to a motion state estimation method and device.
Background
The intelligent driving field is a hot direction in the automobile field, and along with the continuous increase of vehicle-mounted sensors, vehicles become more and more intelligent, and meanwhile, challenges on the driving safety of the vehicles are also more and more serious.
Vehicle motion state information is an important factor affecting intelligent driving safety. Fig. 1 shows a schematic diagram of the relationship between vehicle motion state information and intelligent driving functions. As shown in fig. 1, the core functions of intelligent driving such as high-precision positioning, decision planning, motion control and the like need to use the transverse and longitudinal speeds and the yaw speeds provided by the motion state estimation module, and the functions are closely related to the running safety of the vehicle, so that the accuracy requirement on the motion state estimation of the vehicle is high. As shown in fig. 1, the sensors that can be used for vehicle motion state estimation are: gyroscopes, wheel speed meters, radars (Radio detection AND RANGING, radar), cameras, and Lidar, among others. Conventionally, a wheel speed meter is generally used for estimating the transverse and longitudinal speeds of a vehicle, and a gyroscope is used for estimating the yaw rate of the vehicle. Because the accuracy of the transverse and longitudinal speeds estimated by the wheel speed meter is limited due to the influences of factors such as tire pressure and vehicle slip, zero deviation exists in the yaw rate estimated by the gyroscope, and therefore, the estimation accuracy of the traditional method is difficult to meet the requirement of an intelligent driving function.
The radar has the capacity of Doppler speed measurement, and has better speed measurement precision compared with a camera and a laser radar, so that at least one radar is basically arranged on a vehicle with an auxiliary driving function, the method for estimating the motion state of the vehicle based on the radar does not need to additionally increase the cost of a sensor, and meanwhile, the method has certain advantages in precision compared with other sensors. In the related art, when the radar is used to estimate the yaw angle of the vehicle, there are problems of large deviation and low accuracy.
Disclosure of Invention
In view of this, a motion state estimation method and apparatus are proposed that can improve the accuracy of estimating the yaw angle.
In a first aspect, an embodiment of the present application provides a motion state estimation method, including:
Acquiring a first local grid map, wherein the value of each grid in the first local grid map is used for indicating static points accumulated in the grid based on a first historical point cloud; registering a static point in a first frame point cloud with the first local grid map to obtain a first yaw angle, wherein the first yaw angle represents the yaw angle of the target device when the first frame point cloud is acquired.
In the embodiment of the application, the static points in the current point cloud are registered with the local grid map in which the static points in the historical point cloud are accumulated, so that the situation of sparse static points in the single-frame point cloud is made up, the accuracy of the yaw angle of the target equipment is effectively improved, and the estimation precision and reliability of the yaw angle speed are improved.
In a first possible implementation manner of the method according to the first aspect, the method further includes; acquiring a second local grid map, wherein the value of each grid in the second local grid map is used for indicating static points accumulated in the grid based on a second historical point cloud; registering static points in a second frame point cloud with the second local grid map to obtain a second yaw angle, wherein the second yaw angle represents the yaw angle of the target device when the second frame point cloud is acquired, and the acquisition time of the second frame point cloud is before the acquisition time of the first frame point cloud.
In a second possible implementation manner of the method according to the first possible implementation manner of the first aspect, the method further includes: a first yaw rate is determined based on the first yaw angle, the second yaw angle, and a first time interval, the first time interval representing a time interval between a time of acquisition of the second frame point cloud and a time of acquisition of the first frame point cloud, the first yaw rate representing a yaw rate of the target device within the first time interval.
In the embodiment of the application, the accuracy and the reliability of the estimation of the yaw rate of the target equipment are improved by improving the accuracy of the yaw rate of the target equipment.
In a third possible implementation manner of the method according to the first aspect or any one of the possible implementation manners of the first aspect, the registering the static point in the first frame point cloud with the first local grid map to obtain a first yaw angle includes: acquiring a plurality of candidate yaw angles; determining a grid accumulation value corresponding to each candidate yaw angle, wherein the grid accumulation value corresponding to the candidate yaw angle is used for representing the distribution condition of the static points accumulated based on the first history point cloud in the first local grid map under the condition of rotating to the candidate yaw angle; comparing a grid accumulation value corresponding to each candidate yaw angle with a stored grid accumulation value, and updating the stored grid accumulation value and a stored yaw angle based on a comparison result, wherein the stored grid accumulation value is used for representing the distribution condition of static points accumulated based on a first history point cloud in the first local grid map under the condition of rotating to the stored yaw angle; the first yaw angle is determined from the stored yaw angles.
In the embodiment of the application, the plurality of candidate yaw angles are generated, and the traversing optimization is carried out in the plurality of candidate yaw angles, so that the yaw angle with the maximum grid accumulation value is obtained, the optimizing calculation force is reduced, and meanwhile, the falling into a local minimum value can be avoided.
In a fourth possible implementation manner of the method according to the third possible implementation manner of the first aspect, the obtaining a plurality of candidate yaw angles includes: compensating the second yaw angle according to the second yaw rate and the first time interval to obtain a compensated yaw angle, wherein the second yaw rate represents the yaw rate of the target equipment in the second time interval, and the ending time of the second time interval is the starting time of the first time interval; and generating the plurality of candidate yaw angles according to a preset angle resolution within a preset angle range of the compensation yaw angle.
In the embodiment of the application, under the single radar scene, the compensation yaw angle is estimated based on the motion state information of the previous period, and the candidate yaw angle is generated based on the compensation yaw angle, so that the candidate yaw angle generated under the single radar scene is near the actual yaw angle, and the accuracy of the first yaw angle is improved.
In a fifth possible implementation manner of the method according to the third possible implementation manner of the first aspect, the obtaining a plurality of candidate yaw angles includes: acquiring a plurality of radar speeds and a plurality of mounting angles based on a plurality of radars; acquiring radar estimated yaw angles according to the plurality of radar speeds and the plurality of mounting angles; and generating the plurality of candidate yaw angles according to the preset angle resolution within the preset angle range of the radar estimated yaw angle.
In the embodiment of the application, under the scene of multiple radars, the yaw angle is estimated based on the radars to further adjust and correct, so that the accuracy of the first yaw angle is improved.
In a sixth possible implementation manner of the method according to any one of the third possible implementation manner to the fifth possible implementation manner of the first aspect, the determining a grid cumulative value corresponding to each candidate yaw angle includes: for any one of the plurality of candidate yaw angles: rotating a plurality of static points in the first frame point cloud according to the candidate yaw angle to obtain a plurality of first rotation points, wherein each static point in the first frame point cloud corresponds to one first rotation point; according to a preset grid resolution, carrying out rasterization on the plurality of first rotation points to obtain grids of each first rotation point in the first local grid map; and accumulating the grid values of each first rotation point to obtain grid accumulated values corresponding to the candidate yaw angles.
In the embodiment of the application, the candidate yaw angle which is closer to the actual yaw angle can be found by accumulating the values of the grids where the first rotation point is located and comparing the accumulated values, so that the accuracy of the first yaw angle is improved.
In a seventh possible implementation manner of the method according to any one of the third to sixth possible implementation manners of the first aspect, the updating the stored grid accumulation value and the stored yaw angle based on the comparison result includes: updating the stored grid accumulation value to the grid accumulation value corresponding to the candidate yaw angle and updating the stored yaw angle to the candidate yaw angle under the condition that the grid accumulation value corresponding to the candidate yaw angle is larger than the stored grid accumulation value; and in the case that the grid accumulation value corresponding to the candidate yaw angle is equal to the stored grid accumulation value, keeping the stored grid accumulation value unchanged, and increasing the candidate yaw angle in the stored yaw angle.
In the embodiment of the application, the accuracy of the first yaw angle can be improved by finding the candidate yaw angle with the largest corresponding grid accumulation value.
In a sixth possible implementation manner of the method according to the first aspect or any one of the possible implementation manners of the first aspect, the acquiring a first local grid map includes: acquiring a third local grid map, wherein the value of each grid in the third local grid map represents the number of static points accumulated in the grid based on the first historical point cloud; determining a first translation distance according to a first movement speed, the first time interval and a first translation allowance, wherein the first movement speed represents the speed of the target equipment in the first time interval under a ground coordinate system, the first translation allowance represents a difference between the translation amount of a local grid map and the actual translation amount of the target equipment in a second time interval, the ending moment of the second time interval is the same as the starting moment of the first time interval, and the first translation distance is used for indicating the actual translation amount of the target equipment in the first time interval; determining the number of the translation grids according to the first translation distance and a preset grid resolution; and carrying out translation processing on the third local grid map according to the number of the translation grids to obtain the first local grid map, wherein the grids of the first local grid map comprise overlapped grids overlapped with the third local grid map and extended grids which are not overlapped with the third local grid map, the values of the overlapped grids are consistent with the values of corresponding grids in the third local grid map, and the values of the extended grids are initial values.
In the embodiment of the application, the accuracy of the local grid map alignment is improved by translating the local grid map, so that the accuracy of the first yaw angle obtained by registration is improved.
In a ninth possible implementation manner of the method according to the eighth possible implementation manner of the first aspect, the method further includes: acquiring the first frame point cloud, wherein the first frame point cloud is used for indicating azimuth angles and Doppler speeds of a plurality of target points; extracting a static point in the first frame point cloud from the plurality of target points based on azimuth angles and Doppler speeds of the plurality of target points; determining a first radar speed according to the azimuth angle and the Doppler speed of the static point, wherein the first radar speed represents the speed of the radar in the first time interval under a radar coordinate system; the first moving speed is determined according to the first radar speed, the mounting angle of the first radar, and the second yaw angle.
In the embodiment of the application, in a single radar scene, after determining the speed of the radar in a radar coordinate system based on the azimuth angle and the Doppler speed, the speed of the target device in the coordinate system of the target device is determined based on the speed of the radar in the radar coordinate system.
In a tenth possible implementation form of the method according to the eighth possible implementation form of the first aspect, the method further comprises: acquiring a plurality of radar speeds and a plurality of mounting angles based on a plurality of radars; acquiring a first target device speed according to the radar speeds and the installation angles, wherein the first target device speed represents the speed of the target device in the first time interval under a target device coordinate system; the first movement speed is determined from the first target device speed and the second yaw angle.
In the embodiment of the application, under a multi-radar scene, the speed of the target device under the coordinate system of the target device is measured by a plurality of radars.
In an eleventh possible implementation form of the method according to the first aspect as such or any of the preceding possible implementation forms of the first aspect, the method further comprises: rotating a plurality of static points in the first frame point cloud according to the first transverse swing angle to obtain a plurality of second rotation points, wherein each static point in the first frame point cloud corresponds to one second rotation point; according to the preset grid resolution, carrying out rasterization on the plurality of second rotation points to obtain grids of each second rotation point in the first local grid map; and accumulating the number of the second rotation points in the grids and the values of the grids in the first local grid map aiming at the grids where each second rotation point is positioned to obtain an updated first local grid map.
In the embodiment of the application, the static point accumulated value in the first frame point cloud adopted in the current period is added to the local grid map adopted in the current period, so that the update of the local grid map is realized, the use of the next period is convenient, and the accuracy of the yaw rate estimation of the next period is improved.
In a twelfth possible implementation manner of the method according to the first aspect or any one of the possible implementation manners of the first aspect, the method further includes: according to the preset map size and the preset grid resolution, an initial local grid map is constructed, the value of each grid in the initial local grid map is an initial value, and the origin of coordinates of the initial local grid map is at a preset fixed position of the target equipment; and respectively setting an initial yaw angle, an initial yaw rate, an initial translation allowance and an initial stored yaw angle, wherein the value of a grid accumulated value corresponding to the initial stored yaw angle is an initial value.
In a second aspect, an embodiment of the present application provides a motion state estimation apparatus, the apparatus including:
a first acquisition module for acquiring a first local grid map, wherein the value of each grid in the first local grid map is used for indicating static points accumulated in the grid based on a first history point cloud;
the first registration module is used for registering the static point in the first frame point cloud with the first local grid map acquired by the first acquisition module to obtain a first yaw angle, wherein the first yaw angle represents the yaw angle of the target equipment when the first frame point cloud is acquired.
In the embodiment of the application, the static points in the current point cloud are registered with the local grid map in which the static points in the historical point cloud are accumulated, so that the situation of sparse static points in the single-frame point cloud is made up, the accuracy of the yaw angle of the target equipment is effectively improved, and the estimation precision and reliability of the yaw angle speed are improved.
In a first possible implementation manner of the apparatus according to the second aspect, the apparatus further includes:
A second acquisition module for acquiring a second local grid map, wherein the value of each grid in the second local grid map is used for indicating static points accumulated in the grid based on a second history point cloud;
the second registration module is used for registering the static point in the second frame point cloud with the second local grid map acquired by the second acquisition module to obtain a second yaw angle, the second yaw angle represents the yaw angle of the target equipment when the second frame point cloud is acquired, and the acquisition time of the second frame point cloud is before the acquisition time of the first frame point cloud.
In a second possible implementation manner of the apparatus according to the first possible implementation manner of the second aspect, the apparatus further includes:
The first determining module is configured to determine a first yaw rate based on the first yaw angle obtained by the first registering module, the second yaw angle obtained by the second registering module, and a first time interval, where the second yaw angle represents a yaw angle of the target device when a second frame point cloud is acquired, the acquisition time of the second frame point cloud is before the acquisition time of the first frame point cloud, the first time interval represents a time interval between the acquisition time of the second frame point cloud and the acquisition time of the first frame point cloud, and the first yaw rate represents a yaw rate of the target device in the first time interval.
In a third possible implementation manner of the apparatus according to the second aspect or any one of the possible implementation manners of the second aspect, the registration module includes:
An acquisition unit configured to acquire a plurality of candidate yaw angles;
a first determining unit configured to determine a grid accumulation value corresponding to each candidate yaw angle, where the grid accumulation value corresponding to the candidate yaw angle is used to represent a distribution of the static points accumulated based on the first history point cloud in the first local grid map when the static points are rotated to the candidate yaw angle;
A comparison unit configured to compare a grid accumulation value corresponding to each candidate yaw angle with a stored grid accumulation value, and update the stored grid accumulation value and the stored yaw angle based on a comparison result, where the stored grid accumulation value is used to represent a distribution of the static points accumulated based on the first history point cloud in the first local grid map when rotated to the stored yaw angle;
And a second determination unit configured to determine the first yaw angle from the stored yaw angles.
In a fourth possible implementation manner of the apparatus according to the third possible implementation manner of the second aspect, the obtaining unit is configured to:
compensating the second yaw angle according to the second yaw rate and the first time interval to obtain a compensated yaw angle, wherein the second yaw rate represents the yaw rate of the target equipment in the second time interval, and the ending time of the second time interval is the starting time of the first time interval;
and generating the plurality of candidate yaw angles according to a preset angle resolution within a preset angle range of the compensation yaw angle.
In a fifth possible implementation manner of the apparatus according to the third possible implementation manner of the second aspect, the obtaining unit is configured to:
acquiring a plurality of radar speeds and a plurality of mounting angles based on a plurality of radars;
acquiring radar estimated yaw angles according to the plurality of radar speeds and the plurality of mounting angles;
And generating the plurality of candidate yaw angles according to the preset angle resolution within the preset angle range of the radar estimated yaw angle.
According to any one of the third to fifth possible implementation manners of the second aspect, in a sixth possible implementation manner of the apparatus, the first determining unit is configured to:
For any one of the plurality of candidate yaw angles:
Rotating a plurality of static points in the first frame point cloud according to the candidate yaw angle to obtain a plurality of first rotation points, wherein each static point in the first frame point cloud corresponds to one first rotation point;
according to a preset grid resolution, carrying out rasterization on the plurality of first rotation points to obtain grids of each first rotation point in the first local grid map;
And accumulating the grid values of each first rotation point to obtain grid accumulated values corresponding to the candidate yaw angles.
According to any one of the third to sixth possible implementation manners of the second aspect, in a seventh possible implementation manner of the apparatus, the comparing unit is configured to:
Updating the stored grid accumulation value to the grid accumulation value corresponding to the candidate yaw angle and updating the stored yaw angle to the candidate yaw angle under the condition that the grid accumulation value corresponding to the candidate yaw angle is larger than the stored grid accumulation value;
and in the case that the grid accumulation value corresponding to the candidate yaw angle is equal to the stored grid accumulation value, keeping the stored grid accumulation value unchanged, and increasing the candidate yaw angle in the stored yaw angle.
In an eighth possible implementation manner of the apparatus according to the second aspect or any one of the possible implementation manners of the second aspect, the first obtaining module is configured to:
Acquiring a third local grid map, wherein the value of each grid in the third local grid map represents static points accumulated in the grid based on the first history point cloud;
Determining a first translation distance according to a first movement speed, the first time interval and a first translation allowance, wherein the first movement speed represents the speed of the target equipment in the first time interval under a ground coordinate system, the first translation allowance represents a difference between the translation amount of a local grid map and the actual translation amount of the target equipment in a second time interval, the ending moment of the second time interval is the same as the starting moment of the first time interval, and the first translation distance is used for indicating the actual translation amount of the target equipment in the first time interval;
determining the number of the translation grids according to the first translation distance and a preset grid resolution;
And carrying out translation processing on the third local grid map according to the number of the translation grids to obtain the first local grid map, wherein the grids of the first local grid map comprise overlapped grids overlapped with the third local grid map and extended grids which are not overlapped with the third local grid map, the values of the overlapped grids are consistent with the values of corresponding grids in the third local grid map, and the values of the extended grids are initial values.
In a ninth possible implementation manner of the apparatus according to the seventh possible implementation manner of the second aspect, the apparatus further includes:
a third acquisition module, configured to acquire the first frame point cloud, where the first frame point cloud is used to indicate azimuth angles and doppler speeds of multiple target points;
An extracting module, configured to extract a static point in the first frame point cloud from the plurality of target points based on azimuth angles and doppler velocities of the plurality of target points;
A second determining module, configured to determine a first radar speed according to an azimuth angle and a doppler speed of the static point, where the first radar speed represents a speed of the radar in the first time interval under a radar coordinate system;
And a third determining module for determining the first moving speed according to the first radar speed, the mounting angle of the first radar and the second yaw angle.
In a tenth possible implementation manner of the apparatus according to the seventh possible implementation manner of the second aspect, the apparatus further includes:
a fourth acquisition module for acquiring a plurality of radar speeds and a plurality of installation angles based on a plurality of radars;
A fifth obtaining module, configured to obtain a first target device speed according to the plurality of radar speeds and the plurality of installation angles, where the first target device speed represents a speed of the target device in the first time interval under a target device coordinate system;
and a fourth determining module, configured to determine the first movement speed according to the first target device speed and the second yaw angle.
In an eleventh possible implementation form of the apparatus according to the second aspect as such or any of the possible implementation forms of the second aspect, the apparatus further comprises:
The rotating module is used for rotating the plurality of static points in the first frame point cloud according to the first transverse swing angle to obtain a plurality of second rotating points, wherein each static point in the first frame point cloud corresponds to one second rotating point;
The processing module is used for rasterizing the plurality of second rotation points according to a preset grid resolution to obtain grids of each second rotation point in the first local grid map;
And the accumulating module is used for accumulating the number of the second rotation points in the grids and the values of the grids in the first local grid map aiming at the grids where each second rotation point is positioned to obtain an updated first local grid map.
In a twelfth possible implementation manner of the apparatus according to the second aspect or any one of the possible implementation manners of the second aspect, the apparatus further includes:
the construction module is used for constructing an initial local grid map according to the preset map size and the preset grid resolution, wherein the value of each grid in the initial local grid map is an initial value, and the origin of coordinates of the initial local grid map is at a preset fixed position of the target equipment;
The setting module is used for respectively setting an initial yaw angle, an initial yaw rate, an initial translation allowance and an initial stored yaw angle, wherein the value of a grid accumulated value corresponding to the initial stored yaw angle is an initial value.
In a third aspect, an embodiment of the present application provides a motion state estimation apparatus, the apparatus including a processor; a memory for storing processor-executable instructions; wherein the processor is configured to implement the motion state estimation method of the first aspect or one or several of the plurality of possible implementations of the first aspect when executing the instructions.
In a fourth aspect, embodiments of the present application provide an electronic device, which may perform the motion state estimation method of the first aspect or one or more of the multiple possible implementations of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer readable storage medium having stored thereon computer program instructions which when executed by a processor implement the motion state estimation method of the first aspect or one or more of the possible implementations of the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product comprising computer readable code, or a non-transitory computer readable storage medium carrying computer readable code, which when run in an electronic device, a processor in the electronic device performs the motion state estimation method of the first aspect or one or more of the possible implementations of the first aspect.
In a seventh aspect, a terminal is provided, which comprises the motion state estimation device of the above second or third aspect. Further, the terminal can be intelligent transportation equipment, intelligent manufacturing equipment, intelligent robots, intelligent household equipment or mapping equipment and the like. In some possible implementations, the terminal is a vehicle, such as an intelligent driving vehicle, an unmanned vehicle, or a vehicle with a driving assistance function.
These and other aspects of the application will be apparent from and elucidated with reference to the embodiment(s) described hereinafter.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments, features and aspects of the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic diagram showing the relationship of vehicle motion status information to intelligent driving functions;
fig. 2 is a schematic diagram of an implementation scenario of a motion state estimation method according to an embodiment of the present application;
FIG. 3 shows an exemplary schematic of a coordinate system provided by an embodiment of the present application;
FIG. 4 shows an exemplary schematic of extracted static points in an embodiment of the application;
FIG. 5 shows an exemplary schematic of a partially rasterized map according to an embodiment of the application;
Fig. 6 illustrates a motion state estimation method provided by an embodiment of the present application;
FIG. 7 illustrates an exemplary schematic view of a local grid map of an embodiment of the present application;
FIG. 8 is a schematic diagram of a rasterization process for a first rotation point in an embodiment of the present application;
fig. 9 is a flowchart of an embodiment of a motion state estimation method according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a motion state estimation device according to an embodiment of the present application;
fig. 11 shows a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Various exemplary embodiments, features and aspects of the application will be described in detail below with reference to the drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Although various aspects of the embodiments are illustrated in the accompanying drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
In addition, numerous specific details are set forth in the following description in order to provide a better illustration of the application. It will be understood by those skilled in the art that the present application may be practiced without some of these specific details. In some instances, well known methods, procedures, components, and circuits have not been described in detail so as not to obscure the present application.
The embodiment of the application provides a motion state estimation method which can be applied to a scene of motion state estimation by using a radar and can improve the estimation accuracy of yaw rate. For example, the target device for motion state estimation by using the motion state estimation method provided by the application embodiment includes, but is not limited to, a vehicle, a robot, a unmanned aerial vehicle, and the like. In the embodiment of the present application, the vehicle is taken as an example to describe the motion state estimation method provided, and the relevant parameters of the target device may refer to the relevant parameters of the vehicle, for example, the first target device speed information may refer to the first vehicle speed information, and the yaw rate of the target device in the first time interval may refer to the yaw rate of the vehicle in the first time interval. The motion state estimation process and parameters of other target devices such as robots and unmanned aerial vehicles can refer to the motion state estimation process and parameters of vehicles, and detailed description is omitted in the embodiment of the application.
Fig. 2 is a schematic diagram of an implementation scenario of a motion state estimation method according to an embodiment of the present application. As shown in fig. 2, this implementation scenario is a vehicle motion state estimation scenario. A radar and motion state estimation module is included in the vehicle. The number of radars may be one or more. The radar may be deployed around the vehicle. The motion state estimation module may be deployed in an intelligent driving platform. The intelligent driving platform can be arranged at the positions of a vehicle driving position, a vehicle central control area and the like. One or more radars may access the motion state estimation module through a coaxial cable or the like.
The radar may provide a point cloud to the motion state estimation module, which may be used to indicate a plurality of target points, as well as a distance of each target point from the radar, an azimuth of each target point, and a doppler velocity of each target point. The Doppler velocity of the target point is the velocity of the target point relative to the radial direction of the radar, which is measured by applying the Doppler principle. In the case of multiple radars, the point clouds provided by the multiple radars can be spliced into one frame of point cloud, and in the embodiment of the application, the splicing mode of the point clouds is not limited. The radar may periodically provide a point cloud to the motion state estimation module. That is, each frame point cloud indicates the acquisition time of the frame point cloud: the distance of each target point from the radar, the azimuth of each target point, and the doppler velocity of each target point. In the embodiment of the application, the radar comprises, but is not limited to, a microwave radar, a millimeter wave radar and the like. The radar deployed in the vehicle may use any one of the above radars alone or may use any several of the above radars at the same time.
The target point indicated by the point cloud may be a static point or a dynamic point. Wherein the static points may represent objects whose position is unchanged in the geodetic coordinate system, such as roadside buildings, traffic lights, stationary vehicles, etc. The dynamic point may represent an object whose position has changed in the geodetic coordinate system, for example, a running vehicle, a walking passer-by, a goods in transit, or the like.
The motion state estimation module can obtain motion state information of the vehicle, such as transverse and longitudinal speeds (including transverse and longitudinal speeds) of the vehicle, yaw rate of the vehicle and the like, according to the point cloud provided by the radar.
The intelligent driving platform can realize intelligent driving functions such as high-precision positioning, decision planning, motion control and the like based on the motion state information of the vehicle provided by the motion state estimation module.
The motion state estimation method provided by the application has the principle that: and registering the static points in the current point cloud with the local grid map accumulated with the static points in the historical point cloud, so that the situation of sparse static points in the single-frame point cloud is made up, the estimation accuracy and reliability of the vehicle yaw angle are effectively improved, and the estimation accuracy and reliability of the vehicle yaw angle speed are improved.
In order to facilitate understanding of the various speeds involved in the embodiments of the present application, the coordinate system involved in the embodiments of the present application will be first described below. Fig. 3 shows an exemplary schematic diagram of a coordinate system provided by an embodiment of the present application. As shown in fig. 3, the embodiment of the present application involves three coordinate systems, namely, a ground coordinate system, a vehicle coordinate system, and a radar coordinate system.
The origin of coordinates of the ground coordinate system is a fixed position on the ground, and the origin of coordinates of the ground coordinate system does not change with the movement of the vehicle. The X-axis direction and the Y-axis direction of the ground coordinate system are shown as X1 and Y1, respectively, in fig. 3.
The origin of coordinates of the vehicle coordinate system is a fixed position on the vehicle, such as the center of the vehicle or the center of a center console of the vehicle, and the origin of coordinates of the vehicle coordinate system moves synchronously in the ground coordinate system along with the movement of the vehicle. The X-axis direction and the Y-axis direction of the vehicle coordinate system are shown as X2 and Y2, respectively, in fig. 3.
The origin of coordinates of the radar coordinate system is a fixed position on the radar, such as the center of the radar, and the origin of coordinates of the radar coordinate system can synchronously move in the ground coordinate system along with the movement of the vehicle where the radar is located. The X-axis direction and the y-axis direction of the radar coordinate system are shown as X3 and X3, respectively, in fig. 3.
The z-axis directions of the ground coordinate system, the vehicle coordinate system, and the radar coordinate system are all numerical directions perpendicular to the ground, and are not shown in fig. 3. In the embodiment of the application, the specific positions of the coordinate origin of the ground coordinate system, the vehicle coordinate system origin and the radar coordinate system origin are not limited. It will be appreciated that different vehicles share the same ground coordinate system, each vehicle having a separate vehicle coordinate system, and each radar on a vehicle having a separate radar coordinate system.
As shown in fig. 3, the ground coordinate system is rotated counterclockwise to obtain a vehicle coordinate system, wherein the rotation angle is called Yaw angle and is denoted as yaw_est; the radar coordinate system is obtained by anticlockwise setting the vehicle coordinate system, and the angle of rotation is referred to as the radar mounting angle and is denoted by beta. It can be seen that the radar coordinate system can be obtained by rotating the ground coordinate system by an angle of (yaw_est+β) counterclockwise.
In the embodiment of the application, the speed of the vehicle in the ground coordinate system is referred to as a moving speed, the speed of the vehicle in the vehicle coordinate system is referred to as a vehicle speed, and the speed of the radar in the radar coordinate system is referred to as a radar speed.
As shown in fig. 3, in the vehicle coordinate system, the speed of the vehicle in the X2 direction is referred to as the longitudinal speed of the vehicle, and may be represented by V x; the speed of the vehicle in the Y2 direction is referred to as the lateral speed of the vehicle and may be represented by V y; the speed at which the vehicle rotates about the z-axis is referred to as yaw rate and may be represented by Yawrate. The lateral speed and the longitudinal speed of the vehicle are collectively referred to as the lateral and longitudinal speeds of the vehicle. In summary of embodiments of the application, the lateral and longitudinal speeds of a vehicle may also be referred to as vehicle speeds. The angle between the direction of the longitudinal speed of the vehicle (i.e., the direction of V x) and the positive direction of the X-axis of the ground coordinate system (i.e., the X1 direction) is referred to as the Yaw angle of the vehicle, and may be expressed in terms of yaw_est.
As shown in fig. 3, in the radar coordinate system, the speed of the radar in the X3 direction is referred to as the longitudinal speed of the radar, and may be represented by V rx; the speed of the radar in the Y3 direction is referred to as the transverse speed of the radar and may be represented by V ry. Since the radar is fixedly connected with the vehicle, the radar rotates around the z-axis at the same speed as the vehicle rotates around the z-axis, which can also be called as transverse standard angular speed and is denoted by Yawrate. The lateral speed and the longitudinal speed of the radar are collectively referred to as the lateral and longitudinal speed of the radar. In the embodiment of the present application, the transverse and longitudinal speeds of the radar may also be referred to as speeds of the radar. The angle between the positive direction of the X-axis of the radar coordinate system (i.e., the X3 direction) and the positive direction of the X-axis of the vehicle coordinate system (i.e., the X2 direction) is referred to as the radar mounting angle, and may be represented by β.
In the embodiment of the application, the motion state estimation is performed in a period iteration mode, and the parameter of the last period is needed to be used in each period for motion state estimation. In the embodiment of the application, the current estimation period may be referred to as an Mth period, the last estimation period is referred to as an Mth-1 th period, and the next estimation period is referred to as an Mth+1th period. M is a positive integer. For example, the parameters for the M-1 th cycle are used for motion state estimation in the M-1 th cycle, and the parameters for the M-1 th cycle are used for motion state estimation in the M+1 th cycle.
Based on the principle of the motion state estimation method provided by the embodiment of the application, the yaw angle of the vehicle is determined by registering the static point with the local grid map and determining the rotation angle of the vehicle when the position of the static point is matched with the position of the static point in the local grid map. Therefore, in the embodiment of the present application, it is necessary to extract static points and acquire a local grid map.
The process of extracting the static point in the mth period will be described first. As previously described, the multiple target points indicated in a frame of point cloud may be static points or dynamic points. The essence of extracting static points from multiple target points is to filter dynamic points from multiple target points. For each target point indicated by the point cloud used in the mth period, the motion state estimation module may determine a static doppler velocity of the target point in the mth period according to equation one:
Wherein, Representing the static doppler velocity of the target point at the mth period, θ M representing the azimuth angle of the target point at the mth period,Representing the estimated longitudinal speed of the radar for the M-1 th cycle,Representing the estimated transverse velocity of the radar for the M-1 th period
In the case where one target point is a stationary point, the target point does not move in the ground coordinate system. The relative movement between the target point and the vehicle is caused by the movement of the vehicle. Thus, the static Doppler velocity of the target point can be scaled from the longitudinal velocity and the lateral velocity of the radar of the previous cycle.
The actual Doppler speed of each target point in the Mth period is indicated in the point cloud used in the Mth periodThe motion state estimation module can estimate the static Doppler velocity of a target point in the Mth periodAnd the actual Doppler velocity of the target point at the Mth periodAnd comparing to obtain a difference value between the two.
Under the condition that the difference value between the two is larger than a preset threshold value, the difference value between the two is larger, the target point obviously moves under the ground coordinate system, and the determined target point can be used as a dynamic point of high-speed movement and can be filtered.
In the case where the difference between the two is less than or equal to the preset threshold, indicating that the difference between the two is small, the target point does not move significantly under the ground coordinate system, and at this time, the target point may be further processed to determine whether the target point is a static point or a dynamic point moving at a low speed (e.g., a walking person, a vehicle traveling at a low speed, etc.). In one possible implementation, the RANSAC algorithm may be used for further static point extraction, filtering out dynamic points of low-speed motion. The setting threshold may be set as needed, for example, may be set to 0.1 m/s. In the embodiment of the application, the set threshold is not limited.
The filtering of the dynamic points (including the dynamic points moving at low speed and the dynamic points moving at high speed) in the target points is completed, and the extraction of the static points is realized.
Fig. 4 shows an exemplary schematic of extracted static points in an embodiment of the application. As shown in fig. 4, three stationary points are extracted, the doppler velocities of which at the mth period are vr 1、vr 2 and vr 3, respectively, and the azimuth angles are θ 1、θ 2 and θ 3, respectively.
In one possible implementation, the transverse and longitudinal speeds of the radar may be determined based on the actual Doppler speed and azimuth angle of the extracted stationary point. For example, the transverse and longitudinal speeds of the radar at the mth period can be determined based on the doppler speeds vr 1、vr 2 and vr 3, and the azimuth angles θ 1、θ 2 and θ 3 at the mth period of the three stationary points shown in fig. 4. Specifically, a formula II can be constructed through the relation between the Doppler velocity and azimuth angle of each static point and the transverse and longitudinal speeds of the radar, and then the formula II is calculated through a least square method to obtain the transverse and longitudinal speeds of the radar.
Wherein N is an integer greater than or equal to 1 and less than or equal to N, N representing the number of static points.Representing the doppler velocity of the nth stationary point in the point cloud employed by the mth period,Representing the azimuth angle of the nth stationary point in the point cloud employed by the mth period,Representing the longitudinal speed of the mth period radar,Representing the lateral speed of the mth periodic radar.
The local grid map that needs to be used in registration is described below. In the embodiment of the application, the local grid map refers to a grid map within a certain range around the vehicle, and the origin of coordinates of the local grid map is at a fixed position of the vehicle. The local grid map moves with the movement of the vehicle. For example, the origin of coordinates of the local grid map may be at the center of the vehicle, or may be at another location of the vehicle, which is not limited in this embodiment of the present application.
Fig. 5 shows an exemplary schematic of a partially rasterized map according to an embodiment of the application. As shown in fig. 5, the origin of coordinates of the local grid map is at the center of the vehicle, the local grid map has a size of 7m×8m, and each resolution is 1m×1 m. That is, the local grid map has 8 grids in the x-direction and 7 grids in the y-direction of the vehicle coordinate system. When the vehicle 1 is at position 1, the partial grid map is a gray area shown in fig. 5. When the vehicle 1 is located at the position 2, the partial grid map is a diagonal line area shown in fig. 5. It can be seen that as the vehicle moves, the local grid map also moves accordingly, and the distance that the vehicle moves is the same as the distance that the local grid map moves. In embodiments of the present application, the value of each grid in the local grid map may be used to indicate static points in the grid based on historical point cloud accumulation. In one example, the value of each grid in the local grid map may be used to indicate the number of static points in the grid based on the history point cloud accumulation. In yet another example, the value of each grid in the local grid map may be used to indicate the density of static points in the grid based on the history point cloud accumulation. Wherein the number of static points accumulated based on the history point cloud in one grid is divided by the area of the grid to obtain the density of static points accumulated based on the history point cloud in the grid. For example, if the number of static points accumulated based on the history point cloud in one grid is 10 and the area of the grid is 1 square meter, the density of static points accumulated based on the history point cloud in the grid is 10 per square meter. In the embodiment of the application, the value of each grid in the local grid map actually represents the possibility of static points in the grid. It will be appreciated that when the number or density of static points accumulated in a grid based on the history point cloud is greater, there is a greater likelihood that static points are present in the grid; when the number or density of static points accumulated based on the history point cloud is small in one grid, the probability of existence of static points in the grid is small. In the embodiment of the application, the local grid map is constructed by the number of the accumulated static points or the density of the accumulated static points, and compared with the traditional Bayesian filtering and other methods, the calculation force of constructing the local grid map is reduced.
It will be appreciated that some initialization process is required to achieve the motion state estimation for cycle 1. In the embodiment of the application, the initialization processing comprises the following steps: local grid map initialization, yaw angle initialization, yaw rate initialization, translation margin initialization, and the like.
Wherein the local grid map initialization may include: the map size and the grid resolution of the local grid map are preset, and the initial local grid map is built according to the preset map size and the preset grid resolution. The initial local grid map represents the local grid map at the initial time. The origin of coordinates of the initial partial grid map may be preset at a fixed position of the vehicle, and a value of each grid in the initial partial grid map is an initial value. By way of example, the initial value here may be 0, representing that no static points have yet existed in the respective grids in the initial local grid map. Alternatively, the initial value may be another value, which is not limited by the present application.
The yaw angle initialization may include: an initial yaw angle is set. For example, the initial yaw angle represents the yaw angle at the initial time, and the initial yaw angle has a value of 0. The yaw rate initialization may include: the initial yaw rate is set. The initial yaw rate represents the yaw rate at the initial time, and the initial yaw rate has a value of 0. Alternatively, the initial yaw rate may be another value, which is not limited by the present application.
As shown in fig. 5, the partial grid map translates by the entire grid number, but in reality, the translation distance of the vehicle in the ground coordinate system is not necessarily converted into the entire translation grid number, so that the translation distance of the vehicle in the ground coordinate system is converted into the translation grid number, and then there is a margin of the translation distance. This margin is referred to herein as the translation margin. That is, the translation margin represents a gap between the translation amount of the local grid map and the actual translation amount of the vehicle. The translation allowance comprises the translation allowance of the local grid map in the x-axis direction and the translation allowance in the y-axis direction under a ground coordinate system. The translation margin may need to be reserved for use in the next cycle. The translation margin initialization may include: an initial translation margin is set. The initial translation margin includes a translation margin in the x-axis direction and a translation margin in the y-axis direction. The initial translation margin in the x-axis direction and the initial translation margin in the y-axis direction are both 0. Alternatively, the initial translation margin in the x-axis direction and the initial translation margin in the y-axis direction may take other values, which is not limited in the present application.
The initial partial grid map, the initial yaw angle, the initial yaw rate, and the initial translational margin are described above, and the parameters related to the mth cycle are described below. The mth period may represent any one motion state estimation period.
The point cloud to which the static point used in the mth period registration belongs is referred to as a first frame point cloud. The point cloud to which the static point used in the M-1 th period registration belongs is referred to as a second frame point cloud. The acquisition time of the second frame point cloud is before the acquisition time of the first frame point cloud. The time interval between the acquisition time of the second frame point cloud and the acquisition time of the first frame point cloud is the first time interval. The first yaw angle represents a yaw angle of the vehicle when the first frame point cloud is acquired. The second yaw angle represents a yaw angle of the vehicle when the second frame point cloud is acquired. The yaw rate estimated in the mth period is referred to as a first yaw rate. The vehicle speed estimated in the mth period is referred to as a first vehicle speed.
The local grid map used at the time of registration of the mth period is referred to as a first local grid map. The local grid map used in the M-1 th period registration is referred to as a second local grid map. The local grid map obtained after the end of the M-1 th period is referred to as a third local grid map. Specifically, the third partial raster map represents a partial raster map obtained after accumulating the static points in the second frame point cloud on the basis of the second partial raster map used at the time of registration of the M-1 th cycle. In the embodiment of the application, the static points in the second frame point cloud are accumulated on the second local grid map, so that on one hand, the accuracy of the positions of the static points in the local grid map can be improved, and on the other hand, the calculated amount is reduced.
The value of each grid in the first local grid map may be used to indicate static points in the grid that are accumulated based on the first historical point cloud. The first historical point cloud represents a point cloud acquired before the first frame point cloud is acquired. In one example, the first historical point cloud may include all point clouds collected in the 1 st through M-1 st cycles. In yet another example, the first historical point cloud may include point clouds selected at intervals of time from among all the point clouds acquired in the 1 st cycle through the M-1 st cycle. In one example, the value of each grid in the first local grid map may be used to indicate the number or density of static points in the grid that are accumulated based on the first historical point cloud. In the embodiment of the application, the number or the density of the accumulated static points is used for representing the probability of existence of the static points in each grid, so that the calculated amount is reduced, and the calculation efficiency is improved.
The value of each grid in the second local grid map may be used to indicate static points in the grid that are accumulated based on the second historical point cloud. The second historical point cloud represents the point cloud acquired before the second frame point cloud is acquired. In one example, the second historical point cloud may include all point clouds collected in cycle 1 through M-2 Zhou Qian. In yet another example, the second historical point cloud may include point clouds selected at intervals of time from among all the point clouds acquired in the 1 st to M-2 th periods. In one example, the value of each grid in the first local grid map may be used to indicate the number or density of static points in the grid that are accumulated based on the second historical point cloud. The first history point cloud includes a second history point cloud and a second frame point cloud.
In an embodiment of the present application, the first history point cloud includes a second history point cloud and a second frame point cloud. That is, the value of each grid in the first historical point cloud is used to indicate the static points that the grid accumulated based on the second historical point cloud and the second frame point cloud.
The value of each grid in the third local grid map is used to indicate the static points in that grid that are accumulated based on the first historical point cloud. The difference between the first partial grid map and the third partial grid map is that the positions of the first partial grid map and the third partial grid map may be different. The origin of the first and third partial grid maps are both at a fixed position of the vehicle, and the position of the vehicle in the ground coordinate system may be shifted during the first time interval, and thus the position of the partial grid map around the vehicle is shifted from the position of the third partial grid map to the position of the first partial grid map. And the third local grid map is translated to obtain the first local grid map. The amount of translation of the third partial raster map is determined by the amount of translation of the vehicle over the first time interval. Taking the local grid map of the position 1 as the third local grid map and the local grid map of the position 2 as the first local grid map shown in fig. 5 as an example. In a first time interval, the vehicle 1 has moved from position 1 to position 2, and the local grid map is changed from the third local grid map to the first local grid map.
The motion state estimation method provided by the embodiment of the application is described below by taking the mth period as an example. The duration of the mth period is the first time interval. The duration of the M-1 th period is the second time interval. Fig. 6 illustrates a motion state estimation method provided by an embodiment of the present application. The method may be applied to the motion state estimation module shown in fig. 2. As shown in fig. 6, the method may include:
step S501, a first local grid map is acquired.
Wherein the value of each grid in the first local grid map is used to indicate static points in the grid that are accumulated based on the first historical point cloud.
As described above, in the embodiment of the present application, the first local grid map may be acquired based on the third local grid map. In one possible implementation, step S501 may include: acquiring a third local grid map; determining a first translation distance according to the first movement speed, the first time interval and the first translation allowance; determining the number of the translation grids according to the first translation distance and the preset grid resolution; and carrying out translation processing on the second local grid map according to the number of the translation grids to obtain a first local grid map.
Wherein, the value of each grid in the third local grid map represents static points accumulated based on the first history point cloud in the grid, and the static points can be specifically the number of the static points or the density of the static points; the first moving speed represents the speed of the vehicle in a first time interval under the ground coordinate system, the first translation allowance represents the difference between the translation amount of the local grid map and the actual translation amount of the vehicle in a second time interval, the ending time of the second time interval and the starting time of the first time interval are the same, and the first translation distance is used for indicating the actual translation amount of the vehicle in the first time interval; the grids of the first local grid map comprise overlapped grids overlapped with the third local grid map and extended grids not overlapped with the third local grid map, the values of the overlapped grids are consistent with the values of corresponding grids in the second local upper map, and the values of the extended grids are initial values.
In a single radar scenario, the process of acquiring the first moving speed may include: acquiring the first frame point cloud; extracting a static point in the first frame point cloud from the plurality of target points based on azimuth angles and Doppler speeds of the plurality of target points; determining a first radar speed according to the azimuth angle and the Doppler speed of the static point; the first moving speed is determined according to the first radar speed, the mounting angle of the first radar, and the second yaw angle.
The first frame point cloud is used for indicating azimuth angles and Doppler speeds of a plurality of target points. The first radar speed represents a speed of the radar in the radar coordinate system within the first time interval.
In one example, the first translation distance of the vehicle over the first time interval may be determined by equation three.
Where move_x M represents a translation distance of the first partial raster map relative to the third partial raster map in the x-axis direction (i.e., a component of the first translation distance in the x-axis direction) for the mth cycle in the ground coordinate system. Move_y M represents a translation distance of the first partial raster map with respect to the third partial raster map in the y-axis direction (i.e., a component of the first translation distance in the y-axis direction) in the mth cycle under the ground coordinate system.Representing the longitudinal speed of the M-th period radar (i.e., the component of the first radar speed in the x-axis direction of the radar coordinate system) in the radar coordinate system.Represents the transverse velocity of the M-th periodic radar (i.e., the component of the first radar velocity in the y-axis direction of the radar coordinate system) in the radar coordinate system. Beta represents the mounting angle of the radar. T denotes the duration of the first time interval. Yaw_est M-1 represents the Yaw angle (i.e., the second Yaw angle) of the vehicle estimated at the M-1 th cycle.
It will be appreciated that, in equation three,Representing the x-axis direction of the longitudinal speed projection of the radar in the radar coordinate system into the ground coordinate system.Representing the x-axis direction of the transverse velocity projection of the radar in the radar coordinate system into the ground coordinate system. In a corresponding manner, Representing the component of the first radar speed in the x-axis direction of the ground coordinate system.Representing the projection of the longitudinal speed of the radar in the radar coordinate system to the y-axis direction in the ground coordinate system.Representing the projection of the lateral velocity of the radar in the radar coordinate system to the y-axis direction in the ground coordinate system. In a corresponding manner,Representing the component of the first radar speed in the y-axis direction of the ground coordinate system.
Move_x_remain M-1 represents the component of the translation margin of the M-1 st period in the x-axis direction (i.e., the component of the first translation margin in the x-axis direction of the ground coordinate system) in the ground coordinate system. Move_y_remain M-1 represents the component of the translation margin of the M-1 th period in the y-axis direction (i.e., the component of the first translation margin in the x-axis direction of the ground coordinate system) in the ground coordinate system.
In a multi-radar scenario, the acquiring of the first moving speed may include: acquiring a plurality of radar speeds and a plurality of mounting angles based on a plurality of radars; acquiring the first vehicle speed according to the radar speeds and the installation angles; a first movement speed is determined based on the first vehicle speed and the second yaw angle.
Wherein a first vehicle speed represents a speed of the vehicle in the first time interval in a vehicle coordinate system.
In one example, the first translation distance of the vehicle over the first time interval may be determined by equation four.
The move_x M、Move_y M、T、Yaw_est M-1、Move_x_remain M-1 and move_y_remain M-1 may refer to the formula three, and will not be described herein.Representing the mth cycle vehicle longitudinal speed (i.e., the component of the first vehicle speed in the x-axis direction of the vehicle coordinate system) in the vehicle coordinate systemRepresents the mth cycle vehicle lateral speed (i.e., the component of the first vehicle speed in the y-axis direction of the vehicle coordinate system) in the vehicle coordinate system.
It will be appreciated that, in equation four,Representing the x-axis direction of the longitudinal velocity of the vehicle projected into the ground coordinate system in the vehicle coordinate system.Representing the x-axis direction of the lateral velocity of the vehicle projected into the ground coordinate system in the vehicle coordinate system. In a corresponding manner, Representing the component of the first vehicle speed in the x-axis direction of the ground coordinate system. Representing the y-axis direction of the longitudinal velocity of the vehicle projected into the ground coordinate system in the vehicle coordinate system.Representing the y-axis direction of the lateral velocity of the vehicle projected into the ground coordinate system in the vehicle coordinate system. In a corresponding manner,Representing the component of the first vehicle speed in the y-axis direction of the ground coordinate system.
After determining the first translation distance, the motion state estimation module may determine the number of translation grids according to the first translation distance and a preset grid resolution, for example, may determine translation grid data according to formula five.
Where int represents a rounding operation. GridMove _x M represents the number of translation grids of the mth cycle in the x-axis direction in the ground coordinate system. GridMove _y M represents the number of translation grids of the Mth cycle in the y-axis direction under the ground coordinate system. Grid_resolution represents a preset Grid resolution.
It will be appreciated that there may be a gap between the movement in units of grid and the actual movement of the vehicle, which is the margin of the translation distance for the mth cycle, which margin may be reserved for use for the next cycle (i.e. the m+1th cycle), which margin may be determined by equation six.
Wherein, move_x_remain M represents the component of the translation margin of the mth period in the x-axis direction under the ground coordinate system. Move_y_remain M represents the component of the translation margin of the mth cycle in the y-axis direction under the ground coordinate system. For the move_x_domain M-1 and move_y_domain M-1 used in the third and fourth formulas, reference may be made to the acquisition process of move_x_domain M and move_y_domain M in the sixth formula, and only the mth parameter needs to be replaced by the parameter of the M-1 th cycle.
Fig. 7 shows an exemplary schematic of a partial grid map of an embodiment of the present application. The moving speed of the vehicle in the first time interval under the ground coordinate system is a first moving speed. As shown in fig. 7, the vehicle moves from one location to another location during a first time interval. The values of the individual grids in the third partial grid map remain unchanged. The grids of the first partial grid map that overlap with the third partial grid map are referred to as overlapping grids, and the grids of the first partial grid map that do not overlap with the third partial grid map are referred to as expanded grids. For each overlapping grid in the first partial grid map, the value of the grid in the third partial grid map may be taken as the value of the grid in the first partial grid map, as in the gray area of fig. 7, and the value of the grid in the third partial grid map may be taken as the value of the grid in the first partial grid map. For each expanded grid in the first partial grid map, an initial value may be taken as the value of the grid in the first partial grid map, as for each grid in the diagonal line region shown in fig. 7, an initial value may be taken as the value of the grid in the first partial grid map. The initial value may be set as needed, for example, the initial value may be 0.
Step S502, registering static points in the first frame point cloud with the first local grid map to obtain a first yaw angle.
Wherein the first yaw angle represents a yaw angle of the vehicle when the first frame point cloud is acquired. In this step, the static point is first extracted from the first frame point cloud, and the process of extracting the static point is described above, which is not described herein. After the static points are extracted from the first frame point cloud and the first local grid map is obtained, the motion state estimation module may register the static points in the first frame point cloud with the first local grid map to obtain a first yaw angle.
In the embodiment of the application, the static points in the first frame point cloud are registered with the first local grid map, namely the first frame point cloud is rotated, so that the positions of the static points in the first frame point cloud in the ground coordinate system are consistent with the positions of the grids with more static points accumulated in the first local grid map. When the value of one grid in the first partial grid map is larger, it indicates that the larger the number or density of static points accumulated in the grid is, the higher the probability that the static points actually exist in the grid is. Therefore, after the first frame point cloud rotates, when the static points fall on the grid with more static points accumulated in the first local grid map, it can be determined that the registration between the static points in the first frame point cloud and the first local grid is completed, and at this time, the first Yaw angle yaw_est can be obtained according to the included angle between the direction of the first frame point cloud (i.e., the X3 direction shown in fig. 3) and the X1 direction.
In one possible implementation, step S502 may include: acquiring a plurality of candidate yaw angles; determining a grid accumulation value corresponding to each candidate yaw angle, comparing the grid accumulation value corresponding to each candidate yaw angle with the stored grid accumulation values, and updating the stored grid accumulation values and the stored yaw angles based on the comparison result; the first yaw angle is determined from the stored yaw angles.
Wherein the grid cumulative value corresponding to the candidate yaw angle may be used to represent: the vehicle is rotated to the candidate yaw angle based on a distribution of static points accumulated by the first history point cloud in the first local grid map. The stored grid cumulative value may be used to represent: the vehicle, in the case of rotating to the stored yaw angle, is based on the distribution of static points accumulated by the first history point cloud in the first local grid map.
In one possible implementation, determining the grid cumulative value for each candidate yaw angle includes: for any one of the plurality of candidate yaw angles: rotating a plurality of static points in the first frame point cloud according to the candidate yaw angle to obtain a plurality of first rotation points, wherein each static point in the first frame point cloud corresponds to one first rotation point; according to a preset grid resolution, carrying out rasterization on the plurality of first rotation points to obtain grids of each first rotation point in the first local grid map; and accumulating the grid values of each first rotation point to obtain grid accumulated values corresponding to the candidate yaw angles.
In the embodiment of the present application, the candidate Yaw angle is denoted as Yaw. And traversing each candidate yaw angle, and rotating a plurality of static points in the first frame point cloud according to the candidate yaw angle to obtain a plurality of rotation points, wherein the obtained rotation points are the first rotation points. Each static point in the first frame point cloud corresponds to a first rotation point. That is, the number of static points in the first frame point cloud is the same as the number of first rotation points. In one possible implementation, when the vehicle is rotated to any one candidate yaw angle, the included angle between the X3 direction and the X1 direction may be determined according to the positional relationship between the vehicle coordinate system and the radar coordinate system shown in fig. 3; according to the included angle between the X3 direction and the X1 direction and the position relationship between the radar and the static points shown in FIG. 4, the included angle between each static point in the first frame point cloud and the X1 direction shown in FIG. 3 can be determined; and rotating each static point in the first frame point cloud, and obtaining a first rotating point corresponding to each static point when the included angle between each static point and the X1 direction shown in FIG. 3 is the determined included angle. And then, according to the preset grid resolution, carrying out rasterization processing on the plurality of first rotation points to obtain grids of each first rotation point in the first local grid map.
Fig. 8 shows a schematic diagram of a rasterization process of a first rotation point in an embodiment of the present application. As shown in fig. 8, the point cloud to be processed includes a plurality of first rotation points. The rasterization of the point cloud to be processed is to determine the position of each first rotation point in the first local grid map. Specifically, the position of each first rotation point in the first local grid map may be determined according to the distance between the first rotation point and the radar and the azimuth angle. Thereafter, the number of first rotation points in each grid may be determined, and as shown in fig. 8, there is one first rotation point in the 1 st row and 1 st column, and the number of first rotation points in the 1 st row and 1 st column grids is 1. The number of the first rotation points in other grids may be determined in the same manner, as shown in fig. 8, and will not be described again here. Based on the grids where each first rotation point is located, the value of the grid where each first rotation point is located can be obtained, and the values of the grids where each first rotation point is located are accumulated, so that the grid accumulated value corresponding to the candidate yaw angle can be obtained.
After each of the grid accumulated values corresponding to one of the candidate yaw angles is obtained, the grid accumulated value corresponding to the candidate yaw angle may be compared with the stored grid accumulated value, and the stored grid accumulated value and the stored yaw angle may be updated based on the comparison result. Wherein the initial stored grid cumulative value may be 0 and the initial stored yaw angle may be set to null.
In one example, updating the stored grid cumulative value and the stored yaw angle based on the comparison result includes: updating the stored grid accumulation value to the grid accumulation value corresponding to the candidate yaw angle and updating the stored yaw angle to the candidate yaw angle under the condition that the grid accumulation value corresponding to the candidate yaw angle is larger than the stored grid accumulation value; under the condition that the grid accumulated value corresponding to the candidate yaw angle is equal to the stored grid accumulated value, keeping the stored grid accumulated value unchanged, and increasing the candidate yaw angle in the stored yaw angle; and when the grid accumulated value corresponding to the candidate yaw angle is smaller than the stored grid accumulated value, keeping the stored grid accumulated value and the stored yaw angle unchanged.
Since the stationary points represent objects whose coordinate positions do not move in the ground coordinate system. After the vehicle moves, the radar can still detect the target at the position, that is, the position of the static point in the measuring range in the ground coordinate system is unchanged in the point cloud obtained at different times of the radar. If the point clouds of each frame are accumulated, the number of static points in the grid where the static points actually exist should be increased. For example, in a certain grid, the number of static points is 2, after three frames of point clouds are accumulated, the number of static points accumulated by the grid reaches 6, the number of static points of one grid is 3, and after four frames of point clouds are accumulated, the number of static points accumulated by the grid can reach 12. The greater the number of static points accumulated, the greater the likelihood that static points are actually present.
In the case where the grid accumulation value corresponding to one candidate yaw angle is larger than the stored grid accumulation value, it indicates that the vehicle (or radar) rotates to the position of the stationary point measured after the candidate yaw angle, which is more accurate with respect to the position of the stationary point measured after the vehicle rotates to the stored candidate yaw angle, that is, the candidate yaw angle is closer to the actual yaw angle of the vehicle with respect to the stored yaw angle. Therefore, in the case where the grid accumulation value corresponding to one candidate yaw angle is larger than the stored grid accumulation value, it is possible to update the stored grid accumulation value to the grid accumulation value corresponding to the candidate yaw angle and to update the stored yaw angle to the candidate yaw angle.
In the case where the grid cumulative value corresponding to one candidate yaw angle is equal to the stored grid cumulative value, it is indicated that the vehicle (or radar) rotates to the position of the stationary point measured after the candidate yaw angle, with respect to the position of the stationary point measured after the vehicle rotates to the stored yaw angle, the accuracy is the same, that is, the candidate yaw angle is almost the same as the stored yaw angle as the actual yaw angle of the vehicle. Therefore, in the case where the grid accumulation value corresponding to one candidate yaw angle is equal to the stored grid accumulation value, it is possible to keep the stored grid accumulation value unchanged and increase the candidate yaw angle in the stored yaw angles. It will be appreciated that it is possible that the different candidate yaw angles will be the same as the actual yaw angle of the vehicle, and therefore, there may be one or more stored yaw angles. For example, the actual yaw angle is 20 degrees, and the stored yaw angles may include 19.9 degrees and 201 degrees, or may include only 20 degrees.
In the case where the grid accumulation value corresponding to one candidate yaw angle is smaller than the stored grid accumulation value, it is indicated that the position of the static point measured after the vehicle rotates to the candidate yaw angle is more inaccurate with respect to the position of the static point measured after the vehicle rotates to the stored yaw angle, that is, the stored yaw angle is closer to the actual yaw angle of the vehicle with respect to the candidate yaw angle. Therefore, in the case where the grid accumulation value corresponding to one candidate yaw angle is smaller than the stored grid accumulation value, the stored grid accumulation value and the stored yaw angle can be kept unchanged.
After all the grid accumulation values corresponding to the candidate yaw angles are traversed, the yaw angle closest to the actual yaw angle of the vehicle among all the candidate yaw angles can be obtained. That is, the stored yaw angle is the closest to the actual yaw angle of the vehicle among all the candidate yaw angles. It is contemplated that the number of stored yaw angles may be one or more. Therefore, in the case where the number of stored Yaw angles is plural, the average value of the plural stored candidate Yaw angles may be taken to obtain the Yaw angle yaw_est M, i.e., the first Yaw angle, which is finally estimated in the mth cycle.
In the embodiment of the application, the number of the candidate yaw angles can be set according to the needs, for example, the number can be set according to the requirements of computing capacity, instantaneity and precision. When the computing capacity is weaker, the real-time requirement is higher or the accuracy requirement is lower, fewer candidate yaw angles can be set, for example, 50 candidate yaw angles can be set; when the calculation capability is strong, the real-time requirement is low, or the accuracy requirement is high, more candidate yaw angles can be set, for example, 10 candidate yaw angles can be set. In the embodiment of the application, the number of the candidate yaw angles is not limited. The following describes the process of acquiring the candidate yaw angle involved in the above-described process. Considering that the transverse and longitudinal speeds based on the radar are accurate in a single radar scene, the transverse and longitudinal speeds of the vehicle are directly obtained based on the transverse and longitudinal speeds and the installation angle of the radarAndAnd has limited accuracy. In a multi-radar scene, the transverse and longitudinal speeds of the radars are estimated by each radar in the plurality of radars and the transverse and longitudinal speeds of the radars are directly estimated by the installation angleAndThe method is accurate and can be directly used as a final output result. Therefore, in the embodiment of the present application, the process of acquiring a plurality of candidate yaw angles is different in a single radar scene and in a multi-radar scene.
In a single radar scenario, the acquiring a plurality of candidate yaw angles includes: according to the second yaw rate and the first time interval, compensating the second yaw rate to obtain a compensated yaw angle; and generating a plurality of candidate yaw angles according to the preset angle resolution within a preset angle range of the compensation yaw angle.
In one example, compensating the second yaw rate based on the second yaw rate and the first time interval, the compensated yaw angle being: the angle of rotation of the vehicle at the first time interval is estimated according to the yaw rate of the previous cycle, the yaw rate estimated in the previous cycle is compensated according to the estimated rotation angle, and a possible yaw angle, namely, a compensated yaw angle is estimated. The second yaw rate is multiplied by the first time interval to obtain a possible rotation angle, and the second yaw rate is added to the rotation angle to obtain a compensated yaw angle.
It will be appreciated that the second yaw rate is the yaw rate during the second time interval, and that the yaw rate of the vehicle during the first time interval has a greater probability of being different from the second yaw rate. Thus, the resulting compensated yaw angle may be different from, but may be in the vicinity of, the yaw angle actually produced by the vehicle during the second time interval. Accordingly, a plurality of candidate yaw angles can be generated at a preset angular resolution within a preset angular range of the compensated yaw angle.
The preset angle range may be set as needed, for example, the preset angle range may be 1 degree reduced on the basis of the compensated yaw angle to 1 degree increased on the basis of the compensated yaw angle, or 0.5 degree reduced on the basis of the step yaw angle to 1.5 degrees increased on the basis of the compensated yaw angle. In the case where the vehicle is turning (for example, in the case where the yaw rate is large for the previous several cycles), the preset angle range may be set large. In the case where the vehicle is traveling straight (for example, in the case where the yaw rate is small for the previous several cycles), the preset angle range may be set small. The embodiment of the application does not limit the setting mode and the principle of the preset angle range and does not limit the value of the preset angle range.
The preset angular resolution may be set as required, for example, the preset angular resolution may be 0.1 degree, 0.5 degree, 1 degree, or the like. The smaller the preset angular resolution, the larger the number of candidate yaw angles obtained later, the larger the calculation amount, but the higher the accuracy of the corresponding first yaw angle. The larger the preset angular resolution, the smaller the number of candidate yaw angles obtained later, the smaller the calculation amount, but the lower the accuracy of the corresponding first yaw angle. For example, the compensated yaw angle is 20 degrees, and the predetermined angle range is a decrease of one degree to an increase of one degree, that is, the predetermined angle range is 19 degrees to 21 degrees. If the preset angular resolution is 0.2 degrees, the obtained candidate yaw angles are 19 degrees, 19.2 degrees, 19.4 degrees, 19.6 degrees, 19.8 degrees, 20 degrees, 20.2 degrees, 20.4 degrees, 20.6 degrees, 20.8 degrees and 21 degrees. If the preset angular resolution is 0.5 degrees, the obtained candidate yaw angles are 19 degrees, 19.5 degrees, 20 degrees, 20.5 degrees, and 21 degrees. It can be seen that, in the case where the preset resolution is small, the number of obtained candidate yaw angles is large, the gap between the candidate yaw angles is small, and therefore, the accuracy of the obtained first yaw angle is higher.
The second yaw rate represents a yaw rate of the vehicle in a second time interval, and an end time of the second time interval is a start time of the first time interval.
In a multi-radar scenario, the acquiring a plurality of candidate yaw angles includes: acquiring a plurality of radar speeds and a plurality of mounting angles based on a plurality of radars; acquiring radar estimated yaw angles according to a plurality of radar speeds and a plurality of installation angles; and generating a plurality of candidate yaw angles according to the preset angle resolution within the preset angle range of the radar estimated yaw angle.
The obtaining of the radar estimated yaw angle according to the plurality of radar speeds and the plurality of mounting angles may refer to a related art, and will not be described herein. Since the radar estimated yaw angle is not accurate enough, in the embodiment of the application, the radar estimated yaw angle needs to be adjusted. Specifically, a plurality of candidate yaw angles may be generated according to a preset angular resolution within a preset angular range of the radar predicted yaw angle. The setting mode of the estimated yaw angle and the preset angular resolution in the single radar scene can be referred to, and will not be described in detail.
In the prior art, under a single radar scene, static points in the obtained point cloud are sparse, the estimated yaw angle has a larger phase difference from the actual yaw angle, and under the single radar scene, the estimated lateral and longitudinal speeds and the estimated yaw angle have limited accuracy. The compensation yaw angle obtained by the motion state estimation method provided by the embodiment of the application is estimated based on the motion state information of the previous period in a single radar scene, so that the compensation yaw angle in the embodiment of the application is near the actual yaw angle and only needs to be further adjusted. Under the condition of multiple radars, the calculated transverse and longitudinal speed has higher precision according to radar calibration parameters, but the estimated radar estimated yaw angle is near the actual yaw angle due to insufficient radar height measurement capability and influence of the calibration parameters, and only the estimated radar estimated yaw angle needs to be further adjusted and corrected. The compensated yaw angle and the radar estimated yaw angle are both estimated yaw angles in the vicinity of the actual yaw angle.
In the embodiment of the application, a plurality of candidate yaw angles are generated around the compensation yaw angle or the radar estimated yaw angle, and traversing optimization is performed in the candidate yaw angles to obtain the yaw angle with the maximum grid accumulation value.
In the embodiment of the application, through the step S501 and the step S502, the static points in the current point cloud and the local grid map in which the static points in the history point cloud are accumulated are registered, so that the situation of sparse static points in the single-frame point cloud is compensated, and the accuracy of the yaw angle of the vehicle is effectively improved.
In a possible implementation manner, the motion state estimation method provided by the embodiment of the present application may further include a process of obtaining the second yaw angle, which specifically includes: and acquiring a second local grid map, and registering static points in the second frame point cloud with the second local grid map to obtain a second yaw angle.
The value of each grid in the second local grid map may be used to indicate a static point in the grid that is accumulated based on a second history point cloud, the second yaw angle representing a yaw angle of the vehicle when a second frame point cloud is acquired, the acquisition time of the second frame point cloud being before the acquisition time of the first frame point cloud. The second local grid map, the second history point cloud, the second frame point cloud, and the second yaw angle have been described previously, and are not described here again. The process of acquiring the second yaw angle may refer to the process of acquiring the first yaw angle provided in step S501 and step S502, and will not be described here.
On the basis of this, the yaw rate Yawrate M of the mth cycle, that is, the first yaw rate, can be determined based on the first yaw angle, the second yaw angle, and the first time interval.
In one example, the first yaw rate may be obtained by first obtaining the difference between the first yaw angle and the second yaw angle and then dividing the difference by the first time interval.
In the embodiment of the application, the static points in the current point cloud are registered with the local grid map in which the static points in the historical point cloud are accumulated, so that the situation of sparse static points in the single-frame point cloud is made up, the accuracy of the yaw angle of the vehicle is effectively improved, and the estimation precision and reliability of the yaw angle speed are further improved.
In one possible implementation, the method further includes: rotating a plurality of static points in the first frame point cloud according to the first yaw angle to obtain a plurality of second rotation points, wherein each static point in the first frame bottom sediment corresponds to one second rotation point; according to the preset grid resolution, carrying out rasterization on a plurality of second rotation points to obtain grids of each second rotation point in the first local grid map; and accumulating the number of the second rotation points in the grids and the values of the grids in the first local grid map aiming at the grids where each second rotation point is positioned to obtain an updated first local grid map.
In the (m+1) -th period, the first local grid map obtained in the (M) -th period is required to be used, and therefore, in the embodiment of the application, the first local grid map is required to be updated according to the first frame point cloud obtained in the (M) -th period.
Specifically, each static point in the first frame point cloud may be rotated according to the first yaw angle, and each rotated static point may be referred to as a second rotation point. And carrying out rasterization processing on the second rotation point according to the preset rasterization resolution. The process of rotating the plurality of static points in the first frame point cloud according to the first yaw angle to obtain a plurality of second rotation points may refer to the process of rotating each static point in the first frame point cloud according to the candidate yaw angle to obtain a plurality of first rotation points, and the manner of performing the rasterization processing on the second rotation points may refer to the manner of performing the rasterization processing on the first rotation points, which will not be described herein.
In addition, the process of updating the second local grid map used for registration in the M-1 th period to obtain the third local grid map may refer to the process of updating the first local grid map in the M-1 th period, which is not described herein.
The above method estimates the movement state information (including the transverse and longitudinal speeds and the yaw rate) of the radar in the radar coordinate system, and the movement state information needs to be converted into the movement state information of the vehicle in the vehicle coordinate system, and the installation position of the radar relative to the center of the vehicle is shown in fig. 3. Since the radar is fixedly connected with the vehicle, the radar yaw rate is the same as the yaw rate of the vehicle, and only the transverse and longitudinal speeds are converted. The specific conversion process can refer to the formula seven
Wherein,Representing the mth cycle vehicle longitudinal speed (i.e., the component of the first vehicle speed in the x-axis direction of the vehicle coordinate system) in the vehicle coordinate systemRepresents the mth cycle vehicle lateral speed (i.e., the component of the first vehicle speed in the y-axis direction of the vehicle coordinate system) in the vehicle coordinate system. Beta is the installation angle of the radar.Representing the longitudinal speed of the M-th period radar (i.e., the component of the first radar speed in the x-axis direction of the radar coordinate system) in the radar coordinate system.Represents the transverse velocity of the M-th periodic radar (i.e., the component of the first radar velocity in the y-axis direction of the radar coordinate system) in the radar coordinate system. Yawrate M denotes the yaw rate of the vehicle in the M-th cycle, i.e., the first yaw rate.
It can be appreciated that in a multi-radar scenario, the above formula is not required to be used for conversion, and the transverse and longitudinal speeds of the vehicle estimated by the multi-radar can be directly used as the transverse and longitudinal speedsAnd
In the embodiment of the application, a point cloud acquisition module sends a first frame of point cloud to a point cloud processing module, and the point cloud processing module obtains the longitudinal speed, the transverse speed and the yaw rate of the vehicle based on the first frame of point cloud. In one possible implementation, the point cloud processing module may include a static point extraction module, a radar speed estimation module, a local grid map translation module, a registration module, a vehicle motion state estimation module, a static point rotation module, a local grid map update module, and a local grid map storage module. Fig. 9 is a flowchart of an embodiment of a motion state estimation method according to an embodiment of the present application. The implementation in fig. 9 may be motion state estimation for any one period, and the superscript M is omitted in fig. 9. As shown in fig. 9, the method may include:
In step S601, the point cloud collecting module provides the first frame point cloud adopted by the mth period to the static point extracting module.
In step S602, the static point extraction module extracts static point information from the first frame point cloud.
The static point information comprises Doppler speed of each static point in the first frame point cloud, azimuth angle and distance of each static point relative to the radar. In this step, extracting the static point information from the first frame point cloud includes filtering the dynamic points in the first frame point cloud to obtain the static points included in the first frame point cloud, where the static point information of each static point can be obtained from the information provided by the first frame point cloud. The process of filtering the dynamic points in the first frame point cloud has been described above, and will not be described herein.
In step S603, the static point extraction module sends the extracted static point information to the radar speed estimation module.
In step S604, the radar speed estimation module obtains the longitudinal speed V rx and the transverse speed V ry of the mth period radar according to the doppler speed and the azimuth angle of each static point in the static point information.
This step can be referred to as formula two.
In step S605, the radar speed estimation module sends the longitudinal speed V rx and the lateral speed V ry of the mth periodic radar to the local grid map translation module.
Since the origin of coordinates of the local grid map is at a fixed position of the vehicle, the local grid map moves with the movement of the vehicle. In the step, the radar speed estimation module sends the longitudinal speed and the transverse speed of the radar to the local grid map translation module, so that the local grid map translation module can conveniently determine the grid number of the local grid map translation in the Mth period.
In step S606, the local grid map translation module acquires a third local grid map obtained after the M-1 st period is completed from the local grid map storage module.
The local grid map storage module may be configured to store a local grid map of static points of the previous history point cloud accumulated at the end of each cycle. For example, the local grid map storage module may store the third local grid map acquired in this step, where the value of each grid in the third local grid map represents the static points in that grid that are accumulated based on the first historical frame point cloud. In this step, the local grid map translation module obtains the third local grid map from the local grid map storage module, so that the local grid map can conveniently determine the reference of the local grid map translation in the mth period, that is, based on which local grid map carries out the local grid map translation.
In step S607, the local grid map translation module determines the number of translation grids based on the longitudinal speed V rx and the transverse speed V ry of the radar, the translation allowance of the first time interval and the M-1 period, and the preset grid resolution, and translates the third local grid map according to the number of translation grids to obtain the first local grid map.
The translation allowance of the M-1 th period is the first translation allowance. This step may be implemented with reference to formula three, and will not be described here again. In the step, the local grid map translation module translates the third local grid map obtained after the M-1 th period is finished according to the translation grid number, so that the first local grid map can be used when the M-1 th period is aligned. Wherein translating the third local grid map includes: and determining the position of the first local grid map according to the position of the third local grid map and the translation grid number. And determining an overlapped grid which is overlapped with the third local grid map in the first local grid map and an extended grid which is not overlapped with the third local grid map in the first local grid map according to the position of the first local grid map and the position of the third local grid map. The overlapping grids may directly use the values of the corresponding grids in the third partial grid map; the extended grid may use an initial value. Thus, the first local grid map is obtained by determining both the location of the first local grid map and the value of each grid in the first local grid map. In the embodiment of the application, the method for determining the overlapping grid value can not only realize accumulation of static points and improve accuracy, but also reduce the calculated amount and improve efficiency.
In step S608, the local grid map translation module sends the first local grid map to the registration module.
In step S609, the static point extraction module sends the static point information to the registration module.
In the embodiment of the application, the yaw angle of the vehicle is determined by registering the static points in the current point cloud (i.e., the first frame point cloud) with the local grid map (i.e., the first local grid map) in which the static points of the history point cloud are accumulated. Therefore, the local grid translation module and the static point extraction module need to respectively send the first local grid map and the static point information to the registration module for registration processing.
In step S610, the registration module determines, according to the static point information in the first frame point cloud and the value of each grid in the first local grid map, the position of each static point in the first frame point cloud in the first local grid map, obtains a first Yaw angle yaw_est of the vehicle in the mth period, and determines the first Yaw rate Yawrate based on the first Yaw angle of the vehicle in the mth period, the second Yaw angle of the M-1 th period, and the first time interval.
In an embodiment of the present application, the value of each grid in the first local grid map indicates a static point in the grid that is accumulated based on the first historical frame point cloud. If the value of a certain grid in the first local grid map is larger, the probability that a static target exists in the grid is larger; if the value of a certain grid in the first local grid map is smaller, the probability that a static target exists in the grid is smaller. The static points in the first frame point cloud represent also static targets. Since the static target is not moving in the ground coordinate system. The registration module may rotate the first frame point cloud such that each static point in the first frame point cloud falls on a grid with a larger median of the first local grids. The first frame point cloud rotates, which represents that the radar rotates, and the corresponding vehicle also rotates, and at the moment, the yaw angle of the vehicle is the first yaw angle.
In step S611, the registration module sends the first yaw rate Yawrate to the vehicle motion state estimation module.
In step S612, the radar speed estimation module transmits the longitudinal speed V rx of the radar and the lateral speed V ry of the radar to the vehicle motion state estimation module.
In step S613, the vehicle motion state estimation module determines the longitudinal speed V x of the vehicle, the lateral speed V y of the vehicle, and the yaw rate Yawrate of the vehicle based on the received longitudinal speed V rx of the radar, the lateral speed V ry of the radar, and the first yaw rate Yawrate.
In step S614, the registration module sends the first Yaw angle yaw_est to the stationary point rotation module.
In step S615, the static point extraction module sends the extracted static point to the static point rotation module.
In step S616, the static point rotation module rotates the plurality of static points in the first frame point cloud by the first Yaw angle yaw_est according to the first Yaw angle yaw_est, so as to obtain a plurality of corresponding second rotation points.
In step S617, the static rotation module sends the plurality of second rotation points to the local grid map updating module.
In step S618, the local grid map translation module sends the first local grid map to the local grid map update module.
Step S619, the local grid map updating module performs rasterization processing on a plurality of second rotation points according to a preset grid resolution to obtain grids of each second rotation point in the first local grid map; and accumulating the number of the second rotation points in the grids and the values of the grids in the first local grid map aiming at the grids where each second rotation point is positioned to obtain an updated first local grid map.
In step S620, the local grid map updating module sends the updated first local grid map to the local grid map storage module for storage.
In step S621, the radar speed estimation module sends the radar speed to the static point extraction module for the next period, and the static point extraction module performs static point extraction.
In the embodiment of the application, the static points in the current point cloud are registered with the local grid map in which the static points in the historical point cloud are accumulated, so that the situation of sparse static points in the single-frame point cloud is made up, the accuracy of the yaw angle of the vehicle is effectively improved, and the estimation precision and reliability of the yaw angle speed are further improved.
In one possible implementation manner, the point cloud acquisition module and the point cloud processing module shown in fig. 9 may be disposed in a radar, and the radar completes the acquisition of the point cloud and the processing of the point cloud, so as to obtain the motion state information of the vehicle. In another possible implementation manner, the point cloud collecting module shown in fig. 9 may be disposed in a radar, and the point cloud processing module may be disposed in an autopilot system or an auxiliary driving system of the vehicle, where the radar collects point clouds, and the autopilot system or the auxiliary driving system processes the point clouds, so as to obtain movement state information of the vehicle. According to the embodiment of the application, the setting positions and the setting modes of the point cloud acquisition module and the point cloud processing module are not limited.
The above-described method can be realized by the following means.
Fig. 10 is a schematic structural diagram of a motion state estimation device according to an embodiment of the present application. As shown in fig. 10, the apparatus 700 may include:
A first obtaining module 701, configured to obtain a first local grid map, where a value of each grid in the first local grid map is used to indicate a static point in the grid that is accumulated based on a first history point cloud;
The first registration module 702 is configured to register a static point in a first frame point cloud with the first local grid map acquired by the first acquisition module 701, so as to obtain a first yaw angle, where the first yaw angle represents a yaw angle of the target device when the first frame point cloud is acquired.
In one possible implementation, the apparatus further includes:
A second acquisition module for acquiring a second local grid map, wherein the value of each grid in the second local grid map is used for indicating static points accumulated in the grid based on a second history point cloud;
the second registration module is used for registering the static point in the second frame point cloud with the second local grid map acquired by the second acquisition module to obtain a second yaw angle, the second yaw angle represents the yaw angle of the target equipment when the second frame point cloud is acquired, and the acquisition time of the second frame point cloud is before the acquisition time of the first frame point cloud.
In one possible implementation, the apparatus further includes:
The first determining module is configured to determine a first yaw rate based on the first yaw angle obtained by the first registering module, the second yaw angle obtained by the second registering module, and a first time interval, where the first time interval represents a time interval between an acquisition time of the second frame point cloud and an acquisition time of the first frame point cloud, and the first yaw rate represents a yaw rate of the target device in the first time interval.
In one possible implementation, the first registration module includes:
An acquisition unit configured to acquire a plurality of candidate yaw angles;
a first determining unit configured to determine a grid accumulation value corresponding to each candidate yaw angle, where the grid accumulation value corresponding to the candidate yaw angle is used to represent a distribution of the static points accumulated based on the first history point cloud in the first local grid map when the static points are rotated to the candidate yaw angle;
A comparison unit configured to compare a grid accumulation value corresponding to each candidate yaw angle with a stored grid accumulation value, and update the stored grid accumulation value and the stored yaw angle based on a comparison result, where the stored grid accumulation value is used to represent a distribution of the static points accumulated based on the first history point cloud in the first local grid map when rotated to the stored yaw angle;
And a second determination unit configured to determine the first yaw angle from the stored yaw angles.
In a possible implementation manner, the acquiring unit is configured to:
Compensating the second yaw angle according to the second yaw rate and the first time interval to obtain a compensated yaw angle, wherein the second yaw rate represents the yaw rate of the target equipment in the second time interval, and the ending time of the second time interval is the starting time of the first time interval;
and generating the plurality of candidate yaw angles according to a preset angle resolution within a preset angle range of the compensation yaw angle.
In a possible implementation manner, the acquiring unit is configured to:
acquiring a plurality of radar speeds and a plurality of mounting angles based on a plurality of radars;
acquiring radar estimated yaw angles according to the plurality of radar speeds and the plurality of mounting angles;
And generating the plurality of candidate yaw angles according to the preset angle resolution within the preset angle range of the radar estimated yaw angle.
In one possible implementation manner, the first determining unit is configured to:
For any one of the plurality of candidate yaw angles:
Rotating a plurality of static points in the first frame point cloud according to the candidate yaw angle to obtain a plurality of first rotation points, wherein each static point in the first frame point cloud corresponds to one first rotation point;
according to a preset grid resolution, carrying out rasterization on the plurality of first rotation points to obtain grids of each first rotation point in the first local grid map;
and accumulating the grid values of each first rotation point to obtain grid accumulated values corresponding to the candidate yaw angles.
In a possible implementation, the comparing unit is configured to:
Updating the stored grid accumulation value to the grid accumulation value corresponding to the candidate yaw angle and updating the stored yaw angle to the candidate yaw angle under the condition that the grid accumulation value corresponding to the candidate yaw angle is larger than the stored grid accumulation value;
and in the case that the grid accumulation value corresponding to the candidate yaw angle is equal to the stored grid accumulation value, keeping the stored grid accumulation value unchanged, and increasing the candidate yaw angle in the stored yaw angle.
In one possible implementation manner, the first obtaining module is configured to:
Acquiring a third local grid map, wherein the value of each grid in the third local grid map represents the number of static points accumulated in the grid based on the first historical point cloud;
Determining a first translation distance according to a first movement speed, the first time interval and a first translation allowance, wherein the first movement speed represents the speed of the target equipment in the first time interval under a ground coordinate system, the first translation allowance represents a difference between the translation amount of a local grid map and the actual translation amount of the target equipment in a second time interval, the ending moment of the second time interval is the same as the starting moment of the first time interval, and the first translation distance is used for indicating the actual translation amount of the target equipment in the first time interval;
determining the number of the translation grids according to the first translation distance and a preset grid resolution;
And carrying out translation processing on the third local grid map according to the number of the translation grids to obtain the first local grid map, wherein the grids of the first local grid map comprise overlapped grids overlapped with the third local grid map and extended grids which are not overlapped with the third local grid map, the values of the overlapped grids are consistent with the values of corresponding grids in the third local grid map, and the values of the extended grids are initial values.
In one possible implementation, the apparatus further includes:
a third acquisition module, configured to acquire the first frame point cloud, where the first frame point cloud is used to indicate azimuth angles and doppler speeds of multiple target points;
An extracting module, configured to extract a static point in the first frame point cloud from the plurality of target points based on azimuth angles and doppler velocities of the plurality of target points;
A second determining module, configured to determine a first radar speed according to an azimuth angle and a doppler speed of the static point, where the first radar speed represents a speed of the radar in the first time interval under a radar coordinate system;
And a third determining module for determining the first moving speed according to the first radar speed, the mounting angle of the first radar and the second yaw angle.
In one possible implementation, the apparatus further includes:
a fourth acquisition module for acquiring a plurality of radar speeds and a plurality of installation angles based on a plurality of radars;
A fifth obtaining module, configured to obtain the first target device speed according to the plurality of radar speeds and the plurality of installation angles, where the first target device speed represents a speed of the target device in the first time interval under a target device coordinate system;
and a fourth determining module, configured to determine the first movement speed according to the first target device speed and the second yaw angle.
In one possible implementation, the apparatus further includes:
The rotating module is used for rotating the plurality of static points in the first frame point cloud according to the first transverse swing angle to obtain a plurality of second rotating points, wherein each static point in the first frame point cloud corresponds to one second rotating point;
The processing module is used for rasterizing the plurality of second rotation points according to a preset grid resolution to obtain grids of each second rotation point in the first local grid map;
And the accumulating module is used for accumulating the number of the second rotation points in the grids and the values of the grids in the first local grid map aiming at the grids where each second rotation point is positioned to obtain an updated first local grid map.
In one possible implementation, the apparatus further includes:
the construction module is used for constructing an initial local grid map according to the preset map size and the preset grid resolution, wherein the value of each grid in the initial local grid map is an initial value, and the origin of coordinates of the initial local grid map is at a preset fixed position of the target equipment;
The setting module is used for respectively setting an initial yaw angle, an initial yaw rate, an initial translation allowance and an initial stored yaw angle, wherein the value of a grid accumulated value corresponding to the initial stored yaw angle is an initial value.
In the embodiment of the application, the static points in the current point cloud are registered with the local grid map in which the static points in the historical point cloud are accumulated, so that the situation of sparse static points in the single-frame point cloud is made up, the accuracy of the yaw angle of the vehicle is effectively improved, and the estimation precision and reliability of the yaw angle speed are improved.
An embodiment of the present application provides a motion state estimation apparatus, including: a processor and a memory for storing processor-executable instructions; wherein the processor is configured to implement the above-described method when executing the instructions.
The embodiment of the application provides electronic equipment. The electronic device may perform the above-described method. Fig. 11 shows a schematic structural diagram of an electronic device according to an embodiment of the present application.
As shown in fig. 11, the electronic device may include at least one processor 301, memory 302, input-output devices 303, and a bus 304. The following describes each constituent element of the electronic device in detail with reference to fig. 11:
The processor 301 is a control center of the electronic device, and may be one processor or a collective term of a plurality of processing elements. For example, processor 301 is a CPU, but may also be an Application SPECIFIC INTEGRATED Circuit (ASIC), or one or more integrated circuits configured to implement embodiments of the present disclosure, such as: one or more microprocessors (DIGITAL SIGNAL processors, DSPs), or one or more field programmable gate arrays (Field Programmable GATE ARRAY, FPGA).
Among other things, the processor 301 may perform various functions of the electronic device by running or executing software programs stored in the memory 302, and invoking data stored in the memory 302.
In a particular implementation, processor 301 may include one or more CPUs, such as CPU 0 and CPU 1 shown in the figures, as an example.
In a particular implementation, as one embodiment, an electronic device may include multiple processors, such as processor 301 and processor 305 shown in FIG. 11. Each of these processors may be a single-core processor (single-CPU) or a multi-core processor (multi-CPU). A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
Memory 302 may be, but is not limited to, read-Only Memory (ROM) or other type of static storage device that can store static information and instructions, random access Memory (Random Access Memory, RAM) or other type of dynamic storage device that can store information and instructions, as well as electrically erasable programmable Read-Only Memory (ELECTRICALLY ERASABLE PROGRAMMABLE READ-Only Memory, EEPROM), compact disc Read-Only Memory (Compact Disc Read-Only Memory) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory 302 may be stand alone and may be coupled to the processor 301 via a bus 304. Memory 302 may also be integrated with processor 301.
An input output device 303 for communicating with other devices or communication networks. Such as for communication with an ethernet, radio access network (Radio access network, RAN), wireless local area network (Wireless Local Area Networks, WLAN), etc. The input output device 303 may include all or part of a baseband processor and may also optionally include a Radio Frequency (RF) processor. The RF processor is used for receiving and transmitting RF signals, and the baseband processor is used for realizing the processing of the baseband signals converted by the RF signals or the baseband signals to be converted into the RF signals.
In a specific implementation, as an embodiment, the input output device 303 may include a transmitter and a receiver. Wherein the transmitter is used for transmitting signals to other devices or communication networks, and the receiver is used for receiving signals transmitted by other devices or communication networks. The transmitter and receiver may be independent or may be integrated.
Bus 304 may be an industry standard architecture (Industry Standard Architecture, ISA) bus, an external device interconnect (PERIPHERAL COMPONENT INTERCONNECT, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. The bus may be classified as an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 11, but not only one bus or one type of bus.
The device structure shown in fig. 11 does not constitute a limitation of the electronic device, and may include more or less components than shown, or may combine certain components, or may be arranged in different components.
Embodiments of the present application provide a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the above-described method.
Embodiments of the present application provide a computer program product comprising a computer readable code, or a non-transitory computer readable storage medium carrying computer readable code, which when run in a processor of an electronic device, performs the above method.
The computer readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: portable computer disk, hard disk, random Access Memory (Random Access Memory, RAM), read Only Memory (ROM), erasable programmable Read Only Memory (ELECTRICALLY PROGRAMMABLE READ-Only-Memory, EPROM or flash Memory), static Random Access Memory (SRAM), portable compact disk Read Only Memory (Compact Disc Read-Only Memory, CD-ROM), digital versatile disk (Digital Video Disc, DVD), memory stick, floppy disk, mechanical coding devices, punch cards or in-groove bump structures such as instructions stored thereon, and any suitable combination of the foregoing.
The computer readable program instructions or code described herein may be downloaded from a computer readable storage medium to a respective computing/processing device or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers and/or edge servers. The network interface card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present application may be assembler instructions, instruction set architecture (Instruction Set Architecture, ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as SMALLTALK, C ++ or the like and conventional procedural programming languages, such as the "C" language or similar programming languages. The computer readable program instructions may be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a local area network (Local Area Network, LAN) or a wide area network (Wide Area Network, WAN), or may be connected to an external computer (e.g., through the internet using an internet service provider). In some embodiments, aspects of the application are implemented by personalizing electronic circuitry, such as Programmable logic circuitry, field-Programmable gate arrays (GATE ARRAY, FPGA), or Programmable logic arrays (Programmable Logic Array, PLA), with state information for computer-readable program instructions.
Various aspects of the present application are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by hardware, such as circuits or ASIC (Application SPECIFIC INTEGRATED circuits) which perform the corresponding functions or acts, or combinations of hardware and software, such as firmware and the like.
Although the invention is described herein in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
The foregoing description of embodiments of the application has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (31)

  1. A method of motion state estimation, the method comprising:
    acquiring a first local grid map, wherein the value of each grid in the first local grid map is used for indicating static points accumulated in the grid based on a first historical point cloud;
    Registering a static point in a first frame point cloud with the first local grid map to obtain a first yaw angle, wherein the first yaw angle represents the yaw angle of target equipment when the first frame point cloud is acquired.
  2. The method according to claim 1, wherein the method further comprises:
    Acquiring a second local grid map, wherein the value of each grid in the second local grid map is used for indicating static points accumulated in the grid based on a second historical point cloud;
    registering static points in a second frame point cloud with the second local grid map to obtain a second yaw angle, wherein the second yaw angle represents the yaw angle of the target device when the second frame point cloud is acquired, and the acquisition time of the second frame point cloud is before the acquisition time of the first frame point cloud.
  3. The method according to claim 2, wherein the method further comprises:
    a first yaw rate is determined based on the first yaw angle, the second yaw angle, and a first time interval, the first time interval representing a time interval between a time of acquisition of the second frame point cloud and a time of acquisition of the first frame point cloud, the first yaw rate representing a yaw rate of the target device within the first time interval.
  4. The method of claim 3, wherein registering the static points in the first frame point cloud with the first local grid map results in a first yaw angle, comprising:
    Acquiring a plurality of candidate yaw angles;
    Determining a grid accumulation value corresponding to each candidate yaw angle, wherein the grid accumulation value corresponding to the candidate yaw angle is used for representing the distribution condition of the static points accumulated based on the first history point cloud in the first local grid map under the condition of rotating to the candidate yaw angle;
    Comparing a grid accumulation value corresponding to each candidate yaw angle with a stored grid accumulation value, and updating the stored grid accumulation value and a stored yaw angle based on a comparison result, wherein the stored grid accumulation value is used for representing the distribution condition of static points accumulated based on a first history point cloud in the first local grid map under the condition of rotating to the stored yaw angle;
    the first yaw angle is determined from the stored yaw angles.
  5. The method of claim 4, wherein the obtaining a plurality of candidate yaw angles comprises:
    Compensating the second yaw angle according to the second yaw rate and the first time interval to obtain a compensated yaw angle, wherein the second yaw rate represents the yaw rate of the target equipment in the second time interval, and the ending time of the second time interval is the starting time of the first time interval;
    and generating the plurality of candidate yaw angles according to a preset angle resolution within a preset angle range of the compensation yaw angle.
  6. The method of claim 4, wherein the obtaining a plurality of candidate yaw angles comprises:
    acquiring a plurality of radar speeds and a plurality of mounting angles based on a plurality of radars;
    acquiring radar estimated yaw angles according to the plurality of radar speeds and the plurality of mounting angles;
    And generating the plurality of candidate yaw angles according to the preset angle resolution within the preset angle range of the radar estimated yaw angle.
  7. The method according to any one of claims 4 to 6, wherein the determining a grid cumulative value for each candidate yaw angle includes:
    For any one of the plurality of candidate yaw angles:
    Rotating a plurality of static points in the first frame point cloud according to the candidate yaw angle to obtain a plurality of first rotation points, wherein each static point in the first frame point cloud corresponds to one first rotation point;
    according to a preset grid resolution, carrying out rasterization on the plurality of first rotation points to obtain grids of each first rotation point in the first local grid map;
    and accumulating the grid values of each first rotation point to obtain grid accumulated values corresponding to the candidate yaw angles.
  8. The method according to any one of claims 4 to 7, wherein updating the stored grid accumulation value and the stored yaw angle based on the comparison result includes:
    Updating the stored grid accumulation value to the grid accumulation value corresponding to the candidate yaw angle and updating the stored yaw angle to the candidate yaw angle under the condition that the grid accumulation value corresponding to the candidate yaw angle is larger than the stored grid accumulation value;
    and in the case that the grid accumulation value corresponding to the candidate yaw angle is equal to the stored grid accumulation value, keeping the stored grid accumulation value unchanged, and increasing the candidate yaw angle in the stored yaw angle.
  9. The method of any of claims 4 to 8, wherein the acquiring a first local grid map comprises:
    Acquiring a third local grid map, wherein the value of each grid in the third local grid map represents static points accumulated in the grid based on the first history point cloud;
    Determining a first translation distance according to a first movement speed, the first time interval and a first translation allowance, wherein the first movement speed represents the speed of the target equipment in the first time interval under a ground coordinate system, the first translation allowance represents a difference between the translation amount of a local grid map and the actual translation amount of the target equipment in a second time interval, the ending moment of the second time interval is the same as the starting moment of the first time interval, and the first translation distance is used for indicating the actual translation amount of the target equipment in the first time interval;
    determining the number of the translation grids according to the first translation distance and a preset grid resolution;
    And carrying out translation processing on the third local grid map according to the number of the translation grids to obtain the first local grid map, wherein the grids of the first local grid map comprise overlapped grids overlapped with the third local grid map and extended grids which are not overlapped with the third local grid map, the values of the overlapped grids are consistent with the values of corresponding grids in the third local grid map, and the values of the extended grids are initial values.
  10. The method according to claim 9, wherein the method further comprises:
    acquiring the first frame point cloud, wherein the first frame point cloud is used for indicating azimuth angles and Doppler speeds of a plurality of target points;
    Extracting a static point in the first frame point cloud from the plurality of target points based on azimuth angles and Doppler speeds of the plurality of target points;
    Determining a first radar speed according to the azimuth angle and the Doppler speed of the static point, wherein the first radar speed represents the speed of the radar in the first time interval under a radar coordinate system;
    The first moving speed is determined according to the first radar speed, the mounting angle of the first radar, and the second yaw angle.
  11. The method according to claim 9, wherein the method further comprises:
    acquiring a plurality of radar speeds and a plurality of mounting angles based on a plurality of radars;
    acquiring a first target device speed according to the radar speeds and the installation angles, wherein the first target device speed represents the speed of the target device in the first time interval under a target device coordinate system;
    The first movement speed is determined from the first target device speed and the second yaw angle.
  12. The method according to any one of claims 1 to 11, further comprising:
    Rotating a plurality of static points in the first frame point cloud according to the first transverse swing angle to obtain a plurality of second rotation points, wherein each static point in the first frame point cloud corresponds to one second rotation point;
    according to the preset grid resolution, carrying out rasterization on the plurality of second rotation points to obtain grids of each second rotation point in the first local grid map;
    And accumulating the number of the second rotation points in the grids and the values of the grids in the first local grid map aiming at the grids where each second rotation point is positioned to obtain an updated first local grid map.
  13. The method according to any one of claims 1 to 12, further comprising:
    according to the preset map size and the preset grid resolution, an initial local grid map is constructed, the value of each grid in the initial local grid map is an initial value, and the origin of coordinates of the initial local grid map is at a preset fixed position of the target equipment;
    and respectively setting an initial yaw angle, an initial yaw rate, an initial translation allowance and an initial stored yaw angle, wherein the value of a grid accumulated value corresponding to the initial stored yaw angle is an initial value.
  14. A motion state estimation apparatus, the apparatus comprising:
    a first acquisition module for acquiring a first local grid map, wherein the value of each grid in the first local grid map is used for indicating static points accumulated in the grid based on a first history point cloud;
    The first registration module is used for registering the static point in the first frame point cloud with the first local grid map acquired by the first acquisition module to obtain a first yaw angle, and the first yaw angle represents the yaw angle of the target device when the first frame point cloud is acquired.
  15. The apparatus of claim 14, wherein the apparatus further comprises:
    A second acquisition module for acquiring a second local grid map, wherein the value of each grid in the second local grid map is used for indicating static points accumulated in the grid based on a second history point cloud;
    the second registration module is used for registering the static point in the second frame point cloud with the second local grid map acquired by the second acquisition module to obtain a second yaw angle, the second yaw angle represents the yaw angle of the target equipment when the second frame point cloud is acquired, and the acquisition time of the second frame point cloud is before the acquisition time of the first frame point cloud.
  16. The apparatus of claim 15, wherein the apparatus further comprises:
    The first determining module is configured to determine a first yaw rate based on the first yaw angle obtained by the first registering module, the second yaw angle obtained by the second registering module, and a first time interval, where the first time interval represents a time interval between an acquisition time of the second frame point cloud and an acquisition time of the first frame point cloud, and the first yaw rate represents a yaw rate of the target device in the first time interval.
  17. The apparatus of claim 16, wherein the first registration module comprises:
    An acquisition unit configured to acquire a plurality of candidate yaw angles;
    A first determining unit configured to determine a grid accumulation value corresponding to each candidate yaw angle, where the grid accumulation value corresponding to the candidate yaw angle is used to represent a distribution of the static points accumulated based on the first history point cloud in the first local grid map when the static points are rotated to the candidate yaw angle;
    A comparison unit configured to compare a grid accumulation value corresponding to each candidate yaw angle with a stored grid accumulation value, and update the stored grid accumulation value and the stored yaw angle based on a comparison result, where the stored grid accumulation value is used to represent a distribution of the static points accumulated based on the first history point cloud in the first local grid map when rotated to the stored yaw angle;
    And a second determination unit configured to determine the first yaw angle from the stored yaw angles.
  18. The apparatus of claim 17, wherein the acquisition unit is configured to:
    compensating the second yaw angle according to the second yaw rate and the first time interval to obtain a compensated yaw angle, wherein the second yaw rate represents the yaw rate of the target equipment in the second time interval, and the ending time of the second time interval is the starting time of the first time interval;
    and generating the plurality of candidate yaw angles according to a preset angle resolution within a preset angle range of the compensation yaw angle.
  19. The apparatus of claim 17, wherein the acquisition unit is configured to:
    acquiring a plurality of radar speeds and a plurality of mounting angles based on a plurality of radars;
    acquiring radar estimated yaw angles according to the plurality of radar speeds and the plurality of mounting angles;
    And generating the plurality of candidate yaw angles according to the preset angle resolution within the preset angle range of the radar estimated yaw angle.
  20. The apparatus according to any one of claims 17 to 19, wherein the first determining unit is configured to:
    For any one of the plurality of candidate yaw angles:
    Rotating a plurality of static points in the first frame point cloud according to the candidate yaw angle to obtain a plurality of first rotation points, wherein each static point in the first frame point cloud corresponds to one first rotation point;
    according to a preset grid resolution, carrying out rasterization on the plurality of first rotation points to obtain grids of each first rotation point in the first local grid map;
    and accumulating the grid values of each first rotation point to obtain grid accumulated values corresponding to the candidate yaw angles.
  21. The apparatus according to any one of claims 17 to 20, wherein the comparing unit is configured to:
    Updating the stored grid accumulation value to the grid accumulation value corresponding to the candidate yaw angle and updating the stored yaw angle to the candidate yaw angle under the condition that the grid accumulation value corresponding to the candidate yaw angle is larger than the stored grid accumulation value;
    and in the case that the grid accumulation value corresponding to the candidate yaw angle is equal to the stored grid accumulation value, keeping the stored grid accumulation value unchanged, and increasing the candidate yaw angle in the stored yaw angle.
  22. The apparatus of any one of claims 17 to 21, wherein the first acquisition module is configured to:
    Acquiring a third local grid map, wherein the value of each grid in the third local grid map represents static points accumulated in the grid based on the first history point cloud;
    Determining a first translation distance according to a first movement speed, the first time interval and a first translation allowance, wherein the first movement speed represents the speed of the target equipment in the first time interval under a ground coordinate system, the first translation allowance represents a difference between the translation amount of a local grid map and the actual translation amount of the target equipment in a second time interval, the ending moment of the second time interval is the same as the starting moment of the first time interval, and the first translation distance is used for indicating the actual translation amount of the target equipment in the first time interval;
    determining the number of the translation grids according to the first translation distance and a preset grid resolution;
    And carrying out translation processing on the third local grid map according to the number of the translation grids to obtain the first local grid map, wherein the grids of the first local grid map comprise overlapped grids overlapped with the third local grid map and extended grids which are not overlapped with the third local grid map, the values of the overlapped grids are consistent with the values of corresponding grids in the third local grid map, and the values of the extended grids are initial values.
  23. The apparatus of claim 22, wherein the apparatus further comprises:
    a third acquisition module, configured to acquire the first frame point cloud, where the first frame point cloud is used to indicate azimuth angles and doppler speeds of multiple target points;
    An extracting module, configured to extract a static point in the first frame point cloud from the plurality of target points based on azimuth angles and doppler velocities of the plurality of target points;
    A second determining module, configured to determine a first radar speed according to an azimuth angle and a doppler speed of the static point, where the first radar speed represents a speed of the radar in the first time interval under a radar coordinate system;
    And a third determining module for determining the first moving speed according to the first radar speed, the mounting angle of the first radar and the second yaw angle.
  24. The apparatus of claim 22, wherein the apparatus further comprises:
    a fourth acquisition module for acquiring a plurality of radar speeds and a plurality of installation angles based on a plurality of radars;
    A fifth obtaining module, configured to obtain a first target device speed according to the plurality of radar speeds and the plurality of installation angles, where the first target device speed represents a speed of the target device in the first time interval under a target device coordinate system;
    and a fourth determining module, configured to determine the first movement speed according to the first target device speed and the second yaw angle.
  25. The apparatus according to any one of claims 14 to 24, further comprising:
    The rotating module is used for rotating the plurality of static points in the first frame point cloud according to the first transverse swing angle to obtain a plurality of second rotating points, wherein each static point in the first frame point cloud corresponds to one second rotating point;
    The processing module is used for rasterizing the plurality of second rotation points according to a preset grid resolution to obtain grids of each second rotation point in the first local grid map;
    And the accumulating module is used for accumulating the number of the second rotation points in the grids and the values of the grids in the first local grid map aiming at the grids where each second rotation point is positioned to obtain an updated first local grid map.
  26. The apparatus according to any one of claims 14 to 25, further comprising:
    the construction module is used for constructing an initial local grid map according to the preset map size and the preset grid resolution, wherein the value of each grid in the initial local grid map is an initial value, and the origin of coordinates of the initial local grid map is at a preset fixed position of the target equipment;
    The setting module is used for respectively setting an initial yaw angle, an initial yaw rate, an initial translation allowance and an initial stored yaw angle, wherein the value of a grid accumulated value corresponding to the initial stored yaw angle is an initial value.
  27. A motion state estimation apparatus, comprising:
    A processor;
    a memory for storing processor-executable instructions;
    wherein the processor is configured to implement the method of any one of claims 1 to 13 when executing the instructions.
  28. A computer readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the method of any of claims 1 to 13.
  29. A computer program product comprising computer readable code, or a non-transitory computer readable storage medium carrying computer readable code, which when run in an electronic device, a processor in the electronic device performs the method of any one of claims 1 to 13.
  30. A terminal comprising a motion state estimation device according to any one of claims 14 to 26 or a motion state estimation device according to claim 27.
  31. The terminal of claim 30, wherein the terminal is a vehicle.
CN202180100707.8A 2021-11-18 2021-11-18 Motion state estimation method and device Pending CN117999207A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/131425 WO2023087202A1 (en) 2021-11-18 2021-11-18 Motion state estimation method and apparatus

Publications (1)

Publication Number Publication Date
CN117999207A true CN117999207A (en) 2024-05-07

Family

ID=86396136

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180100707.8A Pending CN117999207A (en) 2021-11-18 2021-11-18 Motion state estimation method and device

Country Status (2)

Country Link
CN (1) CN117999207A (en)
WO (1) WO2023087202A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116859406B (en) * 2023-09-05 2023-11-28 武汉煜炜光学科技有限公司 Calculation method and device for vehicle speed based on laser radar

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005014954A1 (en) * 2005-04-01 2006-10-05 Audi Ag Measured yaw angle deviation determining method for motor vehicle, involves determining measured yaw angle and distance of assigned point of vehicle to lane limit, from data and determining real yaw angle from distance and speed information
CN108162976A (en) * 2017-12-21 2018-06-15 江苏大学 A kind of vehicle running state method of estimation based on sparse grid quadrature Kalman filtering
CN110045376B (en) * 2019-04-28 2021-06-01 森思泰克河北科技有限公司 Drivable region acquisition method, computer-readable storage medium, and terminal device
CN111684382B (en) * 2019-06-28 2024-06-11 深圳市卓驭科技有限公司 Mobile platform state estimation method, system, mobile platform and storage medium
CN112767485B (en) * 2021-01-26 2023-07-07 哈尔滨工业大学(深圳) Point cloud map creation and scene identification method based on static semantic information
CN113093221A (en) * 2021-03-31 2021-07-09 东软睿驰汽车技术(沈阳)有限公司 Generation method and device of grid-occupied map

Also Published As

Publication number Publication date
WO2023087202A1 (en) 2023-05-25

Similar Documents

Publication Publication Date Title
EP3612854B1 (en) Vehicle navigation system using pose estimation based on point cloud
CN109061703B (en) Method, apparatus, device and computer-readable storage medium for positioning
AU2018282302B2 (en) Integrated sensor calibration in natural scenes
US11525682B2 (en) Host vehicle position estimation device
WO2018177026A1 (en) Device and method for determining road edge
KR20210111180A (en) Method, apparatus, computing device and computer-readable storage medium for positioning
CN111656136A (en) Vehicle positioning system using laser radar
CN114636993A (en) External parameter calibration method, device and equipment for laser radar and IMU
WO2021097983A1 (en) Positioning method, apparatus, and device, and storage medium
CN113933818A (en) Method, device, storage medium and program product for calibrating laser radar external parameter
CN112051575B (en) Method for adjusting millimeter wave radar and laser radar and related device
CN114111775B (en) Multi-sensor fusion positioning method and device, storage medium and electronic equipment
CN111316285A (en) Object detection method, electronic device, and computer storage medium
CN113655453A (en) Data processing method and device for sensor calibration and automatic driving vehicle
CN114829971A (en) Laser radar calibration method and device and storage medium
WO2021081958A1 (en) Terrain detection method, movable platform, control device, system, and storage medium
CN115556769A (en) Obstacle state quantity determination method and device, electronic device and medium
CN109029418A (en) A method of vehicle is positioned in closed area
CN117999207A (en) Motion state estimation method and device
US20220390607A1 (en) Collaborative estimation and correction of lidar boresight alignment error and host vehicle localization error
CN116449329B (en) Method, system, equipment and storage medium for disambiguating speed of millimeter wave radar
CN113312403B (en) Map acquisition method and device, electronic equipment and storage medium
CN116242373A (en) High-precision navigation positioning method and system for fusing multi-source data
CN117635721A (en) Target positioning method, related system and storage medium
CN116165652A (en) Target tracking method and device and communication equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination