CN114594467A - Course angle determining method and device, electronic equipment and storage medium - Google Patents

Course angle determining method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114594467A
CN114594467A CN202210185678.6A CN202210185678A CN114594467A CN 114594467 A CN114594467 A CN 114594467A CN 202210185678 A CN202210185678 A CN 202210185678A CN 114594467 A CN114594467 A CN 114594467A
Authority
CN
China
Prior art keywords
detection point
timestamp
determining
detection
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210185678.6A
Other languages
Chinese (zh)
Inventor
赵加友
朱奇峰
方勇军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202210185678.6A priority Critical patent/CN114594467A/en
Publication of CN114594467A publication Critical patent/CN114594467A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The application discloses a course angle determining method, a course angle determining device, electronic equipment and a storage medium, which are used for improving the accuracy of determining the course angle of a moving vehicle. In the embodiment of the application, firstly, continuous detection is carried out on a target scene by adopting a millimeter wave radar and a laser radar respectively to obtain a first detection point cluster and a second detection point cloud; then, carrying out time synchronization processing and space synchronization processing on the first detection point cluster and the second detection point cluster; performing area expansion on the coordinates of each first detection point after synchronous processing to obtain an expansion area of the first detection point, and determining a second detection point falling in the expansion area; then the second detection point is associated with the first detection point, and the first detection point and the second detection point are subjected to information fusion to obtain a fusion detection point; and finally, segmentation processing is carried out on the fusion detection point cloud formed by the fusion detection points to obtain the fusion detection point cloud corresponding to each vehicle in the target scene, and the course angle of each vehicle is determined based on the fusion detection point cloud.

Description

Course angle determining method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of radar technologies, and in particular, to a method and an apparatus for determining a course angle, an electronic device, and a storage medium.
Background
The course angle of a moving vehicle is a very important parameter when estimating the moving state of the moving vehicle at the next moment.
In the related technology, a millimeter wave radar or a laser radar is mostly adopted to estimate the course angle of a moving vehicle, when the millimeter wave radar is adopted to estimate the course angle of the moving vehicle, the position of the vehicle and Doppler information of the vehicle are firstly measured by the millimeter wave radar, and then a corresponding course angle estimation model is established by combining a selected tracking model, but the measurement precision of the method is limited by the error of the millimeter wave radar on the position measurement of the moving vehicle, the error of the Doppler measurement and the error of the course angle estimation model; when the course angle of the moving vehicle is estimated by adopting the laser radar, firstly, the point cloud obtained by detection is segmented, then the shape of the point cloud is estimated, and further the course angle of the moving vehicle is obtained.
Disclosure of Invention
The invention aims to provide a course angle determining method, a course angle determining device, an electronic device and a storage medium, which are used for improving the accuracy of determining the course angle of a moving vehicle.
In a first aspect, an embodiment of the present application provides a method for determining a heading angle, where the method includes:
continuously detecting a target scene by adopting a millimeter wave radar to obtain a continuous multi-frame first detection point cluster, and continuously detecting the target scene by adopting a laser radar to obtain a continuous multi-frame second detection point cloud; wherein the target scene comprises at least one vehicle;
performing time synchronization processing and space synchronization processing on the first detection point cluster and the second detection point cloud to obtain a first detection point cluster and a second detection point cloud which have association relation in time and space;
performing area extension on the coordinate of each first detection point in the first detection point cluster after synchronous processing according to the standard deviation to obtain an extension area of the first detection point, and determining a second detection point in the extension area according to the coordinate of the second detection point; wherein the standard deviation is a standard deviation of a physical parameter associated with the first detection point;
associating a second detection point in the extension area with the first detection point, and performing information fusion on the first detection point and the second detection point which are associated with each other to obtain a fusion detection point; the fusion detection points form a fusion detection point cloud;
and carrying out segmentation processing on the fusion detection point cloud formed by the fusion detection points to obtain a fusion detection point cloud corresponding to each vehicle in the target scene, and determining the course angle of each vehicle based on the fusion detection point cloud.
In the method, the millimeter wave radar and the laser radar are adopted to detect the moving vehicle together, the first detection point cluster and the second detection point cloud obtained through detection are correlated, the first detection point and the corresponding second detection point are subjected to information fusion, the fusion detection point has information obtained through detection of the millimeter wave radar, and also has information obtained through detection of the laser radar, and finally course angle estimation is carried out on the fusion detection point.
In some possible embodiments, the time synchronization of the first detection point cluster and the second detection point cluster comprises:
for each second timestamp, determining a first computation timestamp and a second computation timestamp corresponding to the second timestamp; the first timestamp is obtained by marking each frame of first detection point clusters obtained by the millimeter wave radar through continuous detection on the target scene according to a first preset frequency, and the second timestamp is obtained by marking each frame of second detection point clusters obtained by the laser radar through continuous detection on the target scene according to a second preset frequency; wherein the first calculated timestamp is a first timestamp chronologically preceding and least spaced from the second timestamp, and the second calculated timestamp is a first timestamp chronologically following and least spaced from the second timestamp;
and determining an associated first detection point cluster having an association relation with a second detection point cloud corresponding to the second time stamp based on the first calculation time stamp, the second calculation time stamp and the second time stamp.
According to the method and the device, the first detection point cluster and the second detection point cloud are subjected to time synchronization based on the marked time stamp, the first detection point cluster and the second detection point cloud which have an association relation in time are determined, and the accuracy of the follow-up course angle determination is further improved.
In some possible embodiments, the determining, based on the first, second and third calculated timestamps, an associated first detection point cluster having an association relationship with a second detection point cloud corresponding to the second timestamp comprises:
acquiring a third timestamp, wherein the third timestamp is obtained by marking with a second preset frequency at a first time interval after the first second timestamp of the laser radar is determined to be marked; wherein the first time interval is obtained according to a second preset frequency;
for each third timestamp, determining a target second timestamp that is least chronologically different from and before the third timestamp;
acquiring a first calculation timestamp and a second calculation timestamp corresponding to the target second timestamp;
determining a related first detection point cluster based on a first detection point cluster corresponding to the first computation timestamp and a first detection point cluster corresponding to the second computation timestamp;
and taking the associated first detection point cluster as a first detection point cluster which has an association relation in time with a second detection point cloud corresponding to the target second timestamp.
In the application, the third timestamp is adopted to ensure that when the first detection point cluster and the second detection point cloud with the incidence relation are calculated, the data of the first detection point cluster and the second detection point cloud are detected, and the calculation timeliness is ensured.
In some possible embodiments, said determining an associated first detection point cluster based on the first detection point cluster corresponding to the first computation timestamp and the first detection point cluster corresponding to the second computation timestamp comprises:
performing linear interpolation processing on the detection point cluster corresponding to the first calculation timestamp and the first detection point cluster corresponding to the second calculation timestamp to obtain the associated first detection point cluster; or
And performing mean value processing on the coordinates of first detection points in the detection point cluster corresponding to the first calculation timestamp and the first detection point cluster corresponding to the second calculation timestamp to obtain the associated first detection point cluster.
In the application, in order to ensure the accuracy of the finally obtained course angle, the associated detection point clusters are obtained by combining the first detection point cluster corresponding to the first calculation timestamp and the second detection point cluster corresponding to the second calculation timestamp, so that the course angle is more accurate.
In some possible embodiments, spatially synchronizing the first detection point cluster and the second detection point cluster includes:
performing coordinate transformation on a first detection point based on a rotation parameter and a translation parameter, and performing coordinate transformation on a second detection point based on the rotation parameter and the translation parameter to obtain a coordinate of the first detection point and a coordinate of the second detection point after space synchronization; the rotation parameters are used for rotating the coordinate axes in the coordinate system corresponding to the first detection point and the coordinate system corresponding to the second detection point to the left side to coincide with the coordinate axes of the space synchronization coordinate system; the translation parameters are used for determining coordinate components of the origin of the space synchronization coordinate system in a coordinate system corresponding to the first detection point and a coordinate system corresponding to the second detection point.
In the application, the first detection point cloud and the second detection point cloud are spatially synchronized, so that the coordinates of the first detection point in the first detection point cluster and the coordinates of the second detection point in the second detection point cloud can be represented based on the same coordinate system.
In some possible embodiments, the rotation parameter and the translation parameter are obtained according to the following method:
measuring a target object by using a laser radar to obtain at least one measuring point of the target object and a coordinate corresponding to the measuring point; determining the mean value of the coordinates of the measuring points;
measuring the target object by adopting a millimeter wave radar to obtain a first coordinate of the target object;
determining a rotation parameter and a translation parameter based on the mean of the coordinates and the first coordinate.
In the application, the translation parameter and the rotation parameter are determined based on the measured values of the millimeter wave radar and the laser radar to the target object, so that the spatial synchronization of the first detection point cluster and the second detection point cloud is more accurate.
In some possible embodiments, the coordinates of the first detection point include a radial distance, an azimuth angle, and a pitch angle;
the area expansion of the coordinates of each first detection point in the first detection point cluster after the synchronous processing according to the standard deviation comprises:
performing the following process for each first detection point in the first detection point cluster:
determining the radial distance standard deviation, the azimuth angle standard deviation and the pitch angle standard deviation of the first detection point according to the signal-to-noise ratio of the first detection point;
determining a first difference between a radial distance of the first detection point and the radial distance standard deviation, and determining a first sum between the radial distance of the first detection point and the radial distance standard deviation;
determining an expansion area of the radial distance according to the first difference value and the first sum value;
determining a second difference between the azimuth of the first detection point and the azimuth standard deviation, and determining a second sum between the azimuth of the first detection point and the azimuth standard deviation;
determining an extension area of the azimuth according to the second difference value and the second sum value;
determining a third difference value between the pitch angle of the first detection point and the standard pitch angle difference, and determining a third sum value between the pitch angle of the first detection point and the standard pitch angle difference;
determining an expansion area of the pitch angle according to the third sum;
and determining the expansion area of the first detection point according to the expansion area of the radial distance, the expansion area of the azimuth angle and the expansion area of the pitch angle.
In the embodiment of the application, a method for performing area extension on the first detection points is adopted to ensure that each second detection point has a corresponding first detection point.
In some possible embodiments, associating a second detection point that will fall within a corresponding area of the first detection point with the first detection point comprises:
if the plurality of second detection points fall in the extension area of the first detection point, determining the second detection point with the maximum signal-to-noise ratio in the extension area;
and taking the second detection point with the maximum signal-to-noise ratio as the second detection point associated with the first detection point.
In some possible embodiments, the first cluster of detection points comprises: a plurality of first detection points, the second detection point cloud comprising: the information fusion of the first detection point and the second detection point which are mutually associated to obtain a fusion detection point comprises the following steps:
the following procedure is performed for each second detection point:
taking the physical parameter information of the first detection point associated with the second detection point as the physical parameter information of the fusion detection point corresponding to the second detection point;
taking the coordinates of the second detection point after the space synchronization as the coordinates of the fusion detection point corresponding to the second detection point;
and forming the fusion detection point based on the physical parameter information of the fusion detection point and the coordinates of the fusion detection point.
In the application, the physical parameter information of the first detection point and the coordinates of the second detection point are adopted, so that the fusion detection point can have the advantages of a laser radar and a millimeter wave radar at the same time.
In some possible embodiments, determining a heading angle for the each vehicle based on the fused detection point cloud comprises:
performing, for each frame of the fused detection point cloud for each vehicle:
determining an initial course angle of the vehicle based on the fused detection point cloud;
screening out target fusion detection points according to the signal-to-noise ratio of each fusion detection point in the fusion detection point cloud;
determining a wheel position difference of the vehicle based on the fused detection point cloud and a fused detection point cloud of a previous frame;
and inputting the initial course angle, the target fusion detection point and the wheel position difference into a pre-trained motion model to obtain the course angle of the vehicle.
According to the method and the device, the course angle of the vehicle is obtained based on the fusion detection point cloud, and the accuracy of estimating the course angle of the vehicle is improved.
In a second aspect, the present application also provides a heading angle determining apparatus, the apparatus comprising:
the detection module is used for continuously detecting a target scene by adopting a millimeter wave radar to obtain a continuous multi-frame first detection point cluster, and continuously detecting the target scene by adopting a laser radar to obtain a continuous multi-frame second detection point cloud; wherein the target scene comprises at least one vehicle;
the synchronization module is used for carrying out time synchronization processing and space synchronization processing on the first detection point cluster and the second detection point cloud to obtain the first detection point cluster and the second detection point cloud which have association relation in time and space;
the extension module is used for carrying out area extension on the coordinate of each first detection point in the first detection point cluster after synchronous processing according to the standard deviation to obtain an extension area of the first detection point, and determining a second detection point in the extension area according to the coordinate of the second detection point; wherein the standard deviation is a standard deviation of a physical parameter associated with the first detection point;
the fusion module is used for associating the second detection point in the expansion area with the first detection point and carrying out information fusion on the first detection point and the second detection point which are associated with each other to obtain a fusion detection point; the fusion detection points form fusion detection point cloud;
and the course angle determining module is used for carrying out segmentation processing on the fusion detection point cloud formed by the fusion detection points to obtain the fusion detection point cloud corresponding to each vehicle in the target scene, and determining the course angle of each vehicle based on the fusion detection point cloud.
In some possible embodiments, the synchronization module, when performing time synchronization processing on the first detection point cluster and the second detection point cluster, is configured to:
for each second timestamp, determining a first computation timestamp and a second computation timestamp corresponding to the second timestamp; the first timestamp is obtained by marking each frame of first detection point clusters obtained by the millimeter wave radar through continuous detection on the target scene according to a first preset frequency, and the second timestamp is obtained by marking each frame of second detection point clusters obtained by the laser radar through continuous detection on the target scene according to a second preset frequency; wherein the first calculated timestamp is a first timestamp chronologically preceding and least spaced from the second timestamp, and the second calculated timestamp is a first timestamp chronologically following and least spaced from the second timestamp;
and determining an associated first detection point cluster having an association relation with a second detection point cloud corresponding to the second time stamp based on the first calculation time stamp, the second calculation time stamp and the second time stamp.
In some possible embodiments, the synchronization module, when determining, based on the first, second and third calculated timestamps, an associated first detection point cluster having an association relationship with a second detection point cloud corresponding to the second timestamp, is configured to:
acquiring a third timestamp, wherein the third timestamp is obtained by marking with a second preset frequency at a first time interval after the first second timestamp of the laser radar is determined to be marked; wherein the first time interval is obtained according to a second preset frequency;
for each third timestamp, determining a target second timestamp that is least chronologically different from and before the third timestamp;
acquiring a first calculation timestamp and a second calculation timestamp corresponding to the target second timestamp;
determining a related first detection point cluster based on a first detection point cluster corresponding to the first computation timestamp and a first detection point cluster corresponding to the second computation timestamp;
and taking the related first detection point cluster as a first detection point cluster which has a correlation relation in time with a second detection point cloud corresponding to the target second timestamp.
In some possible embodiments, the synchronization module, when performing the determination of the associated first detection point cluster based on the first detection point cluster corresponding to the first computation timestamp and the first detection point cluster corresponding to the second computation timestamp, is configured to:
performing linear interpolation processing on the detection point cluster corresponding to the first calculation timestamp and the first detection point cluster corresponding to the second calculation timestamp to obtain the associated first detection point cluster; or
And performing mean value processing on the coordinates of first detection points in the detection point cluster corresponding to the first calculation timestamp and the first detection point cluster corresponding to the second calculation timestamp to obtain the associated first detection point cluster.
In some possible embodiments, the synchronization module, when performing the spatial synchronization process on the first detection point cluster and the second detection point cluster, is configured to:
performing coordinate conversion on a first detection point based on a rotation parameter and a translation parameter, and performing coordinate conversion on a second detection point based on the rotation parameter and the translation parameter to obtain a coordinate of the first detection point and a coordinate of the second detection point after space synchronization; the rotation parameters are used for rotating the coordinate axes in the coordinate system corresponding to the first detection point and the coordinate system corresponding to the second detection point to the left side to coincide with the coordinate axes of the space synchronization coordinate system; the translation parameters are used for determining coordinate components of the origin of the space synchronization coordinate system in a coordinate system corresponding to the first detection point and a coordinate system corresponding to the second detection point.
In some possible embodiments, the rotation parameter and the translation parameter are obtained according to the following method:
measuring a target object by using a laser radar to obtain at least one measuring point of the target object and a coordinate corresponding to the measuring point; determining the mean value of the coordinates of the measuring points;
measuring the target object by adopting a millimeter wave radar to obtain a first coordinate of the target object;
determining a rotation parameter and a translation parameter based on the mean of the coordinates and the first coordinate.
In some possible embodiments, the coordinates of the first detection point include a radial distance, an azimuth angle, and a pitch angle;
the extension module, when performing area extension of coordinates of each first detection point in the synchronized first detection point cluster according to a standard deviation, is configured to:
performing the following process for each first detection point in the first detection point cluster:
determining the radial distance standard deviation, the azimuth angle standard deviation and the pitch angle standard deviation of the first detection point according to the signal-to-noise ratio of the first detection point;
determining a first difference between a radial distance of the first detection point and the radial distance standard deviation, and determining a first sum between the radial distance of the first detection point and the radial distance standard deviation;
determining an expansion area of the radial distance according to the first difference value and the first sum value;
determining a second difference between the azimuth of the first detection point and the azimuth standard deviation, and determining a second sum between the azimuth of the first detection point and the azimuth standard deviation;
determining an extension area of the azimuth according to the second difference value and the second sum value;
determining a third difference value between the pitch angle of the first detection point and the standard pitch angle difference, and determining a third sum value between the pitch angle of the first detection point and the standard pitch angle difference;
determining an expansion area of the pitch angle according to the third sum;
and determining the expansion area of the first detection point according to the expansion area of the radial distance, the expansion area of the azimuth angle and the expansion area of the pitch angle.
In some possible embodiments, the fusion module, when performing associating a second detection point falling in a corresponding region of the first detection point with the first detection point, is configured to:
if the plurality of second detection points fall in the extension area of the first detection point, determining the second detection point with the maximum signal-to-noise ratio in the extension area;
and taking the second detection point with the maximum signal-to-noise ratio as the second detection point associated with the first detection point.
In some possible embodiments, the first cluster of detection points comprises: a plurality of first detection points, the second detection point cloud comprising: the information fusion of the first detection point and the second detection point which are mutually associated to obtain a fusion detection point comprises the following steps:
the following procedure is performed for each second detection point:
taking the physical parameter information of the first detection point associated with the second detection point as the physical parameter information of the fusion detection point corresponding to the second detection point;
taking the coordinates of the second detection point after the space synchronization as the coordinates of the fusion detection point corresponding to the second detection point;
and forming the fusion detection point based on the physical parameter information of the fusion detection point and the coordinates of the fusion detection point.
In some possible embodiments, the heading angle determination module, when performing the determining the heading angle of each vehicle based on the fused detection point cloud, is configured to:
performing, for each frame of the fused detection point cloud for each vehicle:
determining an initial course angle of the vehicle based on the fused detection point cloud;
screening out target fusion detection points according to the signal-to-noise ratio of each fusion detection point in the fusion detection point cloud;
determining a wheel position difference of the vehicle based on the fused detection point cloud and a fused detection point cloud of a previous frame;
and inputting the initial course angle, the target fusion detection point and the wheel position difference into a pre-trained motion model to obtain the course angle of the vehicle.
In a third aspect, another embodiment of the present application further provides an electronic device, including at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor to enable the at least one processor to perform any one of the methods provided by the embodiments of the first aspect of the present application.
In a fourth aspect, another embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and the computer program is configured to cause a computer to execute any one of the methods provided in the first aspect of the present application.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments of the present application will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is an application scenario diagram of a course angle determining method according to an embodiment of the present application;
fig. 2 is an overall flowchart of a course angle determining method according to an embodiment of the present disclosure;
fig. 3 is a schematic time stamp mark diagram of a course angle determining method according to an embodiment of the present application;
fig. 4 is a schematic diagram of a first detection point cluster determined to be associated in a course angle determination method according to an embodiment of the present application;
fig. 5 is a schematic diagram of a third timestamp of a heading angle determining method according to an embodiment of the present application;
fig. 6 is a schematic diagram of a first detection point cluster determined to be associated in a course angle determination method provided in the embodiment of the present application;
FIG. 7 is a schematic view of parameter determination of a course angle determining method according to an embodiment of the present application;
fig. 8A is a schematic area expansion diagram of a method for determining a heading angle according to an embodiment of the present application;
fig. 8B is a schematic diagram of an extended area of a method for determining a heading angle according to an embodiment of the present application;
fig. 9 is an information fusion diagram of a course angle determining method according to an embodiment of the present application;
FIG. 10 is a schematic view of a determined course angle of a course angle determining method according to an embodiment of the present application;
FIG. 11 is a schematic diagram illustrating a method for determining an initial course angle according to an embodiment of the present disclosure;
fig. 12 is an overall flowchart of a method for determining a heading angle according to an embodiment of the present disclosure;
FIG. 13 is a schematic view of a device for determining a course angle according to an embodiment of the present disclosure;
fig. 14 is a schematic view of an electronic device of a method for determining a heading angle according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings.
It is noted that the terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The inventor researches and discovers that the heading angle of a moving vehicle is a very important parameter when the moving state of the moving vehicle at the next moment is estimated.
In the related technology, a millimeter wave radar or a laser radar is mostly adopted to estimate the course angle of a moving vehicle, when the millimeter wave radar is adopted to estimate the course angle of the moving vehicle, the position of the vehicle and Doppler information of the vehicle are firstly measured by the millimeter wave radar, and then a corresponding course angle estimation model is established by combining a selected tracking model, but the measurement precision of the method is limited by the error of the millimeter wave radar on the position measurement of the moving vehicle, the error of the Doppler measurement and the error of the course angle estimation model; when the course angle of the moving vehicle is estimated by adopting the laser radar, firstly, the point cloud obtained by detection is segmented, then the shape of the point cloud is estimated, and further the course angle of the moving vehicle is obtained.
In view of the above, the present application provides a course angle determining method, apparatus, electronic device and storage medium to solve the above problems. The inventive concept of the present application can be summarized as follows: firstly, continuously detecting a target scene by adopting a millimeter wave radar and a laser radar respectively to obtain a continuous multi-frame first detection point cluster and a continuous multi-frame second detection point cloud; then, carrying out time synchronization processing and space synchronization processing on the first detection point cluster and the second detection point cluster; performing area expansion on the coordinates of each first detection point in the first detection point cluster after synchronous processing according to the standard deviation to obtain an expansion area of the first detection point, and determining a second detection point in the expansion area according to the coordinates of the second detection point; wherein the standard deviation is a standard deviation of a physical parameter associated with the first detection point; then, associating a second detection point in the expansion area with the first detection point, and performing information fusion on the first detection point and the second detection point which are associated with each other to obtain a fusion detection point; fusing the detection points to form a fused detection point cloud; and finally, segmentation processing is carried out on the fusion detection point cloud formed by the fusion detection points to obtain the fusion detection point cloud corresponding to each vehicle in the target scene, and the course angle of each vehicle is determined based on the fusion detection point cloud.
In order to facilitate understanding of a method for determining a heading angle provided in an embodiment of the present application, the following detailed description is provided with reference to the accompanying drawings:
fig. 1 is a view of an application scenario of a method for determining a heading angle in an embodiment of the present application. The figure includes: laser radar 10, millimeter wave radar 20, server 30, memory 40; wherein: continuously detecting a target scene by using a millimeter wave radar 20 to obtain a continuous multi-frame first detection point cluster, and continuously detecting the target scene by using a laser radar 10 to obtain a continuous multi-frame second detection point cloud; the target scene comprises at least one vehicle; the laser radar and the millimeter wave radar respectively report the first detection point cluster and the second detection point cluster acquired by the laser radar and the millimeter wave radar to a server; the server 30 performs time synchronization processing and space synchronization processing on the first detection point cluster and the second detection point cloud to obtain the first detection point cluster and the second detection point cloud which have association relation in time and space; then, according to the standard deviation, carrying out area extension on the coordinate of each first detection point in the first detection point cluster after synchronous processing to obtain an extension area of the first detection point, and determining a second detection point in the extension area according to the coordinate of the second detection point; wherein the standard deviation is the standard deviation of the physical parameter associated with the first detection point; associating a second detection point in the extension area with the first detection point, and performing information fusion on the first detection point and the second detection point which are associated with each other to obtain a fusion detection point; fusing the detection points to form a fused detection point cloud; and carrying out segmentation processing on the fusion detection point cloud formed by the fusion detection points to obtain the fusion detection point cloud corresponding to each vehicle in the target scene, and determining the course angle of each vehicle based on the fusion detection point cloud.
Only a single server or memory is described in detail in the description of the present application, but it should be understood by those skilled in the art that the illustrated lidar 10, the millimeter-wave radar 20, the server 30, and the memory 40 are intended to represent the operation of the lidar 10, the millimeter-wave radar 20, the server 30, and the memory 40 involved in the technical aspects of the present application. The detailed description of the individual lidar 10, millimeter-wave radar 20, server 30, memory 40 is for ease of illustration at least, and does not imply a limitation on the number, type, or location of lidar 10, millimeter-wave radar 20, server 30, memory 40, or the like. It should be noted that the underlying concepts of the example embodiments of the present application do not change if additional modules are added or individual modules are removed from the environment shown in FIG. 1.
It should be noted that the storage in the embodiment of the present application may be, for example, a cache system, or a hard disk storage, a memory storage, and the like. In addition, the course angle determining method provided by the application is not only suitable for the application scene shown in FIG. 1, but also suitable for any device with a course angle determining requirement.
As shown in fig. 2, a schematic flow chart of a method for determining a heading angle provided in an embodiment of the present application is shown, where:
in step 201: continuously detecting a target scene by adopting a millimeter wave radar to obtain a continuous multi-frame first detection point cluster, and continuously detecting the target scene by adopting a laser radar to obtain a continuous multi-frame second detection point cloud; wherein the target scene comprises at least one vehicle;
in step 202: performing time synchronization processing and space synchronization processing on the first detection point cluster and the second detection point cloud to obtain the first detection point cluster and the second detection point cloud which have association relation in time and space;
in step 203: performing area extension on the coordinate of each first detection point in the first detection point cluster after synchronous processing according to the standard deviation to obtain an extension area of the first detection point, and determining a second detection point in the extension area according to the coordinate of the second detection point; wherein the standard deviation is a standard deviation of a physical parameter associated with the first detection point;
in step 204: associating a second detection point in the expansion area with the first detection point, and performing information fusion on the first detection point and the second detection point which are associated with each other to obtain a fusion detection point; fusing the detection points to form a fused detection point cloud;
in step 205: and carrying out segmentation processing on the fusion detection point cloud formed by the fusion detection points to obtain a fusion detection point cloud corresponding to each vehicle in the target scene, and determining the course angle of each vehicle based on the fusion detection point cloud.
For ease of understanding, the steps shown in fig. 2 are described in detail below:
firstly, information description is carried out on a first detection point cluster and a second detection point cluster, wherein: the high-performance millimeter wave radar continuously detects a target scene, and in the obtained continuous multi-frame first detection point clusters, each frame of the first detection point clusters comprises: the Radar detection method comprises the following steps that information such as a first detection point, the radial distance of the first detection point, Doppler velocity information, an azimuth angle, pitch angle information, Radar Cross Section (RCS), trace point quality, a first timestamp, a signal-to-noise ratio and the like is obtained; the laser radar continuously detects a target scene, and in the continuous multi-frame second detection point cloud obtained, each frame of second detection point cloud comprises: a second detection point and a second timestamp.
The following describes in detail a course angle determination method provided in the embodiment of the present application with reference to the first detection point cluster and the second detection point cluster:
1. time synchronization
In this application, in order to ensure that the finally obtained course angle of the moving vehicle is more accurate, therefore, time synchronization needs to be performed on the first detection point cluster and the second detection point cloud, which can be implemented as follows: and for each second timestamp, determining a first calculation timestamp and a second calculation timestamp corresponding to the second timestamp, and determining an associated first detection point cluster having an association relation with a second detection point cloud corresponding to the second timestamp based on the first calculation timestamp, the second calculation timestamp and the second timestamp.
Wherein: the first timestamp is obtained by marking each frame of first detection point clusters obtained by the millimeter wave radar through continuous detection on the target scene according to a first preset frequency, and the second timestamp is obtained by marking each frame of second detection point clouds obtained by the laser radar through continuous detection on the target scene according to a second preset frequency; the first calculated timestamp is a first timestamp chronologically preceding and least spaced from the second timestamp, and the second calculated timestamp is a first timestamp chronologically following and least spaced from the second timestamp.
For example: the frequency of the millimeter wave radar is 20 hz, the frequency of the laser radar is 12 hz, the first timestamp and the second timestamp are shown in fig. 3, the first calculated timestamp of timestamp t1 is a0, and the second calculated timestamp is a 1.
Determining, based on the first calculation timestamp, the second calculation timestamp, and the second timestamp, an associated first detection point cluster having an association relationship with the second detection point cloud corresponding to the second timestamp, which may be specifically implemented as the steps shown in fig. 4:
in step 401: acquiring a third timestamp, wherein the third timestamp is obtained by determining that the laser radar marks the first second timestamp and then starts to mark at a first preset frequency at a first time interval; wherein the first time interval is obtained according to a second preset frequency;
for example: assuming that the frequency of the laser radar is 12 hz and the frequency of the millimeter wave radar is 20 hz, the frequency of the third timestamp is 12 hz, and since the frequency of the millimeter wave radar is 20 hz, the inverse of the frequency of the millimeter wave radar in the first time interval is 50 ms, and thus the third timestamp is as shown in fig. 5.
In step 402: for each third timestamp, determining a target second timestamp that is least chronologically different from and before the third timestamp;
for example: continuing with the example of FIG. 5, taking c0 as an example, the target second timestamp is b 0.
In step 403: acquiring a first calculation timestamp and a second calculation timestamp corresponding to a target second timestamp;
taking fig. 5 as an example, the first computation timestamp corresponding to the target second timestamp b0 is a0, and the second computation timestamp is a 1.
In step 404: determining a related first detection point cluster based on a first detection point cluster corresponding to the first calculation timestamp and a first detection point cluster corresponding to the second calculation timestamp;
in the embodiment of the present application, the following two methods can be specifically implemented:
1) and performing linear interpolation processing on the first detection point cluster corresponding to the first calculation timestamp and the first detection point cluster corresponding to the second calculation timestamp to obtain the associated first detection point cluster.
For example: regarding the first detection point cluster corresponding to the first computation timestamp as an a point cluster, and regarding the first detection point cluster corresponding to the second computation timestamp as a B point cluster, as shown in fig. 6, associating the a point cluster and the B point cluster according to coordinates of midpoints of the a point cluster and the B point cluster, that is: the two points with the closest coordinates are taken as the associated two points, for example: a1 is associated with a1 ' point and a 2 is associated … … with a 2 ' point for association with an n ' point. And then carrying out linear interpolation on the associated points to obtain interpolation points, and obtaining an associated first detection point cluster based on the interpolation points.
2) And carrying out mean value processing on the coordinates of the first detection points in the detection point cluster corresponding to the first calculation timestamp and the first detection point cluster corresponding to the second calculation timestamp to obtain the associated first detection point cluster.
For example: continuing with the description of fig. 6 as an example, first, the point a cluster and the point B cluster are associated according to the coordinates of the middle points of the point a cluster and the point B cluster, that is: two points with the closest coordinates are taken as the associated two points, for example: a1 is associated with a1 ' point and a 2 is associated … … with a 2 ' point for association with an n ' point. And then carrying out mean value processing on the two mutually correlated points to obtain a correlated first detection point cluster.
In step 405: and taking the associated first detection point cluster as a first detection point cluster which has an association relation in time with a second detection point cloud corresponding to the target second timestamp.
2. Spatial synchronization
In the embodiment of the present application, in order to facilitate the calculation of the subsequent course angle, it is necessary to perform spatial synchronization processing on the first detection point cluster and the second detection point cluster, and the method may specifically be implemented as follows: performing coordinate conversion on the first detection point based on the rotation parameter and the translation parameter, and performing coordinate conversion on the second detection point based on the rotation parameter and the translation parameter to obtain the coordinate of the first detection point and the coordinate of the second detection point after space synchronization; the rotation parameters are used for rotating the coordinate systems corresponding to the first detection point and the second detection point to the left side to coincide with the coordinate axes of the space synchronization coordinate system; the translation parameters are used for determining coordinate components of the origin of the space synchronization coordinate system in a coordinate system corresponding to the first detection point and a coordinate system corresponding to the second detection point. In the present application, the translation parameter and the rotation parameter may be determined using the steps shown in fig. 7, in which:
in step 701: measuring a target object by using a laser radar to obtain at least one measuring point of the target object and a coordinate corresponding to the measuring point; determining the mean value of the coordinates of the measuring points;
in step 702: measuring a target object by adopting a millimeter wave radar to obtain a first coordinate of the target object;
in step 703: a rotation parameter and a translation parameter are determined based on the mean of the coordinates and the first coordinate.
It should be understood that, in a specific implementation, the execution sequence of step 701 and step 702 is not limited in the present application, that is, step 701 may be executed first and then step 702 is executed, step 702 may be executed first and then step 701 is executed, or step 701 and step 702 may be executed at the same time.
For example: a small metal panel (target object) is selected to be placed right in front of the millimeter wave radar and meets the far field measurement condition of the millimeter wave radar, the millimeter wave can accurately measure the position of the metal plate at the moment, the horizontal precision can reach 0.1 degree, the pitching precision can reach 0.2 degree, and meanwhile, the laser radar can accurately measure the space position of the static metal plate. The laser radar has high azimuth measurement resolution, a plurality of measurement points can be obtained by the metal plate, and at the moment, the average value L of the coordinates of the points can be taken to determine the translation parameters and the rotation parameters with the millimeter wave radar measurement value Rm and the formula 1.
L R + T Rm, (formula 1)
Wherein: l is the average value of the laser radar measuring points, Rm is the coordinate value of the millimeter wave radar measuring points, R is the rotation parameter, and T is the translation parameter.
In summary, by moving the position of the metal plate, a plurality of sets of values L and Rm can be obtained, and further the rotation parameter and the translation parameter can be obtained.
3. Region expansion
In the present application, when performing the region extension on each first detection point in the first detection point cluster, the steps as shown in fig. 8A may be implemented as follows:
in step 801: determining a radial distance standard deviation, an azimuth angle standard deviation and a pitch angle standard deviation of the first detection point according to the signal-to-noise ratio of the first detection point;
in step 802: determining a first difference value between the radial distance of the first detection point and the standard deviation of the radial distance, and determining a first sum value between the radial distance of the first detection point and the standard deviation of the radial distance;
in step 803: determining an expansion area of the radial distance according to the first difference value and the first sum value;
in step 804: determining a second difference value between the azimuth angle of the first detection point and the azimuth angle standard difference, and determining a second sum value between the azimuth angle of the first detection point and the azimuth angle standard difference;
in step 805: determining an extension area of the azimuth angle according to the second difference value and the second sum value;
in step 806: determining a third difference value between the pitch angle of the first detection point and the standard pitch angle difference, and determining a third sum value between the pitch angle of the first detection point and the standard pitch angle difference;
in step 807: determining an expansion area of the pitch angle according to the third sum;
in step 808: and determining the expansion area of the first detection point according to the expansion area of the radial distance, the expansion area of the azimuth angle and the expansion area of the pitch angle.
It should be noted that the present application does not limit the sequence of the expansion region for determining the radial distance, the expansion region for determining the azimuth angle, and the expansion region for determining the pitch angle, and the skilled person can set the execution sequence according to the requirement, and fig. 8A is only one embodiment.
For example: the coordinates of the first measurement point a are (r, a, e), wherein r represents the radial distance, a represents the azimuth angle, e represents the pitch angle, when the extension area of the radial distance is (r- Δ r, r + Δ r), the extension area of the azimuth angle is (a- Δ a, a + Δ a), the extension area of the pitch angle is (e- Δ e, e + Δ e), wherein Δ r is the standard deviation of r, Δ a is the standard deviation of a, Δ e is the standard deviation of e, wherein Δ is the standard deviation of r, Δ is the standard deviation of a, and Δ is the standard deviation of a; the expanded area of point a is shown in fig. 8B.
In determining the standard deviation, a radar target simulator may be employed for testing. For example, when the standard deviation is measured, the target is set to be a fixed distance, then the reflected power of the target is adjusted, and each adjustment is performed for a plurality of times of measurement, and the standard deviation of the sample is calculated. And when the azimuth standard deviation is calculated, setting the reflection power in different azimuths according to the antenna directional diagram, and then measuring for multiple times during each adjustment to calculate the sample standard deviation of the azimuths.
In summary, the expansion area of the first detection point can be determined according to the expansion area of the radial distance, the expansion area of the azimuth angle and the expansion area of the pitch angle.
4. Association
In the application, when a second detection point falling in a corresponding area of a first detection point is associated with the first detection point, if a plurality of second detection points fall in an extension area of the first detection point, the second detection point with the largest signal-to-noise ratio in the extension area is determined; and taking the second detection point with the largest signal-to-noise ratio as the second detection point associated with the first detection point.
For example: and if the extended area of the first detection points is (4,5,10) and the extended area of the first detection points is [ (4-0.2,4+0.2), (5-0.3,5+0.3), (10-0.1,10+0.1) ], determining second detection points in the extended area, and if the extended area has detection points A, B, C and D, wherein the signal-to-noise ratio of the point A is 124, the signal-to-noise ratio of the point B is 106, the signal-to-noise ratio of the point C is 114 and the signal-to-noise ratio of the point D is 103, taking the point A with the maximum signal-to-noise ratio as the second detection point related to the first detection point according to the signal-to-noise ratios of the detection points A, B, C and D.
5. Fusion of
In the present application, when information fusion is performed on the first detection point and the second detection point which are associated with each other, the steps shown in fig. 9 are performed for each second detection point, where:
in step 901: taking the physical parameter information of the first detection point associated with the second detection point as the physical parameter information of the fusion detection point corresponding to the second detection point;
in step 902: taking the coordinates of the second detection point after the space synchronization as the coordinates of the fusion detection point corresponding to the second detection point;
in step 903: and forming a fused detection point based on the physical parameter information of the fused detection point and the coordinates of the fused detection point.
For example: the first detection point A and the second detection point B are related detection points, physical parameter information of the first detection point A, namely Doppler velocity information, RCS, trace quality, a first timestamp and signal-to-noise ratio, is used as physical parameter information of the fusion detection point C, and the coordinate of the second detection zone with you B is used as the coordinate of the fusion detection zone with you C.
6. Determining a heading angle
In the application, because there may be more than one moving vehicle appearing in the target scene, before determining the initial course angle of the vehicle based on the fusion detection point cloud, the fusion detection point cloud needs to be segmented, and a segmentation model trained in advance is adopted at the moment of segmenting the fusion detection point cloud. The segmentation model in this application may be a transform model or a Deep network model (DNN), and when training the segmentation model, a conventional network model training method may be adopted, for example: firstly, point clouds containing a plurality of vehicles and a segmentation result of the point clouds are obtained, the point clouds are used as input of a DNN, the segmentation result is used as expected output to train the DD model, and parameters of the DNN model are adjusted according to a difference between the output result and the segmentation result until the training is converged. The method for training the network model is not limited, and technicians can set the training method according to requirements.
In the present application, determining the heading angle of each vehicle based on the fused detection points may be implemented as the steps shown in fig. 10, wherein:
in step 1001: determining an initial course angle of the vehicle based on the fused detection point cloud;
in the application, in order to avoid the waste of calculation power caused by the fact that all the fused detection point clouds enter the subsequent calculation, before the initial course angle of the vehicle is determined, the fused detection point clouds need to be filtered, and the static point clouds in the fused detection point clouds are removed, which can be specifically implemented as follows: determining the value of the absolute doppler velocity of each fusion detection point in the fusion detection point cloud to determine whether the fusion detection point cloud moves, for example, if a fusion detection point with an absolute doppler value not equal to zero exists, the fusion detection point cloud is considered to be moving; and if no fusion detection point with an absolute Doppler value not equal to zero exists in the fusion detection point cloud, the fusion detection point cloud is considered as a static point cloud, and the fusion detection point cloud is removed.
Besides determining the motion state of the fused detection point cloud according to the absolute doppler value of the fused detection point cloud, it can also determine whether the point cloud moves according to the position information of the fused detection point in the fused detection point cloud of the two frames, which can be implemented as follows: and determining the position information of each fusion detection point in the current frame of fusion detection point cloud, and determining the position information of each fusion detection point in the previous frame of fusion detection point cloud, wherein if the fusion detection point with the changed position exists, the fusion detection point cloud can be considered to be in motion.
When determining the initial course angle for the filtered fused detection point clouds, shape estimation may be performed on each segmented fused detection point cloud, for example: taking a plane with the same height, and then using a method of Searching Rectangular Boundary Filter (SRBF); the course angle azimuth value is between-180 degrees and 180 degrees, coarse search can be adopted firstly, then fine search is adopted, the interval of the coarse search angles is 2 degrees, and the fine search is 0.2 degree; the rough search and the fine search are the same, and the rough search is taken as an example for explanation:
as shown in fig. 11, the heading angle and azimuth value is selected to be 0 degree, the X axis is the direction pointed by the angle, the Y axis is the direction perpendicular to the X axis, the right-hand rule is satisfied, then the fused detection point cloud is projected under the coordinate, and the maximum value of the abscissa and the maximum value of the ordinate are calculated to fit into a rectangle. For each fused detection point in the fused detection point cloud, calculating the distance from the point to each boundary of the fitting rectangle, wherein the distance is a positive value. Selecting the smallest one and then summing the minimum distances of all points, the cumulative minimum distance at that angle can be obtained.
For example: taking point a in fig. 11 as an example, the distances of the four more boundaries at point a are l1, l2, l3, and l 4; wherein l1 is the minimum distance, then l1 is selected as the minimum distance of point A.
The above operations are performed on all the angles, then the angle which can enable the minimum distance accumulation and the minimum distance accumulation is selected, the angle is used as the initial course angle of the vehicle, the fine search process is the same as the above process, and the detailed description is omitted.
In specific implementation, the length or the width of the rectangle may be smaller, so that when the length or the width of the rectangle is smaller than a certain threshold value, it can be considered that the detected edge is one edge, and then the course angle of the previous frame of point cloud can be used as the course angle of the current frame of fused detection point cloud when calculating the course angle.
In step 1002: screening out target fusion detection points according to the signal-to-noise ratio of each fusion detection point in the fusion detection point cloud;
in the present application, in order to ensure the accuracy of the determined course angle, the target fusion detection point is selected based on the signal-to-noise ratio, which may be specifically implemented as: and sequentially selecting four fusion detection points with the largest signal-to-noise ratio as target fusion detection points. Or setting a signal-to-noise ratio threshold value, and taking the fusion detection point with the signal-to-noise ratio larger than the threshold value as a fusion detection point. This is not limited in this application.
In step 1003: determining a wheel position difference of the vehicle based on the fused detection point cloud and the fused detection point cloud of the previous frame;
when the wheel position difference of the vehicle is determined, the fusion detection point cloud can be subjected to cluster analysis, the outline of the vehicle is obtained based on the fusion detection point cloud, the measured wheel position is obtained based on the outline of the vehicle, then the previous frame of fusion detection point cloud is subjected to the same processing, the wheel position of the previous frame of fusion detection point cloud is obtained, and the wheel position difference of the vehicle can be obtained.
The wheel positions in the fused detection point cloud can be determined based on the DNN model, and then the wheel position difference between the front and the rear can be obtained by combining the wheel positions in the previous frame of fused detection point cloud. The position of the wheel can also be determined with reference to the common position difference between the wheel and the vehicle body; the position of the wheel can be determined by analyzing the distribution states of the micro Doppler signals at different positions in combination with the micro Doppler signals, and then the position difference of the wheel can be obtained.
In the present application, in order to further make the determined heading angle more accurate, the wheel position difference is therefore the position difference of the driving wheels, i.e.: if the vehicle is a rear-drive vehicle, the wheel position difference is the wheel position difference of a rear wheel; if the vehicle is a front-drive vehicle, the wheel position difference is the wheel position difference of the front wheel; if the vehicle is a four-wheel drive vehicle, the wheel position difference is a wheel position difference of four wheels.
In step 1004: and inputting the initial course angle, the target fusion detection point and the wheel position difference into a pre-trained motion model to obtain the course angle of the vehicle.
The following description will be given taking as an example the input of four target fusion detection points and the wheel position difference of the rear wheel of the rear-drive vehicle:
the motion model can be selected as a uniform turning model (CTRV), the filtering method can be selected as an Extended Kalman Filter (EKF), the output of the motion model is [ V, yawrate, oriention ], where V represents the vehicle body speed, yawrate represents the vehicle body yaw Rate, oriention represents the vehicle heading angle, and the inputs are [ origin _ Mea, vdopper 1, vdopper 2, vdopper 3, vdopper 4, Δ Δ, Δ o, where oriention _ Mea is the initial heading angle, vdopper 1, vdopper 2, vdopper 3, vdopper 4 are the doppler information of the target fusion point, Δ mesh, Δ Δ Δ is the wheel position difference, and when the Extended Kalman filtering method is used for calculation, the measurement variance and the system variance are needed. The measurement variance of origin _ Mea can be obtained by accumulating the sample variance when the fusion detection point cloud is subjected to shape estimation according to the intensity of the fusion detection point in the fusion detection point cloud; the system variance of the VDoppler is determined according to the signal-to-noise ratio of the fusion detection points in the fusion detection point cloud. The system variance can also be obtained by synthesizing the deviation between the output result of the motion model at the previous moment and the actual course angle of the vehicle and the deviation between the output result of the deviation motion model obtained at the previous moment and the actual course angle of the vehicle, for example: a difference in the deviations of the two moments before and after can be calculated and the system variance can be determined based on the relationship of the difference to a threshold.
For convenience of understanding, the following describes an overall flow of a heading angle estimation method provided in an embodiment of the present application in detail, as shown in fig. 12, where:
in step 1201: continuously detecting a target scene by adopting a millimeter wave radar to obtain a continuous multi-frame first detection point cluster, and continuously detecting the target scene by adopting a laser radar to obtain a continuous multi-frame second detection point cloud;
in step 1202: performing time synchronization processing and space synchronization processing on the first detection point cluster and the second detection point cloud to obtain the first detection point cluster and the second detection point cloud which have association relation in time and space;
in step 1203: performing area extension on the coordinate of each first detection point in the first detection point cluster after synchronous processing according to the standard deviation to obtain an extension area of the first detection point, and determining a second detection point in the extension area according to the coordinate of the second detection point;
in step 1204: associating a second detection point in the extension area with the first detection point, and performing information fusion on the first detection point and the second detection point which are associated with each other to obtain a fusion detection point; fusing the detection points to form a fused detection point cloud;
in step 1205: performing segmentation processing on the fusion detection point cloud formed by the fusion detection points to obtain a fusion detection point cloud corresponding to each vehicle in the target scene;
in step 1206: determining an initial course angle of the vehicle based on the fused detection point cloud;
in step 1207: screening out target fusion detection points according to the signal-to-noise ratio of each fusion detection point in the fusion detection point cloud;
in step 1208: determining a wheel position difference of the vehicle based on the fused detection point cloud and the fused detection point cloud of the previous frame;
in step 1209: and inputting the initial course angle, the target fusion detection point and the wheel position difference into a pre-trained motion model to obtain the course angle of the vehicle.
As shown in fig. 13, based on the same inventive concept, there is provided a heading angle determining apparatus 1300 including:
the detection module 13001 is configured to perform continuous detection on a target scene by using a millimeter wave radar to obtain a continuous multi-frame first detection point cluster, and perform continuous detection on the target scene by using a laser radar to obtain a continuous multi-frame second detection point cloud; wherein the target scene comprises at least one vehicle;
a synchronization module 13002, configured to perform time synchronization processing and space synchronization processing on the first detection point cluster and the second detection point cloud to obtain the first detection point cluster and the second detection point cloud which have an association relationship in time and space;
an extension module 13003, configured to perform area extension on coordinates of each first detection point in the first detection point cluster after the synchronization processing according to a standard deviation to obtain an extension area of the first detection point, and determine a second detection point falling in the extension area according to coordinates of the second detection point; wherein the standard deviation is a standard deviation of a physical parameter associated with the first detection point;
a fusion module 13004, configured to associate a second detection point that falls in the extension area with the first detection point, and perform information fusion on the associated first detection point and second detection point to obtain a fusion detection point; the fusion detection points form a fusion detection point cloud;
and a course angle determining module 13005, configured to perform segmentation processing on the fusion detection point cloud formed by the fusion detection points to obtain a fusion detection point cloud corresponding to each vehicle in the target scene, and determine a course angle of each vehicle based on the fusion detection point cloud.
In some possible embodiments, the synchronization module 13002, when performing time synchronization processing on the first detection point cluster and the second detection point cluster, is configured to:
for each second timestamp, determining a first computation timestamp and a second computation timestamp corresponding to the second timestamp; the first timestamp is obtained by marking each frame of first detection point clusters obtained by the millimeter wave radar through continuous detection on the target scene according to a first preset frequency, and the second timestamp is obtained by marking each frame of second detection point clusters obtained by the laser radar through continuous detection on the target scene according to a second preset frequency; wherein the first calculated timestamp is a first timestamp chronologically preceding and least spaced from the second timestamp, and the second calculated timestamp is a first timestamp chronologically following and least spaced from the second timestamp;
and determining an associated first detection point cluster having an association relation with a second detection point cloud corresponding to the second time stamp based on the first calculation time stamp, the second calculation time stamp and the second time stamp.
In some possible embodiments, the synchronization module 13002, when performing determining, based on the first calculated timestamp, the second calculated timestamp, and the second timestamp, that the associated first detection point cluster having an association relationship with the second detection point cloud corresponding to the second timestamp, is configured to:
acquiring a third timestamp, wherein the third timestamp is obtained by marking with a second preset frequency at a first time interval after the first second timestamp of the laser radar is determined to be marked; wherein the first time interval is obtained according to a second preset frequency;
for each third timestamp, determining a target second timestamp that is least chronologically different from and before the third timestamp;
acquiring a first calculation timestamp and a second calculation timestamp corresponding to the target second timestamp;
determining a related first detection point cluster based on a first detection point cluster corresponding to the first computation timestamp and a first detection point cluster corresponding to the second computation timestamp;
and taking the related first detection point cluster as a first detection point cluster which has a correlation relation in time with a second detection point cloud corresponding to the target second timestamp.
In some possible embodiments, the synchronization module 13002, when performing determining the associated first detection point cluster based on the first detection point cluster corresponding to the first computation timestamp and the first detection point cluster corresponding to the second computation timestamp, is configured to:
performing linear interpolation processing on the detection point cluster corresponding to the first calculation timestamp and the first detection point cluster corresponding to the second calculation timestamp to obtain the associated first detection point cluster; or
And performing mean value processing on the coordinates of first detection points in the detection point cluster corresponding to the first calculation timestamp and the first detection point cluster corresponding to the second calculation timestamp to obtain the associated first detection point cluster.
In some possible embodiments, the synchronization module 13002, when performing the spatial synchronization process on the first detection point cluster and the second detection point cluster, is configured to:
performing coordinate conversion on a first detection point based on a rotation parameter and a translation parameter, and performing coordinate conversion on a second detection point based on the rotation parameter and the translation parameter to obtain a coordinate of the first detection point and a coordinate of the second detection point after space synchronization; the rotation parameters are used for rotating the coordinate axes in the coordinate system corresponding to the first detection point and the coordinate system corresponding to the second detection point to the left side to coincide with the coordinate axes of the space synchronization coordinate system; the translation parameters are used for determining coordinate components of the origin of the space synchronization coordinate system in a coordinate system corresponding to the first detection point and a coordinate system corresponding to the second detection point.
In some possible embodiments, the rotation parameter and the translation parameter are obtained according to the following method:
measuring a target object by using a laser radar to obtain at least one measuring point of the target object and a coordinate corresponding to the measuring point; determining the mean value of the coordinates of the measuring points;
measuring the target object by adopting a millimeter wave radar to obtain a first coordinate of the target object;
determining a rotation parameter and a translation parameter based on the mean of the coordinates and the first coordinate.
In some possible embodiments, the coordinates of the first detection point include a radial distance, an azimuth angle, and a pitch angle;
when the extension module 13003 performs area extension on the coordinates of each first detection point in the synchronized first detection point cluster according to the standard deviation, the extension module is configured to:
performing the following process for each first detection point in the first detection point cluster:
determining the radial distance standard deviation, the azimuth angle standard deviation and the pitch angle standard deviation of the first detection point according to the signal-to-noise ratio of the first detection point;
determining a first difference value between the radial distance of the first detection point and the radial distance standard deviation, and determining a first sum value between the radial distance of the first detection point and the radial distance standard deviation;
determining an expansion area of the radial distance according to the first difference value and the first sum value;
determining a second difference between the azimuth of the first detection point and the azimuth standard deviation, and determining a second sum between the azimuth of the first detection point and the azimuth standard deviation;
determining an extension area of the azimuth according to the second difference value and the second sum value;
determining a third difference value between the pitch angle of the first detection point and the standard pitch angle difference, and determining a third sum value between the pitch angle of the first detection point and the standard pitch angle difference;
determining an expansion area of the pitch angle according to the third sum;
and determining the expansion area of the first detection point according to the expansion area of the radial distance, the expansion area of the azimuth angle and the expansion area of the pitch angle.
In some possible embodiments, the fusion module 13004, when performing associating a second detection point falling in the corresponding region of the first detection point with the first detection point, is configured to:
if the plurality of second detection points fall in the extension area of the first detection point, determining the second detection point with the maximum signal-to-noise ratio in the extension area;
and taking the second detection point with the maximum signal-to-noise ratio as the second detection point associated with the first detection point.
In some possible embodiments, the first cluster of detection points comprises: a plurality of first detection points, the second detection point cloud comprising: the fusion module 13004 performs information fusion on the first detection point and the second detection point which are associated with each other to obtain a fusion detection point, and includes:
the following procedure is performed for each second detection point:
taking the physical parameter information of the first detection point associated with the second detection point as the physical parameter information of the fusion detection point corresponding to the second detection point;
taking the coordinates of the second detection point after the space synchronization as the coordinates of the fusion detection point corresponding to the second detection point;
and forming the fused detection point based on the physical parameter information of the fused detection point and the coordinates of the fused detection point.
In some possible embodiments, the heading angle determination module 13005, when performing determining the heading angle for each vehicle based on the fused detection point cloud, is configured to:
performing, for each frame of the fused detection point clouds for each vehicle:
determining an initial course angle of the vehicle based on the fused detection point cloud;
screening out target fusion detection points according to the signal-to-noise ratio of each fusion detection point in the fusion detection point cloud;
determining a wheel position difference of the vehicle based on the fused detection point cloud and a fused detection point cloud of a previous frame;
and inputting the initial course angle, the target fusion detection point and the wheel position difference into a pre-trained motion model to obtain the course angle of the vehicle.
Having described the heading angle determination method and apparatus of the exemplary embodiments of the present application, an electronic device according to another exemplary embodiment of the present application is next described.
As will be appreciated by one skilled in the art, aspects of the present application may be embodied as a system, method or program product. Accordingly, various aspects of the present application may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
In some possible implementations, an electronic device according to the present application may include at least one processor, and at least one memory. Wherein the memory stores program code which, when executed by the processor, causes the processor to perform the steps in the course angle determination method according to various exemplary embodiments of the present application described above in the present specification.
The electronic device 130 according to this embodiment of the present application is described below with reference to fig. 14. The electronic device 130 shown in fig. 14 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 14, the electronic device 130 is represented in the form of a general electronic device. The components of the electronic device 130 may include, but are not limited to: the at least one processor 131, the at least one memory 132, and a bus 133 that connects the various system components (including the memory 132 and the processor 131).
Bus 133 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a processor, or a local bus using any of a variety of bus architectures.
The memory 132 may include readable media in the form of volatile memory, such as Random Access Memory (RAM)1321 and/or cache memory 1322, and may further include Read Only Memory (ROM) 1323.
Memory 132 may also include a program/utility 1325 having a set (at least one) of program modules 1324, such program modules 1324 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which or some combination thereof may comprise an implementation of a network environment.
The electronic device 130 may also communicate with one or more external devices 134 (e.g., keyboard, pointing device, etc.), with one or more devices that enable a user to interact with the electronic device 130, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 130 to communicate with one or more other electronic devices. Such communication may occur via input/output (I/O) interfaces 135. Also, the electronic device 130 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 136. As shown, network adapter 136 communicates with other modules for electronic device 130 over bus 133. It should be understood that although not shown in FIG. 14, other hardware and/or software modules may be used in conjunction with electronic device 130, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
In some possible embodiments, aspects of a heading angle determination method provided herein may also be embodied in a form of a program product including program code for causing a computer device to perform the steps of a heading angle determination method according to various exemplary embodiments of the present application described above in this specification when the program product is run on the computer device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The program product for course angle determination of embodiments of the present application may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be executable on an electronic device. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the consumer electronic device, partly on the consumer electronic device, as a stand-alone software package, partly on the consumer electronic device and partly on a remote electronic device, or entirely on the remote electronic device or server. In the case of remote electronic devices, the remote electronic devices may be connected to the consumer electronic device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external electronic device (e.g., through the internet using an internet service provider).
It should be noted that although several units or sub-units of the apparatus are mentioned in the above detailed description, such division is merely exemplary and not mandatory. Indeed, the features and functions of two or more units described above may be embodied in one unit, according to embodiments of the application. Conversely, the features and functions of one unit described above may be further divided into embodiments by a plurality of units.
Further, while the operations of the methods of the present application are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (13)

1. A method of determining a heading angle, the method comprising:
continuously detecting a target scene by adopting a millimeter wave radar to obtain a continuous multi-frame first detection point cluster, and continuously detecting the target scene by adopting a laser radar to obtain a continuous multi-frame second detection point cloud; wherein the target scene comprises at least one vehicle;
performing time synchronization processing and space synchronization processing on the first detection point cluster and the second detection point cloud to obtain a first detection point cluster and a second detection point cloud which have association relation in time and space;
performing area extension on the coordinate of each first detection point in the first detection point cluster after synchronous processing according to the standard deviation to obtain an extension area of the first detection point, and determining a second detection point in the extension area according to the coordinate of the second detection point; wherein the standard deviation is a standard deviation of a physical parameter associated with the first detection point;
associating a second detection point in the extension area with the first detection point, and performing information fusion on the first detection point and the second detection point which are associated with each other to obtain a fusion detection point; the fusion detection points form fusion detection point cloud;
and carrying out segmentation processing on the fusion detection point cloud formed by the fusion detection points to obtain a fusion detection point cloud corresponding to each vehicle in the target scene, and determining the course angle of each vehicle based on the fusion detection point cloud.
2. The method of claim 1, wherein the time-synchronizing the first detection point cluster and the second detection point cluster comprises:
for each second timestamp, determining a first computation timestamp and a second computation timestamp corresponding to the second timestamp; the first timestamp is obtained by marking each frame of first detection point clusters obtained by the millimeter wave radar through continuous detection on the target scene according to a first preset frequency, and the second timestamp is obtained by marking each frame of second detection point clusters obtained by the laser radar through continuous detection on the target scene according to a second preset frequency; the first calculated timestamp is a first timestamp chronologically preceding and least spaced from the second timestamp, and the second calculated timestamp is a first timestamp chronologically following and least spaced from the second timestamp;
and determining an associated first detection point cluster having an association relation with a second detection point cloud corresponding to the second time stamp based on the first calculation time stamp, the second calculation time stamp and the second time stamp.
3. The method of claim 2, wherein determining an associated first cluster of detection points having an association relationship with a second cloud of detection points corresponding to the second timestamp based on the first, second, and second calculated timestamps comprises:
acquiring a third timestamp, wherein the third timestamp is obtained by marking with a second preset frequency at a first time interval after the first second timestamp of the laser radar is determined to be marked; wherein the first time interval is obtained according to a first preset frequency;
for each third timestamp, determining a target second timestamp that is least chronologically different from the third timestamp and precedes the third timestamp;
acquiring a first calculation timestamp and a second calculation timestamp corresponding to the target second timestamp;
determining a related first detection point cluster based on a first detection point cluster corresponding to the first computation timestamp and a first detection point cluster corresponding to the second computation timestamp;
and taking the related first detection point cluster as a first detection point cluster which has a correlation relation in time with a second detection point cloud corresponding to the target second timestamp.
4. The method of claim 3, wherein determining an associated first cluster of detection points based on the first cluster of detection points corresponding to the first computation timestamp and the first cluster of detection points corresponding to the second computation timestamp comprises:
performing linear interpolation processing on a first detection point cluster corresponding to the first calculation timestamp and a first detection point cluster corresponding to the second calculation timestamp to obtain the associated first detection point cluster; or
And performing mean value processing on the coordinates of first detection points in the detection point cluster corresponding to the first calculation timestamp and the first detection point cluster corresponding to the second calculation timestamp to obtain the associated first detection point cluster.
5. The method of claim 1, wherein spatially synchronizing the first and second clusters of detection points comprises:
performing coordinate transformation on a first detection point based on a rotation parameter and a translation parameter, and performing coordinate transformation on a second detection point based on the rotation parameter and the translation parameter to obtain a coordinate of the first detection point and a coordinate of the second detection point after space synchronization; the rotation parameters are used for rotating the coordinate axes in the coordinate system corresponding to the first detection point and the coordinate system corresponding to the second detection point to the left side to coincide with the coordinate axes of the space synchronization coordinate system; the translation parameters are used for determining coordinate components of the origin of the space synchronization coordinate system in a coordinate system corresponding to the first detection point and a coordinate system corresponding to the second detection point.
6. The method according to claim 5, wherein the rotation parameter and the translation parameter are obtained according to the following method:
measuring a target object by using a laser radar to obtain at least one measuring point of the target object and a coordinate corresponding to the measuring point; determining the mean value of the coordinates of the measuring points;
measuring the target object by adopting a millimeter wave radar to obtain a first coordinate of the target object;
determining a rotation parameter and a translation parameter based on the mean of the coordinates and the first coordinate.
7. The method of claim 1, wherein the coordinates of the first detection point include a radial distance, an azimuth angle, and a pitch angle;
the area expansion of the coordinates of each first detection point in the first detection point cluster after the synchronous processing according to the standard deviation comprises:
performing the following process for each first detection point in the first detection point cluster:
determining the radial distance standard deviation, the azimuth angle standard deviation and the pitch angle standard deviation of the first detection point according to the signal-to-noise ratio of the first detection point;
determining a first difference between a radial distance of the first detection point and the radial distance standard deviation, and determining a first sum between the radial distance of the first detection point and the radial distance standard deviation;
determining an expansion area of the radial distance according to the first difference value and the first sum value;
determining a second difference between the azimuth of the first detection point and the azimuth standard deviation, and determining a second sum between the azimuth of the first detection point and the azimuth standard deviation;
determining an extension area of the azimuth according to the second difference value and the second sum value;
determining a third difference value between the pitch angle of the first detection point and the standard pitch angle difference, and determining a third sum value between the pitch angle of the first detection point and the standard pitch angle difference;
determining an expansion area of the pitch angle according to the third sum;
and determining the expansion area of the first detection point according to the expansion area of the radial distance, the expansion area of the azimuth angle and the expansion area of the pitch angle.
8. The method of claim 1, wherein associating a second detection point that falls within a corresponding area of the first detection point with the first detection point comprises:
if the plurality of second detection points fall in the extension area of the first detection point, determining the second detection point with the maximum signal-to-noise ratio in the extension area;
and taking the second detection point with the maximum signal-to-noise ratio as the second detection point associated with the first detection point.
9. The method of claim 1, wherein the first cluster of detection points comprises: a plurality of first detection points, the second detection point cloud comprising: the information fusion of the first detection point and the second detection point which are mutually associated to obtain a fusion detection point comprises the following steps:
the following procedure is performed for each second detection point:
taking the physical parameter information of the first detection point associated with the second detection point as the physical parameter information of the fusion detection point corresponding to the second detection point;
taking the coordinates of the second detection point after the space synchronization as the coordinates of the fusion detection point corresponding to the second detection point;
and forming the fusion detection point based on the physical parameter information of the fusion detection point and the coordinates of the fusion detection point.
10. The method of claim 1, wherein determining a heading angle for the each vehicle based on the fused detection point cloud comprises:
performing, for each frame of the fused detection point clouds for each vehicle:
determining an initial course angle of the vehicle based on the fused detection point cloud;
screening out target fusion detection points according to the signal-to-noise ratio of each fusion detection point in the fusion detection point cloud;
determining a wheel position difference of the vehicle based on the fused detection point cloud and a fused detection point cloud of a previous frame;
and inputting the initial course angle, the target fusion detection point and the wheel position difference into a pre-trained motion model to obtain the course angle of the vehicle.
11. A heading angle determining apparatus, the apparatus comprising:
the detection module is used for continuously detecting a target scene by adopting a millimeter wave radar to obtain a continuous multi-frame first detection point cluster, and continuously detecting the target scene by adopting a laser radar to obtain a continuous multi-frame second detection point cloud; wherein the target scene comprises at least one vehicle;
the synchronization module is used for carrying out time synchronization processing and space synchronization processing on the first detection point cluster and the second detection point cloud to obtain the first detection point cluster and the second detection point cloud which have association relation in time and space;
the extension module is used for carrying out area extension on the coordinate of each first detection point in the first detection point cluster after synchronous processing according to the standard deviation to obtain an extension area of the first detection point, and determining a second detection point in the extension area according to the coordinate of the second detection point; wherein the standard deviation is a standard deviation of a physical parameter associated with the first detection point;
the fusion module is used for associating the second detection point in the expansion area with the first detection point and carrying out information fusion on the first detection point and the second detection point which are associated with each other to obtain a fusion detection point; the fusion detection points form fusion detection point cloud;
and the course angle determining module is used for carrying out segmentation processing on the fusion detection point cloud formed by the fusion detection points to obtain the fusion detection point cloud corresponding to each vehicle in the target scene, and determining the course angle of each vehicle based on the fusion detection point cloud.
12. An electronic device comprising at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to implement the method of any one of claims 1 to 10.
13. A computer storage medium, characterized in that it stores a computer program for enabling a computer to perform the method according to any one of claims 1 to 10.
CN202210185678.6A 2022-02-28 2022-02-28 Course angle determining method and device, electronic equipment and storage medium Pending CN114594467A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210185678.6A CN114594467A (en) 2022-02-28 2022-02-28 Course angle determining method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210185678.6A CN114594467A (en) 2022-02-28 2022-02-28 Course angle determining method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114594467A true CN114594467A (en) 2022-06-07

Family

ID=81807555

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210185678.6A Pending CN114594467A (en) 2022-02-28 2022-02-28 Course angle determining method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114594467A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115236645A (en) * 2022-09-23 2022-10-25 北京小马易行科技有限公司 Laser radar attitude determination method and attitude determination device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115236645A (en) * 2022-09-23 2022-10-25 北京小马易行科技有限公司 Laser radar attitude determination method and attitude determination device
CN115236645B (en) * 2022-09-23 2023-01-24 北京小马易行科技有限公司 Laser radar attitude determination method and attitude determination device

Similar Documents

Publication Publication Date Title
CN108664841B (en) Dynamic and static target object identification method and device based on laser point cloud
KR101628154B1 (en) Multiple target tracking method using received signal strengths
EP3693759B1 (en) System and method for tracking motion of target in indoor environment
CN103605126B (en) Radio frequency identification speed measurement method and device
CN108226860B (en) RSS (received signal strength) -based ultra-wideband mixed dimension positioning method and positioning system
EP3910533B1 (en) Method, apparatus, electronic device, and storage medium for monitoring an image acquisition device
CN103047982B (en) Adaptive target tracking method based on angle information
US20230236280A1 (en) Method and system for positioning indoor autonomous mobile robot
CN111563450A (en) Data processing method, device, equipment and storage medium
CN113075648B (en) Clustering and filtering method for unmanned cluster target positioning information
CN111856507A (en) Environment sensing implementation method, intelligent mobile device and storage medium
CN114594467A (en) Course angle determining method and device, electronic equipment and storage medium
CN115728803A (en) System and method for continuously positioning urban driving vehicle
CN114137562B (en) Multi-target tracking method based on improved global nearest neighbor
CN116415202A (en) Multi-source data fusion method, system, electronic equipment and storage medium
Daniş et al. An indoor localization dataset and data collection framework with high precision position annotation
CN111208542A (en) Motion trail control system, device and method for automatic mobile traffic facility
CN117724059A (en) Multi-source sensor fusion track correction method based on Kalman filtering algorithm
CN111753901B (en) Data fusion method, device, system and computer equipment
CN115151836A (en) Method for detecting a moving object in the surroundings of a vehicle and motor vehicle
CN112965076A (en) Multi-radar positioning system and method for robot
CN115327529A (en) 3D target detection and tracking method fusing millimeter wave radar and laser radar
CN113219452B (en) Distributed multi-radar joint registration and multi-target tracking method under unknown vision field
CN111077517A (en) Vehicle detection tracking method and device
US20230025579A1 (en) High-definition mapping

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination