CN117783572A - Vehicle speed estimation method, system and platform suitable for multiple scenes - Google Patents
Vehicle speed estimation method, system and platform suitable for multiple scenes Download PDFInfo
- Publication number
- CN117783572A CN117783572A CN202311755974.6A CN202311755974A CN117783572A CN 117783572 A CN117783572 A CN 117783572A CN 202311755974 A CN202311755974 A CN 202311755974A CN 117783572 A CN117783572 A CN 117783572A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- data
- speed
- detection point
- generating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 238000001514 detection method Methods 0.000 claims abstract description 136
- 230000001133 acceleration Effects 0.000 claims abstract description 34
- 230000003068 static effect Effects 0.000 claims abstract description 33
- 238000012937 correction Methods 0.000 claims description 14
- 238000012545 processing Methods 0.000 claims description 11
- 238000009499 grossing Methods 0.000 claims description 10
- 238000001914 filtration Methods 0.000 claims description 8
- 230000002159 abnormal effect Effects 0.000 claims description 4
- 230000005856 abnormality Effects 0.000 claims description 3
- 230000000694 effects Effects 0.000 abstract description 5
- 230000006870 function Effects 0.000 description 10
- 230000008569 process Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 102100034112 Alkyldihydroxyacetonephosphate synthase, peroxisomal Human genes 0.000 description 1
- 101000799143 Homo sapiens Alkyldihydroxyacetonephosphate synthase, peroxisomal Proteins 0.000 description 1
- 238000000848 angular dependent Auger electron spectroscopy Methods 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Landscapes
- Traffic Control Systems (AREA)
Abstract
The invention discloses a method, a system and a platform for estimating the speed of a vehicle, which are applicable to multiple scenes, wherein the method, the system and the platform are used for generating the longitudinal speed of the vehicle corresponding to first data according to the first data of detection points corresponding to the vehicle in the scenes; wherein the first data includes detection point distance data, detection point speed data, and detection point angle data; generating a stationary point speed distribution condition corresponding to the vehicle body by combining second data corresponding to the vehicle body according to the vehicle longitudinal speed; wherein the second data comprises vehicle speed data, vehicle deflection angle data, vehicle longitudinal acceleration data and vehicle wheel speed data; according to the static point speed distribution condition, vehicle speed data corresponding to a vehicle body, a system and a platform corresponding to the method are generated in real time, so that the effect of acquiring the accurate speed of the vehicle in real time is achieved, namely, the more accurate vehicle speed is acquired under the condition of high vehicle acceleration (starting, sudden braking and the like).
Description
Technical Field
The invention belongs to the technical field of vehicle speed estimation processing, and particularly relates to a vehicle speed estimation method, a vehicle speed estimation system and a vehicle speed estimation platform suitable for multiple scenes.
Background
The millimeter wave radar is one of key sensors in the ADAS, and basic functions of the millimeter wave radar include distance measurement, speed measurement and angle measurement, and the measured speed belongs to relative speed due to the speed measurement principle of the millimeter wave radar.
The existing vehicle speed information of the vehicle is obtained through the vehicle body information, certain time delay exists, the influence of the time delay in the normal running process of the vehicle is not great, but when in individual scenes such as rapid acceleration and deceleration, the obtained vehicle speed information of the vehicle is greatly different from the actual vehicle speed, and the error of the judgment of the target motion state can be caused, so that the realization of the subsequent fusion or function is influenced.
In addition, for patent CN111308458A, a vehicle speed estimation method based on a vehicle millimeter wave radar is to obtain the distance, speed and angle of all targets and calculate the relative speed between the targets and the normal direction of the vehicle; counting the speeds of all static targets, and performing confidence matching to obtain a target cluster result with the top three targets in number ranking; and estimating the speed of the vehicle according to the speed estimation confidence in the target cluster. The scheme discloses that speed statistics is based on a static target, speed estimation errors are reduced by smoothing filtering, but the vehicle body posture of a target vehicle is not considered, the calculated relative speed in the normal direction is inaccurate, and effective static target clusters cannot be obtained in a scene with fewer static targets such as congestion and slow motion or spaciousness.
Therefore, in order to overcome the above drawbacks, there is an urgent need to design and develop a vehicle speed estimation method, system and platform suitable for multiple scenarios.
Disclosure of Invention
In order to overcome the defects and difficulties in the prior art, the invention provides a vehicle speed estimation method, a system and a platform suitable for multiple scenes, so as to realize the effect of acquiring the accurate speed of a vehicle in real time, namely acquiring more accurate vehicle speed under the condition of higher vehicle acceleration (starting, sudden braking and the like).
A first object of the present invention is to provide a method for estimating a vehicle speed suitable for multiple scenarios; a second object of the present invention is to provide a vehicle speed estimation system suitable for multiple scenarios; a third object of the present invention is to provide a vehicle speed estimation platform suitable for multiple scenarios.
The first object of the present invention is achieved by:
acquiring first data of detection points corresponding to the self-vehicle in a scene, and generating the longitudinal speed of the self-vehicle corresponding to the first data according to the first data; wherein the first data includes detection point distance data, detection point speed data, and detection point angle data;
generating a stationary point speed distribution condition corresponding to the vehicle body by combining second data corresponding to the vehicle body according to the vehicle longitudinal speed; wherein the second data comprises vehicle speed data, vehicle deflection angle data, vehicle longitudinal acceleration data and vehicle wheel speed data;
and generating the vehicle speed data corresponding to the vehicle body in real time according to the static point speed distribution condition.
Further, the calculation formula of the longitudinal speed of the bicycle is as follows:
V otg =V/cosθ (1)
wherein V is otg Longitudinal speed; v is the radial speed of the detection point, and θ is the azimuth angle of the detection point.
Further, the generating a stationary point speed distribution corresponding to the vehicle body according to the vehicle longitudinal speed and the second data corresponding to the vehicle body, further includes:
creating a first threshold value corresponding to the dynamic and static properties of the detection point; the first threshold value is used for judging the dynamic and static attribute threshold of the detection point;
determining to generate first detection point data corresponding to the longitudinal speed of the vehicle and smaller than the first threshold value.
Further, the calculation formula of the first threshold value is as follows:
Threshold1=V err +cosθ·lon+sinθ·lat (2)
wherein V is err For minimum baseline error, lon is the longitudinal velocity correction and lat is the lateral velocity correction.
Further, the generating a stationary point speed distribution corresponding to the vehicle body according to the vehicle longitudinal speed and the second data corresponding to the vehicle body, further includes:
creating a second threshold corresponding to the dynamic and static properties of the detection point according to the situation of the self-parking scene; the second threshold is a valid point number threshold meeting the first threshold;
it is determined to generate second detection point data corresponding to the own vehicle longitudinal speed and smaller than the second threshold value.
Further, the generating a stationary point speed distribution corresponding to the vehicle body according to the vehicle longitudinal speed and the second data corresponding to the vehicle body, further includes:
creating a histogram parameter corresponding to the histogram by combining the histogram statistics; wherein the histogram parameters include the range, group distance and group number;
generating group data corresponding to the detection points according to the histogram parameters; wherein the group data comprises a group number of each detection point;
according to the group data, sequentially comparing the number of detection points in each group, and generating group number data corresponding to the number of the detection points; the group number data is the group number where the detection point number group is located.
Further, the generating, in real time, the vehicle speed data corresponding to the vehicle body according to the rest point speed distribution condition, further includes:
smoothing deflection angle data corresponding to the vehicle body by combining linear filtering;
creating a third threshold value corresponding to the longitudinal acceleration of the vehicle, and judging whether the vehicle speed data is abnormal or not according to the third threshold value; wherein the third threshold is 30% of the maximum longitudinal acceleration of the vehicle.
The second object of the present invention is achieved by: the system is applied to the self-vehicle speed estimation method; the system comprises:
the acquisition generation unit is used for acquiring first data of detection points corresponding to the vehicle in the scene and generating longitudinal speed of the vehicle corresponding to the first data according to the first data; wherein the first data includes detection point distance data, detection point speed data, and detection point angle data;
the first generation unit is used for generating a stationary point speed distribution situation corresponding to the self-vehicle body according to the longitudinal speed of the self-vehicle and combining second data corresponding to the self-vehicle body; wherein the second data comprises vehicle speed data, vehicle deflection angle data, vehicle longitudinal acceleration data and vehicle wheel speed data;
and the second generation unit is used for generating the vehicle speed data corresponding to the vehicle body in real time according to the static point speed distribution condition.
Further, the calculation formula of the longitudinal speed of the bicycle is as follows:
V otg =V/cosθ (1)
wherein V is otg Longitudinal speed; v is the radial speed of the detection point, and θ is the azimuth angle of the detection point;
the first generating unit further includes:
the first creating module is used for creating a first threshold value corresponding to the dynamic and static properties of the detection point; the first threshold value is used for judging the dynamic and static attribute threshold of the detection point;
a first determination module for determining to generate first detection point data corresponding to the longitudinal speed of the vehicle and smaller than the first threshold;
the calculation formula of the first threshold value is as follows:
Threshold1=V err +cosθ·lon+sinθ·lat (2)
wherein V is err For the minimum baseline error, lon is longitudinal speed correction, lat is transverse speed correction;
and/or, the first generating unit further comprises:
the second creating module is used for creating a second threshold value corresponding to the dynamic and static attribute of the detection point according to the situation of the self-parking scene; the second threshold is a valid point number threshold meeting the first threshold;
a second determination module configured to determine to generate second detection point data corresponding to the own vehicle longitudinal speed and smaller than the second threshold;
and/or, the first generating unit further comprises:
the third creating module is used for creating a histogram parameter corresponding to the histogram by combining the histogram statistics; wherein the histogram parameters include the range, group distance and group number;
the first generation module is used for generating group data corresponding to the detection points according to the histogram parameters; wherein the group data comprises a group number of each detection point;
the second generation module is used for sequentially comparing the number of the detection points in each group according to the group data and generating group number data corresponding to the number of the detection points; the group number data is the group number where the detection point number group is located;
and/or, the second generating unit further includes:
the first processing module is used for carrying out smoothing processing on deflection angle data corresponding to the self-body by combining linear filtering;
the third judging module is used for creating a third threshold value corresponding to the longitudinal acceleration of the vehicle and judging whether the vehicle speed data has abnormality or not according to the third threshold value; wherein the third threshold is 30% of the maximum longitudinal acceleration of the vehicle.
The third object of the present invention is achieved by: the system comprises a processor, a memory and a vehicle speed estimation platform control program suitable for multiple scenes; the processor executes the vehicle speed estimation platform control program suitable for multiple scenes, the vehicle speed estimation platform control program suitable for multiple scenes is stored in the memory, and the vehicle speed estimation platform control program suitable for multiple scenes realizes the vehicle speed estimation method suitable for multiple scenes.
The method comprises the steps of obtaining first data of detection points corresponding to the vehicle in a scene, and generating longitudinal speed of the vehicle corresponding to the first data according to the first data; wherein the first data includes detection point distance data, detection point speed data, and detection point angle data; generating a stationary point speed distribution condition corresponding to the vehicle body by combining second data corresponding to the vehicle body according to the vehicle longitudinal speed; wherein the second data comprises vehicle speed data, vehicle deflection angle data, vehicle longitudinal acceleration data and vehicle wheel speed data; according to the static point speed distribution condition, vehicle speed data corresponding to a vehicle body, a system and a platform corresponding to the method are generated in real time, so that the effect of acquiring the accurate speed of the vehicle in real time is achieved, and more accurate vehicle speed can be acquired under the condition of high vehicle acceleration (starting, sudden braking and the like).
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for estimating a vehicle speed suitable for multiple scenarios according to the present invention;
FIG. 2 is a schematic diagram of a vehicle speed estimation process according to an embodiment of a vehicle speed estimation method suitable for multiple scenarios;
FIG. 3 is a schematic diagram of a real-time expressway scene according to an embodiment of a method for estimating a vehicle speed for multiple scenes;
FIG. 4 is a schematic diagram of a vehicle speed estimation system suitable for multiple scenarios according to the present invention;
FIG. 5 is a schematic diagram of a vehicle speed estimation platform suitable for multiple scenarios according to the present invention;
the achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
For a better understanding of the present invention, its objects, technical solutions and advantages, further description of the present invention will be made with reference to the drawings and detailed description, and further advantages and effects will be readily apparent to those skilled in the art from the present disclosure.
The invention may be practiced or carried out in other embodiments and details within the scope and range of equivalents of the various features and advantages of the invention.
It should be noted that, if directional indications (such as up, down, left, right, front, and rear … …) are included in the embodiments of the present invention, the directional indications are merely used to explain the relative positional relationship, movement conditions, etc. between the components in a specific posture (as shown in the drawings), and if the specific posture is changed, the directional indications are correspondingly changed.
In addition, if there is a description of "first", "second", etc. in the embodiments of the present invention, the description of "first", "second", etc. is for descriptive purposes only and is not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. Secondly, the technical solutions of the embodiments may be combined with each other, but it is necessary to be based on the fact that those skilled in the art can realize the technical solutions, and when the technical solutions are contradictory or cannot be realized, the technical solutions are considered to be absent and are not within the scope of protection claimed in the present invention.
Preferably, the method for estimating the speed of the vehicle applicable to multiple scenes is applied to one or more terminals or servers. The terminal is a device capable of automatically performing numerical calculation and/or information processing according to a preset or stored instruction, and its hardware includes, but is not limited to, a microprocessor, an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a Programmable gate array (FPGA), a digital processor (Digital Signal Processor, DSP), an embedded device, and the like.
The terminal can be a computing device such as a desktop computer, a notebook computer, a palm computer, a cloud server and the like. The terminal can perform man-machine interaction with a client through a keyboard, a mouse, a remote controller, a touch pad or voice control equipment and the like.
The invention discloses a method, a system and a platform for estimating the speed of a vehicle applicable to multiple scenes.
Fig. 1 is a flowchart of a method for estimating a vehicle speed suitable for multiple scenes according to an embodiment of the present invention.
In this embodiment, the vehicle speed estimation method applicable to multiple scenes may be applied to a terminal or a fixed terminal with a display function, where the terminal is not limited to a personal computer, a smart phone, a tablet computer, a desktop computer or an integrated machine with a camera, and the like.
The vehicle speed estimation method suitable for multiple scenes can also be applied to a hardware environment formed by a terminal and a server connected with the terminal through a network. Networks include, but are not limited to: a wide area network, a metropolitan area network, or a local area network. The vehicle speed estimation method suitable for multiple scenes in the embodiment of the invention can be executed by a server, a terminal or both.
For example, for a terminal that needs to perform a vehicle speed estimation suitable for multiple scenes, the vehicle speed estimation function suitable for multiple scenes provided by the method of the present invention may be directly integrated on the terminal, or a client for implementing the method of the present invention may be installed. For example, the method provided by the invention can also be operated on a server and other devices in the form of a software development kit (Software Development Kit, SDK), an interface suitable for the multi-scene speed estimation function is provided in the form of the SDK, and the terminal or other devices can realize the speed estimation function suitable for the multi-scene speed estimation function through the provided interface. The invention is further elucidated below in connection with the accompanying drawings.
As shown in fig. 1, the present invention provides a method for estimating a vehicle speed suitable for multiple scenes, the method comprising the following steps:
s1, acquiring first data of detection points corresponding to the vehicle in a scene, and generating longitudinal speed of the vehicle corresponding to the first data according to the first data; wherein the first data includes detection point distance data, detection point speed data, and detection point angle data;
s2, generating a stationary point speed distribution situation corresponding to the self-vehicle body according to the longitudinal speed of the self-vehicle and combining second data corresponding to the self-vehicle body; wherein the second data comprises vehicle speed data, vehicle deflection angle data, vehicle longitudinal acceleration data and vehicle wheel speed data;
and S3, generating the vehicle speed data corresponding to the vehicle body in real time according to the stationary point speed distribution condition.
The calculation formula of the longitudinal speed of the bicycle is as follows:
V otg =V/cosθ (1)
wherein V is otg Longitudinal speed; v is the radial speed of the detection point, and θ is the azimuth angle of the detection point.
The generating a stationary point speed distribution situation corresponding to the vehicle body according to the vehicle longitudinal speed and the second data corresponding to the vehicle body, further comprises:
s21, creating a first threshold value corresponding to the dynamic and static properties of the detection point; the first threshold value is used for judging the dynamic and static attribute threshold of the detection point;
s22, judging to generate first detection point data which corresponds to the longitudinal speed of the vehicle and is smaller than the first threshold value.
The calculation formula of the first threshold value is as follows:
Threshold1=V err +cosθ·lon+sinθ·lat (2)
wherein V is err For minimum baseline error, lon is longitudinalSpeed correction, lat is lateral speed correction.
The generating a stationary point speed distribution situation corresponding to the vehicle body according to the vehicle longitudinal speed and the second data corresponding to the vehicle body, further comprises:
s23, creating a second threshold corresponding to the dynamic and static properties of the detection point according to the situation of the self-parking scene; the second threshold is a valid point number threshold meeting the first threshold;
s24, judging to generate second detection point data which corresponds to the longitudinal speed of the vehicle and is smaller than the second threshold value.
The generating a stationary point speed distribution situation corresponding to the vehicle body according to the vehicle longitudinal speed and the second data corresponding to the vehicle body, further comprises:
s25, creating a histogram parameter corresponding to the histogram by combining the histogram statistics; wherein the histogram parameters include the range, group distance and group number;
s26, generating group data corresponding to the detection points according to the histogram parameters; wherein the group data comprises a group number of each detection point;
s27, sequentially comparing the number of detection points in each group according to the group data, and generating group number data corresponding to the number of the detection points; the group number data is the group number where the detection point number group is located.
Generating the vehicle speed data corresponding to the vehicle body in real time according to the stationary point speed distribution condition, and further comprising:
s31, smoothing deflection angle data corresponding to the vehicle body by combining linear filtering;
s32, creating a third threshold value corresponding to the longitudinal acceleration of the vehicle, and judging whether the vehicle speed data is abnormal or not according to the third threshold value; wherein the third threshold is 30% of the maximum longitudinal acceleration of the vehicle.
Specifically, in the embodiment of the invention, a vehicle speed estimation method based on millimeter wave radar is provided, the method is based on histogram statistics and information such as vehicle speed, wheel speed and acceleration output by a vehicle body sensor, and the speed estimation results calculated by different methods are selected in a self-adaptive manner by combining actual scenes.
In order to achieve the above purpose, the invention adopts the following technical scheme:
step 1: RVA information (distance, speed and angle) of all detection points of the current frame is obtained, and the longitudinal speed of each detection point is calculated:
V otg =V/cosθ (1)
wherein: v is the radial speed of the detection point, and θ is the azimuth angle of the detection point.
Step 2: acquiring part of body information from a body sensor: vehicle speed V of bicycle veh Yaw angle yawRate, vehicle longitudinal acceleration longAcc, vehicle wheel speed [4]]。
Step 3: using histogram to count the speed distribution of absolute rest point:
firstly, selecting effective data, i.e. longitudinal speed meeting V otg Detection point of Threshold1, threshold1 is a Threshold for judging dynamic and static properties of the detection point:
Threshold1=V err +cosθ·lon+sinθ·lat (2)
wherein: v (V) err For minimum baseline error, lon is longitudinal velocity correction, lat is transverse velocity correction, and all the values are set through offline data statistics.
Secondly, setting a histogram parameter: extremely bad, group spacing, and group number.
Setting a range D:
D=V max -V min (3)
very bad settings only need to consider the range of target speeds, default maximum speed V max 50m/s, minimum velocity V min Is 0m/s.
The group distance d (d E (0, D)) is set by considering the precision of the speed result and the calculated amount simultaneously, and the smaller the group distance value, namely the more accurate the final speed obtained by the histogram statistics, the larger the corresponding calculated amount is, and the adjustment can be carried out according to the actual use condition.
Setting the number of groups: k=d/D;
and judging whether the effective detection point number meets a set Threshold2, wherein the Threshold is adjusted according to the actual application scene, and the Threshold is recommended to be set to 1-10 in consideration of fewer static targets of individual scenes. If the effective detection point number meets the Threshold2, continuing processing, otherwise, entering step 4.
Again, the group number at which each detection point is located is calculated:
index=V otg /d+(D-1)+0.5 (4)
finally: counting the number of detection points falling into each group, sequentially comparing the number of detection points of each group, and obtaining the group number index of the group with the largest number of detection points max And calculating the estimated speed:
V est1 =-(index max -(D-1))×d (5)
step 4: the deflection angle yawRate is smoothed using linear filtering using the following formula:
yawRate - k =yawRate k ×K+yawRate - k-1 ×(1-K),...K∈[0,1] (6)
wherein: yawRate k For the measured value obtained from the car body information of the current frame, yawRate - k-1 For the smooth value of the previous frame, yawRate- k K is a linear coefficient (K is adjusted according to actual use conditions so that the deflection angle variance after smoothing is within a certain range) for the smoothing value of the current frame.
Step 5: and judging the validity of the wheel speed information, namely eliminating abnormal wheel speed data. And if the vehicle speed and the wheel speed meet the following conditions, the wheel speed information is considered invalid:
wherein: wheelSpeed [3] is the left rear wheel speed, wheelSpeed [4] is the right rear wheel speed, and le-6 is scientific count, representing a power of minus 6, 1 by 10, i.e., 0.000001.
The above condition indicates that if the own vehicle is in a motion state and the left rear/right rear wheel speed is 0, the wheel speed information is considered invalid, and then the step 6 is entered; otherwise, the wheel speed information is considered to be valid, and the step 7 is entered.
Step 6: calculating the final estimated speed V of the bicycle comp . If the wheel speed information is determined to be invalid in the step 5, two cases are considered at this time: 1) In the step 3, the effective detection point does not meet the threshold value; 2) V (V) est1 And V est2 There is a large difference between V est1 -V est2 > 10d. If either of the above two conditions is satisfied, V comp =V veh Otherwise V comp =V est1 。
Step 7: if the wheel speed information is judged to be valid in the step 5, entering the step:
first, a vehicle speed V is calculated from a left rear wheel speed and a right rear wheel speed est2 :
V est2 =(wheelSpeed[3]+wheelSpeed[4])/2 (8)
Secondly: judging whether the longitudinal acceleration longAcc of the vehicle meets a set Threshold3, wherein the Threshold is required to be set according to off-line data statistics and is generally set to be 30% of the maximum longitudinal acceleration of the vehicle. Three cases were considered at this time: 1) In the step 3, the effective detection point does not meet the threshold value; 2) V (V) est1 And V est2 There is a large difference between V est1 -V est2 > 5d; 3) longAcc meets the emergency braking condition (i.e., longAcc is greater than 60% of the maximum longitudinal acceleration). If the longitudinal acceleration satisfies the threshold value and any one of the three conditions satisfies V comp =V est2 Otherwise V comp =V est1 ;
Finally: if longAcc does not meet Threshold3, two cases are considered at this time: 1) The turning scene, namely yawRate is larger (|yawRate| > omega, omega needs to be adjusted according to the actual application scene); 2) V (V) est1 And V est2 There is a large difference between V est1 -V est2 > 5d. If the two conditions are satisfied at the same time, V comp =V est1 Otherwise V comp =V est2 。
The following describes the process flow of the scheme of the present invention with reference to the expressway scenario example of fig. 3, and the specific steps are as follows:
step 1: and selecting one frame of data, namely 160 detection points in total, and calculating the longitudinal speed of each detection point.
Step 2: acquiring vehicle body information:
step 3: using histogram to count the speed distribution of absolute rest point:
first, threshold1 is calculated according to the following formula:
Threshold1=V err +cosθ·lon+sinθ·lat (2)
wherein V is err The dynamic and static thresholds corresponding to the detection points can be calculated by combining the azimuth angles of the detection points, and the number of the effective detection points is 146.
Secondly, setting a histogram parameter: extremely poor d=v max -V min =50-0=50, group distance d=0.25, and group number k=d/d=50/0.25=200.
Again, the Threshold2 is set to 5 and the number of valid detection points satisfies the Threshold2.
Finally, calculating the group number of each detection point and counting the number of detection points falling into each group, sequentially comparing the number of detection points of each group, and obtaining the group number index of the group with the highest number of detection points max =105, the estimated speed is calculated: v (V) est1 =-(index max -(D-1))×d=23.50。
Step 4: yawrate=0.00175 for the current frame is calculated according to the smoothing filter formula.
Step 5: and 7, judging that the wheel speed information is effective, and entering a step 7.
Step 6: skipping.
Step 7: calculating the final estimated speed V of the bicycle comp 。
First, the vehicle speed (to be converted into m/s) is calculated from the left rear wheel speed and the right rear wheel speed:
V est2 =(wheelSpeed[3]+wheelSpeed[4])/2=(84.59+84.69)/2/3.6=23.51。
next, the longitudinal acceleration Threshold value Threshold3 is set to 3m/s 2 And longAcc does not meet the threshold.
Finally, two cases are as follows: 1) The turning scene, namely yawRate is larger (|yawRate| > omega, omega needs to be adjusted according to the actual application scene); 2) V (V) est1 And V est2 There is a large difference between V est1 -V est2 > 5d, and thus V comp =V est2 =23.51m/s。
In order to achieve the above objective, the present invention further provides a vehicle speed estimation system suitable for multiple scenes, as shown in fig. 4, where the system is applied to the vehicle speed estimation method; the system comprises:
the acquisition generation unit is used for acquiring first data of detection points corresponding to the vehicle in the scene and generating longitudinal speed of the vehicle corresponding to the first data according to the first data; wherein the first data includes detection point distance data, detection point speed data, and detection point angle data;
the first generation unit is used for generating a stationary point speed distribution situation corresponding to the self-vehicle body according to the longitudinal speed of the self-vehicle and combining second data corresponding to the self-vehicle body; wherein the second data comprises vehicle speed data, vehicle deflection angle data, vehicle longitudinal acceleration data and vehicle wheel speed data;
and the second generation unit is used for generating the vehicle speed data corresponding to the vehicle body in real time according to the static point speed distribution condition.
Further, the calculation formula of the longitudinal speed of the bicycle is as follows:
V otg =V/cosθ (1)
wherein V is otg Longitudinal speed; v is the radial speed of the detection point, and θ is the azimuth angle of the detection point;
the first generating unit further includes:
the first creating module is used for creating a first threshold value corresponding to the dynamic and static properties of the detection point; the first threshold value is used for judging the dynamic and static attribute threshold of the detection point;
a first determination module for determining to generate first detection point data corresponding to the longitudinal speed of the vehicle and smaller than the first threshold;
the calculation formula of the first threshold value is as follows:
Threshold1=V err +cosθ·lon+sinθ·lat (2)
wherein V is err For the minimum baseline error, lon is longitudinal speed correction, lat is transverse speed correction;
and/or, the first generating unit further comprises:
the second creating module is used for creating a second threshold value corresponding to the dynamic and static attribute of the detection point according to the situation of the self-parking scene; the second threshold is a valid point number threshold meeting the first threshold;
a second determination module configured to determine to generate second detection point data corresponding to the own vehicle longitudinal speed and smaller than the second threshold;
and/or, the first generating unit further comprises:
the third creating module is used for creating a histogram parameter corresponding to the histogram by combining the histogram statistics; wherein the histogram parameters include the range, group distance and group number;
the first generation module is used for generating group data corresponding to the detection points according to the histogram parameters; wherein the group data comprises a group number of each detection point;
the second generation module is used for sequentially comparing the number of the detection points in each group according to the group data and generating group number data corresponding to the number of the detection points; the group number data is the group number where the detection point number group is located;
and/or, the second generating unit further includes:
the first processing module is used for carrying out smoothing processing on deflection angle data corresponding to the self-body by combining linear filtering;
the third judging module is used for creating a third threshold value corresponding to the longitudinal acceleration of the vehicle and judging whether the vehicle speed data has abnormality or not according to the third threshold value; wherein the third threshold is 30% of the maximum longitudinal acceleration of the vehicle.
In the embodiment of the system solution of the present invention, the specific details of the steps of the method involved in the vehicle speed estimation system suitable for multiple scenarios are described above, that is, the functional modules in the system are used to implement the steps or sub-steps in the embodiment of the method, which are not described herein.
In order to achieve the above objective, the present invention further provides a vehicle speed estimation platform suitable for multiple scenes, as shown in fig. 5, including a processor, a memory, and a vehicle speed estimation platform control program suitable for multiple scenes; the processor executes the vehicle speed estimation platform control program suitable for multiple scenes, the vehicle speed estimation platform control program suitable for multiple scenes is stored in the memory, and the vehicle speed estimation platform control program suitable for multiple scenes realizes the vehicle speed estimation method steps suitable for multiple scenes. For example:
s1, acquiring first data of detection points corresponding to the vehicle in a scene, and generating longitudinal speed of the vehicle corresponding to the first data according to the first data; wherein the first data includes detection point distance data, detection point speed data, and detection point angle data;
s2, generating a stationary point speed distribution situation corresponding to the self-vehicle body according to the longitudinal speed of the self-vehicle and combining second data corresponding to the self-vehicle body; wherein the second data comprises vehicle speed data, vehicle deflection angle data, vehicle longitudinal acceleration data and vehicle wheel speed data;
and S3, generating the vehicle speed data corresponding to the vehicle body in real time according to the stationary point speed distribution condition.
The details of the steps are set forth above and are not repeated here.
In the embodiment of the invention, the built-in processor of the vehicle speed estimation platform suitable for multiple scenes can be composed of integrated circuits, for example, can be composed of single packaged integrated circuits, can also be composed of a plurality of integrated circuits packaged with the same function or different functions, and comprises one or a plurality of central processing units (Central Processing unit, CPU), microprocessors, digital processing chips, graphics processors, various control chips and the like. The processor uses various interfaces and line connections to access various components, by running or executing programs or units stored in the memory, and invoking data stored in the memory to perform various functions and process data for vehicle speed estimation for multiple scenarios;
the memory is used for storing program codes and various data, is installed in a vehicle speed estimation platform suitable for multiple scenes, and realizes high-speed and automatic program or data access in the running process.
The Memory includes Read-Only Memory (ROM), random-access Memory (RandomAccess Memory, RAM), programmable Read-Only Memory (Programmable Read-Only Memory, PROM), erasable programmable Read-Only Memory (EPROM), one-time programmable Read-Only Memory (OTPROM), electrically erasable programmable Read-Only Memory (EEPROM), compact disc Read-Only Memory (Compact Disc Read-Only Memory, CD-ROM) or other optical disc Memory, magnetic disk Memory, tape Memory, or any other medium from which a computer can be used to carry or store data.
The method comprises the steps of obtaining first data of detection points corresponding to the vehicle in a scene, and generating longitudinal speed of the vehicle corresponding to the first data according to the first data; wherein the first data includes detection point distance data, detection point speed data, and detection point angle data; generating a stationary point speed distribution condition corresponding to the vehicle body by combining second data corresponding to the vehicle body according to the vehicle longitudinal speed; wherein the second data comprises vehicle speed data, vehicle deflection angle data, vehicle longitudinal acceleration data and vehicle wheel speed data; according to the static point speed distribution condition, vehicle speed data corresponding to a vehicle body, a system and a platform corresponding to the method are generated in real time, so that the effect of acquiring the accurate speed of the vehicle in real time is achieved, and more accurate vehicle speed can be acquired under the condition of high vehicle acceleration (starting, sudden braking and the like).
The foregoing examples illustrate only a few embodiments of the invention and are described in detail herein without thereby limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.
Claims (10)
1. A method of estimating a speed of a vehicle for use in a plurality of scenarios, the method comprising the steps of:
acquiring first data of detection points corresponding to the self-vehicle in a scene, and generating the longitudinal speed of the self-vehicle corresponding to the first data according to the first data; wherein the first data includes detection point distance data, detection point speed data, and detection point angle data;
generating a stationary point speed distribution condition corresponding to the vehicle body by combining second data corresponding to the vehicle body according to the vehicle longitudinal speed; wherein the second data comprises vehicle speed data, vehicle deflection angle data, vehicle longitudinal acceleration data and vehicle wheel speed data;
and generating the vehicle speed data corresponding to the vehicle body in real time according to the static point speed distribution condition.
2. The method for estimating speed of a vehicle according to claim 1, wherein the calculation formula of the longitudinal speed of the vehicle is as follows:
V otg =V/cosθ (1)
wherein V is otg Longitudinal speed; v is the radial speed of the detection point, and θ is the azimuth angle of the detection point.
3. The method for estimating a vehicle speed for use in multiple scenarios according to claim 1, wherein the generating a stationary point speed distribution corresponding to the vehicle body according to the vehicle longitudinal speed in combination with the second data corresponding to the vehicle body further comprises:
creating a first threshold value corresponding to the dynamic and static properties of the detection point; the first threshold value is used for judging the dynamic and static attribute threshold of the detection point;
determining to generate first detection point data corresponding to the longitudinal speed of the vehicle and smaller than the first threshold value.
4. A method for estimating a vehicle speed for a plurality of scenes according to claim 3, wherein the calculation formula of the first threshold is as follows:
Threshold1=V err +cosθ·lon+sinθ·lat (2)
wherein V is err For minimum baseline error, lon is the longitudinal velocity correction and lat is the lateral velocity correction.
5. A method for estimating a vehicle speed suitable for use in a multi-scenario according to claim 1 or 3, wherein said generating a stationary point speed distribution corresponding to a vehicle body from said vehicle longitudinal speed in combination with second data corresponding to the vehicle body further comprises:
creating a second threshold corresponding to the dynamic and static properties of the detection point according to the situation of the self-parking scene; the second threshold is a valid point number threshold meeting the first threshold;
it is determined to generate second detection point data corresponding to the own vehicle longitudinal speed and smaller than the second threshold value.
6. The method for estimating a vehicle speed for use in multiple scenarios according to claim 5, wherein the generating a stationary point speed distribution corresponding to the vehicle body according to the vehicle longitudinal speed in combination with the second data corresponding to the vehicle body further comprises:
creating a histogram parameter corresponding to the histogram by combining the histogram statistics; wherein the histogram parameters include the range, group distance and group number;
generating group data corresponding to the detection points according to the histogram parameters; wherein the group data comprises a group number of each detection point;
according to the group data, sequentially comparing the number of detection points in each group, and generating group number data corresponding to the number of the detection points; the group number data is the group number where the detection point number group is located.
7. The method for estimating a vehicle speed suitable for multiple scenes according to claim 1, wherein the generating vehicle speed data corresponding to a vehicle body in real time according to the stationary point speed distribution condition further comprises:
smoothing deflection angle data corresponding to the vehicle body by combining linear filtering;
creating a third threshold value corresponding to the longitudinal acceleration of the vehicle, and judging whether the vehicle speed data is abnormal or not according to the third threshold value; wherein the third threshold is 30% of the maximum longitudinal acceleration of the vehicle.
8. A vehicle speed estimation system adapted for use in multiple scenarios, characterized in that the system is applied to a vehicle speed estimation method according to any one of claims 1-7; the system comprises:
the acquisition generation unit is used for acquiring first data of detection points corresponding to the vehicle in the scene and generating longitudinal speed of the vehicle corresponding to the first data according to the first data; wherein the first data includes detection point distance data, detection point speed data, and detection point angle data;
the first generation unit is used for generating a stationary point speed distribution situation corresponding to the self-vehicle body according to the longitudinal speed of the self-vehicle and combining second data corresponding to the self-vehicle body; wherein the second data comprises vehicle speed data, vehicle deflection angle data, vehicle longitudinal acceleration data and vehicle wheel speed data;
and the second generation unit is used for generating the vehicle speed data corresponding to the vehicle body in real time according to the static point speed distribution condition.
9. The vehicle speed estimation system according to claim 8, wherein the calculation formula of the vehicle longitudinal speed is as follows:
V otg =V/cosθ (1)
wherein V is otg Longitudinal speed; v is the radial speed of the detection point, and θ is the azimuth angle of the detection point;
the first generating unit further includes:
the first creating module is used for creating a first threshold value corresponding to the dynamic and static properties of the detection point; the first threshold value is used for judging the dynamic and static attribute threshold of the detection point;
a first determination module for determining to generate first detection point data corresponding to the longitudinal speed of the vehicle and smaller than the first threshold;
the calculation formula of the first threshold value is as follows:
Threshold1=V err +cosθ·lon+sinθ·lat (2)
wherein V is err For the minimum baseline error, lon is longitudinal speed correction, lat is transverse speed correction;
and/or, the first generating unit further comprises:
the second creating module is used for creating a second threshold value corresponding to the dynamic and static attribute of the detection point according to the situation of the self-parking scene; the second threshold is a valid point number threshold meeting the first threshold;
a second determination module configured to determine to generate second detection point data corresponding to the own vehicle longitudinal speed and smaller than the second threshold;
and/or, the first generating unit further comprises:
the third creating module is used for creating a histogram parameter corresponding to the histogram by combining the histogram statistics; wherein the histogram parameters include the range, group distance and group number;
the first generation module is used for generating group data corresponding to the detection points according to the histogram parameters; wherein the group data comprises a group number of each detection point;
the second generation module is used for sequentially comparing the number of the detection points in each group according to the group data and generating group number data corresponding to the number of the detection points; the group number data is the group number where the detection point number group is located;
and/or, the second generating unit further includes:
the first processing module is used for carrying out smoothing processing on deflection angle data corresponding to the self-body by combining linear filtering;
the third judging module is used for creating a third threshold value corresponding to the longitudinal acceleration of the vehicle and judging whether the vehicle speed data has abnormality or not according to the third threshold value; wherein the third threshold is 30% of the maximum longitudinal acceleration of the vehicle.
10. The vehicle speed estimation platform is characterized by comprising a processor, a memory and a vehicle speed estimation platform control program applicable to multiple scenes; wherein the processor executes the vehicle speed estimation platform control program suitable for multiple scenes, the vehicle speed estimation platform control program suitable for multiple scenes is stored in the memory, and the vehicle speed estimation platform control program suitable for multiple scenes realizes the vehicle speed estimation method suitable for multiple scenes according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311755974.6A CN117783572A (en) | 2023-12-20 | 2023-12-20 | Vehicle speed estimation method, system and platform suitable for multiple scenes |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311755974.6A CN117783572A (en) | 2023-12-20 | 2023-12-20 | Vehicle speed estimation method, system and platform suitable for multiple scenes |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117783572A true CN117783572A (en) | 2024-03-29 |
Family
ID=90386143
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311755974.6A Pending CN117783572A (en) | 2023-12-20 | 2023-12-20 | Vehicle speed estimation method, system and platform suitable for multiple scenes |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117783572A (en) |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009086788A (en) * | 2007-09-28 | 2009-04-23 | Hitachi Ltd | Vehicle surrounding monitoring device |
JP2009236570A (en) * | 2008-03-26 | 2009-10-15 | Aisin Aw Co Ltd | Apparatus and method for detecting vehicle speed and program |
JP2014089505A (en) * | 2012-10-29 | 2014-05-15 | Daimler Ag | Other-vehicle detection apparatus |
CN111308458A (en) * | 2020-02-21 | 2020-06-19 | 北京理工睿行电子科技有限公司 | Vehicle speed estimation method based on vehicle-mounted millimeter wave radar |
CN111775950A (en) * | 2020-07-07 | 2020-10-16 | 清华大学苏州汽车研究院(吴江) | Vehicle reference speed measuring and calculating method, device, equipment, storage medium and system |
CN111845755A (en) * | 2020-06-10 | 2020-10-30 | 武汉理工大学 | Method for estimating longitudinal speed of vehicle |
CN112109708A (en) * | 2020-10-26 | 2020-12-22 | 吉林大学 | Adaptive cruise control system considering driving behaviors and control method thereof |
WO2021000313A1 (en) * | 2019-07-04 | 2021-01-07 | 深圳市大疆创新科技有限公司 | Methods of using lateral millimeter wave radar to detect lateral stationary object and measure moving speed |
WO2021012254A1 (en) * | 2019-07-25 | 2021-01-28 | 深圳市大疆创新科技有限公司 | Target detection method, system, and mobile platform |
CN112550300A (en) * | 2019-09-25 | 2021-03-26 | 比亚迪股份有限公司 | Vehicle speed detection method and device, storage medium, electronic equipment and vehicle |
CN113771857A (en) * | 2021-09-24 | 2021-12-10 | 北京易航远智科技有限公司 | Longitudinal speed estimation method and system for vehicle control |
CN113788021A (en) * | 2021-09-03 | 2021-12-14 | 东南大学 | Adaptive following cruise control method combined with preceding vehicle speed prediction |
CN116299302A (en) * | 2023-05-19 | 2023-06-23 | 南京隼眼电子科技有限公司 | Vehicle body speed determination method, radar system, and storage medium |
CN116577787A (en) * | 2023-05-30 | 2023-08-11 | 浙江飞碟汽车制造有限公司 | Vehicle motion state parameter estimation method based on vehicle millimeter wave radar |
-
2023
- 2023-12-20 CN CN202311755974.6A patent/CN117783572A/en active Pending
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009086788A (en) * | 2007-09-28 | 2009-04-23 | Hitachi Ltd | Vehicle surrounding monitoring device |
JP2009236570A (en) * | 2008-03-26 | 2009-10-15 | Aisin Aw Co Ltd | Apparatus and method for detecting vehicle speed and program |
JP2014089505A (en) * | 2012-10-29 | 2014-05-15 | Daimler Ag | Other-vehicle detection apparatus |
WO2021000313A1 (en) * | 2019-07-04 | 2021-01-07 | 深圳市大疆创新科技有限公司 | Methods of using lateral millimeter wave radar to detect lateral stationary object and measure moving speed |
WO2021012254A1 (en) * | 2019-07-25 | 2021-01-28 | 深圳市大疆创新科技有限公司 | Target detection method, system, and mobile platform |
CN112550300A (en) * | 2019-09-25 | 2021-03-26 | 比亚迪股份有限公司 | Vehicle speed detection method and device, storage medium, electronic equipment and vehicle |
CN111308458A (en) * | 2020-02-21 | 2020-06-19 | 北京理工睿行电子科技有限公司 | Vehicle speed estimation method based on vehicle-mounted millimeter wave radar |
CN111845755A (en) * | 2020-06-10 | 2020-10-30 | 武汉理工大学 | Method for estimating longitudinal speed of vehicle |
CN111775950A (en) * | 2020-07-07 | 2020-10-16 | 清华大学苏州汽车研究院(吴江) | Vehicle reference speed measuring and calculating method, device, equipment, storage medium and system |
CN112109708A (en) * | 2020-10-26 | 2020-12-22 | 吉林大学 | Adaptive cruise control system considering driving behaviors and control method thereof |
CN113788021A (en) * | 2021-09-03 | 2021-12-14 | 东南大学 | Adaptive following cruise control method combined with preceding vehicle speed prediction |
CN113771857A (en) * | 2021-09-24 | 2021-12-10 | 北京易航远智科技有限公司 | Longitudinal speed estimation method and system for vehicle control |
CN116299302A (en) * | 2023-05-19 | 2023-06-23 | 南京隼眼电子科技有限公司 | Vehicle body speed determination method, radar system, and storage medium |
CN116577787A (en) * | 2023-05-30 | 2023-08-11 | 浙江飞碟汽车制造有限公司 | Vehicle motion state parameter estimation method based on vehicle millimeter wave radar |
Non-Patent Citations (3)
Title |
---|
IKRAM MZ, ET AL: "3-D Object Tracking in Millimeter-Wave Radar for Advanced Driver Assistance Systems", 《2013 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING》, 31 December 2013 (2013-12-31), pages 723 - 726, XP032566730, DOI: 10.1109/GlobalSIP.2013.6736993 * |
王畅;徐远新;付锐;郭应时;袁伟;: "应用于换道预警系统的潜在危险目标辨识算法", 长安大学学报(自然科学版), vol. 35, no. 1, 15 January 2015 (2015-01-15), pages 98 - 105 * |
高博麟;陈慧;: "基于车轮力传感器信息的全轮驱动车辆状态估计", 农业机械学报, vol. 43, no. 12, 25 December 2012 (2012-12-25), pages 22 - 27 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10035508B2 (en) | Device for signalling objects to a navigation module of a vehicle equipped with this device | |
CN109941274B (en) | Parking method and system based on radar ranging identification shore bridge, server and medium | |
CN109191487B (en) | Unmanned vehicle-based collision detection method, device, equipment and storage medium | |
JP2021056608A (en) | Occupancy grid map generation device, occupancy grid map generation system, occupancy grid map generation method, and program | |
KR20200123308A (en) | Apparatus and method for monitoring camera signals | |
KR20130021985A (en) | Method and apparatus for estimating radius of curvature of vehicle | |
CN117783572A (en) | Vehicle speed estimation method, system and platform suitable for multiple scenes | |
CN116614621B (en) | Method, device and storage medium for testing in-camera perception algorithm | |
CN113640766A (en) | Method and device for determining motion state of target | |
CN114397671B (en) | Course angle smoothing method and device of target and computer readable storage medium | |
US20230410338A1 (en) | Method for optimizing depth estimation model, computer device, and storage medium | |
CN115979288A (en) | Course angle determining method, electronic equipment and storage medium | |
CN112835063B (en) | Method, device, equipment and storage medium for determining dynamic and static properties of object | |
CN112630736B (en) | Parameter determination method, device, equipment and storage medium of road side radar | |
CN116165652A (en) | Target tracking method and device and communication equipment | |
US20240230842A9 (en) | Method and apparatus of filtering dynamic objects in radar-based ego-emotion estimation | |
CN118334568B (en) | Pose construction method and device, electronic equipment and storage medium | |
WO2024180652A1 (en) | Object detection device and object detection method | |
CN113753024B (en) | Method, device, equipment and storage medium for eliminating steady-state deviation of vehicle | |
US10976438B2 (en) | Estimation device and estimation method | |
CN113052241A (en) | Multi-sensor data fusion method and device and automobile | |
CN116520255A (en) | Filtering method | |
CN113985413A (en) | Point cloud data processing method, device and equipment and automatic driving vehicle | |
CN113358894A (en) | Vehicle speed calculation method, moving object motion state detection method, device and equipment | |
CN117350077A (en) | Vehicle function verification method, device, equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |