CN110203204A - A kind of vehicle-surroundings environment perception method - Google Patents

A kind of vehicle-surroundings environment perception method Download PDF

Info

Publication number
CN110203204A
CN110203204A CN201910411835.9A CN201910411835A CN110203204A CN 110203204 A CN110203204 A CN 110203204A CN 201910411835 A CN201910411835 A CN 201910411835A CN 110203204 A CN110203204 A CN 110203204A
Authority
CN
China
Prior art keywords
target
vehicle
effective
radar
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910411835.9A
Other languages
Chinese (zh)
Inventor
杨书艽
徐波
丁赞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Senyun Intelligent Technology Co Ltd
Original Assignee
Shenzhen Senyun Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Senyun Intelligent Technology Co Ltd filed Critical Shenzhen Senyun Intelligent Technology Co Ltd
Priority to CN201910411835.9A priority Critical patent/CN110203204A/en
Publication of CN110203204A publication Critical patent/CN110203204A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions

Abstract

The present invention relates to radar cognition technology fields, specifically disclose a kind of vehicle-surroundings environment perception method, comprising steps of S1, several radars to be mounted on to vehicle-surroundings position, and establish and communicate with data processing equipment;S2, several radars identify target and corresponding generation target information according to the surrounding enviroment of its position, and target information is sent to data processing equipment;S3, data processing equipment filter out effective target from target information;The target information of S4, data processing equipment fusion effective target, generate vehicle-surroundings environment sensing result.Vehicle-surroundings environment perception method of the invention carries out screening and fusion treatment by the target information generated to radar, the vehicle-surroundings environment sensing result of high accuracy is obtained, for the primary decision-making basis that vehicle behavior decision making device controls vehicle driving, to ensure that the safety of vehicle drive.

Description

A kind of vehicle-surroundings environment perception method
Technical field
The present invention relates to radar cognition technology field more particularly to a kind of vehicle-surroundings environment perception methods.
Background technique
With popularizing for automobile, the identification capability of road pavement object is improved using sensor, it is more safe to driver Warning and auxiliary, to improve the active safety of vehicle, it has also become the important directions of intelligent transport system field development.Due to Single-sensor tends not to the information for accurately perceiving vehicle-periphery, in order to guarantee vehicle security drive, in automotive environment It has been often equipped with multiple radars in sensory perceptual system, and the environmental information of radar has mutually been merged, has obtained environment surrounding automobile situation. But need to promote its accuracy in the prior art for the fusion results of radar environments information, it is achieved that high precision The vehicle-periphery perception of degree becomes a urgent problem to be solved.
Summary of the invention
For in the prior art the technical issues of, the present invention provides a kind of vehicle-surroundings environment perception method.
A kind of vehicle-surroundings environment perception method, comprising the following steps:
S1, several radars are mounted on to vehicle-surroundings position, and establish and communicates with data processing equipment;
S2, several radars identify target and corresponding generation target information according to the surrounding enviroment of its position, by target Information is sent to data processing equipment;
S3, data processing equipment filter out effective target from target information;
The target information of S4, data processing equipment fusion effective target, generate vehicle-surroundings environment sensing result;
Wherein S3 the following steps are included:
S31, by target information null object and jamming target filter out, obtain primary election data;
S32, primary election data are carried out to screen determining effective target;
S33, the life cycle for judging effective target are divided according to formation, lasting, tracking extinction four-stage, and Its target information is handled according to the stage where effective target.
Further, in step S2, target information includes the relative distance of vehicle itself and target, relative angle, opposite The status information of the ID and target of speed and target.
Further, in step S31 further include: by the null object in target information and after jamming target filtered out, weight The ID of new arrangement target, obtains primary election data.
Further, step S32 is specifically included:
S321, prediction target next detection cycle target information as predicted value;
S322, acquisition target are in next detection cycle by the target information of detections of radar as detected value;
Whether within the set range S323, comparison predicted value and detected value simultaneously judge difference of them, if so, thening follow the steps S324, if it is not, thening follow the steps S325;
S324, consistency treatment is carried out to target;
S325, inconsistent processing is carried out to target;
S326, effective target is determined according to the target information of step S224.
Further, step S32 is specifically included:
S321 ', effective identification range that vehicle driving is determined according to traffic information;
S322 ', vehicle driving coordinate system is established according to traffic information and effective identification range;
S323 ', target is calculated in the lateral position of vehicle driving coordinate system;
Lateral position is located at the target in effective identification range in S324 ', screening primary election data, and as effective target.
Further, step S33 is specifically included:
S331, number that target is continuously chosen is set as N1, the corresponding detected value of setting effective target and predicted value are not Consistent number is N2, number that effective target is continuously lost is set as N3;And setting N2Threshold value be Tw, N3Threshold value be TL
S332, the life cycle for judging effective target;Wherein: N1> 5 is formation stages;N2=0 or N3=0 and N2< TwWhen, it is sustained period;N2< Tw, and N3< TLWhen be tracking phase;N2> TwOr N3> TLWhen be the extinction stage;
S333, effective target is divided according to formation, lasting, tracking extinction four-stage;Wherein: at effective target When formation stages, new target information is used;When effective target is in sustained period, the target letter of former effective target is updated Breath;When effective target is in tracking phase, kept using predicted value;When effective target is in the extinction stage, terminates and use The target information of former effective target.
Further, step S4 is specifically included:
S41, spatial synchronization is carried out to the target information that different location radar generates, establishes vehicle center coordinate system and to sky Between synchrodata carry out coordinate conversion, obtain spatial synchronization calibration result;
S42, region division is carried out according to radar position difference, according to division result to the target information of effective target It is merged, generates vehicle-surroundings environment sensing result;
S43, it is shown according to the occupy-place that traffic information carries out effective target, and requires progress can FOH according to vehicle driving Domain selection.
Further, step S41 is specifically included:
S411, radar fix system is established for each radar, the target information of effective target is subjected to sky according to detection cycle Between it is synchronous;
S412, vehicle center coordinate system is established according to installation site of the radar on vehicle, spatial synchronization data is carried out Corresponding coordinate is converted to spatial synchronization calibration result.
Further, step S42 comprising steps of
S421, the region that corresponding radar fix system coincides in vehicle center coordinate system is obtained;
S422, unification is carried out to the effective target in overlapping region, and calculates the target of effective target in the overlapping region Information;
S423, the target information that effective target is merged with vehicle center coordinate system, generate vehicle-surroundings environment sensing result.
Further, step S43 is specifically included:
S431, the distribution that effective target is obtained according to vehicle-surroundings environment sensing result, carry out according to traffic information The occupy-place of effective target is shown;
S432, according to vehicle driving require carry out can traffic areas select, and can traffic areas selection result be sent to Vehicle behavior decision making device;
S433, vehicle behavior decision making device control vehicle driving.
The vehicle-surroundings environment perception method of the present embodiment, by the way that the target information that radar generates is screened and melted Conjunction processing, obtained the vehicle-surroundings environment sensing of high accuracy as a result, for vehicle behavior decision making device to vehicle driving into The primary decision-making basis of row control, to ensure that the safety of vehicle drive.
Detailed description of the invention
Illustrate the embodiment of the present invention or technical solution in the prior art in order to clearer, to embodiment or will show below There is attached drawing needed in technical description to be briefly described, it is clear that, the accompanying drawings in the following description is only this Some embodiments of invention for those of ordinary skill in the art without creative efforts, can be with Other attached drawings are obtained according to these attached drawings.
Fig. 1 is the step flow chart of the vehicle-surroundings environment perception method of the embodiment of the present invention;
Fig. 2 is radar schematic view of the mounting position in the vehicle-surroundings environment perception method of the embodiment of the present invention;
Fig. 3 is the step flow chart that step S3 is realized in the vehicle-surroundings environment perception method of the embodiment of the present invention;
Fig. 4 is the arrangement schematic diagram of Target id in the vehicle-surroundings environment perception method of the embodiment of the present invention;
Fig. 5 is the step flow chart that step S32 is realized in the vehicle-surroundings environment perception method of the embodiment of the present invention;
Fig. 6 is that schematic diagram is established in coordinate system direction in the vehicle-surroundings environment perception method of the embodiment of the present invention;
Fig. 7 is the step process that step S32 is realized in the vehicle-surroundings environment perception method of another embodiment of the present invention Figure;
Fig. 8 is the step flow chart that step S4 is realized in the vehicle-surroundings environment perception method of the embodiment of the present invention;
Fig. 9 is the step flow chart that the present invention applies that step S41 is realized in the vehicle-surroundings environment perception method of example;
Figure 10 is establishment of coordinate system schematic diagram in the vehicle-surroundings environment perception method of the embodiment of the present invention;
Figure 11 is the step flow chart that step S42 is realized in the vehicle-surroundings environment perception method of the embodiment of the present invention;
Figure 12 is the identification range schematic diagram of radar in the vehicle-surroundings environment perception method of the embodiment of the present invention;
Figure 13 is the step flow chart that step S43 is realized in the vehicle-surroundings environment perception method of the embodiment of the present invention;
Figure 14 is detections of radar range schematic diagram in the vehicle-surroundings environment perception method of the embodiment of the present invention;
Figure 15 is Lattice encoding schematic diagram in the vehicle-surroundings environment perception method of the embodiment of the present invention;
Figure 16 is grid division schematic diagram in the vehicle-surroundings environment perception method of the embodiment of the present invention.
Specific embodiment
Below in conjunction with the attached drawing in the present invention, technical solution in the embodiment of the present invention carry out it is clear, completely retouch It states, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.Based on the present invention In embodiment, those skilled in the art's all other reality obtained without making creative work Example is applied, protection scope of the present invention is belonged to.
As shown in Figure 1, being a kind of vehicle-surroundings environment perception method of the embodiment of the present invention, comprising the following steps:
Step S1: several radars are mounted on vehicle-surroundings position, and establishes and communicates with data processing equipment.
The radar for being used to measure before the actual measurement, is first mounted on by the vehicle-surroundings environment perception method of the present embodiment On vehicle, the present embodiment does not limit the particular number and installation site of radar, optionally, as shown in Fig. 2, installing on vehicle 6 radars, number consecutively are 1., 2., 3., 4., 5., 6., to be separately mounted to central front position, the rear center position of vehicle It sets, left forward side position, tail portion leftward position, right forward side position, tail portion right positions, radar is installed according to aforementioned setting After good, the communication with data processing equipment is established, the present embodiment can be used wireless telecommunications, wire communication can also be used, herein not It is specifically limited.It is also only a kind of embodiment that the present embodiment, which selects 6 radars to carry out the detection of information, can not be interpreted as Limiting the scope of the present invention.Data processing equipment in the present embodiment, for detection information caused by radar into Row processing, realizing the specific hardware product of the function, the present embodiment does not limit.
As shown in Figure 1, the vehicle-surroundings environment perception method of the embodiment of the present invention, further includes:
Step S2: several radars identify target and corresponding generation target information according to the surrounding enviroment of its position, will Target information is sent to data processing equipment.
Optionally, the radar of the present embodiment selects millimetre-wave radar, and each radar can recognize 64 targets, i.e., each radar Include the information of 64 targets in the target information of generation, these information may include vehicle itself with the target it is opposite away from From, the status information of the ID of relative angle, relative velocity and the target and the target.The ID of target is for realizing to target Between differentiation, the ID of target can be set as to i, limit i ∈ (0,1,2 ..., 63), respectively with 0 to 63 this 64 digital representations The ID of 64 targets, each target has unique ID, and is stored according to analysis protocol to target information.
As shown in Figure 1, the vehicle-surroundings environment perception method of the embodiment of the present invention, further includes:
Step S3: data processing equipment filters out effective target from target information.
Step S4: data processing equipment merges the target information of effective target, generates vehicle-surroundings environment sensing result.
Specifically, as shown in figure 3, for the screening of effective target in step S3 specifically:
Step S31: by target information null object and jamming target filter out, obtain primary election data.
Some in the target information that radar returns is the target information of null object, that is, target, which is not detected, can also return System default value, also deposits the target informations of invalid targets in the target information that radar returns, i.e., vehicle occur jolting, speed it is prominent No practical significance can be generated without the corresponding invalid jamming target of objective target when change or other interference.Therefore target letter is being carried out Breath is further processed the null object needed before by the target information of radar generation and invalid targets are rejected, with guarantee pair The accuracy of vehicle-surroundings environment sensing avoids judging by accident caused by the appearance because of null object or jamming target.
According to general analysis protocol it is found that the characteristics of target information of the target information medium altitude target of radar is return value For initial value, target is zero relative to the relative distance of vehicle itself, relative angle, and response speed is also initial value;And it does The characteristics of disturbing target is that time of occurrence is of short duration and jumping characteristic is big, i.e., relative distance, relative angle, relative velocity occur jumping and deposit It is short in the time.According to the above feature, the target information that can use radar return is filtered target, with reject null object and Jamming target.Specifically, can be in the following manner for the judgement of null object and jamming target:
The relative distance of vehicle itself and target is set as di, relative angle αi, relative velocity vi, wherein i is target ID;Then
Null object meets: di=0 and αi=0, i.e. the measured value of relative distance and relative angle is zero;
Jamming target meets: | di(n+1)-di(n) | >=A meters or | αi(n+1)-αi(n) | >=B degree or | vi(n+1)- vi(n) | >=C meter per second, in which: n is detection number, and A, B, C are default value.The setting value of A, B, C are used in the present embodiment Judge whether target occurs jumping and there is a situation where that the time is short, the value of A, B, C depend on the sampling frequency of radar under normal circumstances Rate and the speed of vehicle driving etc..Such as previous embodiment, 6 radars are installed on vehicle in the present embodiment, due to each The installation direction of radar is all different, so the direction difference of the relative distance obtained, relative angle, relative velocity, because This takes the absolute value of object information data to be judged to simplify operation.For being located at the radar of central front position, thunder The frequency acquisition reached is 20Hz, then sampling interval duration is 0.05 second, sets A=5, B=3, C=5, that is, is directed to same target, It is in the adjacent information of acquisition twice, and relative distance is more than 5 meters, and perhaps relative angle is more than 3 degree or relative velocity is more than 5 metre per second (m/s)s, then being defined as the target is jamming target, because the target information of jamming target is there are jumping characteristic and without track.When So, those skilled in the art in the specific implementation process can sets itself A, B, C value, the present embodiment be only illustrate It is bright.Further, there are also this short features of time of occurrence for jamming target, so setting the continuous frequency of occurrence of target less than 5 times, i.e., This target was then considered as jamming target less than 0.25 second by the continuous time of occurrence of the target.Through the above way by null object and After the corresponding target information of jamming target filters out, the primary election data in step step S31 are obtained.
Further include in the step S31 of the embodiment of the present invention by target information null object and jamming target filter out Afterwards, the ID for rearranging target obtains primary election data.As shown in figure 4, a is the ID row of target in the target data of radar generation Sequence, b are the sequence of the Target id after filtering out null object, and c is the sequence of the Target id after filtering out jamming target, and d is to rearrange Target id.
When the vehicle is driving at high speed, loss and misjudgment phenomenon that target occur, therefore step of the embodiment of the present invention are easiest to The primary election data that null object and jamming target in S31 in target information obtain after filtering out need to be further processed, to realize To the correct identification of target and tenacious tracking, guarantee the correctness of autonomous driving vehicle decision.
Specifically, the present embodiment further includes step S32: carrying out screening determining effective target to primary election data.
Vehicle-surroundings environment perception technology is used in this technical field of autonomous driving vehicle more, and autonomous driving vehicle pair The perception of ambient enviroment is the basis of automatic Pilot, and autonomous driving vehicle needs to receive the various information of ambient enviroment, such as hands over Logical mark, automobile, pedestrian, road conditions, vehicle condition etc., and corresponding Decision Control information is exported, such as turn to, accelerate, braking, shift Deng.Radar can detect the environment and farther away roadblock of vehicle periphery, provide decision references for autonomous driving vehicle.
Specifically, as shown in figure 5, the realization for screen determining effective target to primary election data in the present embodiment include with Lower step:
S321 ', effective identification range that vehicle driving is determined according to traffic information.
In practical driving conditions, automatic driving vehicle needs to identify the target in vehicle-periphery in order to generate phase Corresponding decision, according to China's road technical specification, the width of runway is generally 3.75m, if calculated according to 8 lanes, Overall width is 30m, it is contemplated that stop in emergency band and central partition, and the overall width of road can be regarded as 40m, therefore vehicle amount two sides Identification range is set as 20m.
S322 ', vehicle driving coordinate system is established according to traffic information and effective identification range.
In practical situations, effective target is normally at from the limited range of vehicle two sides, establishes coordinate as shown in Fig. 6 System, dotted line frame represent vehicle, and solid box represents the target in vehicle surrounding enviroment, and horizontal line indicates the division in lane.
S323 ', target is calculated in the lateral position of vehicle driving coordinate system.
In vehicle driving coordinate system y to distance for judge target whether in effective identification range, i.e. yi=disinαi, Wherein, diFor target i with from the relative distance of vehicle, αiFor target i with from the relative angle of vehicle, yiIt is sat for the lateral position of target i Mark, | yi| indicate target i in the lateral position of vehicle driving coordinate system.
Lateral position is located at the target in effective identification range in S324 ', screening primary election data, and as effective target.
Judgement | yi|≤y0It is whether true, y0For distinguish target whether the direction y in effective identification range apart from threshold Value.The y of target is less than y to distance0When think the target be effective target, y0Value be contemplated that road conditions, number of track-lines etc. Factor, y0If value it is too small the case where effective target is selected in leakage would tend to occur, if crossing conference introduces unnecessary interference Target, therefore, according to effective identification range in aforementioned lane, the present embodiment is by y0Value is 20m.
The present invention also provides another embodiments, as shown in fig. 7, step S32 includes:
S321, prediction target next detection cycle target information as predicted value.
In the present embodiment, predicted using target information of the three rank Kalman filterings to next detection cycle of target, Choose target informationWherein dN, e、vN, e、aN, eRespectively mesh of the target in n-th of detection cycle Relative distance, relative velocity and the relative acceleration in information are marked, then target is in the detection cycle of next detection cycle, that is, n+1 Predicted value are as follows:
In above formula: t is the detection cycle of radar, and the radar detection frequency of the present embodiment is selected as 20Hz, therefore t is 0.05s; d(n+1)nFor the relative distance for the target that the target information obtained by a upper detection cycle is predicted;v(n+1)nTo pass through upper one The relative velocity for the target that the target information that detection cycle obtains is predicted;a(n+1)nIt is obtained by a upper detection cycle The relative acceleration for the target that target information is predicted.
S322, acquisition target are in next detection cycle by the target information of detections of radar as detected value;
(n+1)th detection cycle of radar is obtained by the target information of detections of radar are as follows:
In above formula: d(n+1)The relative distance (m) in target information obtained for (n+1)th detection cycle of radar;v(n+1)For The relative velocity (m/s) in target information that (n+1)th detection cycle of radar obtains;a(n+1)For (n+1)th detection cycle of radar Relative acceleration (m/s in obtained target information2)。
Whether within the set range S323, comparison predicted value and detected value simultaneously judge difference of them, if so, thening follow the steps S324, if it is not, thening follow the steps S325.
The detected value that the predicted value obtained by three rank Kalman predictions and radar are returned according to following criterion into Row judgement:
In above formula: d0、v0、a0It is the worst error value allowed between predicted value and detected value.The measurement error of radar itself And error can be generated when three rank Kalman filterings, it can be obtained according to experiment and experience:
By comparing the predicted value and detected value of (n+1)th detection cycle target, if the two difference is less than or equal to most Big error amount, thens follow the steps S324, if more than worst error value, thens follow the steps S325.
S324, consistency treatment is carried out to target.
When the predicted value of target in (n+1)th detection cycle and the difference of detected value are less than or equal to worst error value, then Illustrate that the target in primary election data is consistent with the target that n-th of detection cycle obtains, then consistency treatment is carried out to target.
S325, inconsistent processing is carried out to target.
If the predicted value of target and the difference of detected value are greater than worst error value in (n+1)th detection cycle, illustrate just The target for selecting target in data to obtain with n-th of detection cycle is inconsistent, then carries out inconsistent processing to target.
S326, effective target is determined according to the target information of step S224.
The present embodiment carries out the screening of effective target in primary election data, realizes the correct identification to target and stablizes Tracking, to guarantee the correctness of autonomous driving vehicle decision.The step S32 of the present embodiment both can be by step S321 ' extremely The screening of step S324 ' carry out effective target can also carry out the screening of effective target by step S321 to step S326, more excellent Choosing, the present embodiment can also first pass through step S321 ' and determine preliminary effective target to step S324 ', then pass through step S321 Preliminary effective target is further screened to step S326, obtains more accurate effective target the selection result.
The present embodiment further include: step S33: judging the life cycle of effective target, according to formation, continues, tracking extinction Four-stage is divided, and is handled according to the stage where effective target its target information.
The embodiment of the present invention introduce effective target life cycle method, with this come to effective target from formed to withering away into Row description.The life cycle of effective target refer to using relevant parameter describe effective target from occur disappear to the end it is entire Process.FindTimes indicates that the number that some target is continuously chosen in primary election data, WrongTimes indicate certain effective target The predicted value number inconsistent with corresponding detected value;LostTimes indicates what tracked effective target was continuously lost Number.FindTimes, WrongTimes, LostTimes can be expressed as N1、N2、N3.The wherein introducing of WrongTimes The appearance for allowing for following several operating conditions can have an impact the tracking of effective target: (1) when car speed is very fast, in radar Occurs the short period in detection range;(2) radar recognizes false target;(3) since radar is there are identification error, having can Can miss the target identification except selected range is effective target.The introducing of parameter LostTimes is to consider following several operating conditions Appearance the tracking of effective target can be had an impact: (1) due to occurring phenomena such as jolting or swinging meeting from vehicle driving process The interference for causing radar identification effective target, causes former effective target to lose;(2) it is made due to the job insecurity of radar itself At effective target transient loss;(3) occur to generate former effective target when other drive into vehicle in radar identification range It blocks, causes former effective target transient loss.
Specifically, step S33 includes:
S331, number that target is continuously chosen is set as N1, the corresponding detected value of setting effective target and predicted value are not Consistent number is N2, number that effective target is continuously lost is set as N3;And setting N2Threshold value be Tw, N3Threshold value be TL
S332, the life cycle for judging effective target;Wherein: N1> 5 is formation stages;N2=0 or N3=0 and N2< TwWhen, it is sustained period;N2< Tw, and N3< TLWhen be tracking phase;N2> TwOr N3> TLWhen be the extinction stage.
It is specific as shown in table 1:
The description in 1 effective target life cycle each stage of table
For Tw、TLValue, it is contemplated that following two situation: although a kind of situation refers to the interference accidentally occurred Or there is of short duration loss, but former effective target remains as actual effective target, at this time if Tw、TLIt is too small then The influence that accurate exclusive PCR or target are lost;Another situation refer to when former effective target has no longer existed or When being no longer effective target, if Tw、TLThe excessive delay that will cause effective target update.Choosing Tw、TLWhen need root It is arranged according to the length of effective target life cycle:
In formula: setting Tw、TLMaximum value be 20, then the time at least delay of 0.5s that target is replaced, and with Original effect time of target following and it is linearly increasing, be up to 1s.
S333, effective target is divided according to formation, lasting, tracking extinction four-stage;Wherein: at effective target When formation stages, new target information is used;When effective target is in sustained period, the target letter of former effective target is updated Breath;When effective target is in tracking phase, kept using predicted value;When effective target is in the extinction stage, terminates and use The target information of former effective target.
Specifically, after being divided to effective target according to formation, lasting, tracking extinction four-stage, to each stage Effective target carry out corresponding decision, such as table 2:
Table 2: the correspondence decision in effective target life cycle each stage
Stage Objective decision
It is formed Use the target information of new effective target
Continue Update the target information of former effective target
Tracking It is kept using predicted value
It withers away Terminate the target information using former effective target
The life cycle of effective target is described according to the above parameter and effective mesh is judged according to the variation of relevant parameter The identification and tracking under different traffic operating conditions to target may be implemented by different objective decisions in target each stage.
As shown in Figure 1, a kind of vehicle-surroundings environment perception method of the embodiment of the present invention, further includes:
Step S4: data processing equipment merges the target information of effective target, generates vehicle-surroundings environment sensing result.
Specifically, as shown in figure 8, step S4 the following steps are included:
S41, spatial synchronization is carried out to the target information that different location radar generates, establishes vehicle center coordinate system and to sky Between synchrodata carry out coordinate conversion, obtain spatial synchronization calibration result.
S42, region division is carried out according to radar position difference, according to division result to the target information of effective target It is merged, generates vehicle-surroundings environment sensing result.
S43, it is shown according to the occupy-place that traffic information carries out effective target, and requires progress can FOH according to vehicle driving Domain selection.
Wherein, as shown in figure 9, step S41 is specifically realized by following steps:
S411, radar fix system is established for each radar, the target information of effective target is subjected to sky according to detection cycle Between it is synchronous.
The environmental information that single radar is capable of providing is limited in scope, and merging to multiple wave radar informations can be big The range of big extension environment sensing, to realize effectively evading for barrier.So needing to carry out the data that multiple radars obtain The information that spatial synchronization can just be such that multiple radars obtain matches, and otherwise will affect the accuracy of fusion results.The present embodiment institute The radar of selection detection frequency having the same is 20Hz, therefore synchronous reception may be implemented in target information.Preferably, in order to protect The authenticity for demonstrate,proving data is sent out the target information packet parsing that radar is sent using the received method of multiple threads later It send, will test range in data fusion and redistribute.
S412, vehicle center coordinate system is established according to installation site of the radar on vehicle, spatial synchronization data is carried out Corresponding coordinate is converted to spatial synchronization calibration result.
The installation site for defining six radars as shown in Figure 2, defines radar fix system oi-xi-yi, i represent radar number (i =1,2,3,4,5,6), oiFor radar center, xiAxis is the direction on vertical radar surface, and vehicle heading is positive, yiDirection is flat Row is in radar surface, and right area is positive, and left side is negative;Define vehicle center coordinate system Oo-Xo-Yo, wherein OoFor in vehicle The heart, XoFor vehicle central axes, vehicle heading is positive.As shown in Figure 10, with vehicle front central location and leftward position For radar, i.e. i=1 and i=3.According to the identification range of radar it is found that the investigative range of radar 1 and radar 3 exists jointly Region, therefore, by taking radar 3 as an example, a3For the transversal displacement of 3 coordinate system relative vehicle centre coordinate system of radar, b3It is longitudinal inclined Shifting amount, α3For the angle that radar 3 is installed, required according to general row of surveying, the installation requirement of radar 3 is α3=60 °.The coordinate of radar 1 Transformational relation between system and vehicle center coordinate system is answered are as follows:
X0=x1+2
Y0=y1
In formula: 2 be distance (m) of the vehicle center to 1 detection faces of radar, can be obtained by measurement.
As shown in Figure 10, respective objects point is chosen in the common detection zone of radar 1 and radar 3, then the available mesh The location information being marked under two coordinate systems, can using the geometrical relationship between two radar fix systems and vehicle center coordinate system To obtain:
XP0=b3+yP0sinα3+xP0coSα3
YP0=a3-(yP0cosα3-xP0sinα3)
In formula: XP0、YP0For the location information under target point P0 in the car heart coordinate system;xP0、yP0It is target point P0 in thunder Location information up under 3 coordinate system.Benefit other radar fix systems available in a like fashion and vehicle center coordinate system Between relational expression.Because radar can have error in an installation, for the spatial synchronization and target information for guaranteeing six radars Accuracy need to carry out the calibration of multiclass, for example, the variable of calibration includes the coordinate system relative vehicle centre coordinate system of radar 3 Transversal displacement α3, vertical misalignment amount b3And the offset angle between radar fix system and vehicle center coordinate system x-axis3
Specifically, as shown in figure 11, step S42 is specifically included:
S421, the region that corresponding radar fix system coincides in vehicle center coordinate system is obtained;
The identification range of radar can be as shown in figure 12, the detection zone of the radar of the radar and front and back central location of the left and right sides There is very big overlapping in domain, and main overlapping region appears between the radar of front-rear center position and corresponding lateral radar.By In the detection accuracy difference of different radars and the presence of installation error, calibrated error, the target identification of overlapping region is easy to produce Raw difference or even same target may return to two or more target informations, it is therefore desirable to be analyzed and processed overlapping region.
It is found by test, the radar of front-rear center position will be better than in terms of the accuracy and stability of identification target Lateral radar, and the installation accuracy of the radar of front-rear center position will be much higher than lateral radar, therefore in object detection field The principle based on the radar of front-rear center position is followed in distribution.Detections of radar is carried out by taking radar 1 in attached drawing 2 and radar 3 as an example The introduction of range assignment.The overlapping region of radar 1 and radar 3 is mainly in radar 1 away within the scope of mode detection.
S422, unification is carried out to the effective target in overlapping region, and calculates the target of effective target in the overlapping region Information;
With the positional relationship and the division of overlapping region between the two of aforementioned radar 1 and radar 3, the two radars are analyzed Target information (the X of return1i, Y1i)、(X3i, Y3i);Velocity information V1i、V1i, wherein (0,1 ..., 63) i ∈, if (X1i, Y1i)、 (X3i, Y3i) and V1i、V1iMeet:
|X1i-X3i|≤0.2m
|Y1i-Y3i|≤0.2m
|V1i-V3i|≤2m/s
Then illustrate that two radar identifications are the same target, the desirable two target return informations of the location information of target Average value.Above 0.2m, the 2m/s refers to the setting that positional relationship carries out between the shape and radar of vehicle, is only used for herein Example.
S423, the target information that effective target is merged with vehicle center coordinate system, generate vehicle-surroundings environment sensing result.
Vehicle-surroundings environment is generated according to the target information of effective target in the overlapping region calculated in previous step Sensing results.
Specifically, as shown in figure 13, step S43 is specifically included:
S431, the distribution that effective target is obtained according to vehicle-surroundings environment sensing result, carry out according to traffic information The occupy-place of effective target is shown;
S432, according to vehicle driving require carry out can traffic areas select, and can traffic areas selection result be sent to Vehicle behavior decision making device;
By the distribution of the available target of the environment sensing of radar, the distribution of target is carried out according to target position Divide, with realize can traffic areas selection.
S433, vehicle behavior decision making device control vehicle driving.
In the car in heart coordinate system, using vehicle center as coordinate origin, the reality of vehicle is realized by the way of grid map When traveling figure.Real-time vehicle running figure include two major parts, left side be radar scanning target display figure, including enlarged drawing with Global thumbnail, it is vehicle front side and the left and right sides that wherein enlarged drawing, which pays close attention to region, after thumbnail mainly observes vehicle Side.Traveling figure right side is vehicle coordinate figure, main to show vehicle GPS information.When extracting the target of radar scanning, in grid The corresponding position of upper displaying target, while the distribution situation of target is sent to vehicle behavior decision package, vehicle behavior decision Device controls vehicle driving.
Optionally, the design method of real-time vehicle running figure is as follows in the present embodiment:
The real time running figure of vehicle is length L (centimetre), and the grid map of width W (centimetre), each small grid is to become D The square of (centimetre) calculates for convenience, and the value of L and W are the integral multiple of P;The real time running figure of vehicle is encoded, often A small grid is indicated with (x, y, value), and wherein x represents the columns of each small grid, and y represents the line number of each small grid, Value represents the data encoding of small grid filling, which is the data encoding of each radar output.The data type of x, y For unsigned short, value data type is unSigned char.Draft system setting L=20000cm, W= 8000cm, D=20cm, then Xmax=1200, Xmin=-8000, Ymax=2000, Ymin=-2000.Using vehicle center as origin, Front 120m, each 20m in left and right, Che Hou 80m establish rectangular coordinate system.It is detections of radar range in wire regional scope in Figure 14, Target is numbered according to the target of radar according to the source difference of target.It will test region and carry out grid division, with Figure 14 In A point be that will test region division be the square that side length is 20cm to starting point, the coding of grid is as shown in figure 15, in vehicle Grid where the heart is (99,399).
For target Lattice encoding specifically in the following way:
According to the grid proposed above, by detections of radar to effective target be filled according to coordinate position, have target Grid tint display.If the position coordinates under heart coordinate system are (X to target in the carp, Yp), according to following regular by position Coordinate is converted to Lattice encoding (Px, Py)。
(1) if Xp< Xmin, then X is enabledp=Xmin
(2) if Xp> Xmax, then X is enabledp=Xmax-0.2;
(3) if Yp< Ymin, then Y is enabledp=Ymin
(4) if Yp> Ymax, then Y is enabledp> Ymax-0.2;
The then Lattice encoding of target point are as follows:
In formula: D is the setting width of grid, and D may be set to 0.2m.If Px< 0, then enable Px=0, if Px> [(Xmax-Xmin)/(D-1)], then enable Px=[(Xmax-Xmin)/(D-1)];If Py< 0, then enable Py=0, if Py> [(Ymax- Ymin)/(D-1)], then enable Py=[(Ymax-Ymin)/(D-1)].The then grid number of target point are as follows:
In formula: PindexThe grid number for being target in vehicle driving figure.If Pindex≤ 0, then enable Pindex=0;IfThen enable
If the Lattice encoding of target meets hereafter any of Chinese style (3) (4) (5) (6), the target position is corresponding Lattice encoding PindexThe grid represented as can passage points, otherwise PindexCorresponding grid tag is impassabitity point.
In the present embodiment for can the selection mode specific as follows of traffic areas realize:
After target carries out Lattice encoding in vehicle driving figure, the distribution of target can be clearly showed that according to grid map Situation, according to target distribution situation carry out can traffic areas selection, vehicle periphery region is divided, as shown in figure 16. Set vehicle can passage width as Wp=2.8m, path length are set as Ls=5m, can traffic areas left boundary be yleft, right Boundary is Yright, coboundary Xup, lower boundary Xdown.Work as Px> (- XminWhen)/D, and in the car the front region A of the heart with Grid in the B of region is retrieved, wherein region A and region B meets:
Then this region A and region B can traffic areas right boundary are as follows:
Yleft=Pymax-L (4)
Yrgiht=Pymin-R (5)
In above formula: Pymax-LFor the nearest grid y coding of vehicle left side lateral distance in range of search;Pymmin-RFor retrieval The grid y coding that vehicle right side lateral distance is nearest in range.
In the A of region can traffic areas up-and-down boundary are as follows:
Similarly, other regions are retrieved, the left and right side dividing value of all areas is obtained.Route interval is set as 5 meters, therefore by each section Zone boundary be compared with first zone boundary.
ΔYleft-j0=Yleft-j-Yleft-0
ΔYright-j0=Yright-j-Yright-0
In formula: Yleft-0For Chinese herbaceous peony the 1st can traffic areas left margin;Yleft-jIt can traffic areas for j-th of Chinese herbaceous peony Left margin;Yright-0For Chinese herbaceous peony the 1st can traffic areas right margin;Yright-jFor j-th of Chinese herbaceous peony can traffic areas the right Boundary.
In formula: βleft-j0For j-th can traffic areas left margin and first P Passable region left margin angle; βright-j0For j-th can traffic areas right margin and the 1st can traffic areas left margin angle, j=(1,2 ...).If j > 1, then can traffic areas parameter value are as follows:
In formula: XmaxLongitudinal Lattice encoding for right ahead apart from nearest target;βleft-j0minFor the traffic areas left side The smallest deflection in boundary;βright-j0maxFor the maximum deflection of traffic areas right margin, leading to for vehicle is formed by above-mentioned parameter Row region.
If j=1, by region right boundary respectively with vehicle is minimum to be compared by zone boundary, select it respectively Middle the greater, if j < 1, impassabitity.
Vehicle behavior decision making device according to it is above-mentioned can traffic areas selection result vehicle driving is controlled.
The vehicle-surroundings environment perception method of the embodiment of the present invention, by the target information that radar generates screened with And fusion treatment, the vehicle-surroundings environment sensing of high accuracy has been obtained as a result, for vehicle behavior decision making device to vehicle row The primary decision-making basis for driving into row control, to ensure that the safety of vehicle drive.Above by specific embodiment to this hair It is bright to further describe, it should be understood that, specific description here should not be construed as the essence of the present invention and model The restriction enclosed, the various modifications that one of ordinary skilled in the art after reading this specification makes above-described embodiment, all Belong to the range that the present invention is protected.

Claims (10)

1. a kind of vehicle-surroundings environment perception method, which comprises the following steps:
S1, several radars are mounted on to vehicle-surroundings position, and establish and communicates with data processing equipment;
S2, several radars identify target and corresponding generation target information according to the surrounding enviroment of its position, will be described Target information is sent to the data processing equipment;
S3, the data processing equipment filter out effective target from the target information;
S4, the data processing equipment merge the target information of the effective target, generate vehicle-surroundings environment sensing result;
Wherein S3 the following steps are included:
S31, by the target information null object and jamming target filter out, obtain primary election data;
S32, the primary election data are carried out screening determining effective target;
S33, the life cycle for judging the effective target are divided according to formation, lasting, tracking extinction four-stage, and Its target information is handled according to the stage where the effective target.
2. a kind of vehicle-surroundings environment perception method as described in claim 1, which is characterized in that described in the step S2 Target information include the relative distance of vehicle itself and the target, relative angle, relative velocity and the target ID and The status information of the target.
3. a kind of vehicle-surroundings environment perception method as claimed in claim 2, which is characterized in that also wrapped in the step S31 It includes: by the null object in the target information and after jamming target filtered out, rearranging the ID of the target, obtain primary election Data.
4. a kind of vehicle-surroundings environment perception method as claimed in claim 3, which is characterized in that the step S32 is specifically wrapped It includes:
S321, the prediction target next detection cycle target information as predicted value;
S322, the acquisition target are in next detection cycle by the target information of detections of radar as detected value;
Whether within the set range S323, the comparison predicted value and the detected value simultaneously judge difference of them, if so, executing Step S324, if it is not, thening follow the steps S325;
S324, consistency treatment is carried out to target;
S325, inconsistent processing is carried out to the target;
S326, effective target is determined according to the target information of step S224.
5. a kind of vehicle-surroundings environment perception method as claimed in claim 3, which is characterized in that the step S32 is specifically wrapped It includes:
S321 ', effective identification range that vehicle driving is determined according to traffic information;
S322 ', vehicle driving coordinate system is established according to the traffic information and effective identification range;
S323 ', the target is calculated in the lateral position of the vehicle driving coordinate system;
Lateral position described in S324 ', the screening primary election data is located at the target in effective identification range, and makees For effective target.
6. a kind of vehicle-surroundings environment perception method as claimed in claim 4, which is characterized in that the step S33 is specifically wrapped It includes:
S331, number that target is continuously chosen is set as N1, set the corresponding detected value of the effective target with it is described pre- The inconsistent number of measured value is N2, number that the effective target is continuously lost is set as N3;And setting N2Threshold value be Tw, N3's Threshold value is TL
S332, the life cycle for judging the effective target;Wherein: N1> 5 is formation stages;N2=0 or N3=0 and N2< TwWhen, it is sustained period;N2< Tw, and N3< TLWhen be tracking phase;N2> TwOr N3> TLWhen be the extinction stage;
S333, the effective target is divided according to formation, lasting, tracking extinction four-stage;Wherein: effective mesh When mark is in formation stages, the new target information is used;When the effective target is in sustained period, updating has described in original Imitate the target information of target;When the effective target is in tracking phase, kept using predicted value;At the effective target When the stage of extinction, the target information using the former effective target is terminated.
7. a kind of vehicle-surroundings environment perception method as claimed in claim 6, which is characterized in that the step S4 is specifically wrapped It includes:
S41, spatial synchronization is carried out to the target information that different location radar generates, establishes vehicle center coordinate system and to sky Between synchrodata carry out coordinate conversion, obtain spatial synchronization calibration result;
S42, region division is carried out according to radar position difference, according to division result to the target information of the effective target It is merged, generates vehicle-surroundings environment sensing result;
S43, it is shown according to the occupy-place that traffic information carries out the effective target, and requires progress can FOH according to vehicle driving Domain selection.
8. a kind of vehicle-surroundings environment perception method as claimed in claim 7, which is characterized in that the step S41 is specifically wrapped It includes:
S411, radar fix system is established for each radar, the target information of the effective target is subjected to sky according to detection cycle Between it is synchronous;
S412, vehicle center coordinate system is established according to installation site of the radar on vehicle, spatial synchronization data is corresponded to Coordinate be converted to spatial synchronization calibration result.
9. a kind of vehicle-surroundings environment perception method as claimed in claim 8, which is characterized in that the step S42 includes step It is rapid:
S421, the region that corresponding radar fix system coincides in the vehicle center coordinate system is obtained;
S422, unification is carried out to the effective target in overlapping region, and calculates the target information of effective target in the overlapping region;
S423, the target information that the effective target is merged with the vehicle center coordinate system generate vehicle-surroundings environment sensing As a result.
10. a kind of vehicle-surroundings environment perception method as claimed in claim 9, which is characterized in that the step S43 is specifically wrapped It includes:
S431, the distribution that the effective target is obtained according to the vehicle-surroundings environment sensing result, according to traffic information The occupy-place for carrying out the effective target is shown;
S432, according to vehicle driving require carry out can traffic areas select, and can traffic areas selection result be sent to vehicle Behaviour decision-making device;
S433, vehicle behavior decision making device control vehicle driving.
CN201910411835.9A 2019-05-17 2019-05-17 A kind of vehicle-surroundings environment perception method Pending CN110203204A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910411835.9A CN110203204A (en) 2019-05-17 2019-05-17 A kind of vehicle-surroundings environment perception method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910411835.9A CN110203204A (en) 2019-05-17 2019-05-17 A kind of vehicle-surroundings environment perception method

Publications (1)

Publication Number Publication Date
CN110203204A true CN110203204A (en) 2019-09-06

Family

ID=67787523

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910411835.9A Pending CN110203204A (en) 2019-05-17 2019-05-17 A kind of vehicle-surroundings environment perception method

Country Status (1)

Country Link
CN (1) CN110203204A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110596694A (en) * 2019-09-20 2019-12-20 吉林大学 Complex environment radar multi-target tracking and road running environment prediction method
CN111854771A (en) * 2020-06-09 2020-10-30 北京百度网讯科技有限公司 Map quality detection processing method and device, electronic equipment and storage medium
CN112654879A (en) * 2020-12-11 2021-04-13 华为技术有限公司 Anti-interference method, device and system based on vehicle-mounted millimeter wave radar and vehicle
CN112712717A (en) * 2019-10-26 2021-04-27 华为技术有限公司 Information fusion method and system
WO2021097880A1 (en) * 2019-11-22 2021-05-27 惠州市德赛西威汽车电子股份有限公司 Mirror target removal method employing vehicle-mounted corner radar
CN113942511A (en) * 2021-10-19 2022-01-18 东风柳州汽车有限公司 Method, device and equipment for controlling passing of driverless vehicle and storage medium
CN115980676A (en) * 2023-01-10 2023-04-18 扬州宇安电子科技有限公司 Radar signal data analysis system and method based on big data

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080080313A1 (en) * 2006-09-28 2008-04-03 Brumley Blair H System and method for accoustic doppler velocity processing with a phased array transducer including using differently coded transmit pulses in each beam so that the cross-coupled side lobe error is removed
CN105182311A (en) * 2015-09-02 2015-12-23 四川九洲电器集团有限责任公司 Omnidirectional radar data processing method and system
CN108845509A (en) * 2018-06-27 2018-11-20 中汽研(天津)汽车工程研究院有限公司 A kind of adaptive learning algorithms algorithm development system and method
CN109085829A (en) * 2018-08-09 2018-12-25 北京智行者科技有限公司 A kind of sound state target identification method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080080313A1 (en) * 2006-09-28 2008-04-03 Brumley Blair H System and method for accoustic doppler velocity processing with a phased array transducer including using differently coded transmit pulses in each beam so that the cross-coupled side lobe error is removed
CN105182311A (en) * 2015-09-02 2015-12-23 四川九洲电器集团有限责任公司 Omnidirectional radar data processing method and system
CN108845509A (en) * 2018-06-27 2018-11-20 中汽研(天津)汽车工程研究院有限公司 A kind of adaptive learning algorithms algorithm development system and method
CN109085829A (en) * 2018-08-09 2018-12-25 北京智行者科技有限公司 A kind of sound state target identification method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘曰: "基于组合毫米波雷达的智能车环境感知方法", 《中国优秀硕士学位论文全文数据库 工程科技II辑》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110596694A (en) * 2019-09-20 2019-12-20 吉林大学 Complex environment radar multi-target tracking and road running environment prediction method
CN110596694B (en) * 2019-09-20 2023-01-10 中汽研软件测评(天津)有限公司 Complex environment radar multi-target tracking and road driving environment prediction method
CN112712717A (en) * 2019-10-26 2021-04-27 华为技术有限公司 Information fusion method and system
WO2021097880A1 (en) * 2019-11-22 2021-05-27 惠州市德赛西威汽车电子股份有限公司 Mirror target removal method employing vehicle-mounted corner radar
CN111854771A (en) * 2020-06-09 2020-10-30 北京百度网讯科技有限公司 Map quality detection processing method and device, electronic equipment and storage medium
CN112654879A (en) * 2020-12-11 2021-04-13 华为技术有限公司 Anti-interference method, device and system based on vehicle-mounted millimeter wave radar and vehicle
CN112654879B (en) * 2020-12-11 2022-04-15 华为技术有限公司 Anti-interference method, device and system based on vehicle-mounted millimeter wave radar and vehicle
WO2022120839A1 (en) * 2020-12-11 2022-06-16 华为技术有限公司 Anti-interference method, apparatus and system based on vehicle-mounted millimeter wave radars, and vehicle
CN113942511A (en) * 2021-10-19 2022-01-18 东风柳州汽车有限公司 Method, device and equipment for controlling passing of driverless vehicle and storage medium
CN115980676A (en) * 2023-01-10 2023-04-18 扬州宇安电子科技有限公司 Radar signal data analysis system and method based on big data
CN115980676B (en) * 2023-01-10 2023-09-19 扬州宇安电子科技有限公司 Radar signal data analysis system and method based on big data

Similar Documents

Publication Publication Date Title
CN110203204A (en) A kind of vehicle-surroundings environment perception method
DE102015122825B4 (en) Techniques for grouping target elements for object fusion
CN110531770B (en) RRT path planning method and system based on improvement
CN106291736A (en) Pilotless automobile track dynamic disorder object detecting method
CN109829351A (en) Detection method, device and the computer readable storage medium of lane information
CN110208819A (en) A kind of processing method of multiple barrier three-dimensional laser radar data
CN104091348A (en) Multi-target tracking method integrating obvious characteristics and block division templates
CN106548659B (en) To anti-collision warning method and device, terminal and computer storage medium before a kind of
CN103206957B (en) The lane detection and tracking method of vehicular autonomous navigation
CN102222236A (en) Image processing system and position measurement system
CN108535727B (en) Method of tracking a plurality of objects in the vicinity of a host vehicle
CN113071487B (en) Automatic driving vehicle control method and device and cloud equipment
CN103280052B (en) Be applied to the intrusion detection method of long distance track circuit intelligent video monitoring
CN106327880B (en) A kind of speed recognition methods and its system based on monitor video
CN105355044A (en) Dynamic expression method of urban traffic road network grid lock transmission based on GIS
CN110111590A (en) A kind of vehicle dynamic queue length detection method
CN102610108A (en) Method for calculating green wave effective coordinated time
CN109508003A (en) A kind of unmanned road machine group of planes dynamic preventing collision method
CN115187946B (en) Multi-scale intelligent sensing method for fusion of underground obstacle point cloud and image data
Cui et al. Lane change identification and prediction with roadside LiDAR data
CN107656256A (en) Utilize the method for radar data recognition and tracking heavy rain
CN109859475B (en) Intersection signal control method, device and system based on DBSCAN density clustering
CN115223131A (en) Adaptive cruise following target vehicle detection method and device and automobile
Wang et al. Planning autonomous driving with compact road profiles
CN115214724B (en) Trajectory prediction method and apparatus, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20190906