TWI646306B - Method for analyzing error and existence probability of multi-sensor fusion of obstacle detection - Google Patents
Method for analyzing error and existence probability of multi-sensor fusion of obstacle detection Download PDFInfo
- Publication number
- TWI646306B TWI646306B TW106144262A TW106144262A TWI646306B TW I646306 B TWI646306 B TW I646306B TW 106144262 A TW106144262 A TW 106144262A TW 106144262 A TW106144262 A TW 106144262A TW I646306 B TWI646306 B TW I646306B
- Authority
- TW
- Taiwan
- Prior art keywords
- obstacle
- fusion
- error
- information
- sensor
- Prior art date
Links
Landscapes
- Traffic Control Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
本發明提供一種應用於多感測器融合之誤差及偵測機率分析方法,其障礙物感測步驟產生障礙物觀測資訊,障礙物預測步驟產生障礙物預測資訊。誤差模型離線建立步驟依據感測器建立事前平均誤差分佈函數。偵測信心度建立步驟依據感測器建立事前偵測機率資訊。追蹤融合步驟以追蹤融合方法融合各資訊而產生融合誤差變化量。融合誤差變化量累計修正步驟依據事前偵測機率資訊修正融合誤差變化量之累計數值。藉此,透過預先處理的感測器誤差分析結合偵測信心度模型,以融合多個感測器之間的偵測資訊,能得到可信度較高之障礙物存在判斷。 The invention provides an error and detection probability analysis method applied to multi-sensor fusion, wherein an obstacle sensing step generates obstacle observation information, and an obstacle prediction step generates obstacle prediction information. The error model offline establishment step establishes a prior average error distribution function according to the sensor. The detection confidence establishment step establishes the pre-detection probability information according to the sensor. The tracking fusion step is performed to track the fusion method to fuse the information to generate a fusion error variation. The fusion error variation cumulative correction step corrects the cumulative value of the fusion error variation based on the prior detection probability information. In this way, through the pre-processed sensor error analysis combined with the detection confidence model to fuse the detection information between the plurality of sensors, the presence of obstacles with higher reliability can be obtained.
Description
本發明是關於一種誤差及偵測機率分析方法,特別是關於一種應用於多感測器融合之誤差及偵測機率分析方法。 The invention relates to an error and detection probability analysis method, in particular to an error and detection probability analysis method applied to multi-sensor fusion.
目前車用電腦的功能愈趨健全,為了提升駕駛安全性、朝向自動駕駛的未來,車前障礙物偵測及分類之可靠度便顯得尤為重要,其係將偵測到之車前障礙物分類為汽車、行人、腳踏車、電線桿等各種不同物體,依據系統設定決定分類項目,如此一來,系統便可依據障礙物的分類決定是提示剎車、自動緊急剎車或執行其他動作。 At present, the functions of the vehicle computer are becoming more and more perfect. In order to improve the safety of driving and the future of autonomous driving, the reliability of the detection and classification of the obstacles in front of the vehicle is particularly important. For various objects such as cars, pedestrians, bicycles, and utility poles, the classification items are determined according to the system settings. In this way, the system can determine whether to prompt the brakes, automatically brake, or perform other actions according to the classification of the obstacles.
一般設於車輛上用來偵測障礙物之感測器的種類甚多,常用的為視覺成像系統與雷達系統,其中視覺成像系統用於車輛上是為了增強物體探測和其他視覺或定位系統的應用,此種系統利用攝像機來捕獲圖像並從圖像辨識出物體(即障礙物),此障礙物可以是在行駛道路內的其 他車輛、行人或者甚至是物體;至於雷達系統則是用以探測行駛道路內的物體,雷達系統使用無線電波來確定物體的距離、方向或速度,雷達發射器會發射無線電波脈衝,在其路徑內的任意物體會被無線電波脈衝打到而反彈,此外,從物體反射出的脈衝將無線電波能量的一小部分傳送至接收器,此接收器通常與發射器處於相同位置。 There are many types of sensors commonly used in vehicles to detect obstacles. Visual imaging systems and radar systems are commonly used. Visual imaging systems are used on vehicles to enhance object detection and other visual or positioning systems. Application, such a system uses a camera to capture an image and identify an object (ie, an obstacle) from the image, which may be within the road He is a vehicle, a pedestrian or even an object; as far as the radar system is used to detect objects in the road, the radar system uses radio waves to determine the distance, direction or speed of the object, and the radar transmitter emits radio wave pulses in its path. Any object within it will be bounced by a radio wave pulse, and in addition, a pulse reflected from the object will transmit a small fraction of the radio wave energy to the receiver, which is typically in the same position as the transmitter.
雖然上述感測器均可偵測障礙物,但其可靠度往往不足,而且偵測結果的誤差容易過大而導致障礙物位置追蹤之錯誤經常發生。由此可知,目前市場上缺乏一種誤差小且可信度高的應用於多感測器融合之誤差及偵測機率分析方法,故相關業者均在尋求其解決之道。 Although the above sensors can detect obstacles, their reliability is often insufficient, and the error of the detection result is likely to be too large, and the error of tracking the position of the obstacle often occurs frequently. It can be seen that there is a lack of error and detection probability analysis method for multi-sensor fusion in the market, so the relevant industry is seeking solutions.
因此,本發明之目的在於提供一種應用於多感測器融合之誤差及偵測機率分析方法,其透過預先處理的感測器誤差分析結合偵測信心度模型,以融合多個感測器之間的事前偵測機率資訊並加以修正融合誤差變化量之累計數值,能得到一較為可信之障礙物存在判斷;藉由全球衛星定位即時動態測量裝置預先建立出各種環境、障礙物及車輛狀況條件下的事前平均誤差分佈函數,並以此事前平均誤差分佈函數動態修正追蹤結果,可產生誤差較小且可信度較高之融合後障礙物資訊,再者,透過偵測信心度建立步驟的事前偵測機率資訊來修正融合誤差變化量之累計數值,作為障礙物存在與否的判斷依據,可大幅增加判 斷結果的可靠度,達到即時運算的效果,並解決習知技術之感測誤差過大及可靠度過低的問題。 Therefore, the object of the present invention is to provide an error and detection probability analysis method for multi-sensor fusion, which combines a plurality of sensors through a pre-processed sensor error analysis combined with a detection confidence model. The pre-detection of the probability information and the correction of the cumulative value of the fusion error can be used to obtain a more reliable obstacle existence judgment; the global satellite positioning real-time dynamic measurement device pre-establishes various environments, obstacles and vehicle condition conditions. The pre-existing average error distribution function, and dynamically correcting the tracking result by using the pre-average error distribution function, can generate the information of the merged obstacle with less error and higher credibility, and further, the step of establishing confidence by detecting confidence Pre-detecting the probability information to correct the cumulative value of the fusion error, as a basis for judging the existence of obstacles, can greatly increase the judgment The reliability of the broken result achieves the effect of real-time operation, and solves the problem that the sensing error of the prior art is too large and the reliability is too low.
依據本發明的一實施方式提供一種應用於多感測器融合之誤差及偵測機率分析方法,其用以判斷一車輛之行進方向上的一障礙物。此應用於多感測器融合之誤差及偵測機率分析方法包含障礙物感測步驟、障礙物預測步驟、誤差模型離線建立步驟、偵測信心度建立步驟、追蹤融合步驟以及融合誤差變化量累計修正步驟,其中障礙物感測步驟係提供複數個感測器以感測障礙物而分別產生複數個障礙物觀測資訊;障礙物預測步驟係提供處理器依據複數個障礙物觀測資訊分別產生複數個障礙物預測資訊。再者,誤差模型離線建立步驟係利用處理器依據複數個感測器分別建立複數個事前平均誤差分佈函數;偵測信心度建立步驟係利用處理器依據複數個感測器分別建立複數個事前偵測機率資訊;追蹤融合步驟係利用處理器以一追蹤融合方法融合障礙物觀測資訊、障礙物預測資訊及事前平均誤差分佈函數而產生複數個融合誤差變化量以及複數個融合後障礙物資訊;至於融合誤差變化量累計修正步驟則是利用處理器依據複數個事前偵測機率資訊修正融合誤差變化量之累計數值,以判斷障礙物是否存在。 According to an embodiment of the present invention, an error and detection probability analysis method for multi-sensor fusion is provided for determining an obstacle in a traveling direction of a vehicle. The error and detection probability analysis method applied to the multi-sensor fusion includes an obstacle sensing step, an obstacle prediction step, an error model offline establishing step, a detection confidence establishing step, a tracking fusion step, and a fusion error variation amount accumulation. a correction step, wherein the obstacle sensing step provides a plurality of sensors to sense the obstacles to generate a plurality of obstacle observation information respectively; the obstacle prediction step provides the processor to generate the plurality of obstacles according to the plurality of obstacle observation information respectively Obstacle prediction information. Furthermore, the error model offline establishing step uses the processor to respectively establish a plurality of pre-existing average error distribution functions according to the plurality of sensors; the detecting confidence establishing step is to use the processor to establish a plurality of pre-investigations respectively according to the plurality of sensors. Measuring the rate information; the tracking and fusion step is to use the processor to combine the obstacle observation method, the obstacle prediction information and the prior average error distribution function to generate a plurality of fusion error variations and a plurality of post-fusion obstacle information; The fusion error variation cumulative correction step is to use the processor to correct the cumulative value of the fusion error variation according to a plurality of prior detection probability information to determine whether an obstacle exists.
藉此,本發明的方法透過預先處理的感測器誤差分析結合偵測信心度模型,以融合多個感測器之間的事前偵測機率資訊並加以修正融合誤差變化量之累計數值,能得到一較為可信之障礙物存在判斷。此外,透過偵測信 心度建立步驟的事前偵測機率資訊來修正融合誤差變化量之累計數值,作為障礙物存在與否的判斷依據,可大幅增加判斷結果的可靠度,達到即時運算效果,並解決習知技術之感測誤差過大及可靠度過低的問題。 Therefore, the method of the present invention combines the pre-processed sensor error analysis with the detection confidence model to fuse the pre-detection probability information between the plurality of sensors and correct the cumulative value of the fusion error variation. Get a judgment of a more credible obstacle. In addition, through the detection letter The pre-detection probability information of the heart-building step is used to correct the cumulative value of the fusion error as a basis for judging the presence or absence of the obstacle, which can greatly increase the reliability of the judgment result, achieve the real-time operation effect, and solve the conventional technology. The problem of excessive sensing error and low reliability.
前述實施方式之其他實施例如下:在前述追蹤融合步驟中,追蹤融合方法可為一卡爾曼濾波演算法(Kalman Filter),且各融合後障礙物資訊包含一融合後障礙物位置、一融合後障礙物速度及一融合後障礙物種類。再者,前述一個感測器可為一雷達感測器(RADAR),且另一個感測器為一攝影機。此外,前述障礙物感測步驟之各障礙物觀測資訊可包含一觀測位置,且前述誤差模型離線建立步驟可事前設置一即時動態定位模組於障礙物上並驅動即時動態定位模組以產生複數個即時動態定位位置,然後處理器接收並運算即時動態定位位置與觀測位置之相對誤差而產生事前平均誤差分佈函數。感測器具有一視野(Field of View;FOV),即時動態定位位置與觀測位置均位於視野內且對應障礙物之觀測速度。另外,在前述融合誤差變化量累計修正步驟中,處理器可儲存一預設累計門檻值,且處理器比對預設累計門檻值與累計數值之大小而判斷障礙物是否存在。當累計數值小於等於預設累計門檻值時,障礙物視為存在;當累計數值大於預設累計門檻值時,障礙物視為不存在。 Other implementations of the foregoing embodiments include: in the foregoing tracking fusion step, the tracking fusion method may be a Kalman Filter algorithm, and each of the merged obstacle information includes a merged obstacle position and a fusion Obstacle speed and the type of obstacle after fusion. Furthermore, the aforementioned one sensor may be a radar sensor (RADAR), and the other sensor is a camera. In addition, the obstruction observation information of the obstacle sensing step may include an observation position, and the error model offline establishing step may previously set an instant dynamic positioning module on the obstacle and drive the real dynamic positioning module to generate the complex number. The instantaneous dynamic positioning position, and then the processor receives and calculates the relative error between the instantaneous dynamic positioning position and the observation position to generate a prior average error distribution function. The sensor has a field of view (FOV), and the instantaneous dynamic positioning position and the observation position are both located in the field of view and correspond to the observation speed of the obstacle. In addition, in the foregoing fusion error change amount cumulative correction step, the processor may store a preset cumulative threshold value, and the processor compares the preset cumulative threshold value and the accumulated numerical value to determine whether an obstacle exists. When the accumulated value is less than or equal to the preset cumulative threshold, the obstacle is considered to exist; when the accumulated value is greater than the preset cumulative threshold, the obstacle is considered to be absent.
依據本發明的另一實施方式提供一種應用於多感測器融合之誤差及偵測機率分析方法,其用以判斷車輛 之行進方向上的障礙物。此應用於多感測器融合之誤差及偵測機率分析方法包含障礙物感測步驟、障礙物預測步驟、誤差模型離線建立步驟以及追蹤融合步驟,其中障礙物感測步驟係提供複數個感測器以感測障礙物而分別產生複數個障礙物觀測資訊,各障礙物觀測資訊包含一觀測位置與一觀測速度;障礙物預測步驟係提供一處理器依據複數個障礙物觀測資訊分別產生複數個障礙物預測資訊;誤差模型離線建立步驟係利用處理器依據複數個感測器分別建立複數個事前平均誤差分佈函數,且誤差模型離線建立步驟係事前設置一即時動態定位模組於障礙物上並驅動即時動態定位模組以產生複數個即時動態定位位置(RTK-GPS),然後處理器接收並運算即時動態定位位置與觀測位置之相對誤差而產生事前平均誤差分佈函數;追蹤融合步驟係利用處理器以一追蹤融合方法融合障礙物觀測資訊、障礙物預測資訊及事前平均誤差分佈函數而產生複數個融合後障礙物資訊。另外,感測器具有一視野(Field of View;FOV),即時動態定位位置與觀測位置均位於視野內且對應障礙物之觀測速度。 According to another embodiment of the present invention, an error and detection probability analysis method for multi-sensor fusion is provided, which is used to determine a vehicle. An obstacle in the direction of travel. The error and detection probability analysis method applied to the multi-sensor fusion includes an obstacle sensing step, an obstacle prediction step, an error model offline establishing step, and a tracking fusion step, wherein the obstacle sensing step provides a plurality of sensing steps The device respectively generates a plurality of obstacle observation information by sensing the obstacle, and the obstacle observation information includes an observation position and an observation speed; the obstacle prediction step provides a processor to generate a plurality of pieces according to the plurality of obstacle observation information respectively. Obstacle prediction information; the error model offline establishing step is to use the processor to establish a plurality of pre-existing average error distribution functions according to the plurality of sensors, and the error model offline establishing step is to set an instant dynamic positioning module on the obstacle beforehand and Driving the real-time dynamic positioning module to generate a plurality of real-time dynamic positioning positions (RTK-GPS), and then the processor receives and calculates the relative error between the instantaneous dynamic positioning position and the observed position to generate a prior average error distribution function; the tracking fusion step is processed Fusion tracking obstacles with a tracking fusion method Obstacle to predict in advance the average error information and distribution function to generate the plurality of information integration obstacles. In addition, the sensor has a field of view (FOV), and the instantaneous dynamic positioning position and the observation position are both located in the field of view and correspond to the observation speed of the obstacle.
藉此,本發明的方法藉由全球衛星定位即時動態測量裝置預先建立出各種環境、障礙物及車輛狀況條件下的事前平均誤差分佈函數,並以此函數資料庫動態修正追蹤結果,可產生誤差較小且可信度較高之融合後障礙物資訊。 Thereby, the method of the invention pre-establishes the prior average error distribution function under various environments, obstacles and vehicle conditions by the global satellite positioning real-time dynamic measuring device, and dynamically corrects the tracking result by using the function database, which can generate errors. Smaller and more reliable information on post-fusion obstacles.
前述實施方式之其他實施例如下:在前述追蹤融合步驟中,追蹤融合方法可為卡爾曼濾波演算法(Kalman Filter),且各融合後障礙物資訊包含融合後障礙物位置、融合後障礙物速度及融合後障礙物種類。前述其中一個感測器可為雷達感測器(RADAR),且另一個感測器為一攝影機。此外,前述應用於多感測器融合之誤差及偵測機率分析方法可包含一偵測信心度建立步驟,此偵測信心度建立步驟係利用處理器依據感測器分別建立複數個事前偵測機率資訊。在追蹤融合步驟中,處理器以一追蹤融合方法融合障礙物觀測資訊、障礙物預測資訊及事前平均誤差分佈函數而產生複數個融合誤差變化量。再者,前述應用於多感測器融合之誤差及偵測機率分析方法可包含一融合誤差變化量累計修正步驟,此融合誤差變化量累計修正步驟係利用處理器依據事前偵測機率資訊修正融合誤差變化量之累計數值,以判斷障礙物是否存在。另外,在前述融合誤差變化量累計修正步驟中,處理器儲存一預設累計門檻值,且處理器比對預設累計門檻值與累計數值之大小而判斷障礙物是否存在。當累計數值小於等於預設累計門檻值時,障礙物視為存在;當累計數值大於預設累計門檻值時,障礙物視為不存在。 Other implementations of the foregoing embodiments include: in the foregoing tracking fusion step, the tracking fusion method may be a Kalman Filter algorithm, and each of the merged obstacle information includes the position of the obstacle after the fusion and the speed of the obstacle after the fusion And the type of obstacles after fusion. One of the aforementioned sensors may be a radar sensor (RADAR) and the other sensor is a camera. In addition, the foregoing error and detection probability analysis method applied to the multi-sensor fusion may include a detection confidence establishing step, wherein the detecting confidence establishing step uses the processor to respectively establish a plurality of pre-detection according to the sensor. Probability information. In the tracking fusion step, the processor generates a plurality of fusion error variations by integrating the obstacle observation information, the obstacle prediction information, and the prior average error distribution function by a tracking fusion method. Furthermore, the error and detection probability analysis method applied to the multi-sensor fusion may include a fusion error variation cumulative correction step, and the fusion error variation cumulative correction step is performed by the processor according to the pre-detection probability information correction fusion. The cumulative value of the amount of error change to determine if an obstacle exists. In addition, in the foregoing fusion error change amount cumulative correction step, the processor stores a preset cumulative threshold value, and the processor compares the preset cumulative threshold value and the accumulated numerical value to determine whether an obstacle exists. When the accumulated value is less than or equal to the preset cumulative threshold, the obstacle is considered to exist; when the accumulated value is greater than the preset cumulative threshold, the obstacle is considered to be absent.
100、100a‧‧‧應用於多感測器融合之誤差及偵測機率分析系統 100, 100a‧‧‧Error and detection probability analysis system for multi-sensor fusion
110‧‧‧車輛 110‧‧‧ Vehicles
120‧‧‧障礙物 120‧‧‧ obstacles
130‧‧‧網格 130‧‧‧Grid
200‧‧‧感測器 200‧‧‧ sensor
300、300a‧‧‧處理器 300, 300a‧‧‧ processor
S12、S21‧‧‧障礙物感測步驟 S12, S21‧‧‧ obstacle sensing steps
S14、S22‧‧‧障礙物預測步驟 S14, S22‧‧‧ obstacle prediction steps
S16、S23‧‧‧誤差模型離線建立步驟 S16, S23‧‧‧ error model offline establishment steps
S18、S24‧‧‧追蹤融合步驟 S18, S24‧‧‧ tracking fusion steps
310‧‧‧障礙物感測模組 310‧‧‧ obstacle sensing module
320‧‧‧障礙物預測模組 320‧‧‧ obstacle prediction module
330‧‧‧誤差模型離線建立模組 330‧‧‧ Error model offline module
332‧‧‧即時動態定位模組 332‧‧‧Instant Dynamic Positioning Module
340、340a‧‧‧追蹤融合模組 340, 340a‧‧‧ Tracking Fusion Module
342‧‧‧融合誤差變化量 342‧‧‧Confusion error variation
350‧‧‧偵測信心度建立模組 350‧‧‧Detection confidence building module
352‧‧‧事前偵測機率資訊 352‧‧‧Pre-detection probability information
360‧‧‧融合誤差變化量累計修正模組 360‧‧‧Fused Error Variation Accumulation Correction Module
362‧‧‧存在狀況資訊 362‧‧‧ Status information
370‧‧‧碰撞時間計算模組 370‧‧‧ Collision time calculation module
400、400a‧‧‧應用於多感測器融合之誤差及偵測機率分析方法 400, 400a‧‧‧Application of multi-sensor fusion error and detection probability analysis method
S25‧‧‧偵測信心度建立步驟 S25‧‧‧Steps to detect confidence
S26‧‧‧融合誤差變化量累計 修正步驟 S26‧‧‧Combined error variation Correction step
S27‧‧‧碰撞時間計算步驟 S27‧‧‧ Collision time calculation steps
(x,y,v)‧‧‧障礙物觀測資訊 (x, y, v) ‧ ‧ obstacle observation information
(x,y)‧‧‧觀測位置 (x, y) ‧ ‧ observation position
v‧‧‧觀測速度 V‧‧‧ observation speed
(x',y',v')‧‧‧障礙物預測資訊 (x', y', v') ‧ ‧ obstacle prediction information
(x',y')‧‧‧預測位置 (x', y') ‧ ‧ predicted position
v'‧‧‧預測速度 V'‧‧‧ forecast speed
f(x,y,v)‧‧‧事前平均誤差分佈函數 f (x, y, v) ‧ ‧ ante average error distribution function
(x",y",v")‧‧‧融合後障礙物資訊 (x", y", v") ‧ ‧ post-fusion obstacle information
(x",y")‧‧‧融合後障礙物位置 (x", y") ‧ ‧ position of the obstacle after fusion
v"‧‧‧融合後障礙物速度 v"‧‧‧ obstacle speed after fusion
T1~T30‧‧‧時間 T1~T30‧‧‧Time
FOV‧‧‧視野 FOV‧‧ Vision
第1圖係繪示本發明一實施例之應用於多感測器融合之誤差及偵測機率分析系統的方塊示意圖。 FIG. 1 is a block diagram showing an error and detection probability analysis system applied to multi-sensor fusion according to an embodiment of the present invention.
第2圖係繪示本發明一實施例之應用於多感測器融合之誤差及偵測機率分析方法的流程示意圖。 FIG. 2 is a schematic flow chart showing an error of the multi-sensor fusion and a detection probability analysis method according to an embodiment of the present invention.
第3圖係繪示本發明一實施例之障礙物搭載即時動態定位模組的量測環境示意圖。 FIG. 3 is a schematic diagram showing the measurement environment of an obstacle-equipped instant dynamic positioning module according to an embodiment of the present invention.
第4A圖係繪示第2圖中誤差模型離線建立步驟之感測器為雷達感測器的動態追蹤結果。 Fig. 4A is a diagram showing the dynamic tracking result of the sensor of the radar model in the offline setting step of the error model in Fig. 2.
第4B圖係繪示第2圖中誤差模型離線建立步驟之障礙物觀測速度為20kph的誤差分佈狀況。 Fig. 4B is a diagram showing the error distribution of the obstacle observation speed of 20kph in the offline establishment step of the error model in Fig. 2.
第4C圖係繪示第2圖中誤差模型離線建立步驟之障礙物觀測速度為60kph的誤差分佈狀況。 Fig. 4C is a diagram showing the error distribution of the obstacle observation speed of 60 kph in the offline establishment step of the error model in Fig. 2.
第5A~5C圖係繪示第2圖中誤差模型離線建立步驟在不同時間內以多感測器間的偵測誤差動態修正追蹤結果。 The 5A~5C diagram shows that the error model offline establishment step in Fig. 2 dynamically corrects the tracking result with the detection error between multiple sensors in different time periods.
第6圖係繪示本發明另一實施例之應用於多感測器融合之誤差及偵測機率分析系統的方塊示意圖。 FIG. 6 is a block diagram showing an error and detection probability analysis system applied to multi-sensor fusion according to another embodiment of the present invention.
第7圖係繪示本發明另一實施例之應用於多感測器融合之誤差及偵測機率分析方法的流程示意圖。 FIG. 7 is a flow chart showing a method for analyzing errors and detecting probability of multi-sensor fusion according to another embodiment of the present invention.
第8A圖係繪示第7圖中融合誤差變化量累計修正步驟之融合誤差變化量。 Fig. 8A is a graph showing the amount of fusion error variation of the fusion error variation cumulative correction step in Fig. 7.
第8B圖係繪示第8A圖的融合誤差變化量之累計數值。 Fig. 8B is a graph showing the cumulative value of the amount of change in the fusion error of Fig. 8A.
以下將參照圖式說明本發明之複數個實施例。為明確說明起見,許多實務上的細節將在以下敘述中一併說明。然而,應瞭解到,這些實務上的細節不應用以限制本發明。也就是說,在本發明部分實施例中,這些實務上的細節是非必要的。此外,為簡化圖式起見,一些習知慣用的結構與元件在圖式中將以簡單示意的方式繪示之;並且重複之元件將可能使用相同的編號表示之。 Hereinafter, a plurality of embodiments of the present invention will be described with reference to the drawings. For the sake of clarity, many practical details will be explained in the following description. However, it should be understood that these practical details are not intended to limit the invention. That is, in some embodiments of the invention, these practical details are not necessary. In addition, some of the conventional structures and elements are illustrated in the drawings in a simplified schematic manner, and the repeated elements may be represented by the same reference numerals.
請一併參閱第1圖與第3圖,第1圖係繪示本發明一實施例之應用於多感測器融合之誤差及偵測機率分析系統100的方塊示意圖,多感測器融合之誤差及偵測機率分析系統100包含多個感測器200以及處理器300。第3圖係繪示本發明一實施例之量測環境示意圖,其中障礙物120搭載即時動態定位模組332,以及包含多個感測器200與處理器300的車輛110。此應用於多感測器融合之誤差及偵測機率分析系統100用以判斷車輛110之行進方向上的障礙物120。 Please refer to FIG. 1 and FIG. 3 together. FIG. 1 is a block diagram showing an error and detection probability analysis system 100 applied to multi-sensor fusion according to an embodiment of the present invention. The error and detection probability analysis system 100 includes a plurality of sensors 200 and a processor 300. FIG. 3 is a schematic diagram of a measurement environment according to an embodiment of the present invention, wherein the obstacle 120 is equipped with an instant dynamic positioning module 332, and a vehicle 110 including a plurality of sensors 200 and a processor 300. The error and detection probability analysis system 100 for multi-sensor fusion is used to determine the obstacle 120 in the direction of travel of the vehicle 110.
多個感測器200設於車輛110上,且多個感測器200可為不同形態。本實施例之多個感測器200數量為2,且其中一個感測器200可為雷達感測器(RADAR),另一個感測器200可為攝影機。雷達感測器(RADAR)用以感測障礙物120的位置與速度,而攝影機用以感測障礙物120的位置及辨識障礙物120的種類,但不限於上述數量以及感測器200種類。 A plurality of sensors 200 are disposed on the vehicle 110, and the plurality of sensors 200 can be in different configurations. The number of the plurality of sensors 200 in this embodiment is two, and one of the sensors 200 may be a radar sensor (RADAR), and the other sensor 200 may be a camera. A radar sensor (RADAR) is used to sense the position and speed of the obstacle 120, and the camera is used to sense the position of the obstacle 120 and identify the type of the obstacle 120, but is not limited to the above number and the type of the sensor 200.
處理器300設於車輛110上並訊號連接多個感測器200,且處理器300可為車用之電子控制單元(Electronic Control Unit;ECU)、微處理器或其他電子運算處理器等,處理器300包含障礙物感測模組310、障礙物預測模組320、誤差模型離線建立模組330以及追蹤融合模組340,其中障礙物感測模組310利用多個感測器200感測到之障礙物120訊號來產生複數個障礙物觀測資訊(x,y,v),各障礙物觀測資訊(x,y,v)包含觀測位置(x,y)與觀測速度v,其中觀測位置(x,y)代表障礙物120經感測所得到的位置,觀測速度v則代表障礙物120經感測所得到的移動速度;而障礙物預測模組320依據多個感測器200分別產生複數個障礙物預測資訊(x',y',v'),其中預測位置(x',y')代表障礙物120經預測所得到的位置,預測速度v'則代表障礙物120經預測所得到的移動速度。此外,誤差模型離線建立模組330依據多個感測器200分別建立在不同測試情境下的複數個事前平均誤差分佈函數f(x,y,v),事前平均誤差分佈函數f(x,y,v)係為障礙物觀測資訊(x,y,v)與實際障礙物120正確資訊之間誤差的平均函數,其用以動態修正障礙物觀測資訊(x,y,v)與障礙物預測資訊(x',y',v')之間的誤差,例如:第4B圖與第4C圖分別繪示兩種不同測試情境之誤差分佈狀況,其中第4B圖之障礙物觀測速度為20kph,而第4B圖之障礙物觀測速度則為60kph,此兩種測試情境可得到兩種不同之事前平均誤差分佈函數f(x,y,v)。至於追蹤融合模組340則透過一個追蹤 融合方法融合障礙物觀測資訊(x,y,v)、障礙物預測資訊(x',y',v')及事前平均誤差分佈函數f(x,y,v)而產生複數個融合後障礙物資訊(x",y",v"),其中追蹤融合方法為卡爾曼濾波演算法(Kalman Filter),融合後障礙物位置(x",y")代表障礙物120經融合後的位置,融合後障礙物速度v"則代表障礙物120經融合後的移動速度。換句話說,追蹤融合模組340訊號連接障礙物感測模組310、障礙物預測模組320及誤差模型離線建立模組330。由於本實施例為二個感測器200,即雷達感測器與攝影機,因此每一個感測器200都會有對應的障礙物觀測資訊(x,y,v)、障礙物預測資訊(x',y',v')以及事前平均誤差分佈函數f(x,y,v),藉以產生誤差較小且可信度較高之融合後障礙物資訊(x",y",v")。 The processor 300 is disposed on the vehicle 110 and connected to the plurality of sensors 200, and the processor 300 can be an electronic control unit (ECU) for the vehicle, a microprocessor or other electronic computing processor, and the like. The device 300 includes an obstacle sensing module 310, an obstacle prediction module 320, an error model offline establishing module 330, and a tracking fusion module 340. The obstacle sensing module 310 is sensed by using the plurality of sensors 200. The obstacle 120 signal generates a plurality of obstacle observation information (x, y, v), and each obstacle observation information (x, y, v) includes an observation position (x, y) and an observation speed v, wherein the observation position ( x, y) represents the position obtained by the obstacle 120 being sensed, and the observation speed v represents the moving speed obtained by the obstacle 120 being sensed; and the obstacle prediction module 320 generates the plural according to the plurality of sensors 200 respectively. Obstacle prediction information (x', y', v'), wherein the predicted position (x', y') represents the predicted position of the obstacle 120, and the predicted speed v' represents the predicted obstacle 120. The speed of movement. In addition, the error model offline establishing module 330 establishes a plurality of prior average error distribution functions f (x, y, v) according to the plurality of sensors 200 in different test scenarios, and the prior average error distribution function f (x, y) , v) is the average function of the error between the obstacle observation information (x, y, v) and the actual information of the actual obstacle 120, which is used to dynamically correct obstacle observation information (x, y, v) and obstacle prediction The error between the information (x', y', v'), for example: Figure 4B and Figure 4C respectively show the error distribution of two different test scenarios, wherein the obstacle observation speed of Figure 4B is 20kph, The obstacle observation speed in Fig. 4B is 60kph. In the two test scenarios, two different pre-average error distribution functions f (x, y, v) can be obtained. The tracking fusion module 340 integrates obstacle observation information (x, y, v), obstacle prediction information (x', y', v') and the prior average error distribution function f (x, y) through a tracking fusion method. , v) and generate a plurality of post-fusion obstacle information (x", y", v"), wherein the tracking fusion method is Kalman Filter (Kalman Filter), the position of the obstacle after fusion (x", y") Representing the position of the obstacle 120 after fusion, the speed of the obstacle after v" represents the moving speed of the obstacle 120 after fusion. In other words, the tracking fusion module 340 is connected to the obstacle sensing module 310, the obstacle prediction module 320, and the error model offline establishing module 330. Since the present embodiment is two sensors 200, namely a radar sensor and a camera, each sensor 200 has corresponding obstacle observation information (x, y, v) and obstacle prediction information (x' , y', v') and the pre-existing average error distribution function f (x, y, v), thereby generating post-fusion obstacle information (x", y", v") with less error and higher reliability.
請一併參閱第1~4C圖,第2圖係繪示本發明一實施例之應用於多感測器融合之誤差及偵測機率分析方法400的流程示意圖。第4A圖係繪示第2圖中誤差模型離線建立步驟S16之感測器200為雷達感測器的動態追蹤結果。第4B圖係繪示第2圖中誤差模型離線建立步驟S16之障礙物120觀測速度v為20kph的誤差分佈狀況。第4C圖係繪示第2圖中誤差模型離線建立步驟S16之障礙物120觀測速度v為60kph的誤差分佈狀況。如圖所示,此應用於多感測器融合之誤差及偵測機率分析方法400用以判斷車輛110之行進方向上的障礙物120;此應用於多感測器融合之誤差及偵測機率分析方法400包含障礙物感測步驟 S12、障礙物預測步驟S14、誤差模型離線建立步驟S16以及追蹤融合步驟S18。 Please refer to FIG. 1 to FIG. 4C. FIG. 2 is a schematic flow chart of the error and detection probability analysis method 400 applied to the multi-sensor fusion according to an embodiment of the present invention. FIG. 4A is a diagram showing the dynamic tracking result of the sensor 200 in the error model offline establishing step S16 in FIG. 2 as a radar sensor. 4B is a diagram showing the error distribution of the obstacle 120 in the error map offline creation step S16 in FIG. 2 when the observation speed v is 20 kph. Fig. 4C is a diagram showing the error distribution of the obstacle 120 in the error model offline creation step S16 in Fig. 2 at an observation speed v of 60 kph. As shown, the multi-sensor fusion error and detection probability analysis method 400 is used to determine the obstacle 120 in the direction of travel of the vehicle 110; this is applied to multi-sensor fusion error and detection probability. Analytical method 400 includes an obstacle sensing step S12, an obstacle prediction step S14, an error model offline establishment step S16, and a tracking fusion step S18.
障礙物感測步驟S12係提供複數個感測器200以感測障礙物120而分別產生複數個障礙物觀測資訊(x,y,v),也就是說,障礙物感測步驟S12利用感測器200感測障礙物120之後,系統會再透過障礙物感測模組310產生複數個障礙物觀測資訊(x,y,v)。 The obstacle sensing step S12 provides a plurality of sensors 200 to sense the obstacles 120 to generate a plurality of obstacle observation information (x, y, v), that is, the obstacle sensing step S12 utilizes sensing. After the obstacle 200 is sensed by the device 200, the system generates a plurality of obstacle observation information (x, y, v) through the obstacle sensing module 310.
障礙物預測步驟S14係提供處理器300依據多個感測器200分別產生複數個障礙物預測資訊(x',y',v')。詳細地說,障礙物預測步驟S14利用感測器200感測障礙物120之後,系統會透過障礙物預測模組320依據多個感測器200來產生複數個障礙物預測資訊(x',y',v')。 The obstacle prediction step S14 is to provide the processor 300 to generate a plurality of obstacle prediction information (x', y', v') according to the plurality of sensors 200, respectively. In detail, after the obstacle prediction step S14 senses the obstacle 120 by the sensor 200, the system generates a plurality of obstacle prediction information (x', y according to the plurality of sensors 200 through the obstacle prediction module 320. ',v').
誤差模型離線建立步驟S16係利用處理器300依據感測器200分別建立複數個事前平均誤差分佈函數f(x,y,v)。詳細地說,誤差模型離線建立步驟S16係事前設置一個即時動態定位模組332於障礙物120上並驅動以產生複數個即時動態定位位置(如第4A圖之符號“○”以及第4B、4C圖之橫軸所示)。然後,處理器300的誤差模型離線建立模組330接收並運算即時動態定位位置與障礙物觀測資訊(x,y,v)的觀測位置(x,y)之相對誤差而產生事前平均誤差分佈函數f(x,y,v),其中觀測位置(x,y)如第4A圖之符號“×”所示。而第4B及4C圖的黑點為前車障礙物120在不同測試情境下,雷達感測器之觀測位置(x,y) 與全球定位系統RTK-GPS之即時動態定位位置的差值,這些差值用以建置事前平均誤差分佈函數f(x,y,v),此事前平均誤差分佈函數f(x,y,v)會隨前車障礙物120之不同車況狀態而有所變化。而事前平均誤差分佈函數f(x,y,v)則如第4B、4C圖之趨勢線所示。另外,感測器200具有一視野(Field of View;FOV),即時動態定位位置與觀測位置(x,y)均位於視野內且對應障礙物120之觀測速度v。處理器300在視野內可形成複數個網格130以判斷障礙物120的位置。本實施例之即時動態定位模組332為全球衛星定位即時動態測量裝置(RTK-GPS),因此即時動態定位模組332係透過全球定位系統(GPS)訊號連接誤差模型離線建立模組330,且即時動態定位模組332所量測到的即時動態定位位置視為實際障礙物120的正確資訊。而雷達感測器200感測障礙物120所產生之複數個觀測位置(x,y)會跟即時動態定位位置相互比對,藉由兩者之間的差值可建立出事前平均誤差分佈函數f(x,y,v)之資料庫,此事前平均誤差分佈函數f(x,y,v)可當作修正之參考。換句話說,未來在使用雷達感測器200時,可以透過預先建立的事前平均誤差分佈函數f(x,y,v)將觀測位置(x,y)作適當地修正而得到較準確之融合後障礙物位置(x",y"),此融合後障礙物位置(x",y")如第4A圖之符號“△”所示。下面將說明此修正方式的細節,即追蹤融合步驟S18。 The error model offline establishing step S16 is to use the processor 300 to respectively establish a plurality of prior average error distribution functions f (x, y, v) according to the sensor 200. In detail, the error model offline establishing step S16 is to pre-set an instant dynamic positioning module 332 on the obstacle 120 and drive to generate a plurality of instantaneous dynamic positioning positions (such as the symbol "○" in FIG. 4A and the 4B, 4C. The horizontal axis of the graph is shown). Then, the error model offline establishing module 330 of the processor 300 receives and calculates the relative error between the instantaneous dynamic positioning position and the observation position (x, y) of the obstacle observation information (x, y, v) to generate a prior average error distribution function. f (x, y, v), where the observation position (x, y) is as indicated by the symbol "X" in Fig. 4A. The black points in Figures 4B and 4C are the difference between the observation position (x, y) of the radar sensor and the instantaneous dynamic positioning position of the global positioning system RTK-GPS in different test situations. The difference is used to establish an average error distribution function f (x, y, v) beforehand, and the prior average error distribution function f (x, y, v) varies with the state of the vehicle condition of the preceding vehicle obstacle 120. The ex ante average error distribution function f (x, y, v) is shown by the trend line in Figures 4B and 4C. In addition, the sensor 200 has a field of view (FOV), and both the instantaneous dynamic positioning position and the observation position (x, y) are located in the field of view and correspond to the observation speed v of the obstacle 120. The processor 300 can form a plurality of grids 130 within the field of view to determine the location of the obstacles 120. The real-time dynamic positioning module 332 of the present embodiment is a global satellite positioning real-time dynamic measuring device (RTK-GPS), so the real-time dynamic positioning module 332 establishes the module 330 offline through the global positioning system (GPS) signal connection error model, and The instantaneous dynamic positioning position measured by the real-time dynamic positioning module 332 is regarded as the correct information of the actual obstacle 120. The radar sensor 200 senses that the plurality of observation positions (x, y) generated by the obstacle 120 are compared with the instantaneous dynamic positioning position, and the difference between the two can establish an average error distribution function beforehand. For the database of f (x, y, v), the pre-existing mean error distribution function f (x, y, v) can be used as a reference for correction. In other words, when the radar sensor 200 is used in the future, the observation position (x, y) can be appropriately corrected by a pre-established pre-averaged error distribution function f (x, y, v) to obtain a more accurate fusion. The rear obstacle position (x", y"), the post-fusion obstacle position (x", y") is indicated by the symbol "△" in Fig. 4A. The details of this modification will be explained below, that is, the tracking fusion step S18.
追蹤融合步驟S18係利用處理器300以一追蹤融合方法融合障礙物觀測資訊(x,y,v)、障礙物預測資訊 (x',y',v')及事前平均誤差分佈函數f(x,y,v)而產生複數個融合後障礙物資訊(x",y",v")。詳細地說,追蹤融合方法為卡爾曼濾波演算法,此卡爾曼濾波演算法透過追蹤融合模組340執行,且各融合後障礙物資訊(x",y",v")包含融合後障礙物位置(x",y")及融合後障礙物速度v"。而在其他實施例中,融合後障礙物資訊(x",y",v")還可包含融合後障礙物種類,此融合後障礙物種類可為行人、車子或其他種類之障礙物120。藉此,本發明的應用於多感測器融合之誤差及偵測機率分析方法400利用預先處理的感測器200誤差分析,並透過全球衛星定位即時動態測量裝置預先建立出各種環境、障礙物120及車輛110狀況條件下的事前平均誤差分佈函數f(x,y,v),並以事前平均誤差分佈函數f(x,y,v)動態修正追蹤結果,可產生誤差較小且可信度較高之融合後障礙物資訊(x",y",v")。 The tracking and fusion step S18 uses the processor 300 to fuse obstacle observation information (x, y, v), obstacle prediction information (x', y', v') and the prior average error distribution function f (x) by a tracking fusion method. , y, v) and generate a plurality of post-fusion obstacle information (x", y", v"). In detail, the tracking fusion method is a Kalman filter algorithm, and the Kalman filter algorithm is used to track the fusion module. 340 is performed, and each of the post-fusion obstacle information (x", y", v") includes the post-fusion obstacle position (x", y") and the post-fusion obstacle speed v". In other embodiments, the fusion The posterior obstacle information (x", y", v") may also include a post-fusion obstacle species, which may be a pedestrian, a car, or other type of obstacle 120. Therefore, the error and detection probability analysis method 400 applied to the multi-sensor fusion of the present invention utilizes the error analysis of the pre-processed sensor 200, and pre-establishes various environments and obstacles through the global satellite positioning instantaneous dynamic measurement device. 120 and the pre-existing average error distribution function f (x, y, v) under the condition of the vehicle 110, and dynamically correcting the tracking result by the prior average error distribution function f (x, y, v), which can generate less error and be credible A higher degree of post-fusion obstacle information (x", y", v").
請一併參閱第1、2、3及5A~5C圖,第5A~5C圖係繪示第2圖中誤差模型離線建立步驟S16在不同時間內以多感測器200之間的偵測誤差動態修正追蹤結果。如圖所示,符號“×”代表攝影機感測得到的觀測位置(x,y),符號“△”代表攝影機之觀測位置(x,y)的追蹤位置(經觀測、預測及誤差模型修正),符號“*”代表雷達感測器感測得到的觀測位置(x,y),符號“○”代表雷達感測器之觀測位置(x,y)的追蹤位置(經觀測、預測及誤差模型修正),至於符號“□”則是攝影機與雷達感測器同步融合後之動態修正追蹤結果。第5A~5C圖中共有30個不同時間 T1~T30的追蹤結果,這些時間T1~T30彼此間隔相等且依序發生。一般而言,使用雷達感測器感測障礙物120之距離與位置相較於攝影機之感測資訊較為準確,而本發明透過多個不同感測器200之間的偵測誤差來即時動態修正並得到較為可信的追蹤結果。 Please refer to the figures 1, 2, 3 and 5A~5C together. The 5A~5C diagram shows the error of the error model in the second figure. Step S16 is used to detect errors between multiple sensors 200 in different time. Dynamically correct tracking results. As shown, the symbol "X" represents the observed position (x, y) sensed by the camera, and the symbol "△" represents the tracking position (observed, predicted, and corrected by the error model) of the camera's observed position (x, y). The symbol "*" represents the observed position (x, y) sensed by the radar sensor, and the symbol "○" represents the tracking position (x, y) of the radar sensor (observed, predicted, and error model) Correction), as for the symbol "□", the dynamic correction tracking result after the camera and the radar sensor are synchronized. There are 30 different times in Figures 5A~5C The tracking results of T1~T30, these times T1~T30 are equally spaced from each other and occur sequentially. In general, the use of the radar sensor to sense the distance and position of the obstacle 120 is more accurate than the sensing information of the camera, and the present invention dynamically corrects the error through the detection error between the plurality of different sensors 200. And get more reliable tracking results.
請一併參閱第1圖與第6圖,第6圖係繪示本發明另一實施例之應用於多感測器融合之誤差及偵測機率分析系統100a的方塊示意圖。如圖所示,此應用於多感測器融合之誤差及偵測機率分析系統100a用以判斷車輛110之行進方向上的障礙物120且包含多個感測器200與處理器300a。其中處理器300a包含障礙物感測模組310、障礙物預測模組320、誤差模型離線建立模組330、追蹤融合模組340a、偵測信心度建立模組350、融合誤差變化量累計修正模組360以及碰撞時間計算模組370;上述的感測器200、障礙物感測模組310、障礙物預測模組320以及誤差模型離線建立模組330跟第1圖之對應方塊相同,不再贅述。特別的是,處理器300a更包含追蹤融合模組340a、偵測信心度建立模組350、融合誤差變化量累計修正模組360以及碰撞時間計算模組370。 Please refer to FIG. 1 and FIG. 6 together. FIG. 6 is a block diagram showing an error and detection probability analysis system 100a applied to multi-sensor fusion according to another embodiment of the present invention. As shown, the multi-sensor fusion error and detection probability analysis system 100a is used to determine the obstacle 120 in the direction of travel of the vehicle 110 and includes a plurality of sensors 200 and a processor 300a. The processor 300a includes an obstacle sensing module 310, an obstacle prediction module 320, an error model offline establishing module 330, a tracking fusion module 340a, a detection confidence setting module 350, and a fusion error variation cumulative correction mode. The group 360 and the collision time calculation module 370; the sensor 200, the obstacle sensing module 310, the obstacle prediction module 320, and the error model offline creation module 330 are the same as the corresponding blocks in FIG. Narration. In particular, the processor 300a further includes a tracking fusion module 340a, a detection confidence establishment module 350, a fusion error variation cumulative correction module 360, and a collision time calculation module 370.
追蹤融合模組340a以追蹤融合方法融合障礙物觀測資訊(x,y,v)、障礙物預測資訊(x',y',v')及事前平均誤差分佈函數f(x,y,v)而產生複數個融合誤差變化量342及複數個融合後障礙物資訊(x",y",v"),本實施例之追蹤 融合方法為卡爾曼濾波演算法且具有兩種感測器200,這兩種感測器200分別為雷達感測器與攝影機。 Tracking fusion module 340a to track fusion method fusion obstacle observation information (x, y, v), obstacle prediction information (x', y', v') and prior average error distribution function f (x, y, v) The plurality of fusion error variations 342 and the plurality of merged obstacle information (x", y", v") are generated. The tracking fusion method of the embodiment is a Kalman filter algorithm and has two types of sensors 200. The two sensors 200 are a radar sensor and a camera, respectively.
偵測信心度建立模組350訊號連接感測器200與誤差模型離線建立模組330,且偵測信心度建立模組350依據不同的感測器200分別建立複數個事前偵測機率資訊352,此事前偵測機率資訊352代表感測器200所偵測到之訊號為真(True)或假(False)的機率,亦可視為偵測的信心度。 The detection confidence establishing module 350 signal connection sensor 200 and the error model offline establishing module 330, and the detection confidence establishing module 350 respectively establish a plurality of pre-detection probability information 352 according to different sensors 200, The pre-detection probability information 352 represents the probability that the signal detected by the sensor 200 is true (true) or false (False), and can also be regarded as the confidence of detection.
融合誤差變化量累計修正模組360訊號連接追蹤融合模組340a及偵測信心度建立模組350,且融合誤差變化量累計修正模組360依據偵測信心度建立模組350的事前偵測機率資訊352來修正追蹤融合模組340a的融合誤差變化量342之累計數值,以判斷障礙物120是否存在。詳細地說,融合誤差變化量累計修正模組360儲存一個預設累計門檻值,且融合誤差變化量累計修正模組360比對預設累計門檻值與累計數值之大小而判斷障礙物120是否存在;當累計數值小於等於預設累計門檻值時,障礙物120視為存在;反之,當累計數值大於預設累計門檻值時,障礙物120視為不存在。再者,融合誤差變化量累計修正模組360會依據事前偵測機率資訊352修正融合誤差變化量342之累計數值,並輸出障礙物120對應融合後障礙物資訊(x",y",v")的存在狀況資訊362。 The fusion error variation cumulative correction module 360 signal connection tracking fusion module 340a and the detection confidence establishment module 350, and the fusion error variation cumulative correction module 360 establishes the pre-detection probability of the module 350 according to the detection confidence level. The information 352 is used to correct the accumulated value of the fusion error variation 342 of the tracking fusion module 340a to determine whether the obstacle 120 exists. In detail, the fusion error variation cumulative correction module 360 stores a preset cumulative threshold value, and the fusion error variation cumulative correction module 360 compares the preset cumulative threshold value with the cumulative value to determine whether the obstacle 120 exists. When the accumulated value is less than or equal to the preset cumulative threshold, the obstacle 120 is considered to exist; conversely, when the accumulated value is greater than the preset cumulative threshold, the obstacle 120 is considered to be absent. Furthermore, the fusion error variation cumulative correction module 360 corrects the cumulative value of the fusion error variation 342 according to the prior detection probability information 352, and outputs the obstacle 120 corresponding to the merged obstacle information (x", y", v" Status information 362.
碰撞時間計算模組370接收融合後障礙物資訊(x",y",v")以及存在狀況資訊362以計算車輛110與障 礙物120的碰撞時間,此碰撞時間可作為自動駕駛的判斷參數。藉此,本發明之系統透過預先處理的感測器200誤差分析結合偵測信心度模型,以融合多個感測器200之間的事前偵測機率資訊352並加以修正融合誤差變化量342之累計數值,能得到一較為可信之障礙物120存在判斷。此外,藉由全球衛星定位即時動態測量裝置預先建立出各種環境、障礙物120及車輛110狀況條件下的事前平均誤差分佈函數f(x,y,v),並以此事前平均誤差分佈函數f(x,y,v)動態修正追蹤結果,可產生誤差較小且可信度較高之融合後障礙物資訊(x",y",v")。再者,透過事前偵測機率資訊352來修正融合誤差變化量342之累計數值,作為障礙物120存在與否的判斷依據,可大幅增加判斷結果的可靠度,並解決習知技術之感測誤差過大及可靠度過低的問題。 The collision time calculation module 370 receives the merged obstacle information (x", y", v") and the presence status information 362 to calculate the collision time of the vehicle 110 and the obstacle 120, and the collision time can be used as the determination parameter of the automatic driving. Thereby, the system of the present invention combines the detection confidence model with the pre-processed sensor 200 error analysis to fuse the pre-detection probability information 352 between the plurality of sensors 200 and correct the fusion error variation 342. The accumulated value can be judged by a more reliable obstacle 120. In addition, the pre-average error distribution function f (x) of various environments, obstacles 120 and vehicle 110 conditions is pre-established by the global satellite positioning real-time dynamic measuring device. , y, v), and dynamically correct the tracking result by the pre-existing average error distribution function f (x, y, v), which can generate the information of the post-fusion obstacle (x", y" with less error and higher reliability. , v"). Furthermore, the cumulative value of the fusion error variation 342 is corrected by the prior detection probability information 352 as a basis for determining the presence or absence of the obstacle 120, which can greatly increase the reliability of the determination result and solve the sensing error of the prior art. Too big and too low reliability.
請一併參閱第5A~5C、6、7、8A及8B圖,第7圖係繪示本發明另一實施例之應用於多感測器融合之誤差及偵測機率分析方法400a的流程示意圖。第8A圖係繪示第7圖中融合誤差變化量累計修正步驟S26之融合誤差變化量342。第8B圖係繪示第8A圖的融合誤差變化量342之累計數值。如第7圖所示,此應用於多感測器融合之誤差及偵測機率分析方法400a包含障礙物感測步驟S21、障礙物預測步驟S22、誤差模型離線建立步驟S23、追蹤融合步驟S24、偵測信心度建立步驟S25、融合誤差變化量累計修正步驟S26以及碰撞時間計算步驟S27。上述的 障礙物感測步驟S21、障礙物預測步驟S22以及誤差模型離線建立步驟S23跟第2圖之障礙物感測步驟S12、障礙物預測步驟S14以及誤差模型離線建立步驟S16相同,不再贅述。特別的是,應用於多感測器融合之誤差及偵測機率分析方法400a更包含追蹤融合步驟S24、偵測信心度建立步驟S25、融合誤差變化量累計修正步驟S26以及碰撞時間計算步驟S27。 Please refer to FIGS. 5A-5C, 6, 7, 8A and 8B. FIG. 7 is a schematic flow chart of the error and detection probability analysis method 400a applied to the multi-sensor fusion according to another embodiment of the present invention. . Fig. 8A is a diagram showing the fusion error variation amount 342 of the fusion error variation amount cumulative correction step S26 in Fig. 7. Fig. 8B is a diagram showing the cumulative value of the fusion error variation amount 342 of Fig. 8A. As shown in FIG. 7, the error and detection probability analysis method 400a for multi-sensor fusion includes an obstacle sensing step S21, an obstacle prediction step S22, an error model offline establishing step S23, and a tracking fusion step S24. The detection confidence degree establishing step S25, the fusion error change amount cumulative correction step S26, and the collision time calculation step S27 are performed. abovementioned The obstacle sensing step S21, the obstacle prediction step S22, and the error model offline establishing step S23 are the same as the obstacle sensing step S12, the obstacle prediction step S14, and the error model offline establishing step S16 of FIG. 2, and will not be described again. In particular, the error and detection probability analysis method 400a applied to the multi-sensor fusion includes a tracking fusion step S24, a detection confidence establishment step S25, a fusion error variation amount accumulation correction step S26, and a collision time calculation step S27.
追蹤融合步驟S24係利用處理器300之誤差模型離線建立模組330以一追蹤融合方法融合障礙物觀測資訊(x,y,v)、障礙物預測資訊(x',y',v')及事前平均誤差分佈函數f(x,y,v)而產生複數個融合誤差變化量342及複數個融合後障礙物資訊(x",y",v"),本實施例之追蹤融合方法為卡爾曼濾波演算法且具有兩種感測器200,這兩種感測器200分別為雷達感測器與攝影機。 The tracking and fusion step S24 utilizes the error model offline establishment module 330 of the processor 300 to integrate the obstacle observation information (x, y, v) and the obstacle prediction information (x', y', v') with a tracking fusion method and The pre-existing average error distribution function f (x, y, v) generates a plurality of fusion error variations 342 and a plurality of post-fusion obstacle information (x", y", v"). The tracking fusion method of this embodiment is Carl. The Man filter algorithm has two types of sensors 200, which are respectively a radar sensor and a camera.
偵測信心度建立步驟S25利用處理器300之偵測信心度建立模組350依據不同的感測器200分別建立複數個事前偵測機率資訊352。而且誤差模型離線建立步驟S23與偵測信心度建立步驟S25有一定之關連性,若事前平均誤差分佈函數f(x,y,v)的變異越大(亦即感測器200感測越不精準),則事前偵測機率資訊352所代表之偵測信心度越低;換句話說,若事前平均誤差分佈函數f(x,y,v)的變異越小(亦即感測器200感測越精準),則事前偵測機率資訊352所代表之偵測信心度越高,亦即感測器200的可靠度越高。 The detection confidence establishing step S25 uses the detection confidence setting module 350 of the processor 300 to respectively establish a plurality of pre-detection probability information 352 according to different sensors 200. Moreover, the error model offline establishing step S23 has a certain correlation with the detection confidence establishing step S25, and the variation of the prior average error distribution function f (x, y, v) is larger (that is, the sensor 200 is less sensitive) Accurate), the lower the detection confidence represented by the pre-recovery probability information 352; in other words, the smaller the variation of the prior mean error distribution function f (x, y, v) (ie, the sensor 200 sense) The more accurate the measurement, the higher the detection confidence that the pre-detection probability information 352 represents, that is, the higher the reliability of the sensor 200.
融合誤差變化量累計修正步驟S26係利用處理器300之融合誤差變化量累計修正模組360依據偵測信心度建立步驟S25的事前偵測機率資訊352修正追蹤融合步驟S24的融合誤差變化量342之累計數值,以判斷障礙物120是否存在。詳細地說,在融合誤差變化量累計修正步驟S26中,處理器300之融合誤差變化量累計修正模組360儲存一個預設累計門檻值,且融合誤差變化量累計修正模組360比對預設累計門檻值與累計數值之大小而判斷障礙物120是否存在;當累計數值小於等於預設累計門檻值時,障礙物120視為存在;反之,當累計數值大於預設累計門檻值時,障礙物120視為不存在。再者,融合誤差變化量累計修正模組360會依據事前偵測機率資訊352修正融合誤差變化量342之累計數值,並輸出障礙物120對應融合後障礙物資訊(x",y",v")的存在狀況資訊362。舉例來說,在第5、8A及8B圖中,時間T19至T24之二種感測器200均無法得到障礙物120的觀測位置(x,y),此時有兩種可能之狀況,第一種可能狀況是二種感測器200同時發生故障,亦即雷達感測器與攝影機均失效,此時感測器200之事前偵測機率資訊352較低(例如:事前偵測機率資訊352為30%);也就是說,感測器200的偵測信心度較低。第二種可能狀況是二種感測器200均正常且障礙物120因雜訊干擾而改變了觀測位置(x,y),此時感測器200之事前偵測機率資訊352較高(例如:事前偵測機率資訊352為90%);也就是說,感測器200的偵測信心度較高。再 者,第8A圖之融合誤差變化量342在時間T19至T24為正值,其代表誤差持續存在。同時,第8B圖的融合誤差變化量342之累計數值在時間T19至T24亦不斷累加而造成累計數值越來越大。假設系統將預設累計門檻值設定為2,故由第8B圖可知累計數值超過2時代表「感測器200無法感測到障礙物120(第一種可能狀況)」或者「障礙物120不存在(第二種可能狀況)」,而其實際狀況究竟為何種可能狀況則需配合感測器200之事前偵測機率資訊352(即偵測信心度)一起綜合判斷。至於累計數值小於等於2時則代表障礙物120存在。 The fusion error variation cumulative correction step S26 is based on the fusion error variation cumulative correction module 360 of the processor 300 to correct the fusion error variation 342 of the tracking fusion step S24 according to the prior detection probability information 352 of the detection confidence establishment step S25. The values are accumulated to determine if the obstacle 120 is present. In detail, in the fusion error change amount cumulative correction step S26, the fusion error change amount cumulative correction module 360 of the processor 300 stores a preset cumulative threshold value, and the fusion error variation amount cumulative correction module 360 compares the presets. The cumulative threshold value and the cumulative value are used to determine whether the obstacle 120 exists; when the accumulated value is less than or equal to the preset cumulative threshold, the obstacle 120 is considered to exist; otherwise, when the accumulated value is greater than the preset cumulative threshold, the obstacle 120 is considered not to exist. Furthermore, the fusion error variation cumulative correction module 360 corrects the cumulative value of the fusion error variation 342 according to the prior detection probability information 352, and outputs the obstacle 120 corresponding to the merged obstacle information (x", y", v" The presence information 362. For example, in the fifth, eighth, and eighth diagrams, the two types of sensors 200 of the time T19 to T24 cannot obtain the observed position (x, y) of the obstacle 120. In the two possible situations, the first possible condition is that the two sensors 200 fail simultaneously, that is, both the radar sensor and the camera fail, and the sensor 200 has a lower detection probability information 352 (for example) : The pre-detection probability information 352 is 30%); that is, the detection confidence of the sensor 200 is low. The second possible condition is that the two sensors 200 are normal and the obstacle 120 is disturbed by noise. When the observation position (x, y) is changed, the pre-detection probability information 352 of the sensor 200 is higher (for example, the pre-detection probability information 352 is 90%); that is, the sensor 200 detects Test confidence is higher. The fusion error variation 342 of Fig. 8A is a positive value at times T19 to T24, which represents that the error persists. At the same time, the cumulative value of the fusion error variation 342 of Fig. 8B is continuously accumulated at times T19 to T24, resulting in an increasing cumulative value. Assuming that the system sets the preset cumulative threshold to 2, it can be seen from Fig. 8B that the accumulated value exceeds 2, which means that "the sensor 200 cannot sense the obstacle 120 (the first possible condition)" or "the obstacle 120 does not." There is a (second possible condition), and the actual situation of which is the possible condition needs to be comprehensively judged together with the pre-detection probability information 352 (ie, the detection confidence) of the sensor 200. When the cumulative value is less than or equal to 2, it means that the obstacle 120 exists.
碰撞時間計算步驟S27利用處理器300之碰撞時間計算模組370接收融合後障礙物資訊(x",y",v")以及存在狀況資訊362以計算車輛110與障礙物120的碰撞時間,此碰撞時間可作為自動駕駛的判斷參數。藉此,本發明的應用於多感測器融合之誤差及偵測機率分析方法400a利用事前偵測機率資訊352來修正融合誤差變化量342之累計數值,作為障礙物120存在與否的判斷依據,不但可大幅增加判斷結果的可靠度,還可有效地應用於自動緊急煞車系統(Autonomous Emergency Braking System;AEB)以及自動駕駛系統(Autonomous Driving System;ADS)上。 The collision time calculation step S27 uses the collision time calculation module 370 of the processor 300 to receive the post-fusion obstacle information (x", y", v") and the presence status information 362 to calculate the collision time of the vehicle 110 and the obstacle 120. The collision time can be used as a judgment parameter of the automatic driving. Thereby, the error and detection probability analysis method 400a of the present invention applied to the multi-sensor fusion method uses the pre-detection probability information 352 to correct the accumulated value of the fusion error variation 342. As the basis for judging the presence or absence of the obstacle 120, not only can the reliability of the judgment result be greatly increased, but also the Autonomous Emergency Braking System (AEB) and the Autonomous Driving System (ADS) can be effectively applied. on.
由上述實施方式可知,本發明具有下列優點:其一,透過預先處理的感測器誤差分析結合偵測信心度模型,以融合多個感測器之間的事前偵測機率資訊並加以修 正融合誤差變化量之累計數值,能得到一較為可信之障礙物存在判斷,且可得到即時運算結果。其二,藉由全球衛星定位即時動態測量裝置預先建立出各種環境、障礙物及車輛狀況條件下的事前平均誤差分佈函數,並以此事前平均誤差分佈函數動態修正追蹤結果,可產生誤差較小且可信度較高之融合後障礙物資訊。其三,透過偵測信心度建立步驟的事前偵測機率資訊來修正融合誤差變化量之累計數值,作為障礙物存在與否的判斷依據,可大幅增加判斷結果的可靠度,並解決習知技術之感測誤差過大及可靠度過低的問題。 It can be seen from the above embodiments that the present invention has the following advantages: First, through the pre-processed sensor error analysis combined with the detection confidence model, the information of the pre-detection probability between the multiple sensors is integrated and repaired. The cumulative value of the amount of error variation is positively obtained, and a more reliable obstacle existence judgment can be obtained, and an instant operation result can be obtained. Secondly, the global average positioning error distribution function under various environmental, obstacle and vehicle conditions is pre-established by the global satellite positioning real-time dynamic measuring device, and the tracking result is dynamically corrected by the pre-average error distribution function, which can generate less error. And more reliable information on post-integration obstacles. Thirdly, the cumulative value of the fusion error is corrected by detecting the probability of the confidence detection step to detect the cumulative value of the fusion error, which can greatly increase the reliability of the judgment result and solve the conventional technology. The problem of excessive sensing error and low reliability.
雖然本發明已以實施方式揭露如上,然其並非用以限定本發明,任何熟習此技藝者,在不脫離本發明之精神和範圍內,當可作各種之更動與潤飾,因此本發明之保護範圍當視後附之申請專利範圍所界定者為準。 Although the present invention has been disclosed in the above embodiments, it is not intended to limit the present invention, and the present invention can be modified and modified without departing from the spirit and scope of the present invention. The scope is subject to the definition of the scope of the patent application attached.
Claims (10)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW106144262A TWI646306B (en) | 2017-12-15 | 2017-12-15 | Method for analyzing error and existence probability of multi-sensor fusion of obstacle detection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW106144262A TWI646306B (en) | 2017-12-15 | 2017-12-15 | Method for analyzing error and existence probability of multi-sensor fusion of obstacle detection |
Publications (2)
Publication Number | Publication Date |
---|---|
TWI646306B true TWI646306B (en) | 2019-01-01 |
TW201928294A TW201928294A (en) | 2019-07-16 |
Family
ID=65803980
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW106144262A TWI646306B (en) | 2017-12-15 | 2017-12-15 | Method for analyzing error and existence probability of multi-sensor fusion of obstacle detection |
Country Status (1)
Country | Link |
---|---|
TW (1) | TWI646306B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI848620B (en) * | 2023-03-17 | 2024-07-11 | 國立中山大學 | Processing method for sensing signals and detection system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1073655A (en) * | 1996-06-17 | 1998-03-17 | Bayerische Motoren Werke Ag | Method for measuring distance between vehicle and object |
TWI253998B (en) * | 2004-11-19 | 2006-05-01 | Jiun-Yuan Tseng | Method and apparatus for obstacle avoidance with camera vision |
EP2330439A1 (en) * | 2009-12-04 | 2011-06-08 | Valeo Vision | Obstacle-detection system for a vehicle |
KR20140077296A (en) * | 2012-12-14 | 2014-06-24 | 현대자동차주식회사 | Multimodal feedback alarm system for approach obstacle recognition |
TWI559267B (en) * | 2015-12-04 | 2016-11-21 | Method of quantifying the reliability of obstacle classification |
-
2017
- 2017-12-15 TW TW106144262A patent/TWI646306B/en active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1073655A (en) * | 1996-06-17 | 1998-03-17 | Bayerische Motoren Werke Ag | Method for measuring distance between vehicle and object |
TWI253998B (en) * | 2004-11-19 | 2006-05-01 | Jiun-Yuan Tseng | Method and apparatus for obstacle avoidance with camera vision |
EP2330439A1 (en) * | 2009-12-04 | 2011-06-08 | Valeo Vision | Obstacle-detection system for a vehicle |
KR20140077296A (en) * | 2012-12-14 | 2014-06-24 | 현대자동차주식회사 | Multimodal feedback alarm system for approach obstacle recognition |
TWI559267B (en) * | 2015-12-04 | 2016-11-21 | Method of quantifying the reliability of obstacle classification |
Also Published As
Publication number | Publication date |
---|---|
TW201928294A (en) | 2019-07-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9478139B2 (en) | Driving safety system and barrier screening method thereof | |
EP3366539B1 (en) | Information processing apparatus and information processing method | |
US11620837B2 (en) | Systems and methods for augmenting upright object detection | |
KR100854766B1 (en) | Method for detecting of parking area by using range sensor | |
JPWO2018212346A1 (en) | Control device, scanning system, control method, and program | |
CN112154455A (en) | Data processing method, equipment and movable platform | |
US11408989B2 (en) | Apparatus and method for determining a speed of a vehicle | |
US20200255006A1 (en) | Method and device of determining kinematics of a target | |
JP2014067169A (en) | Collision prediction device | |
KR20140100787A (en) | Apparatus, method and computer readable recording medium for detecting an error of a lane data for lane maintenance support | |
GB2576206A (en) | Sensor degradation | |
US10866307B2 (en) | Method for analyzing error and existence probability of multi-sensor fusion of obstacle detection | |
WO2021102471A1 (en) | Systems and methods of geometric vehicle collision evaluation | |
CN113741388A (en) | Safety diagnosis system and method based on automatic driving perception failure | |
CN109932721B (en) | Error and detection probability analysis method applied to multi-sensor fusion | |
TWI646306B (en) | Method for analyzing error and existence probability of multi-sensor fusion of obstacle detection | |
US20210366274A1 (en) | Method and device for predicting the trajectory of a traffic participant, and sensor system | |
US6947841B2 (en) | Method for identifying obstacles for a motor vehicle, using at least three distance sensors for identifying the lateral extension of an object | |
TWI541152B (en) | Traffic safety system and its obstacle screening method | |
US20220398879A1 (en) | Method for monitoring at least one sensor | |
US20230168352A1 (en) | Method for assessing a measuring inaccuracy of an environment detection sensor | |
US20230221410A1 (en) | Object sensing device and object sensing method | |
KR102660192B1 (en) | Lane Departure Prevention Apparatus | |
CN110341716B (en) | Vehicle speed calculation method and device, automatic driving system and storage medium | |
US20240010195A1 (en) | Method for ascertaining an approximate object position of a dynamic object, computer program, device, and vehicle |