TWI689433B - Lane tracking method and system for autonomous driving vehicles - Google Patents

Lane tracking method and system for autonomous driving vehicles Download PDF

Info

Publication number
TWI689433B
TWI689433B TW107141410A TW107141410A TWI689433B TW I689433 B TWI689433 B TW I689433B TW 107141410 A TW107141410 A TW 107141410A TW 107141410 A TW107141410 A TW 107141410A TW I689433 B TWI689433 B TW I689433B
Authority
TW
Taiwan
Prior art keywords
lane
vehicle
predetermined
time point
lateral
Prior art date
Application number
TW107141410A
Other languages
Chinese (zh)
Other versions
TW202019744A (en
Inventor
徐錦衍
余建宏
古昆隴
張統凱
林泓邦
Original Assignee
財團法人車輛研究測試中心
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 財團法人車輛研究測試中心 filed Critical 財團法人車輛研究測試中心
Priority to TW107141410A priority Critical patent/TWI689433B/en
Application granted granted Critical
Publication of TWI689433B publication Critical patent/TWI689433B/en
Publication of TW202019744A publication Critical patent/TW202019744A/en

Links

Images

Landscapes

  • Traffic Control Systems (AREA)

Abstract

於一種用於自駕車的車道追蹤方法及系統,處理單元根據慣性測量單元的測量結果所估算且符合信心水準的偏航率及側向加速度、及預存且對於行駛於車道的車輛在當前時間點所估算的當前位置資料,估算並儲存車輛在未來單位時間點的未來位置資料,然後根據已預存且車輛在歷史時間點所估算的歷史參考位置資料及未來位置資料所估算出車輛自歷史時間點至未來單位時間點的總縱向位移量、總側向位移量及總方位角偏移量,以及預存且根據在歷史時間點所感測車道的影像而產生的參考車道線資料並利用座標轉換估算對應於未來單位時間點的未來車道線資料。In a lane tracking method and system for self-driving cars, the processing unit estimates the yaw rate and lateral acceleration of the confidence level according to the measurement results of the inertial measurement unit, and the pre-stored and current time points for the vehicles driving in the lane Estimated current position data, estimate and store the vehicle's future position data at a future unit time point, and then estimate the vehicle's self-history time point based on the pre-stored historical reference position data and future position data estimated by the vehicle at the historical time point Total longitudinal displacement, total lateral displacement, and total azimuth offset to the future unit time point, as well as pre-stored reference lane line data generated based on the image of the lane sensed at the historical time point and using coordinate conversion to estimate correspondence Information on future lane lines at future unit time points.

Description

用於自動駕駛車輛的車道追蹤方法及系統Lane tracking method and system for autonomous driving vehicles

本發明是有關於自動駕駛車輛的控制,特別是指一種用於自動駕駛車輛的車道追蹤方法及系統。 The present invention relates to the control of autonomous driving vehicles, in particular to a lane tracking method and system for autonomous driving vehicles.

按照國際自動機工程師學會(SAE)所制訂出的L1~L5的自動駕駛技術等級,對於越高的自動駕駛技術等級而言,自動駕駛車輛的主動控制系統的強健性及可靠度將面對更高的挑戰。 According to the L1~L5 level of automatic driving technology developed by the International Society of Automated Engineers (SAE), for higher levels of automatic driving technology, the robustness and reliability of the active control system of autonomous vehicles will face more High challenge.

然而,有鑑於目前攝影機對於所拍攝影像從處理到輸出所需處理時間上有其限制,致使車道偵測資訊的更新頻率(例如10~20Hz)過低,以及普遍存在有車道偵測的輸出資訊與車輛運動控制在時間上的不同步問題,例如提供給車輛控制系統的ECU的車道線資訊通常是100~200ms以前的資訊。更具體地說,車輛控制系統的車道感測資訊更新頻率最好應為控制頻率的5~10倍,如此可避免因根據非即時的車道感測資訊所導致在相關控制上的誤差。另一方面,若控制頻率因受限於例如較低的車道感測資訊更新頻率而過低時,如此恐將導致控制命令的解析度與精確度降低,特 別是對於高曲率路段或側向偏移速度較大時,車輛軌跡追蹤效果不佳。 However, in view of the current camera's limitations on the processing time required for the captured image from processing to output, the update frequency of lane detection information (eg, 10-20 Hz) is too low, and the output information of lane detection generally exists The problem of time synchronization with vehicle motion control, for example, the lane line information provided to the ECU of the vehicle control system is usually information before 100~200ms. More specifically, the update frequency of the lane sensing information of the vehicle control system should preferably be 5 to 10 times the control frequency, so as to avoid errors in related control caused by non-real-time lane sensing information. On the other hand, if the control frequency is too low due to, for example, a low lane sensing information update frequency, this may result in a reduction in the resolution and accuracy of the control commands. In particular, for high-curvature sections or when the lateral offset speed is large, the vehicle trajectory tracking effect is not good.

此外,現有道路常出現車道線汙損、模糊不清或因光線變化所導致的車道線顏色對比異常,甚至車道線根本不存在(例如在交叉路口處)的情況,如此恐將導致車道偵測模組的影像辨識錯誤或車道偵測功能異常,甚至根本無法提供車道線偵測功能。 In addition, the existing roads often have lane line stains, blurs, or abnormal color contrast of the lane lines caused by changes in light, or even the lane lines do not exist at all (such as at intersections), which may lead to lane detection. The image recognition error of the module or the lane detection function is abnormal, and even the lane line detection function cannot be provided at all.

因此,如何發展出能解決上述問題的車道追蹤技術遂成為一重要課題。 Therefore, how to develop a lane tracking technology that can solve the above problems has become an important issue.

因此,本發明的一目的,即在提供一種用於自動駕駛車輛的車道追蹤方法,其能克服上述習知技藝的至少一缺點。 Therefore, an object of the present invention is to provide a lane tracking method for an autonomous vehicle, which can overcome at least one of the disadvantages of the above-mentioned conventional techniques.

於是,本發明提供了一種用於自動駕駛車輛的車道追蹤方法,其藉由一處理單元來實施,並包含以下步驟:(A)將由一設於一行駛在一車道的車輛的車道偵測模組根據在一歷史時間點所感測該車道的影像而產生的參考車道線資料、先前估算出且對應於當前時間點的估算車道線資料,以及先前估算出該車輛的一參考點相對於該車道分別在該歷史時間點及該當前時間點的位置的歷史參考位置資料與當前位置資料且在一自該歷史時間點至該當前時間點的歷史期間內的每個歷史單位時間點的位置的歷史位置資料儲存於一儲存單元,其中該歷史參考位置資料、該歷史位置資料及 該當前位置資料其中每一者含有縱向位置值、側向位置值及方位角;(B)根據由一設於該車輛的慣性測量單元在該當前時間點所測量到該車輛的角速度及加速度,獲得該車輛在該當前時間點的估算偏航率及估算側向加速度,並且至少根據由一設於該車輛的動態感測單元在該當前時間點所感測相關於該車輛的方向盤以及所有車輪的車輛動態資料,獲得該車輛在該當前時間點的參考偏航率及參考側向加速度;(C)在判定出該估算偏航率及該估算側向加速度分別與該參考偏航率及該參考側向加速度的相似度均達到一預定信心水準時,根據該估算偏航率及該估算側向加速度,估算該車輛在未來的單位時間內的縱向位移量、側向位移量及方位角偏移量;(D)根據該儲存單元所儲存的該當前位置資料,以及步驟(C)所估算的該縱向位移量、該側向位移量及該方位角偏移量,估算該車輛的該參考點相對於該車道在一未來單位時間點的未來位置資料,並將該未來位置資料儲存於該儲存單元,其中該未來位置資料含有縱向位置值、側向位置值及方位角;(E)根據該歷史參考位置資料及該未來位置資料,估算該車輛自該歷史時間點至該未來單位時間點的總縱向位移量、總側向位移量及總方位角偏移量;及(F)根據該儲存單元所儲存的該參考車道線資料、該總縱向位移量、該總側向位移量及該總方位偏移量並利用座標轉換,估算對應於該未來單位時間點的未來車道線資料,並將該未來車道線資料儲存於該儲存單元。 Therefore, the present invention provides a lane tracking method for an autonomous driving vehicle, which is implemented by a processing unit and includes the following steps: (A) a lane detection module for a vehicle traveling in a lane The reference lane line data generated from the image of the lane sensed at a historical time point, the previously estimated estimated lane line data corresponding to the current time point, and a previously estimated reference point of the vehicle relative to the lane The historical reference position data and the current position data of the position at the historical time point and the current time point, respectively, and the history of the position of each historical unit time point within a historical period from the historical time point to the current time point The location data is stored in a storage unit, wherein the historical reference location data, the historical location data and Each of the current position data contains a longitudinal position value, a lateral position value and an azimuth angle; (B) according to the angular velocity and acceleration of the vehicle measured at the current time point by an inertial measurement unit provided in the vehicle, Obtain the estimated yaw rate and estimated lateral acceleration of the vehicle at the current time, and at least according to the steering wheel and all wheels related to the vehicle sensed at the current time by a dynamic sensing unit provided at the vehicle Vehicle dynamic data to obtain the reference yaw rate and reference lateral acceleration of the vehicle at the current time point; (C) when determining the estimated yaw rate and the estimated lateral acceleration and the reference yaw rate and the reference respectively When the similarity of the lateral acceleration reaches a predetermined confidence level, the longitudinal displacement, lateral displacement, and azimuth deviation of the vehicle in the future unit time are estimated based on the estimated yaw rate and the estimated lateral acceleration (D) estimate the reference point of the vehicle based on the current position data stored in the storage unit and the longitudinal displacement, the lateral displacement and the azimuth deviation estimated in step (C) Relative to the future position data of the lane at a future unit time point, and store the future position data in the storage unit, wherein the future position data contains longitudinal position value, lateral position value and azimuth; (E) according to the Historical reference position data and the future position data, estimate the total longitudinal displacement, total lateral displacement and total azimuth deviation of the vehicle from the historical time point to the future unit time point; and (F) according to the storage The unit stores the reference lane line data, the total longitudinal displacement, the total lateral displacement, and the total azimuth offset and uses coordinate conversion to estimate the future lane line data corresponding to the future unit time point, and The future lane line data is stored in the storage unit.

因此,本發明之另一目的,即在提供一種用於自動駕駛車輛的車道追蹤系統,其能克服上述習知技藝的至少一缺點。 Therefore, another object of the present invention is to provide a lane tracking system for self-driving vehicles, which can overcome at least one of the disadvantages of the above-mentioned conventional techniques.

於是,本發明提供了一種用於自動駕駛車輛的車道追蹤系統,其包含一車道偵測模組、一慣性測量單元、一動態感測單元、一儲存單元及一處理單元。 Thus, the present invention provides a lane tracking system for autonomous vehicles, which includes a lane detection module, an inertial measurement unit, a dynamic sensing unit, a storage unit, and a processing unit.

該車道偵測模組被設於一行駛在一車道的車輛,並用來以一感測頻率持續地感測該車道的影像,以便根據所感測的每一影像產生對應的車道線資料。 The lane detection module is set on a vehicle driving in a lane, and is used to continuously sense the image of the lane at a sensing frequency, so as to generate corresponding lane line data according to each sensed image.

該慣性測量單元被設於該車輛,且用來感測該車輛的慣性以產生該車輛的角速度及加速度。 The inertial measurement unit is installed in the vehicle, and used to sense the inertia of the vehicle to generate the angular velocity and acceleration of the vehicle.

該動態感測單元被設於該車輛,且用來感測該車輛本身及其方向盤與所有車輪的運動狀態以產生車輛動態資料。 The dynamic sensing unit is installed in the vehicle, and is used to sense the motion state of the vehicle itself and its steering wheel and all wheels to generate vehicle dynamic data.

該儲存單元儲存有該車道偵測模組根據在一歷史時間點所感測該車道的影像而產生的參考車道線資料,以及先前估算出該車輛的一參考點相對於該車道分別在該歷史時間點及當前時間點的位置的歷史參考位置資料與當前位置資料且在一自該歷史時間點至該當前時間點的歷史期間內的每個歷史單位時間點的位置的對應的歷史位置資料,其中該歷史參考位置資料、該歷史位置資料及該當前位置資料其中每一者含有縱向位置值、側向位置值及方位角。 The storage unit stores reference lane line data generated by the lane detection module based on the image of the lane sensed at a historical time point, and a previously estimated reference point of the vehicle is at the historical time relative to the lane The historical reference position data and the current position data of the position of the point and the current time point and the corresponding historical position data of the position of each historical unit time point within a historical period from the historical time point to the current time point, wherein Each of the historical reference position data, the historical position data, and the current position data contains a longitudinal position value, a lateral position value, and an azimuth angle.

該處理單元電連接該車道偵測模組、該慣性測量單元、該動態感測單元及該儲存單元,並用來執行以下操作:根據來自該慣性測量單元且在該當前時間點所產生的該角速度及該加速度,獲得該車輛在該當前時間點的估算偏航率及估算側向加速度,並且至少根據來自該動態感測單元且在該當前時間點所產生的該車輛動態資料,獲得該車輛在該當前時間點的參考偏航率及參考側向加速度;在判定出該估算偏航率及該估算側向加速度分別與該參考偏航率及該參考側向加速度的相似度均達到一預定信心水準時,根據該估算偏航率及該估算側向加速度,估算該車輛在未來的單位時間內的縱向位移量、側向位移量及方位角偏移量;根據該儲存單元所儲存的該當前位置資料,以及所估算的該縱向位移量、該側向位移量及該方位角偏移量,估算該車輛的該參考點相對於該車道在一未來單位時間點的未來位置資料,並將該未來位置資料儲存於該儲存單元,其中該未來位置資料含有縱向位置值、側向位置值及方位角;根據該歷史參考位置資料及該未來位置資料,估算該車輛自該歷史時間點至該未來單位時間點的總縱向位移量、總側向位移量及總方位角偏移量;及根據該儲存單元所儲存的該參考車道線資料、該總縱向位移量、該總側向位移量及該總方位偏移量並利用座標轉換,估算對應於該未來單位時間點的未來車道線資料,並將該未來車道線資料儲存於該儲存單元。 The processing unit is electrically connected to the lane detection module, the inertial measurement unit, the dynamic sensing unit and the storage unit, and is used to perform the following operations: according to the angular velocity generated from the inertial measurement unit at the current time point And the acceleration, the estimated yaw rate and the estimated lateral acceleration of the vehicle at the current time point are obtained, and at least according to the vehicle dynamic data generated from the dynamic sensing unit at the current time point, the vehicle The reference yaw rate and the reference lateral acceleration at the current time point; when it is determined that the estimated yaw rate and the estimated lateral acceleration are similar to the reference yaw rate and the reference lateral acceleration, respectively, a predetermined confidence is reached When leveling, according to the estimated yaw rate and the estimated lateral acceleration, the longitudinal displacement, lateral displacement and azimuth deviation of the vehicle in the future unit time are estimated; according to the current stored in the storage unit Position data, as well as the estimated longitudinal displacement, the lateral displacement and the azimuth offset, estimate the future position data of the reference point of the vehicle relative to the lane at a future unit time point, and convert the Future position data is stored in the storage unit, where the future position data contains longitudinal position values, lateral position values, and azimuth angles; based on the historical reference position data and the future position data, the vehicle is estimated from the historical time point to the future The total longitudinal displacement, the total lateral displacement and the total azimuth offset per unit time point; and according to the reference lane line data stored by the storage unit, the total longitudinal displacement, the total lateral displacement and the The total azimuth offset and the coordinate conversion are used to estimate the future lane line data corresponding to the future unit time point, and store the future lane line data in the storage unit.

本發明的功效在於:藉由該車輛動態資料來進一步確認根據該慣性測量單元的測量結果(即角速度及加速度)所估算的偏航率及側向加速度的可信度;利用估算的未來位置資料及參考車道線資料,估算出在未來單位時間點的未來車道線資料,以提高用於側向軌跡追蹤的車道線資料更新頻率;及可有效地補償在車道偵測模組的車道偵測功能短暫失效期間的車線道資料,藉此提升側向控制的精確性及強健性。 The effect of the present invention is to further confirm the credibility of the yaw rate and lateral acceleration estimated according to the measurement results of the inertial measurement unit (ie, angular velocity and acceleration) by using the vehicle dynamic data; using the estimated future position data With reference to the lane line data, the future lane line data at the future unit time point is estimated to increase the update frequency of the lane line data used for lateral trajectory tracking; and the lane detection function in the lane detection module can be effectively compensated The lane information during the short-term failure, thereby improving the accuracy and robustness of the lateral control.

100:車道追蹤系統 100: Lane tracking system

1:車道偵測模組 1: Lane detection module

11:CCD影像感測器 11: CCD image sensor

12:影像處理器 12: Image processor

2:慣性測量單元 2: Inertial measurement unit

21:三軸陀螺儀 21: Three-axis gyroscope

22:三軸加速度計 22: Three-axis accelerometer

3:動態感測單元 3: Dynamic sensing unit

31:方向盤轉角感測器 31: Steering wheel angle sensor

32:車速感測器 32: Vehicle speed sensor

33:輪速感測器組 33: Wheel speed sensor group

4:儲存單元 4: storage unit

5:處理單元 5: Processing unit

401-413:步驟 401-413: Steps

本發明的其他的特徵及功效,將於參照圖式的實施方式中清楚地呈現,其中:圖1是一示意圖,示例地繪示一正行駛於一車道的車輛;圖2是一方塊圖,示例地說明本發明用於自動駕駛車輛的車道追蹤系統的一實施例;圖3是一示意圖,示例地說明該實施例的一儲存單元所儲存的資料內容;圖4是一流程圖,示例地說明該實施例的一處理單元如何執行本發明用於自動駕駛車輛的車道追蹤方法;圖5是一示意圖,示例地說明該處理單元執行完步驟406後該儲存單元所儲存的資料內容;圖6是一示意圖,示例地說明該處理單元執行完步驟409後該 儲存單元所儲存的資料內容;圖7是一示意圖,示例地說明該處理單元執行完步驟413後該儲存單元所儲存的資料內容;及圖8是一示意圖,示例地說明該實施例根據該車道追蹤方法所估算出在一未來單位時間點的未來車道線資料。 Other features and functions of the present invention will be clearly presented in the embodiment with reference to the drawings, in which: FIG. 1 is a schematic diagram illustrating an example of a vehicle driving in a lane; FIG. 2 is a block diagram, Illustratively illustrate an embodiment of the lane tracking system of the present invention for an autonomous driving vehicle; FIG. 3 is a schematic diagram illustrating the contents of data stored in a storage unit of the embodiment; FIG. 4 is a flowchart illustrating an example Explain how a processing unit of this embodiment implements the lane tracking method for autonomous vehicles of the present invention; FIG. 5 is a schematic diagram illustrating the contents of data stored in the storage unit after the processing unit performs step 406; FIG. 6 Is a schematic diagram illustrating after the processing unit executes step 409 The data content stored in the storage unit; FIG. 7 is a schematic diagram illustrating the data content stored in the storage unit after the processing unit performs step 413; and FIG. 8 is a schematic diagram illustrating the embodiment according to the lane The tracking method estimates the future lane line data at a future unit time point.

在本發明被詳細描述之前,應當注意在以下的說明內容中,類似的元件是以相同的編號來表示。 Before the present invention is described in detail, it should be noted that in the following description, similar elements are denoted by the same numbers.

參閱圖1及圖2,本發明用於自動駕駛車輛的車道追蹤系統100的一實施例是應用於一行駛在一車道200的車輛300,該車輛300能提供自動駕駛功能。在本實施例中,該車道200的兩側具有例如呈虛線的車道線,但不以此例為限,在其他實施例中,此車道線亦可為實線或雙實線。該車道追蹤系統100包含一車道偵測模組1、一慣性測量單元2、一動態感測單元3、一儲存單元4及一處理單元5。 Referring to FIGS. 1 and 2, an embodiment of the lane tracking system 100 for automatic driving vehicles of the present invention is applied to a vehicle 300 driving in a lane 200, which can provide an automatic driving function. In this embodiment, both sides of the lane 200 have lane lines that are dashed lines, for example, but not limited to this example. In other embodiments, the lane lines may also be solid lines or double solid lines. The lane tracking system 100 includes a lane detection module 1, an inertial measurement unit 2, a dynamic sensing unit 3, a storage unit 4, and a processing unit 5.

該車道偵測模組1被設於該車輛300,且例如包含一安裝於該車輛300的前擋風玻璃的CCD影像感測器11、及一電連接該CCD影像感測器11的影像處理器12。該CCD影像感測器11被用來以一感測頻率(例如,10/sec)持續地感測該車道200的影像,且該影像處理器12根據該CCD影像感測器11所感測的每一影像且利用 已知影像處理演算法以產生對應的車道線資料。值得注意的是,該車道偵測模組1本身所提供的車道線資料更新頻率亦為例如10/sec,換句話說,每筆車道線資料自該車道200的影像被該CCD影像感測器11感測的時間點開始起算的100ms後才被該影像處理器12產生並輸出。在本實施例中,所產生的車道線資料例如包含左車道線方程式(如以下式1所示)及右車道線方程式(如以下式2所示):y L =f L (x)=A L x 3+B L x 2+C L x+D L 式1 The lane detection module 1 is installed in the vehicle 300, and includes, for example, a CCD image sensor 11 installed on the front windshield of the vehicle 300, and an image processing electrically connected to the CCD image sensor 1112. 12. 12. The CCD image sensor 11 is used to continuously sense the image of the lane 200 at a sensing frequency (for example, 10/sec), and the image processor 12 is based on each sensed by the CCD image sensor 11 An image and use known image processing algorithms to generate corresponding lane line data. It is worth noting that the lane line data provided by the lane detection module 1 itself is also updated at a frequency of, for example, 10/sec. In other words, the image of each lane line data from the lane 200 is captured by the CCD image sensor 11 It is generated and output by the image processor 12 100 ms after the start of the sensing time. In this embodiment, the generated lane line data includes, for example, the left lane line equation (as shown in Equation 1 below) and the right lane line equation (as shown in Equation 2 below): y L = f L ( x ) = A L x 3 + B L x 2 + C L x + D L Formula 1

y R =f R (x)=A R x 3+B R x 2+C R x+D R 式2其中,y L 代表左車道線在縱向位置值為x時的側向位置值,且y R 代表右車道線在縱向位置值為x時的側向位置值。 y R = f R ( x ) = A R x 3 + B R x 2 + C R x + D R Equation 2 where y L represents the lateral position value of the left lane line when the longitudinal position value is x , and y R represents the lateral position value of the right lane line when the longitudinal position value is x .

該慣性測量單元(Inertial Measurement Unit)2被設於該車輛300,且例如包含一三軸陀螺儀(Triaxial Gyroscope)及一三軸加速度計(Triaxial Accelerometer),但不以此例為限,並用來測量該車輛300在三維空間中的角速度和加速度,以產生含有所測量的角速度及加速度的測量結果。 The inertial measurement unit (Inertial Measurement Unit) 2 is installed in the vehicle 300, and includes, for example, a triaxial gyroscope (Triaxial Gyroscope) and a triaxial accelerometer (Triaxial Accelerometer), but not limited to this example, and used The angular velocity and acceleration of the vehicle 300 in three-dimensional space are measured to produce a measurement result containing the measured angular velocity and acceleration.

該動態感測單元3被設於該車輛300,且用來感測該車輛300本身及其方向盤與所有車輪的運動狀態以產生車輛動態資料。在本實施例中,該動態感測單元3例如包含一用來感測該車輛300的方向盤的轉動角度(Steering Angle)的方向盤轉角感測器31、一 用來感測該車輛300的行駛速度(Longitudinal Velocity)的車速感測器32、及一用來感測該車輛300的所有車輪的轉速的輪速感測器組33,但不以此例為限。於是,該動態感測單元3所產生的該車輛動態資料例如至少包含方向盤轉角、車速、後右車輪輪速及後左車輪輪速。 The dynamic sensing unit 3 is installed in the vehicle 300, and is used to sense the motion state of the vehicle 300, its steering wheel, and all wheels to generate vehicle dynamic data. In this embodiment, the dynamic sensing unit 3 includes, for example, a steering wheel angle sensor 31 for sensing the turning angle (Steering Angle) of the steering wheel of the vehicle 300, a A vehicle speed sensor 32 for sensing the driving speed (Longitudinal Velocity) of the vehicle 300, and a wheel speed sensor group 33 for sensing the rotation speeds of all wheels of the vehicle 300, but not as an example limit. Therefore, the vehicle dynamic data generated by the dynamic sensing unit 3 includes, for example, at least the steering wheel angle, the vehicle speed, the rear right wheel speed, and the rear left wheel speed.

值得注意的是,理想上,該慣性測量單元2及該動態感測單元3能被設計成具有相同的輸出更新率,此輸出更新率例如為該單位時間(10ms)的倒數(也就是100/sec),並且是該車道偵測模組1的輸出更新率(例如,10/sec))的十倍。 It is worth noting that, ideally, the inertial measurement unit 2 and the dynamic sensing unit 3 can be designed to have the same output update rate. The output update rate is, for example, the reciprocal of the unit time (10 ms) (that is, 100/ sec), and is ten times the output update rate of the lane detection module 1 (for example, 10/sec).

參閱圖3,在本實施例中,該儲存單元4儲存有該車道偵測模組1根據在一歷史時間點t0-N所感測該車道200的影像而(例如在t0-N後的100ms)產生的參考車道線資料、先前估算出且對應於當前時間點t0的估算車道線資料,以及先前估算出該車輛300的一參考點301(例如,如圖8所示該車輛300的一重心)相對於該車道200分別在該歷史時間點t0-N及該當前時間點t0的位置的歷史參考位置資料與當前位置資料且在一自該歷史時間點t0-N至該當前時間點t0的歷史期間內的每個歷史單位時間點t0-1,...,t0-(N-1)的位置的對應的歷史位置資料。在本實施例中,該參考車道線資料及該估算車道線資料其中每一者例如包含左車道線方程式及右車道線方程 式,並且該歷史參考位置資料、該歷史位置資料及該當前位置資料其中每一者例如含有縱向位置值、側向位置值及方位角。舉例來說,此單位時間為10ms,且10

Figure 107141410-A0305-02-0012-12
N
Figure 107141410-A0305-02-0012-13
10×P,其中P為一預定正整數,但不以此例為限。 Referring to FIG. 3, in this embodiment, the storage unit 4 stores the lane detection module 1 according to the image of the lane 200 sensed at a historical time point t 0-N (for example, after t 0-N 100ms) generated reference lane line data, the estimated lane line data previously estimated and corresponding to the current time t 0 , and a reference point 301 of the vehicle 300 previously estimated (for example, as shown in FIG. 8 of the vehicle 300 (Center of gravity) the historical reference position data and the current position data with respect to the position of the lane 200 at the historical time point t 0-N and the current time point t 0 and from the historical time point t 0-N to the The corresponding historical position data of the position of each historical unit time point t 0-1 , ..., t 0-(N-1) within the historical period of the current time point t 0 . In this embodiment, each of the reference lane line data and the estimated lane line data includes, for example, a left lane line equation and a right lane line equation, and the historical reference position data, the historical position data, and the current position data Each contains, for example, a longitudinal position value, a lateral position value, and an azimuth angle. For example, this unit time is 10ms, and 10
Figure 107141410-A0305-02-0012-12
N
Figure 107141410-A0305-02-0012-13
10×P, where P is a predetermined positive integer, but not limited to this example.

該處理單元5電連接該車道偵測模組1、該慣性測量單元2、該動態感測單元3及該儲存單元4,以接收來自該車道偵測模組1的所有車道線資料、來自該慣性測量單元2的該測量結果及來自該動態感測單元3的該車輛動態資料。值得注意的是,該儲存單元4所儲存的所有資料皆是經由該處理單元5重複執行本發明用於自動駕駛車輛的車道追蹤方法而獲得,並由該處理單元5將其儲存至該儲存單元4。 The processing unit 5 is electrically connected to the lane detection module 1, the inertial measurement unit 2, the dynamic sensing unit 3 and the storage unit 4 to receive all lane line data from the lane detection module 1, from the The measurement result of the inertial measurement unit 2 and the vehicle dynamic data from the dynamic sensing unit 3. It is worth noting that all the data stored in the storage unit 4 is obtained through the processing unit 5 repeatedly executing the lane tracking method for autonomous vehicles of the present invention, and the processing unit 5 stores it to the storage unit 4.

以下將參閱圖2至圖7,示例地詳細說明該處理單元5在該當前時間點t0如何經由執行該車道追蹤方法而估算出該車輛300在一未來單位時間點t0+1的未來車道線資料,該車道追蹤方法包含以下步驟401~413(圖4)。 The following will refer to FIGS. 2 to 7 to illustrate in detail how the processing unit 5 estimates the future lane of the vehicle 300 at a future unit time t 0+1 by executing the lane tracking method at the current time t 0 Line data, the lane tracking method includes the following steps 401 ~ 413 (Figure 4).

在步驟401中,該處理單元5根據在該當前時間點t0由該慣性測量單元2所產生(輸出)的測量結果(即該角速度及該加速度),獲得該車輛300在該當前時間點t0的估算偏航率及估算側向加速度。更明確地說,在此,該處理單元5用作一卡爾曼濾波器(Kalman Filter),以便先利用卡爾曼濾波方式濾除該角速度及該 加速度的雜訊後,再根據濾波後的該角速度及該加速度並利用卡爾曼估測方式獲得該估算偏航率(Yaw Rate)及該估算側向加速度。由於卡爾曼濾波器的細節已為嫻熟此技藝者所熟知,故在此不再贅述。 In step 401, the processing unit 5 in accordance with the current time point t 0 generated (output) of the inertial measurement unit 2, the measurement result (i.e., the angular velocity and the acceleration) to obtain the vehicle 300 t at the point of the current time 0 estimated yaw rate and estimated lateral acceleration. More specifically, here, the processing unit 5 serves as a Kalman filter (Kalman Filter), so as to first filter out the noise of the angular velocity and the acceleration by the Kalman filtering method, and then according to the filtered angular velocity And the acceleration and use the Kalman estimation method to obtain the estimated yaw rate (Yaw Rate) and the estimated lateral acceleration. Since the details of the Kalman filter are well-known to those skilled in the art, they will not be repeated here.

另一方面,在步驟402,該處理單元5至少根據在該當前時間點t0所產生的該車輛動態資料,獲得該車輛300在該當前時間點t0的參考偏航率及參考側向加速度。更明確地說,該處理單元5根據該車輛動態資料所含的該方向盤轉角(以δ sw 表示)與該車速(以V x 表示),以及該車輛300的轉向比(方向盤轉角與車輪轉向角的比值,並以N表示)、轉向不足係數(以Kus)及軸距(Wheel Base;即前輪軸到後輪軸的距離,且以L表示),獲得該參考偏航率(以

Figure 107141410-A0305-02-0013-42
表示)及該參考側向加速度(以
Figure 107141410-A0305-02-0013-43
表示),且該參考偏航率
Figure 107141410-A0305-02-0013-44
及該參考側向加速度
Figure 107141410-A0305-02-0013-45
分別能由以下式3及式4計算出:
Figure 107141410-A0305-02-0013-1
On the other hand, at step 402, the processing unit 5 in accordance with at least the dynamics data of the vehicle at the current time point t 0 generated, to obtain the acceleration of the vehicle 300 at the current time point t 0 reference yaw rate and a reference lateral . More specifically, the processing unit 5 is based on the steering wheel angle (denoted by δ sw ) and the vehicle speed (denoted by V x ) contained in the vehicle dynamic data, and the steering ratio of the vehicle 300 (steering wheel angle and wheel steering angle The ratio of N, and the understeer coefficient (in Kus ) and wheelbase (Wheel Base; that is, the distance from the front axle to the rear axle, and expressed in L) to obtain the reference yaw rate (in
Figure 107141410-A0305-02-0013-42
Indicated) and the reference lateral acceleration (in
Figure 107141410-A0305-02-0013-43
Indicates), and the reference yaw rate
Figure 107141410-A0305-02-0013-44
And the reference lateral acceleration
Figure 107141410-A0305-02-0013-45
It can be calculated by the following formula 3 and formula 4, respectively:
Figure 107141410-A0305-02-0013-1

Figure 107141410-A0305-02-0013-2
其中,δ f 為前輪轉向角,且δ f =δ sw /N。或者,該處理單元5亦能根據該車輛動態資料所含的該車速V x 、該後右車輪輪速(以v rr 表示)及該後左車輪輪速(以v rl 表示),以及該車輛300的後輪輪距(Rear Track Width;且以S r 表示),獲得該參考偏航率
Figure 107141410-A0305-02-0013-46
及該參考側向加速度
Figure 107141410-A0305-02-0013-48
,且該參考偏航率
Figure 107141410-A0305-02-0013-49
及該參考側向加速度
Figure 107141410-A0305-02-0013-47
分別能由以下式5及式6計 算出:
Figure 107141410-A0305-02-0014-3
Figure 107141410-A0305-02-0013-2
Where δ f is the steering angle of the front wheels, and δ f = δ sw / N. Alternatively, the processing unit 5 can also based on the vehicle speed V x, the right wheel speed (expressed in v rr) and the rear left wheel speed (expressed in v rl), the rear of the vehicle and vehicle dynamic data contained in the The rear track width of 300 (Rear Track Width; and expressed as S r ), to obtain the reference yaw rate
Figure 107141410-A0305-02-0013-46
And the reference lateral acceleration
Figure 107141410-A0305-02-0013-48
And the reference yaw rate
Figure 107141410-A0305-02-0013-49
And the reference lateral acceleration
Figure 107141410-A0305-02-0013-47
It can be calculated by the following formula 5 and formula 6 respectively:
Figure 107141410-A0305-02-0014-3

Figure 107141410-A0305-02-0014-4
附帶一提的是,若該車輛動態資料還包含前右車輪輪速(以v fr 表示)及該前左車輪輪速(以v fl 表示),該處理單元5還能根據該前右車輪輪速v fr 及該前左車輪輪速v fl 、該車輛300的前輪輪距(Front Track Width;且以S f 表示)及該前輪轉向角δ f 獲得該參考偏航率
Figure 107141410-A0305-02-0014-51
及該參考側向加速度
Figure 107141410-A0305-02-0014-50
,且該參考偏航率
Figure 107141410-A0305-02-0014-53
及該參考側向加速度
Figure 107141410-A0305-02-0014-52
分別能由以下式7及式8計算出:
Figure 107141410-A0305-02-0014-5
Figure 107141410-A0305-02-0014-4
Incidentally, if the vehicle dynamic data further includes the front right wheel speed (indicated by v fr ) and the front left wheel speed (indicated by v fl ), the processing unit 5 can also use the front right wheel The speed v fr and the front left wheel speed v fl , the front wheel width of the vehicle 300 (Front Track Width; and denoted by S f ), and the front wheel steering angle δ f obtain the reference yaw rate
Figure 107141410-A0305-02-0014-51
And the reference lateral acceleration
Figure 107141410-A0305-02-0014-50
And the reference yaw rate
Figure 107141410-A0305-02-0014-53
And the reference lateral acceleration
Figure 107141410-A0305-02-0014-52
It can be calculated by the following formula 7 and formula 8 respectively:
Figure 107141410-A0305-02-0014-5

Figure 107141410-A0305-02-0014-6
Figure 107141410-A0305-02-0014-6

在步驟401及步驟402之後,該處理單元5利用例如雙樣本T檢定(Two Sample T-Test)方式,判定該估算偏航率及該估算側向加速度分別與該參考偏航率及該參考側向加速度的相似度是否均達到一預定信心水準。在本實施例中,該預定信心水準例如為95。若該判定結果為肯定時,此意謂該慣性測量單元2的該測量結果已通過可信度的檢定為可利用的資料,流程進行至步驟405。相反地,若該判定結果為否定時,此意謂該處理單元5檢定出該慣性測量單元2的該測量結果被檢定為不能利用的資料,在此情況下,該處理單元5能輸出一指示出該慣性測量單元2異常的警示信號至 一外部控制系統(圖未示)(步驟404),以供該外部控制系統進行相關後續處理。 After step 401 and step 402, the processing unit 5 determines the estimated yaw rate and the estimated lateral acceleration with the reference yaw rate and the reference side by using, for example, the Two Sample T-Test method. Whether the similarity of the accelerations has reached a predetermined confidence level. In this embodiment, the predetermined confidence level is 95, for example. If the determination result is affirmative, this means that the measurement result of the inertial measurement unit 2 has passed the verification of the credibility as available data, and the flow proceeds to step 405. Conversely, if the determination result is negative, this means that the processing unit 5 has detected that the measurement result of the inertial measurement unit 2 has been verified as unavailable data. In this case, the processing unit 5 can output an instruction The warning signal of the abnormality of the inertial measurement unit 2 is output to An external control system (not shown) (step 404), for the external control system to perform related subsequent processing.

在步驟405中,該處理單元5根據該估算偏航率及該估算側向加速度,估算該車輛300在未來的單位時間(例如,10ms)內的縱向位移量(以Δs x 表示)、側向位移量(以Δs y 表示)及方位角偏移量(以Δ

Figure 107141410-A0305-02-0015-26
表示)。然後,該處理單元5根據該儲存單元1所儲存的該當前位置資料,以及所估算的該縱向位移量Δs x 、該側向位移量Δs y 及該方位角偏移量Δ
Figure 107141410-A0305-02-0015-27
,估算該車輛300的該參考點301相對於該車道在一未來單位時間點t0+1的未來位置資料,並將該未來位置資料儲存於該儲存單元4(步驟406),其中該未來位置資料含有縱向位置值、側向位置值及方位角,如圖5所示。 In step 405, based on the estimated yaw rate and the estimated lateral acceleration, the processing unit 5 estimates the longitudinal displacement (indicated by Δ s x ) of the vehicle 300 in the future unit time (for example, 10 ms), side Amount of displacement (in Δ s y ) and azimuth offset (in Δ
Figure 107141410-A0305-02-0015-26
Means). Then, the processing unit 5 is based on the current position data stored in the storage unit 1 and the estimated longitudinal displacement Δ s x , the lateral displacement Δ s y and the azimuth deviation Δ
Figure 107141410-A0305-02-0015-27
, Estimate the future position data of the reference point 301 of the vehicle 300 relative to the lane at a future unit time point t 0+1 , and store the future position data in the storage unit 4 (step 406), wherein the future position The data contains longitudinal position value, lateral position value and azimuth, as shown in Figure 5.

應注意的是,該處理單元5,在執行之後的步驟412之前,會先確認是否在該當前時間點t0接收到來自該車道偵測模組1新近產生的車道線資料(步驟407)。若該處理單元5在該當前時間點t0未接收到任何來自該車道偵測模組1的車道線資料,則流程進行步驟412。相反地,若該處理單元5在該當前時間點t0接收到來自該車道偵測模組1新近產生的車道線資料(更明確地說,此車道線資料是該影像處理器12根據該CCD影像感測器11在該歷史期間中的一歷史單位時間點(例如,t0-10)所感測該車道200的影像且在該當前時間點t0所產生的),該處理單元5根據該儲存單元4所儲存的該估 算車道線資料、及相關於該車道偵測模組1的該CCD影像感測模組11的影像感測(解析度)規格的預定參考條件,判定所接收的該車道線資料是否為可利用的(available)(步驟408),藉此確認該車道偵測模組1的車道偵測功能是否正常。 It should be noted that the processing unit 5 will first confirm whether it has received the lane line data newly generated from the lane detection module 1 at the current time point t 0 (step 407). If the processing unit 5 does not receive any lane line data from the lane detection module 1 at the current time point t 0 , the process proceeds to step 412. On the contrary, if the processing unit 5 receives the lane line data newly generated from the lane detection module 1 at the current time point t 0 (more specifically, the lane line data is the image processor 12 based on the CCD The image sensor 11 senses the image of the lane 200 at a historical unit time point (for example, t 0-10 ) in the historical period and generates it at the current time point t 0 ), and the processing unit 5 according to the The estimated lane line data stored in the storage unit 4 and the predetermined reference conditions related to the image sensing (resolution) specifications of the CCD image sensing module 11 of the lane detection module 1 determine the received Whether the lane line data is available (step 408), thereby confirming whether the lane detection function of the lane detection module 1 is normal.

在本實施例中,該影像感測規格例如定義出一預定最短有效縱向距離及一預定最長有效縱向距離,並且該預定參考條件例如包含分別在該預定最短有效縱向距離及該預定最長有效縱向距離處相關於該車道200的寬度的一預定第一寬度差門檻及一預定第二寬度差門檻,以及分別在該預定最短有效縱向距離及該預定最長有效縱向距離處相關於該車道200的中心線的一預定第一側向偏移量門檻及一預定第二側向偏移量門檻。舉例來說,該預定最短有效縱向距離及該預定最長有效縱向距離分別為15公尺及25公尺,該預定第一寬度差門檻與該預定第二寬度差門檻均為0.5公尺,且該預定第一側向偏移量門檻與該預定第二側向偏移量門檻均為0.1公尺,但不以此例為限。 In this embodiment, the image sensing specification defines, for example, a predetermined shortest effective longitudinal distance and a predetermined longest effective longitudinal distance, and the predetermined reference condition includes, for example, the predetermined shortest effective longitudinal distance and the predetermined longest effective longitudinal distance, respectively. A predetermined first width difference threshold and a predetermined second width difference threshold related to the width of the lane 200, and the centerline of the lane 200 at the predetermined shortest effective longitudinal distance and the predetermined longest effective longitudinal distance, respectively A predetermined first lateral offset threshold and a predetermined second lateral offset threshold. For example, the predetermined shortest effective longitudinal distance and the predetermined longest effective longitudinal distance are 15 meters and 25 meters, respectively, the predetermined first width difference threshold and the predetermined second width difference threshold are both 0.5 meters, and the The predetermined first lateral offset threshold and the predetermined second lateral offset threshold are both 0.1 meters, but not limited to this example.

於是,在步驟408中,該處理單元5根據該估算車道線資料獲得在該預定最短有效縱向距離處代表該車道200的寬度的第一寬度值與代表該車道200的中心線的側向位置的第一位置值,以及在該預定最長有效縱向距離處代表該車道200的寬度值的第二寬度與代表該車道200的中心線的側向位置的第二位置值,並且根據該 車道線資料獲得在該預定最短有效縱向距離處代表該車道200的寬度值的第三寬度值與代表該車道的中心線的側向位置的第三位置值,以及在該預定最長有效縱向距離處代表該車道200的寬度的第四寬度值與代表該車道的中心線的側向位置的第四位置值。在本實施例中,該處理單元5藉由判定該第一寬度值與該第三寬度值之間的第一差值是否不大於該預定第一寬度差門檻、該第二寬度值與該第四寬度值之間的第二差值是否不大於該預定第二寬度差門檻、該第一位置值與該第三位置值之間的第三差值是否不大於該預定第一側向偏移量門檻、及該第二位置值與該第四位置值之間的第四差值是否不大於該預定第二側向偏移量門檻來決定該車道線資料是否為可利用的。當該處理單元5判定出該第一差值、該第二差值、該第三差值及該第四差值分別不大於該預定第一寬度差門檻、該預定第二寬度差門檻、該預定第一側向偏移量門檻及該預定第二側向偏移量門檻時,該車道線資料被該處理單元5判定為可利用的,如此,該車道偵測模組1的車道偵測功能被確認為正常的。於是,流程將進行步驟409。否則,流程將進行步驟410。 Thus, in step 408, the processing unit 5 obtains the first width value representing the width of the lane 200 at the predetermined shortest effective longitudinal distance and the lateral position representing the center line of the lane 200 according to the estimated lane line data The first position value, and the second width representing the width value of the lane 200 at the predetermined longest effective longitudinal distance and the second position value representing the lateral position of the centerline of the lane 200, and according to the The lane line data obtains a third width value representing the width value of the lane 200 at the predetermined shortest effective longitudinal distance and a third position value representing the lateral position of the centerline of the lane, and at the predetermined longest effective longitudinal distance A fourth width value representing the width of the lane 200 and a fourth position value representing the lateral position of the centerline of the lane. In this embodiment, the processing unit 5 determines whether the first difference between the first width value and the third width value is not greater than the predetermined first width difference threshold, the second width value and the first Whether the second difference between the four width values is not greater than the predetermined second width difference threshold, and whether the third difference between the first position value and the third position value is not greater than the predetermined first lateral offset The amount threshold and whether the fourth difference between the second position value and the fourth position value is not greater than the predetermined second lateral offset threshold determines whether the lane line data is available. When the processing unit 5 determines that the first difference, the second difference, the third difference and the fourth difference are not greater than the predetermined first width difference threshold, the predetermined second width difference threshold, the When the first lateral offset threshold and the predetermined second lateral offset threshold are predetermined, the lane line data is determined to be available by the processing unit 5, and thus, the lane detection of the lane detection module 1 The function is confirmed to be normal. Then, the flow will proceed to step 409. Otherwise, the process will proceed to step 410.

舉例來說,該估算車道線資料所包含的該左車道線方程式y' L1及該右車道線方程式y' R1,以及該車道線資料所包含的該左車道線方程式

Figure 107141410-A0305-02-0017-54
及該右車道線方程式
Figure 107141410-A0305-02-0017-55
分別被表成以下式9至式12:y' L1=f' L1(x)=A' L1 x 3+B' L1 x 2+C' L1 x+D' L1 式9 The left lane and the left lane marking line equation of the equation y 'L 1 and right lane lines of the equation y' R 1, and the lane line profile includes an example, the estimated lane line information contained
Figure 107141410-A0305-02-0017-54
And the right lane line equation
Figure 107141410-A0305-02-0017-55
They are expressed as the following formula 9 to formula 12: y ' L 1 = f ' L 1 ( x ) = A ' L 1 x 3 + B ' L 1 x 2 + C ' L 1 x + D ' L 1 formula 9

y' R1=f' R1(x)=A' R1 x 3+B' R1 x 2+C' R1 x+D' R1 式10 y ' R 1 = f ' R 1 ( x ) = A ' R 1 x 3 + B ' R 1 x 2 + C ' R 1 x + D ' R 1 Formula 10

Figure 107141410-A0305-02-0018-10
Figure 107141410-A0305-02-0018-10

Figure 107141410-A0305-02-0018-11
於是,依循前述範例,將x=15代入式9至式12,能獲得該第一寬度值W1、該第三寬度值W2、該第一位置值Y1及該第三位置值Y3,其中W1=f' L1(15)-f' R1(15),
Figure 107141410-A0305-02-0018-38
,Y1=(f' L1(15)+f' R1(15))/2,及
Figure 107141410-A0305-02-0018-39
;而將x=25代入式9至式12,能獲得該第二寬度值W2、該第四寬度值W4、該第二位置值Y2及該第四位置值Y4,其中W2=f' L1(25)-f' R1(25),
Figure 107141410-A0305-02-0018-40
,Y2=(f' L1(25)+f' R1(25))/2,及
Figure 107141410-A0305-02-0018-41
。然後,該第一差值D1、該第二差值D2、該第三差值D3及該第四差值D4能被獲得,其中D1=|W1-W3|,D2=|W2-W4|,D3=|Y1-Y3|,及D4=|Y2-Y4|。值得注意的是,前述所有值均以公尺為單位。若D1
Figure 107141410-A0305-02-0018-34
0.5,D2
Figure 107141410-A0305-02-0018-35
0.5,D3
Figure 107141410-A0305-02-0018-36
0.1且D4
Figure 107141410-A0305-02-0018-37
0.1,則該車道線資料被判定為可利用的,相反地,若前述四個關係式其中的任一者未被滿足時,該車道線資料被判定為不可利用的,此意謂該車道偵測模組1可能因擷取到含有已汙損、模糊不清或因光線變化所導致的車道線顏色對比異常的車道線或者根本不存在有車道線的車道影像而導致車道偵測功能的短暫失效。
Figure 107141410-A0305-02-0018-11
Therefore, following the aforementioned example, substituting x=15 into Equations 9 to 12, the first width value W1, the third width value W2, the first position value Y1 and the third position value Y3 can be obtained, where W1= f ' L 1 (15)- f ' R 1 (15),
Figure 107141410-A0305-02-0018-38
, Y1=( f ' L 1 (15)+ f ' R 1 (15))/2, and
Figure 107141410-A0305-02-0018-39
; And substituting x=25 into Equation 9 to Equation 12, the second width value W2, the fourth width value W4, the second position value Y2 and the fourth position value Y4 can be obtained, where W2= f L 1 (25)- f ' R 1 (25),
Figure 107141410-A0305-02-0018-40
, Y2=( f ' L 1 (25)+ f ' R 1 (25))/2, and
Figure 107141410-A0305-02-0018-41
. Then, the first difference D1, the second difference D2, the third difference D3 and the fourth difference D4 can be obtained, where D1=| W 1- W 3|, D2=| W 2- W 4|, D3=| Y 1- Y 3|, and D4=| Y 2- Y 4|. It is worth noting that all the aforementioned values are in meters. If D1
Figure 107141410-A0305-02-0018-34
0.5, D2
Figure 107141410-A0305-02-0018-35
0.5, D3
Figure 107141410-A0305-02-0018-36
0.1 and D4
Figure 107141410-A0305-02-0018-37
0.1, the lane line data is judged to be available. Conversely, if any of the above four relational expressions are not satisfied, the lane line data is judged to be unavailable, which means the lane detection The detection module 1 may cause a short-term lane detection function because it captures a lane line that has been stained, blurred, or has abnormal color contrast of the lane line due to light changes, or there is no lane image with a lane line at all. Failure.

若該處理單元5於步驟408判定出該第一至第四差值其中任一者大於該預定第一寬度差門檻、該預定第二寬度差門檻、該 預定第一側向偏移量門檻及該預定第二側向偏移量門檻其中一對應者時,該車道線資料被該處理單元5判定為不可利用的(也就是說,該車道偵測模組1的車道偵測功能被確認為異常的)。在此情況下,該處理單元5會進一步確認至目前為止已判定出不可利用車道線資料的累積次數是否超過門檻次數(例如,7次)(步驟410)。然後,若該處理單元5判定出該累積次數未超過該門檻次數時,流程將進行步驟412。相反地,若該處理單元5判定出該累積次數超過該門檻次數時,該處理單元5輸出一指示出該車道偵測模組1異常的警示信號至該外部控制系統(步驟411),以供該外部控制系統進行後續相關控制。 If the processing unit 5 determines in step 408 that any one of the first to fourth difference values is greater than the predetermined first width difference threshold, the predetermined second width difference threshold, the When a corresponding one of the first lateral offset threshold and the predetermined second lateral offset threshold is predetermined, the lane line data is determined to be unavailable by the processing unit 5 (that is, the lane detection mode The lane detection function of group 1 is confirmed to be abnormal). In this case, the processing unit 5 will further confirm whether the accumulated number of unavailable lane line data has exceeded the threshold number of times (for example, 7 times) (step 410). Then, if the processing unit 5 determines that the accumulated number of times does not exceed the threshold number of times, the flow proceeds to step 412. Conversely, if the processing unit 5 determines that the cumulative number of times exceeds the threshold number of times, the processing unit 5 outputs a warning signal indicating that the lane detection module 1 is abnormal to the external control system (step 411) for The external control system performs subsequent related control.

在步驟409中,該處理單元5將該儲存單元4所儲存的該參考車道線資料及該歷史參考位置資料分別更新為該車道線資料及在該歷史單位時間點t0-10的該歷史位置資料。在此情況下,該儲存單元4所儲存的位置資料內容被更新為僅會存留自該歷史單位時間點t0-10的該歷史位置資料,如圖6所示。 In step 409, the processing unit 5 updates the reference lane line data and the historical reference position data stored in the storage unit 4 to the lane line data and the historical position at the historical unit time point t 0-10 , respectively data. In this case, the content of the location data stored in the storage unit 4 is updated to only store the historical location data from the time point t 0-10 of the historical unit, as shown in FIG. 6.

在步驟412中,該處理單元5根據該歷史參考位置資料及該未來位置資料,估算該車輛300自該歷史時間點t0-N(t0-10)至該未來單位時間點t0+1的總縱向位移量S x 、總側向位移量S y 及總方位角偏移量ψ。更明確地說,該處理單元5在步驟407中確認未接收到新近產生的車道線資料之後,或者在步驟407中確認接收到新近產 生的該車道線資料但步驟408中判定出該車道線資料為不可利用且在步驟410中確認車道線資料為不可利用的判定的該累積次數不超過該門檻次數之後,則採用圖5所示的該歷史參考位置資料及該未來位置資料,且該歷史時間點採用t0-N;而該處理單元5在步驟407中確認接收到新近產生的該車道線資料且在執行完步驟409時,則採用圖6所示的該歷史參考位置資料及該未來位置資料,且該歷史時間點採用t0-10In step 412, the processing unit 5 estimates the vehicle 300 from the historical time point t 0-N (t 0-10 ) to the future unit time point t 0+1 based on the historical reference position data and the future position data The total longitudinal displacement S x , the total lateral displacement S y and the total azimuth deviation ψ . More specifically, after the processing unit 5 confirms that the newly generated lane line data is not received in step 407, or confirms that the newly generated lane line data is received in step 407 but determines the lane line data in step 408 Is unavailable and after the determination that the lane line data is unavailable in step 410 does not exceed the threshold, the historical reference position data and the future position data shown in FIG. 5 are used, and the historical time The point adopts t 0-N ; and the processing unit 5 confirms that the newly generated lane line data is received in step 407 and when step 409 is executed, the historical reference position data and the future position shown in FIG. 6 are used Data, and the historical time point adopts t 0-10 .

最後,在步驟413中,該處理單元5根據該儲存單元4所儲存的該參考車道線資料、該總縱向位移量S x 、該總側向位移量S y 及該總方位偏移量ψ並利用座標轉換,估算對應於該未來單位時間點t0+1的未來車道線資料。在此情況下,該未來車道線資料會被視為對應於該未來單位時間點t0+1的估算車道線資料。此外,該處理單元5還將該未來車道線資料(例如以覆蓋先前儲存的該估算車道線資料的複寫(rewrite)方式,但不在此限)儲存於該儲存單元4,如圖7所示,以供後續處理使用。 Finally, in step 413, the processing unit 5 according to the reference lane line data stored by the storage unit 4, the total longitudinal displacement S x , the total lateral displacement S y and the total azimuth offset ψ and Using coordinate conversion, the future lane line data corresponding to the future unit time point t 0+1 is estimated. In this case, the future lane line data will be regarded as the estimated lane line data corresponding to the future unit time point t 0+1 . In addition, the processing unit 5 also stores the future lane line data (for example, in a rewrite manner that covers the previously stored estimated lane line data, but not limited to this) in the storage unit 4, as shown in FIG. 7, For subsequent processing.

舉例來說,參考圖8,該估算車道線資料所包含的該左車道線方程式y L1及該右車道線方程式y R1分別被表成以下式13至式14:y L1=f L1(x)=A L1 x 3+B L1 x 2+C L1 x+D L1 式13 For example, referring to FIG. 8, the left lane line equation y L 1 and the right lane line equation y R 1 included in the estimated lane line data are respectively expressed as the following equations 13 to 14: y L 1 = f L 1 ( x ) = A L 1 x 3 + B L 1 x 2 + C L 1 x + D L 1 Equation 13

y R1=f R1(x)=A R1 x 3+B R1 x 2+C R1 x+D R1 式14 在此情況下,該車輛200的該參考點301在該未來單位時間點t0+1相對於在該歷史時間點t0-N(t0-10)的座標關係可表示成以下式15:

Figure 107141410-A0305-02-0021-8
將式15經過反座標轉換後,獲得以下式16:
Figure 107141410-A0305-02-0021-9
於是,從式16可獲得x=g(x',y')且y=h(x',y')。由於y L1=f L1(x)且y R1=f R1(x),於是最後可獲得該未來車道線資料所含的左車道線方程式y L2及右車道線方程式y R2,並可表示成以下式17及式18:y L2=f L1(g(x',y'))=A L2 x 3+B L2 x 2+C L2 x+D L2 式17 y R 1 = f R 1 ( x ) = A R 1 x 3 + B R 1 x 2 + C R 1 x + D R 1 Formula 14 In this case, the reference point 301 of the vehicle 200 is in the future unit The coordinate relationship of the time point t 0+1 with respect to the historical time point t 0-N (t 0-10 ) can be expressed as the following formula 15:
Figure 107141410-A0305-02-0021-8
After transforming Equation 15 through inverse coordinates, the following Equation 16 is obtained:
Figure 107141410-A0305-02-0021-9
Thus, from equation 16, x = g ( x ', y ') and y = h ( x ', y ') can be obtained. Since y L 1 = f L 1 ( x ) and y R 1 = f R 1 ( x ), the left lane line equation y L 2 and the right lane line equation y R 2 contained in the future lane line data are finally obtained , And can be expressed as the following formula 17 and formula 18: y L 2 = f L 1 ( g ( x ', y ')) = A L 2 x 3 + B L 2 x 2 + C L 2 x + D L 2 Formula 17

y R2=f R1(g(x',y'))=A R2 x 3+B R2 x 2+C R2 x+D R2 式18至此,該處理單元5執行完該車道追蹤方法。該處理單元5能將估算出的該未來車道線資料輸出至該外部控制系統,以供其作為該車輛300的側向控制的依據。值得注意的是,在使用了本發明的車道追蹤方法且該單位時間為10ms的情況下,該車道偵測系統100能等效地將該車道線資料的更新率提高為該車道偵測模組1本身所提供車道線資料更新率的十倍。 y R 2 = f R 1 ( g ( x ', y ')) = A R 2 x 3 + B R 2 x 2 + C R 2 x + D R 2 Equation 18 So far, the processing unit 5 has completed the lane Tracking method. The processing unit 5 can output the estimated future lane line data to the external control system for use as a basis for the lateral control of the vehicle 300. It is worth noting that when the lane tracking method of the present invention is used and the unit time is 10 ms, the lane detection system 100 can equivalently increase the update rate of the lane line data to the lane detection module 1 Ten times the update rate of lane line information provided by itself.

綜上所述,本發明該車道追蹤系統100藉由該車輛動態資料來進一步確認根據該慣性測量單元2的測量結果(即角速度及加速度)所估算的偏航率及側向加速度的可信度,且利用估算的未 來位置資料及參考車道線資料,估算出在未來單位時間點的未來車道線資料,以提高用於側向軌跡追蹤的車道線資料更新頻率,並可有效地補償在該車道偵測模組1的車道偵測功能短暫失效(也就是,新近產生的車道線資料被判定為不可利用的)期間的車道資訊,藉此提升側向控制的精確性及強健性。故確實能達成本發明的目的。 In summary, the lane tracking system 100 of the present invention uses the vehicle dynamic data to further confirm the reliability of the yaw rate and lateral acceleration estimated based on the measurement results (ie, angular velocity and acceleration) of the inertial measurement unit 2 And use the estimated The position data and the reference lane line data are used to estimate the future lane line data at the future unit time point, so as to increase the update frequency of the lane line data used for lateral trajectory tracking, and can effectively compensate for the lane detection module 1 Lane information during the short-term failure of the lane detection function (that is, the newly generated lane line data is determined to be unavailable), thereby improving the accuracy and robustness of the lateral control. Therefore, the purpose of cost invention can indeed be achieved.

惟以上所述者,僅為本發明的實施例而已,當不能以此限定本發明實施的範圍,凡是依本發明申請專利範圍及專利說明書內容所作的簡單的等效變化與修飾,皆仍屬本發明專利涵蓋的範圍內。 However, the above are only examples of the present invention, and the scope of implementation of the present invention cannot be limited by this, any simple equivalent changes and modifications made according to the scope of the patent application of the present invention and the content of the patent specification are still classified as Within the scope of the invention patent.

100:車道追蹤系統 100: Lane tracking system

1:車道偵測模組 1: Lane detection module

11:CCD影像感測器 11: CCD image sensor

12:影像處理器 12: Image processor

2:慣性測量單元 2: Inertial measurement unit

21:三軸陀螺儀 21: Three-axis gyroscope

22:三軸加速度計 22: Three-axis accelerometer

3:動態感測單元 3: Dynamic sensing unit

31:方向盤轉角感測器 31: Steering wheel angle sensor

32:車速感測器 32: Vehicle speed sensor

33:輪速感測器組 33: Wheel speed sensor group

4:儲存單元 4: storage unit

5:處理單元 5: Processing unit

Claims (10)

一種用於自動駕駛車輛的車道追蹤方法,藉由一處理單元來實施,包含以下步驟:(A)將由一設於一行駛在一車道的車輛的車道偵測模組根據在一歷史時間點所感測該車道的影像而產生的參考車道線資料,以及先前估算出該車輛的一參考點相對於該車道分別在該歷史時間點及當前時間點的位置的歷史參考位置資料與當前位置資料且在一自該歷史時間點至該當前時間點的歷史期間內的每個歷史單位時間點的位置的歷史位置資料儲存於一儲存單元,其中該歷史參考位置資料、該歷史位置資料及該當前位置資料其中每一者含有縱向位置值、側向位置值及方位角;(B)根據由一設於該車輛的慣性測量單元在該當前時間點所測量到該車輛的角速度及加速度,獲得該車輛在該當前時間點的估算偏航率及估算側向加速度,並且至少根據由一設於該車輛的動態感測單元在該當前時間點所感測相關於該車輛的方向盤以及所有車輪的車輛動態資料,獲得該車輛在該當前時間點的參考偏航率及參考側向加速度;(C)在判定出該估算偏航率及該估算側向加速度分別與該參考偏航率及該參考側向加速度的相似度均達到一預定信心水準時,根據該估算偏航率及該估算側向加速度,估算該車輛在未來的單位時間內的縱向位移量、側向位移量及方位角偏移量; (D)根據該儲存單元所儲存的該當前位置資料,以及步驟(C)所估算的該縱向位移量、該側向位移量及該方位角偏移量,估算該車輛的該參考點相對於該車道在一未來單位時間點的未來位置資料,並將該未來位置資料儲存於該儲存單元,其中該未來位置資料含有縱向位置值、側向位置值及方位角;(E)根據該歷史參考位置資料及該未來位置資料,估算該車輛自該歷史時間點至該未來單位時間點的總縱向位移量、總側向位移量及總方位角偏移量;及(F)根據該儲存單元所儲存的該參考車道線資料、該總縱向位移量、該總側向位移量及該總方位偏移量並利用座標轉換,估算對應於該未來單位時間點的未來車道線資料,並將該未來車道線資料儲存於該儲存單元。 A lane tracking method for autonomous vehicles, implemented by a processing unit, includes the following steps: (A) The lane detection module of a vehicle driving in a lane will be sensed based on a historical time point The reference lane line data generated by measuring the image of the lane, and the historical reference position data and the current position data of the position of the reference point of the vehicle previously estimated at the historical time point and the current time point with respect to the lane respectively and in A historical position data of the position of each historical unit time point in the historical period from the historical time point to the current time point is stored in a storage unit, wherein the historical reference position data, the historical position data and the current position data Each of them contains longitudinal position value, lateral position value and azimuth angle; (B) according to the angular velocity and acceleration of the vehicle measured by an inertial measurement unit provided at the vehicle at the current time The estimated yaw rate and the estimated lateral acceleration at the current time, and at least according to vehicle dynamic data related to the steering wheel and all wheels of the vehicle sensed by a dynamic sensing unit provided at the vehicle at the current time, Obtain the reference yaw rate and reference lateral acceleration of the vehicle at the current time point; (C) After determining the estimated yaw rate and the estimated lateral acceleration, respectively, the reference yaw rate and the reference lateral acceleration When the similarity reaches a predetermined confidence level, based on the estimated yaw rate and the estimated lateral acceleration, the longitudinal displacement, lateral displacement, and azimuth deviation of the vehicle in the future unit time are estimated; (D) According to the current position data stored in the storage unit, and the longitudinal displacement, the lateral displacement, and the azimuth deviation estimated in step (C), estimate the reference point of the vehicle relative to The future position data of the lane at a future unit time point, and store the future position data in the storage unit, wherein the future position data includes longitudinal position value, lateral position value and azimuth angle; (E) according to the historical reference Position data and the future position data, estimate the total longitudinal displacement, total lateral displacement and total azimuth deviation of the vehicle from the historical time point to the future unit time point; and (F) according to the storage unit The stored reference lane line data, the total longitudinal displacement, the total lateral displacement, and the total azimuth offset are converted using coordinates to estimate the future lane line data corresponding to the future unit time point, and the future The lane line data is stored in the storage unit. 如請求項1所述的用於自動駕駛車輛的車道追蹤方法,其中,在步驟(A)中,該處理單元還將先前估算出且對應於當前時間點的估算車道線資料儲存於該儲存單元,在步驟(A)與步驟(E)之間,還包含以下步驟:(G)當在該當前時間點接收到該車道偵測模組根據在該歷史期間中的一歷史單位時間點所感測該車道的影像而新近產生的車道線資料時,根據該儲存單元所儲存的該估算車道線資料、及相關於該車道偵測模組的影像感測規格的預定參考條件,判定所接收的該車道線資料是否為可利用的;及(H)當判定出該車道線資料為可利用的,將該儲存單 元所儲存的該參考車道線資料及該歷史參考位置資料分別更新為該車道線資料及在該歷史單位時間點的該歷史位置資料。 The lane tracking method for an autonomous driving vehicle according to claim 1, wherein in step (A), the processing unit also stores the estimated lane line data previously estimated and corresponding to the current time point in the storage unit Between step (A) and step (E), the following steps are also included: (G) When the lane detection module is received at the current time point, it is sensed according to a historical unit time point in the historical period When the image of the lane is newly generated lane line data, the received lane line data stored in the storage unit and the predetermined reference conditions related to the image sensing specifications of the lane detection module are used to determine the received Whether the lane line data is available; and (H) When it is determined that the lane line data is available, the storage slip The reference lane line data and the historical reference position data stored in the yuan are updated to the lane line data and the historical position data at the time point of the historical unit, respectively. 如請求項2所述的用於自動駕駛車輛的車道追蹤方法,其中,在步驟(G)中:該影像感測規格定義出一預定最短有效縱向距離及一預定最長有效縱向距離;該預定參考條件包含分別在該預定最短有效縱向距離及該預定最長有效縱向距離處相關於該車道的寬度的一預定第一寬度差門檻及一預定第二寬度差門檻,以及分別在該預定最短有效縱向距離及該預定最長有效縱向距離處相關於該車道的中心線的一預定第一側向偏移量門檻及一預定第二側向偏移量門檻;該處理單元根據該估算車道線資料獲得在該預定最短有效縱向距離處代表該車道的寬度的第一寬度值與代表該車道的中心線的側向位置的第一位置值,以及在該預定最長有效縱向距離處代表該車道的寬度值的第二寬度值與代表該車道的中心線的側向位置的第二位置值,並且根據該車道線資料獲得在該預定最短有效縱向距離處代表該車道的寬度的第三寬度值與代表該車道的中心線的側向位置的第三位置值,以及在該預定最長有效縱向距離處代表該車道的寬度的第四寬度值與代表該車道的中心線的側向位置的第四位置值;該處理單元藉由判定該第一寬度值與該第三寬度值 之間的第一差值是否不大於該預定第一寬度差門檻、該第二寬度值與該第四寬度值之間的第二差值是否不大於該預定第二寬度差門檻、該第一位置值與該第三位置值之間的第三差值是否不大於該預定第一側向偏移量門檻、及該第二位置值與該第四位置值之間的第四差值是否不大於該預定第二側向偏移量門檻來決定該車道線資料是否為可利用的;及當該處理單元判定出該第一差值、該第二差值、該第三差值及該第四差值分別不大於該預定第一寬度差門檻、該預定第二寬度差門檻、該預定第一側向偏移量門檻及該預定第二側向偏移量門檻時,該車道線資料被該處理單元判定為可利用的。 The lane tracking method for an automatic driving vehicle according to claim 2, wherein in step (G): the image sensing specification defines a predetermined shortest effective longitudinal distance and a predetermined longest effective longitudinal distance; the predetermined reference The conditions include a predetermined first width difference threshold and a predetermined second width difference threshold corresponding to the width of the lane at the predetermined shortest effective longitudinal distance and the predetermined longest effective longitudinal distance, respectively, and respectively at the predetermined shortest effective longitudinal distance And a predetermined first lateral offset threshold and a predetermined second lateral offset threshold related to the center line of the lane at the predetermined maximum effective longitudinal distance; the processing unit obtains the The first width value representing the width of the lane at the predetermined shortest effective longitudinal distance and the first position value representing the lateral position of the centerline of the lane, and the first value representing the width of the lane at the predetermined longest effective longitudinal distance The second width value and the second position value representing the lateral position of the center line of the lane, and the third width value representing the width of the lane at the predetermined shortest effective longitudinal distance and the The third position value of the lateral position of the center line, and the fourth width value representing the width of the lane at the predetermined longest effective longitudinal distance and the fourth position value representing the lateral position of the center line of the lane; the processing The unit determines the first width value and the third width value by Whether the first difference between them is not greater than the predetermined first width difference threshold, whether the second difference between the second width value and the fourth width value is not greater than the predetermined second width difference threshold, the first Whether the third difference between the position value and the third position value is not greater than the predetermined first lateral offset threshold, and whether the fourth difference between the second position value and the fourth position value is not Greater than the predetermined second lateral offset threshold to determine whether the lane line data is available; and when the processing unit determines the first difference, the second difference, the third difference and the first When the four differences are not greater than the predetermined first width difference threshold, the predetermined second width difference threshold, the predetermined first lateral offset threshold, and the predetermined second lateral offset threshold, the lane line data is The processing unit determines that it is available. 如請求項1所述的用於自動駕駛車輛的車道追蹤方法,其中,在步驟(B)中,該處理單元先利用卡爾曼濾波方式濾除該角速度及該加速度的雜訊後,再根據濾波後的該角速度及該加速度並利用卡爾曼估測方式獲得該估算偏航率及該估算側向加速度。 The lane tracking method for an autonomous driving vehicle according to claim 1, wherein in step (B), the processing unit first uses Kalman filtering to filter out the noise of the angular velocity and the acceleration, and then according to the filtering After the angular velocity and the acceleration, the estimated yaw rate and the estimated lateral acceleration are obtained using the Kalman estimation method. 如請求項1所述的用於自動駕駛車輛的車道追蹤方法,其中,在步驟(B)中:該車輛動態資料至少包含方向盤轉角、車速、後右車輪輪速及後左車輪輪速;及該處理單元根據該車輛動態資料所含的該方向盤轉角與該車速,以及該車輛的轉向比、轉向不足係數及軸距,或根據該車輛動態資料所含的該車速、該後右車輪輪 速及該後左車輪輪速,以及該車輛的後輪輪距,獲得該參考偏航率及該參考側向加速度。 The lane tracking method for an autonomous driving vehicle according to claim 1, wherein in step (B): the vehicle dynamic data includes at least a steering wheel angle, a vehicle speed, a rear right wheel speed and a rear left wheel speed; and The processing unit is based on the steering wheel angle and the vehicle speed contained in the vehicle dynamic data, as well as the vehicle's steering ratio, understeer coefficient and wheelbase, or the vehicle speed and the rear right wheel wheels included in the vehicle dynamic data Speed and the rear left wheel speed, and the rear wheel track of the vehicle, the reference yaw rate and the reference lateral acceleration are obtained. 一種用於自動駕駛車輛的車道追蹤系統,包含:一車道偵測模組,設於一行駛在一車道的車輛,並用來以一感測頻率持續地感測該車道的影像,以便根據所感測的每一影像產生對應的車道線資料;一慣性測量單元,設於該車輛,並用來感測該車輛的慣性以產生該車輛的角速度及加速度;一動態感測單元,設於該車輛,並用來感測該車輛本身及其方向盤與所有車輪的運動狀態以產生車輛動態資料;一儲存單元,儲存有該車道偵測模組根據在一歷史時間點所感測該車道的影像而產生的參考車道線資料,以及先前估算出該車輛的一參考點相對於該車道分別在該歷史時間點及當前時間點的位置的歷史參考位置資料與當前位置資料且在一自該歷史時間點至該當前時間點的歷史期間內的每個歷史單位時間點的位置的歷史位置資料,其中該歷史參考位置資料、該歷史位置資料及該當前位置資料其中每一者含有縱向位置值、側向位置值及方位角;及一處理單元,電連接該車道偵測模組、該慣性測量單元、該動態感測單元及該儲存單元,並用來執行以下操作根據來自該慣性測量單元且在該當前時間點所產生的該角速度及該加速度,獲得該車輛在該當前時間點 的估算偏航率及估算側向加速度,並且至少根據來自該動態感測單元且在該當前時間點所產生的該車輛動態資料,獲得該車輛在該當前時間點的參考偏航率及參考側向加速度,在判定出該估算偏航率及該估算側向加速度分別與該參考偏航率及該參考側向加速度的相似度均達到一預定信心水準時,根據該估算偏航率及該估算側向加速度,估算該車輛在未來的單位時間內的縱向位移量、側向位移量及方位角偏移量,根據該儲存單元所儲存的該當前位置資料,以及所估算的該縱向位移量、該側向位移量及該方位角偏移量,估算該車輛的該參考點相對於該車道在一未來單位時間點的未來位置資料,並將該未來位置資料儲存於該儲存單元,其中該未來位置資料含有縱向位置值、側向位置值及方位角,根據該歷史參考位置資料及該未來位置資料,估算該車輛自該歷史時間點至該未來單位時間點的總縱向位移量、總側向位移量及總方位角偏移量,及根據該儲存單元所儲存的該參考車道線資料、該總縱向位移量、該總側向位移量及該總方位偏移量並利用座標轉換,估算對應於該未來單位時間點的未來車道線資料,並將該未來車道線資料儲存於該儲存單元。 A lane tracking system for an autonomous driving vehicle includes: a lane detection module, set in a vehicle driving in a lane, and used to continuously sense the image of the lane at a sensing frequency, so as to be based on the sensed Each image of the corresponding lane line data is generated; an inertial measurement unit is provided in the vehicle and used to sense the inertia of the vehicle to generate the angular velocity and acceleration of the vehicle; a dynamic sensing unit is provided in the vehicle and used To sense the motion status of the vehicle itself and its steering wheel and all wheels to generate vehicle dynamic data; a storage unit stores the reference lane generated by the lane detection module based on the image of the lane sensed at a historical time point Line data, as well as the historical reference position data and current position data of a previously estimated position of a reference point of the vehicle relative to the lane at the historical time point and the current time point and from the historical time point to the current time The historical position data of the position of each historical unit time point within the historical period of the point, where each of the historical reference position data, the historical position data, and the current position data each contains a longitudinal position value, a lateral position value, and an orientation Angle; and a processing unit, electrically connected to the lane detection module, the inertial measurement unit, the dynamic sensing unit and the storage unit, and used to perform the following operations according to the inertial measurement unit and generated at the current time point The angular velocity and the acceleration of the vehicle at the current time Estimated yaw rate and estimated lateral acceleration, and based on at least the vehicle dynamic data generated from the dynamic sensing unit at the current time point, the reference yaw rate and reference side of the vehicle at the current time point are obtained Acceleration, when it is determined that the similarity of the estimated yaw rate and the estimated lateral acceleration to the reference yaw rate and the reference lateral acceleration respectively reach a predetermined confidence level, according to the estimated yaw rate and the estimated Lateral acceleration, estimate the longitudinal displacement, lateral displacement and azimuth offset of the vehicle in the future unit time, according to the current position data stored in the storage unit and the estimated longitudinal displacement, The lateral displacement and the azimuth offset, estimate the future position data of the reference point of the vehicle relative to the lane at a future unit time point, and store the future position data in the storage unit, wherein the future The position data includes longitudinal position value, lateral position value and azimuth angle. Based on the historical reference position data and the future position data, the total longitudinal displacement and total lateral direction of the vehicle from the historical time point to the future unit time point are estimated. Displacement and total azimuth offset, and according to the reference lane line data stored in the storage unit, the total longitudinal displacement, the total lateral displacement and the total azimuth offset and using coordinate conversion, estimate the corresponding The future lane line data at the future unit time point, and store the future lane line data in the storage unit. 如請求項6所述的用於自動駕駛車輛的車道追蹤系統,其中: 該儲存單元還儲存有先前估算出且對應於該當前時間點的估算車道線資料;當該處理單元在該當前時間點接收到來自該車道偵測模組且根據在該歷史期間中的一歷史單位時間點所感測該車道的影像而新近產生的車道線資料時,該處理單元根據該儲存單元所儲存的該估算車道線資料、及相關於該車道偵測模組的影像感測規格定參考條件,判定所接收的該車道線資料是否為可利用的;及當該處理單元判定出該車道線資料為可利用的,該處理單元在執行該車輛的總縱向位移量、總側向位移量及總方位角偏移量的估算前,將該儲存單元所儲存的該參考車道線資料及該歷史參考位置資料分別更新為該車道線資料及在該歷史單位時間點的該歷史位置資料。 The lane tracking system for autonomous vehicles as described in claim 6, wherein: The storage unit also stores previously estimated lane line data corresponding to the current time point; when the processing unit receives the lane detection module at the current time point and based on a history in the historical period When the image of the lane is sensed at a unit time and the lane line data is newly generated, the processing unit sets a reference based on the estimated lane line data stored in the storage unit and the image sensing specifications related to the lane detection module Conditions, determine whether the received lane line data is available; and when the processing unit determines that the lane line data is available, the processing unit is performing the total longitudinal displacement and total lateral displacement of the vehicle Before the estimation of the total azimuth offset, the reference lane line data and the historical reference position data stored in the storage unit are updated to the lane line data and the historical position data at the time point of the historical unit, respectively. 如請求項7所述的用於自動駕駛車輛的車道追蹤系統,其中:該影像感測規格定義出一預定最短有效縱向距離及一預定最長有效縱向距離;該預定參考條件包含分別在該預定最短有效縱向距離及該預定最長有效縱向距離處相關於該車道的寬度的一預定第一寬度差門檻及一預定第二寬度差門檻,以及分別在該預定最短有效縱向距離及該預定最長有效縱向距離處相關於該車道的中心線的一預定第一側向偏移量門檻及一預定第二側向偏移量門檻; 該處理單元根據該估算車道線資料獲得在該預定最短有效縱向距離處代表該車道的寬度的第一寬度值與代表該車道的中心線的側向位置的第一位置值,以及在該預定最長有效縱向距離處代表該車道的寬度值的第二寬度值與代表該車道的中心線的側向位置的第二位置值,並且根據該車道線資料獲得在該預定最短有效縱向距離處代表該車道的寬度的第三寬度值與代表該車道的中心線的側向位置的第三位置值,以及在該預定最長有效縱向距離處代表該車道的寬度的第四寬度值與代表該車道的中心線的側向位置的第四位置值;該處理單元藉由判定該第一寬度值與該第三寬度值之間的第一差值是否不大於該預定第一寬度差門檻、該第二寬度值與該第四寬度值之間的第二差值是否不大於該預定第二寬度差門檻、該第一位置值與該第三位置值之間的第三差值是否不大於該預定第一側向偏移量門檻、及該第二位置值與該第四位置值之間的第四差值是否不大於該預定第二側向偏移量門檻來決定該車道線資料是否為可利用的;及當該處理單元判定出該第一差值、該第二差值、該第三差值及該第四差值分別不大於該預定第一寬度差門檻、該預定第二寬度差門檻、該預定第一側向偏移量門檻及該預定第二側向偏移量門檻時,該車道線資料被該處理單元判定為可利用的。 The lane tracking system for an autonomous driving vehicle according to claim 7, wherein: the image sensing specification defines a predetermined shortest effective longitudinal distance and a predetermined longest effective longitudinal distance; the predetermined reference condition includes respectively at the predetermined shortest A predetermined first width difference threshold and a predetermined second width difference threshold at the effective longitudinal distance and the predetermined longest effective longitudinal distance relative to the width of the lane, and at the predetermined shortest effective longitudinal distance and the predetermined longest effective longitudinal distance, respectively A predetermined first lateral offset threshold and a predetermined second lateral offset threshold related to the centerline of the lane; The processing unit obtains, according to the estimated lane line data, a first width value representing the width of the lane at the predetermined shortest effective longitudinal distance and a first position value representing the lateral position of the center line of the lane, and At the effective longitudinal distance, a second width value representing the width value of the lane and a second position value representing the lateral position of the center line of the lane, and obtaining the representative at the predetermined shortest effective longitudinal distance according to the lane line data The third width value of the width and the third position value representing the lateral position of the centerline of the lane, and the fourth width value representing the width of the lane at the predetermined longest effective longitudinal distance and the centerline representing the lane The fourth position value of the lateral position of the; the processing unit determines whether the first difference between the first width value and the third width value is not greater than the predetermined first width difference threshold and the second width value Whether the second difference between the fourth width value is not greater than the predetermined second width difference threshold, and whether the third difference between the first position value and the third position value is not greater than the predetermined first side The threshold of the lateral offset, and whether the fourth difference between the second position value and the fourth position value is not greater than the predetermined second lateral offset threshold to determine whether the lane line data is available; And when the processing unit determines that the first difference, the second difference, the third difference and the fourth difference are not greater than the predetermined first width difference threshold, the predetermined second width difference threshold, the When the first lateral offset threshold and the predetermined second lateral offset threshold are predetermined, the lane line data is determined to be available by the processing unit. 如請求項6所述的用於自動駕駛車輛的車道追蹤系統,其 中,該處理單元先利用卡爾曼濾波方式濾除該角速度及該加速度的雜訊後,再根據濾波後的該角速度及該加速度並利用卡爾曼估測方式獲得該估算偏航率及該估算側向加速度。 The lane tracking system for autonomous vehicles as described in claim 6, which In the process, the processing unit first uses the Kalman filtering method to filter out the noise of the angular velocity and the acceleration, and then uses the Kalman estimation method to obtain the estimated yaw rate and the estimated side according to the filtered angular velocity and the acceleration Directional acceleration. 如請求項6所述的用於自動駕駛車輛的車道追蹤系統,其中:該動態感測單元包含一用來感測該車輛的方向盤的轉動角度的方向盤轉角感測器、一用來感測該車輛的行駛速度的車速感測器、及一用來感測該車輛的所有車輪的轉速的輪速感測器組,以致該動態感測單元所產生的該車輛動態資料至少包含方向盤轉角、車速、後右車輪輪速及後左車輪輪速;及該處理單元根據該車輛動態資料所含的該方向盤轉角與該車速,以及該車輛的轉向比、轉向不足係數及軸距,或根據該車輛動態資料所含的該車速、該後右車輪輪速及該後左車輪輪速,以及該車輛的後輪輪距,獲得該參考偏航率及該參考側向加速度。 The lane tracking system for an autonomous driving vehicle according to claim 6, wherein: the dynamic sensing unit includes a steering wheel angle sensor for sensing the turning angle of the steering wheel of the vehicle, and a sensor for sensing the A vehicle speed sensor for the driving speed of the vehicle, and a wheel speed sensor group for sensing the rotation speeds of all wheels of the vehicle, so that the vehicle dynamic data generated by the dynamic sensing unit includes at least the steering wheel angle and the vehicle speed , The rear right wheel speed and the rear left wheel speed; and the processing unit according to the steering wheel angle and the vehicle speed contained in the vehicle dynamic data, as well as the vehicle's steering ratio, understeer coefficient and wheelbase, or according to the vehicle The vehicle speed, the rear right wheel speed and the rear left wheel speed included in the dynamic data, and the rear wheel track of the vehicle obtain the reference yaw rate and the reference lateral acceleration.
TW107141410A 2018-11-21 2018-11-21 Lane tracking method and system for autonomous driving vehicles TWI689433B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW107141410A TWI689433B (en) 2018-11-21 2018-11-21 Lane tracking method and system for autonomous driving vehicles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW107141410A TWI689433B (en) 2018-11-21 2018-11-21 Lane tracking method and system for autonomous driving vehicles

Publications (2)

Publication Number Publication Date
TWI689433B true TWI689433B (en) 2020-04-01
TW202019744A TW202019744A (en) 2020-06-01

Family

ID=71134138

Family Applications (1)

Application Number Title Priority Date Filing Date
TW107141410A TWI689433B (en) 2018-11-21 2018-11-21 Lane tracking method and system for autonomous driving vehicles

Country Status (1)

Country Link
TW (1) TWI689433B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101778753A (en) * 2007-08-15 2010-07-14 沃尔沃技术公司 Operating method and system for supporting lane keeping of a vehicle
CN104240536A (en) * 2013-06-20 2014-12-24 福特全球技术公司 Lane monitoring method with electronic horizon
US9469343B2 (en) * 2013-02-07 2016-10-18 Mando Corporation System, method, and computer-readable recording medium for lane keeping control
TW201819223A (en) * 2016-11-28 2018-06-01 財團法人車輛研究測試中心 Automatic tracking and controlling system and method of vehicle lanes for calculating a compensation angle information of a steering wheel

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101778753A (en) * 2007-08-15 2010-07-14 沃尔沃技术公司 Operating method and system for supporting lane keeping of a vehicle
US9469343B2 (en) * 2013-02-07 2016-10-18 Mando Corporation System, method, and computer-readable recording medium for lane keeping control
CN104240536A (en) * 2013-06-20 2014-12-24 福特全球技术公司 Lane monitoring method with electronic horizon
TW201819223A (en) * 2016-11-28 2018-06-01 財團法人車輛研究測試中心 Automatic tracking and controlling system and method of vehicle lanes for calculating a compensation angle information of a steering wheel

Also Published As

Publication number Publication date
TW202019744A (en) 2020-06-01

Similar Documents

Publication Publication Date Title
US9645250B2 (en) Fail operational vehicle speed estimation through data fusion of 6-DOF IMU, GPS, and radar
CN110031019B (en) Slip detection processing method for automatic driving vehicle
US10703365B1 (en) Lane tracking method and lane tracking system for an autonomous vehicle
JP7073052B2 (en) Systems and methods for measuring the angular position of a vehicle
US8280586B2 (en) Determination of the actual yaw angle and the actual slip angle of a land vehicle
JP7036080B2 (en) Inertial navigation system
US20050201593A1 (en) Vehicle state sensing system and vehicle state sensing method
EP3343173A1 (en) Vehicle position estimation device, vehicle position estimation method
CN110316197B (en) Tilt estimation method, tilt estimation device, and non-transitory computer-readable storage medium storing program
US10668928B2 (en) Method and device for estimating the friction values of a wheel of a vehicle against a substrate
JP6020729B2 (en) Vehicle position / posture angle estimation apparatus and vehicle position / posture angle estimation method
US9764744B2 (en) Vehicle yaw rate estimation system
TW201940370A (en) Vehicle operation based on vehicular measurement data processing
CN111923914A (en) Identification of the shoulder driving of a motor vehicle
CN111965390A (en) Wheel speed sensor fault detection method
CN112433531A (en) Trajectory tracking method and device for automatic driving vehicle and computer equipment
CN111795692A (en) Method and apparatus for parallel tracking and positioning via multi-mode SLAM fusion process
JP5402244B2 (en) Vehicle physical quantity estimation device
JP2016206976A (en) Preceding vehicle track calculation device for driving support control of vehicle
JPH10239334A (en) Arithmetic device of initial correction coefficient
US11731596B2 (en) Method for the traction control of a single-track motor vehicle taking the slip angle of the rear wheel into consideration
JP2008049828A (en) Yaw rate estimating device
TWI689433B (en) Lane tracking method and system for autonomous driving vehicles
CN111284496B (en) Lane tracking method and system for autonomous vehicle
JP7028223B2 (en) Self-position estimator