WO2012039496A1 - 走路推定装置及びプログラム - Google Patents
走路推定装置及びプログラム Download PDFInfo
- Publication number
- WO2012039496A1 WO2012039496A1 PCT/JP2011/071898 JP2011071898W WO2012039496A1 WO 2012039496 A1 WO2012039496 A1 WO 2012039496A1 JP 2011071898 W JP2011071898 W JP 2011071898W WO 2012039496 A1 WO2012039496 A1 WO 2012039496A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- runway
- track
- lane
- system noise
- parameter
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/77—Determining position or orientation of objects or cameras using statistical methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Definitions
- the present invention relates to a runway estimation apparatus and program, and more particularly, to a runway estimation apparatus and program that estimates runway parameters based on an image captured by an imaging apparatus.
- a vehicle travel path recognition device that detects a lane marker from an input image obtained by imaging a front of a vehicle with a CCD camera and calculates a road model parameter for representing a road shape in front of the vehicle by a Kalman filter based on a detection result of the lane marker.
- the estimated change in the road model parameter is defined as a discrete random walk model driven by fixed white Gaussian noise, assuming that it behaves stochastically.
- the present invention has been made to solve the above-described problems, and an object of the present invention is to provide a runway estimation device and program capable of stably estimating runway parameters.
- the runway estimation apparatus of the present invention obtains a captured image obtained by imaging the periphery of the host vehicle, and extracts a feature point indicating a lane from the captured image obtained by the obtaining unit. And the position and posture of the host vehicle with respect to the track on which the host vehicle travels, and the runway parameters relating to the shape and size of the track are estimated based on the distribution of the feature points extracted by the extracting unit. Based on the setting means for setting the system noise indicating the degree of fluctuation of the runway parameter, the feature points extracted by the extraction means, the past estimation result of the runway parameter, and the system noise set by the setting means, And estimation means for estimating the lane parameter by probabilistic signal processing for handling a discrete time signal.
- the acquisition unit acquires a captured image obtained by imaging the periphery of the host vehicle, and the extraction unit extracts a feature point indicating a lane from the captured image acquired by the acquisition unit. Extraction of feature points indicating lanes is performed by first extracting edge points from the captured image and selecting feature points indicating lanes from the edge points based on the continuity and shape of the edge points.
- the setting means estimates the runway parameters related to the position and posture of the host vehicle relative to the runway on which the host vehicle travels and the shape and size of the runway based on the distribution of the feature points extracted by the extracting unit.
- a system noise indicating the degree of parameter variation is set, and the estimation means is a discrete time signal based on the feature points extracted by the extraction means, the past runway parameter estimation results, and the system noise set by the setting means.
- the runway parameters are estimated by probabilistic signal processing.
- the system noise corresponding to each of the estimated runway parameters is set based on the distribution of the feature points indicating the lane extracted from the captured image, that is, the distribution of the observed values, so that the runway parameters are stably estimated. can do.
- the shape and size of the runway are defined as the runway parameters relating to the position and attitude of the host vehicle with respect to the runway, the lateral position of the host vehicle with respect to the runway, the yaw angle with respect to the center line of the runway, and the pitch angle with respect to the plane of the runway
- the runway parameters relating to the length may be the curvature of the runway and the lane width of the runway.
- the setting means is a system noise corresponding to the curvature of the road, the lane width of the road, and the lateral position of the host vehicle with respect to the road.
- the system noise corresponding to the curvature of the runway is reduced, and the feature point or the right boundary indicating the left boundary of the lane is reduced.
- the system noise corresponding to the lane width of the runway and the pitch angle with respect to the plane of the runway is reduced, and the feature points do not exist in a predetermined number or more.
- each of the system noises corresponding to all the runway parameters can be reduced. Thereby, even if it is an observation condition where the estimation accuracy of a runway parameter falls, a runway parameter can be estimated stably.
- the runway estimation program includes a computer that obtains a captured image obtained by imaging the periphery of the host vehicle, an extraction unit that extracts a feature point indicating a lane from the captured image obtained by the acquisition unit, and the extraction Based on the distribution of the feature points extracted by the means, the position and posture of the host vehicle with respect to the track on which the host vehicle travels, and the fluctuation of the track parameter when estimating the track parameter regarding the shape and size of the track
- a setting means for setting system noise indicating a degree, and a discrete time signal is handled based on the feature points extracted by the extracting means, the past estimation result of the runway parameter, and the system noise set by the setting means. It is a program for functioning as an estimation means for estimating the runway parameter by stochastic signal processing.
- the runway estimation apparatus and program of the present invention corresponds to each of the runway parameters to be estimated based on the distribution of feature points indicating the lane extracted from the captured image, that is, the distribution of the observed values. Since the system noise is set, it is possible to stably estimate the road parameter.
- the runway estimation device 10 of the present embodiment includes an imaging device 12 that continuously images a vehicle front area and a computer 16 that executes a process of estimating a runway parameter.
- the imaging device 12 captures a target area in front of the vehicle and generates an image signal (not shown), and an A / D conversion unit (converts the image signal that is an analog signal generated by the imaging unit into a digital signal) And an image memory (not shown) for temporarily storing the A / D converted image signal.
- the computer 16 includes a CPU 20 that controls the entire runway estimation apparatus 10, a ROM 22 that stores various programs such as a program for a runway estimation process routine described later, a RAM 24 that temporarily stores data as a work area, It includes a memory 26 as storage means in which various information is stored, an input / output port (I / O port) 28, and a bus that interconnects these.
- the imaging device 12 is connected to the I / O port 28.
- the runway estimation device 10 of the present embodiment extracts a feature point (lane boundary point) indicating a lane from the captured image captured by the imaging device 12, and uses this feature point as an observation value to determine a runway parameter using a Kalman filter. presume.
- the runway parameters relating to the position and posture of the host vehicle with respect to the runway on which the host vehicle runs, and the runway parameters relating to the shape and size of the runway on which the host vehicle runs are estimated. More specifically, the runway parameters related to the position and posture of the host vehicle with respect to the runway are set as the lane indicating the left boundary of the track, the lane indicating the right boundary, or the lateral position e k
- these five track parameters are collectively referred to as a track parameter, the track parameter x k
- k (e k
- a feature point extraction unit 30 that acquires and extracts feature points from the captured image
- a lane boundary point selection unit 32 that selects a lane boundary point indicating a lane from the extracted feature points, and a distribution of lane boundary points is determined.
- a distribution determination unit 34 a system noise setting unit 36 that sets each system noise based on the distribution of lane boundary points, and a lane boundary point, past estimation results, and a lane parameter based on the set system noise
- a runway parameter estimation unit 38 that performs the processing.
- the feature point extraction unit 30 scans the captured image as shown in FIG. 4A in the horizontal direction and extracts edge points that are luminance change points for each pixel as feature points. To do.
- FIG. 4C shows an example schematically representing the extracted feature points.
- the lane boundary point selection unit 32 selects the lane boundary point indicating the lane by discriminating the shape, width, color, and the like of the edge points continuously arranged from the feature points extracted by the feature point extraction unit 30. When there are a plurality of lanes, a lane boundary point indicating the innermost pair of left and right lanes is selected.
- FIG. 5 shows an example schematically showing the selected lane boundary point.
- the feature point extraction unit 30 and the lane boundary point selection unit 32 are examples of the extraction means of the present invention.
- the distribution determination unit 34 determines what distribution the lane boundary points selected by the lane boundary point selection unit 32 are. In the present embodiment, it is determined whether the lane boundary points are distributed in both the far and near areas, only in the far area, or only in the near area. Also, a distribution that has both a lane boundary point that indicates the left boundary and a lane boundary point that indicates the right boundary, a distribution that includes only the lane boundary point that indicates the left boundary, or only a lane boundary point that indicates the right boundary. Judge whether the distribution to be. It is also determined whether the total number of selected lane boundary points is equal to or less than a predetermined number. A distribution in which the total number of lane boundary points is a predetermined number or less is referred to as a distribution without observed values.
- the determination as to whether or not there is a lane boundary point in the far and near areas will be described.
- the pixel at the upper left corner of the captured image is the origin, the horizontal direction is the x axis, and the vertical direction is the y axis.
- k is a value of 0 ⁇ k ⁇ 1, for example, 1/3.
- the y coordinate on the captured image corresponding to the position where the distance from the imaging device 12 is 20 m may be set as yc.
- a range where the y coordinate is ya to yc is a far region, and a range where the y coordinate is yc to yb is a near region.
- the minimum value of the y coordinate of the lane boundary point indicating the left boundary is LF, and LF ⁇ yc, it is determined that the lane boundary point exists in the far left region.
- the maximum value of the y coordinate of the lane boundary point indicating the left boundary is LN, and LN> yc, it is determined that the lane boundary point exists in the left vicinity region.
- the minimum value of the y-coordinate of the lane boundary point indicating the right boundary is RF, the maximum value is RN, and compared with yc, it is determined whether the lane boundary point exists in the far right region or the right neighborhood region.
- y coordinate threshold values Tf and Tn for determining whether or not the lane boundary point exists in the far region and the nearby region are provided, and if LF (RF) ⁇ Tf, the vehicle boundary point is in the far region. If present, LN ⁇ Tn, it may be determined that the vehicle boundary point exists in the vicinity region.
- the distribution pattern of the lane boundary points is as shown in FIGS. 7A to 7P. Classified into patterns.
- FIG. 7P shows a distribution without observed values.
- the system noise setting unit 36 sets system noise corresponding to each estimated runway parameter x k
- the system noise indicates the degree to which the runway parameter is changed when estimating the current runway parameter by changing the previous estimation result based on the current observation.
- the lane boundary points are distributed in the far and near areas, and both the left and right sides, it is possible to stably estimate all the lane parameters x k
- the estimation accuracy of the runway parameters such as the lateral position e k
- FIG. 8 shows an example of a system noise setting method.
- k is a pitch angle ⁇ k
- k is a pitch angle ⁇ k
- k is the lane width w k
- k is the lane boundary point when the lane boundary point is distributed from the near region to the far region.
- the distribution pattern of the lane boundary points determined by the distribution determination unit 34 is “the far and right and left” where the lane boundary points are present in all regions, “only the far” that exists only in the far region, and only in the nearby region. “Nearby only” exists, both left and right lane boundary points and right and left lane boundary points exist, “left and right” exist, only left and right side boundary lane boundary points exist It is determined whether the distribution type is “one side only” or “no observation value” in which the total number of lane boundary points is a predetermined number or less.
- the lower alphabets of the distribution type names in FIG. 8 correspond to the distribution patterns in FIGS. 7A to 7P. “With left and right” does not include the patterns of FIGS.
- FIGS. 7A, 7F, and 7J correspond to both “only in the distance” and “only in one side”.
- the patterns of FIGS. 7L and 70 correspond to both “near only” and “only one side”.
- a system noise setting method corresponding to each runway parameter is determined for each type of distribution. For example, in the case of “only in the distance”, the system noise corresponding to the lateral position e k
- the distribution determination unit 34 and the system noise setting unit 36 are examples of the setting unit of the present invention.
- the runway parameter estimation unit 38 estimates the runway parameter x k
- k is an internal state (running road parameter) at time k
- f k is State transition function
- h k is an observation function
- F k is a state transition matrix at time k
- G k is a driving matrix at time k
- H k is an observation matrix
- k is an estimated error covariance matrix at time k
- k is a prediction error covariance matrix at time k
- ⁇ wk is a system noise covariance matrix at time k
- ⁇ vk is an observation noise covariance matrix at time k.
- the system noise set by the system noise setting unit 36 is ⁇ wk in equation (5). Then, the coordinates of the lane boundary point are given as the observed value y k
- This routine is performed by the CPU 20 executing the runway estimation program stored in the ROM 22.
- step 100 a captured image captured by the imaging device 12 is acquired, and in step 102, an edge point that is a luminance change point for each pixel of the captured image is extracted as a feature point.
- step 104 the lane boundary point indicating the lane is selected by discriminating the shape, width, color, and the like of the edge points continuously arranged from the feature points extracted in step 102.
- step 106 it is determined what distribution the lane boundary points selected in step 104 are. First, it is determined whether or not the total number of lane boundary points is equal to or less than a predetermined number. If the total number of lane boundary points is equal to or less than a predetermined number, it is determined that the distribution has no observed value in the pattern of FIG. 7P. Next, if the minimum value of the y coordinate of the lane boundary point indicating the left boundary is LF, and LF ⁇ yc, it is determined that the lane boundary point exists in the left far region. If the maximum value of the y coordinate of the lane boundary point indicating the left boundary is LN, and LN> yc, it is determined that the lane boundary point exists in the left vicinity region.
- the minimum value of the y-coordinate of the lane boundary point indicating the right boundary is RF
- the maximum value is RN
- step 108 system noise corresponding to each estimated runway parameter x k
- each system noise setting corresponding to the distribution pattern of lane boundary points is read, and ⁇ in the above equation (5) Set to wk .
- the distribution type corresponds to “only far away”, and therefore the lateral position e k
- k is set to be small.
- the distribution type corresponds to “only in the distance” and “only on one side”, so the pitch angle ⁇ k
- step 110 the coordinates of the lane boundary point selected in step 104 are given as the observed value y k according to the above equations (1) to (5), the runway parameter x k
- the output estimation result can be displayed on a display device (not shown) or used as input data for a vehicle motion control system that controls vehicle motion.
- the runway estimation apparatus of the present embodiment is the observation situation in which the estimation accuracy decreases for each runway parameter based on the distribution of lane boundary points, that is, the distribution of observation values? Therefore, since the system noise setting corresponding to the road parameter for which the estimation accuracy is lowered is reduced to reduce the fluctuation of the road parameter, the road parameter can be estimated stably.
- the runway parameter is estimated using the Kalman filter.
- a filter based on other stochastic (statistical) signal processing that handles discrete time signals may be used.
- a particle filter can be used.
- (1) the probability of the road parameter is expressed by the size (weight) of the particle, and (2) the road parameter at the next time is predicted.
- the runway parameters are stochastically diffused. When the change is large, it is diffused over a wide range, and when the change is small, it is diffused over a narrow range.
- This spreading width corresponds to the system noise of the present invention. Then, (4) the likelihood (likelihood) of each lane parameter value is weighted using the observed value (lane boundary point), and (5) the probability distribution of the lane parameter updated by the observed value is calculated.
- the distribution of the lane boundary points is determined based on whether the lane boundary points are present in the far region or the near region, and the left side or the right side.
- the distribution may be determined by dividing into more detailed areas, or it may be determined only in the remote area or in the vicinity area.
- the lane boundary line that is the observed value Any distribution can be used.
- the program of the present invention may be provided by being stored in a recording medium or provided via a wired or wireless communication means. Further, the present invention is not limited to implementation by software configuration, and may be implemented by hardware configuration or a combination of hardware configuration and software configuration.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Probability & Statistics with Applications (AREA)
- Multimedia (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
12 撮像装置
16 コンピュータ
30 特徴点抽出部
32 車線境界点選択部
34 分布判定部
36 システムノイズ設定部
38 走路パラメータ推定部
Claims (5)
- 自車両周辺を撮像した撮像画像を取得する取得手段と、
前記取得手段により取得した撮像画像から、車線を示す特徴点を抽出する抽出手段と、
前記抽出手段により抽出された特徴点の分布に基づいて、前記自車両が走行する走路に対する自車両の位置及び姿勢、並びに該走路の形状及び大きさに関する走路パラメータを推定する際の該走路パラメータの変動の度合いを示すシステムノイズを設定する設定手段と、
前記抽出手段により抽出された特徴点、過去の前記走路パラメータの推定結果、及び前記設定手段により設定されたシステムノイズに基づいて、離散時間信号を扱う確率的信号処理により前記走路パラメータを推定する推定手段と、
を含む走路推定装置。 - 前記走路に対する自車両の位置及び姿勢に関する走路パラメータを、該走路に対する自車両の横位置、該走路の中央線に対するヨー角、及び該走路の平面に対するピッチ角とし、前記走路の形状及び大きさに関する走路パラメータを、該走路の曲率及び該走路の車線幅とした請求項1記載の走路推定装置。
- 前記設定手段は、前記特徴点が撮像画像上の遠方領域にのみ分布する場合には、前記走路の曲率、前記走路の車線幅、及び前記走路に対する自車両の横位置に対応するシステムノイズの各々を小さくし、前記特徴点が撮像画像上の近傍領域にのみ分布する場合には、前記走路の曲率に対応するシステムノイズを小さくし、前記車線の左側境界を示す特徴点または右側境界を示す特徴点のみが分布している場合には、前記走路の車線幅、及び前記走路の平面に対するピッチ角に対応するシステムノイズの各々を小さくし、前記特徴点が予め定めた所定数以上存在しない場合には、全ての走路パラメータに対応するシステムノイズの各々を小さくする請求項2記載の走路推定装置。
- コンピュータを、
自車両周辺を撮像した撮像画像を取得する取得手段、
前記取得手段により取得した撮像画像から、車線を示す特徴点を抽出する抽出手段、
前記抽出手段により抽出された特徴点の分布に基づいて、前記自車両が走行する走路に対する自車両の位置及び姿勢、並びに該走路の形状及び大きさに関する走路パラメータを推定する際の該走路パラメータの変動の度合いを示すシステムノイズを設定する設定手段、及び
前記抽出手段により抽出された特徴点、過去の前記走路パラメータの推定結果、及び前記設定手段により設定されたシステムノイズに基づいて、離散時間信号を扱う確率的信号処理により前記走路パラメータを推定する推定手段
として機能させるための走路推定プログラム。 - コンピュータを、請求項1~請求項3のいずれか1項記載の走路推定装置を構成する各手段として機能させるための走路推定プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
BR112013006124-3A BR112013006124B1 (pt) | 2010-09-24 | 2011-09-26 | aparelho de estimativa de percurso |
CN201180042149.0A CN103098111B (zh) | 2010-09-24 | 2011-09-26 | 行驶道路推定装置及程序 |
EP11826938.0A EP2620930B1 (en) | 2010-09-24 | 2011-09-26 | Track estimation device and program |
US13/820,030 US8948455B2 (en) | 2010-09-24 | 2011-09-26 | Travel path estimation apparatus and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-214025 | 2010-09-24 | ||
JP2010214025A JP5258859B2 (ja) | 2010-09-24 | 2010-09-24 | 走路推定装置及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012039496A1 true WO2012039496A1 (ja) | 2012-03-29 |
Family
ID=45873975
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/071898 WO2012039496A1 (ja) | 2010-09-24 | 2011-09-26 | 走路推定装置及びプログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US8948455B2 (ja) |
EP (1) | EP2620930B1 (ja) |
JP (1) | JP5258859B2 (ja) |
CN (1) | CN103098111B (ja) |
BR (1) | BR112013006124B1 (ja) |
WO (1) | WO2012039496A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014091877A1 (ja) * | 2012-12-12 | 2014-06-19 | 日産自動車株式会社 | 移動物体位置姿勢角推定装置及び移動物体位置姿勢角推定方法 |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4378186B2 (ja) | 2004-02-06 | 2009-12-02 | キヤノン株式会社 | 有機el素子アレイ |
JP5970811B2 (ja) | 2011-12-28 | 2016-08-17 | セイコーエプソン株式会社 | 発光素子、発光装置および電子機器 |
RU2647688C2 (ru) * | 2013-09-27 | 2018-03-16 | Ниссан Мотор Ко., Лтд. | Система представления информации |
JP6046666B2 (ja) * | 2014-06-24 | 2016-12-21 | トヨタ自動車株式会社 | 走路境界推定装置及び走路境界推定方法 |
KR20160059376A (ko) * | 2014-11-18 | 2016-05-26 | 엘지전자 주식회사 | 전자 기기 및 그 제어방법 |
JP6363518B2 (ja) * | 2015-01-21 | 2018-07-25 | 株式会社デンソー | 区画線認識装置 |
CN108885831B (zh) * | 2016-03-24 | 2020-04-14 | 日产自动车株式会社 | 行进路检测方法及行进路检测装置 |
CN106092121B (zh) * | 2016-05-27 | 2017-11-24 | 百度在线网络技术(北京)有限公司 | 车辆导航方法和装置 |
JP6293213B2 (ja) * | 2016-08-01 | 2018-03-14 | 三菱電機株式会社 | 車線区画線検知補正装置、車線区画線検知補正方法、及び自動運転システム |
JP6637399B2 (ja) * | 2016-09-30 | 2020-01-29 | 株式会社デンソー | 領域認識装置及び領域認識方法 |
JP2019011971A (ja) * | 2017-06-29 | 2019-01-24 | 株式会社東芝 | 推定システムおよび自動車 |
US10586118B2 (en) * | 2018-01-13 | 2020-03-10 | Toyota Jidosha Kabushiki Kaisha | Localizing traffic situation using multi-vehicle collaboration |
US10916135B2 (en) | 2018-01-13 | 2021-02-09 | Toyota Jidosha Kabushiki Kaisha | Similarity learning and association between observations of multiple connected vehicles |
US10963706B2 (en) | 2018-01-13 | 2021-03-30 | Toyota Jidosha Kabushiki Kaisha | Distributable representation learning for associating observations from multiple vehicles |
CN112733874B (zh) * | 2020-10-23 | 2023-04-07 | 招商局重庆交通科研设计院有限公司 | 基于知识图谱推理的可疑车辆判别方法 |
KR102371849B1 (ko) * | 2020-10-29 | 2022-03-11 | 주식회사 드림티엔에스 | 차선추출 방법 |
CN112465831B (zh) * | 2020-11-16 | 2023-10-20 | 北京中科慧眼科技有限公司 | 基于双目立体相机的弯道场景感知方法、系统和装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08261756A (ja) * | 1994-11-10 | 1996-10-11 | Toyota Central Res & Dev Lab Inc | 走行レーン認識装置 |
JPH1123291A (ja) * | 1997-07-04 | 1999-01-29 | Nissan Motor Co Ltd | 車両用画像処理装置 |
JP2000036037A (ja) * | 1998-07-16 | 2000-02-02 | Toyota Central Res & Dev Lab Inc | 走行路認識装置 |
JP2002109695A (ja) | 2000-10-02 | 2002-04-12 | Nissan Motor Co Ltd | 車両の走行路認識装置 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3538476B2 (ja) * | 1995-05-12 | 2004-06-14 | 本田技研工業株式会社 | 車両の走行路区分線などの認識装置 |
DE69736764T2 (de) * | 1996-08-28 | 2007-01-25 | Matsushita Electric Industrial Co., Ltd., Kadoma | Lokales Positionierungsgerät und Verfahren dafür |
JP3695315B2 (ja) * | 2000-11-14 | 2005-09-14 | 日産自動車株式会社 | 車両用表示装置 |
JP4086759B2 (ja) * | 2002-11-05 | 2008-05-14 | ダイハツ工業株式会社 | 道路モデルの推定装置及びその推定方法 |
US7660436B2 (en) * | 2003-06-13 | 2010-02-09 | Sarnoff Corporation | Stereo-vision based imminent collision detection |
JP3864945B2 (ja) | 2003-09-24 | 2007-01-10 | アイシン精機株式会社 | 路面走行レーン検出装置 |
JP4390631B2 (ja) * | 2004-06-02 | 2009-12-24 | トヨタ自動車株式会社 | 境界線検出装置 |
JP4659631B2 (ja) * | 2005-04-26 | 2011-03-30 | 富士重工業株式会社 | 車線認識装置 |
CN100535954C (zh) * | 2005-08-30 | 2009-09-02 | 珠海金联安软件有限公司 | 多功能行车、停车引导报警系统 |
CN100403332C (zh) * | 2006-11-02 | 2008-07-16 | 东南大学 | 用于车道偏离报警的车道线鲁棒识别方法 |
JP2009154647A (ja) * | 2007-12-26 | 2009-07-16 | Aisin Aw Co Ltd | マルチ画面表示装置及びそのプログラム |
JP5058002B2 (ja) * | 2008-01-21 | 2012-10-24 | 株式会社豊田中央研究所 | 物体検出装置 |
JP2010191661A (ja) * | 2009-02-18 | 2010-09-02 | Nissan Motor Co Ltd | 走行路認識装置、自動車及び走行路認識方法 |
-
2010
- 2010-09-24 JP JP2010214025A patent/JP5258859B2/ja active Active
-
2011
- 2011-09-26 BR BR112013006124-3A patent/BR112013006124B1/pt not_active IP Right Cessation
- 2011-09-26 WO PCT/JP2011/071898 patent/WO2012039496A1/ja active Application Filing
- 2011-09-26 EP EP11826938.0A patent/EP2620930B1/en active Active
- 2011-09-26 US US13/820,030 patent/US8948455B2/en active Active
- 2011-09-26 CN CN201180042149.0A patent/CN103098111B/zh active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08261756A (ja) * | 1994-11-10 | 1996-10-11 | Toyota Central Res & Dev Lab Inc | 走行レーン認識装置 |
JPH1123291A (ja) * | 1997-07-04 | 1999-01-29 | Nissan Motor Co Ltd | 車両用画像処理装置 |
JP2000036037A (ja) * | 1998-07-16 | 2000-02-02 | Toyota Central Res & Dev Lab Inc | 走行路認識装置 |
JP2002109695A (ja) | 2000-10-02 | 2002-04-12 | Nissan Motor Co Ltd | 車両の走行路認識装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2620930A4 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014091877A1 (ja) * | 2012-12-12 | 2014-06-19 | 日産自動車株式会社 | 移動物体位置姿勢角推定装置及び移動物体位置姿勢角推定方法 |
CN104854637A (zh) * | 2012-12-12 | 2015-08-19 | 日产自动车株式会社 | 移动物体位置姿态角推定装置及移动物体位置姿态角推定方法 |
JP5962771B2 (ja) * | 2012-12-12 | 2016-08-03 | 日産自動車株式会社 | 移動物体位置姿勢角推定装置及び移動物体位置姿勢角推定方法 |
CN104854637B (zh) * | 2012-12-12 | 2016-08-24 | 日产自动车株式会社 | 移动物体位置姿态角推定装置及移动物体位置姿态角推定方法 |
JPWO2014091877A1 (ja) * | 2012-12-12 | 2017-01-05 | 日産自動車株式会社 | 移動物体位置姿勢角推定装置及び移動物体位置姿勢角推定方法 |
US9740942B2 (en) | 2012-12-12 | 2017-08-22 | Nissan Motor Co., Ltd. | Moving object location/attitude angle estimation device and moving object location/attitude angle estimation method |
Also Published As
Publication number | Publication date |
---|---|
BR112013006124A2 (pt) | 2016-05-31 |
JP2012068961A (ja) | 2012-04-05 |
EP2620930B1 (en) | 2016-08-31 |
CN103098111B (zh) | 2015-12-16 |
EP2620930A4 (en) | 2015-09-16 |
US20130177211A1 (en) | 2013-07-11 |
JP5258859B2 (ja) | 2013-08-07 |
US8948455B2 (en) | 2015-02-03 |
EP2620930A1 (en) | 2013-07-31 |
BR112013006124B1 (pt) | 2021-03-09 |
CN103098111A (zh) | 2013-05-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012039496A1 (ja) | 走路推定装置及びプログラム | |
US7209832B2 (en) | Lane recognition image processing apparatus | |
US9818301B2 (en) | Lane correction system, lane correction apparatus and method of correcting lane | |
EP3070430B1 (en) | Moving body position estimation device and moving body position estimation method | |
US7346190B2 (en) | Traffic line recognition device | |
JP4416039B2 (ja) | 縞模様検知システム、縞模様検知方法および縞模様検知用プログラム | |
US20150367781A1 (en) | Lane boundary estimation device and lane boundary estimation method | |
KR101483742B1 (ko) | 지능형 차량의 차선 검출방법 | |
JP2011243161A (ja) | 車線境界検出装置、車線境界検出プログラム | |
JP2011022995A (ja) | 消失点推定装置およびプログラム | |
US11462052B2 (en) | Image processing device, image processing method, and recording medium | |
JP2007011490A (ja) | 道路パラメータ検出装置および道路パラメータ検出方法 | |
Cerri et al. | Free space detection on highways using time correlation between stabilized sub-pixel precision IPM images | |
US11354794B2 (en) | Deposit detection device and deposit detection method | |
US20210090260A1 (en) | Deposit detection device and deposit detection method | |
US10853947B2 (en) | Device, method, and program for detecting boundary lines included in captured images | |
JP6492603B2 (ja) | 画像処理装置、システム、画像処理方法、およびプログラム | |
JP5760523B2 (ja) | 走路推定装置及びプログラム | |
JP2005157733A (ja) | 車線認識装置および車線認識方法 | |
CN113345087A (zh) | 基于单目视觉的地表模型拟合方法和装置 | |
JP5472137B2 (ja) | 境界検出装置、および境界検出プログラム | |
JP2008257399A (ja) | 画像処理装置 | |
KR102629639B1 (ko) | 차량용 듀얼 카메라 장착 위치 결정 장치 및 방법 | |
KR20230094853A (ko) | 듀얼 카메라를 이용하여 거리를 계산하는 고속도로 주행지원 시스템의 성능 평가 장치와 그 방법 | |
JP5614100B2 (ja) | 画像処理装置及び移動体位置推定方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180042149.0 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11826938 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13820030 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2011826938 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011826938 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112013006124 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112013006124 Country of ref document: BR Kind code of ref document: A2 Effective date: 20130314 |