WO2014050285A1 - Dispositif de caméra stéréoscopique - Google Patents

Dispositif de caméra stéréoscopique Download PDF

Info

Publication number
WO2014050285A1
WO2014050285A1 PCT/JP2013/070273 JP2013070273W WO2014050285A1 WO 2014050285 A1 WO2014050285 A1 WO 2014050285A1 JP 2013070273 W JP2013070273 W JP 2013070273W WO 2014050285 A1 WO2014050285 A1 WO 2014050285A1
Authority
WO
WIPO (PCT)
Prior art keywords
processing unit
vehicle
stereo camera
image
camera device
Prior art date
Application number
PCT/JP2013/070273
Other languages
English (en)
Japanese (ja)
Inventor
寛人 三苫
春樹 的野
Original Assignee
日立オートモティブシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立オートモティブシステムズ株式会社 filed Critical 日立オートモティブシステムズ株式会社
Publication of WO2014050285A1 publication Critical patent/WO2014050285A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present invention relates to a stereo camera device that detects an object using images picked up by a plurality of image sensors.
  • Patent Document 1 a traveling guidance obstacle detection device that can be used.
  • a distance in real space is calculated for each pixel of an image based on a pair of images, a region of a three-dimensional object on the screen is extracted based on the distance and luminance value of each pixel, and a three-dimensional object is calculated from the region and the distance.
  • Driving that calculates the size of an object in real space and determines that it is a driving-guided obstacle if the size is within a preset threshold and the vertical luminance change in the area has a certain pattern
  • An inductive obstacle detection device is disclosed. This is a three-dimensional object that exists in front of the host vehicle by selecting the size by calculating the size in the real space, and determining whether the luminance change in the vertical direction has a certain pattern. For example, it is possible to accurately detect a traveling guidance obstacle such as a lane separator or a pylon without erroneously recognizing the vehicle.
  • Patent Document 2 an obstacle detection system capable of detecting an obstacle with high accuracy and at high speed.
  • a candidate area is specified by binarizing each image captured by two far-infrared imaging devices, and the distance to the representative point of the identified candidate area is estimated by the principle of triangulation, and estimated.
  • An obstacle detection system that detects an obstacle by matching a template having a size determined based on the determined distance and a candidate area is disclosed. This is because the candidate area is specified by binarization, and the distance to the candidate area by stereo vision is calculated, and matching using the template according to the distance is performed, so that obstacle detection at high speed is accurate. Is possible.
  • a distance in real space is calculated for each pixel of an image based on a pair of images.
  • the unit for calculating the distance is a block size of 4 ⁇ 4 pixels, and in a nearby region Since the target size is larger than the block size and the image is clear, it is possible to extract the target region. However, in the far region, the target size is smaller than the block size, and the image is blurred. Therefore, the distance accuracy and the separation performance from other objects deteriorate, the area cannot be grouped, and the target area cannot be extracted.
  • region is pinpointed by binarizing each image initially imaged with two far-infrared imaging devices, if a plurality of people are not in the vicinity area, although it is possible to specify an area, if multiple people are close to each other even in nearby areas, binarization makes the area of two people one area and performs pattern matching there. , Can not be recognized correctly.
  • the present invention has been made in view of such circumstances, and an object of the present invention is to provide a stereo camera device that can detect a recognition target more accurately even in a long distance or close proximity.
  • a stereo camera device of the present invention includes a first imaging unit, a second imaging unit, and a first image acquired from the first imaging unit, and a recognition target object existence region. And an object area specifying processing unit that specifies the type of the recognition object, an image of the existing area of the recognition object from the object area specifying processing unit, and a second image acquired from the second imaging unit, Based on the distance calculation processing unit for calculating the distance to the recognition target, the distance calculated by the distance calculation processing unit, and the existence area of the recognition target obtained by the target region specifying processing unit, the recognition target A feature calculation processing unit that calculates a feature quantity in the real space, and a target for verifying the type of the recognition target identified by the target area identification processing unit based on the feature quantity calculated by the feature calculation processing unit And an object feature verification processing unit.
  • FIG. 1 shows an overall view of a stereo camera device 100 according to the present embodiment.
  • the stereo camera apparatus 100 includes two imaging units, a camera 101 that is a first imaging unit (left imaging unit) and a camera 102 that is a second imaging unit (right imaging unit), and includes an image acquisition unit 103 and an image acquisition unit.
  • the unit 104 acquires images (first image and second image) captured by the camera 101 and the camera 102, respectively.
  • the vehicle region on the image is identified from one of the acquired images by the vehicle region identification processing unit 105 and recognized as a vehicle, and the image of the identified vehicle region and the image obtained by the image acquisition unit 104
  • the vehicle distance calculation processing unit 106 calculates the distance to the vehicle in the real space by stereo matching.
  • the vehicle feature calculation processing unit 107 calculates the size of the vehicle in real space from the area on the image obtained by the vehicle region identification processing unit 105 and the distance obtained by the vehicle distance calculation processing unit 106.
  • the vehicle feature verification processing unit 108 verifies the characteristics of the vehicle based on the calculated size of the vehicle in the real space and the vehicle area identification processing unit 105 recognizing the vehicle. As a result of the verification, if the vehicle is correct, the control target selection processing unit 109 selects whether to be a control target. If the vehicle is a control target, the information is transmitted to the vehicle control device 110.
  • the camera 101 and the camera 102 are attached with a certain interval so that the front of the vehicle is photographed in the vehicle interior.
  • These cameras include image sensors, which are image sensors such as CCD and CMOS, and are synchronized with each other and sampled at the same timing.
  • the image acquisition unit 103 and the image acquisition unit 104 acquire images captured by the camera 101 and the camera 102, respectively, and convert the acquired image to a digital value so that the acquired image can be processed by a subsequent processing unit.
  • the images captured by the camera 101 and the camera 102 are corrected so as to eliminate the difference in imaging environment and imaging characteristics between the two cameras, and the corrected image is transferred to the next processing.
  • the vehicle area identification processing unit 105 identifies the area 202 of the vehicle 201 ahead shown in FIG. 2 from the original image 200 captured by the camera 101 and recognizes that the vehicle is a vehicle.
  • an angle of an edge is calculated from the original image 200 using, for example, a Sobel filter, and only an edge whose angle is within ⁇ 15 [deg] with respect to the vertical angle is extracted.
  • the vertical edge image 203 only in the vertical direction is generated.
  • the vehicle template prepared as a learning value in advance is sized according to the width [pix] on the image (the number of pixels on the image between the vertical edge 205 and the vertical edge 206).
  • the template is matched with the vertical edge 207 and the vertical edge 208, pattern matching is performed while shifting the vertical position in pixels from the upper end to the lower end of the original image 200, and the vertical position where the degree of correlation becomes the maximum and the correlation Calculate the degree.
  • the pattern matching method is not particularly limited.
  • the vehicle is recognized as a vehicle, and the position is specified as the vehicle region. By performing this for all pairs of vertical edges, the vehicle region 202 is identified from the original image 200 and recognized as a vehicle.
  • the combination of vertical edges and the pattern matching of the vehicle are used.
  • any other method can be used as long as the region and type can be recognized using the original image 200.
  • a technique may be used.
  • a technique may be used in which a template such as a person, a motorcycle, a sign, or the like and a part of the original image are completely scanned by pattern matching to specify the region and type.
  • the vehicle is simply recognized, but it may be recognized by the type of vehicle in a large category such as a light vehicle, a normal vehicle, or a truck, or may be recognized more finely.
  • the recognized type is used in the vehicle feature verification processing unit 108 at the subsequent stage.
  • the image acquired by the image acquisition unit 103 is processed.
  • the image acquired by the image acquisition unit 104 may be processed.
  • a configuration may be adopted in which the vehicle region is specified by both images and the one with the higher correlation is used. In this way, by performing pattern matching and specifying the region, for example, even when two people who cannot be separated by binarization are close to each other, they can be separated and specified.
  • the vehicle distance calculation processing unit 106 calculates the distance to the vehicle in the real space using the image of the vehicle region obtained by the vehicle region identification processing unit 105 and the image obtained by the image acquisition unit 104.
  • FIG. 3 shows a general distance measurement principle using a stereo camera.
  • the symbol D in the figure represents the distance from the surfaces of the lens 302 and the lens 303 to the measurement point 301
  • f represents the distance (focal length) between the lens 302 and the lens 303, the imaging surface 304, and the imaging surface 305
  • b Represents the distance (base line length) between the centers of the lens 302 and the lens 303
  • d is the position where the measurement point 301 is imaged on the imaging surface 304 through the lens 302
  • the measurement point 301 is imaged on the imaging surface 305 through the lens 303.
  • the difference (parallax) from the target position is shown. Between these symbols, the following mathematical formula holds from the similarity of triangles.
  • a result obtained by performing the vehicle area identification processing unit 105 on the left image 403 is a vehicle area 401.
  • the vehicle area 401 of the left image 403 has a height on the image of the vehicle area 401 from the right image 404.
  • the same target vehicle region 402 can be obtained from the right image 404, and the parallax d is calculated from the amount of deviation between the vehicle region 401 and the vehicle region 402. Can do.
  • the degree of coincidence that is maximized is lower than a predetermined threshold value, the result recognized by the vehicle area identification processing unit 105 is erroneously recognized, and is excluded from the next processing target.
  • the degree of coincidence can be calculated by using the sum of absolute values of luminance differences between pixels (Sum of the Absolute Differences: SAD). From the above, since the parallax d is obtained, the distance D in the real space to the vehicle is calculated using the mathematical formula (1). Thus, by obtaining the parallax in units of the size of the image, for example, compared to the parallax calculation in units of blocks of 4 ⁇ 4 pixels, a local matching degree is obtained instead of a local matching degree.
  • the parallax can be calculated not in pixel units but in sub-pixel units by obtaining the vertices of the SAD results around the parallax having the highest degree of coincidence, for example, by quadratic function approximation.
  • the distance can be calculated with higher accuracy.
  • the vehicle feature calculation processing unit 107 calculates a vehicle feature amount from the distance to the vehicle obtained by the vehicle distance calculation processing unit 106 and the vehicle area obtained by the vehicle region specification processing unit 105.
  • the width W in the real space of the vehicle 501 is imaged on the imaging surface 503 through the lens 502 with a width of w. Since the vehicle width w on the imaging surface is also calculated in the vehicle area obtained by the vehicle area identification processing unit 105, the distance D to the vehicle obtained by the vehicle distance calculation processing unit 106 is used as follows.
  • the width W of the vehicle in the real space is calculated by this mathematical formula 2.
  • the width of the vehicle is calculated, but as another feature amount, for example, the height of the vehicle in real space can be calculated in the same manner.
  • the road surface gradient in the real space is known from the white line of the road, the height from the road surface in the real space can be further calculated.
  • the vehicle feature verification processing unit 108 performs verification processing based on the type obtained by the vehicle region specification processing unit 105 and the feature amount in the real space calculated by the vehicle feature calculation processing unit 107. Specifically, in the present embodiment, since the vehicle is recognized as a type, the actual width calculated by the vehicle feature calculation processing unit 107 is set to a width that can be taken as the vehicle, for example, 1.4 m to 2.5 m. It is determined whether the width in the space is within the range, and if it is within the range, it is sent as it is as a control target candidate to the vehicle target selection unit 109, and if it is out of the range, it is not output.
  • the mistake can be found and a more reliable output can be performed.
  • the one having the highest SAD coincidence and the largest area is selected.
  • the maximum value of the degree of coincidence is calculated, and those that differ from the maximum value by a certain threshold or more are removed.
  • the control target candidate 603 outside the vehicle has a lower degree of coincidence due to the background, and thus is excluded from the candidates.
  • the areas of the control target candidates 601 and 602 that remain without being removed are compared, and the largest one is left.
  • the recognition type is only the vehicle, and the feature to be verified is only the width.
  • the recognition type is the type of the vehicle in a large round, such as a light car, a normal car, or a truck, and further, the car type.
  • the width of the feature at the time of verification can be further narrowed down, and verification can be performed with higher reliability.
  • not only the vehicle but also a person, a sign, or the like can be recognized from an image, and if the size is specified, the verification can be similarly performed.
  • the vehicle target selection processing unit 109 selects which of the highly reliable vehicle area information verified by the vehicle feature verification processing unit 108 is suitable as a control target. Specifically, the traveling path of the host vehicle is predicted using the steering angle, the yaw rate, and the vehicle speed, and the vehicle closest to the traveling path is output as a control target.
  • the vehicle control device 110 determines whether or not there is a risk of collision based on the control target information sent from the stereo camera device 100 via the CAN bus or the like, for example, the distance in the real space, the relative speed, and the like. If there is a danger, first, an alarm is sounded to prompt the driver to avoid a collision, and when it is finally impossible to avoid the collision, the collision is avoided or reduced by automatic braking.
  • the distance can be found even in the distance, and the distance can be measured with high accuracy by performing stereo vision in the area, and the size in real space can be recognized. It is possible to accurately detect the recognition target by verifying with different types.
  • the region larger than the region of the actual recognition target can be excluded.
  • an area smaller than the area of the actual recognition target can be excluded.
  • the recognition target can be accurately detected even at a long distance or close.
  • the image deteriorates.
  • the image of the vehicle traveling in front may be partially different on the left and right, or the edge of the vehicle may be blurred, making it impossible to detect the vehicle correctly.
  • a stereo camera device configured based on the stereo camera device of the first embodiment that can robustly detect a vehicle even in such an environment will be described.
  • the vehicle area identification processing unit 105 and the vehicle feature verification processing unit 108 change the processing of the contents in order to perform more robust vehicle detection.
  • the degree of correlation is maximized in the vehicle area specifying processing unit 105. However, not only the maximum value but also the degree of correlation and the area on the image at the time of maximum are retained. If the degree of correlation exceeds a threshold value, it is passed to subsequent processing.
  • the vehicle feature verification processing unit 108 performs verification processing in the same manner as in the first embodiment.
  • the vehicle region identification processing unit 105 leaves a plurality of region candidates, as shown in FIG.
  • the width and center position on the image are the same as the control target candidate 701 (for ease of viewing, the horizontal position is slightly shifted in the drawing, but the same horizontal position is shown).
  • region candidates having different heights on the image may remain. Therefore, if there is a region candidate having the same width and center position on the image, the one with the higher degree of matching is left.
  • the control target candidate 702 including a large amount of background is removed, and only the correct control target candidate 701 remains.
  • the region of the control target candidate that is output from the vehicle feature verification processing unit 108 is used in the vehicle region specification processing unit 105 in the next processing frame. Specifically, if there is no vehicle area obtained by the vehicle area identification processing unit 105 around the area of the control object candidate obtained in the previous frame, the control object candidate obtained in the previous frame If pattern matching is performed with a threshold value lower than the normal threshold value around the area, and the vehicle is recognized as a vehicle exceeding the result threshold value, the control target candidate obtained in the previous frame is used as the vehicle area in the current frame. Add a region.
  • the area around the control target candidate area obtained in the previous frame is, for example, the center position [pix] at the exact center between the left end and the right end of the control target candidate area obtained in the previous frame.
  • the difference between the left end and the right end of the position of the control target candidate obtained at the center position within the left and right 10 [pix] of the center position and the previous frame is defined as a width [pix], and the width ⁇ 10 [ pix].
  • it is simply a fixed value, but it may be varied according to the distance, relative speed, and lateral position variation speed, or may be varied according to the yaw rate if the vehicle has a yaw rate sensor. .
  • detection is possible even when an edge is difficult to appear due to raindrops.
  • only the previous frame has been described, but previous frames can also be used in the same manner.
  • the recognition target can be detected robustly.
  • SYMBOLS 100 Stereo camera apparatus, 101: Camera, 102: Camera, 103: Image acquisition part, 104: Image acquisition part, 105: Vehicle area specific process part, 106: Vehicle distance calculation process part, 107: Vehicle feature calculation process part, 108: Vehicle feature verification processing unit, 109: Control target selection processing unit, 110: Vehicle control device

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention a pour but de proposer un dispositif de caméra stéréoscopique qui est apte à détecter des objets à reconnaître avec une meilleure précision même lorsque ces objets sont éloignés les uns des autres ou à proximité étroite les uns des autres. Le dispositif de caméra stéréoscopique comprend : une première unité d'imagerie; une seconde unité d'imagerie; une unité de traitement d'identification de région d'objet pour identifier une région contenant un objet et le type d'objet sur la base d'une première image acquise à partir de la première unité d'imagerie; une unité de traitement de calcul de distance pour calculer la distance à l'objet sur la base d'une image de la région contenant un objet provenant de l'unité de traitement d'identification de région d'objet et d'une seconde image acquise à partir de la seconde unité d'imagerie; une unité de traitement de calcul de caractéristique pour calculer des valeurs de caractéristique de l'objet dans un espace réel sur la base de la distance calculée par l'unité de traitement de calcul de distance et de la région contenant un objet identifiée par l'unité de traitement d'identification de région d'objet; et une unité de traitement de vérification de caractéristique d'objet pour vérifier le type d'objet identifié par l'unité de traitement d'identification de région d'objet sur la base des valeurs de caractéristique calculées par l'unité de traitement de calcul de caractéristique.
PCT/JP2013/070273 2012-09-27 2013-07-26 Dispositif de caméra stéréoscopique WO2014050285A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-213388 2012-09-27
JP2012213388A JP2014067320A (ja) 2012-09-27 2012-09-27 ステレオカメラ装置

Publications (1)

Publication Number Publication Date
WO2014050285A1 true WO2014050285A1 (fr) 2014-04-03

Family

ID=50387699

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/070273 WO2014050285A1 (fr) 2012-09-27 2013-07-26 Dispositif de caméra stéréoscopique

Country Status (2)

Country Link
JP (1) JP2014067320A (fr)
WO (1) WO2014050285A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114762019A (zh) * 2019-12-17 2022-07-15 日立安斯泰莫株式会社 摄像机系统

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6607764B2 (ja) * 2015-11-04 2019-11-20 日立オートモティブシステムズ株式会社 撮像装置
CN112188059B (zh) * 2020-09-30 2022-07-15 深圳市商汤科技有限公司 可穿戴设备、智能引导方法及装置、引导系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09145362A (ja) * 1995-11-29 1997-06-06 Ikegami Tsushinki Co Ltd ステレオ画像による物体の高さ測定方法
JP2001351193A (ja) * 2000-06-09 2001-12-21 Nissan Motor Co Ltd 歩行者検知装置
JP2002323301A (ja) * 2001-04-26 2002-11-08 Fujitsu Ltd 物体位置計測装置
JP2004112144A (ja) * 2002-09-17 2004-04-08 Nissan Motor Co Ltd 前方車両追跡システムおよび前方車両追跡方法
WO2004102222A1 (fr) * 2003-05-13 2004-11-25 Fujitsu Limited Detecteur d'objets, procede et programme de detection d'objets, capteur de distance
JP2005049963A (ja) * 2003-07-30 2005-02-24 Olympus Corp 安全移動支援装置
JP2008123462A (ja) * 2006-11-16 2008-05-29 Hitachi Ltd 物体検知装置

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000163564A (ja) * 1998-12-01 2000-06-16 Fujitsu Ltd 目の追跡装置および瞬き検出装置
JP2012133759A (ja) * 2010-11-29 2012-07-12 Canon Inc 侵入物体の検知を行うことができる物体追尾装置、物体追尾方法及び記憶媒体

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09145362A (ja) * 1995-11-29 1997-06-06 Ikegami Tsushinki Co Ltd ステレオ画像による物体の高さ測定方法
JP2001351193A (ja) * 2000-06-09 2001-12-21 Nissan Motor Co Ltd 歩行者検知装置
JP2002323301A (ja) * 2001-04-26 2002-11-08 Fujitsu Ltd 物体位置計測装置
JP2004112144A (ja) * 2002-09-17 2004-04-08 Nissan Motor Co Ltd 前方車両追跡システムおよび前方車両追跡方法
WO2004102222A1 (fr) * 2003-05-13 2004-11-25 Fujitsu Limited Detecteur d'objets, procede et programme de detection d'objets, capteur de distance
JP2005049963A (ja) * 2003-07-30 2005-02-24 Olympus Corp 安全移動支援装置
JP2008123462A (ja) * 2006-11-16 2008-05-29 Hitachi Ltd 物体検知装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114762019A (zh) * 2019-12-17 2022-07-15 日立安斯泰莫株式会社 摄像机系统

Also Published As

Publication number Publication date
JP2014067320A (ja) 2014-04-17

Similar Documents

Publication Publication Date Title
US9697421B2 (en) Stereoscopic camera apparatus
JP6416293B2 (ja) 自動車に接近する対象車両を自動車のカメラシステムにより追跡する方法、カメラシステムおよび自動車
KR101243108B1 (ko) 차량의 후방 영상 표시 장치 및 방법
EP2928178B1 (fr) Dispositif de commande embarqué
JP4930046B2 (ja) 路面判別方法および路面判別装置
US20100110193A1 (en) Lane recognition device, vehicle, lane recognition method, and lane recognition program
CN111727437A (zh) 提供碰撞前预警的多光谱系统
JP5561064B2 (ja) 車両用対象物認識装置
US20200074212A1 (en) Information processing device, imaging device, equipment control system, mobile object, information processing method, and computer-readable recording medium
WO2014002692A1 (fr) Caméra stéréo
US20170227634A1 (en) Object recognition apparatus using a plurality of object detecting means
US8160300B2 (en) Pedestrian detecting apparatus
JP2016099650A (ja) 走行路認識装置及びそれを用いた走行支援システム
CN109196304B (zh) 物体距离检测装置
JP3961584B2 (ja) 区画線検出装置
WO2017047282A1 (fr) Dispositif de traitement d'images, dispositif de reconnaissance d'objets, système de commande de dispositif, procédé de traitement d'images et programme
JP2006318059A (ja) 画像処理装置、画像処理方法、および画像処理用プログラム
JP6572696B2 (ja) 画像処理装置、物体認識装置、機器制御システム、画像処理方法およびプログラム
KR20170055738A (ko) 영상 기반 주행 차로 판단 장치 및 방법
WO2011016257A1 (fr) Dispositif de calcul de la distance pour un véhicule
WO2014050285A1 (fr) Dispositif de caméra stéréoscopique
KR20160088986A (ko) 소실점 기반의 양안시차를 이용한 차선 검출 방법
JP2007018451A (ja) 道路区画線検出装置
JPH07244717A (ja) 車両用走行環境認識装置
JP2007310591A (ja) 画像処理装置及び駐車場判定方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13842431

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13842431

Country of ref document: EP

Kind code of ref document: A1