WO2011039989A1 - 車両周囲監視装置 - Google Patents
車両周囲監視装置 Download PDFInfo
- Publication number
- WO2011039989A1 WO2011039989A1 PCT/JP2010/005801 JP2010005801W WO2011039989A1 WO 2011039989 A1 WO2011039989 A1 WO 2011039989A1 JP 2010005801 W JP2010005801 W JP 2010005801W WO 2011039989 A1 WO2011039989 A1 WO 2011039989A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- distance
- vehicle
- information
- unit
- distance information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/165—Anti-collision systems for passive traffic, e.g. including static obstacles, trees
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/168—Driving aids for parking, e.g. acoustic or visual feedback on parking space
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30264—Parking
Definitions
- the present invention relates to a vehicle periphery monitoring device that calculates a distance to an obstacle around the host vehicle using an in-vehicle camera.
- a camera is mounted on a vehicle and the distance from the captured image around the vehicle to an obstacle at risk of collision is detected.
- a change in the posture of the camera occurs, such as a pitch (movement that sways vertically) due to the behavior of the vehicle.
- Patent Documents 1 and 2 there is known a method of calculating and correcting a posture change amount by image processing using a road boundary line or vanishing point from an image captured by a camera.
- the heights of the left and right road boundary lines, the width information of the traveling roads, and vanishing point information calculated from the left and right boundary lines as in Patent Documents 1 and 2 are used.
- the imaging range of the rear camera The boundary information cannot be used, for example, when there is almost no entry, and the influence of the pitch cannot be eliminated.
- the method of correcting the pitch by changing the height of the object without using the boundary line as in Patent Document 3 in principle, requires a means for accurately measuring the distance to the object.
- Patent Document 3 specifically describes a stereo camera device.
- a stereo-type distance calculation device in which two cameras are arranged has a problem that the system configuration is complicated and large, and further, high calibration accuracy between the cameras is required, so that the system cost and the installation cost in the vehicle increase. there were. Further, when a moving object is mixed in an object, there is a problem that it cannot be accurately determined because it cannot be determined whether the object is shifted at the pitch of the camera posture or the object is moved and shifted.
- An object of the present invention is to provide a vehicle surrounding monitoring apparatus that can reduce the influence of a distance error due to a camera posture due to generation of a pitch and the like and can accurately display a distance to an object.
- the vehicle surroundings monitoring device of the present invention is used in the imaging unit based on the imaging unit that captures the situation around the host vehicle, the vehicle information acquisition unit that detects the vehicle speed, and the vehicle speed obtained by the vehicle information acquisition unit.
- a camera posture estimation unit for calculating a camera posture; a distance calculation unit for calculating a distance to an object from an image captured by the imaging unit and a change amount of the camera posture estimated by the camera posture estimation unit; and the distance calculation
- a distance information storage unit that stores the distance information calculated by the unit, and a distance information calculated by the distance calculation unit based on a change amount of the camera posture estimated by the camera posture estimation unit or stored in the distance information storage unit
- a distance information update determination unit that determines whether to update the distance information using the current distance information, and an output information generation unit that generates output information according to the content determined by the distance information update determination unit. That.
- the vehicle surrounding monitoring apparatus of the present invention uses the distance information calculated by the distance calculation unit based on the change amount of the camera posture estimated by the camera posture estimation unit or the distance information stored in the distance information storage unit.
- a distance information update determination unit that determines whether to update the distance, it is possible to estimate the influence on the distance data due to changes in the camera posture such as the pitch, and to accurately output the distance to the object.
- the block diagram which shows the structure of the vehicle periphery monitoring apparatus in one embodiment of this invention The figure which shows the camera installation to the vehicle in the same device Diagram showing the relationship between the camera and the real world of the device Flow chart for explaining the operation of the device
- the figure which shows the posture change of the camera of the same device The figure which shows the relationship between the acceleration of the vehicle of the same apparatus, and the pitch angle change of a camera attitude
- the figure which shows the driving environment of a vehicle in the same apparatus The figure which shows the distance data display of the same device
- the figure which shows the distance data display using the past memory information of the apparatus The figure which shows the case where a new target object approaches while moving in the same device
- FIG. 1 is a block diagram showing a configuration of a vehicle surroundings monitoring apparatus.
- the imaging unit 1 captures the situation around the host vehicle, and the vehicle information acquisition unit 2 acquires the vehicle speed and the like.
- the image pickup means 1 has a CCD camera or a CMOS camera as a device, and is installed in the vicinity of the license plate or emblem at the rear of the vehicle, or at the top and rear camera for taking an image of the rear of the own vehicle, or on the side of the own vehicle. For example, there is a side camera that captures the image.
- the vehicle information acquisition means 2 is a vehicle speed signal obtained from a vehicle, for example, and acquires by A / D conversion I / F in the case of an analog signal, and I / F which acquires a CAN signal in the case of CAN information.
- information on the illuminance sensor and a steering angle value may be acquired as vehicle information.
- the signal processing means 3 processes the image data using the image data picked up by the image pickup means 1 and the vehicle information acquired by the vehicle information acquisition means 2 and outputs the distance information to the object to the display means 9.
- the signal processing means 3 includes a camera posture estimation unit 4, a distance information update determination unit 5, a distance information storage unit 6, a distance calculation unit 7, and an output information generation unit 8.
- the camera posture estimation unit 4 receives the vehicle speed acquired by the vehicle information acquisition unit 2 and estimates the posture of the camera from the behavior change of the vehicle.
- the distance information update determination unit 5 calculates the distance information from the image captured by the imaging unit 1 based on the posture state estimated by the camera posture estimation unit 4 or stores the distance information stored in the distance information storage unit 6. Determine whether to use information.
- the distance information storage unit 6 stores the distance information calculated by the distance calculation unit 7 when the distance information update determination unit 5 determines to calculate the distance information from the image captured by the imaging unit 1.
- the output information generation unit 8 generates data using the distance information calculated by the distance calculation unit 7 or the distance information read from the distance information storage unit 6 as output information.
- FIG. 2 shows an installation example of the camera on the vehicle, and the camera is installed at a height h20 from the road surface.
- the camera posture is positive in the backward direction in the vehicle front-rear direction and the direction parallel to the ground is the Z axis (the optical axis direction of the camera), the vehicle width direction relative to the Z axis direction and the direction parallel to the ground was defined as the X axis, and the downward direction perpendicular to the Z axis was defined as the Y direction.
- the camera installed in the vehicle in this way has its installation conditions (height from the road surface (h), direction of optical axis) and camera characteristics (pixel spacing (iu, iv of the image sensor)). ), Focal length (f), and lens distortion correction data) are known, the coordinates (u, v) of the imaging screen and the coordinates (x, y) of each point on the road surface due to the geometrical constraints of the optical system. , Z), the following relationship holds, and the distance can be calculated.
- the camera has distance data for each coordinate in the imaging screen in advance in the memory.
- FIG. 4 shows a flowchart of signal processing in the vehicle surrounding monitoring apparatus of the present embodiment.
- the signal processing means 3 reads new image data from the imaging means 1 and stores it in the memory together with the time of acquisition.
- step S41 the vehicle information acquisition means 2 acquires the vehicle speed and sequentially stores it in the memory together with the time of acquisition.
- step S42 the vehicle speed per unit time is obtained based on the time change from the time when the previous distance was calculated, and the acceleration is calculated.
- step S43 the camera posture estimation unit 4 estimates a change in vehicle behavior, that is, the posture of the camera, from the acceleration calculated in step S42.
- vehicle behavior that is, the posture of the camera
- FIG. 5 shows the coordinate axes of the camera installed in the vehicle.
- the forward and backward direction of the vehicle is positive, and the direction parallel to the ground is the Z axis (camera optical axis direction), with respect to the Z axis direction.
- the behavior change of the vehicle becomes large.
- the camera posture changes by ⁇ 1 around the X axis, and the Z axis in the traveling direction of the vehicle is the z ′ direction ( The pitch is in the + direction), and the vertical Y-axis is inclined to y ′.
- ⁇ 2 changes around the X axis, the Z axis tilts in the z ′′ direction (pitch is in the negative direction), and the vertical Y axis tilts in the y ′′ direction.
- the value of the vehicle behavior change amount as shown in FIG. 6, data in which the vehicle acceleration and the pitch angle fluctuation amount are associated in advance may be prepared.
- the behavior change per minute time also becomes large, and it is not possible to calculate an accurate change amount of the vehicle posture corresponding to the camera captured image without using another posture sensor. It becomes difficult. Therefore, the error becomes large in the distance measurement method on the assumption that the posture of the camera attached to the vehicle with respect to the ground is known.
- step S44 the distance information update determination unit 5 proceeds to step S45 in order to calculate the distance from the image read in step S40 if the behavior change is within the threshold based on the vehicle behavior change amount estimated in step S43. If the distance is equal to or greater than the threshold, it is determined that the error will increase if the distance is calculated by image processing, and the process proceeds to step S47.
- the threshold value is a preset number of patterns (ex. Lo: acceleration 0.5, Mid acceleration: 1.0, Hi: acceleration 1). .5)
- the value may be selected by the user, or may be set in advance in consideration of the vehicle weight and vehicle characteristics such as suspension coil and damper.
- step S45 the distance calculation unit 7 calculates the distance from the image read in step S40 to the object captured in the image.
- the distance is calculated by extracting the edge of the part where the object touches the road surface by image processing, and regarding the position of the edge, the coordinates of the imaging screen and each point on the road surface according to the geometrical constraints of the optical system described above. This is calculated using the relational expressions (1) and (2) of the coordinates of and the pitch angle estimated in step S43.
- step S46 the distance information storage unit 6 stores the image data acquired in step S40, the vehicle behavior change amount estimated in step S43, and the distance calculated in step S45 together with time information.
- step S47 when the distance information update determination unit 5 determines in step S44 that the change in behavior of the vehicle is equal to or greater than the threshold, that is, the error increases when calculating the distance from the image to the object, the distance information storage unit 6 The distance data stored when there is little change and the time information at the time of storage are acquired.
- step S48 the output information generation unit 8 generates output information using the distance information calculated in step S45 or the distance information acquired in step S47 as distance information from the vehicle to the object.
- the output information is generated by dividing the content of the output information according to whether it is distance data calculated in real time or distance data estimated using past information.
- the output information is shown using the case where the vehicle is moved backward with respect to the environment with the wall surface and the pillar as shown in FIG.
- the left figure in FIG. 8 is a display example when the distance information to the object is calculated in real time. Since the distance is calculated from the captured image in real time, the display means 9 displays the calculated distance to the object by a method of superimposing it on the display image.
- the right figure in FIG. 8 shows an example in which an error estimated from the behavior change amount at the time of distance calculation is further taken into consideration.
- the display means 9 displays in consideration of the reliability because the error increases as the distance to the object increases (separates).
- the distance information of the past information is shown in FIG. 8 with respect to the left diagram in FIG. 8 in consideration of the movement amount of the vehicle from the time when the distance data is calculated to the current time, as indicated by d91.
- the image data is shifted by the vehicle movement amount d91, and the distance data is updated for the object in the image (in the example, 30 cm for the state of FIG. 8).
- the travel amount is obtained by integrating the vehicle speed information acquired from the vehicle information and stored in the memory for the elapsed time.
- the behavior change in the pitch direction is estimated from the change in the speed of the vehicle and the distance to the object is calculated or the past information is determined as the distance information has been described.
- the information may be obtained by using the steering angle (calculated from the steering wheel rotation angle or the left and right wheel pulses) and taking into account the behavior change in the roll direction of the vehicle.
- the distance information update determination unit 5 estimates the behavior change of the camera from the change in the vehicle speed, and in real time from the captured image based on the behavior amount. Since it is determined whether the distance to the object is calculated or the distance data stored in the past is used, the distance to the object can be accurately calculated.
- the vehicle surrounding monitoring apparatus estimates the influence on the distance data due to the change in the camera posture such as the pitch by providing the means for estimating the camera posture using the vehicle information. This has the effect of stably outputting the distance to an object, and is useful as a vehicle surrounding monitoring device that calculates the distance to an obstacle around the host vehicle.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
v=f/iv×y/h (2)
2 車両情報取得手段
4 カメラ姿勢推定部
5 距離情報更新判断部
6 距離情報記憶部
7 距離算出部
8 出力情報生成部
9 表示手段
Claims (2)
- 自車両の周囲の状況を撮像する撮像手段と、
車速を検出する車両情報取得手段と、
前記車両情報取得手段で得た車速に基づき、前記撮像手段で用いられるカメラの姿勢を算出するカメラ姿勢推定部と、
前記撮像手段が撮像した画像と前記カメラ姿勢推定部が推定したカメラ姿勢の変化量から対象物までの距離を算出する距離算出部と、
前記距離算出部が算出した距離情報を記憶する距離情報記憶部と、
前記カメラ姿勢推定部で推定したカメラ姿勢の変化量に基づき、前記距離算出部が算出した距離情報か、前記距離情報記憶部に記憶されている距離情報を用いて距離情報を更新するかを判断する距離情報更新判断部と、
前記距離情報更新判断部が判断した内容に従い出力情報を生成する出力情報生成部と
を備えた車両周囲監視装置。 - 前記距離情報更新判断部が前記距離情報記憶部に記憶されている距離情報を用いて距離情報を更新する場合、運転者が新たな障害物に気をつけるように、対象物と自車の間のエリアについて注意を払う意図の表示を行う表示手段を設けた
請求項1記載の車両周囲監視装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201080043133.7A CN102549631B (zh) | 2009-09-30 | 2010-09-27 | 车辆周围监视装置 |
JP2011534064A JP5538411B2 (ja) | 2009-09-30 | 2010-09-27 | 車両周囲監視装置 |
EP10820111.2A EP2485203B1 (en) | 2009-09-30 | 2010-09-27 | Vehicle-surroundings monitoring device |
US13/498,958 US9280824B2 (en) | 2009-09-30 | 2010-09-27 | Vehicle-surroundings monitoring device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009226942 | 2009-09-30 | ||
JP2009-226942 | 2009-09-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011039989A1 true WO2011039989A1 (ja) | 2011-04-07 |
Family
ID=43825837
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/005801 WO2011039989A1 (ja) | 2009-09-30 | 2010-09-27 | 車両周囲監視装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9280824B2 (ja) |
EP (1) | EP2485203B1 (ja) |
JP (1) | JP5538411B2 (ja) |
CN (1) | CN102549631B (ja) |
WO (1) | WO2011039989A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101377204B1 (ko) * | 2012-09-03 | 2014-04-01 | 한국과학기술원 | 노면정보 제공 시스템 및 방법과 이를 사용한 차량 |
JP2015067220A (ja) * | 2013-09-30 | 2015-04-13 | 日産自動車株式会社 | 車両用加速抑制装置及び車両用加速抑制方法 |
KR101610521B1 (ko) * | 2014-10-13 | 2016-04-07 | 현대자동차주식회사 | 시선 추적 장치 및 방법 |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014010636A (ja) * | 2012-06-29 | 2014-01-20 | Denso Corp | 電子機器 |
JP6223685B2 (ja) * | 2013-01-30 | 2017-11-01 | 富士通テン株式会社 | 画像処理装置および画像処理方法 |
WO2015030067A1 (ja) | 2013-09-02 | 2015-03-05 | アルプス電気株式会社 | 発電入力装置 |
WO2015060193A1 (ja) * | 2013-10-22 | 2015-04-30 | 日本精機株式会社 | 車両情報投影システム及び投影装置 |
CN103544806B (zh) * | 2013-10-31 | 2016-01-06 | 江苏物联网研究发展中心 | 基于视频绊线规则的重要物资运输车辆监控预警系统 |
KR102175961B1 (ko) * | 2013-11-29 | 2020-11-09 | 현대모비스 주식회사 | 차량 후방 주차 가이드 장치 |
JP6264037B2 (ja) * | 2013-12-27 | 2018-01-24 | トヨタ自動車株式会社 | 車両用情報表示装置及び車両用情報表示方法 |
WO2015125022A2 (en) * | 2014-02-20 | 2015-08-27 | Mobileye Vision Technologies Ltd. | Navigation based on radar-cued visual imaging |
US9437055B2 (en) * | 2014-08-13 | 2016-09-06 | Bendix Commercial Vehicle Systems Llc | Cabin and trailer body movement determination with camera at the back of the cabin |
JP6361382B2 (ja) * | 2014-08-29 | 2018-07-25 | アイシン精機株式会社 | 車両の制御装置 |
JP6511406B2 (ja) * | 2016-02-10 | 2019-05-15 | クラリオン株式会社 | キャリブレーションシステム、キャリブレーション装置 |
IT201600094414A1 (it) * | 2016-09-20 | 2018-03-20 | St Microelectronics Srl | Un procedimento per rilevare un veicolo in sorpasso, relativo sistema di elaborazione, sistema di rilevazione di un veicolo in sorpasso e veicolo |
KR101916779B1 (ko) | 2017-06-29 | 2018-11-09 | 현대오트론 주식회사 | 단일 후방 카메라 기반 장애물 거리 측정 장치 및 방법 |
US10126423B1 (en) * | 2017-08-15 | 2018-11-13 | GM Global Technology Operations LLC | Method and apparatus for stopping distance selection |
CN110786004B (zh) * | 2017-08-25 | 2021-08-31 | 本田技研工业株式会社 | 显示控制装置、显示控制方法及存储介质 |
CN108639040B (zh) * | 2018-06-28 | 2019-08-30 | 中科安达(北京)科技有限公司 | 车辆制动系统复合测距的方法及系统 |
US10380440B1 (en) | 2018-10-23 | 2019-08-13 | Capital One Services, Llc | Method for determining correct scanning distance using augmented reality and machine learning models |
KR102640871B1 (ko) | 2018-10-30 | 2024-02-27 | 삼성전자주식회사 | 증강 현실을 이용한 영상 데이터를 제공하는 전자 장치 및 그 제어 방법 |
JP7128723B2 (ja) * | 2018-11-12 | 2022-08-31 | 本田技研工業株式会社 | 画像管理装置、路面情報管理システム、車両、プログラム、及び、画像管理方法 |
CN111142118B (zh) * | 2020-01-07 | 2022-08-26 | 盟识(上海)科技有限公司 | 一种自卸车倒车检测方法 |
CN113071480B (zh) * | 2021-04-30 | 2022-06-03 | 重庆长安汽车股份有限公司 | 自动泊车障碍物检测方法、泊车方法、系统及车辆 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0777431A (ja) * | 1993-09-08 | 1995-03-20 | Sumitomo Electric Ind Ltd | カメラ姿勢パラメータ算出方法 |
JPH1123291A (ja) * | 1997-07-04 | 1999-01-29 | Nissan Motor Co Ltd | 車両用画像処理装置 |
JPH1151645A (ja) * | 1997-08-05 | 1999-02-26 | Honda Motor Co Ltd | 車両用距離測定装置 |
JP2002117392A (ja) * | 2000-10-06 | 2002-04-19 | Nissan Motor Co Ltd | 車間距離推定装置 |
JP2002259995A (ja) | 2001-03-06 | 2002-09-13 | Nissan Motor Co Ltd | 位置検出装置 |
JP3820874B2 (ja) | 2000-11-22 | 2006-09-13 | 日産自動車株式会社 | 車両用画像処理装置 |
JP3910345B2 (ja) | 1999-07-13 | 2007-04-25 | 本田技研工業株式会社 | 位置検出装置 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5638116A (en) | 1993-09-08 | 1997-06-10 | Sumitomo Electric Industries, Ltd. | Object recognition apparatus and method |
JPH07251651A (ja) * | 1994-03-15 | 1995-10-03 | Nissan Motor Co Ltd | 車間距離制御装置 |
US6389162B2 (en) * | 1996-02-15 | 2002-05-14 | Canon Kabushiki Kaisha | Image processing apparatus and method and medium |
US6531959B1 (en) | 1999-07-13 | 2003-03-11 | Honda Giken Kogyo Kabushiki Kaisha | Position detecting device |
JP3822770B2 (ja) * | 1999-12-10 | 2006-09-20 | 三菱電機株式会社 | 車両用前方監視装置 |
US20020134151A1 (en) * | 2001-02-05 | 2002-09-26 | Matsushita Electric Industrial Co., Ltd. | Apparatus and method for measuring distances |
JP2004012429A (ja) * | 2002-06-11 | 2004-01-15 | Mitsubishi Heavy Ind Ltd | 自己位置/姿勢同定装置、及び自己位置/姿勢同定方法 |
JP2005025692A (ja) * | 2003-07-04 | 2005-01-27 | Suzuki Motor Corp | 車両用情報提供装置 |
US7561202B2 (en) * | 2003-08-21 | 2009-07-14 | Konica Minolta Opto, Inc. | Image device with lens adjustment for various environmental conditions |
JP2006105640A (ja) | 2004-10-01 | 2006-04-20 | Hitachi Ltd | ナビゲーション装置 |
JP4094604B2 (ja) * | 2004-11-30 | 2008-06-04 | 本田技研工業株式会社 | 車両周辺監視装置 |
DE102006010481A1 (de) * | 2006-03-07 | 2007-09-13 | Robert Bosch Gmbh | Verfahren und Anordnung zur Anzeige von Navigationshinweisen |
JP4420011B2 (ja) * | 2006-11-16 | 2010-02-24 | 株式会社日立製作所 | 物体検知装置 |
DE102007002813A1 (de) * | 2007-01-18 | 2008-07-24 | Robert Bosch Gmbh | Verfahren zum Abschätzen einer Entfernung eines Fahrzeugs zu einem Objekt und Fahrerassistenzsystem |
JP5039765B2 (ja) * | 2009-09-17 | 2012-10-03 | 日立オートモティブシステムズ株式会社 | 車両制御装置 |
JP5350297B2 (ja) * | 2010-03-17 | 2013-11-27 | クラリオン株式会社 | 車両姿勢角算出装置及びそれを用いた車線逸脱警報システム |
-
2010
- 2010-09-27 US US13/498,958 patent/US9280824B2/en active Active
- 2010-09-27 JP JP2011534064A patent/JP5538411B2/ja not_active Expired - Fee Related
- 2010-09-27 EP EP10820111.2A patent/EP2485203B1/en active Active
- 2010-09-27 WO PCT/JP2010/005801 patent/WO2011039989A1/ja active Application Filing
- 2010-09-27 CN CN201080043133.7A patent/CN102549631B/zh active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0777431A (ja) * | 1993-09-08 | 1995-03-20 | Sumitomo Electric Ind Ltd | カメラ姿勢パラメータ算出方法 |
JPH1123291A (ja) * | 1997-07-04 | 1999-01-29 | Nissan Motor Co Ltd | 車両用画像処理装置 |
JPH1151645A (ja) * | 1997-08-05 | 1999-02-26 | Honda Motor Co Ltd | 車両用距離測定装置 |
JP3910345B2 (ja) | 1999-07-13 | 2007-04-25 | 本田技研工業株式会社 | 位置検出装置 |
JP2002117392A (ja) * | 2000-10-06 | 2002-04-19 | Nissan Motor Co Ltd | 車間距離推定装置 |
JP3820874B2 (ja) | 2000-11-22 | 2006-09-13 | 日産自動車株式会社 | 車両用画像処理装置 |
JP2002259995A (ja) | 2001-03-06 | 2002-09-13 | Nissan Motor Co Ltd | 位置検出装置 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101377204B1 (ko) * | 2012-09-03 | 2014-04-01 | 한국과학기술원 | 노면정보 제공 시스템 및 방법과 이를 사용한 차량 |
JP2015067220A (ja) * | 2013-09-30 | 2015-04-13 | 日産自動車株式会社 | 車両用加速抑制装置及び車両用加速抑制方法 |
KR101610521B1 (ko) * | 2014-10-13 | 2016-04-07 | 현대자동차주식회사 | 시선 추적 장치 및 방법 |
Also Published As
Publication number | Publication date |
---|---|
JP5538411B2 (ja) | 2014-07-02 |
EP2485203A4 (en) | 2016-10-26 |
US20120182426A1 (en) | 2012-07-19 |
EP2485203B1 (en) | 2020-02-19 |
US9280824B2 (en) | 2016-03-08 |
EP2485203A1 (en) | 2012-08-08 |
CN102549631B (zh) | 2014-11-12 |
CN102549631A (zh) | 2012-07-04 |
JPWO2011039989A1 (ja) | 2013-02-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5538411B2 (ja) | 車両周囲監視装置 | |
KR102313026B1 (ko) | 차량 및 차량 후진 시 충돌방지 보조 방법 | |
JP4832321B2 (ja) | カメラ姿勢推定装置、車両、およびカメラ姿勢推定方法 | |
JP5962771B2 (ja) | 移動物体位置姿勢角推定装置及び移動物体位置姿勢角推定方法 | |
JP5752729B2 (ja) | 車間距離算出装置およびその動作制御方法 | |
US9586455B2 (en) | Road surface condition estimating apparatus | |
US7551067B2 (en) | Obstacle detection system | |
JP4670528B2 (ja) | 撮像装置のずれ検出方法、撮像装置のずれ補正方法及び撮像装置 | |
WO2017154389A1 (ja) | 画像処理装置、撮像装置、移動体機器制御システム、画像処理方法、及びプログラム | |
CN103383728B (zh) | 使用环视系统的全速车道感测 | |
JP6241172B2 (ja) | 自車位置推定装置及び自車位置推定方法 | |
WO2022153795A1 (ja) | 信号処理装置、信号処理方法及び信号処理システム | |
JP2018136739A (ja) | キャリブレーション装置 | |
US20200265588A1 (en) | Road surface area detection device | |
JP2011215769A (ja) | 車線位置検出装置および車線位置検出方法 | |
JP5083142B2 (ja) | 車両周辺監視装置 | |
JPH0981757A (ja) | 車両位置検出装置 | |
JP4066942B2 (ja) | 車線逸脱防止装置 | |
JP7411108B2 (ja) | 車両姿勢推定システムおよび車両姿勢推定方法 | |
JP4847303B2 (ja) | 障害物検出方法、障害物検出プログラムおよび障害物検出装置 | |
KR20120025290A (ko) | 목표물 추적 장치 및 방법 | |
JP2021196826A (ja) | 安全支援システム、および車載カメラ画像分析方法 | |
JP2020087210A (ja) | キャリブレーション装置及びキャリブレーション方法 | |
WO2021106510A1 (ja) | 障害物認識装置 | |
JP5724570B2 (ja) | 走行支援装置及び走行支援方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080043133.7 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10820111 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011534064 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13498958 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010820111 Country of ref document: EP |