WO2016047037A1 - 車両用画像処理装置 - Google Patents

車両用画像処理装置 Download PDF

Info

Publication number
WO2016047037A1
WO2016047037A1 PCT/JP2015/004256 JP2015004256W WO2016047037A1 WO 2016047037 A1 WO2016047037 A1 WO 2016047037A1 JP 2015004256 W JP2015004256 W JP 2015004256W WO 2016047037 A1 WO2016047037 A1 WO 2016047037A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
traveling direction
acquires
acceleration
Prior art date
Application number
PCT/JP2015/004256
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
賢治 小原
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Priority to DE112015004341.1T priority Critical patent/DE112015004341B4/de
Priority to CN201580037396.XA priority patent/CN106664392A/zh
Publication of WO2016047037A1 publication Critical patent/WO2016047037A1/ja

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing

Definitions

  • the present disclosure relates to an image processing apparatus for a vehicle that displays an image taken around the vehicle on a display device.
  • a technique in which a situation around a vehicle is photographed by an in-vehicle camera, a plurality of images photographed by the in-vehicle camera are combined, and the combined image is displayed on a display device. Specifically, by combining the current image currently captured by the in-vehicle camera and the past image based on the past captured image to form a vehicle peripheral image, the vehicle peripheral portion that entered the blind spot is continuously displayed on the display device. It can be displayed (for example, Patent Document 1).
  • the past image is moved according to the movement of the vehicle. That is, in order to combine the current image and the past image, it is necessary to acquire the amount of movement of the vehicle. As an amount related to the movement of the vehicle, the vehicle speed can be acquired based on the vehicle speed pulse.
  • the traveling direction of the vehicle cannot be determined based on the vehicle speed pulse.
  • a method of acquiring the traveling direction of the vehicle based on the vehicle shift position is conceivable.
  • the vehicle is inclined with the road surface gradient, it is conceivable that the vehicle moves backward despite the shift position being a drive, or moves forward although the shift position is reverse.
  • the actual movement direction of the vehicle is different from the movement direction of the past image in the image composition, and a deviation occurs in the composition of the current image and the past image.
  • an image processing apparatus for a vehicle mounted on a vehicle including an imaging device that captures a predetermined range around the vehicle and a display device that displays a captured image captured by the imaging device, As a traveling state at the time of photographing by the photographing device, a first acquisition device that acquires a vehicle speed based on a vehicle speed pulse, a storage device that stores the captured image as a past image, and a traveling state acquired by the first acquisition device.
  • An image creation device that combines a past image stored in the storage device and a current image captured by the imaging device to create a display image to be displayed on the display device; and an actual vehicle And a direction determination device that determines a traveling direction of the vehicle based on an acquired value of a second acquisition device that acquires a traveling mode.
  • the image creation device synthesizes the past image and the current image based on the traveling direction determined by the direction determination device in addition to the traveling state acquired by the first acquisition device.
  • the traveling direction is determined based on the actual traveling mode of the vehicle. And it was set as the structure which implements image composition based on the advancing direction determined based on the actual advancing aspect in addition to the vehicle speed acquired based on a vehicle speed pulse.
  • the current image and the past image can be suitably combined.
  • FIG. 1 is a block diagram of an image display system.
  • FIG. 2 is a diagram showing a camera and its shooting range.
  • FIG. 3 is a diagram showing a perspective transformation method.
  • FIG. 4 is a flowchart showing image composition processing.
  • FIG. 5 is a flowchart showing the determination process of the traveling direction based on the acceleration.
  • the present image display system is mounted on a vehicle, and includes a front camera 11 and a rear camera 12 as photographing devices, and an image processing unit 13 to which photographing data of these cameras 11 and 12 are input, And an in-vehicle monitor 14 as a display device for displaying a display image created by the image processing unit 13 based on the photographing data.
  • Cameras 11 and 12 capture a predetermined range around the vehicle. Specifically, the front camera 11 and the rear camera 12 are respectively attached to the front side and the rear side of the vehicle, and a predetermined range in front of the vehicle is photographed by the front camera 11, and a predetermined range in the rear of the vehicle is captured by the rear camera 12. Taken.
  • Each of these cameras 11 and 12 is a wide-angle camera capable of photographing in a wide-angle range. For example, the cameras 11 and 12 can photograph at a viewing angle of 180 degrees in the front and rear of the vehicle.
  • Each of the cameras 11 and 12 is a digital imaging system camera that performs imaging using a CCD image sensor or a CMOS image sensor.
  • the front camera 11 is attached to the front side portion of the roof portion
  • the rear camera 12 is attached to the rear side portion of the roof portion.
  • the front camera 11 captures the front range R1
  • the rear camera 12 captures the rear range R2.
  • the front range R1 and the rear range R2 are arranged in the front-rear direction of the vehicle C.
  • both the front camera 11 and the rear camera 12 are arranged on the same line (on the center line X1) extending in the front-rear direction of the vehicle C, and photograph the front and rear of the vehicle with the center line X1 as the photographing center. .
  • each camera 11 and 12 has respectively image
  • the in-vehicle monitor 14 is provided in a position that is visible to the driver while driving, for example, an instrument panel portion in the vehicle interior.
  • the shape and size of the in-vehicle monitor 14 may be arbitrary, in the present embodiment, since the vehicle peripheral image is displayed on the in-vehicle monitor 14 in a range including both the front and rear regions around the host vehicle, A vertically long display area is set on the display screen.
  • the vehicle monitor 14 can display information other than the vehicle periphery image. When the vehicle periphery image is not displayed on the vehicle monitor 14, the vehicle front image (for example, darkness in front of the vehicle) is displayed on the display screen.
  • Various information other than the captured image such as a navigation image based on the position of the vehicle C acquired by the GPS sensor 15 and the position of the vehicle C, may be displayed.
  • the image processing unit 13 is an image processing apparatus that creates a display image by combining the shooting data of the cameras 11 and 12 and displays the display image on the in-vehicle monitor 14.
  • the image processing unit 13 includes a first conversion circuit 21 that inputs the shooting data of the front camera 11 and a second conversion circuit 22 that inputs the shooting data of the rear camera 12. Performs perspective transformation on the respective shooting data of the front camera 11 and the rear camera 12 to create a bird's-eye view image.
  • the bird's-eye view image is a bird's-eye view image in a state in which the shooting range of each camera 11, 12 is looked down vertically from the sky position.
  • a front bird's-eye image and a rear bird's-eye image are created by the conversion circuits 21 and 22, respectively.
  • Each of the cameras 11 and 12 captures the front of the vehicle and the rear of the vehicle at every predetermined time. According to each of the conversion circuits 21 and 22, the bird's-eye image at the front of the vehicle and the bird's-eye image at the rear of the vehicle at each predetermined time. Can be obtained.
  • the image data of the bird's-eye view image created by the conversion circuits 21 and 22 is input to the CPU 23.
  • the image processing unit 13 creates a vehicle peripheral image based on the front bird's-eye image and the rear bird's-eye image. This vehicle peripheral image is created as a bird's-eye view image, similar to the image after conversion by the conversion circuits 21 and 22.
  • the range R3 outside the camera field of view is included in the vicinity (side of the vehicle) of the host vehicle, and therefore, the range R3 outside the camera field of view is taken by the front camera 11 (front bird's-eye view image). ) And any past image of the image captured by the rear camera 12 (rear bird's-eye view image).
  • the image processing unit 13 includes an image memory 24 as a storage device that stores the past image.
  • the vehicle C includes a vehicle speed sensor 16 that detects the vehicle speed, and a yaw rate sensor 17 that detects a yaw rate (a change speed of the rotation angle in the turning direction).
  • the detection signals of the vehicle speed sensor 16 and the yaw rate sensor 17 are sequentially input to the CPU 23.
  • the image processing unit 13 as the first acquisition device calculates and acquires the vehicle speed and yaw rate that are the running state of the vehicle C based on the detection signals (vehicle speed pulse and yaw rate signal) of the sensors 16 and 17. In short, what is necessary is that the traveling state of the vehicle C at the time of photographing with the cameras 11 and 12 can be grasped, and the moving distance and rotation angle of the vehicle C or the position and orientation of the vehicle C may be calculated.
  • the image processing unit 13 associates the past image of the front bird's-eye image and the past image of the rear bird's-eye image with the traveling information (vehicle speed and yaw rate) of the vehicle C at the time of image capturing at predetermined time intervals. To remember.
  • the CPU 23 as an image creation device creates a vehicle peripheral image
  • Past image is read from the image memory 24.
  • a vehicle peripheral image is created from the current captured images (front current image and rear current image) by the cameras 11 and 12 and the past past image.
  • a captured image (rear past image) captured in the past by the rear camera 12 is read from the image memory 24 as an image in the range R3 outside the camera field of view.
  • a vehicle peripheral image is created from the current captured image (front current image and rear current image) by the cameras 11 and 12 and the rear past image.
  • the image processing unit 13 arranges and connects the front current image, the rear current image, and the front past image (or the back past image) side by side in the vehicle front-rear direction on the same plane. At this time, based on the current traveling state (vehicle speed and yaw rate) of the vehicle C and the traveling state (vehicle speed and yaw rate) of the vehicle C when the past image is captured, the past image is translated and rotated (Euclidean). Conversion), and each image is synthesized as one image.
  • the past image is translated and rotated in accordance with the movement amount of the vehicle C from the time when the past image was taken to the present, and the current image and the past image are synthesized.
  • the vehicle speed sensor 16 used for calculating the amount of movement of the vehicle C is composed of a magnet provided in a gear shape with respect to the propeller shaft, drive shaft, etc. of the vehicle C and a magnetic sensor (pickup) for detecting a change in magnetic field.
  • the rotation of the propeller shaft and drive shaft is output as a vehicle speed pulse.
  • the vehicle speed pulse does not include information on the rotation direction of the propeller shaft or the drive shaft, that is, the traveling direction of the vehicle C.
  • the value detected by the yaw rate sensor 17 also does not include information about the traveling direction of the vehicle C. That is, in order to calculate the movement amount of the vehicle C, it is necessary to acquire the traveling direction of the vehicle C using another input value different from the detection values of the vehicle speed sensor 16 and the yaw rate sensor 17.
  • a method of determining based on the position (shift position) of the shift lever 19 of the vehicle C is conceivable.
  • the transmission mechanism of the vehicle C in the present embodiment is an automatic transmission.
  • the vehicle C is inclined with the road surface gradient, it may be considered that the vehicle moves backward despite the shift position being a drive, or moves forward although the shift position is reverse. For this reason, in the method of acquiring the traveling direction of the vehicle C based on the shift position of the vehicle C, inconvenience may occur in image composition.
  • the image processing unit 13 as the second acquisition device acquires the acceleration of the vehicle C detected by the acceleration sensor 18 (acceleration detection device) in addition to the shift position.
  • the acceleration sensor 18 uses the output value (gravity acceleration) when the vehicle is stopped as an offset, and when the vehicle C is accelerating forward, the vehicle C accelerates the offset + positive value (acceleration forward) backward. Output an offset + negative value (backward acceleration). Therefore, the image processing unit 13 acquires a value obtained by subtracting an offset (gravity acceleration) from the output value of the acceleration sensor 18 as the acceleration of the vehicle C.
  • the image processing unit 13 as a direction determination apparatus determines the advancing direction of the vehicle C based on the acceleration which is an acquired value of the 2nd acquisition apparatus. Then, image composition is performed based on the traveling direction.
  • the sampling cycle of the acceleration sensor 18 is 50 ms, which is about the same as the display image update cycle (33 ms).
  • FIG. 4 is a flowchart showing the image composition processing in the present embodiment. This process is periodically performed by the image processing unit 13.
  • step S01 it is determined whether or not a vehicle speed pulse is input from the vehicle speed sensor 16. If the vehicle speed pulse is not input (S01: NO), the process is terminated without performing image composition. If a vehicle speed pulse is input (S01: YES), whether or not the slope of the vehicle C (road slope) is a steep slope ( ⁇ 10% or less, or + 10% or more) greater than or equal to a predetermined slope in step S02. Determine.
  • the inclination of the vehicle C can be calculated based on the gravitational acceleration acquired as an offset (DC component) of the acceleration sensor 18 as the inclination detecting device.
  • step S03 When the inclination of the vehicle C is a steep inclination greater than or equal to a predetermined inclination (S02: YES), it is determined in step S03 whether or not the vehicle speed acquired based on the vehicle speed pulse belongs to a predetermined range (10 km / h or less). .
  • the vehicle speed belongs to the predetermined range (S03: YES)
  • step S04 the variance of the acceleration detected by the acceleration sensor 18 is acquired, and it is determined whether the variance is equal to or less than a predetermined value.
  • step S05 the traveling direction is determined based on the acceleration. Details of the determination process of the traveling direction based on the acceleration will be described later.
  • the traveling direction of the vehicle C is acquired based on the shift position in step S06. That is, when the shift position is drive, it is determined that the vehicle C is moving forward, and when it is reverse, it is determined that the vehicle C is moving backward.
  • step S07 image synthesis is performed based on the vehicle speed acquired from the vehicle speed sensor 16, the yaw rate acquired from the yaw rate sensor 17, and the traveling direction of the vehicle C acquired in step S05 or S06. Then, the process ends.
  • FIG. 5 is a flowchart showing the determination of the traveling direction based on the detection value of the acceleration sensor 18.
  • step S11 it is determined whether or not a predetermined time has elapsed after the vehicle C starts. If the predetermined time has not elapsed since the vehicle C started (S11: NO), it is determined in step S12 whether or not the acceleration detected by the acceleration sensor 18 is a positive value. If the acceleration is a positive value (S12: YES), it is determined in step S13 that the vehicle C is moving forward. When the acceleration is a negative value (S12: NO), it is determined in step S14 that the vehicle C is moving backward. That is, in a situation where the vehicle starts from a stop state, it is determined that the vehicle C is moving forward when the acceleration is positive, and it is determined that the vehicle C is moving backward when the acceleration is negative.
  • step S15 If the predetermined time has elapsed after the vehicle C has started, it is determined in step S15 whether or not the vehicle speed has increased. If the vehicle speed is increasing (S15: YES), it is determined in step S16 whether or not the acceleration is a positive value. When the acceleration is a positive value (S16: YES), it is determined in step S17 that the vehicle C is moving forward. If the acceleration is a negative value (S16: NO), it is determined in step S18 that the vehicle C is moving backward. If the vehicle speed is decreasing (S15: NO), it is determined in step S19 whether or not the acceleration is a positive value. If the acceleration is a positive value (S19: YES), it is determined in step S20 that the vehicle C is moving backward. If the acceleration is a negative value (S19: NO), it is determined in step S21 that the vehicle C is moving forward.
  • the vehicle speed increases and the acceleration becomes a negative value.
  • the vehicle speed absolute value of the vehicle speed
  • the vehicle speed absolute value of the vehicle speed
  • the vehicle speed decreases as the driver operates the brake pedal.
  • the vehicle speed decreases and the acceleration becomes a positive value.
  • the vehicle speed absolute value of the vehicle speed
  • the traveling direction can be correctly determined when the vehicle C continues to move forward or backward on the slope.
  • the traveling direction is determined based on the actual traveling mode of the vehicle C. And it was set as the structure which implements an image composition using the advancing direction determined based on the actual advancing aspect in addition to the vehicle speed acquired based on a vehicle speed pulse.
  • the current image and the past image can be suitably combined.
  • both the traveling direction determined based on the actual traveling mode and the traveling direction acquired based on the shift position are used.
  • the current image and the past image can be combined more suitably.
  • the image synthesis is performed based on the traveling direction acquired based on the shift position. It was set as the structure which implements. With such a configuration, it is possible to perform image composition using a highly reliable determination result in the traveling direction.
  • the inclination of the vehicle C that is, the road gradient
  • the actual traveling direction determined by the image processing unit 13 as the direction determination device Based on this, image composition is performed.
  • image synthesis using the traveling direction acquired based on the acceleration sensor 18 is performed in a situation where the reliability of the traveling direction acquired based on the shift position is low. This makes it possible to perform more suitable image composition.
  • the image composition is performed based on the traveling direction determined by the image processing unit 13 as the direction determining device.
  • image synthesis using the traveling direction acquired based on the acceleration sensor 18 is performed in a situation where the reliability of the traveling direction acquired based on the shift position is low. This makes it possible to perform more suitable image composition.
  • the image processing unit 13 of the above embodiment is configured to determine the actual traveling direction of the vehicle C using the detection value of the acceleration sensor 18. By changing this, the image processing unit acquires the position of the vehicle C based on the GPS signal received from the GPS satellite by the GPS sensor 15 (FIG. 1) as the position acquisition device. And it is good also as a structure which determines the actual advancing direction of the vehicle C based on the change of the position of the vehicle C.
  • the image processing unit 13 determines the actual traveling direction of the vehicle C based on the self-position estimation based on the Kalman filter using the detection values of the GPS sensor 15, the vehicle speed sensor 16, the yaw rate sensor 17, and the acceleration sensor 18 as input values. It is good also as a structure to determine.
  • the traveling direction acquired based on the shift position is A configuration in which image composition is performed with priority is preferable.
  • the determination of the traveling direction based on the acceleration may be performed regardless of the inclination of the vehicle C and the vehicle speed.
  • each section is expressed as S01, for example.
  • each section can be divided into a plurality of subsections, while a plurality of sections can be combined into one section.
  • each section configured in this manner can be referred to as a device, module, or means.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)
PCT/JP2015/004256 2014-09-24 2015-08-25 車両用画像処理装置 WO2016047037A1 (ja)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE112015004341.1T DE112015004341B4 (de) 2014-09-24 2015-08-25 Fahrzeug-bildverarbeitungsvorrichtung
CN201580037396.XA CN106664392A (zh) 2014-09-24 2015-08-25 车辆用图像处理装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-193542 2014-09-24
JP2014193542A JP6361415B2 (ja) 2014-09-24 2014-09-24 車両用画像処理装置

Publications (1)

Publication Number Publication Date
WO2016047037A1 true WO2016047037A1 (ja) 2016-03-31

Family

ID=55580591

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/004256 WO2016047037A1 (ja) 2014-09-24 2015-08-25 車両用画像処理装置

Country Status (4)

Country Link
JP (1) JP6361415B2 (enrdf_load_stackoverflow)
CN (1) CN106664392A (enrdf_load_stackoverflow)
DE (1) DE112015004341B4 (enrdf_load_stackoverflow)
WO (1) WO2016047037A1 (enrdf_load_stackoverflow)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7130994B2 (ja) * 2018-03-12 2022-09-06 オムロン株式会社 車載機、後退判定方法、及び後退判定プログラム
US10567724B2 (en) * 2018-04-10 2020-02-18 GM Global Technology Operations LLC Dynamic demosaicing of camera pixels
JP7184591B2 (ja) * 2018-10-15 2022-12-06 三菱重工業株式会社 車両用画像処理装置、車両用画像処理方法、プログラムおよび記憶媒体
CN115742962B (zh) * 2022-11-25 2024-05-14 重庆长安汽车股份有限公司 一种全景影像控制方法及装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006321394A (ja) * 2005-05-19 2006-11-30 Aisin Aw Co Ltd 駐車支援装置
JP2006327499A (ja) * 2005-05-27 2006-12-07 Aisin Aw Co Ltd 駐車支援方法及び駐車支援装置
JP2007099261A (ja) * 2005-09-12 2007-04-19 Aisin Aw Co Ltd 駐車支援方法及び駐車支援装置
JP2007114020A (ja) * 2005-10-19 2007-05-10 Aisin Aw Co Ltd 車両の移動距離検出方法、車両の移動距離検出装置、車両の現在位置検出方法及び車両の現在位置検出装置
JP2011152865A (ja) * 2010-01-27 2011-08-11 Kyocera Corp 車載撮像装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3722487B1 (ja) * 2004-05-19 2005-11-30 本田技研工業株式会社 車両用走行区分線認識装置
JP4321543B2 (ja) 2006-04-12 2009-08-26 トヨタ自動車株式会社 車両周辺監視装置
JP5500369B2 (ja) * 2009-08-03 2014-05-21 アイシン精機株式会社 車両周辺画像生成装置
JP5858650B2 (ja) * 2011-06-08 2016-02-10 富士通テン株式会社 画像生成装置、画像表示システム、及び、画像生成方法
JP5790335B2 (ja) 2011-09-01 2015-10-07 株式会社デンソー 車両周辺画像表示制御装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006321394A (ja) * 2005-05-19 2006-11-30 Aisin Aw Co Ltd 駐車支援装置
JP2006327499A (ja) * 2005-05-27 2006-12-07 Aisin Aw Co Ltd 駐車支援方法及び駐車支援装置
JP2007099261A (ja) * 2005-09-12 2007-04-19 Aisin Aw Co Ltd 駐車支援方法及び駐車支援装置
JP2007114020A (ja) * 2005-10-19 2007-05-10 Aisin Aw Co Ltd 車両の移動距離検出方法、車両の移動距離検出装置、車両の現在位置検出方法及び車両の現在位置検出装置
JP2011152865A (ja) * 2010-01-27 2011-08-11 Kyocera Corp 車載撮像装置

Also Published As

Publication number Publication date
DE112015004341B4 (de) 2022-03-10
CN106664392A (zh) 2017-05-10
JP2016066855A (ja) 2016-04-28
DE112015004341T5 (de) 2017-06-08
JP6361415B2 (ja) 2018-07-25

Similar Documents

Publication Publication Date Title
CN108496178B (zh) 用于估计未来路径的系统和方法
CN105075247B (zh) 车辆的控制装置及存储介质
JP4863791B2 (ja) 車両周辺画像生成装置および画像切替方法
JP6477562B2 (ja) 情報処理装置
JP2002373327A (ja) 車両周辺画像処理装置及び記録媒体
CN106463062A (zh) 车辆周边图像生成装置及方法
JP6375633B2 (ja) 車両周辺画像表示装置、車両周辺画像表示方法
JP6597415B2 (ja) 情報処理装置及びプログラム
JP7000383B2 (ja) 画像処理装置および画像処理方法
JP7426174B2 (ja) 車両周囲画像表示システム及び車両周囲画像表示方法
WO2016047037A1 (ja) 車両用画像処理装置
WO2015133072A1 (ja) 車両周辺画像表示装置、車両周辺画像表示方法
JP2020086956A (ja) 撮影異常診断装置
JP4932293B2 (ja) 障害物認識装置
JP2023174682A (ja) 車両用映像処理装置および車両用映像処理方法
JP4857159B2 (ja) 車両運転支援装置
CN111788088A (zh) 车辆用显示控制装置和显示控制方法
JP7573377B2 (ja) 駐車支援装置、及び駐車支援方法
JP5943207B2 (ja) 車両の駐車操作支援用映像表示装置
JP2001341600A (ja) 駐車支援装置
JP5132796B2 (ja) 車両周辺画像生成装置および画像切替方法
JP4557712B2 (ja) 運転支援システム
CN108353147B (zh) 显示控制装置以及储存有显示控制程序的非暂时性计算机可读存储介质
JP2018203090A (ja) 駐車支援装置、駐車支援システム、駐車支援方法及びプログラム
JP2013015882A (ja) 車両用画像認識装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15843645

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 112015004341

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15843645

Country of ref document: EP

Kind code of ref document: A1