WO2011013813A1 - Dispositif embarqué à bord d’un véhicule et programme de traitement d’image - Google Patents

Dispositif embarqué à bord d’un véhicule et programme de traitement d’image Download PDF

Info

Publication number
WO2011013813A1
WO2011013813A1 PCT/JP2010/062927 JP2010062927W WO2011013813A1 WO 2011013813 A1 WO2011013813 A1 WO 2011013813A1 JP 2010062927 W JP2010062927 W JP 2010062927W WO 2011013813 A1 WO2011013813 A1 WO 2011013813A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
steering angle
image
host vehicle
imaging signal
Prior art date
Application number
PCT/JP2010/062927
Other languages
English (en)
Japanese (ja)
Inventor
秀一 小原
Original Assignee
クラリオン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by クラリオン株式会社 filed Critical クラリオン株式会社
Priority to JP2011524862A priority Critical patent/JPWO2011013813A1/ja
Publication of WO2011013813A1 publication Critical patent/WO2011013813A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/304Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
    • B60R2300/305Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images merging camera image with lines or icons
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/806Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking

Definitions

  • the present invention relates to an in-vehicle device and an image processing program.
  • Patent Literature 1 describes a driving assistance display device that superimposes and displays a guide line including a guide line indicating a vehicle width and a guide line indicating a distance on the road surface from the rear end of a vehicle on a captured image of a wide-angle camera. ing.
  • vehicle width display as in Patent Document 1, there is also an in-vehicle device that displays a guide indicating the magnitude of the steering angle of the host vehicle acquired by the steering angle sensor.
  • the in-vehicle device is based on the imaging unit that images the periphery of the host vehicle and outputs the imaging signal, the video display unit that displays the video based on the imaging signal on the display device, and the imaging signal.
  • a steering angle detector that detects the steering angle of the host vehicle, and an image display unit that displays an image representing the vehicle width and steering angle of the host vehicle on the display device based on the steering angle.
  • the steering angle detection unit includes a latest image acquired based on the imaging signal and a latest image acquired based on the imaging signal. It is preferable to detect the steering angle based on the horizontal deviation amount from the image.
  • the image display unit displays the straight line representing the vehicle width of the host vehicle and the polygon representing the steering angle as an image to the display device. Display is preferred.
  • the image display unit has at least one of hue, saturation, and brightness as compared with the color of the region that is the background of the straight line in the video. It is preferable to display a straight line with a color different from the fixed amount.
  • the image display unit includes the image representing the vehicle width and the steering angle of the host vehicle and the ground of the bumper of the host vehicle. It is preferable to display an image representing the height on the display device so as to overlap the image.
  • the in-vehicle device according to any one of the first to fifth aspects further includes a speed detection unit that detects the speed of the host vehicle based on the imaging signal.
  • the image display unit when the speed exceeds a predetermined speed, does not display an image representing the vehicle width and the steering angle of the host vehicle on the display device.
  • the image processing program includes an imaging process in which an imaging signal obtained by imaging the periphery of the host vehicle is input, and a video display process for displaying an image based on the imaging signal on the display device.
  • An image for displaying a steering angle detection step for detecting the steering angle of the host vehicle based on the imaging signal, and an image representing the vehicle width and the steering angle of the host vehicle based on the steering angle on the display device in a superimposed manner.
  • the display process is executed by a computer.
  • the vehicle width of the host vehicle and the size of the steering angle of the host vehicle can be easily grasped from the screen.
  • FIG. 1 is a block diagram showing the overall configuration of the in-vehicle device in the present embodiment.
  • the in-vehicle device 1 is mounted on the vehicle, but is not connected to the vehicle except for a power supply line for receiving driving power. Accordingly, various signals output from the vehicle, such as vehicle speed pulses, cannot be received. Further, for example, a network connection such as a CAN (Controller Area Network) is not provided.
  • a host vehicle a vehicle on which the in-vehicle device 1 is mounted.
  • the in-vehicle device 1 includes a control circuit 11 that controls each unit.
  • the control circuit 11 is connected to a ROM 12, a RAM 13, a rear camera 14, an input device 15, a liquid crystal monitor 16, and a speaker 17.
  • the control circuit 11 is composed of a microprocessor and its peripheral circuits.
  • the control circuit 11 is a computer that executes a predetermined control program stored in the ROM 12 using the RAM 13 as a work area and performs various controls.
  • the rear camera 14 is a camera installed behind the host vehicle. The rear camera 14 images the rear lower part of the host vehicle at a speed of, for example, 30 frames per second, and outputs an imaging signal such as NTSC to the control circuit 11.
  • the control circuit 11 converts this image pickup signal into image data by a known technique.
  • the input device 15 is an input device such as a touch panel or a remote controller.
  • the liquid crystal monitor 16 displays an image or the like on the screen based on the image data output from the control circuit 11.
  • the speaker 17 emits a sound to the passenger of the own vehicle based on the audio signal output from the control circuit 11.
  • the control circuit 11 includes a video display unit 11a, a detection unit 11b, and an image display unit 11c. Each of these functional units is realized by software when the control circuit 11 executes a control program stored in the ROM 12.
  • the video display unit 11 a displays the video captured by the rear camera 14 on the liquid crystal monitor 16. As described above, since the rear camera 14 captures images at a speed of 30 frames per second, the video display unit 11a also displays on the liquid crystal monitor 16 at a speed of 30 frames per second. Specifically, by outputting image data based on the imaging signal from the rear camera 14 to the liquid crystal monitor 16, the video imaged by the rear camera 14 is displayed on the liquid crystal monitor 16.
  • the detection unit 11b analyzes the image data based on the imaging signal from the rear camera 14 and detects the current steering angle and vehicle speed of the host vehicle. Specific contents of the detection processing by the detection unit 11b will be described in detail later.
  • the image display unit 11c Based on the steering angle of the host vehicle detected by the detection unit 11b, the image display unit 11c displays a straight line representing the vehicle width of the host vehicle and a polygon representing the steering angle of the host vehicle on the liquid crystal monitor. 16 is displayed. These straight lines and polygons are displayed on the video displayed by the video display unit 11a.
  • FIG. 2 is a diagram showing an installation location and an imaging range of the rear camera 14.
  • FIG. 2A is a top view of the host vehicle 2
  • FIG. 2B is a side view of the host vehicle 2.
  • the rear camera 14 is installed behind the host vehicle 2 and images the range indicated by the symbol A in the drawing, that is, the lower rear part of the host vehicle 2.
  • the location where the rear camera 14 is installed may be any location as long as the periphery of the host vehicle 2 can be imaged. Information such as the vehicle width, the location of the rear camera 14 and the angle is given to the in-vehicle device 1 in advance by a known calibration method or the like.
  • the display screen of the liquid crystal monitor 16 will be described.
  • an image captured by the rear camera 14 is displayed by the image display unit 11a, and a straight line and a polygon are displayed by the image display unit 11c on the image.
  • display by the image display unit 11c will be described.
  • FIG. 3 is a diagram showing a display example of the liquid crystal monitor 16 when the host vehicle 2 is traveling straight backward.
  • FIG. 3A shows a display screen SC1 of the liquid crystal monitor 16 when the host vehicle 2 goes straight backward.
  • the background of the screen SC1 is an image captured by the rear camera 14.
  • the image display unit 11c displays two straight lines LL and LR that represent the vehicle width of the host vehicle 2 so as to overlap the video.
  • FIG. 3B is a diagram in which the straight lines LL and LR displayed on the screen SC1 are projected onto the top view of the host vehicle 2. There are no straight lines LL, LR on the actual road surface. As shown in FIG. 3B, the straight lines LL and LR displayed on the screen SC1 indicate that the left and right ends of the vehicle body are straight lines LL and LR when the host vehicle 2 moves backward at the current steering angle. Represents reaching the indicated location. The driver can grasp the positional relationship between the obstacles on the road and the own vehicle 2 by confirming the positions of the straight lines LL and LR from the display screen of the liquid crystal monitor 16.
  • the video display unit 11a displays the video captured by the rear camera 14 with the left and right being reversed. That is, the direction indicated by reference sign D1 in FIG. 3A is the right direction when viewed from the driver of the host vehicle 2. This is because the left and right directions of the liquid crystal monitor 16 are aligned with the rearview mirror of the host vehicle 2.
  • the image display unit 11c determines the colors of the straight lines LL and LR according to the color of the video that is the background of the screen. For example, when a concrete road surface is reflected on the liquid crystal monitor 16, if the straight lines LL and LR are drawn in gray, the visibility is poor. Therefore, the image display unit 11c draws the straight lines LL and LR with a complementary color of the background color. Note that the straight lines LL and LR may be drawn with colors different in hue by a predetermined amount or more instead of complementary colors. Further, instead of the hue, the straight lines LL and LR may be drawn with colors that differ by a predetermined amount or more in saturation and brightness.
  • FIG. 4 is a diagram showing a display example of the liquid crystal monitor 16 when the host vehicle 2 travels while turning the steering wheel backward.
  • 4A shows a display screen SC2 of the liquid crystal monitor 16
  • FIG. 4B shows an overhead view of the host vehicle 2.
  • the steering angle of the host vehicle 2 is, for example, an angle A1 shown in FIG. 4B
  • the left and right ends of the host vehicle 2 do not become straight lines LL and LR that are horizontal to the host vehicle 2 as shown in FIG. Instead, the left and right ends of the host vehicle 2 are positioned at angles corresponding to the steering angle with respect to the straight lines LL and LR, such as the straight lines LL ′ and LR ′ shown in FIG.
  • the image display unit 11c displays straight lines LL ′ and LR ′ having an angle corresponding to the steering angle.
  • the image display unit 11c further draws a triangle corresponding to the steering angle with respect to a straight line that approaches the inside of the host vehicle 2 among the two straight lines LL ′ and LR ′. Specifically, as shown in FIG. 4A, the triangle S is drawn with a line pattern between the position where the straight line LL was drawn and the straight line LL '. The driver of the own vehicle 2 can grasp how much the own vehicle 2 is bent by visually recognizing the triangle S displayed on the liquid crystal monitor 16 when moving the own vehicle 2 backward.
  • the detection unit 11b performs this detection process based on two consecutive frames imaged by the rear camera 14. Specifically, the steering angle and the vehicle speed of the host vehicle 2 are detected by comparing the image data based on the latest imaging signal output by the rear camera 14 and the immediately preceding image data.
  • FIG. 5 is a diagram for explaining the detection processing of the steering angle and the vehicle speed by the detection unit 11b.
  • a frame P1 shown in FIG. 5A is obtained by an imaging signal from the rear camera 14 at a certain time t1.
  • the detection unit 11b extracts a predetermined range of image data from the frame P1 and stores it in the RAM 13.
  • the extracted partial image data is referred to as partial image data.
  • a certain range of image data near the center of the frame P1 is stored as partial image data.
  • the detection unit 11b executes a block matching process between the frame P2 and the partial image data stored in the RAM 13. As shown in FIG. 5B, the search range of the block matching process is a range 52 larger than the range 51 shown in FIG.
  • the block matching process is performed according to the following procedure.
  • the range 53 having the same size as the range 51 from the upper left corner of the search range 52 is compared with the partial image data stored in the RAM 13 in pixel units. Specifically, a difference in luminance value is calculated for each pixel, and a cumulative value of absolute values of the difference is obtained.
  • the range 53 is shifted to the right by one pixel, and the above cumulative value is obtained again. In this way, the entire range of the search range 52 is compared with the partial image data.
  • the detection unit 11b compares the minimum cumulative value with a predetermined threshold value. When the minimum cumulative value is equal to or less than the predetermined threshold value, the detection unit 11b considers that the matching process has been successful, and obtains the steering angle and the vehicle speed by the method shown in FIG. In FIG. 5C, it is assumed that the position where the matching is successful is the range 53 ′ and the position of the partial image data is the range 51.
  • the detecting unit 11b calculates the amount of deviation between the range 53 'and the range 51 in the vertical and horizontal directions in the screen.
  • the vertical shift amount is dy
  • the horizontal shift amount is dx.
  • the detection unit 11b converts the shift amounts dx and dy in the screen into real space distances, and calculates the actual steering angle and vehicle speed. Specifically, since the interval between the time t1 at which the frame P1 is acquired and the time t2 at which the frame P2 is acquired is known, the detection unit performs a calculation that takes into account the position information of the rear camera 14. , Dx, and the vehicle speed can be calculated from dy. As the steering angle increases, dx increases, and as the vehicle speed increases, dy increases.
  • the detection unit 11b considers that the matching process has failed.
  • the failure of the matching process is caused, for example, when the imaging range of the rear camera 14 changes more than the search range 52 from time t1 to time t2 because the host vehicle 2 is moving at high speed. In such a case, the image display unit 11c stops the display on the liquid crystal monitor 16.
  • the image display unit 11c performs the display as described above based on the steering angle detected by the detection unit 11b as described above.
  • the image display unit 11c does not display as described above when the host vehicle 2 is stationary, that is, when the vehicle speed detected by the detection unit 11b is equal to or less than a predetermined threshold value. This is because when the host vehicle 2 is stationary, the imaging signal from the rear camera 14 does not change, and the detection unit 11b cannot detect the steering angle. However, when an operation that frequently repeats the progress and the stop is performed, the display by the image display unit 11c is frequently switched, and thus the display becomes troublesome. Therefore, the image display unit 11c does not perform the display as described above only when the host vehicle 2 is continuously stationary for a predetermined time (for example, 5 seconds).
  • a predetermined time for example, 5 seconds
  • FIG. 6 is a flowchart of detection processing by the detection unit 11b.
  • This process is realized in software by the control circuit 11 executing a control program (image processing program) stored in the ROM 12.
  • the control circuit 11 creates image data based on the imaging signal from the rear camera 14 and stores it in the RAM 13.
  • the video display unit 11 a displays a video based on the image data stored in the RAM 13 on the liquid crystal monitor 16.
  • the detection unit 11b executes block matching processing between the partial image data stored in the RAM 13 and the image data stored in the RAM 13 in step S10. By this process, the minimum cumulative value described above is calculated.
  • step S40 it is determined whether or not the cumulative value calculated in step S30 is a predetermined threshold value or less. If the cumulative value exceeds a predetermined threshold value, the process proceeds to step S80. On the other hand, if the cumulative value is equal to or less than the predetermined threshold value, the process proceeds to step S50.
  • step S50 the detection unit 11b detects the steering angle and the vehicle speed of the host vehicle 2 based on the result of the block matching process in step S30.
  • step S60 it is determined whether or not a predetermined time has passed while the host vehicle 2 is stationary. If the predetermined time has elapsed while the host vehicle is stationary, the process proceeds to step S80. On the other hand, if the host vehicle is not stationary at the present time, or if the elapsed time while the host vehicle is stationary is less than the predetermined time, the process proceeds to step S70.
  • step S ⁇ b> 70 the image display unit 11 c displays two straight lines representing the vehicle width of the host vehicle 2 and a triangle representing the steering angle of the host vehicle 2 on the liquid crystal monitor 16.
  • step S80 partial image data is extracted from a predetermined range of the frame P2, which is the latest frame, as shown in a range 51 shown in FIG.
  • the image display unit 11c displays on the liquid crystal monitor 16 a straight line representing the vehicle width of the host vehicle and a triangle representing the magnitude of the steering angle, superimposed on the video displayed by the video display unit 11a. Thereby, the movement trajectory of the host vehicle 2 in consideration of the steering angle and the magnitude of the steering angle of the host vehicle 2 can be easily grasped from the display screen of the liquid crystal monitor 16.
  • the detection unit 11b detects the steering angle and the vehicle speed of the host vehicle 2 using only the output of the rear camera 14. Thereby, it is not necessary to obtain the output from the own vehicle 2 and the outputs of various sensors, and the cost of the in-vehicle device 1 can be reduced.
  • the image display unit 11c displays a straight line representing the vehicle width of the host vehicle with a complementary color of the color of the video displayed on the liquid crystal monitor 16 by the video display unit 11a.
  • the in-vehicle device of the present embodiment has the same configuration as the in-vehicle device according to the first embodiment. Note that members, circuits, and functional units similar to those in the first embodiment are denoted by the same reference numerals, and description thereof is omitted.
  • the image display unit 11c of the in-vehicle device 1 performs a display representing the ground height of the bumper of the host vehicle 2 on a straight line representing the vehicle width of the host vehicle 2. It is assumed that the ground height of the bumper of the host vehicle 2, that is, the height from the road surface to the upper end of the bumper is given to the in-vehicle device 1 in advance by a calibration process or the like that is executed in advance.
  • FIG. 7 is a diagram illustrating an example of a screen display of the in-vehicle device 1 according to the second embodiment.
  • the screen SC3 On the screen SC3, two straight lines LL ′′ and LR ′′ representing the vehicle width of the host vehicle 2 are displayed.
  • planes WL and WR representing the bumper's ground height are displayed. These surfaces WL and WR are displayed in a three-dimensional shape so as to cover the road surface to the bumper ground height.
  • the image display unit 11c displays on the liquid crystal monitor 16 a straight line representing the vehicle width of the host vehicle 2, a triangle representing the magnitude of the steering angle, and a surface representing the ground height of the bumper of the host vehicle. Thereby, it becomes possible to grasp
  • the detection unit 11b may detect the steering angle and the vehicle speed of the host vehicle 2 by a method other than block matching.
  • the steering angle and the vehicle speed of the host vehicle 2 may be detected by obtaining a feature point from image data using a known technique and calculating the amount of movement of the feature point between frames.
  • the detection unit 11b may detect the steering angle and the vehicle speed of the host vehicle 2 from three or more consecutive frames.
  • the image display unit 11c may determine a straight line representing the vehicle width of the host vehicle 2 and a triangle representing the magnitude of the steering angle based on factors other than the background color. For example, when the in-vehicle device 1 is a navigation device, the color may be determined based on the current location and map data. Further, the color may be determined by the state of the host vehicle 2, the output of the illuminance sensor, or the time zone. Furthermore, decorations other than colors, such as the thickness of a straight line or a triangular pattern, may be determined based on these factors.
  • the driver may be notified by voice or the like based on the detection result by the detection unit 11b or the display content on the screen by the image display unit 11c. For example, movement to the left and right, turning to the left and right, forward and backward, etc. may be notified by voice or the like.
  • the range 51 may be divided into a plurality of areas, and the block matching process may be performed for each area.
  • the range 51 is equally divided into 4 horizontal rows and 3 vertical rows to form 12 small regions.
  • a block matching process is performed using an area obtained by enlarging the small area in all directions as a search range.
  • the steering angle and the vehicle speed of the host vehicle 2 may be detected using the twelve dx and dy thus obtained.
  • the parts excluding the rear camera 14 of the in-vehicle device 1 of the above embodiment, that is, the control circuit 11, the ROM 12, the RAM 13, the input device 15, the liquid crystal monitor 16, and the speaker 17 may be replaced with a personal computer.
  • the above-described control program can be provided through a recording medium such as a CD-ROM, a USB memory (flash memory), a memory card, or a data signal such as the Internet.
  • FIG. 8 shows the state.
  • the personal computer 100 is provided with a program via the CD-ROM 104.
  • the personal computer 100 has a connection function with the communication line 101.
  • the computer 102 is a server computer that provides the program, and stores the program in a recording medium such as the hard disk 103.
  • the communication line 101 is a communication line such as the Internet or personal computer communication, or a dedicated communication line.
  • the computer 102 reads the program using the hard disk 103 and transmits the program to the personal computer 100 via the communication line 101. That is, the program is transmitted as a data signal through the communication line 101 via a carrier wave.
  • the program can be supplied as a computer-readable computer program product in various forms such as a recording medium and a data signal (carrier wave). If the vehicle-mounted apparatus 1 of the said embodiment also takes the same structure, a control program can be supplied similarly.
  • the present invention is not limited to the above-described embodiments, and other forms conceivable within the scope of the technical idea of the present invention are also included in the scope of the present invention. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

La présente invention se rapporte à un dispositif embarqué à bord d’un véhicule qui est pourvu d’une unité de capture d’image, d’une unité d’affichage vidéo, d’une unité de détection d’angle de direction, et d’une unité d’affichage d’image. L’unité de capture d’image prend une image des environs d’un véhicule lui-même et sort un signal de capture d’image. L’unité d’affichage vidéo affiche, sur un dispositif d’affichage, une vidéo sur la base du signal de capture d’image. L’unité de détection d’angle de direction détecte un angle de direction du véhicule lui-même sur la base du signal de capture d’image. Sur la base de l’angle de direction, l’unité d’affichage d’image affiche une image qui indique la largeur du véhicule lui-même et l’angle de direction de celui-ci sur le dispositif d’affichage, l’image étant superposée sur la vidéo.
PCT/JP2010/062927 2009-07-30 2010-07-30 Dispositif embarqué à bord d’un véhicule et programme de traitement d’image WO2011013813A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011524862A JPWO2011013813A1 (ja) 2009-07-30 2010-07-30 車載装置および画像処理プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009178123 2009-07-30
JP2009-178123 2009-07-30

Publications (1)

Publication Number Publication Date
WO2011013813A1 true WO2011013813A1 (fr) 2011-02-03

Family

ID=43529464

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/062927 WO2011013813A1 (fr) 2009-07-30 2010-07-30 Dispositif embarqué à bord d’un véhicule et programme de traitement d’image

Country Status (2)

Country Link
JP (1) JPWO2011013813A1 (fr)
WO (1) WO2011013813A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013088995A1 (fr) * 2011-12-13 2013-06-20 住友建機株式会社 Dispositif d'affichage d'image périphérique et procédé d'affichage d'image périphérique pour engin de chantier
WO2013157250A1 (fr) * 2012-04-20 2013-10-24 京セラ株式会社 Dispositif de traitement d'image et procédé d'aide à la conduite
JP2014146149A (ja) * 2013-01-29 2014-08-14 Kyocera Corp 画像処理装置および走行支援方法
JP2016065449A (ja) * 2015-12-01 2016-04-28 住友建機株式会社 ショベル
EP3358840A4 (fr) * 2015-09-30 2018-09-12 Aisin Seiki Kabushiki Kaisha Dispositif de traitement d'image pour véhicules
JP2019206912A (ja) * 2019-08-19 2019-12-05 住友建機株式会社 ショベル及びショベルの表示装置
US20210248756A1 (en) * 2018-05-10 2021-08-12 Sony Corporation Image processing apparatus, vehicle-mounted apparatus, image processing method, and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000272445A (ja) * 1999-01-19 2000-10-03 Toyota Autom Loom Works Ltd 車両の後退時の操舵支援装置
JP2004147083A (ja) * 2002-10-24 2004-05-20 Matsushita Electric Ind Co Ltd 運転支援装置
JP2005329870A (ja) * 2004-05-21 2005-12-02 Clarion Co Ltd 補助情報提示装置
JP2008137425A (ja) * 2006-11-30 2008-06-19 Aisin Aw Co Ltd 駐車支援方法及び駐車支援装置
JP2009017462A (ja) * 2007-07-09 2009-01-22 Sanyo Electric Co Ltd 運転支援システム及び車両

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000272445A (ja) * 1999-01-19 2000-10-03 Toyota Autom Loom Works Ltd 車両の後退時の操舵支援装置
JP2004147083A (ja) * 2002-10-24 2004-05-20 Matsushita Electric Ind Co Ltd 運転支援装置
JP2005329870A (ja) * 2004-05-21 2005-12-02 Clarion Co Ltd 補助情報提示装置
JP2008137425A (ja) * 2006-11-30 2008-06-19 Aisin Aw Co Ltd 駐車支援方法及び駐車支援装置
JP2009017462A (ja) * 2007-07-09 2009-01-22 Sanyo Electric Co Ltd 運転支援システム及び車両

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013088995A1 (fr) * 2011-12-13 2013-06-20 住友建機株式会社 Dispositif d'affichage d'image périphérique et procédé d'affichage d'image périphérique pour engin de chantier
JP2013124467A (ja) * 2011-12-13 2013-06-24 Sumitomo (Shi) Construction Machinery Co Ltd 建設機械の周囲画像表示装置及び周囲画像表示方法
US20140267731A1 (en) * 2011-12-13 2014-09-18 Sumitomo(S.H.I.) Construction Machinery Co., Ltd. Peripheral image display device and method of displaying peripheral image for construction machine
US10017113B2 (en) 2011-12-13 2018-07-10 Sumitomo(S.H.I.) Construction Machinery Co., Ltd. Peripheral image display device and method of displaying peripheral image for construction machine
US10293751B2 (en) 2011-12-13 2019-05-21 Sumitomo (S.H.I.) Construction Machinery Co., Ltd. Peripheral image display device and method of displaying peripheral image for construction machine
WO2013157250A1 (fr) * 2012-04-20 2013-10-24 京セラ株式会社 Dispositif de traitement d'image et procédé d'aide à la conduite
US9789818B2 (en) 2012-04-20 2017-10-17 Kyocera Corporation Image processing apparatus, imaging apparatus and drive assisting method
JP2014146149A (ja) * 2013-01-29 2014-08-14 Kyocera Corp 画像処理装置および走行支援方法
EP3358840A4 (fr) * 2015-09-30 2018-09-12 Aisin Seiki Kabushiki Kaisha Dispositif de traitement d'image pour véhicules
JP2016065449A (ja) * 2015-12-01 2016-04-28 住友建機株式会社 ショベル
US20210248756A1 (en) * 2018-05-10 2021-08-12 Sony Corporation Image processing apparatus, vehicle-mounted apparatus, image processing method, and program
JP2019206912A (ja) * 2019-08-19 2019-12-05 住友建機株式会社 ショベル及びショベルの表示装置

Also Published As

Publication number Publication date
JPWO2011013813A1 (ja) 2013-01-10

Similar Documents

Publication Publication Date Title
JP5454934B2 (ja) 運転支援装置
JP4639753B2 (ja) 運転支援装置
JP5143235B2 (ja) 制御装置および車両周囲監視装置
JP5321267B2 (ja) 車両用画像表示装置及び俯瞰画像の表示方法
EP1916846B1 (fr) Dispositif et procede de surveillance de l'environnement d'un véhicule
JP5068779B2 (ja) 車両周囲俯瞰画像表示装置及び方法
JP4832321B2 (ja) カメラ姿勢推定装置、車両、およびカメラ姿勢推定方法
WO2011013813A1 (fr) Dispositif embarqué à bord d’un véhicule et programme de traitement d’image
WO2016132849A1 (fr) Dispositif d'aide à la conduite et procédé d'aide à la conduite
JP6379779B2 (ja) 車両用表示装置
JP4687411B2 (ja) 車両周辺画像処理装置及びプログラム
WO2019008764A1 (fr) Procédé et dispositif d'aide au stationnement
JP2005311868A (ja) 車両周辺視認装置
JP2010184607A (ja) 車両周辺表示装置
JP5471141B2 (ja) 駐車支援装置及び駐車支援方法
JP5549235B2 (ja) 運転支援装置
JP5636493B2 (ja) 画像処理装置、及び画像表示装置
JP2010089716A (ja) 駐車支援装置及び駐車支援方法
JP2007124097A (ja) 車両周辺視認装置
JP2006171950A (ja) ヘッドアップディスプレイの表示制御装置およびプログラム
JP2010183234A (ja) 描画装置
JP5273068B2 (ja) 車両周辺監視装置
JPH0981757A (ja) 車両位置検出装置
JP2011211432A (ja) 車載用撮像装置
JP4677820B2 (ja) 予測進路表示装置および予測進路表示方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10804559

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2011524862

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08/05/2012)

122 Ep: pct application non-entry in european phase

Ref document number: 10804559

Country of ref document: EP

Kind code of ref document: A1