WO2011013813A1 - In-vehicle device and image processing program - Google Patents

In-vehicle device and image processing program Download PDF

Info

Publication number
WO2011013813A1
WO2011013813A1 PCT/JP2010/062927 JP2010062927W WO2011013813A1 WO 2011013813 A1 WO2011013813 A1 WO 2011013813A1 JP 2010062927 W JP2010062927 W JP 2010062927W WO 2011013813 A1 WO2011013813 A1 WO 2011013813A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
steering angle
image
host vehicle
imaging signal
Prior art date
Application number
PCT/JP2010/062927
Other languages
French (fr)
Japanese (ja)
Inventor
秀一 小原
Original Assignee
クラリオン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by クラリオン株式会社 filed Critical クラリオン株式会社
Priority to JP2011524862A priority Critical patent/JPWO2011013813A1/en
Publication of WO2011013813A1 publication Critical patent/WO2011013813A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/304Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
    • B60R2300/305Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images merging camera image with lines or icons
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/806Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking

Definitions

  • the present invention relates to an in-vehicle device and an image processing program.
  • Patent Literature 1 describes a driving assistance display device that superimposes and displays a guide line including a guide line indicating a vehicle width and a guide line indicating a distance on the road surface from the rear end of a vehicle on a captured image of a wide-angle camera. ing.
  • vehicle width display as in Patent Document 1, there is also an in-vehicle device that displays a guide indicating the magnitude of the steering angle of the host vehicle acquired by the steering angle sensor.
  • the in-vehicle device is based on the imaging unit that images the periphery of the host vehicle and outputs the imaging signal, the video display unit that displays the video based on the imaging signal on the display device, and the imaging signal.
  • a steering angle detector that detects the steering angle of the host vehicle, and an image display unit that displays an image representing the vehicle width and steering angle of the host vehicle on the display device based on the steering angle.
  • the steering angle detection unit includes a latest image acquired based on the imaging signal and a latest image acquired based on the imaging signal. It is preferable to detect the steering angle based on the horizontal deviation amount from the image.
  • the image display unit displays the straight line representing the vehicle width of the host vehicle and the polygon representing the steering angle as an image to the display device. Display is preferred.
  • the image display unit has at least one of hue, saturation, and brightness as compared with the color of the region that is the background of the straight line in the video. It is preferable to display a straight line with a color different from the fixed amount.
  • the image display unit includes the image representing the vehicle width and the steering angle of the host vehicle and the ground of the bumper of the host vehicle. It is preferable to display an image representing the height on the display device so as to overlap the image.
  • the in-vehicle device according to any one of the first to fifth aspects further includes a speed detection unit that detects the speed of the host vehicle based on the imaging signal.
  • the image display unit when the speed exceeds a predetermined speed, does not display an image representing the vehicle width and the steering angle of the host vehicle on the display device.
  • the image processing program includes an imaging process in which an imaging signal obtained by imaging the periphery of the host vehicle is input, and a video display process for displaying an image based on the imaging signal on the display device.
  • An image for displaying a steering angle detection step for detecting the steering angle of the host vehicle based on the imaging signal, and an image representing the vehicle width and the steering angle of the host vehicle based on the steering angle on the display device in a superimposed manner.
  • the display process is executed by a computer.
  • the vehicle width of the host vehicle and the size of the steering angle of the host vehicle can be easily grasped from the screen.
  • FIG. 1 is a block diagram showing the overall configuration of the in-vehicle device in the present embodiment.
  • the in-vehicle device 1 is mounted on the vehicle, but is not connected to the vehicle except for a power supply line for receiving driving power. Accordingly, various signals output from the vehicle, such as vehicle speed pulses, cannot be received. Further, for example, a network connection such as a CAN (Controller Area Network) is not provided.
  • a host vehicle a vehicle on which the in-vehicle device 1 is mounted.
  • the in-vehicle device 1 includes a control circuit 11 that controls each unit.
  • the control circuit 11 is connected to a ROM 12, a RAM 13, a rear camera 14, an input device 15, a liquid crystal monitor 16, and a speaker 17.
  • the control circuit 11 is composed of a microprocessor and its peripheral circuits.
  • the control circuit 11 is a computer that executes a predetermined control program stored in the ROM 12 using the RAM 13 as a work area and performs various controls.
  • the rear camera 14 is a camera installed behind the host vehicle. The rear camera 14 images the rear lower part of the host vehicle at a speed of, for example, 30 frames per second, and outputs an imaging signal such as NTSC to the control circuit 11.
  • the control circuit 11 converts this image pickup signal into image data by a known technique.
  • the input device 15 is an input device such as a touch panel or a remote controller.
  • the liquid crystal monitor 16 displays an image or the like on the screen based on the image data output from the control circuit 11.
  • the speaker 17 emits a sound to the passenger of the own vehicle based on the audio signal output from the control circuit 11.
  • the control circuit 11 includes a video display unit 11a, a detection unit 11b, and an image display unit 11c. Each of these functional units is realized by software when the control circuit 11 executes a control program stored in the ROM 12.
  • the video display unit 11 a displays the video captured by the rear camera 14 on the liquid crystal monitor 16. As described above, since the rear camera 14 captures images at a speed of 30 frames per second, the video display unit 11a also displays on the liquid crystal monitor 16 at a speed of 30 frames per second. Specifically, by outputting image data based on the imaging signal from the rear camera 14 to the liquid crystal monitor 16, the video imaged by the rear camera 14 is displayed on the liquid crystal monitor 16.
  • the detection unit 11b analyzes the image data based on the imaging signal from the rear camera 14 and detects the current steering angle and vehicle speed of the host vehicle. Specific contents of the detection processing by the detection unit 11b will be described in detail later.
  • the image display unit 11c Based on the steering angle of the host vehicle detected by the detection unit 11b, the image display unit 11c displays a straight line representing the vehicle width of the host vehicle and a polygon representing the steering angle of the host vehicle on the liquid crystal monitor. 16 is displayed. These straight lines and polygons are displayed on the video displayed by the video display unit 11a.
  • FIG. 2 is a diagram showing an installation location and an imaging range of the rear camera 14.
  • FIG. 2A is a top view of the host vehicle 2
  • FIG. 2B is a side view of the host vehicle 2.
  • the rear camera 14 is installed behind the host vehicle 2 and images the range indicated by the symbol A in the drawing, that is, the lower rear part of the host vehicle 2.
  • the location where the rear camera 14 is installed may be any location as long as the periphery of the host vehicle 2 can be imaged. Information such as the vehicle width, the location of the rear camera 14 and the angle is given to the in-vehicle device 1 in advance by a known calibration method or the like.
  • the display screen of the liquid crystal monitor 16 will be described.
  • an image captured by the rear camera 14 is displayed by the image display unit 11a, and a straight line and a polygon are displayed by the image display unit 11c on the image.
  • display by the image display unit 11c will be described.
  • FIG. 3 is a diagram showing a display example of the liquid crystal monitor 16 when the host vehicle 2 is traveling straight backward.
  • FIG. 3A shows a display screen SC1 of the liquid crystal monitor 16 when the host vehicle 2 goes straight backward.
  • the background of the screen SC1 is an image captured by the rear camera 14.
  • the image display unit 11c displays two straight lines LL and LR that represent the vehicle width of the host vehicle 2 so as to overlap the video.
  • FIG. 3B is a diagram in which the straight lines LL and LR displayed on the screen SC1 are projected onto the top view of the host vehicle 2. There are no straight lines LL, LR on the actual road surface. As shown in FIG. 3B, the straight lines LL and LR displayed on the screen SC1 indicate that the left and right ends of the vehicle body are straight lines LL and LR when the host vehicle 2 moves backward at the current steering angle. Represents reaching the indicated location. The driver can grasp the positional relationship between the obstacles on the road and the own vehicle 2 by confirming the positions of the straight lines LL and LR from the display screen of the liquid crystal monitor 16.
  • the video display unit 11a displays the video captured by the rear camera 14 with the left and right being reversed. That is, the direction indicated by reference sign D1 in FIG. 3A is the right direction when viewed from the driver of the host vehicle 2. This is because the left and right directions of the liquid crystal monitor 16 are aligned with the rearview mirror of the host vehicle 2.
  • the image display unit 11c determines the colors of the straight lines LL and LR according to the color of the video that is the background of the screen. For example, when a concrete road surface is reflected on the liquid crystal monitor 16, if the straight lines LL and LR are drawn in gray, the visibility is poor. Therefore, the image display unit 11c draws the straight lines LL and LR with a complementary color of the background color. Note that the straight lines LL and LR may be drawn with colors different in hue by a predetermined amount or more instead of complementary colors. Further, instead of the hue, the straight lines LL and LR may be drawn with colors that differ by a predetermined amount or more in saturation and brightness.
  • FIG. 4 is a diagram showing a display example of the liquid crystal monitor 16 when the host vehicle 2 travels while turning the steering wheel backward.
  • 4A shows a display screen SC2 of the liquid crystal monitor 16
  • FIG. 4B shows an overhead view of the host vehicle 2.
  • the steering angle of the host vehicle 2 is, for example, an angle A1 shown in FIG. 4B
  • the left and right ends of the host vehicle 2 do not become straight lines LL and LR that are horizontal to the host vehicle 2 as shown in FIG. Instead, the left and right ends of the host vehicle 2 are positioned at angles corresponding to the steering angle with respect to the straight lines LL and LR, such as the straight lines LL ′ and LR ′ shown in FIG.
  • the image display unit 11c displays straight lines LL ′ and LR ′ having an angle corresponding to the steering angle.
  • the image display unit 11c further draws a triangle corresponding to the steering angle with respect to a straight line that approaches the inside of the host vehicle 2 among the two straight lines LL ′ and LR ′. Specifically, as shown in FIG. 4A, the triangle S is drawn with a line pattern between the position where the straight line LL was drawn and the straight line LL '. The driver of the own vehicle 2 can grasp how much the own vehicle 2 is bent by visually recognizing the triangle S displayed on the liquid crystal monitor 16 when moving the own vehicle 2 backward.
  • the detection unit 11b performs this detection process based on two consecutive frames imaged by the rear camera 14. Specifically, the steering angle and the vehicle speed of the host vehicle 2 are detected by comparing the image data based on the latest imaging signal output by the rear camera 14 and the immediately preceding image data.
  • FIG. 5 is a diagram for explaining the detection processing of the steering angle and the vehicle speed by the detection unit 11b.
  • a frame P1 shown in FIG. 5A is obtained by an imaging signal from the rear camera 14 at a certain time t1.
  • the detection unit 11b extracts a predetermined range of image data from the frame P1 and stores it in the RAM 13.
  • the extracted partial image data is referred to as partial image data.
  • a certain range of image data near the center of the frame P1 is stored as partial image data.
  • the detection unit 11b executes a block matching process between the frame P2 and the partial image data stored in the RAM 13. As shown in FIG. 5B, the search range of the block matching process is a range 52 larger than the range 51 shown in FIG.
  • the block matching process is performed according to the following procedure.
  • the range 53 having the same size as the range 51 from the upper left corner of the search range 52 is compared with the partial image data stored in the RAM 13 in pixel units. Specifically, a difference in luminance value is calculated for each pixel, and a cumulative value of absolute values of the difference is obtained.
  • the range 53 is shifted to the right by one pixel, and the above cumulative value is obtained again. In this way, the entire range of the search range 52 is compared with the partial image data.
  • the detection unit 11b compares the minimum cumulative value with a predetermined threshold value. When the minimum cumulative value is equal to or less than the predetermined threshold value, the detection unit 11b considers that the matching process has been successful, and obtains the steering angle and the vehicle speed by the method shown in FIG. In FIG. 5C, it is assumed that the position where the matching is successful is the range 53 ′ and the position of the partial image data is the range 51.
  • the detecting unit 11b calculates the amount of deviation between the range 53 'and the range 51 in the vertical and horizontal directions in the screen.
  • the vertical shift amount is dy
  • the horizontal shift amount is dx.
  • the detection unit 11b converts the shift amounts dx and dy in the screen into real space distances, and calculates the actual steering angle and vehicle speed. Specifically, since the interval between the time t1 at which the frame P1 is acquired and the time t2 at which the frame P2 is acquired is known, the detection unit performs a calculation that takes into account the position information of the rear camera 14. , Dx, and the vehicle speed can be calculated from dy. As the steering angle increases, dx increases, and as the vehicle speed increases, dy increases.
  • the detection unit 11b considers that the matching process has failed.
  • the failure of the matching process is caused, for example, when the imaging range of the rear camera 14 changes more than the search range 52 from time t1 to time t2 because the host vehicle 2 is moving at high speed. In such a case, the image display unit 11c stops the display on the liquid crystal monitor 16.
  • the image display unit 11c performs the display as described above based on the steering angle detected by the detection unit 11b as described above.
  • the image display unit 11c does not display as described above when the host vehicle 2 is stationary, that is, when the vehicle speed detected by the detection unit 11b is equal to or less than a predetermined threshold value. This is because when the host vehicle 2 is stationary, the imaging signal from the rear camera 14 does not change, and the detection unit 11b cannot detect the steering angle. However, when an operation that frequently repeats the progress and the stop is performed, the display by the image display unit 11c is frequently switched, and thus the display becomes troublesome. Therefore, the image display unit 11c does not perform the display as described above only when the host vehicle 2 is continuously stationary for a predetermined time (for example, 5 seconds).
  • a predetermined time for example, 5 seconds
  • FIG. 6 is a flowchart of detection processing by the detection unit 11b.
  • This process is realized in software by the control circuit 11 executing a control program (image processing program) stored in the ROM 12.
  • the control circuit 11 creates image data based on the imaging signal from the rear camera 14 and stores it in the RAM 13.
  • the video display unit 11 a displays a video based on the image data stored in the RAM 13 on the liquid crystal monitor 16.
  • the detection unit 11b executes block matching processing between the partial image data stored in the RAM 13 and the image data stored in the RAM 13 in step S10. By this process, the minimum cumulative value described above is calculated.
  • step S40 it is determined whether or not the cumulative value calculated in step S30 is a predetermined threshold value or less. If the cumulative value exceeds a predetermined threshold value, the process proceeds to step S80. On the other hand, if the cumulative value is equal to or less than the predetermined threshold value, the process proceeds to step S50.
  • step S50 the detection unit 11b detects the steering angle and the vehicle speed of the host vehicle 2 based on the result of the block matching process in step S30.
  • step S60 it is determined whether or not a predetermined time has passed while the host vehicle 2 is stationary. If the predetermined time has elapsed while the host vehicle is stationary, the process proceeds to step S80. On the other hand, if the host vehicle is not stationary at the present time, or if the elapsed time while the host vehicle is stationary is less than the predetermined time, the process proceeds to step S70.
  • step S ⁇ b> 70 the image display unit 11 c displays two straight lines representing the vehicle width of the host vehicle 2 and a triangle representing the steering angle of the host vehicle 2 on the liquid crystal monitor 16.
  • step S80 partial image data is extracted from a predetermined range of the frame P2, which is the latest frame, as shown in a range 51 shown in FIG.
  • the image display unit 11c displays on the liquid crystal monitor 16 a straight line representing the vehicle width of the host vehicle and a triangle representing the magnitude of the steering angle, superimposed on the video displayed by the video display unit 11a. Thereby, the movement trajectory of the host vehicle 2 in consideration of the steering angle and the magnitude of the steering angle of the host vehicle 2 can be easily grasped from the display screen of the liquid crystal monitor 16.
  • the detection unit 11b detects the steering angle and the vehicle speed of the host vehicle 2 using only the output of the rear camera 14. Thereby, it is not necessary to obtain the output from the own vehicle 2 and the outputs of various sensors, and the cost of the in-vehicle device 1 can be reduced.
  • the image display unit 11c displays a straight line representing the vehicle width of the host vehicle with a complementary color of the color of the video displayed on the liquid crystal monitor 16 by the video display unit 11a.
  • the in-vehicle device of the present embodiment has the same configuration as the in-vehicle device according to the first embodiment. Note that members, circuits, and functional units similar to those in the first embodiment are denoted by the same reference numerals, and description thereof is omitted.
  • the image display unit 11c of the in-vehicle device 1 performs a display representing the ground height of the bumper of the host vehicle 2 on a straight line representing the vehicle width of the host vehicle 2. It is assumed that the ground height of the bumper of the host vehicle 2, that is, the height from the road surface to the upper end of the bumper is given to the in-vehicle device 1 in advance by a calibration process or the like that is executed in advance.
  • FIG. 7 is a diagram illustrating an example of a screen display of the in-vehicle device 1 according to the second embodiment.
  • the screen SC3 On the screen SC3, two straight lines LL ′′ and LR ′′ representing the vehicle width of the host vehicle 2 are displayed.
  • planes WL and WR representing the bumper's ground height are displayed. These surfaces WL and WR are displayed in a three-dimensional shape so as to cover the road surface to the bumper ground height.
  • the image display unit 11c displays on the liquid crystal monitor 16 a straight line representing the vehicle width of the host vehicle 2, a triangle representing the magnitude of the steering angle, and a surface representing the ground height of the bumper of the host vehicle. Thereby, it becomes possible to grasp
  • the detection unit 11b may detect the steering angle and the vehicle speed of the host vehicle 2 by a method other than block matching.
  • the steering angle and the vehicle speed of the host vehicle 2 may be detected by obtaining a feature point from image data using a known technique and calculating the amount of movement of the feature point between frames.
  • the detection unit 11b may detect the steering angle and the vehicle speed of the host vehicle 2 from three or more consecutive frames.
  • the image display unit 11c may determine a straight line representing the vehicle width of the host vehicle 2 and a triangle representing the magnitude of the steering angle based on factors other than the background color. For example, when the in-vehicle device 1 is a navigation device, the color may be determined based on the current location and map data. Further, the color may be determined by the state of the host vehicle 2, the output of the illuminance sensor, or the time zone. Furthermore, decorations other than colors, such as the thickness of a straight line or a triangular pattern, may be determined based on these factors.
  • the driver may be notified by voice or the like based on the detection result by the detection unit 11b or the display content on the screen by the image display unit 11c. For example, movement to the left and right, turning to the left and right, forward and backward, etc. may be notified by voice or the like.
  • the range 51 may be divided into a plurality of areas, and the block matching process may be performed for each area.
  • the range 51 is equally divided into 4 horizontal rows and 3 vertical rows to form 12 small regions.
  • a block matching process is performed using an area obtained by enlarging the small area in all directions as a search range.
  • the steering angle and the vehicle speed of the host vehicle 2 may be detected using the twelve dx and dy thus obtained.
  • the parts excluding the rear camera 14 of the in-vehicle device 1 of the above embodiment, that is, the control circuit 11, the ROM 12, the RAM 13, the input device 15, the liquid crystal monitor 16, and the speaker 17 may be replaced with a personal computer.
  • the above-described control program can be provided through a recording medium such as a CD-ROM, a USB memory (flash memory), a memory card, or a data signal such as the Internet.
  • FIG. 8 shows the state.
  • the personal computer 100 is provided with a program via the CD-ROM 104.
  • the personal computer 100 has a connection function with the communication line 101.
  • the computer 102 is a server computer that provides the program, and stores the program in a recording medium such as the hard disk 103.
  • the communication line 101 is a communication line such as the Internet or personal computer communication, or a dedicated communication line.
  • the computer 102 reads the program using the hard disk 103 and transmits the program to the personal computer 100 via the communication line 101. That is, the program is transmitted as a data signal through the communication line 101 via a carrier wave.
  • the program can be supplied as a computer-readable computer program product in various forms such as a recording medium and a data signal (carrier wave). If the vehicle-mounted apparatus 1 of the said embodiment also takes the same structure, a control program can be supplied similarly.
  • the present invention is not limited to the above-described embodiments, and other forms conceivable within the scope of the technical idea of the present invention are also included in the scope of the present invention. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

Disclosed is an in-vehicle device which is provided with an image pickup unit, a video display unit, a steering angle detection unit, and an image display unit. The image pickup unit takes an image of the vicinity of a vehicle itself and outputs an image pickup signal. The video display unit displays, on a display device, a video based on the image pickup signal. The steering angle detection unit detects a steering angle of the vehicle itself on the basis of the image pickup signal. On the basis of the steering angle, the image display unit displays an image that indicates the width of the vehicle itself and the steering angle thereof on the display device with the image superimposed on the video.

Description

車載装置および画像処理プログラムIn-vehicle device and image processing program
 本発明は、車載装置および画像処理プログラムに関する。 The present invention relates to an in-vehicle device and an image processing program.
 乗用車の運転を支援するため、自車両の車幅を表すガイドラインを表示する車載装置が知られている。例えば特許文献1には、広角カメラの撮像画像に、車幅を示すガイド線と車両後端からの路面上での距離を示すガイド線とからなるガイドラインを重畳表示する運転支援表示装置が記載されている。特許文献1のような車幅表示に加えて、舵角センサにより取得した自車両の操舵角の大きさを表すガイドを表示する車載装置も存在する。 2. Description of the Related Art An in-vehicle device that displays a guideline indicating a vehicle width of a host vehicle is known in order to support driving of a passenger car. For example, Patent Literature 1 describes a driving assistance display device that superimposes and displays a guide line including a guide line indicating a vehicle width and a guide line indicating a distance on the road surface from the rear end of a vehicle on a captured image of a wide-angle camera. ing. In addition to the vehicle width display as in Patent Document 1, there is also an in-vehicle device that displays a guide indicating the magnitude of the steering angle of the host vehicle acquired by the steering angle sensor.
日本国特開2003-104145号公報Japanese Unexamined Patent Publication No. 2003-104145
 自車両の移動支援のためのガイドが複数重なって表示されると画面が煩雑になり、視認性が悪くなるという問題があった。 When a plurality of guides for assisting the movement of the own vehicle are displayed, the screen becomes complicated and the visibility is deteriorated.
 本発明の第1の態様によると、車載装置は、自車両の周辺を撮像し撮像信号を出力する撮像部と、撮像信号に基づく映像を表示装置へ表示する映像表示部と、撮像信号に基づいて自車両の操舵角を検出する操舵角検出部と、操舵角に基づいて、自車両の車幅と操舵角とを表す画像を、表示装置へ映像に重ねて表示する画像表示部とを備える。
 本発明の第2の態様によると、第1の態様の車載装置において、操舵角検出部は、撮像信号に基づき取得された最新の画像と、撮像信号に基づき取得された最新の画像の直近の画像と、の水平方向のずれ量に基づいて操舵角を検出するのが好ましい。
 本発明の第3の態様によると、第1または2の態様の車載装置において、画像表示部は、自車両の車幅を表す直線と、操舵角を表す多角形と、を画像として表示装置へ表示するのが好ましい。
 本発明の第4の態様によると、第3の態様の車載装置において、画像表示部は、映像のうち直線の背景となる領域の色と比べ、色相、彩度、および明度の少なくとも1つが所定量以上異なる色で直線を表示するのが好ましい。
 本発明の第5の態様によると、第1~4のいずれか一の態様の車載装置において、画像表示部は、自車両の車幅と操舵角とを表す画像と共に、自車両のバンパーの地上高さを表す画像を表示装置へ映像に重ねて表示するのが好ましい。
 本発明の第6の態様によると、第1~5のいずれか一の態様の車載装置において、撮像信号に基づいて自車両の速度を検出する速度検出部を更に備えるのが好ましい。
 本発明の第7の態様によると、第6の態様の車載装置において、画像表示部は、速度が所定速度を上回る場合、自車両の車幅と操舵角とを表す画像を表示装置へ表示しないのが好ましい。
 本発明の第8の態様によると、画像処理プログラムは、自車両の周辺を撮像して得られる撮像信号が入力される撮像工程と、撮像信号に基づく映像を表示装置へ表示する映像表示工程と、撮像信号に基づいて自車両の操舵角を検出する操舵角検出工程と、操舵角に基づいて、自車両の車幅と操舵角とを表す画像を、表示装置へ映像に重ねて表示する画像表示工程とをコンピュータに実行させる。
According to the first aspect of the present invention, the in-vehicle device is based on the imaging unit that images the periphery of the host vehicle and outputs the imaging signal, the video display unit that displays the video based on the imaging signal on the display device, and the imaging signal. A steering angle detector that detects the steering angle of the host vehicle, and an image display unit that displays an image representing the vehicle width and steering angle of the host vehicle on the display device based on the steering angle. .
According to the second aspect of the present invention, in the in-vehicle device according to the first aspect, the steering angle detection unit includes a latest image acquired based on the imaging signal and a latest image acquired based on the imaging signal. It is preferable to detect the steering angle based on the horizontal deviation amount from the image.
According to the third aspect of the present invention, in the in-vehicle device according to the first or second aspect, the image display unit displays the straight line representing the vehicle width of the host vehicle and the polygon representing the steering angle as an image to the display device. Display is preferred.
According to the fourth aspect of the present invention, in the in-vehicle device according to the third aspect, the image display unit has at least one of hue, saturation, and brightness as compared with the color of the region that is the background of the straight line in the video. It is preferable to display a straight line with a color different from the fixed amount.
According to the fifth aspect of the present invention, in the in-vehicle device according to any one of the first to fourth aspects, the image display unit includes the image representing the vehicle width and the steering angle of the host vehicle and the ground of the bumper of the host vehicle. It is preferable to display an image representing the height on the display device so as to overlap the image.
According to the sixth aspect of the present invention, it is preferable that the in-vehicle device according to any one of the first to fifth aspects further includes a speed detection unit that detects the speed of the host vehicle based on the imaging signal.
According to the seventh aspect of the present invention, in the in-vehicle device according to the sixth aspect, when the speed exceeds a predetermined speed, the image display unit does not display an image representing the vehicle width and the steering angle of the host vehicle on the display device. Is preferred.
According to the eighth aspect of the present invention, the image processing program includes an imaging process in which an imaging signal obtained by imaging the periphery of the host vehicle is input, and a video display process for displaying an image based on the imaging signal on the display device. An image for displaying a steering angle detection step for detecting the steering angle of the host vehicle based on the imaging signal, and an image representing the vehicle width and the steering angle of the host vehicle based on the steering angle on the display device in a superimposed manner. The display process is executed by a computer.
 本発明によれば、自車両の車幅と、自車両の操舵角の大きさと、を画面から容易に把握できる。 According to the present invention, the vehicle width of the host vehicle and the size of the steering angle of the host vehicle can be easily grasped from the screen.
第1の実施形態における車載装置の全体構成を示すブロック図である。It is a block diagram which shows the whole structure of the vehicle-mounted apparatus in 1st Embodiment. リアカメラ14の設置箇所ならびに撮像範囲を示す図である。It is a figure which shows the installation location and imaging range of the rear camera. 自車両2が後方へ直進しているときの液晶モニタ16の表示例を示す図である。It is a figure which shows the example of a display of the liquid crystal monitor 16 when the own vehicle 2 is moving straight back. 自車両2が後方へステアリングハンドルを切りながら進行した場合の液晶モニタ16の表示例を示す図である。It is a figure which shows the example of a display of the liquid crystal monitor 16 when the own vehicle 2 advances, turning a steering wheel back. 検出部11bによる操舵角および車速の検出処理を説明するための図である。It is a figure for demonstrating the detection process of the steering angle and vehicle speed by the detection part 11b. 検出部11bによる検出処理のフローチャートである。It is a flowchart of the detection process by the detection part 11b. 第2の実施の形態に係る車載装置1の画面表示の例を示す図である。It is a figure which shows the example of the screen display of the vehicle-mounted apparatus 1 which concerns on 2nd Embodiment. 制御プログラムの提供の様子を示す図である。It is a figure which shows the mode of provision of a control program.
(第1の実施の形態)
 図1は、本実施形態における車載装置の全体構成を示すブロック図である。車載装置1は車両に搭載されているが、駆動電力を受けとるための電源ラインを除き、車両とは接続されていない。従って、例えば車速パルス等、車両が出力する種々の信号を受信することはできない。また、例えばCAN(Controller Area Network)等によるネットワーク接続も備えていない。以下、車載装置1が搭載された車両を自車両と呼ぶ。
(First embodiment)
FIG. 1 is a block diagram showing the overall configuration of the in-vehicle device in the present embodiment. The in-vehicle device 1 is mounted on the vehicle, but is not connected to the vehicle except for a power supply line for receiving driving power. Accordingly, various signals output from the vehicle, such as vehicle speed pulses, cannot be received. Further, for example, a network connection such as a CAN (Controller Area Network) is not provided. Hereinafter, a vehicle on which the in-vehicle device 1 is mounted is referred to as a host vehicle.
 車載装置1は、各部の制御を行う制御回路11を備える。制御回路11には、ROM12、RAM13、リアカメラ14、入力装置15、液晶モニタ16、スピーカ17がそれぞれ接続されている。 The in-vehicle device 1 includes a control circuit 11 that controls each unit. The control circuit 11 is connected to a ROM 12, a RAM 13, a rear camera 14, an input device 15, a liquid crystal monitor 16, and a speaker 17.
 制御回路11は、マイクロプロセッサ及びその周辺回路から構成される。制御回路11は、RAM13を作業エリアとしてROM12に格納された所定の制御プログラムを実行し、各種の制御を行うコンピュータである。リアカメラ14は自車両の後方に設置されたカメラである。リアカメラ14は自車両の後方下部を例えば毎秒30フレームの速度で撮像し、制御回路11へ例えばNTSC等の撮像信号を出力する。制御回路11はこの撮像信号を公知の技術により画像データに変換する。 The control circuit 11 is composed of a microprocessor and its peripheral circuits. The control circuit 11 is a computer that executes a predetermined control program stored in the ROM 12 using the RAM 13 as a work area and performs various controls. The rear camera 14 is a camera installed behind the host vehicle. The rear camera 14 images the rear lower part of the host vehicle at a speed of, for example, 30 frames per second, and outputs an imaging signal such as NTSC to the control circuit 11. The control circuit 11 converts this image pickup signal into image data by a known technique.
 入力装置15は例えばタッチパネルやリモコン等の入力装置である。液晶モニタ16は制御回路11から出力される画像データに基づき、画面へ画像等の表示を行う。スピーカ17は制御回路11が出力する音声信号に基づいて、自車両の搭乗者へ音を発する。 The input device 15 is an input device such as a touch panel or a remote controller. The liquid crystal monitor 16 displays an image or the like on the screen based on the image data output from the control circuit 11. The speaker 17 emits a sound to the passenger of the own vehicle based on the audio signal output from the control circuit 11.
 制御回路11は、映像表示部11a、検出部11b、および画像表示部11cを備える。これらの各機能部は、制御回路11がROM12に格納された制御プログラムを実行することにより、ソフトウェア的に実現される。 The control circuit 11 includes a video display unit 11a, a detection unit 11b, and an image display unit 11c. Each of these functional units is realized by software when the control circuit 11 executes a control program stored in the ROM 12.
 映像表示部11aは、リアカメラ14が撮像した映像を液晶モニタ16へ表示する。リアカメラ14は前述の通り毎秒30フレームの速度で撮像を行うので、映像表示部11aもこれに合わせて毎秒30フレームの速度で液晶モニタ16への表示を行う。具体的には、リアカメラ14からの撮像信号に基づく画像データを液晶モニタ16へ出力することにより、リアカメラ14が撮像した映像が液晶モニタ16へ表示される。 The video display unit 11 a displays the video captured by the rear camera 14 on the liquid crystal monitor 16. As described above, since the rear camera 14 captures images at a speed of 30 frames per second, the video display unit 11a also displays on the liquid crystal monitor 16 at a speed of 30 frames per second. Specifically, by outputting image data based on the imaging signal from the rear camera 14 to the liquid crystal monitor 16, the video imaged by the rear camera 14 is displayed on the liquid crystal monitor 16.
 検出部11bは、リアカメラ14からの撮像信号に基づく画像データを解析し、自車両の現在の操舵角および車速を検出する。検出部11bによる検出処理の具体的な内容については後に詳述する。 The detection unit 11b analyzes the image data based on the imaging signal from the rear camera 14 and detects the current steering angle and vehicle speed of the host vehicle. Specific contents of the detection processing by the detection unit 11b will be described in detail later.
 画像表示部11cは、検出部11bが検出した自車両の操舵角に基づいて、画像表示部11cへ自車両の車幅を表す直線と、自車両の操舵角を表す多角形と、を液晶モニタ16へ表示する。この直線および多角形は、映像表示部11aが表示する映像の上に重ねて表示される。 Based on the steering angle of the host vehicle detected by the detection unit 11b, the image display unit 11c displays a straight line representing the vehicle width of the host vehicle and a polygon representing the steering angle of the host vehicle on the liquid crystal monitor. 16 is displayed. These straight lines and polygons are displayed on the video displayed by the video display unit 11a.
 図2は、リアカメラ14の設置箇所ならびに撮像範囲を示す図である。図2(a)は自車両2の上面図、図2(b)は自車両2の側面図である。図2(a),(b)に示すように、リアカメラ14は自車両2の後方に設置され、図中に符号Aで示す範囲、すなわち自車両2の後方下部を撮像する。リアカメラ14の設置箇所は、自車両2の周辺が撮像できる位置であればいずれの場所であってもよい。車幅、リアカメラ14の設置箇所や角度等の情報は、公知のキャリブレーション手法等により、あらかじめ車載装置1へ与えられる。 FIG. 2 is a diagram showing an installation location and an imaging range of the rear camera 14. FIG. 2A is a top view of the host vehicle 2, and FIG. 2B is a side view of the host vehicle 2. As shown in FIGS. 2A and 2B, the rear camera 14 is installed behind the host vehicle 2 and images the range indicated by the symbol A in the drawing, that is, the lower rear part of the host vehicle 2. The location where the rear camera 14 is installed may be any location as long as the periphery of the host vehicle 2 can be imaged. Information such as the vehicle width, the location of the rear camera 14 and the angle is given to the in-vehicle device 1 in advance by a known calibration method or the like.
 次に、液晶モニタ16の表示画面について説明する。液晶モニタ16には、映像表示部11aによりリアカメラ14が撮像した映像が表示されると共に、この映像に重ねて画像表示部11cにより直線および多角形が表示される。以下、画像表示部11cによる表示について述べる。 Next, the display screen of the liquid crystal monitor 16 will be described. On the liquid crystal monitor 16, an image captured by the rear camera 14 is displayed by the image display unit 11a, and a straight line and a polygon are displayed by the image display unit 11c on the image. Hereinafter, display by the image display unit 11c will be described.
 図3は、自車両2が後方へ直進しているときの液晶モニタ16の表示例を示す図である。図3(a)には、自車両2が後方へ直進している場合の液晶モニタ16の表示画面SC1を示す。画面SC1の背景は、リアカメラ14が撮像した映像となっている。画像表示部11cは、この映像に重ねて、自車両2の車幅を表す2本の直線LL、LRを表示する。 FIG. 3 is a diagram showing a display example of the liquid crystal monitor 16 when the host vehicle 2 is traveling straight backward. FIG. 3A shows a display screen SC1 of the liquid crystal monitor 16 when the host vehicle 2 goes straight backward. The background of the screen SC1 is an image captured by the rear camera 14. The image display unit 11c displays two straight lines LL and LR that represent the vehicle width of the host vehicle 2 so as to overlap the video.
 図3(b)は、自車両2の上面図に、画面SC1に表示されている直線LL、LRを投影した図である。実際の路面には、直線LL、LRは存在しない。図3(b)に示すように、画面SC1に表示されている直線LL、LRは、自車両2が現在の操舵角のまま後方に移動した場合、車体の左右端がそれぞれ直線LL、LRで示す場所に到達することを表している。運転者は、液晶モニタ16の表示画面から直線LL、LRの位置を確認することにより、路上の障害物等と自車両2との位置関係を把握することができる。 FIG. 3B is a diagram in which the straight lines LL and LR displayed on the screen SC1 are projected onto the top view of the host vehicle 2. There are no straight lines LL, LR on the actual road surface. As shown in FIG. 3B, the straight lines LL and LR displayed on the screen SC1 indicate that the left and right ends of the vehicle body are straight lines LL and LR when the host vehicle 2 moves backward at the current steering angle. Represents reaching the indicated location. The driver can grasp the positional relationship between the obstacles on the road and the own vehicle 2 by confirming the positions of the straight lines LL and LR from the display screen of the liquid crystal monitor 16.
 なお、映像表示部11aは、リアカメラ14が撮像した映像を、左右を反転させて表示する。すなわち、図3(a)上に符号D1で示す方向が、自車両2の運転者から見て右方向となる。これは、液晶モニタ16における左右の向きを、自車両2のバックミラーと揃えるためである。 The video display unit 11a displays the video captured by the rear camera 14 with the left and right being reversed. That is, the direction indicated by reference sign D1 in FIG. 3A is the right direction when viewed from the driver of the host vehicle 2. This is because the left and right directions of the liquid crystal monitor 16 are aligned with the rearview mirror of the host vehicle 2.
 画像表示部11cは、直線LL、LRの色を、画面の背景となる映像の色に応じて決定する。例えばコンクリートの路面が液晶モニタ16に映っている場合、直線LL、LRを灰色で描画すると視認性が悪い。そこで、画像表示部11cは、直線LL、LRを背景色の補色で描画する。なお、補色ではなく、色相が所定量以上異なる色で直線LL、LRを描画してもよい。また、色相ではなく、彩度や明度が所定量以上異なる色で直線LL、LRを描画するようにしてもよい。 The image display unit 11c determines the colors of the straight lines LL and LR according to the color of the video that is the background of the screen. For example, when a concrete road surface is reflected on the liquid crystal monitor 16, if the straight lines LL and LR are drawn in gray, the visibility is poor. Therefore, the image display unit 11c draws the straight lines LL and LR with a complementary color of the background color. Note that the straight lines LL and LR may be drawn with colors different in hue by a predetermined amount or more instead of complementary colors. Further, instead of the hue, the straight lines LL and LR may be drawn with colors that differ by a predetermined amount or more in saturation and brightness.
 図4は、自車両2が後方へステアリングハンドルを切りながら進行した場合の液晶モニタ16の表示例を示す図である。図4(a)には液晶モニタ16の表示画面SC2を、図4(b)には自車両2の俯瞰図を示す。自車両2の操舵角が例えば図4(b)に示す角度A1であった場合、自車両2の左右端は図3のように自車両2に対し水平な直線LL、LRとはならない。代わりに、自車両2の左右端は図4(b)に示す直線LL’、LR’のように、直線LL、LRに対し操舵角に応じた角度を付けた位置となる。画像表示部11cは、図4(a)に示すように、操舵角に応じた角度の直線LL’、LR’を表示する。 FIG. 4 is a diagram showing a display example of the liquid crystal monitor 16 when the host vehicle 2 travels while turning the steering wheel backward. 4A shows a display screen SC2 of the liquid crystal monitor 16, and FIG. 4B shows an overhead view of the host vehicle 2. As shown in FIG. When the steering angle of the host vehicle 2 is, for example, an angle A1 shown in FIG. 4B, the left and right ends of the host vehicle 2 do not become straight lines LL and LR that are horizontal to the host vehicle 2 as shown in FIG. Instead, the left and right ends of the host vehicle 2 are positioned at angles corresponding to the steering angle with respect to the straight lines LL and LR, such as the straight lines LL ′ and LR ′ shown in FIG. As shown in FIG. 4A, the image display unit 11c displays straight lines LL ′ and LR ′ having an angle corresponding to the steering angle.
 画像表示部11cは更に、これら2つの直線LL’、LR’のうち、自車両2の内側へ近づく直線に対して、操舵角に応じた三角形を描画する。具体的には、図4(a)に示すように、直線LLが描画されていた位置と、直線LL’との間に、三角形Sをラインパターンにより描画する。自車両2の運転者は、自車両2を後方へ動かす際に液晶モニタ16上に表示された三角形Sを視認することにより、自車両2がどの程度曲がっているかを把握することができる。 The image display unit 11c further draws a triangle corresponding to the steering angle with respect to a straight line that approaches the inside of the host vehicle 2 among the two straight lines LL ′ and LR ′. Specifically, as shown in FIG. 4A, the triangle S is drawn with a line pattern between the position where the straight line LL was drawn and the straight line LL '. The driver of the own vehicle 2 can grasp how much the own vehicle 2 is bent by visually recognizing the triangle S displayed on the liquid crystal monitor 16 when moving the own vehicle 2 backward.
 次に、検出部11bによる自車両2の操舵角および車速の検出処理について詳述する。検出部11bはこの検出処理を、リアカメラ14により撮像された連続する2フレームに基づいて実行する。具体的には、リアカメラ14が出力した最新の撮像信号に基づく画像データと、直前の画像データと、を比較することにより、自車両2の操舵角および車速を検出する。 Next, the detection processing of the steering angle and the vehicle speed of the host vehicle 2 by the detection unit 11b will be described in detail. The detection unit 11b performs this detection process based on two consecutive frames imaged by the rear camera 14. Specifically, the steering angle and the vehicle speed of the host vehicle 2 are detected by comparing the image data based on the latest imaging signal output by the rear camera 14 and the immediately preceding image data.
 図5は、検出部11bによる操舵角および車速の検出処理を説明するための図である。ある時刻t1において、リアカメラ14からの撮像信号により、図5(a)に示すフレームP1が得られたとする。検出部11bはこのフレームP1から、所定範囲の画像データを抽出し、RAM13に保存する。以下、この抽出された部分的な画像データを、部分画像データと呼ぶ。例えば図5(a)に示す範囲51のように、フレームP1の中央付近の一定範囲の画像データが部分画像データとして保存される。 FIG. 5 is a diagram for explaining the detection processing of the steering angle and the vehicle speed by the detection unit 11b. Assume that a frame P1 shown in FIG. 5A is obtained by an imaging signal from the rear camera 14 at a certain time t1. The detection unit 11b extracts a predetermined range of image data from the frame P1 and stores it in the RAM 13. Hereinafter, the extracted partial image data is referred to as partial image data. For example, as in a range 51 shown in FIG. 5A, a certain range of image data near the center of the frame P1 is stored as partial image data.
 次に、フレームP1の次のフレームP2が時刻t2において得られたとする。検出部11bはこのフレームP2と、RAM13に保存されている部分画像データとのブロックマッチング処理を実行する。図5(b)に示すように、ブロックマッチング処理の検索範囲は、図5(a)に示した範囲51よりも大きな範囲52となる。 Next, it is assumed that the next frame P2 of the frame P1 is obtained at time t2. The detection unit 11b executes a block matching process between the frame P2 and the partial image data stored in the RAM 13. As shown in FIG. 5B, the search range of the block matching process is a range 52 larger than the range 51 shown in FIG.
 ブロックマッチング処理は以下のような手順で行われる。まず検索範囲52の左上隅から範囲51と同じサイズの範囲53について、RAM13に保存されている部分画像データとの画素単位での比較を行う。具体的には、画素毎に輝度値の差を算出し、この差の絶対値の累計値を求める。次に、範囲53を右方向へ1画素だけずらし、再度上記の累計値を求める。このようにして、検索範囲52の全範囲について部分画像データとの比較を行う。 The block matching process is performed according to the following procedure. First, the range 53 having the same size as the range 51 from the upper left corner of the search range 52 is compared with the partial image data stored in the RAM 13 in pixel units. Specifically, a difference in luminance value is calculated for each pixel, and a cumulative value of absolute values of the difference is obtained. Next, the range 53 is shifted to the right by one pixel, and the above cumulative value is obtained again. In this way, the entire range of the search range 52 is compared with the partial image data.
 検索範囲52内のあらゆる位置における累計値を算出したら、検出部11bは最小の累計値と所定のしきい値とを比較する。最小の累計値が所定のしきい値以下であった場合、検出部11bはマッチング処理が成功したと見なし、図5(c)に示す方法で操舵角および車速を求める。図5(c)において、マッチングに成功した位置を範囲53’、部分画像データの位置を範囲51であるとする。 When the cumulative value at every position in the search range 52 is calculated, the detection unit 11b compares the minimum cumulative value with a predetermined threshold value. When the minimum cumulative value is equal to or less than the predetermined threshold value, the detection unit 11b considers that the matching process has been successful, and obtains the steering angle and the vehicle speed by the method shown in FIG. In FIG. 5C, it is assumed that the position where the matching is successful is the range 53 ′ and the position of the partial image data is the range 51.
 検出部11bは、この範囲53’と範囲51との、画面内における縦方向および横方向のずれ量を算出する。図5(c)では、縦方向のずれ量をdy、横方向のずれ量をdxとしている。検出部11bは、画面内におけるずれ量dx、dyを実空間の距離に変換し、実際の操舵角および車速を算出する。具体的には、フレームP1が取得された時刻t1と、フレームP2が取得された時刻t2との間隔は判明しているため、検出部はリアカメラ14の位置情報を加味した演算を行うことにより、dxから操舵角を、dyから車速を算出することが可能である。操舵角が大きくなればなるほどdxも大きくなり、車速が早くなればなるほどdyも大きくなる。 The detecting unit 11b calculates the amount of deviation between the range 53 'and the range 51 in the vertical and horizontal directions in the screen. In FIG. 5C, the vertical shift amount is dy, and the horizontal shift amount is dx. The detection unit 11b converts the shift amounts dx and dy in the screen into real space distances, and calculates the actual steering angle and vehicle speed. Specifically, since the interval between the time t1 at which the frame P1 is acquired and the time t2 at which the frame P2 is acquired is known, the detection unit performs a calculation that takes into account the position information of the rear camera 14. , Dx, and the vehicle speed can be calculated from dy. As the steering angle increases, dx increases, and as the vehicle speed increases, dy increases.
 他方、最小の累計値が所定のしきい値を上回った場合、検出部11bはマッチング処理が失敗したと見なす。マッチング処理の失敗は、例えば自車両2が高速移動しているために、リアカメラ14による撮像範囲が時刻t1から時刻t2までの間に検索範囲52よりも大きく変化することにより引き起こされる。このような場合、画像表示部11cは液晶モニタ16への表示を中止する。 On the other hand, when the minimum cumulative value exceeds a predetermined threshold value, the detection unit 11b considers that the matching process has failed. The failure of the matching process is caused, for example, when the imaging range of the rear camera 14 changes more than the search range 52 from time t1 to time t2 because the host vehicle 2 is moving at high speed. In such a case, the image display unit 11c stops the display on the liquid crystal monitor 16.
 画像表示部11cは検出部11bが以上のようにして検出した操舵角に基づいて、前述のような表示を行う。画像表示部11cは、自車両2が静止している場合、すなわち検出部11bにより検出された車速が所定のしきい値以下であった場合には、前述のような表示を行わない。これは、自車両2が静止している場合にはリアカメラ14からの撮像信号が変化しないため、検出部11bが操舵角を検出できないためである。しかしながら、進行と停止とを頻繁に繰り返すような操作を行う場合、画像表示部11cによる表示が頻繁に切り替わることになるため、表示が煩わしくなる。そこで、画像表示部11cは、自車両2が連続して所定時間(例えば5秒間)静止した場合にのみ、前述のような表示を行わなくなる。 The image display unit 11c performs the display as described above based on the steering angle detected by the detection unit 11b as described above. The image display unit 11c does not display as described above when the host vehicle 2 is stationary, that is, when the vehicle speed detected by the detection unit 11b is equal to or less than a predetermined threshold value. This is because when the host vehicle 2 is stationary, the imaging signal from the rear camera 14 does not change, and the detection unit 11b cannot detect the steering angle. However, when an operation that frequently repeats the progress and the stop is performed, the display by the image display unit 11c is frequently switched, and thus the display becomes troublesome. Therefore, the image display unit 11c does not perform the display as described above only when the host vehicle 2 is continuously stationary for a predetermined time (for example, 5 seconds).
 図6は、検出部11bによる検出処理のフローチャートである。この処理は、制御回路11がROM12に格納された制御プログラム(画像処理プログラム)を実行することにより、ソフトウェア的に実現される。まずステップS10では、制御回路11がリアカメラ14からの撮像信号に基づいて画像データを作成し、RAM13へ格納する。ステップS20では、映像表示部11aがRAM13に格納された画像データに基づく映像を液晶モニタ16へ表示する。ステップS30では、検出部11bが、RAM13に保存されている部分画像データとステップS10においてRAM13に格納された画像データとのブロックマッチング処理を実行する。この処理により、前述した最小の累計値が算出される。 FIG. 6 is a flowchart of detection processing by the detection unit 11b. This process is realized in software by the control circuit 11 executing a control program (image processing program) stored in the ROM 12. First, in step S <b> 10, the control circuit 11 creates image data based on the imaging signal from the rear camera 14 and stores it in the RAM 13. In step S <b> 20, the video display unit 11 a displays a video based on the image data stored in the RAM 13 on the liquid crystal monitor 16. In step S30, the detection unit 11b executes block matching processing between the partial image data stored in the RAM 13 and the image data stored in the RAM 13 in step S10. By this process, the minimum cumulative value described above is calculated.
 ステップS40では、ステップS30で算出された累計値が所定のしきい値以下か否かを判定する。累計値が所定のしきい値を上回っていた場合にはステップS80へ進む。他方、累計値が所定のしきい値以下であった場合にはステップS50へ進む。ステップS50では、検出部11bが、ステップS30におけるブロックマッチング処理の結果に基づいて、自車両2の操舵角および車速を検出する。 In step S40, it is determined whether or not the cumulative value calculated in step S30 is a predetermined threshold value or less. If the cumulative value exceeds a predetermined threshold value, the process proceeds to step S80. On the other hand, if the cumulative value is equal to or less than the predetermined threshold value, the process proceeds to step S50. In step S50, the detection unit 11b detects the steering angle and the vehicle speed of the host vehicle 2 based on the result of the block matching process in step S30.
 ステップS60では、自車両2が静止したまま所定時間が経過したか否かを判定する。自車両が静止したまま所定時間が経過していた場合にはステップS80へ進む。他方、自車両が現時点において静止していない場合、あるいは、自車両が静止したまま経過した時間が所定時間未満であった場合には、ステップS70へ進む。ステップS70では、画像表示部11cが液晶モニタ16へ、自車両2の車幅を表す2つの直線と、自車両2の操舵角を表す三角形と、を表示する。ステップS80では、図5(a)に示した範囲51のように、最新のフレームであるフレームP2の所定範囲から部分画像データを抽出してRAM13へ保存する。 In step S60, it is determined whether or not a predetermined time has passed while the host vehicle 2 is stationary. If the predetermined time has elapsed while the host vehicle is stationary, the process proceeds to step S80. On the other hand, if the host vehicle is not stationary at the present time, or if the elapsed time while the host vehicle is stationary is less than the predetermined time, the process proceeds to step S70. In step S <b> 70, the image display unit 11 c displays two straight lines representing the vehicle width of the host vehicle 2 and a triangle representing the steering angle of the host vehicle 2 on the liquid crystal monitor 16. In step S80, partial image data is extracted from a predetermined range of the frame P2, which is the latest frame, as shown in a range 51 shown in FIG.
 上述した第1の実施の形態による車載装置によれば、次の作用効果が得られる。
(1)画像表示部11cは、映像表示部11aにより表示される映像に重ねて、自車両の車幅を表す直線と、操舵角の大きさを表す三角形と、を液晶モニタ16に表示する。これにより、操舵角を考慮した自車両2の移動軌跡と、自車両2の操舵角の大きさと、を液晶モニタ16の表示画面から容易に把握できる。
According to the on-vehicle apparatus according to the first embodiment described above, the following operational effects are obtained.
(1) The image display unit 11c displays on the liquid crystal monitor 16 a straight line representing the vehicle width of the host vehicle and a triangle representing the magnitude of the steering angle, superimposed on the video displayed by the video display unit 11a. Thereby, the movement trajectory of the host vehicle 2 in consideration of the steering angle and the magnitude of the steering angle of the host vehicle 2 can be easily grasped from the display screen of the liquid crystal monitor 16.
(2)検出部11bは、リアカメラ14の出力のみを用いて、自車両2の操舵角および車速を検出する。これにより、自車両2からの出力や各種センサの出力を得る必要がなく、車載装置1のコストを削減することが可能となる。 (2) The detection unit 11b detects the steering angle and the vehicle speed of the host vehicle 2 using only the output of the rear camera 14. Thereby, it is not necessary to obtain the output from the own vehicle 2 and the outputs of various sensors, and the cost of the in-vehicle device 1 can be reduced.
(3)画像表示部11cは、映像表示部11aにより液晶モニタ16に表示された映像の色の補色で、自車両の車幅を表す直線を表示する。これにより、自車両2の周辺がどのような色であったとしても、液晶モニタ16の表示画面の視認性が低下しない。 (3) The image display unit 11c displays a straight line representing the vehicle width of the host vehicle with a complementary color of the color of the video displayed on the liquid crystal monitor 16 by the video display unit 11a. As a result, the visibility of the display screen of the liquid crystal monitor 16 does not deteriorate no matter what color the surroundings of the host vehicle 2 are.
(第2の実施の形態)
 本実施形態の車載装置は、第1の実施の形態に係る車載装置と同様の構成を備える。なお、第1の実施の形態と同様な部材、回路および機能部については同一の符号を付し、説明を省略する。
(Second Embodiment)
The in-vehicle device of the present embodiment has the same configuration as the in-vehicle device according to the first embodiment. Note that members, circuits, and functional units similar to those in the first embodiment are denoted by the same reference numerals, and description thereof is omitted.
 本実施形態に係る車載装置1の画像表示部11cは、自車両2の車幅を表す直線上に、自車両2のバンパーの地上高さを表す表示を行う。なお、自車両2のバンパーの地上高さ、すなわち、路面からバンパーの上端までの高さは、事前に実行されるキャリブレーション処理等により、あらかじめ車載装置1へ与えられているものとする。 The image display unit 11c of the in-vehicle device 1 according to the present embodiment performs a display representing the ground height of the bumper of the host vehicle 2 on a straight line representing the vehicle width of the host vehicle 2. It is assumed that the ground height of the bumper of the host vehicle 2, that is, the height from the road surface to the upper end of the bumper is given to the in-vehicle device 1 in advance by a calibration process or the like that is executed in advance.
 図7は、第2の実施の形態に係る車載装置1の画面表示の例を示す図である。画面SC3には、自車両2の車幅を表す2つの直線LL’’、LR’’が表示されている。それぞれの直線の上には、バンパーの地上高さを表す面WL、WRがそれぞれ表示されている。これらの面WL、WRは、路面からバンパーの地上高さまでを覆うように、三次元状に表示される。 FIG. 7 is a diagram illustrating an example of a screen display of the in-vehicle device 1 according to the second embodiment. On the screen SC3, two straight lines LL ″ and LR ″ representing the vehicle width of the host vehicle 2 are displayed. On each straight line, planes WL and WR representing the bumper's ground height are displayed. These surfaces WL and WR are displayed in a three-dimensional shape so as to cover the road surface to the bumper ground height.
 上述した第2の実施の形態による車載装置によれば、第1の実施の形態による車載装置で得られる作用効果に加えて、次の作用効果が得られる。
(1)画像表示部11cは、自車両2の車幅を表す直線と、操舵角の大きさを表す三角形と、自車両のバンパーの地上高さを表す面を液晶モニタ16へ表示する。これにより、自車両2と障害物等との距離関係をより確実に把握することが可能となる。
According to the vehicle-mounted device according to the second embodiment described above, the following operational effects can be obtained in addition to the operational effects obtained with the vehicle-mounted device according to the first embodiment.
(1) The image display unit 11c displays on the liquid crystal monitor 16 a straight line representing the vehicle width of the host vehicle 2, a triangle representing the magnitude of the steering angle, and a surface representing the ground height of the bumper of the host vehicle. Thereby, it becomes possible to grasp | ascertain the distance relationship between the own vehicle 2 and an obstruction more reliably.
 次のような変形も本発明の範囲内であり、変形例の一つ、もしくは複数を上述の実施形態と組み合わせることも可能である。 The following modifications are also within the scope of the present invention, and one or a plurality of modifications can be combined with the above-described embodiment.
(変形例1)
 検出部11bは、ブロックマッチング以外の方法により自車両2の操舵角および車速を検出してもよい。例えば、公知の技術により、画像データから特徴点を求め、この特徴点がフレーム間で移動した量を算出することにより、自車両2の操舵角および車速を検出してもよい。
(Modification 1)
The detection unit 11b may detect the steering angle and the vehicle speed of the host vehicle 2 by a method other than block matching. For example, the steering angle and the vehicle speed of the host vehicle 2 may be detected by obtaining a feature point from image data using a known technique and calculating the amount of movement of the feature point between frames.
(変形例2)
 検出部11bは、連続した3つ以上のフレームから自車両2の操舵角および車速を検出してもよい。
(Modification 2)
The detection unit 11b may detect the steering angle and the vehicle speed of the host vehicle 2 from three or more consecutive frames.
(変形例3)
 画像表示部11cは、自車両2の車幅を表す直線や、操舵角の大きさを表す三角形を、背景の色以外の要因により決定してもよい。例えば車載装置1がナビゲーション装置である場合には、現在地と地図データに基づいて色を決定してもよい。また、自車両2の状態や照度センサーの出力、あるいは時間帯などにより色を決定してもよい。更に、例えば直線の太さや三角形のパターンなど、色以外の装飾をこれらの要因により決定してもよい。
(Modification 3)
The image display unit 11c may determine a straight line representing the vehicle width of the host vehicle 2 and a triangle representing the magnitude of the steering angle based on factors other than the background color. For example, when the in-vehicle device 1 is a navigation device, the color may be determined based on the current location and map data. Further, the color may be determined by the state of the host vehicle 2, the output of the illuminance sensor, or the time zone. Furthermore, decorations other than colors, such as the thickness of a straight line or a triangular pattern, may be determined based on these factors.
(変形例4)
 検出部11bによる検出結果や、画像表示部11cによる画面への表示内容に基づいて、運転者へ音声等による報知を行ってもよい。例えば、左右への移動、左右への旋回、前進や後退などを音声等により報知してもよい。
(Modification 4)
The driver may be notified by voice or the like based on the detection result by the detection unit 11b or the display content on the screen by the image display unit 11c. For example, movement to the left and right, turning to the left and right, forward and backward, etc. may be notified by voice or the like.
(変形例5)
 図5に示すブロックマッチング処理を行う際、範囲51を複数の領域に分割し、各々の領域についてブロックマッチング処理を行ってもよい。例えば、範囲51を横4列、縦3列に等分し、12個の小領域とする。そして、各々の小領域について、その小領域を四方に拡大した領域を検索範囲とし、ブロックマッチング処理を行う。このようにして得られた12個のdx、dyを用いて、自車両2の操舵角および車速を検出してもよい。
(Modification 5)
When the block matching process shown in FIG. 5 is performed, the range 51 may be divided into a plurality of areas, and the block matching process may be performed for each area. For example, the range 51 is equally divided into 4 horizontal rows and 3 vertical rows to form 12 small regions. Then, for each small area, a block matching process is performed using an area obtained by enlarging the small area in all directions as a search range. The steering angle and the vehicle speed of the host vehicle 2 may be detected using the twelve dx and dy thus obtained.
(変形例6)
 上記実施の形態の車載装置1のリアカメラ14を除く部分、すなわち、制御回路11、ROM12、RAM13、入力装置15、液晶モニタ16、スピーカ17部分を、パーソナルコンピュータで置き換えるようにしてもよい。この場合、上述した制御プログラムは、CD-ROMやUSBメモリ(フラッシュメモリ)やメモリカードなどの記録媒体やインターネットなどのデータ信号を通じて提供することができる。図8はその様子を示す図である。パーソナルコンピュータ100は、CD-ROM104を介してプログラムの提供を受ける。また、パーソナルコンピュータ100は通信回線101との接続機能を有する。コンピュータ102は上記プログラムを提供するサーバーコンピュータであり、ハードディスク103などの記録媒体にプログラムを格納する。通信回線101は、インターネット、パソコン通信などの通信回線、あるいは専用通信回線などである。コンピュータ102はハードディスク103を使用してプログラムを読み出し、通信回線101を介してプログラムをパーソナルコンピュータ100に送信する。すなわち、プログラムをデータ信号として搬送波を介して、通信回線101を介して送信する。このように、プログラムは、記録媒体やデータ信号(搬送波)などの種々の形態のコンピュータ読み込み可能なコンピュータプログラム製品として供給できる。上記実施の形態の車載装置1も同様な構成を取れば、制御プログラムを同様に供給することができる。
(Modification 6)
The parts excluding the rear camera 14 of the in-vehicle device 1 of the above embodiment, that is, the control circuit 11, the ROM 12, the RAM 13, the input device 15, the liquid crystal monitor 16, and the speaker 17 may be replaced with a personal computer. In this case, the above-described control program can be provided through a recording medium such as a CD-ROM, a USB memory (flash memory), a memory card, or a data signal such as the Internet. FIG. 8 shows the state. The personal computer 100 is provided with a program via the CD-ROM 104. The personal computer 100 has a connection function with the communication line 101. The computer 102 is a server computer that provides the program, and stores the program in a recording medium such as the hard disk 103. The communication line 101 is a communication line such as the Internet or personal computer communication, or a dedicated communication line. The computer 102 reads the program using the hard disk 103 and transmits the program to the personal computer 100 via the communication line 101. That is, the program is transmitted as a data signal through the communication line 101 via a carrier wave. As described above, the program can be supplied as a computer-readable computer program product in various forms such as a recording medium and a data signal (carrier wave). If the vehicle-mounted apparatus 1 of the said embodiment also takes the same structure, a control program can be supplied similarly.
 本発明の特徴を損なわない限り、本発明は上記実施の形態に限定されるものではなく、本発明の技術的思想の範囲内で考えられるその他の形態についても、本発明の範囲内に含まれる。 As long as the characteristics of the present invention are not impaired, the present invention is not limited to the above-described embodiments, and other forms conceivable within the scope of the technical idea of the present invention are also included in the scope of the present invention. .
 次の優先権基礎出願の開示内容は引用文としてここに組み込まれる。
 日本国特許出願2009年第178123号(2009年7月30日出願)
The disclosure of the following priority application is hereby incorporated by reference.
Japanese Patent Application No. 2009 178123 (filed on July 30, 2009)

Claims (8)

  1.  自車両の周辺を撮像し撮像信号を出力する撮像部と、
     前記撮像信号に基づく映像を表示装置へ表示する映像表示部と、
     前記撮像信号に基づいて前記自車両の操舵角を検出する操舵角検出部と、
     前記操舵角に基づいて、前記自車両の車幅と前記操舵角とを表す画像を、前記表示装置へ前記映像に重ねて表示する画像表示部と、
    を備える車載装置。
    An imaging unit for imaging the periphery of the host vehicle and outputting an imaging signal;
    A video display unit for displaying video based on the imaging signal on a display device;
    A steering angle detector for detecting a steering angle of the host vehicle based on the imaging signal;
    Based on the steering angle, an image display unit that displays an image representing the vehicle width and the steering angle of the host vehicle over the video on the display device;
    A vehicle-mounted device comprising:
  2.  請求項1に記載の車載装置において、
     前記操舵角検出部は、前記撮像信号に基づき取得された最新の画像と、前記撮像信号に基づき取得された前記最新の画像の直近の画像と、の水平方向のずれ量に基づいて前記操舵角を検出する車載装置。
    The in-vehicle device according to claim 1,
    The steering angle detection unit is configured to calculate the steering angle based on a horizontal shift amount between the latest image acquired based on the imaging signal and the most recent image acquired based on the imaging signal. In-vehicle device that detects
  3.  請求項1または2に記載の車載装置において、
     前記画像表示部は、前記自車両の車幅を表す直線と、前記操舵角を表す多角形と、を前記画像として前記表示装置へ表示する車載装置。
    The in-vehicle device according to claim 1 or 2,
    The said image display part is a vehicle-mounted apparatus which displays the straight line showing the vehicle width of the said own vehicle, and the polygon showing the said steering angle on the said display apparatus as the said image.
  4.  請求項3に記載の車載装置において、
     前記画像表示部は、前記映像のうち前記直線の背景となる領域の色と比べ、色相、彩度、および明度の少なくとも1つが所定量以上異なる色で前記直線を表示する車載装置。
    The in-vehicle device according to claim 3,
    The image display unit displays the straight line in a color in which at least one of hue, saturation, and brightness differs by a predetermined amount or more compared to the color of the region that is the background of the straight line in the video.
  5.  請求項1~4のいずれか一項に記載の車載装置において、
     前記画像表示部は、前記自車両の車幅と前記操舵角とを表す画像と共に、前記自車両のバンパーの地上高さを表す画像を前記表示装置へ前記映像に重ねて表示する車載装置。
    The in-vehicle device according to any one of claims 1 to 4,
    The image display unit is an on-vehicle device that displays an image representing the vehicle height and the steering angle of the host vehicle and an image representing the ground height of a bumper of the host vehicle on the display device in an overlapping manner.
  6.  請求項1~5のいずれか一項に記載の車載装置において、
     前記撮像信号に基づいて前記自車両の速度を検出する速度検出部を更に備える車載装置。
    The in-vehicle device according to any one of claims 1 to 5,
    A vehicle-mounted device further comprising a speed detection unit that detects the speed of the host vehicle based on the imaging signal.
  7.  請求項6に記載の車載装置において、
     前記画像表示部は、前記速度が所定速度を上回る場合、前記自車両の車幅と前記操舵角とを表す画像を前記表示装置へ表示しない車載装置。
    The in-vehicle device according to claim 6,
    The image display unit is an in-vehicle device that does not display an image representing the vehicle width and the steering angle of the host vehicle on the display device when the speed exceeds a predetermined speed.
  8.  自車両の周辺を撮像して得られる撮像信号が入力される撮像工程と、
     前記撮像信号に基づく映像を表示装置へ表示する映像表示工程と、
     前記撮像信号に基づいて前記自車両の操舵角を検出する操舵角検出工程と、
     前記操舵角に基づいて、前記自車両の車幅と前記操舵角とを表す画像を、前記表示装置へ前記映像に重ねて表示する画像表示工程と、
    をコンピュータに実行させる画像処理プログラム。
    An imaging process in which an imaging signal obtained by imaging the periphery of the host vehicle is input;
    An image display step of displaying an image based on the imaging signal on a display device;
    A steering angle detection step of detecting a steering angle of the host vehicle based on the imaging signal;
    An image display step of displaying an image representing the vehicle width and the steering angle of the host vehicle on the display device in an overlapping manner on the video based on the steering angle;
    An image processing program for causing a computer to execute.
PCT/JP2010/062927 2009-07-30 2010-07-30 In-vehicle device and image processing program WO2011013813A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011524862A JPWO2011013813A1 (en) 2009-07-30 2010-07-30 In-vehicle device and image processing program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009178123 2009-07-30
JP2009-178123 2009-07-30

Publications (1)

Publication Number Publication Date
WO2011013813A1 true WO2011013813A1 (en) 2011-02-03

Family

ID=43529464

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/062927 WO2011013813A1 (en) 2009-07-30 2010-07-30 In-vehicle device and image processing program

Country Status (2)

Country Link
JP (1) JPWO2011013813A1 (en)
WO (1) WO2011013813A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013088995A1 (en) * 2011-12-13 2013-06-20 住友建機株式会社 Peripheral image display device and peripheral image display method for construction machinery
WO2013157250A1 (en) * 2012-04-20 2013-10-24 京セラ株式会社 Image processing device and driving assisting method
JP2014146149A (en) * 2013-01-29 2014-08-14 Kyocera Corp Image processing apparatus and travel support method
JP2016065449A (en) * 2015-12-01 2016-04-28 住友建機株式会社 Shovel
EP3358840A4 (en) * 2015-09-30 2018-09-12 Aisin Seiki Kabushiki Kaisha Image processing device for vehicles
JP2019206912A (en) * 2019-08-19 2019-12-05 住友建機株式会社 Shovel and display device of shovel
US20210248756A1 (en) * 2018-05-10 2021-08-12 Sony Corporation Image processing apparatus, vehicle-mounted apparatus, image processing method, and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000272445A (en) * 1999-01-19 2000-10-03 Toyota Autom Loom Works Ltd Steering support device in reversing vehicle
JP2004147083A (en) * 2002-10-24 2004-05-20 Matsushita Electric Ind Co Ltd Driving support apparatus
JP2005329870A (en) * 2004-05-21 2005-12-02 Clarion Co Ltd Auxiliary information presentation device
JP2008137425A (en) * 2006-11-30 2008-06-19 Aisin Aw Co Ltd Parking assisting method and parking assisting device
JP2009017462A (en) * 2007-07-09 2009-01-22 Sanyo Electric Co Ltd Driving support system and vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000272445A (en) * 1999-01-19 2000-10-03 Toyota Autom Loom Works Ltd Steering support device in reversing vehicle
JP2004147083A (en) * 2002-10-24 2004-05-20 Matsushita Electric Ind Co Ltd Driving support apparatus
JP2005329870A (en) * 2004-05-21 2005-12-02 Clarion Co Ltd Auxiliary information presentation device
JP2008137425A (en) * 2006-11-30 2008-06-19 Aisin Aw Co Ltd Parking assisting method and parking assisting device
JP2009017462A (en) * 2007-07-09 2009-01-22 Sanyo Electric Co Ltd Driving support system and vehicle

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013088995A1 (en) * 2011-12-13 2013-06-20 住友建機株式会社 Peripheral image display device and peripheral image display method for construction machinery
JP2013124467A (en) * 2011-12-13 2013-06-24 Sumitomo (Shi) Construction Machinery Co Ltd Surrounding image display device and surrounding image display method of construction machine
US20140267731A1 (en) * 2011-12-13 2014-09-18 Sumitomo(S.H.I.) Construction Machinery Co., Ltd. Peripheral image display device and method of displaying peripheral image for construction machine
US10017113B2 (en) 2011-12-13 2018-07-10 Sumitomo(S.H.I.) Construction Machinery Co., Ltd. Peripheral image display device and method of displaying peripheral image for construction machine
US10293751B2 (en) 2011-12-13 2019-05-21 Sumitomo (S.H.I.) Construction Machinery Co., Ltd. Peripheral image display device and method of displaying peripheral image for construction machine
WO2013157250A1 (en) * 2012-04-20 2013-10-24 京セラ株式会社 Image processing device and driving assisting method
US9789818B2 (en) 2012-04-20 2017-10-17 Kyocera Corporation Image processing apparatus, imaging apparatus and drive assisting method
JP2014146149A (en) * 2013-01-29 2014-08-14 Kyocera Corp Image processing apparatus and travel support method
EP3358840A4 (en) * 2015-09-30 2018-09-12 Aisin Seiki Kabushiki Kaisha Image processing device for vehicles
JP2016065449A (en) * 2015-12-01 2016-04-28 住友建機株式会社 Shovel
US20210248756A1 (en) * 2018-05-10 2021-08-12 Sony Corporation Image processing apparatus, vehicle-mounted apparatus, image processing method, and program
JP2019206912A (en) * 2019-08-19 2019-12-05 住友建機株式会社 Shovel and display device of shovel

Also Published As

Publication number Publication date
JPWO2011013813A1 (en) 2013-01-10

Similar Documents

Publication Publication Date Title
JP5454934B2 (en) Driving assistance device
JP4639753B2 (en) Driving assistance device
CN102616182B (en) Parking assistance system and method
JP5321267B2 (en) Vehicular image display device and overhead image display method
EP1916846B1 (en) Device and method for monitoring vehicle surroundings
JP5068779B2 (en) Vehicle surroundings overhead image display apparatus and method
JP4832321B2 (en) Camera posture estimation apparatus, vehicle, and camera posture estimation method
JP5212748B2 (en) Parking assistance device
WO2011013813A1 (en) In-vehicle device and image processing program
US20110228980A1 (en) Control apparatus and vehicle surrounding monitoring apparatus
WO2016132849A1 (en) Driving assistance system and driving assistance method
JP6379779B2 (en) Vehicle display device
JP4687411B2 (en) Vehicle peripheral image processing apparatus and program
WO2019008764A1 (en) Parking assistance method and parking assistance device
JP2005311868A (en) Vehicle periphery visually recognizing apparatus
JP2010184607A (en) Vehicle periphery displaying device
JP5471141B2 (en) Parking assistance device and parking assistance method
JP5549235B2 (en) Driving assistance device
JP5636493B2 (en) Image processing apparatus and image display apparatus
JP2010089716A (en) Parking assist apparatus and parking assist method
JP2007124097A (en) Apparatus for visually recognizing surrounding of vehicle
JP2006171950A (en) Display controller for head-up display, and program
JP2010183234A (en) Drawing apparatus
JP5273068B2 (en) Vehicle periphery monitoring device
JPH0981757A (en) Vehicle position detecting device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10804559

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2011524862

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08/05/2012)

122 Ep: pct application non-entry in european phase

Ref document number: 10804559

Country of ref document: EP

Kind code of ref document: A1