JP2009040107A - Image display control device and image display control system - Google Patents

Image display control device and image display control system Download PDF

Info

Publication number
JP2009040107A
JP2009040107A JP2007204383A JP2007204383A JP2009040107A JP 2009040107 A JP2009040107 A JP 2009040107A JP 2007204383 A JP2007204383 A JP 2007204383A JP 2007204383 A JP2007204383 A JP 2007204383A JP 2009040107 A JP2009040107 A JP 2009040107A
Authority
JP
Japan
Prior art keywords
object
image display
display control
image
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2007204383A
Other languages
Japanese (ja)
Inventor
Masanari Takagi
雅成 高木
Original Assignee
Denso Corp
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Denso Corp, 株式会社デンソー filed Critical Denso Corp
Priority to JP2007204383A priority Critical patent/JP2009040107A/en
Publication of JP2009040107A publication Critical patent/JP2009040107A/en
Pending legal-status Critical Current

Links

Abstract

<P>PROBLEM TO BE SOLVED: To provide an image display control device and an image display control system capable of a properly highlighting corresponding to actual risk of a surrounding obstacle of a vehicle. <P>SOLUTION: This image display control device 1 performs travel state recognizing processing in a Step 100. The travel state recognizing processing is processing for recognizing a travel state of the vehicle based on a signal provided from a vehicle speed sensor 9 and a steering angle sensor 11. Travel environment recognizing processing is performed in a Step 110. The travel environment recognizing processing is processing for recognizing a surrounding environment and the obstacle of the vehicle based on an image photographed by a surrounding monitoring camera 5 and information on the obstacle provided by a radar 7. Whether or not an object (such as the obstacle) is recognized is determined by the travel environment recognizing processing in a Step 120. Emphatic display processing is performed. The highlighting processing highlights the object in response to its risk by determining the risk of the object recognized by the travel environment recognizing processing. <P>COPYRIGHT: (C)2009,JPO&INPIT

Description

  The present invention relates to an image display control device and an image display control system that are mounted on, for example, an automobile and can display surrounding images such as the front of the host vehicle.

  Conventionally, in order to drive a vehicle safely, a camera, radar, or the like detects a pedestrian or other vehicle that needs attention, and displays an image of the object (obstacle) around the vehicle. There has been proposed an image display control apparatus that displays an image on a display or the like.

  For example, in Patent Document 1, the position of a moving object on a screen is calculated based on information from a moving object detecting unit such as a radar, and an emphasis display is added to an image area where the moving object exists in an image, Techniques have been proposed for changing the blinking speed of the highlight depending on the degree of danger of the body.

  Further, in Patent Document 2, when it is estimated that at least one or more obstacles are present in the estimated travel locus range in the estimated range of the locus on which the vehicle travels based on the captured image of the camera, There has been proposed a technique for displaying an emphasis image of an obstacle estimated to be present at a position closest to the vehicle with respect to the traveling direction.

Furthermore, in Patent Document 3, an obstacle existing at a position close to the own vehicle is determined based on the distance between the obstacle and the own vehicle, and according to the determination result, the obstacle present at a position closer to the vehicle is highlighted. Techniques to do this have been proposed.
JP-A-7-223487 Japanese Patent Laid-Open No. 10-117340 JP 2001-76298 A

  However, in the technique of Patent Document 1, the danger level for changing the blinking speed of the highlighting is set by the distance of the obstacle from the own vehicle and the approach speed to the own vehicle. In some cases, it was not possible to display according to the degree.

For example, when there is a road sign at the end of the road or when a person is standing, there is a problem that the risk is determined to be the same.
Further, in the techniques of Patent Document 2 and Patent Document 3, since objects closer to the own vehicle are displayed with emphasis, there is a problem that they may not correspond to the actual risk level. .

For example, there is a problem that the same risk is determined when another vehicle approaches or moves away from the host vehicle.
The present invention has been made in view of such a problem, and provides an image display control device and an image display control system capable of appropriately highlighting according to an actual risk level of an obstacle around a vehicle. Objective.

  (1) The invention of claim 1 detects an object existing around the host vehicle, sets a risk level of the target object with respect to the host vehicle, and sets the target object on the screen according to the risk level. In the image display control device that performs highlighting on the image indicating the vehicle, the speed of the host vehicle, the distance to the object, and the type of the object are detected, and the speed of the host vehicle and the distance to the object are detected. The risk level of the target object is set according to the type of the target object, and the highlight display of the image showing the target object is controlled according to the setting result.

  In the present invention, the risk level of the object is set according to the speed of the host vehicle, the distance to the object, and the type of the object, and the highlighting of the image showing the object is controlled according to the setting result. Therefore, it is possible to perform appropriate highlighting according to the true risk level.

(2) The invention of claim 2 is characterized in that the image showing the object is at least one of an image of the object itself and a mark designating the object.
In the present invention, when highlighting is performed, the image of the object itself can be highlighted by blinking or changing the brightness. Further, the image can be emphasized by emphasizing a mark indicating (designating) an image of a pedestrian or a vehicle, for example, a frame provided so as to surround the image, instead of the image itself.

  (3) In the invention of claim 3, among the objects existing in the predicted traveling locus of the host vehicle and in the vicinity of the predicted traveling locus, at least the target existing in the predicted traveling locus is the risk level. It is characterized by setting.

  The predicted travel trajectory of the vehicle (for example, the lane scheduled to travel) is highly likely to actually travel, and in the vicinity along the trajectory (an area having a predetermined width from the lane), an obstacle collides with the vehicle. Since there is a high possibility, the risk level of the object in the locus or in the nearby region is increased. In addition, it is desirable to raise the risk of the target object in the locus rather than the vicinity of the locus.

(4) The invention of claim 4 is characterized in that the degree of danger is set according to the moving direction of the object, and the highlighting is adjusted.
When the target object approaches the vehicle or moves in the direction of the predicted travel path, the possibility of collision with the vehicle is high, so the risk level of the target object is set high.

(5) The invention of claim 5 is characterized in that the highlighting is adjusted according to a blinking speed of an image showing the object.
In the present invention, as a highlighting method, the blinking speed of an image (for example, a frame or the object itself) showing the object is changed. For example, the greater the degree of danger, the faster the blinking speed.

(6) The invention of claim 6 is characterized in that an image showing the object is highlighted by vibrating on the screen.
In the present invention, as a highlighting method, an image showing an object (for example, an image of a pedestrian or the like itself or a frame) is vibrated on the screen. For example, the greater the degree of danger, the greater the number of vibrations and the fluctuation width.

  (7) In the invention of claim 7, when there are a plurality of the objects (for example, when there are A, B, and C), the images showing the objects are highlighted in order from the one with the highest degree of risk. It is characterized by that.

For example, when the order of high risk is A, B, C, for example, blinking is performed in order from the high risk (that is, blinking in the order of A → B → C). You can easily recognize high-risk items,
(8) The invention of claim 8 is characterized in that a driver's line of sight is detected, and an image showing the object in a direction not seen by the driver is highlighted.

  In the present invention, since an image of an object in a direction that the driver is not looking at is highlighted, even in a situation where the driver tends to overlook, it is possible to effectively notify the dangerous object. Thereby, safety is greatly improved.

(9) The invention of claim 9 is characterized in that the highlighting is strengthened for an object having a higher degree of risk.
In the present invention, since the degree of emphasis is increased as the degree of danger increases, the degree of danger can be easily recognized.

  (10) An invention (an image display control system) according to a tenth aspect includes at least an imaging device (for example, a peripheral monitoring camera) and an image display device (for example, a display) in the image display control device according to any one of the first to ninth aspects. ).

  The present invention shows a system configuration using an image display control device. In addition to the imaging device, a radar, various sensors, or the like can be used.

Embodiments of the present invention will be described below with reference to the drawings.
[First Embodiment]
An image display control device and an image display control system according to the present embodiment detect an object (obstacle) around a vehicle and control display of an image of an obstacle or the like on a display or the like according to the degree of risk. And system.

a) First, the configuration of the image display control apparatus used in the present embodiment will be described.
As shown in FIG. 1, an image display control apparatus 1 according to this embodiment performs control such as image highlighting in an image display control system 3 mounted on a vehicle. It is comprised from the electronic control apparatus which is a processing part.

  The image display control device 1 includes a peripheral monitoring camera (see FIG. 2) 5 for photographing the front of the vehicle, a radar 7 for detecting an obstacle such as the front of the vehicle, a vehicle speed sensor 9 for detecting the vehicle speed, and a steering angle. Are connected to a steering angle sensor 11, a navigation device 13, a display 15 as an output unit for displaying an image, and the like.

  As will be described in detail later, the image display control device 1 recognizes the travel environment and travels based on information obtained from the periphery monitoring camera 5, the radar 7, the vehicle speed sensor 9, the steering angle sensor 11, the navigation device 13, and the like. Processing such as state recognition and highlighting of an object is performed, and an image of an obstacle or the like is displayed on the display 13 based on the processing result.

  For example, the image display control device 1 can recognize the type of the object and obtain the moving direction and moving speed of the object based on the signal from the peripheral monitoring camera 5. Further, the radar 7 can determine the distance to the object, the moving speed of the object, the relative speed with the object, and the like.

As will be described later, the risk level of the object can be set on the basis of the information, and highlighting can be performed according to the risk level.
b) Next, processing contents performed in the image display control apparatus 1 will be described.

(1) First, main processing will be described based on the flowchart of FIG.
As shown in FIG. 3, in the image display control device 1, a running state recognition process is performed in step (S) 100.

The travel state recognition process is a process for recognizing the travel state of the vehicle based on signals obtained from the vehicle speed sensor 9 and the steering angle sensor 11.
In the following step 110, a driving environment recognition process is performed.

  This running environment recognition process is a process for recognizing the environment and obstacles around the vehicle based on the images taken by the periphery monitoring camera 5 and the surrounding information (including obstacles) obtained by the radar 7. is there. In addition, by adding navigation information, a road condition etc. can be grasped | ascertained more correctly.

For example, a pedestrian can be recognized as follows.
As shown in FIG. 4 (a), a pedestrian identifier (for example, an image display control device) for determining whether or not a pedestrian by learning using image data of pedestrians and non-pedestrians in advance. 1 is prepared, and this pedestrian identifier is used to determine whether the detected obstacle is a pedestrian.

  Specifically, as shown in FIG. 4B, using a pedestrian classifier, unknown data such as the feature of the shape of the obstacle detected from the image data and the distance from the obstacle detected by the radar 7 are stored. By comparing with the learning data, it is identified whether the obstacle is a pedestrian.

Vehicles and road signs can also be recognized by using a similar discriminator.
In the following step 120, it is determined whether or not an object (such as an obstacle) has been recognized by the above-described traveling environment recognition process. If an affirmative determination is made here, the process proceeds to step 130, whereas if a negative determination is made, the present process is temporarily terminated.

In step 130, a highlighting process is performed, and the process is temporarily terminated.
As will be described later, the highlighting process determines the risk level of the object recognized by the driving environment recognition process, and highlights the object (for example, image correction, vibrates the detection area) according to the risk level. , And so on).

(2) Next, the highlighting process will be described based on the flowchart of FIG.
In step 200 of FIG. 5, it is determined whether or not the recognized object is a pedestrian. If the user is a pedestrian, the process proceeds to step 210; otherwise, the process proceeds to step 230.

  In step 210, the speed V of the host vehicle and the distance L to the pedestrian are used, and the risk of the pedestrian is set in consideration of the moving direction of the pedestrian. When the degree of danger is taken into consideration depending on the direction of movement of the pedestrian, the predicted travel range of the host vehicle and the range in the vicinity thereof are obtained from the host vehicle speed and the steering angle, and the pedestrian approaches the predicted travel range. Consider whether or not.

  Then, for example, as shown in FIG. 6, when the degree of risk is indicated by a blinking speed of an image indicating a pedestrian (for example, a highlight frame surrounding the image when the pedestrian image is displayed on the display 13). First, the blinking speed V ′ of the highlighting frame is determined according to the set risk level.

For example, the blinking speed V ′ of the frame indicating the pedestrian is determined by the arithmetic expression (1) or (2) shown in FIG. These equations (1) and (2) use the speed V of the host vehicle and the distance L to the pedestrian, and the weight α ped of the host vehicle, the weight β ped of the distance, and the weight γ ped of the pedestrian movement direction It has been added.

Here, for example, constants can be adopted as the vehicle speed weight α ped and the distance weight β ped . Also, the weight γ ped of the pedestrian's moving direction is set large because the degree of danger increases when the moving direction of the pedestrian approaches the own vehicle or the predicted traveling locus of the own vehicle.

  Note that, for example, as shown in FIG. 8, the movement direction of the pedestrian is determined based on the image data at time t and time t + 1 when the pedestrian image changes from time t to time t + 1. It can be obtained from the moving direction of the feature point. Further, the predicted travel locus can be obtained from the own vehicle speed and the steering angle, but the accuracy may be improved by taking navigation information into consideration.

  Therefore, by using the above formulas (1) and (2), for example, when the host vehicle speed is fast, the blinking speed is increased, and when the distance is long, the blinking speed is decreased, and the moving direction of the pedestrian is When approaching the traveling direction, the blinking speed can be increased. In other words, the higher the risk level of the pedestrian that is the object, the faster the blinking speed of the frame indicating the pedestrian.

In the subsequent step 220, the pattern to be highlighted and the degree of enhancement are determined, a signal for highlighting is output on the display 15, and the process is temporarily terminated.
For example, when a frame is selected as a pattern to be highlighted and the degree of emphasis is adjusted by the blinking speed of the frame, a process for blinking the frame is performed based on the blinking speed set in step 210.

Here, the case where the blinking speed of the frame is changed according to the degree of risk is taken as an example, but the degree of highlighting may be increased as the degree of danger is increased by another method.
For example, in order to emphasize an object such as a pedestrian, “enclose with a frame”, “contrast correction”, “brightness correction”, “sharpness correction (outline emphasis)”, “hue correction” The highlighting may be performed by “vibrating the detection area”.

  On the other hand, in step 230, which is determined to be not a pedestrian in step 200, it is determined whether or not the recognized object is a vehicle (another vehicle). If it is another vehicle, the process proceeds to step 240; otherwise, the process proceeds to step 260.

  In step 240, for example, the speed V of the host vehicle and the distance L to the other vehicle are used, and the highlight frame used for displaying the image of the other vehicle on the display 13 in consideration of the moving direction of the other vehicle. The blinking speed V ′ is determined.

For example, the blinking speed V ′ of the frame indicating the other vehicle is determined by the arithmetic expression shown in FIG. These equations (3) and (4) use the speed V of the host vehicle and the distance L to the other vehicle, and also use the vehicle speed weight α veh , the distance weight β veh , and the moving direction weight γ veh of the other vehicle. This is an arithmetic expression that takes into account.

Here, for example, constants can be adopted as the vehicle speed weight α veh and the distance weight β veh . Further, the weight γ veh of the moving direction of the other vehicle is set large because the degree of danger increases when the moving direction of the other vehicle approaches the own vehicle or the predicted traveling locus of the own vehicle.

The moving direction of the other vehicle can be obtained in the same manner as the pedestrian described above.
Therefore, by using the above formulas (3) and (4), for example, when the host vehicle speed is high, the blinking speed is increased. When the distance is long, the blinking speed is decreased, and the moving direction of the other vehicle is When approaching the traveling direction, the blinking speed can be increased. That is, the higher the degree of danger of the other vehicle that is the object, the faster the blinking speed of the frame indicating the other vehicle.

  In the following step 250, for example, based on the blinking speed set in the step 240, a process for blinking the frame is performed in the same manner as in the step 220, and this process is once ended.

  On the other hand, in step 260, which is determined to be not a vehicle in step 230, it is determined whether or not the recognized object is a sign (road sign). If it is a sign, the process proceeds to step 270; otherwise, the process proceeds to step 290.

  In step 270, for example, the speed V of the host vehicle and the distance L to the sign are used, and the blinking speed V ′ of the highlight frame used when displaying the sign image on the display 13 in consideration of the kind of sign. To decide.

For example, the blinking speed V ′ of the frame indicating the sign is determined by the arithmetic expression shown in FIG. These formulas (5) and (6) use the speed V of the host vehicle and the distance L to the sign, and also take into account the weight α sign of the host vehicle speed, the weight β sign of the distance, and the weight γ sign of the sign type. It is a formula.

Here, for example, constants can be adopted as the vehicle speed weight α sign and the distance weight β sign . Also, the sign weight γ sign is large because the degree of danger increases when the importance of the contents of the sign is large (for example, the sign of a stop is more important than the sign of the curve ahead). Set.

  Therefore, by using the above formulas (5) and (6), for example, when the vehicle speed is high, the blinking speed is increased. When the distance is long, the blinking speed is decreased, and the sign is highly important. The flashing speed can be increased. In other words, the higher the degree of danger indicated by the sign as the object, the faster the blinking speed of the frame indicating the sign.

  In the following step 280, for example, based on the blinking speed set in the step 270, the process for blinking the frame is performed in the same manner as in the step 220, and this process is once ended.

  On the other hand, in step 290 which is determined not to be a sign in step 260 and is an object that is neither a pedestrian nor a vehicle, for example, using the speed V of the host vehicle and the distance L to the object, The blinking speed V ′ of the object frame is determined.

Therefore, for example, the flashing speed can be increased when the host vehicle speed is high, and the flashing speed can be decreased when the distance is long. The blinking speed may be a predetermined fixed blinking speed.
In the subsequent step 300, for example, based on the blinking speed set in the step 290, the process for blinking the frame is performed in the same manner as in the step 220, and the process is temporarily terminated.

  c) In this way, in this embodiment, the risk level of the target is set according to the speed of the host vehicle, the distance to the target, and the type of the target, and the target is set according to the setting result. The highlighted display of an image to be displayed (for example, a frame surrounding the image) is controlled by changing the blinking speed.

  Specifically, in addition to the speed of the host vehicle and the distance to the target object, in addition to the type of target object, and also in consideration of the moving direction of the target object, it exists in and near the predicted travel path of the host vehicle. The risk level is set for the target object. And the highlighting is strengthened as the object having a higher degree of risk.

As a result, even when there are a plurality of objects, the driver can surely recognize an object with a very high degree of danger, so that the driving safety is greatly improved.
[Second Embodiment]
Next, the second embodiment will be described, but the description of the same contents as the first embodiment will be omitted.

In the present embodiment, the driver's line of sight is detected, and highlighting is performed according to the line of sight.
As shown in FIG. 9, in the image display control system 21 of the present embodiment, the image display control device 23, the peripheral monitoring camera 25, the radar 27, the vehicle speed sensor 29, and the steering angle sensor 31, as in the first embodiment. And a navigation device 33, a display 35, etc., and an in-vehicle monitoring camera 37 for photographing the inside of the vehicle, and an emphasis display release switch 39 for releasing the emphasis display.

The in-vehicle monitoring camera 37 captures the driver's face, and detects the driver's line of sight from the captured image.
For example, from the position of the eyes in the face image, it can be determined whether the face is facing the front of the vehicle or the side. It is also possible to detect the line of sight from the position of the black eye in the eye image.

When the line of sight is detected, the driver's attention can be drawn by highlighting the image of the object in the direction that the driver is not looking at.
In addition, this invention is not limited to the said embodiment, It is possible to implement in various aspects.

  (1) For example, as a highlighting method, there is no particular limitation as long as it draws attention of the driver or the like. For example, an image showing an object may be highlighted by vibrating on the screen. In addition, when there are a plurality of objects, images indicating the objects may be sequentially highlighted in order from the highest risk level.

(2) The present invention can also be applied to a program for executing highlighting processing and the like based on the above-described algorithm and a recording medium storing the program.
Examples of the recording medium include various recording media such as an electronic control device configured as a microcomputer, a microchip, a flexible disk, a hard disk, a DVD, and an optical disk. That is, there is no particular limitation as long as it stores a program that can execute the processing of the image display control device described above.

  The program is not limited to a program stored in a recording medium, but can be applied to a program transmitted / received through a communication line such as the Internet.

It is a block diagram which shows the whole image display control system in 1st Embodiment. It is explanatory drawing which shows the visual field of the periphery monitoring camera mounted in the vehicle. It is a main flowchart which shows the main processes of 1st Embodiment. It is explanatory drawing which shows a pedestrian discriminator. It is a flowchart which shows the highlight display process of 1st Embodiment. It is explanatory drawing which shows the screen displayed on the display. It is explanatory drawing which shows the calculating formula of blinking speed. It is explanatory drawing which shows a moving direction detection method. It is a block diagram which shows the whole image display control system in 2nd Embodiment.

Explanation of symbols

DESCRIPTION OF SYMBOLS 1,23 ... Image display control apparatus 3, 21 ... Image display control system 5, 25 ... Perimeter monitoring camera 7, 27 ... Radar 13, 35 ... Display 37 ... In-vehicle monitoring camera

Claims (10)

  1. An image display that detects an object existing around the host vehicle, sets a risk level of the target object with respect to the host vehicle, and performs highlighting on an image showing the target object on the screen according to the risk level In the control device,
    Detecting the speed of the host vehicle, the distance to the target object, and the type of the target object, and depending on the speed of the host vehicle, the distance to the target object, and the type of the target object, An image display control apparatus, wherein a risk level is set, and emphasis display of an image showing the object is controlled according to the setting result.
  2.   The image display control apparatus according to claim 1, wherein the image indicating the object is at least one of an image of the object itself and a mark designating the object.
  3.   The risk level is set for at least an object existing in the predicted travel locus of the host vehicle and in the vicinity of the predicted travel locus of the host vehicle. Item 3. The image display control device according to Item 1 or 2.
  4.   The image display control apparatus according to claim 1, wherein the degree of risk is set according to a moving direction of the object to adjust the highlighting.
  5.   The image display control device according to claim 1, wherein the highlighting is adjusted according to a blinking speed of an image indicating the object.
  6.   The image display control apparatus according to claim 1, wherein an image showing the object is highlighted by vibrating on a screen.
  7.   The image display control apparatus according to claim 1, wherein when there are a plurality of the objects, images indicating the objects are highlighted in order from the highest risk level.
  8.   The image display control device according to claim 1, wherein the driver's line of sight is detected, and an image showing the object in a direction not seen by the driver is highlighted.
  9.   The image display control apparatus according to claim 1, wherein the highlighting is strengthened for an object having a higher risk level.
  10.   An image display control system according to claim 1, comprising at least an imaging device and an image display device.
JP2007204383A 2007-08-06 2007-08-06 Image display control device and image display control system Pending JP2009040107A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007204383A JP2009040107A (en) 2007-08-06 2007-08-06 Image display control device and image display control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007204383A JP2009040107A (en) 2007-08-06 2007-08-06 Image display control device and image display control system

Publications (1)

Publication Number Publication Date
JP2009040107A true JP2009040107A (en) 2009-02-26

Family

ID=40441360

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007204383A Pending JP2009040107A (en) 2007-08-06 2007-08-06 Image display control device and image display control system

Country Status (1)

Country Link
JP (1) JP2009040107A (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010276557A (en) * 2009-05-29 2010-12-09 Toyota Motor Corp Spectrum measuring device for moving body
WO2011104984A1 (en) 2010-02-24 2011-09-01 アイシン精機株式会社 Vehicle surroundings monitoring device
JP2011192070A (en) * 2010-03-15 2011-09-29 Honda Motor Co Ltd Apparatus for monitoring surroundings of vehicle
JP2011191859A (en) * 2010-03-12 2011-09-29 Honda Motor Co Ltd Apparatus for monitoring surroundings of vehicle
JP2011238161A (en) * 2010-05-13 2011-11-24 Yupiteru Corp Alarm device
JP2012064096A (en) * 2010-09-17 2012-03-29 Nissan Motor Co Ltd Vehicle image display device
JP2012066606A (en) * 2010-09-21 2012-04-05 Denso Corp Display device for vehicle
JP2012068962A (en) * 2010-09-24 2012-04-05 Yupiteru Corp On-vehicle electronic device and program
JP2012138828A (en) * 2010-12-27 2012-07-19 Toyota Motor Corp Image providing device
JP2014090349A (en) * 2012-10-31 2014-05-15 Clarion Co Ltd Image processing system and image processing method
US20140168424A1 (en) * 2011-07-21 2014-06-19 Ziv Attar Imaging device for motion detection of objects in a scene, and method for motion detection of objects in a scene
WO2014129026A1 (en) * 2013-02-21 2014-08-28 本田技研工業株式会社 Driving assistance device and image processing program
JP2015024742A (en) * 2013-07-26 2015-02-05 日産自動車株式会社 Drive support apparatus and drive support method
JP2016035738A (en) * 2014-08-04 2016-03-17 富士重工業株式会社 Running environment risk determination device and running environment risk notification device
JP2016103142A (en) * 2014-11-28 2016-06-02 富士通テン株式会社 Data processing apparatus, image processing method, and program
JP2016176898A (en) * 2015-03-23 2016-10-06 セイコーエプソン株式会社 Electronic component conveyance device and electronic component inspection device
US9514547B2 (en) 2013-07-11 2016-12-06 Denso Corporation Driving support apparatus for improving awareness of unrecognized object
JP2017034543A (en) * 2015-08-04 2017-02-09 株式会社デンソー On-vehicle display control device and on-vehicle display control method
JP2017030688A (en) * 2015-08-06 2017-02-09 日立建機株式会社 Periphery monitoring device of work machine
KR20180040679A (en) * 2015-09-18 2018-04-20 닛산 지도우샤 가부시키가이샤 Vehicle display device and vehicle display method
KR20180044363A (en) * 2015-09-28 2018-05-02 닛산 지도우샤 가부시키가이샤 Vehicle display device and vehicle display method
WO2018092919A1 (en) * 2016-11-21 2018-05-24 京セラ株式会社 Image processing device, imaging device, and display system
US10235768B2 (en) 2014-12-10 2019-03-19 Mitsubishi Electric Corporation Image processing device, in-vehicle display system, display device, image processing method, and computer readable medium
WO2020152737A1 (en) * 2019-01-21 2020-07-30 三菱電機株式会社 Information presentation device, information presentation control method, program, and recording medium

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010276557A (en) * 2009-05-29 2010-12-09 Toyota Motor Corp Spectrum measuring device for moving body
US9073483B2 (en) 2010-02-24 2015-07-07 Aisin Seiki Kabushiki Kaisha Vehicle surroundings monitoring device
WO2011104984A1 (en) 2010-02-24 2011-09-01 アイシン精機株式会社 Vehicle surroundings monitoring device
JP2011191859A (en) * 2010-03-12 2011-09-29 Honda Motor Co Ltd Apparatus for monitoring surroundings of vehicle
JP2011192070A (en) * 2010-03-15 2011-09-29 Honda Motor Co Ltd Apparatus for monitoring surroundings of vehicle
JP2011238161A (en) * 2010-05-13 2011-11-24 Yupiteru Corp Alarm device
JP2012064096A (en) * 2010-09-17 2012-03-29 Nissan Motor Co Ltd Vehicle image display device
US9106842B2 (en) 2010-09-17 2015-08-11 Nissan Motor Co., Ltd. Vehicle image display apparatus and method
JP2012066606A (en) * 2010-09-21 2012-04-05 Denso Corp Display device for vehicle
JP2012068962A (en) * 2010-09-24 2012-04-05 Yupiteru Corp On-vehicle electronic device and program
JP2012138828A (en) * 2010-12-27 2012-07-19 Toyota Motor Corp Image providing device
CN103269908A (en) * 2010-12-27 2013-08-28 丰田自动车株式会社 Image providing device
US20140168424A1 (en) * 2011-07-21 2014-06-19 Ziv Attar Imaging device for motion detection of objects in a scene, and method for motion detection of objects in a scene
JP2014090349A (en) * 2012-10-31 2014-05-15 Clarion Co Ltd Image processing system and image processing method
US9485438B2 (en) 2012-10-31 2016-11-01 Clarion Co., Ltd. Image processing system with image conversion unit that composites overhead view images and image processing method
WO2014129026A1 (en) * 2013-02-21 2014-08-28 本田技研工業株式会社 Driving assistance device and image processing program
CN104885448B (en) * 2013-02-21 2018-04-06 本田技研工业株式会社 Drive assistance device and image processing program
CN104885448A (en) * 2013-02-21 2015-09-02 本田技研工业株式会社 Driving assistance device and image processing program
US9589194B2 (en) 2013-02-21 2017-03-07 Honda Motor Co., Ltd. Driving assistance device and image processing program
US9514547B2 (en) 2013-07-11 2016-12-06 Denso Corporation Driving support apparatus for improving awareness of unrecognized object
JP2015024742A (en) * 2013-07-26 2015-02-05 日産自動車株式会社 Drive support apparatus and drive support method
JP2016035738A (en) * 2014-08-04 2016-03-17 富士重工業株式会社 Running environment risk determination device and running environment risk notification device
US9922554B2 (en) 2014-08-04 2018-03-20 Subaru Corporation Driving environment risk determination apparatus and driving environment risk notification apparatus
JP2016103142A (en) * 2014-11-28 2016-06-02 富士通テン株式会社 Data processing apparatus, image processing method, and program
US10399438B2 (en) 2014-11-28 2019-09-03 Fujitsu Ten Limited Data processing apparatus
US10235768B2 (en) 2014-12-10 2019-03-19 Mitsubishi Electric Corporation Image processing device, in-vehicle display system, display device, image processing method, and computer readable medium
JP2016176898A (en) * 2015-03-23 2016-10-06 セイコーエプソン株式会社 Electronic component conveyance device and electronic component inspection device
CN107852483B (en) * 2015-08-04 2020-02-07 株式会社电装 Device and method for prompting auxiliary image to driver
WO2017022496A1 (en) * 2015-08-04 2017-02-09 株式会社デンソー Device for presenting assistance images to driver, and method therefor
CN107852483A (en) * 2015-08-04 2018-03-27 株式会社电装 The devices and methods therefor of assistant images is prompted to driver
JP2017034543A (en) * 2015-08-04 2017-02-09 株式会社デンソー On-vehicle display control device and on-vehicle display control method
US10464484B2 (en) 2015-08-04 2019-11-05 Denso Corporation Apparatus for presenting support images to a driver and method thereof
WO2017022262A1 (en) * 2015-08-06 2017-02-09 日立建機株式会社 Surrounding monitoring device for operating machines
JP2017030688A (en) * 2015-08-06 2017-02-09 日立建機株式会社 Periphery monitoring device of work machine
KR101960644B1 (en) * 2015-09-18 2019-03-20 닛산 지도우샤 가부시키가이샤 Vehicle display device and vehicle display method
US10304228B2 (en) 2015-09-18 2019-05-28 Nissan Motor Co., Ltd. Vehicular display apparatus and vehicular display method
KR20180040679A (en) * 2015-09-18 2018-04-20 닛산 지도우샤 가부시키가이샤 Vehicle display device and vehicle display method
KR101975154B1 (en) * 2015-09-28 2019-05-03 닛산 지도우샤 가부시키가이샤 Vehicle display device and vehicle display method
US10269161B2 (en) 2015-09-28 2019-04-23 Nissan Motor Co., Ltd. Vehicular display device and vehicular display method
KR20180044363A (en) * 2015-09-28 2018-05-02 닛산 지도우샤 가부시키가이샤 Vehicle display device and vehicle display method
WO2018092919A1 (en) * 2016-11-21 2018-05-24 京セラ株式会社 Image processing device, imaging device, and display system
WO2020152737A1 (en) * 2019-01-21 2020-07-30 三菱電機株式会社 Information presentation device, information presentation control method, program, and recording medium

Similar Documents

Publication Publication Date Title
KR101498976B1 (en) Parking asistance system and parking asistance method for vehicle
US9771022B2 (en) Display apparatus
JP5939357B2 (en) Moving track prediction apparatus and moving track prediction method
DE102017124304A1 (en) Device and method for recognizing and notifying a beginner
US9308917B2 (en) Driver assistance apparatus capable of performing distance detection and vehicle including the same
JP6346614B2 (en) Information display system
US9723243B2 (en) User interface method for terminal for vehicle and apparatus thereof
KR102051142B1 (en) System for managing dangerous driving index for vehicle and method therof
JP5718942B2 (en) Apparatus and method for assisting safe operation of transportation means
JP2016001464A (en) Processor, processing system, processing program, and processing method
JP5706874B2 (en) Vehicle periphery monitoring device
CN103732480B (en) Method and device for assisting a driver in performing lateral guidance of a vehicle on a carriageway
US9283963B2 (en) Method for operating a driver assist system of an automobile providing a recommendation relating to a passing maneuver, and an automobile
US8405491B2 (en) Detection system for assisting a driver when driving a vehicle using a plurality of image capturing devices
US8085140B2 (en) Travel information providing device
JP4277081B2 (en) Driving assistance device
US7222009B2 (en) Driving assist system for vehicle
US9031774B2 (en) Apparatus and method for preventing collision of vehicle
JP2016001170A (en) Processing unit, processing program and processing method
US7190282B2 (en) Nose-view monitoring apparatus
CN102203837B (en) System for monitoring the area around a vehicle
DE112010003874B4 (en) vehicle control
JP6648411B2 (en) Processing device, processing system, processing program and processing method
US20150015712A1 (en) Driving assistance device and driving assistance method
JP4513318B2 (en) Rear side image control apparatus and method