JP2017056909A - Vehicular image display device - Google Patents

Vehicular image display device Download PDF

Info

Publication number
JP2017056909A
JP2017056909A JP2015185679A JP2015185679A JP2017056909A JP 2017056909 A JP2017056909 A JP 2017056909A JP 2015185679 A JP2015185679 A JP 2015185679A JP 2015185679 A JP2015185679 A JP 2015185679A JP 2017056909 A JP2017056909 A JP 2017056909A
Authority
JP
Japan
Prior art keywords
monitor
image
driver
vehicle
optical flow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
JP2015185679A
Other languages
Japanese (ja)
Inventor
岩本 太郎
Taro Iwamoto
太郎 岩本
Original Assignee
マツダ株式会社
Mazda Motor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by マツダ株式会社, Mazda Motor Corp filed Critical マツダ株式会社
Priority to JP2015185679A priority Critical patent/JP2017056909A/en
Publication of JP2017056909A publication Critical patent/JP2017056909A/en
Abandoned legal-status Critical Current

Links

Abstract

PROBLEM TO BE SOLVED: To provide a vehicular image display device which can suppress unnecessary attractivity to a monitor in a driver under a non-consciousness condition and can improve visibility against the traveling direction of the driver.SOLUTION: In a vehicular image display device 1 comprising an outside camera 2 capable of imaging continuously and a monitor 4 mounted inside a compartment and capable of displaying a far-infrared imaging image imaged by the outside camera 2, provided are an optical flow detection part 5a which can detect a first optical flow F1 which a driver sights through a window glass 12, a sighting timing determination part 5b which can determine the sighting timing of the monitor 4, and a monitor image generation part 5c which generates a monitor image for displaying in the monitor 4 on the basis of the far-infrared imaging image imaged by the outside camera 2. The monitor image generation part 5c implements coordinate conversion of the monitor image so that a second optical flow F2 displayed in the monitor 4 matches the first optical flow F1 when the sighting timing is not determined.SELECTED DRAWING: Figure 2

Description

  The present invention relates to a vehicular image display device, and more particularly to a vehicular image display device including a monitor that displays a picked-up image picked up by an image pickup unit capable of continuously picking up a vehicle periphery.

Conventionally, as represented by a navigation device, various display elements related to driving are displayed on a monitor (display) attached to a position (for example, the rear end portion of the instrument panel) visible to the driver. Has been.
It is also known to change the display form of the monitor in consideration of so-called optical flow from the driver's forward view as the vehicle travels, so-called optical flow, in order to improve monitor confirmation by the driver. .

In the in-vehicle information display device of Patent Document 1, a display is disposed at a position where a driver's optical flow passes when the vehicle is traveling, and a plurality of display elements are arranged in an oblique direction along the optical flow with respect to the display. it's shown.
As a result, the driver moves the line of sight along the optical flow and visually recognizes a plurality of display elements, so the amount of line-of-sight movement for checking the display elements can be reduced, and the selection time of the display elements can be shortened. ing.

In recent years, a visual field assistance device (night vision system or night view system) that assists a driver to recognize an obstacle such as a pedestrian during night driving has been put into practical use.
This visibility assist device displays the infrared image around the vehicle captured by the infrared camera on the monitor, even in the morning and evening backlighting and in the weather that lacks visibility such as shade, rain, and fog. On the other hand, a clear view image around the vehicle is provided.

JP 2014-201253 A

It is known that a video (image) has a remarkable area that attracts human eyes.
Visual saliency is a psychological characteristic that attracts bottom-up attention determined by the spatial arrangement of visual stimuli. That is, visual saliency has characteristics that increase as the difference between the three elements of color, luminance, and inclination (edge direction) increases with respect to the surroundings. Is unconsciously attracted to different areas with respect to the surroundings.

In the case of a vehicle equipped with a visual assistance device capable of displaying an infrared image around the vehicle on a monitor arranged on the instrument panel, the driver's field of view is the actual direct view that can be seen through the windshield. A visual field (hereinafter referred to as a first visual field) and an infrared image (hereinafter referred to as a second visual field) as a visual field captured by an infrared camera and indirectly displayed via a monitor.
Since the first and second optical flows generated in the first and second fields of view each have a vanishing point, the directing direction of the second optical flow is different from the directing direction of the first optical flow. . Therefore, the second field of view has a higher visual saliency than the first field of view that occupies most of the driver's field of view. Even though there are no obstacles such as pedestrians to watch, there is a possibility that the monitor is unconsciously attracted to the monitor and cannot pay enough attention to the front of the vehicle.

The in-vehicle information display device of Patent Document 1 displays a plurality of display elements on a display, and has a directivity direction different from that of the first optical flow that can be visually recognized through the front window glass in the driver's field of view. 2 Optical flow is not displayed.
In addition, Patent Document 1 aims to reduce the amount of line-of-sight movement when the driver selects a display element, and is intended for the driver's own conscious behavior. It is not supposed to suppress unnecessary attractiveness to the display in the state.

  An object of the present invention is to provide an image display device for a vehicle or the like that can suppress unnecessary attraction to a monitor in an unconscious driver and can improve the visibility of the driver in the traveling direction.

  The image display device for a vehicle according to claim 1 is for a vehicle including an imaging unit capable of continuously imaging the periphery of the vehicle, and a monitor attached to a vehicle interior and capable of displaying a captured image captured by the imaging unit. In the image display device, an optical flow detection unit capable of detecting a driver optical flow visually recognized by a driver through a window glass, and a monitor image for displaying on the monitor based on a captured image captured by the imaging unit Monitor image generating means for generating the image, wherein the monitor image generating means performs coordinate conversion of the monitor image so that the in-monitor optical flow displayed in the monitor matches the driver optical flow. It is said.

In this vehicle image display device, the monitor image is coordinated so that the in-monitor optical flow displayed in the monitor matches the driver optical flow. The attractiveness can be suppressed by reducing the property, and the visibility of the driver in the traveling direction can be relatively increased.
In addition, the driver's discomfort and fatigue due to the difference between the direction of the driver optical flow and the direction of the in-monitor optical flow can be reduced.

According to a second aspect of the present invention, in the first aspect of the invention, when the driver has visual time determination means capable of determining the time when the driver visually recognizes or visually recognizes the monitor, the visual time determination means determines that the time is the visual recognition time. The monitor image generating means generates a monitor image without performing the coordinate conversion.
According to this configuration, when the driver visually recognizes the monitor or when it is determined that the driver should visually recognize the monitor, the visual saliency of the monitor screen is increased to increase the attractiveness. The visibility of the driver's monitor can be increased.

According to a third aspect of the present invention, in the second aspect of the invention, the monitor image generating means generates the monitor image as a still image.
According to this configuration, by increasing the visual saliency of the monitor screen in the unconscious state and enhancing the attractiveness, the driver's visibility can be enhanced and the gaze time can be shortened.

According to a fourth aspect of the present invention, in the second or third aspect of the invention, the visual recognition time determining means includes an obstacle detecting means capable of detecting an obstacle around the vehicle.
According to this configuration, when there is an obstacle to be recognized by the driver, it is possible to guide the driver's line of sight to the monitor screen by increasing the visual saliency of the monitor screen and enhancing the attractiveness. .

A fifth aspect of the present invention is characterized in that, in the second or third aspect of the invention, the visual recognition time determining means includes visual line detection means capable of detecting that the driver has directed the visual line toward the monitor.
According to this configuration, when the driver consciously turns his line of sight toward the monitor, the driver's visibility can be improved and the gaze time can be shortened.

The invention of claim 6 is the invention according to any one of claims 1 to 5, wherein the imaging means is an infrared camera capable of capturing an infrared image in front of the vehicle.
According to this configuration, even when the driver's field of vision is not clear, such as at night or in bad weather, a clear field of view around the vehicle can be provided to the driver.

  According to the vehicle image display device of the present invention, it is possible to suppress unnecessary attraction to the monitor in an unconscious driver and improve the visibility of the driver in the traveling direction.

It is the figure which looked at the vehicle interior front of the vehicle which concerns on Example 1. FIG. It is a block diagram of the image display apparatus for vehicles. It is a figure which shows a 1st visual field. It is a figure which shows a driver | operator visual field when the 2nd visual field is not coordinate-transformed. It is a principal part enlarged view of FIG. It is a figure which shows a driver | operator visual field when a 2nd visual field is coordinate-transformed. It is a principal part enlarged view of FIG. It is a flowchart which shows an image display process. It is a modification of a monitor.

Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
The following description exemplifies a case where the present invention is applied to the vehicle V, and does not limit the present invention, its application, or its use.

Embodiment 1 of the present invention will be described below with reference to FIGS.
The vehicular image display device 1 relatively controls visibility through the front window glass 12 by adjusting the visual saliency of the monitor image in the driver's field of view.
As shown in FIG. 1 and FIG. 2, the vehicle image display device 1 is mounted on a vehicle V and can continuously image the vehicle outer camera 2 that can continuously image the periphery of the vehicle and the upper body of the driver. A vehicle interior camera 3, a monitor 4 capable of displaying a vehicle exterior captured image captured by the vehicle exterior camera 2, an ECU (Electronic Control Unit) 5, and the like are provided.

  The vehicle exterior camera 2 is mounted, for example, in the vicinity of the rearview mirror 11 on the lower surface on the front end side of the roof panel, and is configured to be able to capture a heat distribution image in front of the vehicle through a front window glass 12 as a continuous moving image. The vehicle exterior camera 2 is composed of an infrared color camera, and is formed such that a near-infrared captured image and a far-infrared captured image can be selectively acquired by using a switchable wavelength variable filter.

The near-infrared captured image is an image close to a visible light captured image projected by daylight sunlight or nighttime headlight light, and the far-infrared captured image includes a night pedestrian that is difficult to visually recognize as a detection target. Therefore, it is a heat distribution captured image in the far-infrared wavelength region.
As the image sensor of the outside camera 2, a thermal reaction type CCD or CMOS having sensitivity in the infrared wavelength region is used. A pyroelectric infrared camera may be used to obtain sensitivity at a longer distance.

  The in-vehicle camera 3 is mounted on, for example, an upper part of the driver seat side rear end wall of the instrument panel 13 and is configured to be able to image the upper body including the driver's face while avoiding the steering wheel 14. An image of the face portion is cut out from the captured upper body image, the iris (iris) of the eyes is enlarged and detected, and the driver's line-of-sight direction is specified. The measurement of the line-of-sight direction is a known technique such as a technique for detecting the line-of-sight direction from the Purkinje image, and thus detailed description thereof is omitted.

The monitor 4 is attached to, for example, the rear end wall of the central portion of the instrument panel 13 in the vehicle width direction, and is configured to be able to display a far-infrared image captured by the outside camera 2.
The monitor 4 is composed of a color liquid crystal display, and is configured to be able to display an image (moving image or still image) of a CD player, HDD player, television, navigation, night vision system, etc., by operating a changeover switch (not shown). ing.
Further, the monitor 4 is used in combination with a touch panel on which a group of operation switches is displayed, and by touching a CD player operation position, an HDD player operation position, a television operation position, an air conditioning operation position, a navigation operation position, and a night vision switch, Each device can be activated.

Next, the ECU 5 will be described.
The ECU 5 is configured to perform image processing on a far-infrared captured image captured by the vehicle exterior camera 2 when the changeover switch of the night vision system is operated.
As shown in FIG. 2, the ECU 5 includes an optical flow detection unit 5a, a visual recognition time determination unit 5b, a monitor image generation unit 5c, and the like.
The ECU 5 is an electronic control unit including a CPU, a ROM, a RAM, and the like, and performs various arithmetic processes by loading an application program stored in the ROM into the RAM and executing it by the CPU.

The optical flow detection unit 5a is configured to be able to detect a driver optical flow visually recognized by the driver through the window glass 12. When the vehicle V is traveling, the optical flow is generated radially from one point called the vanishing point P1 (see FIG. 3).
Hereinafter, the field of view visible to the driver via the front window glass 12 is the first field of view V1, the driver optical flow generated in the first field of view V1 is the first optical flow F1, and the driver can view the field of view via the monitor screen. The second optical flow F2 is an in-monitor optical flow in which the second visual field V2 is the correct visual field and the second visual field V2 is generated.
The optical flow detection unit 5a selects a specific target (such as a tree or a building) in two temporally continuous near-infrared captured images captured by the camera 2 outside the vehicle, and calculates the displacement of the target. The second optical flow F2 is extracted by extracting the vector amount and the directing direction of the first optical flow F1, selecting a specific target in two temporally continuous far-infrared imaging images, and calculating the displacement of the target. The vector amount and the directivity direction are extracted.

The visual recognition time determination unit 5b is configured to be able to determine the time when the driver consciously views the monitor 4 or the time when the driver should visually recognize the monitor 4.
When the visual recognition time determination unit 5b detects that the driver's line-of-sight direction specified based on the in-vehicle captured image captured by the in-vehicle camera 3 is directed to the monitor 4, the driver himself / herself selects the monitor 4. It is determined that it is the time when it was consciously viewed.
In addition, when there is an obstacle around the vehicle, particularly in the forward direction of the vehicle, the driver needs to recognize the obstacle and needs to drive in consideration of the obstacle.
Therefore, the visual recognition time determination unit 5b has an obstacle detection function for detecting obstacles such as pedestrians and signs based on the near-infrared captured image and the far-infrared captured image captured by the outside camera 2. When the obstacle detection function detects an obstacle, it is determined that it is time for the driver to visually recognize the monitor 4. A plurality of templates corresponding to detection targets such as pedestrians and signs are stored in advance, and the matching rate between the extracted target and the template is used as a criterion for obstacle detection.

The monitor image generation unit 5c is configured to be able to generate a monitor image for display on the monitor 4 based on the far-infrared captured image captured by the camera 2 outside the vehicle.
Prior to the description of the monitor image generation unit 5c, the attractiveness related to the monitor 4 will be described.
As shown in FIG. 4, the visual field that the driver can visually recognize while driving is configured by the sum of the first visual field V1 that can be directly viewed by the driver and the second visual field V2 that is displayed based on the far-infrared captured image. ing. In the second field of view V2, a far-infrared captured image obtained by capturing a predetermined region of the first field of view V1 with the camera 2 outside the vehicle is displayed, so the first and second fields of view are completely independent coordinate systems.
That is, as shown in FIG. 5, when the vanishing point P1 of the first visual field V1 and the first optical flow F1 generated from the vanishing point P1 are imaged by the outside camera 2, the second visual field V2 corresponds to the vanishing point P1. Since the vanishing point P2 and the second optical flow F2 corresponding to the first optical flow F1 are displayed, the first optical flow F1 and the second optical flow F2 are displayed as vectors having an intersection angle.
Therefore, the second visual field V2 displayed on the monitor 4 has a higher visual saliency with respect to the first visual field V1 that occupies most of the driver's visual field. The driver's line of sight is unconsciously attracted to the monitor 4.

Returning to the description of the monitor image generation unit 5c.
When the monitor time generation unit 5b determines that the visual recognition time determination unit 5b is not the visual recognition time, the monitor image generation unit 5c displays the far-infrared captured image so that the second optical flow F2 is aligned with the first optical flow F1 (almost parallel). A monitor image that has undergone coordinate transformation is generated.
As shown in FIG. 5 and FIG. 7, the monitor image generation unit 5 c is configured such that the second optical flow passing through the point C 1 on the second optical flow F 2 passing through the lower left corner A of the monitor 4 and the right lower corner B of the monitor 5. A point D1 on the flow F2 is horizontal to a point C2 on the first optical flow F1 passing through the lower left corner A of the monitor 4 and a point D2 on the first optical flow F1 passing through the lower right corner B of the monitor 4 respectively. The coordinates are converted in the direction.
In this coordinate conversion, the horizontal (left / right) direction pixel data is subjected to planar projective conversion in accordance with the conversion rule described above without converting the vertical (up / down) direction pixel data of the far-infrared captured image. Therefore, as shown in FIGS. 6 and 7, the vanishing point P2 of the second optical flow F2 is rearranged to the same position as the vanishing point P1 of the first optical flow F1, and the first optical flow F1 and The crossing angle of the second optical flow F2 after the coordinate conversion is eliminated, and the first visual field V1 and the second visual field V2 can be simulated as the same visual field, and the visual field of the second visual field V2 with respect to the first visual field V1 The saliency can be reduced.

The monitor image generation unit 5c is configured to generate a monitor image without performing coordinate conversion when the visual recognition time determination unit 5b determines that it is the visual recognition time.
Since the first optical flow F1 and the second optical flow F2 are displayed as vectors having intersection angles, the visual saliency of the second visual field V2 with respect to the first visual field V1 can be increased, and the driver's line of sight can be increased. It can be attracted to the monitor 4 (see FIGS. 4 and 5).
As described above, by controlling the consistency (degree of parallelism) between the first optical flow F1 and the second optical flow F2, the visual saliency of the monitor image in the driver's field of view is adjusted, and the first visual field V1 is relatively adjusted. Visibility is controlled.

Next, image display processing will be described based on the flowchart of FIG. Si (i = 1, 2,...) Indicates a step for each process.
The image display process is started by an operation operation of the night vision system (night vision switch ON operation) by the driver.
First, in S1, a near-infrared captured image and a far-infrared captured image are acquired from the camera 2 outside the vehicle, an in-vehicle captured image is acquired from the vehicle camera 3, and the process proceeds to S2.
In S2, the first optical flow F1 is detected based on two temporally continuous near-infrared captured images, and the second optical flow F2 is detected based on two temporally continuous infrared captured images.

Next, obstacles such as pedestrians and signs are detected based on the near-infrared captured image and the far-infrared captured image (S3), and the driver's line-of-sight direction is detected based on the in-vehicle captured image of the in-vehicle camera 3 (S4). ).
In S5, it is determined whether or not it is a visual recognition period based on the presence or absence of an obstacle and the direction of the driver's line of sight.
As a result of the determination in S5, when it is the visual recognition time, the process proceeds to S6, and it is determined whether or not the driver is pointing the line of sight to the monitor 4.
If the result of determination in S6 is that the driver is pointing his / her line of sight toward the monitor 4, still image processing is performed (S7). When the driver is consciously directing his / her line of sight to the monitor 4, there is an active request from the driver such as confirming safety in front of the vehicle (traveling direction) or confirming a sign. Update of the display image (moving image display) of the second field of view V2 is stopped to improve the driver's cognition of the second field of view V2, thereby shortening the gaze time of the driver on the monitor 4.
After the still image processing, the still image of the far-infrared image captured by the outside camera 2 is displayed on the monitor 4 for a predetermined time (S8), and the process returns.

As a result of the determination in S6, when the driver does not direct the line of sight to the monitor 4, the process proceeds to S9 to determine whether there is a pedestrian in front of the vehicle.
If the result of determination in S9 is that there is a pedestrian in front of the vehicle, moving image processing is performed (S10), and processing proceeds to S8. When there is a pedestrian in front of the vehicle, it is necessary to guide the driver's line of sight to the monitor 4. Therefore, the coordinate conversion processing of the far-infrared captured image captured by the outside camera 2 is prohibited and the second visual field V2 is displayed. Visual saliency is enhanced, and the driver's attention is actively guided toward the monitor 4. And since the moving image which prohibited the coordinate conversion process of a far-infrared image pick-up image is displayed, a driver | operator can grasp | ascertain correctly a pedestrian's behavior and an existing position.
After the moving image processing, the moving image of the far-infrared captured image that has not undergone coordinate conversion processing is displayed on the monitor 4 (S8), and the process returns.

If there is no pedestrian in front of the vehicle as a result of the determination in S9, the process proceeds to S7.
When there is no pedestrian in front of the vehicle, since the obstacle is a sign or the like, the still image processing is performed with priority given to the driver's recognition of the second field of view V2.

If the result of determination in S5 is not the viewing time, coordinate conversion processing is performed (S11).
When it is not the viewing time, it is necessary to increase the visibility of the driver with respect to the first field of view V1, so that the far-infrared image captured by the outside camera 2 is subjected to coordinate conversion processing and visually noticeable in the second field of view V2. And the visibility of the driver relative to the first field of view V1 is relatively increased.
The moving image of the far-infrared captured image that has undergone coordinate conversion processing is displayed on the monitor 4 (S8), and the process returns.

Next, functions and effects of the vehicle image display apparatus 1 will be described.
According to the vehicular image display device 1, the monitor image generator 5c performs coordinate conversion of the monitor image so that the second optical flow F2 displayed in the monitor 4 is aligned with the first optical flow F1, so that the unconscious state The visual saliency of the second visual field V2 can be reduced and the attractiveness can be suppressed, and the visibility of the driver in the traveling direction can be relatively increased. In addition, driver discomfort and fatigue due to the difference between the direction of the first optical flow F1 and the direction of the second optical flow F2 can be reduced.

  When the driver visually recognizes the monitor 4 or has a visual recognition time determination unit 5b that can determine the visual recognition time, and when the visual recognition time determination unit 5b determines the visual recognition time, the monitor image generation unit 5c does not perform coordinate conversion. In order to generate the monitor image, the visual saliency of the second field of view V2 is increased when the driver consciously views the monitor 4 or when the visual time for the driver to view the monitor 4 is determined. The visibility of the driver with respect to the monitor 4 can be increased by improving the performance.

  Since the monitor image generation unit 5c generates a monitor image as a still image, the visual saliency of the second visual field V2 in the unconscious state is increased to increase the attractiveness, thereby improving the driver's visibility and reducing the gaze time. It can be shortened.

  Since the visual recognition time determination unit 5b has an obstacle detection function (an outside camera 2, ECU 5) that can detect obstacles around the vehicle, when there is an obstacle that the driver should recognize, the second visual field V2 By increasing the visual saliency and enhancing the attractiveness, the driver's line of sight can be guided to the second visual field V2.

  Since the visual recognition time determination unit 5b has a visual line detection function (in-vehicle camera 3, ECU 5) that can detect that the driver has directed the visual line to the monitor 4, the driver consciously directed the visual line to the monitor 4. When the driver's visibility is increased, the gaze time can be shortened.

  Since the camera 2 outside the vehicle is an infrared camera that can capture an infrared image in front of the vehicle, the driver can clearly see the surroundings of the vehicle even when the driver's view is not clear, such as at night or in bad weather. A view image can be provided.

Next, a modified example in which the embodiment is partially changed will be described.
1) In the above-described embodiment, the example in which the first and second optical flows are calculated using both infrared images captured by the camera outside the vehicle is described. However, the vanishing point position and the optical corresponding to the vehicle speed and the steering angle are described. The directivity direction of the flow may be mapped in advance, and the first and second optical flows may be obtained from the map according to the driving state.

2] In the above-described embodiment, the example in which the first optical flow is calculated using the near-infrared image captured by the camera outside the vehicle has been described. However, the first optical flow is prepared using the visible light captured image by preparing the visible light camera. The flow may be calculated and the visible light captured image and the far infrared captured image may be superimposed and displayed on the monitor.

3) In the above-described embodiment, the example of the dedicated monitor mounted on the rear end wall of the instrument panel in the vehicle width direction center portion has been described, but one side in the vehicle width direction from the vehicle width direction center portion of the instrument panel. The monitor may be offset and disposed on the side, or may be used as a meter panel. As shown in FIG. 9, a monitor capable of displaying a captured image is disposed adjacent to the meter display unit. At this time, it is also possible to form the meter panel with a color liquid crystal display and display the second field of view only during night vision operation.
Moreover, you may comprise a monitor with the head-up display (HUD) which can be displayed on a windscreen.

4] In the above embodiment, an example in which an obstacle is detected during night vision operation has been described. However, an obstacle may be detected during traveling regardless of night vision operation.
In this case, even when the night vision is not operating, if an obstacle is detected, an image that has not been subjected to coordinate conversion processing is displayed. In addition, when no obstacle is detected, the display element can be arranged along the first optical flow.

5] In addition, those skilled in the art can implement the present invention with various modifications added without departing from the spirit of the present invention, and the present invention includes such modifications. is there.

1 Image display device 2 Outside camera 3 Inside camera 4 Monitor 5 ECU
5a Optical flow detection unit 5b Viewing time determination unit 5c Monitor image generation unit 12 Front window glass F1 First optical flow F2 Second optical flow

Claims (6)

  1. In an image display device for a vehicle, comprising: an imaging unit capable of continuously imaging a vehicle periphery; and a monitor attached to a vehicle interior and capable of displaying a captured image captured by the imaging unit.
    An optical flow detection means capable of detecting a driver optical flow visually recognized by the driver through the window glass;
    Monitor image generation means for generating a monitor image to be displayed on the monitor based on a captured image captured by the imaging means;
    The vehicle image display device, wherein the monitor image generation means performs coordinate conversion of the monitor image so that an in-monitor optical flow displayed in the monitor matches the driver optical flow.
  2. It has a visual recognition time determination means capable of determining the time when the driver visually recognizes or visually recognizes the monitor,
    The vehicle image display device according to claim 1, wherein when the visual recognition time determination unit determines the visual recognition time, the monitor image generation unit generates a monitor image without performing the coordinate conversion.
  3.   The vehicle image display device according to claim 2, wherein the monitor image generation unit generates the monitor image as a still image.
  4.   4. The vehicle image display device according to claim 2, wherein the visual recognition time determination unit includes an obstacle detection unit capable of detecting an obstacle around the vehicle.
  5.   The vehicular image display device according to claim 2, wherein the visual recognition time determination unit includes a line-of-sight detection unit capable of detecting that a driver has turned his / her line of sight toward the monitor.
  6. The vehicle image display device according to claim 1, wherein the imaging unit is an infrared camera capable of capturing an infrared image in front of the vehicle.


JP2015185679A 2015-09-18 2015-09-18 Vehicular image display device Abandoned JP2017056909A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015185679A JP2017056909A (en) 2015-09-18 2015-09-18 Vehicular image display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2015185679A JP2017056909A (en) 2015-09-18 2015-09-18 Vehicular image display device

Publications (1)

Publication Number Publication Date
JP2017056909A true JP2017056909A (en) 2017-03-23

Family

ID=58391336

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2015185679A Abandoned JP2017056909A (en) 2015-09-18 2015-09-18 Vehicular image display device

Country Status (1)

Country Link
JP (1) JP2017056909A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107194946A (en) * 2017-05-11 2017-09-22 昆明物理研究所 A kind of infrared obvious object detection method based on FPGA
WO2019039406A1 (en) * 2017-08-25 2019-02-28 マツダ株式会社 Passenger compartment structure
JP2019038404A (en) * 2017-08-25 2019-03-14 マツダ株式会社 In-cabin structure
JP2019038405A (en) * 2017-08-25 2019-03-14 マツダ株式会社 In-cabin structure
WO2020031915A1 (en) * 2018-08-06 2020-02-13 株式会社小糸製作所 Vehicle display system and vehicle

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011157066A (en) * 2004-08-02 2011-08-18 Nissan Motor Co Ltd Operation feeling adjusting device
JP2012066606A (en) * 2010-09-21 2012-04-05 Denso Corp Display device for vehicle

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011157066A (en) * 2004-08-02 2011-08-18 Nissan Motor Co Ltd Operation feeling adjusting device
JP2012066606A (en) * 2010-09-21 2012-04-05 Denso Corp Display device for vehicle

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107194946A (en) * 2017-05-11 2017-09-22 昆明物理研究所 A kind of infrared obvious object detection method based on FPGA
WO2019039406A1 (en) * 2017-08-25 2019-02-28 マツダ株式会社 Passenger compartment structure
JP2019038404A (en) * 2017-08-25 2019-03-14 マツダ株式会社 In-cabin structure
JP2019038405A (en) * 2017-08-25 2019-03-14 マツダ株式会社 In-cabin structure
WO2020031915A1 (en) * 2018-08-06 2020-02-13 株式会社小糸製作所 Vehicle display system and vehicle

Similar Documents

Publication Publication Date Title
US20180312111A1 (en) Method for displaying information to vehicle driver
CN104163133B (en) Use the rear view camera system of position of rear view mirror
EP3183875B1 (en) Display system and method
US10525883B2 (en) Vehicle vision system with panoramic view
US8970451B2 (en) Visual guidance system
US10682958B2 (en) Vehicle display device, display control method, and rearview monitoring system
JP6346614B2 (en) Information display system
US9760782B2 (en) Method for representing objects surrounding a vehicle on the display of a display device
US9317759B2 (en) Driving assistance device and driving assistance method
EP1974998B1 (en) Driving support method and driving support apparatus
DE102012102508B4 (en) Adjustment method and system of a smart vehicle imaging device
JP4855158B2 (en) Driving assistance device
JP4533762B2 (en) Variable transmittance window system
EP2660104B1 (en) Apparatus and method for displaying a blind spot
US8094190B2 (en) Driving support method and apparatus
US8179435B2 (en) Vehicle surroundings image providing system and method
JP4412365B2 (en) Driving support method and driving support device
KR101511587B1 (en) Apparatus for displaying information of head-up display and method thereof
US8049609B2 (en) In-vehicle display device
JP4353162B2 (en) Vehicle surrounding information display device
US9001153B2 (en) System and apparatus for augmented reality display and controls
DE102016106255A1 (en) Vehicle exterior camera systems and methods
JP2014534697A (en) Virtual display system for vehicles
US10286843B2 (en) Vehicle vision system
DE102015006612B4 (en) Method for operating data glasses in a motor vehicle and system with data glasses

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20170323

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20171226

A762 Written abandonment of application

Free format text: JAPANESE INTERMEDIATE CODE: A762

Effective date: 20171226