JP2005335410A - Image display device - Google Patents

Image display device Download PDF

Info

Publication number
JP2005335410A
JP2005335410A JP2004152709A JP2004152709A JP2005335410A JP 2005335410 A JP2005335410 A JP 2005335410A JP 2004152709 A JP2004152709 A JP 2004152709A JP 2004152709 A JP2004152709 A JP 2004152709A JP 2005335410 A JP2005335410 A JP 2005335410A
Authority
JP
Japan
Prior art keywords
image
image display
viewpoint
display
position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2004152709A
Other languages
Japanese (ja)
Other versions
JP4323377B2 (en
Inventor
Hidekazu Iwaki
Akio Kosaka
Takashi Miyoshi
貴史 三由
明生 小坂
秀和 岩城
Original Assignee
Olympus Corp
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp, オリンパス株式会社 filed Critical Olympus Corp
Priority to JP2004152709A priority Critical patent/JP4323377B2/en
Publication of JP2005335410A publication Critical patent/JP2005335410A/en
Application granted granted Critical
Publication of JP4323377B2 publication Critical patent/JP4323377B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide an image display device which displays a generated image from virtual viewpoint by considering users'convenience. <P>SOLUTION: The image display device comprises a room mirror which is arranged in a predetermined position in a cabin and has a reflective surface composed of a half mirror and further has a display for displaying an image arranged behind the half mirror and displays the image observably from the front surface side of the half mirror, a detection means for a viewpoint relative position which detects viewpoint coordinates in a coordinate system defined on the room mirror, and a display form control means for controlling an image display form in the display for displaying the image from the viewpoint coordinates detected by the detection means for the viewpoint relative position. <P>COPYRIGHT: (C)2006,JPO&NCIPI

Description

  The present invention does not display images independently of each other for a plurality of images taken by one to several cameras, but the entire area photographed by the one or several cameras. The present invention relates to a device and a method for displaying a composite image so that the situation can be intuitively understood. For example, the present invention is applied to a monitor device in a store or a vehicle surroundings monitor device as an auxiliary for safety confirmation when driving a vehicle. And a suitable technique.

  In recent years, there has been disclosed an image generation apparatus that displays images captured by a plurality of cameras in an easy-to-view manner (for example, Patent Document 1). This Patent Document 1 discloses an image generation apparatus that combines images of an area (for example, the vicinity of a vehicle) captured by a plurality of cameras as a continuous image and displays the combined image.

  Further, in Patent Document 1, a space model that is appropriately set in advance or a space model that is set according to the distance to an obstacle around the vehicle detected by the obstacle detection means is created by the space model creation means. The An image around the vehicle input from a camera installed on the vehicle by the image input means is mapped to the spatial model by the mapping means. Subsequently, one image viewed from the viewpoint determined by the viewpoint conversion means is synthesized from the mapped image and displayed on the display means.

In this way, in the device installed in the vehicle, what kind of object exists in the vicinity of the vehicle over the entire periphery of the vehicle is synthesized as a single image so that it is as easy to understand as possible, and driving The image can be provided to the person. Further, at this time, it is possible to display an image from the viewpoint desired by the driver by the viewpoint conversion means.
Japanese Patent No. 3286306 JP 05-265547 A Japanese Patent Laid-Open No. 06-266828 Japanese Patent Laid-Open No. 11-78693 Japanese Patent Laid-Open No. 5-37965 Japanese Patent Laid-Open No. 6-28452 JP 2000-47139 A JP 2000-276613 A JP 2000-348165 A JP 7-248216 A JP 2004-61907 A Fumiaki Tomita, "High-function 3D visual system", Information processing, IPSJ, Volume 42, No.4

  However, in Patent Document 1, an image of an area (for example, the vicinity of a vehicle) captured by a plurality of cameras is synthesized as a single continuous image, and the synthesized image is mapped to a virtual three-dimensional space model. It is a technology that focuses on how to create an image (virtual viewpoint image) in which the viewpoint of the mapped data is virtually changed in three dimensions. The user interface for the display method and display form We do not make a concrete proposal about improving the convenience of the company.

  In view of the above problems, the present invention provides an image display device that displays a virtual viewpoint image in consideration of user convenience.

  According to the first aspect of the present invention, the above-described problem can be observed from the front side of the half mirror behind the half mirror. A room mirror provided with an image display for displaying an image, a viewpoint relative position detecting means for detecting viewpoint coordinates in a coordinate system defined on the room mirror, and a viewpoint relative position detecting means This can be achieved by providing an image display device comprising display form control means for controlling an image display form on the image display based on the viewpoint coordinates.

The image display apparatus according to claim 1, wherein the subject is according to the invention described in claim 2, wherein the viewpoint relative position detection unit is installed in the room mirror. Can be achieved by providing.
According to the invention described in claim 3, the above-mentioned problem is arranged at a predetermined position in the vehicle interior, the reflecting surface is constituted by a half mirror, and can be observed from the front side of the half mirror behind the half mirror. A room mirror provided with an image display for displaying an image, a position and orientation detection means for detecting the position and orientation of the room mirror, a viewpoint position detection means for detecting the viewpoint position of the observer, and the position and orientation detection Display form control means for controlling the image display form on the image display based on the position and orientation of the room mirror detected by the means and the viewpoint position of the observer detected by the viewpoint position detection means. This can be achieved by providing an image display device characterized by the above.

According to the invention described in claim 4, the object is further provided with a line-of-sight direction detecting means, and an image is displayed only when the line-of-sight direction intersects the mirror surface. This can be achieved by providing the image display device according to 1 or 3.
According to the invention of claim 5, the image display is capable of displaying image information that is an output of an imaging unit that images outside the vehicle, and the display form control unit Is a blind spot area recognition for recognizing an observation area that becomes a blind spot of the observer based on the position and orientation of the room mirror detected by the position and orientation detection means and the viewpoint position of the observer detected by the viewpoint position detection means. Means for controlling the image display form so that an image outside the vehicle corresponding to the observation area that becomes the blind spot recognized by the blind spot area recognition means is displayed on the display for image display. This can be achieved by providing the image display device according to claim 3.

  According to the sixth aspect of the present invention, the above-described problem is that the display form control means is provided separately so as to recognize an object to be noticed or to be alerted. 4. The image display device according to claim 1, wherein a warning display is superimposed on a display image on the image display based on an output from the object recognition means and can be displayed. Can be achieved by providing.

  According to the seventh aspect of the present invention, the display form control means is provided in the vehicle separately provided so as to recognize a dangerous state or a state in which attention should be paid to the vehicle. 4. The image display device according to claim 1, wherein a warning display is superimposed on a display image on the image display based on an output from the state recognition means so as to be displayed. Can be achieved by providing.

  According to the eighth aspect of the present invention, the display form control means outputs the output from the rear visual field situation recognition means provided separately so as to recognize the dirt situation of the rear window. The image display device according to claim 1, wherein the image display device is configured to display an image of a rear view from a rear window on the image display.

According to the ninth aspect of the present invention, the half mirror is configured such that the reflectance changes according to a driving signal supplied from the outside. This can be achieved by providing the screen display device according to claim 1.
According to the invention described in claim 10, the above-mentioned problem is further provided with a half mirror driving means for supplying the driving signal to the half mirror, and the half mirror driving means includes the half mirror driving means, When the image display is in a non-light emitting state, a drive signal is applied so that the half mirror functions as a total reflection mirror, and when the luminance of the image display is in an environment substantially equal to the external luminance, When a drive signal is applied so that the reflected image by the half mirror and the display image by the image display display are observed in an overlapping manner and the brightness of the image display display exceeds the external brightness, the image is exclusively displayed. It is configured to apply a drive signal so that a display image on a display for display can be observed. It can be achieved by providing a screen display device according to claim 9.

  ADVANTAGE OF THE INVENTION According to this invention, the technique for improving the convenience further about the user interface which concerns on the display of a virtual viewpoint image can be embodied.

<First Embodiment>
In the present embodiment, the display means is a room mirror combined display, and a virtual viewpoint image corresponding to an image to be displayed is displayed if it is reflected by the room mirror. Hereinafter, such embodiments will be specifically described sequentially with reference to the drawings.

  FIG. 1 shows an image display device 10000 in the present embodiment. In the figure, an image display device 10000 includes one or more cameras 101, a camera parameter table 103, a spatial reconstruction unit 104, a spatial data buffer 105, a viewpoint conversion unit 106, a display form control unit 10001, a room mirror 10002, A viewpoint relative position detection unit 10003 and an image display 10004 are mounted on a vehicle, for example.

  The plurality of cameras 101 are provided in a state suitable for grasping the status of the monitoring target area. The camera 101 is, for example, a plurality of television cameras that capture images of a space to be monitored outside the vehicle, such as the situation around the vehicle. Moreover, you may install in the vehicle. In general, it is preferable to use a camera 101 having a large angle of view so that a large field of view can be obtained. The number of cameras 101 installed, the installation state, and the like may be a known mode as disclosed in Patent Document 1, for example. In the illustrated example, a plurality of cameras are used. However, it may be used so as to obtain the same imaging data as when a plurality of cameras are provided by sequentially moving the installation position of one camera. . This is the same for each example described below.

  The camera parameter table 103 stores camera parameters indicating the characteristics of the camera 101. Here, the camera parameters will be described. The image display device 10000 is provided with calibration means (not shown) and performs camera calibration. Camera calibration refers to a camera placed in a three-dimensional space, such as the camera mounting position, camera mounting angle, camera lens distortion correction value, camera lens focal length, etc., in the three-dimensional space. 101 is to determine and correct the camera parameters representing the characteristics of 101. The calibration means and the camera parameter table 103 are also described in detail in, for example, Patent Document 1.

  The space reconstruction unit 104 creates spatial data in which an input image from the camera 101 is mapped to a three-dimensional space model based on the camera parameters. That is, the space reconstruction unit 104 associates each pixel constituting the input image from the camera 101 with a point in the three-dimensional space based on the camera parameter calculated by the calibration unit (not shown). Create

  That is, the space reconstruction unit 104 calculates where each object included in the image photographed from the camera 101 exists in the three-dimensional space, and stores the spatial data as the calculation result in the spatial data buffer 105. To do. The spatial model may be a predetermined (predetermined) one, or one generated from a plurality of input images on the spot, or one generated based on the output of a separate sensor. May be.

  For example, as described in Patent Document 1, a space model is a space model composed of five planes, a bowl-shaped space model, a space model composed of a combination of a plane and a curved surface, or a space model introduced with a surface. Or a combination thereof. In addition, it is not limited to these space models, A space model will not be specifically limited if it is a combination of a plane, a curved surface, or a combination of a plane and a curved surface. Moreover, you may create a spatial model based on the stereo vision image obtained from the stereo sensor etc. (for example, patent document 2, patent document 3) which acquire the distance image which calculates a distance image by a triangulation measurement.

  Note that it is not necessary to configure the spatial data using all of the respective pixels constituting the input image from the camera 101. For example, if the input image includes a region located above the horizontal line, it is not necessary to map the pixels included in the region above the horizontal line to the road surface. Alternatively, it is not necessary to map the pixels representing the vehicle body. In addition, when the input image has a high resolution, it may be possible to increase the processing speed by skipping every few pixels and mapping it to spatial data. The space reconstruction unit 104 is described in detail in, for example, Patent Document 1.

The spatial data buffer 105 temporarily stores the spatial data created by the spatial reconstruction unit 104. The spatial data buffer 105 is also described in detail in Patent Document 1, for example.
The viewpoint conversion means 106 creates an image viewed from an arbitrary viewpoint with reference to the spatial data. That is, referring to the spatial data created by the spatial reconstruction means 104, an image is created by setting the camera at an arbitrary viewpoint. The viewpoint conversion means 106 can also have the configuration detailed in, for example, Patent Document 1.

  The room mirror 10002 is installed at a predetermined position in the vehicle interior (for example, near the upper portion of the windshield). The reflection surface of the room mirror 10002 is constituted by a half mirror, and an image display 10004 for displaying an image so that it can be observed from the front side of the half mirror is provided behind the half mirror. The room mirror itself can be, for example, a known one similar to that disclosed in Patent Document 4.

  The viewpoint relative position detection unit 10003 detects the viewpoint of the user by the viewpoint relative position detection unit 10003. That is, the viewpoint coordinates in the coordinate system defined on the room mirror 10002 can be detected. The viewpoint relative position detection unit 10003 is installed in the room mirror.

The display form control unit 10001 controls the display form in order to display the virtual viewpoint image generated by the viewpoint conversion unit 106 on the image display display 10004 (for example, a liquid crystal display).
FIG. 2 shows a display flow of the virtual viewpoint image in the present embodiment. First, the space reconstruction unit 104 calculates the correspondence between each pixel constituting the image obtained from the camera 101 and a point on the three-dimensional coordinate system or a point on a preset three-dimensional model. Spatial data is created (S1). This calculation is executed for all pixels of the image obtained from each camera 101 or thinned pixels. For this processing itself, for example, a known mode disclosed in Patent Document 1 can be applied.

Next, the viewpoint of the user is detected by the viewpoint relative position detector 10003 (S2). This will be described later.
Next, the viewpoint information detected in S2 is set as a virtual viewpoint under the control of the viewpoint conversion means 106, and then an image from the set viewpoint (that is, a virtual viewpoint image) is reproduced from the above-described spatial data ( S3). For this processing itself, for example, a known mode disclosed in Patent Document 1 can be applied.

Then, under the control of the display form control unit 10001, the virtual viewpoint image is output to the image display display 10004 and displayed on the image display display 10004 (S4).
FIG. 3 is a diagram for explaining the positional relationship between the viewpoint position and the room mirror in the present embodiment. 3A is a view when viewed from the side of the vehicle, and FIG. 3B is a view when viewed from the front of the vehicle.

  In order to generate an image to be displayed on the image display display 10004 by the viewpoint conversion unit 106, a virtual viewpoint is required, and in order to obtain the virtual viewpoint, the viewpoint position and the position and orientation of the room mirror are required. Therefore, in this embodiment, the position and orientation of the room mirror is fixed by fixing the coordinate system to the room mirror. This will be described in detail below.

  First, the viewpoint in the present embodiment is to express the position of the user's eyes with one point in a predetermined space. For example, the dominant eye (for example, the right eye) may be the viewpoint, the center of both eyes may be the viewpoint, or the center of gravity of the head (in this case, the offset from the center of the head to the eyeball may be ignored). However, a viewpoint that considers the center of gravity of the head and the orientation of the face (in this case, the movement of the eyeball is ignored) may be used.

  In FIG. 3, the horizontal direction of the room mirror 10002 is defined as the Xa direction, the vertical direction is defined as the Ya direction, and the direction orthogonal to these Xa and Ya directions is defined as the Za direction. Then, the coordinate system is fixed on the room mirror. In addition, the rear mirror 10002 is equipped with, for example, a stereo sensor. As another example, the center of the half mirror surface of the room mirror 10002 can be the origin of the Xa-Ya-Za coordinate system. Furthermore, the position of the stereo sensor mounted on the rearview mirror may be used as the origin of the Xa-Ya-Za coordinate system.

  Then, the user can be imaged by the stereo sensor, and the center of gravity (hereinafter referred to as the viewpoint) of the user's head can be detected from the captured image by image processing, and the distance between the stereo sensor and the user can be obtained. In addition, since the coordinates of the pixels in the image captured by the stereo sensor and the xyz coordinate system are associated in advance, the detected viewpoint can be placed in the Xa-Ya-Za coordinate system. Thereby, the viewpoint can be relatively fixed to the Xa-Ya-Za coordinate system of the room mirror 10002.

  That is, since the viewpoint coordinates in the Xa-Ya-Za coordinate system are determined, when observing the half mirror, a vector connecting the viewpoint to the origin can be regarded as a line-of-sight vector. Therefore, the spatial data is imaged on the display surface for image display (Xa-Ya plane) as the image plane, and the image is aligned with the reflected image of the room mirror 10002 viewed by the user by centrally projecting the viewpoint as a pinhole. Can be generated.

In addition, since the size of the room mirror and the position of the viewpoint relative to the room mirror are known, the angle of view of the image displayed on the room mirror can be determined.
From the above, it is possible to set the viewpoint position required to display the virtual viewpoint image on the room mirror 10002, the observation direction and the angle of view by the room mirror. In the above, the stereo sensor and the image processing are used to detect the position of the viewpoint. However, the present invention is not limited to this. For example, the viewpoint position / direction sensor described in Patent Document 8 and the method described in Patent Document 10; Or, it is not particularly limited as long as it is a known technique such as having a user wear glasses with a sensor or the like to detect a viewpoint position by the sensor.

Further, general techniques for changing the display in accordance with the viewpoint are disclosed in, for example, Patent Document 5, Patent Document 6, Patent Document 7, Patent Document 8, and Patent Document 9, so in the present embodiment, these techniques are disclosed. May be used. Below, the outline | summary of these general techniques is demonstrated.
In Patent Document 5, a three-dimensional shape of an object is recognized based on parallax with a stereo camera and stored in a shape data memory, and image data from the image memory and surface position coordinates on the object from the shape data memory are disclosed. It is disclosed that data is supplied to an arbitrary viewpoint image generation device and output to a multi-view three-dimensional display device by calculation, thereby reducing the amount of information required for display and simplifying the device.

  In Patent Document 6, an observation viewpoint sensor that detects an observation viewpoint of an observer facing the observer is provided on a display that displays a three-dimensional image seen through the viewpoint coordinate system, and is matched with the observation viewpoint detected by the observation viewpoint sensor. The computer moves the viewpoint coordinate system of the displayed image, so that the perspective viewpoint of the three-dimensional image displayed on the display moves following the observer's observation viewpoint, and the perspective viewpoint always matches the observation viewpoint. An image display is disclosed.

  In Patent Document 7, in a stereoscopic image display apparatus having a display display unit and a viewpoint position detection mechanism that detects position information of an observer, the composite parallax image is an original parallax of m = 3 or more, where m is an integer. It is disclosed that the content of the composite parallax image is changed according to viewpoint position information that is configured from an image and is output from the viewpoint position detection mechanism according to the viewpoint position of the observer.

  In Patent Document 8, a viewpoint position / direction calculation unit calculates a user's viewpoint, and a virtual object coordinate conversion unit performs coordinate conversion of a virtual image based on the position of the optical see-through display unit and the user's viewpoint. . Then, it is disclosed that the virtual image after the coordinate conversion is supplied to the optical see-through display unit and displayed together with the image in the real space.

  In Patent Document 9, based on the virtual viewpoint position input by the viewpoint input device, an image frame to be presented is selected from the plurality of video image data by a frame selection device, and a scene matching the virtual viewpoint position is selected. It is disclosed that an image display device displays the image as a presentation image.

From the above, a virtual viewpoint image of a landscape in the direction reflected by the reflection angle is displayed as an image to be displayed on the room mirror 10002. In addition, the virtual viewpoint image displayed on the room mirror 10002 is reversed left and right in order to express an image that is reflected by the mirror.
As described above, the reflected image of the real world reflected by the half mirror and the video by the image display can be observed in a natural form aligned with the user.

<Second Embodiment>
This embodiment is a modification of the first embodiment. In the first embodiment, since the coordinate system is set for the rearview mirror, information on the position and orientation of the rearview mirror is not necessary. However, in this embodiment, the coordinate system is set for the host vehicle and the position and orientation of the rearview mirror is changed. The case where it considers is demonstrated.

  FIG. 4 shows an image display device 10000 in the present embodiment. In the figure, an image display device 10000 includes a plurality of cameras 101, a camera parameter table 103, a spatial reconstruction unit 104, a spatial data buffer 105, a viewpoint conversion unit 106, a display form control unit 10001, a room mirror 10002, and an image display. A display 10004, a viewpoint position detection unit 10005, and a position / orientation detection unit 10006 are configured. Except for the viewpoint position detection unit 10005 and the position and orientation detection unit 10006, the configuration is the same as that in FIG.

  The viewpoint position detection means 10005 is for detecting the viewpoint of the user, and may use a stereo sensor and image processing as in the first embodiment, or may have the user wear glasses with a sensor. The viewpoint position may be detected by a sensor, or other existing methods may be used. It is assumed that the detected viewpoint position is expressed in a coordinate system fixed to the vehicle. The origin position of the coordinate system fixed to the vehicle can be arbitrarily assumed as long as at least the condition of moving and moving with the vehicle is satisfied, and may be defined on a room mirror, for example.

The position / orientation detection means 10006 detects the position / orientation of the rearview mirror in the coordinate system fixed to the vehicle. Specifically, for example, an inclination sensor or the like may be attached to the room mirror.
From the above, since the viewpoint position coordinates and the position of the room mirror in the coordinate system defined on the host vehicle are acquired, the relative position information of the viewpoint position with respect to the room mirror can be obtained from this. That is, it is possible to convert to a new coordinate system (Xb-Yb-Zb coordinate system) with a predetermined part of the room mirror as the origin, and the viewpoint position can also be expressed by this coordinate system. When generating this coordinate system, the horizontal direction of the room mirror 10002 is Xb direction, the vertical direction is Yb direction, based on the information about the position and posture of the room mirror detected by the position / orientation detection unit 10006. A direction perpendicular to these Xb and Yb directions is taken as a Zb direction. The origin of the Xb-Yb-Zb coordinate system is a predetermined position of the room mirror, for example, the center.

  In addition, as another method for calculating the relative position information of the viewpoint position with respect to the room mirror, the following method is also conceivable. From the above, the viewpoint position coordinates and the position of the room mirror in the coordinate system defined on the host vehicle are acquired. Therefore, the vector from the viewpoint position to the room mirror can be obtained by subtracting the position vector of the viewpoint coordinates from the position vector of the room mirror. Note that this vector can also be regarded as a line of sight.

  FIG. 5 is a diagram for explaining the positional relationship between the viewpoint position and the room mirror in the present embodiment. FIG. 5A is a diagram when viewed from the side of the vehicle, and FIG. 5B is a diagram when viewed from the front of the vehicle. As described above, these drawings are diagrams for explaining an example in which a coordinate system is newly provided for the rearview mirror.

  As described above, since the viewpoint coordinates in the coordinate system fixed to the vehicle and the information indicating the position and orientation of the room mirror 10002 are acquired, the room viewed by the user as in the above-described first embodiment. An image aligned with the reflected image of the mirror 10002 can be generated.

  In the first embodiment, the Xa-Ya-Za coordinate system is fixed to the room mirror and the viewpoint position in the coordinate system is calculated. However, this method directly obtains the viewpoint position relative to the room mirror. At least a viewpoint detection means that moves in conjunction with the rearview mirror is required, and when installed later in a complete vehicle, the system becomes complicated and the installation work becomes complicated. On the other hand, in the second embodiment described above, the viewpoint coordinates and the position and orientation of the room mirror are detected by a coordinate system fixed to the host vehicle with the predetermined location moving with the vehicle as the origin. The viewpoint detection means does not need to move in conjunction with the rearview mirror, and the system is simplified. Therefore, the viewpoint detection means is well suited for retrofitting to a finished vehicle and improves versatility.

<Third Embodiment>
In the first and second embodiments, the viewpoint position is detected, and the origin of the XYZ coordinate system is assumed to be the line of sight from the viewpoint position. However, since the viewpoint position is an arbitrary point on the user's body, the user's line of sight is not always directed to the room mirror. Therefore, in the present embodiment, an image display apparatus that considers not only the viewpoint position but also the direction of the user's line of sight will be described.

  FIG. 6 shows an image display device 10000 in the present embodiment. In the figure, an image display device 10000 includes a plurality of cameras 101, a camera parameter table 103, a spatial reconstruction unit 104, a spatial data buffer 105, a viewpoint conversion unit 106, a display form control unit 10001, a room mirror 10002, and an image display. A display 10004, a viewpoint position detection unit 10005, a position and orientation detection unit 10006, and a line-of-sight direction detection unit 10007 are configured. Except for the line-of-sight direction detection means 10007, it is the same as FIG.

  The line-of-sight direction detection means 10007 detects in which direction the user's line of sight is directed. The line of sight here may be the optical axis of the dominant eye, an intermediate axis between the left and right optical axes, or the orientation of the face (in this case, eye movement is ignored as an error). Moreover, it may be a case where it overlaps with a human gaze area (around the line of sight). For example, the line of sight can be acquired from the face image of the user and the direction of the user's face and the direction of the eyes. Further, for example, information representing the line of sight can also be obtained using a known vehicle gaze direction measuring device as disclosed in Patent Document 10.

  From the above embodiment, the viewpoint position, the position of the room mirror, and the size of the room mirror are also known. Therefore, when a straight line is extended in the line-of-sight direction starting from the viewpoint position, it is possible to determine whether or not the straight line intersects the room mirror surface when considering the position of the room mirror and the size of the room mirror. . From this determination result, control is performed so that when the line of sight intersects the room mirror, an image is displayed on the room mirror, and when the line of sight does not intersect, no image is displayed or a screen saver, wallpaper, or the like is displayed.

As described above, since the image can be displayed on the room mirror only when the user looks at the room mirror, it can be displayed economically and efficiently.
<Fourth Embodiment>
This embodiment is a modification of the second embodiment. In this embodiment, when a virtual viewpoint image is displayed on the display of the room mirror, a landscape on the other side that is originally blocked by the portion is displayed on the portion that becomes the blind spot of the driver. By controlling the display in this way, an image as if the vehicle body is translucent is observed with the rearview mirror, and this state is maintained following the movement of the line of sight.

  FIG. 7 shows an image display device 10000 in the present embodiment. In the figure, an image display device 10000 includes a plurality of cameras 101, a camera parameter table 103, a spatial reconstruction unit 104, a spatial data buffer 105, a viewpoint conversion unit 106, a display form control unit 10001, a room mirror 10002, and an image display. A display 10004, a viewpoint position detection unit 10005, a position and orientation detection unit 10006, and a blind spot area recognition unit 10008 are configured. Except for the blind spot area recognition means 10008, it is the same as FIG.

The blind spot area recognizing means 10008 is a means for recognizing a part (dead spot area) that becomes a blind spot of the driver from the viewpoint of the driver detected by the viewpoint position detecting means 10005.
FIG. 8 shows a display form of an image displayed on the room mirror 10002 in the present embodiment. FIG. 8A shows a reflected image displayed by a half mirror when the image display 10004 is not used, that is, a landscape in the rear window direction of the vehicle. As shown in FIG. 8A, the reflected image displayed on the half mirror has a driver's seat, a passenger seat, a rear seat, and a rear window frame, so these are blind spots for the driver who is looking behind the rear mirror. Therefore, the landscape beyond this blind spot is not visible.

  FIG. 8 (b) uses the present invention to display the landscape in the rear window direction on the image display 10004 based on the image information acquired by the camera 101 so as to be displayed on the half mirror, and The state which displayed the virtual viewpoint image corresponding to the scenery which becomes a driver's blind spot and cannot be seen is shown. In FIG. 8 (b), display is made so that the field of view that could not be seen due to the blind spots caused by the driver's seat, front passenger seat, rear seat, and rear window frame shown in FIG. 8 (a) is satisfied. Has been.

  Moreover, in order to show what corresponds to a blind spot, it is a display form which can distinguish the part corresponding to the landscape which becomes a blind spot and cannot be seen, and the part which is not so. In this display form, for example, as shown in FIG. 8 (b), the part causing the blind spot may be made into a wire frame or translucent, and the image of the blind spot may be superimposed on the part. They may be distinguished by different colors or may be highlighted.

  Now, FIG. 8B will be described in further detail. From the above, since the viewpoint position and the position and orientation of the room mirror 10002 are obtained, a virtual reflection image (virtual viewpoint viewpoint) corresponding to the viewpoint position and the position and orientation of the room mirror 10002 is generated, and the image display 10004 can be displayed. Thereby, the display form of FIG. 8A can be realized. Here, the generated virtual viewpoint image is an image generated without taking into consideration information related to the own vehicle (vehicle-mounted articles, seats, front pillars, rear pillars, etc.), and when displayed on the display means, it is horizontally reversed. Is done.

  Next, in order to realize the display form of FIG. 8B, information for specifying a part that becomes a blind spot of the driver is necessary. Therefore, the blind spot area recognizing unit 10008 uses CAD (Computer Aided Design) data to calculate the blind spot from the viewpoint of the driver from the viewpoint position of the driver and the CAD data of the vehicle detected by the viewpoint position detecting unit 10005. Find the area. Then, the information on the blind spot area and the virtual viewpoint information acquired as described above (which may include information indicating which direction it is directed) are transmitted to the display form control unit 10001.

  In the display form control means 10001, in the virtual viewpoint image generated without considering the information about the host vehicle, the part that becomes the blind spot area from the viewpoint of the driver obtained from the CAD data is originally a blind spot and cannot be seen. In order to distinguish a virtual viewpoint image corresponding to a natural landscape and a landscape that is not originally a blind spot, for example, a landscape that becomes a blind spot and cannot be seen may be displayed in a wire frame, or distinguished by a color different from other parts Or may be highlighted.

  In addition, as a method of distinguishing the blind spot area without using CAD data, a virtual viewpoint image generated without considering information about the host vehicle and a virtual viewpoint image generated considering information about the host vehicle are calculated. A blind spot area can be obtained by obtaining a difference between two virtual viewpoint images. Then, based on the information on the blind spot area, the part corresponding to the scenery that cannot be seen as the blind spot is displayed with, for example, a wire frame.

  As described above, by controlling the display in this way, an image as if the vehicle body is translucent is observed with the rearview mirror, and this state is maintained following the movement of the line of sight. There is no conventional example that achieves this. For example, Patent Document 4 discloses a display-equipped room mirror, but it does not realize the present embodiment.

<Fifth Embodiment>
In this embodiment, when the image display device in any of the above embodiments recognizes an object to be noticed or an object to be alerted outside the vehicle, a warning to that effect is displayed on the display. .

  FIG. 9 shows an image display device 10000 in the present embodiment. In the figure, an image display device 10000 includes a plurality of cameras 101, a camera parameter table 103, a spatial reconstruction unit 104, a spatial data buffer 105, a viewpoint conversion unit 106, a display form control unit 10001, a room mirror 10002, a viewpoint relative position. It comprises detection means 10003, an image display display 10004, and a target object recognition means 10009. Except for the object-of-interest recognition means 10009, it is the same as FIG.

  The attention object recognition unit 10009 is, for example, a stereo sensor provided outside the vehicle. The distance to the target object can be obtained by the stereo sensor, and by sampling this at a predetermined time unit, the moving state of the object over time can be detected. It is possible to determine the possibility of a collision.

  In the display form control unit 10001, for example, when an oncoming vehicle detected by the attention object recognition unit 10009 approaches at an abnormal speed, a warning display is superimposed on the display image on the image display display 10004 for display. To do. At this time, the warning display may be a (specific) color, a mark, a balloon, or the like or a voice.

<Sixth Embodiment>
In the present embodiment, in the image display device according to any one of the embodiments described above, when a dangerous state in the vehicle or a state that requires attention is recognized, a warning to that effect is displayed on the display.

  FIG. 10 shows an image display device 10000 in the present embodiment. In the figure, an image display device 10000 includes a plurality of cameras 101, a camera parameter table 103, a spatial reconstruction unit 104, a spatial data buffer 105, a viewpoint conversion unit 106, a display form control unit 10001, a room mirror 10002, a viewpoint relative position. It comprises a detection means 10003, an image display display 10004, and an in-vehicle state recognition means 10010. Except for the in-vehicle state recognition means 10010, it is the same as FIG.

  The in-vehicle state recognition means 10010 is any means for recognizing the in-vehicle state. For example, by installing a pressure sensor on the seat, it is possible to estimate the state of a person sitting on the seat. Thereby, when the person who sits stops moving, it can be estimated that he is sleeping, or when the center of gravity is likely to go out of the vehicle, it can be estimated that a part of the body is out of the vehicle. Moreover, it is possible to estimate the state of the occupant by photographing the occupant's motion with a camera installed in the vehicle and detecting a predetermined motion pattern. In addition, a temperature sensor may be attached in the vehicle to detect whether the vehicle temperature is abnormal.

Since the display form control unit 10001 estimates the passenger's state based on the detection result of the in-vehicle state recognition unit 10010, if the estimation result is a dangerous state for the passenger, the display type display unit 10004 indicates that fact. Can be displayed superimposed.
As described above, if the warning is displayed, the driver can promptly stop the vehicle, so that an accident or the like can be prevented in advance.

<Seventh Embodiment>
In the present embodiment, when the rear window is dirty and it is difficult to confirm the rear of the vehicle by visual observation or a reflection image of the rearview mirror, the rear virtual viewpoint image is displayed on the rearview mirror display.

  FIG. 11 shows an image display device 10000 in the present embodiment. In the figure, an image display device 10000 includes a plurality of cameras 101, a camera parameter table 103, a spatial reconstruction unit 104, a spatial data buffer 105, a viewpoint conversion unit 106, a display form control unit 10001, a room mirror 10002, a viewpoint relative position. It comprises a detection means 10003, an image display 10004, and a rear visual field situation recognition means 10011. Except for the rear visual field situation recognition unit 10011, the configuration is the same as that shown in FIG.

  The rear visual field situation recognition means 10011 is for detecting the state of the rear window. For example, a camera is installed beside the rear window, and the rear window is photographed as needed. Then, the difference between the image of the rear window before being soiled and the image of the rear window photographed at any time may be obtained, and if the difference is larger than a predetermined threshold, it may be determined that the rear window is soiled. In addition, when the rear window is irradiated with light from the inside of the vehicle and is dirty, it is reflected by hitting it, so the reflected light may be detected by a sensor.

  In this way, when the rear window is dirty and it is difficult to see the rear landscape through the rear window, the rear direction image (for example, a rear viewpoint image obtained by a virtual viewpoint image outside the vehicle or a camera installed at the rear of the vehicle, etc. ) Is displayed on the room mirror 10002.

As described above, since the rear window can be recognized even when the rear window is dirty and it is difficult to see the rear landscape through the rear window, it is possible to drive safely.
<Eighth Embodiment>
In this embodiment, an optical element whose reflectivity is changed according to a drive signal supplied from the outside is applied as a half mirror constituting the reflection surface of the rearview mirror 10002, and this half mirror is used as required. Is made to function as a total reflection mirror, to function as a semi-transmission mirror, or to be almost completely transparent.

  As an optical element whose own transparency (reflectance) changes according to a drive signal (applied voltage) supplied from the outside, for example, the one described in Patent Document 11 can be applied. It is a figure which shows the mode thru | or the mode of a visual field observed from the room mirror 10002 in a form.

  FIG. 12A shows a state where a drive signal is applied so that the half mirror functions as a total reflection mirror (reflectance 100%) when the image display 10004 is in a non-light emitting state. As shown in the drawing, in this state, the rear view field (viewed in the front-rear direction of the host vehicle) is observed as a reflected image by the room mirror 10002, as in the case of this type of conventional mirror.

  In FIG. 12B, when the brightness of the image display display 10004 is in an environment substantially equal to the external brightness, the reflected image by the half mirror and the display image by the image display display 10004 are observed in an overlapping manner. It is a figure showing the state which applied the drive signal so that. At this time, the reflectance of the half mirror is approximately 50%.

  FIG. 12C shows a state in which a drive signal is applied so that a display image on the image display display 10004 is exclusively observed when the brightness of the image display display 10004 exceeds the external brightness. FIG. At this time, the reflectance of the half mirror is approximately 0%.

  FIG. 13 is a diagram showing the configuration of a room mirror 10002 provided with a half mirror driven as described above with reference to FIG. 12 and the configuration of its driving means. As for the rearview mirror 10002, a liquid crystal display panel 202 functioning as an image display display is disposed behind the variable reflectivity mirror 201 as a half mirror so that the cross section thereof is conceptually illustrated. Then, the light from the light source 203 is guided by the light guide plate 204 and light emission display is performed.

  Further, an illuminance sensor 205 is provided at an appropriate position near the upper side of the room mirror 10002 so that external luminance is detected. A voltage as a drive signal is applied to the reflectivity variable mirror 201 from a mirror driver 206 as a half mirror drive unit, and the reflectivity mirror 201 is driven as described with reference to FIG.

  In addition, a driving signal is supplied from the liquid crystal driver 207 to the liquid crystal display panel 202 to perform a desired display. Necessary driving power is supplied from the backlight drive controller 208 to the light source 203. The mirror driver 206, the liquid crystal driver 207, and the backlight drive controller 208 are collectively controlled by the display adjustment device 209 so as to perform appropriate operations with each other.

  A signal representing the external luminance detected by the illuminance sensor 205 is supplied to the display adjusting device 209, and the value of the luminance is taken into account, and as described above with reference to FIGS. The relationship between reflection and display is controlled by switching between the brightness of the display and the brightness of the liquid crystal display panel 202 (image display display 10004). Note that it is recommended that the display adjusting device 209 be configured integrally with the display form control means 10001 described above.

In addition, since said 1st-8th embodiment can be mutually combined as much as possible, various embodiment exists according to a use.
FIG. 14 is a block diagram showing the hardware environment of the image display apparatus 10000 according to the first to eighth embodiments. In the figure, an image display device 10000 includes at least a control device 10080 such as a central processing unit (CPU), and a storage device 10081 such as a read only memory (ROM), a random access memory (RAM), or a mass storage device. And an output I / F (interface, the same applies hereinafter) 10082, an input I / F 10083, a communication I / F 10084, and a bus 10085 for connecting them, to which output means 107, an input I / F or a communication I / F There are various devices connected to the.

As an apparatus connected to the input I / F, for example, a camera 101, an in-vehicle camera, a stereo sensor and various sensors, an input device such as a keyboard and a mouse, a reading device for a portable storage medium such as a CD-ROM and a DVD, Other peripheral devices are listed.
Examples of the device connected to the communication I / F 10084 include a car navigation system and a communication device connected to the Internet or GPS. Note that the communication medium may be a communication network such as the Internet, a LAN, a WAN, a dedicated line, a wired line, and a wireless line.

  As an example of the storage device 10081, various types of storage devices such as a hard disk and a magnetic disk can be used. The flow program described in the above embodiment, various tables (for example, a table storing various setting values, etc.) ), CAD data and the like are stored. This program is read by the control device 10080, and each process of the flow is executed.

  This program may be provided from the program provider via the communication I / F 10084 via the Internet and stored in the storage device 10081, or may be stored in a commercially available portable storage medium, It can also be set in the reader and executed by the controller. Various types of storage media such as CD-ROM, DVD, flexible disk, optical disk, magneto-optical disk, and IC card can be used as the portable storage medium, and the program stored in such a storage medium can be read by a reading device. Read by.

  As the input device, a keyboard, a mouse, an electronic camera, a microphone, a scanner, a sensor, a tablet, or the like can be used. In addition, other peripheral devices can be connected. As the input device, a display 10004, a speaker, or the like can be used.

  In each of the above-described embodiments, the plurality of imaging devices may be used so as to configure a so-called trinocular stereo camera or may be used so as to configure a four-eye stereo camera. Thus, it is known that when a three-eye or four-eye stereo camera is used, a more reliable and stable processing result can be obtained in a three-dimensional reconstruction process or the like (for example, Non-Patent Document 1). In particular, it is known that when a plurality of cameras are arranged so as to have a baseline length in two directions, three-dimensional reconstruction can be performed in a more complicated scene. If a plurality of cameras are arranged in one baseline length direction, a so-called multi-baseline stereo camera can be realized, and more accurate stereo measurement can be performed.

It is a figure which shows the image display apparatus in 1st Embodiment. It is a figure which shows the display flow of the virtual viewpoint image in 1st Embodiment. It is a figure for demonstrating the positional relationship of the viewpoint position and room mirror in 1st Embodiment. It is a figure which shows the image display apparatus in 2nd Embodiment. It is a figure for demonstrating the positional relationship of the viewpoint position and room mirror in 2nd Embodiment. It is. It is a figure which shows the image display apparatus in 3rd Embodiment. It is a figure which shows the image display apparatus in 4th Embodiment. It is a figure which shows the display form of the image displayed on the room mirror in 3rd Embodiment. It is a figure which shows the image display apparatus in 5th Embodiment. It is a figure which shows the image display apparatus in 6th Embodiment. It is a figure which shows the image display apparatus in 7th Embodiment. It is a figure which shows the mode thru | or the state of a visual field observed from the room mirror in 8th Embodiment. It is a figure which shows the structure of the room mirror which functions like FIG. 12, and the structure of the drive means. It is a block diagram of the hardware environment of the image display apparatus in the first to eighth embodiments.

Explanation of symbols

DESCRIPTION OF SYMBOLS 101 Camera 103 Camera parameter table 104 Spatial reconstruction means 105 Spatial data buffer 106 View point conversion means 107 Display means 201 Reflectivity variable mirror 202 Liquid crystal display panel 203 Light source 204 Light guide plate 205 Illuminance sensor 206 Mirror driver 207 Liquid crystal driver 208 Backlight drive controller 209 Display adjustment device 10000 Image display device 10001 Display form control means 10002 Room mirror 10003 Viewpoint relative position detection means 10004 Image display display 10005 Viewpoint position detection means 10006 Position / attitude detection means 10007 Gaze direction detection means 10008 Blind spot area recognition means 10009 Attention object Recognizing means 10010 Attention object recognizing means 10011 Rear visual field situation recognizing means 10080 Controller 0081 memory 10082 outputs I / F
10083 Input I / F
10084 Communication I / F
10085 Bus

Claims (10)

  1. A room mirror provided with an image display that is arranged at a predetermined position in the passenger compartment and whose reflection surface is configured by a half mirror and that displays an image so as to be observable from the front side of the half mirror behind the half mirror;
    Viewpoint relative position detecting means for detecting viewpoint coordinates in a coordinate system defined on the room mirror;
    Display form control means for controlling an image display form on the image display based on the viewpoint coordinates detected by the viewpoint relative position detection means;
    An image display device comprising:
  2.   The image display apparatus according to claim 1, wherein the viewpoint relative position detection unit is installed in the room mirror.
  3. A room mirror provided with an image display that is arranged at a predetermined position in the passenger compartment and whose reflection surface is configured by a half mirror and that displays an image so as to be observable from the front side of the half mirror behind the half mirror;
    Position and orientation detection means for detecting the position and orientation of the room mirror;
    Viewpoint position detecting means for detecting the viewpoint position of the observer;
    Display form control means for controlling the image display form on the image display based on the position and orientation of the room mirror detected by the position and orientation detection means and the viewpoint position of the observer detected by the viewpoint position detection means; ,
    An image display device comprising:
  4.   The image display device according to claim 1, further comprising a line-of-sight direction detection unit, wherein an image is displayed only when the line-of-sight direction intersects the mirror surface.
  5.   The image display is capable of displaying image information that is an output of an imaging unit that captures the outside of the vehicle, and the display form control unit is configured to detect the position and orientation of the room mirror and the viewpoint position detected by the position and orientation detection unit. A blind spot area recognizing means for recognizing an observation area that becomes the blind spot of the observer based on the viewpoint position of the observer detected by the detection means; and the observation area that becomes the blind spot recognized by the blind spot area recognition means The image display device according to claim 3, wherein the image display device is configured to control an image display form so as to display a corresponding image outside the vehicle on the image display.
  6.   The display form control means warns the display image on the image display display based on the output from the attention object recognition means separately provided so as to recognize the object to be noticed or to be alerted. The image display device according to claim 1, wherein the image display device is configured to be capable of being displayed by superimposing displays.
  7.   The display form control means warns the display image on the image display display based on the output from the in-vehicle state recognizing means separately provided so as to recognize a dangerous state in the vehicle or a state to be alerted. The image display device according to claim 1, wherein the image display device is configured to be capable of being displayed by superimposing displays.
  8.   The display form control means can display the image of the rear view from the rear window on the image display based on the output from the rear view condition recognition means provided separately so as to recognize the dirt situation of the rear window. The image display device according to claim 1, wherein the image display device is configured.
  9.   The screen display device according to claim 1, wherein the half mirror has a reflectivity that changes according to a drive signal supplied from outside.
  10. Half mirror drive means for supplying the drive signal to the half mirror is further provided, and the half mirror drive means functions as a total reflection mirror when the image display is in a non-light emitting state. When the drive signal is applied and the brightness of the image display is in an environment substantially equal to the external brightness, the reflected image by the half mirror and the display image by the image display are observed in an overlapping manner. The drive signal is applied so that the drive signal is applied so that the display image on the image display display is exclusively observed when the brightness of the image display display exceeds the external brightness. The screen display device according to claim 9, wherein the screen display device is a screen display device.

JP2004152709A 2004-05-24 2004-05-24 Image display device Expired - Fee Related JP4323377B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004152709A JP4323377B2 (en) 2004-05-24 2004-05-24 Image display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004152709A JP4323377B2 (en) 2004-05-24 2004-05-24 Image display device

Publications (2)

Publication Number Publication Date
JP2005335410A true JP2005335410A (en) 2005-12-08
JP4323377B2 JP4323377B2 (en) 2009-09-02

Family

ID=35489465

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004152709A Expired - Fee Related JP4323377B2 (en) 2004-05-24 2004-05-24 Image display device

Country Status (1)

Country Link
JP (1) JP4323377B2 (en)

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006293909A (en) * 2005-04-14 2006-10-26 Denso Corp Driver sight line direction detecting device
WO2008087706A1 (en) * 2007-01-16 2008-07-24 Pioneer Corporation Display device for vehicle, display method for vehicle, and display program for vehicle
JP2009020570A (en) * 2007-07-10 2009-01-29 Denso Corp Vehicle traveling support apparatus
KR100892503B1 (en) * 2008-03-19 2009-04-10 현대자동차주식회사 Room mirror for automobile
JP2009078597A (en) * 2007-09-25 2009-04-16 Denso Corp Rear side confirmation system
JP2009100180A (en) * 2007-10-16 2009-05-07 Denso Corp Vehicular rear monitoring device
WO2009104675A1 (en) * 2008-02-20 2009-08-27 クラリオン株式会社 Vehicle peripheral image display system
JP2009241689A (en) * 2008-03-31 2009-10-22 Equos Research Co Ltd Display device
JP2010109684A (en) * 2008-10-30 2010-05-13 Clarion Co Ltd Vehicle surrounding image display system
JP2010183170A (en) * 2009-02-03 2010-08-19 Denso Corp Display apparatus for vehicle
WO2011007683A1 (en) * 2009-07-13 2011-01-20 クラリオン株式会社 Blind-spot image display system for vehicle, and blind-spot image display method for vehicle
WO2011070641A1 (en) * 2009-12-07 2011-06-16 クラリオン株式会社 Vehicle periphery monitoring system
WO2011070640A1 (en) * 2009-12-07 2011-06-16 クラリオン株式会社 Vehicle periphery image display system
US8035575B2 (en) 2007-03-26 2011-10-11 Aisin Aw Co., Ltd. Driving support method and driving support apparatus
JP2012136192A (en) * 2010-12-27 2012-07-19 Ypk:Kk In-vehicle electronic device
CN103237685A (en) * 2010-12-30 2013-08-07 明智汽车公司 Apparatus and method for displaying a blind spot
JP2014093768A (en) * 2012-10-31 2014-05-19 Hyundai Motor Company Co Ltd Apparatus and method for controlling image on room mirror
WO2014156788A1 (en) * 2013-03-29 2014-10-02 アイシン精機株式会社 Image display control device, image display system, and display unit
WO2015162895A1 (en) * 2014-04-25 2015-10-29 パナソニックIpマネジメント株式会社 Image processing device, method for controlling image processing device, program, and display device
WO2015162910A1 (en) * 2014-04-24 2015-10-29 パナソニックIpマネジメント株式会社 Vehicle-mounted display device, method for controlling vehicle-mounted display device, and program
JP2016040140A (en) * 2014-08-12 2016-03-24 ソニー株式会社 Display device for vehicle and display control method, and rear side monitoring system
JP2016055684A (en) * 2014-09-05 2016-04-21 アイシン精機株式会社 Image display control device and image display system
JP2016107740A (en) * 2014-12-04 2016-06-20 トヨタ自動車株式会社 Anti-glare device
JP2016141303A (en) * 2015-02-03 2016-08-08 株式会社デンソー Visual field support device
JP2017034453A (en) * 2015-07-31 2017-02-09 富士通テン株式会社 Image processing apparatus, image display system, and image processing method
CN106573575A (en) * 2014-07-31 2017-04-19 松下知识产权经营株式会社 Electronic mirror device
EP3170699A4 (en) * 2014-07-14 2017-05-24 Panasonic Intellectual Property Management Co., Ltd. Electron mirror device
JPWO2016047367A1 (en) * 2014-09-25 2017-08-10 株式会社Jvcケンウッド Mirror device with display function and direction changing method of mirror device with display function
EP3206394A4 (en) * 2014-10-07 2017-09-27 Panasonic Intellectual Property Management Co., Ltd. Electronic mirror device
WO2017183513A1 (en) * 2016-04-20 2017-10-26 アイシン精機株式会社 Mirror display device
WO2019187283A1 (en) * 2018-03-28 2019-10-03 パナソニックIpマネジメント株式会社 Image processing device, image display system, and image processing method
EP3562708A4 (en) * 2016-12-27 2019-11-06 Gentex Corp Rear vision system with eye-tracking
US10503989B2 (en) 2015-09-28 2019-12-10 Kyocera Corporation Image processing apparatus, imaging apparatus, camera monitor system, and image processing method
US10525890B2 (en) 2017-12-26 2020-01-07 Gentex Corporation Rear vision system with eye-tracking

Cited By (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006293909A (en) * 2005-04-14 2006-10-26 Denso Corp Driver sight line direction detecting device
WO2008087706A1 (en) * 2007-01-16 2008-07-24 Pioneer Corporation Display device for vehicle, display method for vehicle, and display program for vehicle
US8035575B2 (en) 2007-03-26 2011-10-11 Aisin Aw Co., Ltd. Driving support method and driving support apparatus
JP2009020570A (en) * 2007-07-10 2009-01-29 Denso Corp Vehicle traveling support apparatus
JP2009078597A (en) * 2007-09-25 2009-04-16 Denso Corp Rear side confirmation system
JP2009100180A (en) * 2007-10-16 2009-05-07 Denso Corp Vehicular rear monitoring device
JP5421788B2 (en) * 2008-02-20 2014-02-19 クラリオン株式会社 Vehicle periphery image display system
WO2009104675A1 (en) * 2008-02-20 2009-08-27 クラリオン株式会社 Vehicle peripheral image display system
US8624977B2 (en) 2008-02-20 2014-01-07 Clarion Co., Ltd. Vehicle peripheral image displaying system
KR100892503B1 (en) * 2008-03-19 2009-04-10 현대자동차주식회사 Room mirror for automobile
JP2009241689A (en) * 2008-03-31 2009-10-22 Equos Research Co Ltd Display device
JP2010109684A (en) * 2008-10-30 2010-05-13 Clarion Co Ltd Vehicle surrounding image display system
JP2010183170A (en) * 2009-02-03 2010-08-19 Denso Corp Display apparatus for vehicle
US8717196B2 (en) 2009-02-03 2014-05-06 Denso Corporation Display apparatus for vehicle
JP2011023805A (en) * 2009-07-13 2011-02-03 Clarion Co Ltd Blind-spot image display system for vehicle, and blind-spot image display method for vehicle
CN102474596A (en) * 2009-07-13 2012-05-23 歌乐牌株式会社 Blind-spot image display system for vehicle, and blind-spot image display method for vehicle
US8886023B2 (en) 2009-07-13 2014-11-11 Clarion Co., Ltd. Blind-spot image display system for vehicle, and blind-spot image display method for vehicle
CN102474596B (en) * 2009-07-13 2014-08-27 歌乐牌株式会社 Blind-spot image display system for vehicle, and blind-spot image display method for vehicle
WO2011007683A1 (en) * 2009-07-13 2011-01-20 クラリオン株式会社 Blind-spot image display system for vehicle, and blind-spot image display method for vehicle
US20120242834A1 (en) * 2009-12-07 2012-09-27 Clarion Co., Ltd. Vehicle periphery monitoring system
CN102714710A (en) * 2009-12-07 2012-10-03 歌乐牌株式会社 Vehicle periphery image display system
US20120249789A1 (en) * 2009-12-07 2012-10-04 Clarion Co., Ltd. Vehicle peripheral image display system
WO2011070640A1 (en) * 2009-12-07 2011-06-16 クラリオン株式会社 Vehicle periphery image display system
WO2011070641A1 (en) * 2009-12-07 2011-06-16 クラリオン株式会社 Vehicle periphery monitoring system
US9204108B2 (en) 2009-12-07 2015-12-01 Clarion Co., Ltd. Vehicle periphery monitoring system
JP2012136192A (en) * 2010-12-27 2012-07-19 Ypk:Kk In-vehicle electronic device
EP2660104A4 (en) * 2010-12-30 2014-06-11 Wise Automotive Corp Apparatus and method for displaying a blind spot
EP2660104A2 (en) * 2010-12-30 2013-11-06 Wise Automotive Corporation Apparatus and method for displaying a blind spot
CN103237685A (en) * 2010-12-30 2013-08-07 明智汽车公司 Apparatus and method for displaying a blind spot
KR101417383B1 (en) * 2012-10-31 2014-07-08 현대자동차주식회사 Apparatus and method for image control of room-mirror
JP2014093768A (en) * 2012-10-31 2014-05-19 Hyundai Motor Company Co Ltd Apparatus and method for controlling image on room mirror
JP2014198531A (en) * 2013-03-29 2014-10-23 アイシン精機株式会社 Image display controller, image display system, and display unit
US10112539B2 (en) 2013-03-29 2018-10-30 Aisin Seiki Kabushiki Kaisha Image display control apparatus, image display system and display unit for displaying rear-view image based on eye point of a driver or angle of a display device
US20160288717A1 (en) * 2013-03-29 2016-10-06 Aisin Seiki Kabushiki Kaisha Image display control apparatus, image display system and display unit
CN105073499A (en) * 2013-03-29 2015-11-18 爱信精机株式会社 Image display control device, image display system, and display unit
WO2014156788A1 (en) * 2013-03-29 2014-10-02 アイシン精機株式会社 Image display control device, image display system, and display unit
WO2015162910A1 (en) * 2014-04-24 2015-10-29 パナソニックIpマネジメント株式会社 Vehicle-mounted display device, method for controlling vehicle-mounted display device, and program
WO2015162895A1 (en) * 2014-04-25 2015-10-29 パナソニックIpマネジメント株式会社 Image processing device, method for controlling image processing device, program, and display device
JP5938703B2 (en) * 2014-04-25 2016-06-22 パナソニックIpマネジメント株式会社 Image processing apparatus, image processing apparatus control method, program, and display apparatus
JP2016167859A (en) * 2014-04-25 2016-09-15 パナソニックIpマネジメント株式会社 Image processing device, control method for image processing device, program and display device
EP3170699A4 (en) * 2014-07-14 2017-05-24 Panasonic Intellectual Property Management Co., Ltd. Electron mirror device
US10325550B2 (en) 2014-07-31 2019-06-18 Panasonic Intellectual Property Management Co., Ltd. Electronic mirror device
EP3176036A4 (en) * 2014-07-31 2017-07-12 Panasonic Intellectual Property Management Co., Ltd. Electronic mirror device
JPWO2016017114A1 (en) * 2014-07-31 2017-05-25 パナソニックIpマネジメント株式会社 Electronic mirror device
CN106573575A (en) * 2014-07-31 2017-04-19 松下知识产权经营株式会社 Electronic mirror device
JP2016040140A (en) * 2014-08-12 2016-03-24 ソニー株式会社 Display device for vehicle and display control method, and rear side monitoring system
JP2016055684A (en) * 2014-09-05 2016-04-21 アイシン精機株式会社 Image display control device and image display system
JPWO2016047367A1 (en) * 2014-09-25 2017-08-10 株式会社Jvcケンウッド Mirror device with display function and direction changing method of mirror device with display function
EP3206394A4 (en) * 2014-10-07 2017-09-27 Panasonic Intellectual Property Management Co., Ltd. Electronic mirror device
JP2016107740A (en) * 2014-12-04 2016-06-20 トヨタ自動車株式会社 Anti-glare device
US9744832B2 (en) 2014-12-04 2017-08-29 Toyota Jidosha Kabushiki Kaisha Anti-glare apparatus having light emitted from inside a moving vehicle to a light reactive member
JP2016141303A (en) * 2015-02-03 2016-08-08 株式会社デンソー Visual field support device
JP2017034453A (en) * 2015-07-31 2017-02-09 富士通テン株式会社 Image processing apparatus, image display system, and image processing method
US10503989B2 (en) 2015-09-28 2019-12-10 Kyocera Corporation Image processing apparatus, imaging apparatus, camera monitor system, and image processing method
WO2017183513A1 (en) * 2016-04-20 2017-10-26 アイシン精機株式会社 Mirror display device
EP3562708A4 (en) * 2016-12-27 2019-11-06 Gentex Corp Rear vision system with eye-tracking
US10525890B2 (en) 2017-12-26 2020-01-07 Gentex Corporation Rear vision system with eye-tracking
WO2019187283A1 (en) * 2018-03-28 2019-10-03 パナソニックIpマネジメント株式会社 Image processing device, image display system, and image processing method

Also Published As

Publication number Publication date
JP4323377B2 (en) 2009-09-02

Similar Documents

Publication Publication Date Title
US7719621B2 (en) Image display device and method having image control unit preventing light source unit from outputting an image when observer is outside of predefined normal viewing area
US10029700B2 (en) Infotainment system with head-up display for symbol projection
JP2007142735A (en) Periphery monitoring system
US8179435B2 (en) Vehicle surroundings image providing system and method
JP2010231276A (en) Method and apparatus for processing image
JP2009196630A (en) Display device
EP1179958B1 (en) Image processing device and monitoring system
JP5057936B2 (en) Bird&#39;s-eye image generation apparatus and method
US8451111B2 (en) Image display apparatus and method for displaying an image
US9762880B2 (en) Vehicle vision system with customized display
EP1916846B1 (en) Device and method for monitoring vehicle surroundings
EP3010761B1 (en) Vehicle vision system
US8886023B2 (en) Blind-spot image display system for vehicle, and blind-spot image display method for vehicle
JP5397373B2 (en) Vehicle image processing device and vehicle image processing method
US8502860B2 (en) Electronic control system, electronic control unit and associated methodology of adapting 3D panoramic views of vehicle surroundings by predicting driver intent
US20140336876A1 (en) Vehicle vision system
US20030151563A1 (en) Reduction of blind spots by using display screens
WO2012172923A1 (en) Vehicle periphery monitoring device
EP2365701A1 (en) Display device, terminal device, and display method
JP4475308B2 (en) Display device
JP5436086B2 (en) Vehicle periphery image display device and vehicle periphery image display method
US20110187844A1 (en) Image irradiation system and image irradiation method
DE102013220669A1 (en) Dynamic rearview indicator features
JP2010251939A (en) Vehicle circumference image display system
JP2010070066A (en) Head-up display

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20060526

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20090512

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20090604

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120612

Year of fee payment: 3

LAPS Cancellation because of no payment of annual fees