CN111213114B - Information display device and space sensing device thereof - Google Patents

Information display device and space sensing device thereof Download PDF

Info

Publication number
CN111213114B
CN111213114B CN201880067362.9A CN201880067362A CN111213114B CN 111213114 B CN111213114 B CN 111213114B CN 201880067362 A CN201880067362 A CN 201880067362A CN 111213114 B CN111213114 B CN 111213114B
Authority
CN
China
Prior art keywords
information
display device
driver
image
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880067362.9A
Other languages
Chinese (zh)
Other versions
CN111213114A (en
Inventor
平田浩二
藤田浩司
高木荣治
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Maxell Ltd
Original Assignee
Maxell Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Maxell Ltd filed Critical Maxell Ltd
Priority to CN202311027382.2A priority Critical patent/CN116788039A/en
Publication of CN111213114A publication Critical patent/CN111213114A/en
Application granted granted Critical
Publication of CN111213114B publication Critical patent/CN111213114B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/04Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • B60K2360/149
    • B60K2360/176
    • B60K2360/31
    • B60K2360/66
    • B60K35/10
    • B60K35/23
    • B60K35/28
    • B60K35/654
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0181Adaptation to the pilot/driver
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04108Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction

Abstract

Provided are an information display device and a space sensor device thereof, wherein a large number of display images can be selected and changed with a small viewpoint movement, and a function based on driver interaction can be realized. An information display device for displaying information on a vehicle, comprising: an information display device that displays image information in an image display area in front of a driver's seat of the vehicle; and a space sensing unit configured to detect positional information of an instruction of a driver in a space region between the driver seat and the displayed image display region, wherein the space sensing unit includes a unit configured to display the instruction of the driver in the front image display region in correspondence with the positional information of the instruction of the driver detected by the space sensing unit.

Description

Information display device and space sensing device thereof
Technical Field
The present invention relates to an information display device for projecting information including an image onto a windshield of a so-called vehicle that moves on a passenger such as an automobile, a train, or an airplane, and more particularly to an information display device having a function based on interaction by a driver or the like, and a space sensor device used for the information display device.
Background
For example, patent document 1 below discloses a so-called Head-Up Display (HUD) device that projects image light onto a windshield of an automobile to form a virtual image and displays traffic information such as route information and congestion information, and automobile information such as a fuel remaining amount and a cooling water temperature.
Prior art literature
Patent literature
Patent document 1: japanese patent application laid-open No. 2015-194707
Disclosure of Invention
In such an information display device, there is a demand for a so-called interactive function that is miniaturized in order to dispose a HUD device main body between a steering wheel in front of a driver's seat and a windshield, and that is further operable in a form of a dialogue while viewing a projected screen.
However, in the head-up display device using the above conventional technology, although a technology for miniaturization is disclosed, an interactive function is not described. In particular, no technology is disclosed that is a problem in the HUD device, that is capable of performing an interactive operation on a screen projected to a space between a steering wheel in front of a driver's seat and a windshield.
The present invention aims to provide an information display device capable of realizing a function based on interaction of a driver in an information display device that projects information including an image to a windshield of a vehicle, and a space sensing device for the information display device.
The present invention has been made to achieve the above object, and an example thereof is an information display device for displaying information on a vehicle, the information display device including: an information display device that displays image information in an image display area in front of a driver's seat of the vehicle; and a space sensing unit configured to detect positional information of an instruction of a driver in a space region between the driver seat and the displayed image display region, wherein the space sensing unit includes a unit configured to display the instruction of the driver in the front image display region in correspondence with the positional information of the instruction of the driver detected by the space sensing unit.
In the present invention, the information display device may include a virtual image optical system that displays a virtual image in front of the vehicle by reflecting light emitted from the device at a windshield of the vehicle, or the information display device may include a real image optical system that displays a real image by scanning light emitted from the device over the windshield of the vehicle, or the information display device may include a direct-view type image display device that uses an instrument panel of the vehicle.
The present invention further provides a spatial sensor device for an information display device, wherein the spatial sensor device is configured by arranging a plurality of pairs of optical elements on a straight line, the pairs of optical elements including a light emitting element that makes a light beam from a light source parallel to the light beam, and a condensing lens element that receives a reflected light beam reflected by an obstacle among the light beams from the light emitting element.
According to the present invention, it is possible to provide an information display device capable of selecting and changing a large number of display images such as control, control contents, and navigation information of an automobile with a small viewpoint movement in an information display device that projects information including an image onto a windshield of a vehicle and realizing a function based on interaction by a driver, and a space sensor device used for the information display device.
Drawings
Fig. 1 is a schematic configuration diagram of an information display device having an interactive function and a peripheral device of the information display device according to an embodiment of the present invention.
Fig. 2 is a schematic cross-sectional configuration diagram showing the information display device, the windshield, and the viewpoint position of the driver in the embodiment.
Fig. 3 is a schematic explanatory diagram of an image display position in the embodiment.
Fig. 4 is a schematic explanatory diagram of another image display position in the embodiment.
Fig. 5 is a schematic diagram for explaining a structure for realizing the interactive function in the embodiment.
Fig. 6 is a schematic diagram for explaining an interactive function in the embodiment.
Fig. 7 is a first explanatory diagram illustrating the principle of the spatial sensing device.
Fig. 8 is a second explanatory diagram illustrating the principle of the spatial sensing device.
Fig. 9 is a diagram illustrating a difference in radius of curvature of the windshield in the embodiment.
Fig. 10 is a reflectance characteristic diagram of an incident angle to glass for different polarized light in the embodiment.
Fig. 11 is a top view of an automobile equipped with the information display device according to the embodiment.
Fig. 12 is a characteristic diagram showing reflection characteristics of a reflecting substance coated, adhered or bonded on a windshield in the embodiment.
Fig. 13 is a schematic configuration diagram of a virtual image optical system of the information display device in the embodiment.
Fig. 14 is a basic structural view of the projection optical device in the embodiment.
Fig. 15 is a schematic structural diagram of a 2-axis driven MEMS element in the embodiment.
Fig. 16 is an explanatory diagram illustrating an outline of beam scanning of the MEMS element in the embodiment.
Fig. 17 is an explanatory diagram of a 1 st scanning state of the laser light scanning on the free-form surface mirror in the embodiment.
Fig. 18 is a light source spectrum of the optical scanning device in the 1 st scanning state in the embodiment.
Fig. 19 is a black body locus and an isochromatic temperature map.
Fig. 20 is a diagram showing a chromaticity table of light source light of the light scanning device in the 1 st scanning state in the embodiment.
Fig. 21 is an explanatory diagram of a 2 nd scanning state of the laser light scanned on the free-form surface mirror in the embodiment.
Fig. 22 is a light source spectrum of the optical scanning device in the 2 nd scanning state in the embodiment.
Fig. 23 is a chromaticity table of light source light of the light scanning device in the 2 nd scanning state in the embodiment.
(symbol description)
1: a concave mirror; 1a: a concave mirror support; 2. 3: optical elements (lenses, lens groups); 4: an image display device; 6: a projected member (front glass); 8: eye boxes (eyes of observer, line of sight); 10: a backlight source; 20: a display surface; 41: an opening portion; 42: an instrument panel; 43: a steering wheel; 44: a front cover; 45: a vehicle body; 50: sun is carried out; 51. 52, 53: a space sensing device; 531: a conversion device; 60: a light source (first light source, specific light source); 61: a condenser optical element; 65: a condensing lens element; 64: a light receiving section; 80: a first component; 81: a second component; 82: a third component; 91: a scanning mirror; 100: HUD device; 101: an automobile; 201: a pixel; 202: scanning tracks of laser; 210: an observation camera; 220: projection optics; 301: laser of the 1 st scanning part; 302: a scanning track of the 1 st scanning part; 303: a laser beam of the 2 nd scanning unit; 304: a scanning track of the 2 nd scanning part; 1 (a), 2 (a), 3 (a), 1 (b), 2 (b), 3 (b): and an image display area.
Detailed Description
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
First, fig. 1 is a schematic configuration diagram of an information display device having an interactive function and a peripheral device of the information display device according to an embodiment of the present invention. Here, as an example thereof, an information display device that projects an image onto a windshield of an automobile will be specifically described. Fig. 1 (a) is a cross-sectional perspective view of the information display device, and fig. 1 (B) is a schematic block diagram of the peripheral device.
In fig. 1 (a), 45 is a vehicle body, 6 is a windshield as a projected member, and a schematic view of a longitudinal section of the vehicle body. The HUD device 100 is a device that displays various information reflected by the projection member 6 (the inner surface of the front windshield in this embodiment) as a Virtual Image (Virtual Image) in order to form a Virtual Image in the front of the host vehicle in the line of sight 8 of the driver. The projected member 6 may be any member on which information is projected, and may be not only the windshield but also another synthesizer. That is, the HUD device 100 of the present embodiment may form a virtual image in front of the host vehicle in the line of sight 8 of the driver to allow the driver to see, and includes, for example, vehicle information and foreground information captured by a camera (not shown) such as a monitoring camera or a panoramic camera as information displayed as the virtual image.
In addition, the HUD device 100 is provided with an image display device 4 that projects image light for displaying information, and lenses 2 and 3 as optical elements for correcting distortion and aberration occurring when a virtual image is formed in the concave mirror 1 for correcting an image displayed on the image display device 4 are provided between the image display device 4 and the concave mirror 1.
The HUD device 100 includes the image display device 4 and the control device 40 for controlling the backlight 10. The optical member including the image display device 4, the backlight source 10, and the like is a concave mirror 1 having a concave shape for reflecting light as a virtual image optical system described below. The light reflected by the concave mirror 1 is reflected by the windshield 6 and directed toward the line of sight 8 of the driver (the eye box (Eyebox) may be a range of view of the driver in which the driver can observe accurately).
The image display device 4 includes, for example, a self-luminous VFD (Vacuum Fluorescent Display ) in addition to an LCD (Liquid Crystal Display, liquid crystal display) having a backlight.
On the other hand, instead of the image display device 4, an image may be displayed on a screen by a projector, and the image may be formed into a virtual image by the concave mirror 1, and reflected on the windshield 6 as a projected member to be directed toward the driver's line of sight 8. Such a screen may be constituted by a microlens array in which microlenses are arranged two-dimensionally, for example.
More specifically, in order to reduce distortion of the virtual image, the shape of the concave mirror 1 may be set to the following shape: in the upper part shown in fig. 1 (a) (the area where light is reflected below the front windshield 6 at a relatively short distance from the viewpoint of the driver), the radius of curvature is relatively small so that the magnification becomes large, while in the lower part (the area where light is reflected above the front windshield 6 at a relatively long distance from the viewpoint of the driver), the radius of curvature is relatively large so that the magnification becomes small. Further, by tilting the image display device 4 with respect to the optical axis of the concave mirror, the distortion itself generated by correcting the difference in the virtual image magnification can be reduced, and thus, a better correction can be realized.
Next, in fig. 1 (B), the control device 40 acquires, as foreground information (i.e., information displayed in front of the host vehicle by the virtual image), various pieces of information such as a speed limit of a road corresponding to a current position where the host vehicle is traveling, the number of lanes, a planned travel route of the host vehicle set in the navigation system 60 ', and the like from the navigation system 60'.
The driving support ECU62 is a control device that controls the drive system and the control system in accordance with an obstacle detected as a result of monitoring in the surrounding area monitoring device 63 to realize driving support control, and examples of the driving support control include known technologies such as cruise control, adaptive cruise control, pre-crash safety, and lane keeping assist.
The surrounding area monitoring device 63 is a device for monitoring the surrounding area of the host vehicle, and is, for example, a camera for detecting an object existing in the surrounding area of the host vehicle from an image of the surrounding area of the host vehicle, a probe device for detecting an object existing in the surrounding area of the host vehicle from a result obtained by transmitting and receiving probe waves, and the like.
The control device 40 acquires information (e.g., distance from the preceding vehicle, azimuth of the preceding vehicle, obstacle, location where the sign exists, etc.) from such a driving support ECU62 as foreground information. Further, an Ignition (IG) signal and host vehicle state information are input to the control device 40. Among these pieces of information, the host vehicle state information is information acquired as vehicle information, and includes, for example, warning information indicating that the vehicle is in a predetermined abnormal state, such as the remaining amount of fuel in the internal combustion engine and the temperature of the cooling water. Further, the operation result of the direction indicator, the running speed of the vehicle, and further the shift position information are included.
The video signal from the control device 40 described above is video information corresponding to the state of the automobile and the surrounding environment, and is appropriately and selectively displayed on the HUD device 100 as the 1 st information display device superimposed with a virtual image on a distant real scene observed by the observer, the projection optical device 220 as the 2 nd information display device superimposed with a real image on a close scene, and the dashboard 42 as the 3 rd information display device, so that the movement of the viewpoint of the driver as the observer during driving is reduced. The control device 40 is activated when an ignition signal is input. The above is the structure of the entire system of the information display device in the present embodiment.
Fig. 2 is a schematic cross-sectional configuration diagram showing the information display device, the windshield, and the viewpoint position of the driver in the present embodiment. Fig. 3 is a schematic explanatory diagram of the image display position in the present embodiment, and is a schematic view of the front windshield 6 from the driver's seat.
As shown in fig. 2 and 3, in the present embodiment, the image display area 1 (a) near the center of the front windshield 6 in front of the steering wheel 43, the image display area 2 (a) at the lower portion of the front windshield 6, and the image display area 3 (a) on the instrument panel 42 are provided.
The information display device according to the present embodiment uses the image display area 1 (a) (see fig. 2 and 3) near the center of the windshield 6 as a reflection surface by the HUD device 100, and provides a virtual image of 40 inches or more to the observer at a virtual image distance of 8m, and overlaps with the real scene observed by the observer while driving, thereby suppressing the movement of the viewpoint. The inventors measured the change in the viewpoint position of the driver during urban-area traveling, found that the viewpoint movement was suppressed by 90% when the maximum value of the virtual image distance was set to 30m by actual measurement, and found that the viewpoint movement was similarly suppressed when the virtual image distance was set to 70m or more during high-speed traveling by experiments. The size of the virtual image required at this time corresponds to 350 inches.
As described above, in the region where the point of view of the observer is far, the HUD device 100 for displaying a virtual image is used to display the image in the image display region 1 (a), while in the image display region 2 (a), the projection optical device 220 for scanning the light flux of the specific polarization wave by the MEMS (Micro Electro Mechanical Systems, microelectromechanical system) element is used to reflect the real image in the lower portion of the front windshield in order to superimpose the image on the close view actually observed by the driver as the observer. The MEMS-based image display is here essentially focus-free, so that projection onto a front windshield 6 with curvature is facilitated.
In addition, a member having characteristics different from those of the other polarized wave with respect to the reflectance of the specific polarized wave, which will be described later in detail, is contained in or coated on or adhered to the glass surface in the vehicle, and the image light is efficiently reflected to direct the real image to the observer, in the lower portion of the front windshield on which the image is reflected. Here, in order to form a focal point as a virtual image in a distant place in front of the front glass, the horizontal display size of the image display area 1 (a) on the front glass 6 by the HUD device 100 is smaller than the horizontal display size of the real image display by the projection optical device 220.
Further, instead of the above-described division of the image display region, as shown in fig. 4, it has been experimentally confirmed that by displaying a part or the whole of the image display region 1 (b) in the distant place where the virtual image is displayed by using the HUD device, superimposed on the image display region 2 (b) where the video is superimposed in the close view, it is possible to perform three-dimensional display in a simulated manner. In order to achieve better display, a good effect can be obtained when the display position in the depth direction of the virtual image and the display position in the depth direction of the real image are partially overlapped. Further, by overlapping and displaying the 2 image display areas, continuity of the display image can be obtained, and new effects such as smooth viewpoint movement can be obtained.
The display position, display color, and display interval of the virtual image reflected in the vicinity of the center portion of the front windshield are appropriately selected by the observation camera 210 for observing the state of the observer, and not only are information such as left and right turn, stop, acceleration, etc. of the automobile controlled in the automatic running state displayed in an image, but also the health state, drowsiness, etc. of the driver are sensed, and attention is paid to the information in an image display, etc. Further, the information is not always displayed, and is preferably displayed at a desired place as needed by following the movement of the driver's eye line with the observation camera 210.
< Interactive function >
The inventors have confirmed that, in order to reduce the viewpoint movement and obtain a large amount of information in the information display device, for example, a spatial sensor device is used in combination to select a plurality of videos or a switch unit for selecting video display, and thus, a video display device for selecting and changing a large amount of display videos such as control, control contents, and navigation information of a vehicle with a small viewpoint movement can be realized. In particular, it has been confirmed that by realizing a so-called interactive function capable of operating in a form such as a dialogue while viewing a projected screen, it is effective for realizing a more convenient and more usable information display device. Hereinafter, a structure for realizing the above-described interactive function will be described.
< space sensing device >
The symbol 51 shown in fig. 1 (a) is a space sensing device corresponding to the instrument panel 42 (the image display areas 3 (a) and 3 (b)), and is configured to be close to the display surface to select a plurality of videos or a switch unit for selecting video display, whereby a large number of display videos such as control, control contents, and navigation information of the automobile can be selected with a small viewpoint movement. Further, reference numerals 52 and 53 denote space sensing devices capable of sensing the position information (corresponding to the image display areas 1 (a) and 2 (a) of fig. 3 or the image display areas 1 (b) and 2 (b) of fig. 4) in the depth direction from the steering wheel 43 toward the windshield 6 in parallel with the space sensing device 51.
The spatial sensor device 53 corresponding to the image display area 1 (a) or 1 (b) of fig. 3 and 4 is shown as an example thereof. As shown in fig. 5 and 6, according to the space sensor, even when a driver driving while observing information displayed in the image display area sits on the driver's seat, the driver can freely operate and move the indication means (indicator) on the screen displayed in the image display area by freely operating and moving his finger (or a bar-shaped object held by his finger: an obstacle) in the space area (indicated by a broken line a in the figure). Therefore, the position of the indication means (pointer) is calculated from the detected coordinate position of the finger (obstacle) via the conversion means 531 or the like, and is outputted as position information to the HUD device 100 as position information as pointer position information (see fig. 1 or 2).
Thus, even in a driving state, the driver can input position information on a desired position of screen information to instruct a desired instruction by moving a finger or the like to the desired position of screen information in the spatial region indicated by the broken line a while observing the screen information displayed in the front. That is, a function of so-called interactive operation, which can be performed as a dialogue while viewing a projected screen, can be realized.
Next, fig. 7 is a diagram showing the basic principle of the structure and operation of the spatial sensing device of the present application. In the case where the finger of the user (e.g., driver) moves from left to right in the drawing, first, the light beam from the first light source 60The light is made into a substantially parallel beam by the condenser optical element 61, and reflected by the finger of the user moving to become a reflected beam +.>The light is condensed by the condenser lens element 65 and reaches the first light receiving unit 64.
At this time, the distance information from the finger in the Y-axis direction is obtained by the time difference Δt1 between the light emission time of the first light source 60 and the light receiving time of the first light receiving unit 64, and the light emission time of the first light source 60 are simultaneously obtainedThe absolute position coordinates of the first light receiving unit 64 obtain position information in the X-axis direction. Next, as shown in the figure, in the case where the finger of the user (e.g., driver) moves further from the left to the right of the figure, first, the light beam from the third light source 68Is made into a substantially parallel beam by the condenser optical element 69, and is reflected by the finger of the moving user to become a reflected beam +.>The light is condensed by the condenser lens element 73 and reaches the third light receiving unit 72.
At this time, the distance information from the finger in the Y-axis direction is obtained by the time difference Δt2 between the light emission time of the third light source 68 and the light receiving time of the third light receiving portion 72, and the position information in the X-axis direction is obtained by the light emission time of the third light source 68 and the absolute position coordinates of the third light receiving portion 72.
Next, a method of acquiring positional information in the Z-axis direction will be described with reference to fig. 8. A plurality of the sensing members (shown as 80, 81, 82 in the figure) for sensing the positional information in the X-axis direction and the Y-axis direction described in fig. 7 are arranged in a row in the Z-axis direction (the depth direction from the steering wheel 43 toward the windshield 6). In case the finger of the user, e.g. the driver, is moved further from left to right in the drawing with respect to the sensing device 51, first the light beam from the specific light source 60 of the first component 80The light is made into a substantially parallel beam by the condenser optical element 61, and reflected by the finger of the user moving to become a reflected beam +.>Is condensed by a condensing lens element (not shown) and reaches a light receiving unit (not shown). At this time, the time (absolute time T1) and XY coordinates at which the first member 80 passes through the operation of the above-described members become clear. Such as If the user's finger moves further from left to right, the time (absolute time T2) and XY coordinates when the second member 81 passes through the second member 81 become clear as well. In addition, the beam of the third component 82 +.>As shown in fig. 8, the position of the finger in the Z-axis direction can be specified without being blocked by the finger of the user.
Further, in the embodiment of the present invention, the movement speed and acceleration of the finger in the Z coordinate direction of the spatial axis can be calculated from the time difference between the absolute time T1 and the absolute time T2 and the sensing output (XY coordinates and absolute time for obtaining the reflected light from the finger at the light receiving portion) of the second member 81. Similarly, the movement direction and acceleration of the finger are also calculated from the sensed information from the first member 80 and the second member 81, and not only simple positional information but also the user's intention (such as a sense if the acceleration is high) can be reflected to the amount, speed, position, etc. of information display of the system.
Further, the structure of the above-described spatial sensing device is disclosed in, for example, japanese patent publication No. 2012-518228 and the like. In addition, the PC is also marketed under the name "air bar (registered trademark of Neonode corporation)", and is also known as a structure in which the PC is touch-panel-type by simply being left on the PC. Since the space sensor has a rod-like shape, it can be easily disposed at a desired position even on a dashboard or instrument panel in a narrow vehicle.
In the above description, the space sensing device 53 corresponding to the image display area 1 (a) of fig. 3 or the image display area 1 (b) of fig. 4 is mainly described in detail. However, the same as described above is also true for the image display region 2 (a) of fig. 3 or the image display region 2 (b) of fig. 4, and further the image display region 3 (a) of fig. 3 or the image display region 3 (b) of fig. 4. In addition, as long as a person skilled in the art is able to realize the following functions called interaction, it is clear that: even in a driving state, the driver moves a finger or the like to a desired position of screen information in the space region indicated by the broken line a while viewing the screen information displayed in the front, and inputs and instructs the position information of the desired position of the screen information, that is, performs an operation such as a conversation while viewing the projected screen.
< other features of the present embodiment >
As shown in fig. 9, the front windshield 6 of the passenger car has a different curvature radius Rv in the vertical direction and a curvature radius Rh in the horizontal direction, and generally has a relationship of Rh > Rv. Therefore, as shown in fig. 7, when the front glass 6 is treated as a reflecting surface, the front glass becomes a torus of a concave mirror. Therefore, in the HUD device 100 of the present embodiment, regarding the shape of the concave mirror 1, the average curvature radius may be different in the horizontal direction and the vertical direction so as to correct the virtual image magnification based on the shape of the front windshield 6, that is, so as to correct the difference in curvature radius in the vertical direction and the horizontal direction of the front windshield. In this case, the shape of the concave mirror 1 is a function of the distance h from the optical axis in a spherical or aspherical shape (expressed by the following formula (2)) symmetrical to the optical axis, and the horizontal cross-sectional shape and the vertical cross-sectional shape of the separated place cannot be controlled individually, so that it is preferable to correct the shape as a function of the coordinates (x, y) of the plane from the optical axis of the mirror surface as a free-form surface expressed by the following formula (1).
[ 1]
[ 2]
Here, z is a sag in coordinates (x, y) with the axis of the definition surface as a reference, c is a curvature at the origin of the axis of the definition surface, K is a taper constant, and Cj is a coefficient.
Returning again to fig. 1, between the image display device 4 and the concave mirror 1, for example, the lens 2 and the lens 3 are disposed as transmissive optical members, and by controlling the direction of light rays emitted to the concave mirror 1, correction of distortion aberration is performed in accordance with the shape of the concave mirror 1, and at the same time, aberration correction of a virtual image including astigmatism generated due to a difference between the radius of curvature of the front glass 6 in the horizontal direction and the radius of curvature of the front glass in the vertical direction is realized.
On the other hand, as shown in fig. 1 and 2, most of the S polarized light beam from the sun 50 is reflected by the windshield 6, and most of the S polarized light beam entering the vehicle is P polarized light, so that the projection optical device 220 for scanning the S polarized light beam by incidence on the MEMS element is used to project an image on the lower portion of the windshield so as to superimpose the image on the close view. In this case, the reason for using S-polarized light for video display is that the angle of inclination of the windshield 6 is as large as 40 degrees or more, and the reflectance is high in S-polarized light as shown in fig. 10.
The reason for this is that, in the front windshield 6 of the automobile, the curvature radius Rh in the horizontal direction and the curvature radius Rv in the vertical direction are different as shown in fig. 9, and the center of the image is different as shown in fig. 11 as the position of the driver of the observer (the position of the steering wheel 43 in fig. 11) with respect to the center of curvature in the horizontal direction.
In contrast, in the projection optical device 220, although an image is projected on the windshield by scanning the windshield in the vertical and horizontal directions using a laser light source by MEMS, in the image display region 2 (a) in the lower portion of the windshield shown in fig. 2 and 3, a member having a characteristic different from the reflectance of the S polarized wave with respect to the P polarized wave is contained in, coated on, adhered to, or bonded to the glass surface in the vehicle, so that the image light is efficiently reflected and the real image is directed to the viewer. Specifically, as shown in fig. 12, the reflectance in the visible light range (380 nm to 780 nm) of the S-wave laser light is preferably an average of about 10% reflectance as shown in the characteristic (1) to about 20% reflectance as shown in the characteristic (2), and the reflection surface of the windshield contacting the room reflects the image light in the direction of the driver as the observer.
More specifically, a sheet having the above-described characteristics of the optical multilayer film may be laminated, or a plurality of sheets having different refractive indices may be laminated so as to have the same effect, or a concave-convex shape may be provided on the surface of the sheet so that the diffusion characteristics of the windshield in the horizontal direction are larger than the diffusion characteristics in the vertical direction.
The reason for this is that the reflectance of the sheet is set to be high in the ultraviolet region (less than 380 nm) and the near infrared region (more than 780 nm), so that incidence of ultraviolet rays and near infrared rays in the vehicle is suppressed, thereby realizing a more comfortable environment.
As described above, the present embodiment is an information display device for displaying information on a vehicle in addition to the above-described features, and includes: 1 st information display device, reflecting to the front windshield of the vehicle, displaying the image information of the virtual image; a 2 nd information display device for scanning the laser beam on the front windshield by using the MEMS element to obtain a real image; and a 3 rd information display device using an instrument panel of a vehicle, wherein the 1 st information display device includes a virtual image optical system for displaying a virtual image in front of the vehicle by reflecting light emitted from an image display device for displaying image information on a windshield, the 2 nd information display device includes a real image optical system for displaying a real image on the windshield by scanning laser light with a scanning mirror element, the 3 rd information display device includes a direct-view image display device as the instrument panel, the 1 st information display device has an image display position near a center of the windshield, and the 2 nd information display device has an image display position below the windshield.
Thus, by combining the HUD device in which virtual images are superimposed on a distant view, the real image display device in which real images superimposed on a close view are displayed, and the display of the instrument panel, it is possible to provide an information display device that reduces the movement of the viewpoint of the driver and contributes to support of safe driving.
Further, a more specific optical configuration of the HUD device having the virtual image optical system of the information display device will be described below.
Fig. 13 is an overall configuration diagram of the HUD device 100 in the present embodiment. In fig. 13, a concave (free-form surface) mirror 1 for projecting image light forming a virtual image via a windshield 6, a lens group 2 for correcting distortion and aberration generated at this time, an image display device 4, and a backlight source 10 constituting a backlight are provided in this order from the downstream side. Further, 7 is a frame. Further, in order to suppress the P-wave component of sunlight entering the interior of the HUD device 100, an optical element 3 for suppressing the P-wave component is provided between the lens group 2 and the image display device 4 as an example thereof.
First, in the present embodiment, the concave (free-form surface) mirror 1 as the projection image light preferably has a function of reflecting visible light (wavelength: approximately 400 to 700 nm) and particularly removing, from sunlight including various wavelength spectrums, for example, infrared (IR), ultraviolet (UV), or the like, which is not required in the information display device and causes damage to the device. In this case, if the reflectance of visible light is 95% or more, a virtual image optical system with high light use efficiency can be realized.
However, when the concave (free-form surface) mirror 1 is directly observed through the front windshield, glare is caused by disturbing light reflection, which results in degradation of the quality of the automobile, or strong light such as sunlight or a headlight of a vehicle facing the night is reflected by the concave mirror 1 and a part of the light returns to the liquid crystal panel, which results in degradation of the image quality of an image (virtual image) obtained as an information display device, such as contrast performance, and damage to a polarizing plate and the liquid crystal panel constituting the image display device 4. Therefore, the above problem can be solved by intentionally reducing the reflectance of the concave (free-form surface) mirror 1 to 90% or less, preferably 85% or less.
The concave mirror support 1a, which is a base material of the concave (free-form surface) mirror 1, is selected to have high transparency so as to avoid the base material absorbing light of a wavelength component that is not reflected in the sunlight. As the plastic substrate, there are (1) ZEONEX (japan, inc.), polycarbonate (2), acrylic (3) and the like. The water absorption is approximately 0% and the heat distortion temperature is high (1) ZEONEX is optimal, but since the price is high, polycarbonate having the same heat distortion temperature and a water absorption of about 0.2% may be used. Since the most hygroscopic rate is the greatest for the most formable and inexpensive acrylic acid, it is necessary to provide a moisture-proof film and a reflective film.
In order to prevent moisture absorption of the substrate of the concave (free-form surface) mirror 1, a moisture-proof film may be formed of SiN (silicon nitride) as the moisture-proof film on the surface opposite to the reflective film formed on the reflective surface. SiN, which is a moisture-proof film, allows sunlight to pass therethrough, so that thermal deformation can be suppressed without light absorption in the base material. As a result, shape change due to moisture absorption can be prevented even in a concave (free-form surface) reflecting mirror formed of polycarbonate or acrylic.
Further, although not shown here, a light-transmitting plate having the function of removing IR and UV may be provided in addition to or instead of the concave (free-form surface) mirror 1 having the function of suppressing and removing IR and UV, at the opening 41 formed in the upper portion of the HUD device 100. In this case, in addition to the IR and UV suppressing function, external dust can be prevented from entering the HUD device 100.
In this way, according to the concave mirror 1, the components unnecessary for the HUD device can be removed from the sunlight containing a large amount of spectral components entering the HUD device 100 from the opening 41, and the visible light component can be mainly selectively extracted.
On the other hand, as a factor for degrading the image quality of the HUD device, it is known that the image light emitted from the image display device 4 toward the concave mirror 1 is reflected at the surface of the optical element 2 disposed in the middle and returned to the image display device, and is reflected again and superimposed on the original image light, thereby degrading the image quality. Therefore, in the present embodiment, it is preferable to design the lens surface shape of one or both of the image light incident surface and the light emitting surface of the optical element 2 so as to have a restriction such as to prevent the reflected light from extremely condensing on a part of the image display device 4, in addition to forming the reflection preventing film on the surface of the optical element 2 to suppress reflection.
Next, as the image display device 4, if a liquid crystal panel having a polarizing plate disposed for absorbing the reflected light from the optical element 2 is used, the reduction in image quality can be reduced. The backlight light source 10 of the liquid crystal panel is controlled so as to efficiently enter the entrance pupil of the concave mirror 1, and the incidence direction of the light entering the image display device 4 is controlled. Further, a solid-state light source having a long product life may be used as the light source, and it is preferable to perform polarization conversion using a PBS (Polarizing Beam Splitter ) in which an optical unit for reducing the divergence angle of light is provided as an LED (Light Emitting Diode ) having a small light output variation with respect to the fluctuation of ambient temperature.
Polarizing plates are disposed on the backlight source 10 side (light incident surface) and the optical element 2 side (light emitting surface) of the liquid crystal panel to increase the contrast ratio of the image light. If an iodine-based polarizing plate having a high degree of polarization is used as the polarizing plate provided on the backlight 10 side (light incident surface), a high contrast ratio can be obtained. On the other hand, by using a dye-based polarizing plate on the optical element 2 side (light emitting surface), high reliability can be obtained even when the ambient temperature is high and the disturbance light is incident.
When a liquid crystal panel is used as the image display device 4, particularly when a driver wears polarized sunglasses, a specific polarized wave is blocked, and an image failure occurs. In order to prevent this problem, it is preferable to dispose a λ/4 plate on the optical element side of the polarizing plate disposed on the optical element 2 side of the liquid crystal panel, and convert the image light aligned in a specific polarization direction into circularly polarized light.
Further, a more specific optical configuration of a projection optical device having the real image optical system of the information display device will be described.
Fig. 14 is a basic configuration diagram of a projection optical device 220 for obtaining a real image by scanning a laser beam with a MEMS in the present embodiment. In fig. 14, the projection optical device 220 is an image display device of the following scanning type: an optical scanning device that scans a laser beam, which has been subjected to light intensity modulation (hereinafter referred to as "modulation") based on an image signal, in a two-dimensional direction is mounted, and the laser beam is scanned on an object (e.g., a windshield) by the optical scanning device to draw an image. That is, the laser light from the light source units 94 (94 a, 94 b) is reflected by the scanning mirror 91 having the rotation axis, whereby the laser light can be scanned. Conceptually, each modulated pixel 201 is scanned two-dimensionally on an image plane along a scanning trajectory 202 of the laser light of the display surface 20.
The two-dimensional deflection action in the scanning mirror 91 in the present embodiment will be described in detail below.
Fig. 15 is a schematic configuration diagram of a scanning mirror 91 as a 2-axis driven MEMS element in the present embodiment. In the figure, a scanning mirror surface 91a for deflecting the laser light at a reflection angle is connected to a 1 st torsion spring 91b disposed opposite to each other on the axis through the scanning mirror surface 91 a. Further, the torsion spring 91b is coupled to the holding member 91c, and then the holding member 91c is coupled to the 2 nd torsion spring 91 d. The 2 nd torsion spring 91d is coupled to the frame 91 e. Although not shown, permanent magnets and coils are disposed at positions substantially symmetrical with respect to the torsion springs 91b and 91 d. The coil is formed substantially parallel to the scanning mirror surface 91a of the scanning mirror 91, and generates a magnetic field substantially parallel to the scanning mirror surface 91a when the scanning mirror surface 91a of the scanning mirror 91 is in a stationary state. When a current flows through the coil, a lorentz force substantially perpendicular to the scanning mirror surface 91a occurs by the fleming's left-hand law.
The scanning mirror surface 91a rotates to a position where the lorentz force is balanced with the restoring force of the torsion springs 91b and 91 d. The torsion spring 91b supplies an alternating current to the coil at the resonance frequency of the scanning mirror surface 91a, and the scanning mirror surface 91a resonates. Similarly, the coil is supplied with an alternating current at a resonance frequency at which the scanning mirror surface 91a and the holding member 91c are combined with respect to the torsion spring 91d, and the scanning mirror surface 91a, the torsion spring 91b, and the holding member 91c perform resonance operation. Thus, the resonance operation based on different resonance frequencies can be realized for 2 directions.
In fig. 16, if the rotation angle of the scanning mirror 91 as the reflecting surface of the light scanning section is β/2, the scanning angle as the angle of the reflected light ray changes by a factor of 2 β. Here, the scanning angle β is equal to the incident angle α in the display surface 20 without disposing any optical element between the scanning mirror 91 and the display surface 20. Therefore, the size of the scanned image for a certain projection distance is determined by the rotation angle β/2 of the scanning mirror 91. Therefore, in order to obtain a large screen at a short distance, in the present embodiment, the amplitude is increased by providing an optical system (concave lens or convex lens) (not shown) between the scanning mirror 91 shown in fig. 14 and the front glass as the projection surface.
In this embodiment, in order to superimpose an image on a close view observed by an observer, the distance from the observer to the image is short, and therefore, it is necessary to increase the image display area in the horizontal direction with respect to the vertical direction. Therefore, the inventors found the optimum value of the image display width by measuring the distance between the driver as the observer and the lower portion of the windshield at 1.2 m. It is found that in order to display the left and right turns of the vehicle being driven by an arrow together with the rotation angle of the steering wheel, it is necessary to set the display range in the horizontal direction to 30 inches or more, and if display exceeding 40 inches is possible, better video display is possible.
On the other hand, it was found that if the display in the vertical direction is 10 inches, a clear display is possible. Further, it was confirmed that in order to improve visibility of display, it is necessary to expand the display range to about 20 inches, but when the amplitude in the vertical direction is increased during driving of MEMS, it is necessary to reduce the amplitude in the horizontal direction, and if the upper limit is set to 15 inches, a practically sufficient image can be obtained.
Next, a 1 st scanning state of the laser light scanning on the image plane in the present embodiment will be described.
Fig. 17 shows a 1 st scanning state of the laser light from the optical scanning section in the present embodiment. As described above, in the scanning range (amplitude) of the optical scanning unit in the present embodiment, the amplitude angle in the horizontal direction is set to 2 times or more the amplitude angle in the vertical direction, and the image display range in the horizontal direction is increased with respect to the vertical direction. The laser beam 301 is scanned from left to right in the horizontal direction by one pixel, and the laser beam on the windshield is scanned from right to left by one pixel. 302 is a scanning track of the 1 st scanning section. The frame rate of image switching may be 1/60Hz when the running speed of the vehicle is 40 km/h, but the frame rate may be 1/120Hz when the running speed is 100 km/h, so that the rewriting speed of the display image can be increased according to the running speed of the vehicle to realize optimal display.
At this time, in the optical scanning section in the present embodiment, as shown in expression (3), the product of the frame frequency F, the horizontal deflection frequency fh, and the vertical deflection frequency fv is approximately a fixed value a. Accordingly, the frame rate is changed by using the vehicle running speed information from the driving support ECU62 shown in fig. 1, and the horizontal deviation frequency is reduced by the expression (3), and the deviation angle is also reduced proportionally.
[ 3]
A=F(fh×fv)…(3)
As a result, the horizontal dimension of the video display as the video table range becomes smaller, but the visual field of the driver becomes narrower when the traveling speed becomes higher, so that an information display device free from uncomfortable feeling at the time of use can be obtained.
In the 1 st scanning state in the present embodiment, monochromatic laser light of 3 colors (red (635 nm), green (532 nm), blue (460 nm)) shown in fig. 18 is used. Fig. 20 shows the result of converting the chromaticity of the combined monochromatic light and the combined chromaticity into coordinates on the chromaticity diagram shown in fig. 19, but the color purity of the monochromatic light is excellent, so that a sufficient brightness can be obtained while covering the display color range of the NTSC system.
Further, when other color light is mixed at the time of emitting each single color, for example, when blue laser light is 100% emitted, green light is 10% emitted with respect to maximum emission, and red light is 5% emitted with respect to maximum emission, the mixed color light has a color equivalent to blue and a brightness of 2 times or more. As described above, it has also been found that, in the scanning section according to the aspect of the present application, the brightness of the pseudo monochromatic light can be further improved by mixing the laser light of other colors instead of the laser monochromatic light.
Next, a 2 nd scanning state of the laser light scanning on the image plane in the present embodiment will be described.
Fig. 21 shows a 2 nd scanning state of the laser light from the optical scanning section in the present embodiment. The difference from the 1 st scanning state is that the number of optical scanning units is 2, i.e., the 1 st scanning unit and the 2 nd scanning unit in fig. 21. Regarding the scanning range (amplitude) of the 1 st scanning unit, the amplitude angle in the horizontal direction is set to 2 times or more the amplitude angle in the vertical direction, and the image display range in the horizontal direction is increased with respect to the vertical direction. The laser 301 on the windshield is set to a size of one pixel, and the beam is scanned in the horizontal direction, that is, the trajectory shown by the solid line in fig. 21 is scanned from left to right, and is scanned from right to left by one pixel. 302 is a scanning track of the 1 st scanning section.
On the other hand, as for the scanning range (amplitude) of the 2 nd scanning unit, the amplitude angle in the horizontal direction is set to 2 times or more the amplitude angle in the vertical direction as in the 1 st scanning unit, and the image display range in the horizontal direction is increased with respect to the vertical direction. The laser beam 303 on the windshield is scanned in the horizontal direction by one pixel, that is, the horizontal direction is scanned from right to left along the trajectory shown by the broken line in fig. 21, and is scanned from left to right by one pixel. In fig. 21, the state where the laser 303 comes to the final pixel at the lowest stage is shown. ) The scanning of the 2 nd scanning unit may be in a direction from top to bottom or in a direction from bottom to top. Reference numeral 304 denotes a scanning track of the 2 nd scanning unit. At this time, the display of the next frame of the frame image displayed by the 1 st scanning unit is shifted by approximately 1/2 frame amount, and the video of the next frame is displayed.
As a result, it was confirmed that the frame rate can be made to be 2 times in an analog manner. Further, in the 1 st scanning section in the 2 nd scanning state, monochromatic laser light of 3 colors of red (635 nm), green (532 nm), and blue (460 nm) shown in fig. 22 was used. In addition, in the 2 nd scanning unit, monochromatic laser beams of 3 colors (645 nm), green (515 nm) and blue (450 nm) shown in fig. 22 are used, so that flare can be reduced, and monochromatic light obtained by combining them and chromaticity in the case of combination are excellent as shown in fig. 23, so that the color purity of the monochromatic light of the laser light sources constituting the 2 scanning units is excellent, and a sufficient brightness can be obtained while covering the display color range of the NTSC system.
Further, the 1 st scanning unit (hereinafter referred to as (1)) and the 2 nd scanning unit (hereinafter referred to as (2)) mix other color lights at the time of emission of each single color, for example, when blue laser light (1) and (2) of the 2 nd scanning unit are 100% emitted, green light (1) is emitted by 5% relative to maximum emission, green light (2) is emitted by 10% relative to maximum emission, and red light (1) is emitted by 5% relative to maximum emission, the mixed color light is a color equivalent to blue and the brightness is 2 times or more.
As described above, according to the present embodiment, it has also been found that even if a plurality of scanning units are used in a superimposed manner, the brightness of the simulated monochromatic light can be further improved by mixing the laser light of other colors instead of the laser monochromatic light. In the present embodiment, the effect of the case where 2 scanning units are used simultaneously is described, but naturally, if 3 or more scanning units are used simultaneously, the frame rate can be increased in a simulated manner, and by using laser beams of different wavelengths and superimposing them on each scanning unit, a significant reduction in flare noise can be achieved. As for brightness, chromaticity of a single color can be increased without loss as described above.
Next, a more specific configuration of the display of the instrument panel of the information display device will be described.
Since the instrument panel 42 shown in fig. 1 is disposed at the inner diameter portion of the steering wheel 43, the movement of the image displayed on the instrument panel from the viewpoint of the driver as the observer is maximized. Therefore, in addition to the fact that the automobile is being automatically driven in the automatic driving mode, information that the degree of urgency is low is displayed. The viewpoint of the driver is sensed by the observation camera 210, and a large amount of video information can be effectively displayed to the driver when changing the display video.
As the instrument panel, a liquid crystal panel is used for the purpose of reducing the thickness of the device. The interior trim of the automobile may be curved. Further, by switching the display content at a high speed by making the display speed be 120Hz which is 2 times or 240Hz which is 4 times the frame frequency (60 Hz), it is possible to display video information from an observation camera outside the vehicle in real time.
The information display device described above has an image display area 1 (a), an image display area 2 (a), and an image display area 3 (a) as 3 kinds of information display positions shown in fig. 3. In contrast, as a sensor for observing the movement of the viewpoint of the driver as the observer, for example, the observation camera 210 shown in fig. 1 and 2 is used, and each of the images displayed at the 3 information display positions is displayed in combination as the optimal position, time, and display content in accordance with the information of the movement of the viewpoint of the observer and the speed of the vehicle, so that an effective information display device can be provided for the safe driving support. For example, control is performed such that the information display position is changed to the turning direction according to the viewpoint movement of the observer at the time of turning.
Further, regarding the 3 information display positions, since the vicinity of the straight line including the rotation center axis of the steering wheel is set as the center of the display, the viewpoint of the observer in the horizontal direction moves left and right equally, and therefore, there is an effect of suppressing fatigue during driving and an effect of minimizing the viewpoint movement.
In the above, the information display device of various embodiments of the present invention is described. However, the present invention is not limited to the above-described embodiments, but includes various modifications. For example, the above-described embodiments are examples of the entire system for the purpose of easily understanding the present invention, and are not limited to all configurations having the description. In addition, a part of the structure of one embodiment may be replaced with the structure of another embodiment, and the structure of another embodiment may be added to the structure of one embodiment. In addition, other structures may be added, deleted, or replaced to a part of the structures of the embodiments.

Claims (9)

1. An information display device for displaying information on a vehicle, the information display device comprising:
a 1 st display unit that displays image information in an image display area in front of a driver's seat of the vehicle;
A space sensing device that detects positional information of an instruction of a driver and acceleration at a position in a space region between the driver's seat and the image display region displayed by the 1 st display section; and
a 2 nd display unit configured to display the position information of the driver's instruction and the driver's instruction detected by the spatial sensor device in the image display area in front in correspondence with the acceleration at the position,
the space sensing device includes a sensing device configured by arranging a plurality of pairs of optical elements in a straight line, the pairs of optical elements being configured by a light emitting element that makes a light beam from a light source a parallel light beam and a condenser lens element that receives a reflected light beam reflected by an obstacle from among the light beams from the light emitting element.
2. The information display device according to claim 1, wherein,
the 1 st display unit includes a virtual image optical system that reflects light emitted from the 1 st display unit at a windshield of the vehicle to display a virtual image in front of the vehicle.
3. The information display device according to claim 2, wherein,
The 1 st display unit is disposed on an instrument panel between the windshield and the driver's seat.
4. The information display device according to claim 1, wherein,
the 1 st display unit includes a real image optical system that displays a real image by scanning light emitted from the 1 st display unit over a windshield of the vehicle.
5. The information display device according to claim 4, wherein,
the 1 st display unit is disposed on an instrument panel between the windshield and the driver's seat.
6. The information display device according to claim 1, wherein,
the 1 st display unit includes a direct-view type image display device as an instrument panel of the vehicle.
7. The information display device according to claim 6, wherein,
the 1 st display unit is disposed between the instrument panel and the driver's seat.
8. The information display device according to claim 1, wherein,
the pairs of the plurality of optical elements are arranged in a horizontal direction between the windshield and the driver's seat and in a direction orthogonal to a direction from the driver's seat toward the windshield.
9. The information display device according to claim 1, wherein,
The pairs of the plurality of optical elements are arranged along at least 1 edge of an instrument panel of the vehicle.
CN201880067362.9A 2017-10-24 2018-10-05 Information display device and space sensing device thereof Active CN111213114B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311027382.2A CN116788039A (en) 2017-10-24 2018-10-05 Information display device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2017-205299 2017-10-24
JP2017205299A JP6987341B2 (en) 2017-10-24 2017-10-24 Information display device and its spatial sensing device
PCT/JP2018/037459 WO2019082626A1 (en) 2017-10-24 2018-10-05 Information display apparatus and space sensing device for same

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202311027382.2A Division CN116788039A (en) 2017-10-24 2018-10-05 Information display device

Publications (2)

Publication Number Publication Date
CN111213114A CN111213114A (en) 2020-05-29
CN111213114B true CN111213114B (en) 2023-08-29

Family

ID=66246295

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201880067362.9A Active CN111213114B (en) 2017-10-24 2018-10-05 Information display device and space sensing device thereof
CN202311027382.2A Pending CN116788039A (en) 2017-10-24 2018-10-05 Information display device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202311027382.2A Pending CN116788039A (en) 2017-10-24 2018-10-05 Information display device

Country Status (5)

Country Link
US (2) US11878586B2 (en)
JP (2) JP6987341B2 (en)
CN (2) CN111213114B (en)
DE (1) DE112018004671T5 (en)
WO (1) WO2019082626A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6726674B2 (en) * 2015-10-15 2020-07-22 マクセル株式会社 Information display device
WO2019198998A1 (en) * 2018-04-12 2019-10-17 엘지전자 주식회사 Vehicle control device and vehicle including same
DE112020001429T5 (en) * 2019-03-27 2021-12-23 Panasonic Intellectual Property Management Co., Ltd. ELECTRONIC MIRROR SYSTEM, IMAGE DISPLAY METHOD AND MOVING VEHICLE
JP7241005B2 (en) * 2019-11-27 2023-03-16 京セラ株式会社 Head-up display system and moving object
JP7250666B2 (en) * 2019-11-27 2023-04-03 京セラ株式会社 Head-up display, head-up display system and moving object
JP7332449B2 (en) * 2019-11-27 2023-08-23 京セラ株式会社 Head-up display module, head-up display system and moving body
KR20210142932A (en) * 2020-05-19 2021-11-26 현대모비스 주식회사 Head up display apparatus having external light blocking function
CN114241972B (en) * 2021-12-28 2023-08-22 武汉华星光电半导体显示技术有限公司 Display device, control method of display device, and electronic apparatus

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1845837A (en) * 2003-09-02 2006-10-11 松下电器产业株式会社 Input device
JP2016149094A (en) * 2015-02-13 2016-08-18 三菱自動車工業株式会社 Vehicle information processing apparatus

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4961625A (en) * 1987-09-18 1990-10-09 Flight Dynamics, Inc. Automobile head-up display system with reflective aspheric surface
JP2000168352A (en) 1998-12-10 2000-06-20 Honda Access Corp Display device for vehicle
ATE442607T1 (en) * 2004-02-04 2009-09-15 Microvision Inc GRID HEAD-UP DISPLAY AND CORRESPONDING SYSTEMS AND METHODS
JP2010018201A (en) 2008-07-11 2010-01-28 Fujitsu Ten Ltd Driver assistant device, driver assistant method, and driver assistant processing program
US8775023B2 (en) * 2009-02-15 2014-07-08 Neanode Inc. Light-based touch controls on a steering wheel and dashboard
JP2012073658A (en) 2010-09-01 2012-04-12 Shinsedai Kk Computer system
JP5467527B2 (en) 2010-09-06 2014-04-09 株式会社デンソー In-vehicle device controller
JP2013097605A (en) 2011-11-01 2013-05-20 Denso Corp Vehicle driving support device
JP2013250833A (en) * 2012-06-01 2013-12-12 Nec Saitama Ltd Portable electronic instrument, method for controlling the same, and program
KR20140076057A (en) * 2012-12-12 2014-06-20 한국전자통신연구원 Apparatus and method for motion input based on multi-sensor
WO2014096896A1 (en) 2012-12-20 2014-06-26 Renault Trucks A method of selecting display data in a display system of a vehicle
JP2015128918A (en) * 2014-01-06 2015-07-16 株式会社東海理化電機製作所 Operation device
JP2015132905A (en) * 2014-01-10 2015-07-23 アルパイン株式会社 Electronic system, method for controlling detection range, and control program
JP6253417B2 (en) 2014-01-16 2017-12-27 三菱電機株式会社 Vehicle information display control device
WO2015133057A1 (en) * 2014-03-05 2015-09-11 株式会社デンソー Detection device and gesture input device
JP2015194707A (en) 2014-03-27 2015-11-05 パナソニックIpマネジメント株式会社 display device
KR101519290B1 (en) * 2014-04-09 2015-05-11 현대자동차주식회사 Method for Controlling HUD for Vehicle
EP3133436B1 (en) 2014-04-14 2020-03-11 Panasonic Intellectual Property Management Co., Ltd. Heads-up display and moving body equipped with heads-up display
JP6487642B2 (en) 2014-07-01 2019-03-20 国立大学法人 筑波大学 A method of detecting a finger shape, a program thereof, a storage medium of the program, and a system for detecting a shape of a finger.
DE102014116292A1 (en) 2014-11-07 2016-05-12 Visteon Global Technologies, Inc. System for transmitting information in a motor vehicle
CN204687853U (en) * 2015-03-20 2015-10-07 京东方科技集团股份有限公司 A kind of in-vehicle display system and automobile
US10040352B2 (en) * 2016-04-12 2018-08-07 International Business Machines Corporation Vehicle steering control display device
JP6315127B2 (en) 2017-04-10 2018-04-25 船井電機株式会社 Input device, aerial image interaction system, and input method
US10600390B2 (en) * 2018-01-10 2020-03-24 International Business Machines Corporation Displaying a vehicle notification in a location determined based on driver eye gaze direction and other criteria

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1845837A (en) * 2003-09-02 2006-10-11 松下电器产业株式会社 Input device
JP2016149094A (en) * 2015-02-13 2016-08-18 三菱自動車工業株式会社 Vehicle information processing apparatus

Also Published As

Publication number Publication date
JP6987341B2 (en) 2021-12-22
US20200247240A1 (en) 2020-08-06
CN111213114A (en) 2020-05-29
JP2019079238A (en) 2019-05-23
WO2019082626A1 (en) 2019-05-02
CN116788039A (en) 2023-09-22
JP7360433B2 (en) 2023-10-12
JP2022020704A (en) 2022-02-01
US20240092172A1 (en) 2024-03-21
DE112018004671T5 (en) 2020-06-04
US11878586B2 (en) 2024-01-23

Similar Documents

Publication Publication Date Title
CN111213114B (en) Information display device and space sensing device thereof
JP7081021B2 (en) Information display device
US20240036313A1 (en) Information display apparatus
JP6726674B2 (en) Information display device
JP6762807B2 (en) Information display device
JP6839764B2 (en) Information display device
WO2017072841A1 (en) Information display device
CN111201151B (en) Information display device
US20140036374A1 (en) Bifocal Head-up Display System
JPH02299934A (en) Head-up-display of vehicle
JP2020115158A (en) Vehicular information display system and information display device
WO2017061016A1 (en) Information display device
JP2021036317A (en) Information display device
JP2019066760A (en) Image display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Kyoto, Japan

Applicant after: MAXELL, Ltd.

Address before: Kyoto, Japan

Applicant before: MAXELL HOLDINGS, Ltd.

CB02 Change of applicant information
TA01 Transfer of patent application right

Effective date of registration: 20211109

Address after: Kyoto, Japan

Applicant after: MAXELL HOLDINGS, Ltd.

Address before: Kyoto, Japan

Applicant before: MAXELL, Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant