CN112868058A - Wireless communication head-up display system, wireless communication apparatus, moving object, and program - Google Patents

Wireless communication head-up display system, wireless communication apparatus, moving object, and program Download PDF

Info

Publication number
CN112868058A
CN112868058A CN201980068955.1A CN201980068955A CN112868058A CN 112868058 A CN112868058 A CN 112868058A CN 201980068955 A CN201980068955 A CN 201980068955A CN 112868058 A CN112868058 A CN 112868058A
Authority
CN
China
Prior art keywords
image
controller
wireless communication
display
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980068955.1A
Other languages
Chinese (zh)
Inventor
草深薰
桥本直
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kyocera Corp
Original Assignee
Kyocera Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kyocera Corp filed Critical Kyocera Corp
Publication of CN112868058A publication Critical patent/CN112868058A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/10
    • B60K35/211
    • B60K35/22
    • B60K35/23
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/27Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
    • G02B30/28Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays involving active lenticular arrays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/30Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving parallax barriers
    • G02B30/31Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving parallax barriers involving active parallax barriers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/002Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • B60K2360/149
    • B60K2360/1526
    • B60K2360/1876
    • B60K2360/31
    • B60K2360/589
    • B60K35/29
    • B60K35/85
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Abstract

The wireless communication head-up display system includes a wireless communication device including an image pickup element, a 1 st controller, and a 1 st communication module, and a head-up display including a display panel, an optical element, an optical system, and a 2 nd controller. The imaging element is configured to generate an imaged image. The 1 st controller is configured to estimate the position of the user's eyes based on the captured image. The 1 st communication module is configured to transmit the position of the user's eyes. The display panel is configured to display a parallax image. The optical element is configured to define a propagation direction of the image light. The optical system is configured to project image light whose propagation direction is defined by the optical element toward the eyes of the user. The 2 nd communication module is configured to receive the eye position. The 2 nd controller is configured to control the parallax image to be displayed on the display panel based on the received position of the eye.

Description

Wireless communication head-up display system, wireless communication apparatus, moving object, and program
Cross reference to related applications
The present application claims priority to japanese patent application No. 2018-207598, applied on 11/2/2018, and the disclosure of this prior application is incorporated herein in its entirety for reference.
Technical area
The present disclosure relates to a wireless communication head-up display system, a wireless communication apparatus, a moving body, and a program.
Background
Conventionally, in order to perform three-dimensional display without using glasses, there is known a three-dimensional display device including an optical element that allows a part of light emitted from a display panel to reach a right eye and allows another part of light emitted from the display panel to reach a left eye (see patent document 1).
Prior art documents
Patent document
Patent document 1: japanese patent laid-open No. 2001 and 166259
Disclosure of Invention
A wireless communication head-up display system of the present disclosure includes a wireless communication device and a head-up display. The wireless communication device comprises an image pickup element, a 1 st controller and a 1 st communication module. The imaging element is configured to generate an imaged image. The 1 st controller is configured to estimate a position of an eye of the user based on the captured image. The 1 st communication module is configured to transmit the position of the user's eye estimated by the 1 st controller. The head-up display includes a display panel, an optical element, an optical system, and a 2 nd controller. The display panel is configured to display a parallax image. The optical element is configured to define a propagation direction of the image light emitted from the display panel. The optical system is configured to project the image light whose propagation direction is defined by the optical element in a direction toward the eyes of the user. The 2 nd communication module is configured to receive the position of the eye from the 1 st communication module. The 2 nd controller is configured to control the parallax image to be displayed on the display panel based on the position of the eye received by the 2 nd communication module.
The disclosed wireless communication device is provided with an imaging element, a controller, and a communication module. The controller is configured to generate a captured image in the imaging element. The controller is configured to estimate a position of the user's eyes based on the captured image. The communication module is configured to transmit position information indicating a position of the user's eyes to a head-up display.
The mobile body of the present disclosure is provided with a wireless communication head-up display system. A wireless communication head-up display system is provided with a wireless communication device and a head-up display. The wireless communication device comprises an image pickup element, a 1 st controller and a 1 st communication module. The imaging element is configured to generate an imaged image. The 1 st controller is configured to estimate a position of an eye of the user based on the captured image. The 1 st communication module is configured to transmit the position of the user's eye estimated by the 1 st controller. The head-up display includes a display panel, an optical element, an optical system, and a 2 nd controller. The display panel is configured to display a parallax image. The optical element is configured to define a propagation direction of the image light emitted from the display panel. The optical system is configured to project the image light whose propagation direction is defined by the optical element in a direction toward the eyes of the user. The 2 nd communication module is configured to receive the position of the eye from the 1 st communication module. The 2 nd controller is configured to control the parallax image to be displayed on the display panel based on the position of the eye received by the 2 nd communication module.
The program of the present disclosure is executed by a wireless communication device including an image pickup element, a controller, and a communication module. The controller controls the image pickup device to generate a picked-up image, and estimates the position of the eyes of the user based on the picked-up image. The controller controls the communication module to transmit position information indicating a position of the user's eyes to a head-up display.
Drawings
Fig. 1 is a diagram showing an example of a wireless communication head-up display system mounted on a mobile body.
Fig. 2 is a diagram showing a schematic configuration of the wireless communication apparatus and the head-up display shown in fig. 1.
Fig. 3 is a view showing an example of the display panel shown in fig. 2 as viewed from the depth direction.
Fig. 4 is a view showing an example of the parallax barrier shown in fig. 2 as viewed from the depth direction.
Fig. 5 is a diagram for explaining a relationship between the virtual image shown in fig. 1 and the eyes of the user.
Fig. 6 is a flowchart showing an example 1 of the processing flow of the 1 st controller shown in fig. 3.
Fig. 7 is a flowchart showing an example 2 of the processing flow of the 1 st controller shown in fig. 3.
Fig. 8 is a flowchart showing an example 3 of the processing flow of the 1 st controller shown in fig. 3.
Fig. 9 is a diagram showing a positional relationship between the three-dimensional display device and the eyes of the user when the user directly views the display panel.
Fig. 10 is a schematic configuration diagram of a three-dimensional display device in the case where the optical element is a lenticular lens.
Detailed Description
In order for a user to appropriately visually recognize a virtual image of an image projected by the head-up display, it is desirable that the image light appropriately reaches the position of the eyes of the user.
The present disclosure provides a wireless communication head-up display system, a wireless communication apparatus, a moving object, and a program capable of allowing a user to visually recognize an appropriate virtual image.
Hereinafter, an embodiment of the present disclosure will be described with reference to the drawings. The drawings used in the following description are schematic, and the dimensional ratios and the like in the drawings do not necessarily coincide with actual dimensional ratios and the like.
As shown in fig. 1, a Head-Up Display (HUD) system 100 for wireless communication according to an embodiment of the present disclosure includes a wireless communication device 1 and a Head-Up Display 2. The communication HUD system 100 may be mounted on the mobile unit 20.
The "mobile body" in the present disclosure includes a vehicle, a ship, and an airplane. The "vehicle" in the present disclosure includes an automobile and an industrial vehicle, but is not limited thereto, and may include a railway vehicle, a living vehicle, and a fixed-wing aircraft that travels on a slide road. The automobile includes a passenger car, a truck, a bus, a two-wheel vehicle, and a trolley bus, but is not limited thereto, and may include other vehicles that run on a road. Industrial vehicles include agricultural and construction oriented industrial vehicles. Industrial vehicles include, but are not limited to, fork lifts and golf carts. Agricultural-oriented industrial vehicles include, but are not limited to, tractors, tillers, transplanters, binders, combine harvesters, and lawn mowers. Construction-oriented industrial vehicles include, but are not limited to, bulldozers, scrapers (scarers), excavators, cranes, dump trucks, and road rollers. The vehicle includes a vehicle driven by human power. In addition, the classification of the vehicle is not limited to the above. For example, an industrial vehicle that can travel on a road may be included in the automobile, or the same vehicle may be included in a plurality of categories. Among the vessels in this disclosure are marine jets, ships, tankers. The aircraft in the present disclosure includes a fixed-wing aircraft and a rotary-wing aircraft.
The wireless communication apparatus 1 can employ a general-purpose wireless communication terminal, such as a portable phone, a smartphone, a tablet terminal, or the like. The wireless communication apparatus 1 includes an image pickup optical system 10, an image pickup element 11, a 1 st controller (controller) 12, a motion sensor 13, and a 1 st communication module (communication module) 14.
The wireless communication apparatus 1 is arranged such that both eyes of a user are positioned on the opposite side of the imaging optical system 10 from the imaging device 11. When the communication HUD system 100 is mounted on the mobile body 20, the wireless communication device 1 may be mounted on, for example, an indoor mirror. The wireless communication apparatus 1 may be mounted on a dashboard (dashboard) of the mobile body 20, for example. The wireless communication device 1 may be mounted in a center panel. The wireless communication device 1 may be mounted on a support portion of the steering wheel.
The imaging optical system 10 includes one or more lenses. The imaging optical system 10 is arranged such that the optical axis of the imaging optical system 10 is perpendicular to the imaging surface of the imaging element 11. The imaging optical system 10 is configured to form light incident from an object as an object image on an imaging surface of the imaging device 11.
The imaging element 11 may include, for example, a CCD (Charge Coupled Device) imaging element or a CMOS (Complementary Metal Oxide Semiconductor) imaging element. The imaging element 11 is configured to generate an image by converting an image formed by the imaging optical system 10 into image information.
The 1 st controller 12 is connected to and can control each component of the wireless communication device 1. The components controlled by the 1 st controller 12 include the imaging device 11, the 1 st controller 12, the motion sensor 13, and the 1 st communication module 14. The 1 st controller 12 is configured as a processor, for example. The 1 st controller 12 may include more than one processor. The processor may include a general-purpose processor that reads a specific program to execute a specific function, and a specific-purpose processor that is dedicated to a specific process. The dedicated processor may comprise an Application Specific Integrated Circuit (ASIC). The processor may include a Programmable Logic Device (PLD). The PLD may comprise an FPGA (Field-Programmable Gate Array). The 1 st controller 12 may be any one of a System-on-a-Chip (SoC) and a System In Package (SiP) In which one or more processors cooperate.
Details of the 1 st controller 12 will be described later.
The motion sensor 13 is configured to detect a parameter indicating a motion of the wireless communication device 1 including the motion sensor 13. The motion sensor 13 is, for example, an acceleration sensor or an angular acceleration sensor. The parameter indicating the motion includes, for example, acceleration, temporal change in acceleration, angular acceleration, and temporal change in angular acceleration.
The 1 st communication module 14 is configured to be able to communicate with the three-dimensional display device 5. The communication method used for the communication between the 1 st communication module 14 and the three-dimensional display device 5 may be a short-range wireless communication standard or a wireless communication standard for connection to a portable telephone network, and may be a wired communication standard. The short-range wireless Communication standard may include WiFi (registered trademark), Bluetooth (registered trademark), infrared ray, NFC (Near Field Communication), and the like. The wireless communication standard for connecting to the portable telephone network may include, for example, LTE (Long Term Evolution), 4 th generation mobile communication system, 5 th generation mobile communication system, or the like.
The HUD2 can include one or more reflectors 3, optical members 4, and three-dimensional display devices 5. The reflector 3 and the optical member 4 are also referred to as an optical system.
The reflector 3 is configured to reflect image light emitted from the three-dimensional display device 5 toward a predetermined region of the optical member 4. The given area is an area where the image light reflected at the given area is directed toward the eyes of the user. The predetermined area can be determined according to the direction of the user's eyes with respect to the optical member 4 and the incident direction of the image light to the optical member 4. The reflector 3 comprises more than one reflecting element.
Each reflective element may be a mirror. In case the reflecting element is a mirror, the mirror may be a concave mirror, for example. In fig. 1, more than one reflective element is shown as one mirror. However, the present invention is not limited to this, and one or more reflecting elements may be configured by combining one or more mirrors.
The optical member 4 is configured to reflect image light emitted from the three-dimensional display device 5 and reflected by the one or more reflectors 3 toward the left eye (1 st eye) and the right eye (2 nd eye) of the user. For example, the windshield of the moving body 20 may also serve as the optical member 4. Therefore, the HUD2 is configured to travel the light emitted from the three-dimensional display device 5 to the left and right eyes of the user along the optical path L. The user can visually recognize the light reaching along the optical path L as a virtual image V.
As shown in fig. 2, the three-dimensional display device 5 can include a 2 nd communication module 54, a 2 nd controller 55, an illuminator 51, a display panel 52, and a parallax barrier 53 as an optical element. In the configuration in which the HUD2 is mounted on the mobile body 20, the three-dimensional display device 5 may be mounted on a meter (Cluster) of the mobile body 20.
The 2 nd communication module 54 is capable of communicating with the 1 st communication module 14 using the same communication standard as the 1 st communication module 14 of the wireless communication device 1. The 2 nd communication module 54 is configured to receive various kinds of information transmitted from the 1 st communication module 14.
The 2 nd controller 55 is connected to the components of the HUD2 and can control the components. The components controlled by the 2 nd controller 55 include an irradiator 51, a display panel 52, and a 2 nd communication module 54. The 2 nd controller 55 may comprise more than one processor. The 2 nd controller 55 may include a general-purpose processor for reading a specific program to execute a specific function and a specific-purpose processor dedicated to a specific process. The dedicated processor may comprise an ASIC. The processor may comprise a PLD. The PLD may comprise an FPGA. The 2 nd controller 55 may be any one of a SoC and a SiP in which one or more processors cooperate.
The irradiator 51 may be configured to perform surface irradiation on the display panel 52. The illuminator 51 may include a light source, a light guide plate, a diffusion sheet, and the like. The illuminator 51 is configured to emit irradiation light from a light source, and to uniformize the irradiation light in the plane direction of the display panel 52 by a light guide plate, a diffusion sheet, or the like. The illuminator 51 may be configured to emit the homogenized light to the display panel 52.
The display panel 52 can be a display panel such as a transmissive liquid crystal display panel, for example. The display panel 52 is not limited to a transmissive liquid crystal panel, and other display panels such as an organic EL panel can be used. When a self-emission type display panel is used as the display panel 52, the three-dimensional display device 5 may not include the irradiator 51. The display panel 52 will be described as a liquid crystal panel.
As shown in fig. 3, the display panel 52 has a plurality of divided regions in the effective region a formed in a planar shape. The effective region a displays a parallax image. The parallax image includes a left-eye image (1 st image) and a right-eye image (2 nd image) having parallax with respect to the left-eye image, which will be described later. The divided region is a region divided in the 1 st direction and the 2 nd direction orthogonal to the 1 st direction. A direction orthogonal to the 1 st direction and the 2 nd direction is referred to as a 3 rd direction. The 1 st direction may be referred to as a horizontal direction. The 2 nd direction may be referred to as a vertical direction. The 3 rd direction may be referred to as a depth direction. However, the 1 st direction, the 2 nd direction, and the 3 rd direction are not limited to these. In fig. 2 to 4, 9, and 10, the 1 st direction is represented as the x-axis direction, the 2 nd direction is represented as the y-axis direction, and the 3 rd direction is represented as the z-axis direction. In fig. 1 and 5, an inter-eye direction which is a direction of a straight line passing through the right and left eyes of the user is represented as a u-axis direction, a front-back direction of the user is represented as a w-axis direction, and a height direction perpendicular to the u-axis direction and the w-axis direction is represented as a v-axis direction.
In each of the divided regions, there corresponds one sub-pixel. Therefore, the effective region a includes a plurality of sub-pixels arranged in a lattice shape along the horizontal direction and the vertical direction.
Each sub-pixel corresponds to any one of R (Red), G (Green), and B (Blue), and one pixel can be configured by using R, G, B as a set of three sub-pixels. One pixel can be referred to as one picture element. The horizontal direction is, for example, a direction in which a plurality of sub-pixels constituting one pixel are arranged. The vertical direction is, for example, a direction in which the same color sub-pixels are arranged. The display panel 52 is not limited to a transmissive liquid crystal panel, and other display panels such as an organic EL panel can be used. When a self-emission type display panel is used as the display panel 52, the three-dimensional display device 5 may not include the irradiator 51.
As described above, the plurality of subpixels arranged in the effective area a constitute the subpixel group Pg. The sub-pixel groups Pg are repeatedly arranged in the horizontal direction. The subpixel groups Pg may be arranged at the same position in the vertical direction and may be arranged with an offset. For example, the subpixel group Pg may be repeatedly arranged adjacent to a position shifted by one subpixel in the horizontal direction in the vertical direction. The subpixel group Pg includes subpixels of a given row and column. For example, the subpixel group Pg includes b (b rows) subpixels P1 to P (2 × n × b) which are arranged continuously in the vertical direction and 2 × n (2 × n columns) (2 × n × b) subpixels P1 to P (2 × n × b) which are arranged continuously in the horizontal direction. In the example shown in fig. 3, n is 6 and b is 1. In the effective area a, a subpixel group Pg including twelve subpixels P1 to P12, one of which is arranged continuously in the vertical direction and twelve of which are arranged continuously in the horizontal direction, is arranged. In the example shown in fig. 3, a part of the sub-pixel groups Pg is denoted by a symbol.
The subpixel group Pg is a minimum unit for the controller 2 55 to control the display of an image. Each sub-pixel included in the sub-pixel group Pg is identified by identification information P1 to P (2 × n × b). The subpixels P1 to P (2 × n × b) having the same identification information of all the subpixel groups Pg are simultaneously controlled by the 2 nd controller 55. For example, in the case of switching the image displayed on the subpixel P1 from the left-eye image to the right-eye image, the 2 nd controller 55 simultaneously switches the image displayed on the subpixel P1 from the left-eye image to the right-eye image in all the subpixel groups Pg.
As shown in fig. 2, the parallax barrier 53 is formed by a plane along the effective area a, and is disposed apart from the effective area a by a given distance (gap) g. The parallax barrier 53 may be located on the opposite side of the illuminator 51 with respect to the display panel 52. The parallax barrier 53 may also be located on the illuminator 51 side of the display panel 52.
As shown in fig. 4, the parallax barrier 53 is configured to define the propagation direction of the image light emitted from the sub-pixels for each of a plurality of strip-shaped regions extending in a predetermined direction in the plane, that is, light-transmitting regions 531. For example, the parallax barrier 53 has a plurality of dimming regions 532 that dim image light. The plurality of dimming regions 532 delimit light transmitting regions 531 between the dimming regions 532 adjacent to each other. The light transmitting region 531 has a higher light transmittance than the light reduction region 532. The light-reduction region 532 has a lower light transmittance than the light-transmission region 531. The light transmitting regions 531 and the light reduction regions 532 extend in a given direction along the effective region a, and are repeatedly alternately arranged in a direction orthogonal to the given direction. The given direction is, for example, a direction along a diagonal of the sub-pixel. The predetermined direction can be set to a direction crossing a sub-pixels in the 1 st direction and b sub-pixels in the 2 nd direction (a and b are mutually prime positive integers), for example. The given direction may also be set to the 2 nd direction.
In this way, the parallax barrier 53 defines the propagation direction of the image light emitted from the sub-pixels arranged in the effective area a, and thereby determines the area of the 1 st virtual image V1 corresponding to the area on the effective area a that can be seen by the eyes of the user, as shown in fig. 5. Hereinafter, a region within the 1 st virtual image V1 that the user can visually recognize by the image light propagating to the position of the user's eyes is referred to as a visible region Va. A region within the 1 st virtual image V1 that the user can visually recognize by the image light propagating to the position of the left eye of the user is referred to as a left visible region VaL (1 st visible region). A region within the 1 st virtual image V1 where the user visually recognizes with the image light propagating to the position of the user's right eye is referred to as a right visible region VaR (2 nd visible region).
The virtual image barrier pitch VBp and the gap Vg can be defined such that the following expressions (1) and (2) using the viewing distance Vd are satisfied. The virtual image barrier pitch VBp is an arrangement interval of the 2 nd virtual images V2 of the light reduction region 532 in the direction corresponding to the 1 st direction. The virtual image gap Vg is the distance between the 2 nd virtual image V2 and the 1 st virtual image V1. The viewing distance Vd is a distance between the respective positions of the right and left eyes of the user indicated by the position information received from the wireless communication apparatus 1 and the virtual image V2 of the parallax barrier 53. In equations (1) and (2), the virtual image barrier opening width VBw is a width corresponding to the width of the light-transmitting region 531 in the 2 nd virtual image V2. The interocular distance E is the distance between the right eye and the left eye. The inter-eye distance E may be, for example, 61.1mm to 64.4mm, which is a value calculated by a study of the institute of industrial and technology integration. VHp is the length in the horizontal direction of the virtual image of the sub-pixel. VHp is the length in the direction corresponding to the 1 st direction of the sub-pixel virtual images in the 1 st virtual image V1.
E: vd ═ (n × VHp): vg type (1)
Vd: VBp ═ (Vdv + Vg): (2 XnxnxNXVHp) formula (2)
A part of the image light emitted from the effective region a of the display panel 52 passes through the light-transmitting region 531 and reaches the optical member 4 via the one or more reflectors 3. The image light that reaches the optical member 4 is reflected by the optical member 4 to reach the eyes of the user. Thereby, the user's eyes recognize the 1 st virtual image V1, which is the virtual image of the parallax image displayed in the effective region a in front of the optical member 4. In the present application, the front direction is a direction of the optical member 4 as viewed from the user. The front is a direction of normal movement of the mobile body 20.
Therefore, the user visually recognizes the parallax image as if the 1 st virtual image V1 were visually recognized through the 2 nd virtual image V2.
As described above, the position where the communication HUD system 100 reaches the image light differs depending on the position of the eyes of the user. The communication HUD system 100 is configured to generate a captured image, estimate the position of the eyes, and determine the visible region Va while displaying a three-dimensional image, so that a user can appropriately visually recognize the three-dimensional image. The generation of the captured image, the estimation of the eye position, and the determination of the visible region Va will be described in detail below.
< example 1 >
(Generation of photographic image)
When the display panel 52 starts displaying the parallax image, the 1 st controller 12 may cause the imaging device 11 to generate an imaged image including images of both eyes of the user at a predetermined time interval.
(estimation of eye position)
The 1 st controller 12 can estimate the position of the eye from the captured image generated by the imaging element 11. For example, the 1 st controller 12 may estimate the position of the eyes in the real space from the positional relationship between the image of the given object and the image of the eyes included in the single image generated by the image pickup device 11. In the structure in which the wireless communication apparatus 1 is mounted inside the moving body 20, the given object is an object fixedly mounted with respect to the moving body 20, such as a headrest of a driver's seat, a frame of a side window, or the like.
For example, the 1 st controller 12 may be associated with each position on the captured image, and may store in the memory a distance and a direction from a position in the real space corresponding to the position to a position of an image of a given object in association with each other in advance. In such a configuration, the 1 st controller 12 is configured to extract the eyes from the captured image. The 1 st controller 12 can extract the distance and direction from the position of the actual space corresponding to the position of the image of the given object, which are stored in the memory in association with the position of the eye in the captured image. The 1 st controller 12 can be configured to estimate a position in the real space based on a distance and a direction to a position of an image of a predetermined object.
The 1 st controller 12 can estimate the positions of the eyes in the real space from the positional relationship between the image of the given object included in the single image generated by the image pickup device 11 and the image of the part of the body of the user. The part of the body may be, for example, the top of the head, shoulders, ears, etc. In such a configuration, the 1 st controller 12 may estimate the position of the part of the body in the real space based on the positional relationship between the image of the given object and the image of the part of the body of the user. The 1 st controller 12 can also estimate the position of the eye in the real space based on the relative positional relationship of the part of the body and the eye.
The 1 st controller 12 is configured to generate position information indicating the estimated position of the eye. The 1 st controller 12 is configured to control such that the 1 st communication module 14 transmits the position information to the HUD 2.
(control of display Panel)
When the 2 nd communication module 54 of the HUD2 receives the position information, the 2 nd controller 55 is configured to determine the left visible region VaL and the right visible region VaR based on the position of the user's eyes using the inter-eye distance E and the characteristics of the three-dimensional display device 5. The characteristics of the three-dimensional display device 5 are the virtual image barrier pitch VBp, the virtual image gap Vg, and the image pitch (2 × n × VHp) of the 1 st virtual image V1 described above. The 2 nd controller 55 of the HUD2 is configured to, when determining the left view region VaL and the right view region VaR, display the left-eye image in a part of the effective region a and display the right-eye image in the remaining part of the effective region a based on the left view region VaL and the right view region VaR. For example, the 2 nd controller 55 is configured to cause the sub-pixels included in the left visible region VaL more than a given ratio (e.g., 50%) to display the left-eye image. The 2 nd controller 55 is configured to display the right-eye image in sub-pixels included in the right viewable area VaR more than a given ratio. The user's left eye perceives the virtual image of the left-eye image more than a given ratio, and the right eye perceives the virtual image of the right-eye image more than the given ratio. As described above, since the right-eye image and the left-eye image are parallax images having parallax therebetween, the user visually recognizes a virtual image of the three-dimensional image.
< example 2 >
(Generation of photographic image)
The 1 st controller 12 may cause the image pickup device 11 to generate an image based on the parameter indicating the motion of the wireless communication apparatus 1 detected by the motion sensor 13. As described above, the parameter indicating the motion includes, for example, acceleration, temporal change in acceleration, angular acceleration, and temporal change in angular acceleration. The 1 st controller 12 may determine whether a parameter indicative of motion is less than a threshold. The 1 st controller 12 is configured to cause the image pickup device 11 to generate an image in the 1 st cycle when determining that the parameter indicating the motion is smaller than the threshold value. When determining that the parameter indicating the motion is equal to or greater than the threshold value, the 1 st controller 12 may cause the imaging element 11 to generate an image in the 2 nd cycle shorter than the 1 st cycle. The threshold value is a value that, when the parameter indicating the exercise is smaller than the threshold value, it is estimated that the frequency of the change in the position of the user's eyes is lower than when the parameter is equal to or larger than the threshold value. The threshold value can be set by a predetermined experiment or the like. The 1 st controller 12 may cause the image pickup device 11 to generate an image in a shorter cycle as the parameter indicating the motion is larger.
The estimation of the positions of the eyes and the control of the display panel in example 2 are the same as those in example 1.
< example 3 >
(Generation of photographic image)
The 1 st controller 12 may cause the image pickup element 11 to generate an image in accordance with the parameter indicating the motion of the wireless communication apparatus 1 detected by the motion sensor 13. For example, the 1 st controller 12 may determine whether or not the parameter indicating the motion is equal to or greater than a threshold value. When determining that the parameter indicating the motion is equal to or greater than the threshold value, the 1 st controller 12 may cause the image pickup device 11 to generate an image.
The estimation of the positions of the eyes and the control of the display panel in example 3 are the same as those in example 1.
< flow of treatment in example 1 >
Next, the operation performed by the 1 st controller 12 of the wireless communication apparatus 1 in the 1 st example will be described in detail with reference to fig. 6. When a start instruction for starting the process is input to the HUD2, the 1 st controller 12 starts the process.
First, the 1 st controller 12 determines whether or not a predetermined time has elapsed since the last image capture (step S11).
When it is determined that the predetermined time has elapsed since the last image capture, the 1 st controller 12 causes the image capture device 11 to generate an image (step S12).
The 1 st controller 12 acquires an image captured by the image pickup element 11 (step S13).
When the image is acquired in step S13, the 1 st controller 12 estimates the position of the user' S eye based on the image (step S14).
When the position of the eye is estimated in step S14, the 1 st controller 12 transmits position information indicating the position of the eye to the HUD2 (step S15).
The 1 st controller 12 determines whether or not an operation end instruction for ending the operation of the wireless communication device 1 is input (step S16).
If it is determined in step S16 that the end instruction has been input, the 1 st controller 12 ends the process. If it is determined in step S16 that the end instruction has not been input, the 1 st controller 12 returns to step S11 and repeats the processing.
< flow of treatment in example 2 >
Next, the operation performed by the 1 st controller 12 of the wireless communication apparatus 1 in the 2 nd example will be described in detail with reference to fig. 7.
First, the 1 st controller 12 acquires a parameter indicating the motion of the wireless communication apparatus 1 estimated by the motion sensor 13 (step S21).
If the parameter indicating the motion is acquired in step S21, the 1 st controller 12 determines whether the parameter indicating the motion is smaller than a given threshold (step S22).
If it is determined in step S22 that the parameter indicating motion is smaller than the predetermined threshold value, the 1 st controller 12 determines whether the 1 st cycle has elapsed since the image was generated by the image pickup element 11 last time (step S23).
If it is determined in step S23 that the 1 st cycle has not elapsed, the 1 st controller 12 repeats the process of step S23 until the 1 st cycle has elapsed.
If it is determined in step S22 that the parameter indicating motion is equal to or greater than the predetermined threshold, the 1 st controller 12 determines whether or not the 2 nd cycle has elapsed since the image was generated by the image pickup element 11 last time (step S24).
If it is determined in step S24 that the 2 nd cycle has not elapsed, the 1 st controller 12 repeats the process of step S24 until the 2 nd cycle has elapsed.
If it is determined in step S23 that the 1 st cycle has elapsed or it is determined in step S24 that the 2 nd cycle has elapsed, the 1 st controller 12 causes the image pickup device 11 to generate an image (step S25).
The 1 st controller 12 acquires a captured image generated by the image pickup element 11 (step S26).
When the captured image is acquired in step S26, the 1 st controller 12 estimates the position of the user' S eyes from the captured image (step S27).
When the position of the eye is estimated in step S14, the 1 st controller 12 transmits position information indicating the position of the eye to the HUD2 (step S28).
The 1 st controller 12 determines whether or not an end instruction for ending the operation of the wireless communication device 1 is input (step S29).
If it is determined in step S29 that the operation end instruction has been input, the 1 st controller 12 ends the process. If it is determined in step S29 that the operation end instruction has not been input, the 1 st controller 12 returns to step S21 to repeat the process.
< flow of treatment in example 3 >
Next, the operation performed by the 1 st controller 12 of the wireless communication apparatus 1 in example 3 will be described in detail with reference to fig. 8.
First, the 1 st controller 12 acquires a parameter indicating a motion from the motion sensor 13 (step S31).
When the parameter indicating the motion is acquired from the motion sensor 13, the 1 st controller 12 determines whether or not the parameter indicating the motion is equal to or greater than a threshold value (step S32).
If it is determined in step S32 that the parameter indicating motion is equal to or greater than the threshold value, the 1 st controller 12 causes the image pickup device 11 to generate a picked-up image (step S33).
If a captured image is generated in step S33, the 1 st controller 12 performs the processing from step S34 to step S37. The processing from steps S33 to S37 is the same as the processing from steps S13 to S16 in example 1.
As the wireless communication device 1 according to the above-described embodiment, an information processing apparatus such as a mobile phone, a smart phone, or a tablet terminal can be used. Such an information processing apparatus can be realized by storing a program in which processing contents for realizing the functions of the wireless communication device 1 according to the embodiment are described in a memory of the information processing apparatus, and reading and executing the program by a processor of the information processing apparatus. The wireless communication apparatus 1 may also be configured to read and install the program from a non-transitory computer-readable medium. Non-transitory computer readable media include, but are not limited to, magnetic storage media, optical storage media, magneto-optical storage media, semiconductor storage media. Magnetic storage media include magnetic disks, hard disks, magnetic tape. The optical storage medium includes an optical Disc such as a CD (Compact Disc), a DVD (digital versatile Disc), a Blu-ray Disc (Blu-ray (registered trademark) Disc), and the like. The semiconductor storage medium includes a ROM (Read Only Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), and a flash Memory.
As described above, in the present embodiment, the wireless communication device 1 is configured to estimate the positions of the eyes of the user from the captured images. The HUD2 is configured to control the parallax image to be displayed on the display panel 52 in accordance with the position of the eyes estimated by the wireless communication device 1. For example, even if the position of the eyes of the user changes, the image light can appropriately reach the position of the eyes. Therefore, the user can appropriately visually recognize the three-dimensional image.
In the present embodiment, the 1 st controller 12 is configured to cause the imaging device 11 to generate a captured image at a predetermined cycle, and to estimate the position of the user's eyes from the captured image. In the normal operation of the HUD2, even if the position of the user's eyes estimated in the initial setting changes, the current position of the user's eyes can be estimated from the newly captured image. The HUD2 can appropriately cause the display panel 52 to display a parallax image in accordance with the current position of the eyes of the user. As a result, even if the position of the eye changes, the user can appropriately visually recognize the three-dimensional image.
In the present embodiment, the 1 st controller 12 causes the imaging element 11 to generate an imaged image at a cycle based on a parameter indicating motion. The operation of the wireless communication apparatus 1 indicated by the index is related to the operation of the mobile body 20 on which the wireless communication apparatus 1 is placed. The operation of the mobile unit 20 is related to the operation of a user riding on the mobile unit 20. Therefore, if a parameter indicating the motion, such as acceleration, increases, the possibility that the position of the user's eye moves is high. Therefore, the 1 st controller 12 can estimate the position of the eye at a frequency corresponding to the possibility of change by causing the imaging device 11 to generate the captured image at a cycle based on the parameter indicating the motion. As a result, the 1 st controller 12 can appropriately estimate the change in the position of the eye while suppressing the load required for controlling the imaging element 11.
Although the above embodiments have been described as typical examples, it will be apparent to those skilled in the art that a large number of modifications and substitutions can be made within the spirit and scope of the present invention. Therefore, the present invention should not be construed as being limited to the above-described embodiments, and various modifications and changes can be made without departing from the claims. For example, a plurality of blocks described in the embodiments and examples may be combined into one block, or one block may be divided.
For example, as shown in fig. 9, the three-dimensional display device 5 may be arranged such that the image light emitted from the display panel 52 passes through the light-transmitting region 531 of the parallax barrier 53 and reaches the eyes of the user directly without passing through the reflector 3 and the optical member 4. In such a configuration, the 2 nd controller 55 causes a part of the effective region a to display a left-eye image and causes the remaining part of the effective region a to display a right-eye image, based on the visible region information received through the 2 nd communication module 54, in the same manner as in the above configuration. For example, the 2 nd controller 55 displays the left-eye image in sub-pixels included in the left visible region 52aL over the effective region a indicated by the visible region information more than a given ratio (for example, 50%). The 2 nd controller 55 displays the right-eye image in sub-pixels included in the right visible region 52aR indicated by the visible region information more than a given ratio. Thus, the user visually recognizes the virtual image of the left-eye image more than the virtual image of the right-eye image, and the user visually recognizes the virtual image of the right-eye image more than the virtual image of the left-eye image. As described above, the right-eye image and the left-eye image are parallax images having parallax therebetween. Therefore, the user visually recognizes the three-dimensional image.
For example, in the above-described embodiment, the 2 nd controller 55 of the HUD2 determines the left view region VaL and the right view region VaR, but the present invention is not limited thereto. The 1 st controller 12 of the wireless communication apparatus 1 may determine the left visible region VaL and the right visible region VaR based on the position of the eyes using the inter-eye distance E and the characteristics of the three-dimensional display device 5. In such a configuration, the 1 st controller 12 may generate the viewing area information indicating the left viewing area VaL and the right viewing area VaR, and transmit the information to the 1 st communication module 14. The 2 nd controller 55 of the HUD2 may also display an image on the display panel 52 based on the driving information sent from the 1 st communication module 14.
For example, in the above-described embodiment, the optical element is the parallax barrier 53, but the present invention is not limited thereto. For example, as shown in fig. 10, the optical element may be a lenticular lens 56. In this case, the lenticular lens 624 is configured as a cylindrical lens 561 extending in the vertical direction and arranged in the horizontal direction on a plane. The lenticular lens 56, similarly to the parallax barrier 53, transmits a part of the image light emitted from the sub-pixels to the left eye of the user, and transmits the other part of the image light emitted from the sub-pixels to the right eye of the user.
Description of the symbols
1: a wireless communication device;
2: a head-up display;
3: a reflector;
4: an optical member;
5: a three-dimensional display device;
10: an image pickup optical system;
11: an image pickup element;
12: 1, a controller;
13: a motion sensor;
14: a 1 st communication module;
20: a moving body;
51: an irradiator;
52: a display panel;
53: a parallax barrier;
54: a 2 nd communication module;
55: a 2 nd controller;
56: a lenticular lens;
531: a light-transmitting region;
532: a dimming area;
561: a cylindrical lens;
a: an effective area;
v1: 1, virtual image;
v2: a 2 nd virtual image;
VaL: a left viewable area;
VaR: a right viewable area;
vbl: a left shading area;
VBR: a right shade area;
52 aL: a left viewable area;
52 aR: a right viewable area;
100: a wireless communication heads-up display system.

Claims (8)

1. A head-up display system for wireless communication includes a wireless communication device and a head-up display,
the wireless communication apparatus includes:
an imaging element configured to generate an imaged image;
a 1 st controller configured to estimate a position of an eye of a user based on the captured image; and
a 1 st communication module configured to transmit the position of the user's eyes estimated by the 1 st controller,
the head-up display includes:
a display panel configured to display a parallax image;
an optical element configured to define a propagation direction of the image light emitted from the display panel;
an optical system configured to project the image light whose propagation direction is defined by the optical element in a direction toward the eyes of the user;
a 2 nd communication module configured to receive the position of the eye from the 1 st communication module; and
and a 2 nd controller configured to control the parallax image to be displayed on the display panel based on the position of the eye received by the 2 nd communication module.
2. The wireless-communication heads-up display system of claim 1,
the 1 st controller is configured to cause the imaging device to generate the captured image at a predetermined cycle, and to estimate a position of the user's eye based on the captured image.
3. The wireless-communication heads-up display system of claim 1,
the wireless communication device includes a motion sensor configured to detect a parameter indicative of motion of the wireless communication device,
the 1 st controller is configured to cause the imaging element to generate the captured image based on a parameter indicating the motion.
4. The wireless-communication heads-up display system of claim 3,
the 1 st controller is configured to:
causing the image pickup element to generate the picked-up image at a 1 st cycle in a case where the parameter indicating the motion is smaller than a threshold value,
when the parameter indicating the motion is equal to or greater than a threshold value, the imaging element is caused to generate the captured image in a 2 nd cycle shorter than the 1 st cycle.
5. The wireless-communication heads-up display system of claim 3,
the 1 st controller is configured to:
causing the imaging element to generate the captured image when the parameter indicating the motion is equal to or greater than a threshold value,
when the parameter indicating the motion is smaller than a threshold value, the imaging element is not caused to generate the captured image.
6. A wireless communication device is provided with:
an imaging element configured to generate an imaged image;
a controller configured to estimate a position of an eye of a user based on the captured image; and
and a communication module configured to transmit positional information indicating a position of the user's eyes to a head-up display configured to display a parallax image based on the positional information.
7. A mobile body is provided with a wireless communication head-up display system including a wireless communication device and a head-up display,
the wireless communication apparatus includes:
an imaging element configured to generate an imaged image;
a 1 st controller configured to estimate a position of an eye of a user based on the captured image; and
a 1 st communication module configured to transmit the position of the user's eyes estimated by the 1 st controller,
the head-up display includes:
a display panel configured to display a parallax image;
an optical element configured to define a propagation direction of the image light emitted from the display panel;
an optical system configured to project the image light whose propagation direction is defined by the optical element in a direction toward the eyes of the user;
a 2 nd communication module configured to receive the position of the eye estimated by the 1 st controller;
a 2 nd controller configured to control a parallax image to be displayed on the display panel based on the position of the eye received by the 2 nd communication module; and
and a reflector configured to reflect the image light whose propagation direction is defined by the optical element toward the user.
8. A program to be executed by a wireless communication device of a wireless communication head-up display system including the wireless communication device and a head-up display,
the wireless communication device comprises an image pickup element, a controller, and a communication module,
the controller controls such that:
causing the imaging element to generate an imaged image, and estimating the position of the eyes of the user based on the imaged image,
the communication module transmits position information indicating a position of the eyes of the user to the head-up display that displays a parallax image based on the position information.
CN201980068955.1A 2018-11-02 2019-10-28 Wireless communication head-up display system, wireless communication apparatus, moving object, and program Pending CN112868058A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018-207598 2018-11-02
JP2018207598A JP7034052B2 (en) 2018-11-02 2018-11-02 Wireless communication head-up display systems, wireless communication devices, mobiles, and programs
PCT/JP2019/042128 WO2020090714A1 (en) 2018-11-02 2019-10-28 Radio communication head-up display system, radio communication equipment, moving body, and program

Publications (1)

Publication Number Publication Date
CN112868058A true CN112868058A (en) 2021-05-28

Family

ID=70463716

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980068955.1A Pending CN112868058A (en) 2018-11-02 2019-10-28 Wireless communication head-up display system, wireless communication apparatus, moving object, and program

Country Status (5)

Country Link
US (1) US20210339628A1 (en)
EP (1) EP3876224B1 (en)
JP (1) JP7034052B2 (en)
CN (1) CN112868058A (en)
WO (1) WO2020090714A1 (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101473266A (en) * 2006-06-22 2009-07-01 诺基亚公司 Method and system for image stabilization
CN102027737A (en) * 2008-05-16 2011-04-20 松下电器产业株式会社 Camera system
CN201937736U (en) * 2007-04-23 2011-08-17 德萨拉技术爱尔兰有限公司 Digital camera
CN104954668A (en) * 2014-03-25 2015-09-30 松下电器(美国)知识产权公司 Image-capturing device for moving body
US20150314682A1 (en) * 2012-12-14 2015-11-05 Visteon Global Technologies, Inc. System and method for automatically adjusting an angle of a three-dimensional display within a vehicle
US20170046880A1 (en) * 2014-05-12 2017-02-16 Panasonic Intellectual Property Management Co., Ltd. Display device and display method
CN107704805A (en) * 2017-09-01 2018-02-16 深圳市爱培科技术股份有限公司 method for detecting fatigue driving, drive recorder and storage device
CN108334871A (en) * 2018-03-26 2018-07-27 深圳市布谷鸟科技有限公司 The exchange method and system of head-up display device based on intelligent cockpit platform
WO2018139611A1 (en) * 2017-01-27 2018-08-02 公立大学法人大阪市立大学 3d display device, 3d display system, head-up display, head-up display system, 3d display device design method, and moving body
CN108621947A (en) * 2018-05-04 2018-10-09 福建省汽车工业集团云度新能源汽车股份有限公司 A kind of vehicle-mounted head-up-display system of automatic adjusument

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3668116B2 (en) 1999-09-24 2005-07-06 三洋電機株式会社 3D image display device without glasses
JP6075298B2 (en) * 2014-01-14 2017-02-08 トヨタ自動車株式会社 Information processing apparatus and mobile terminal
JP2016070951A (en) 2014-09-26 2016-05-09 パイオニア株式会社 Display device, control method, program, and storage medium
WO2017134861A1 (en) 2016-02-05 2017-08-10 日立マクセル株式会社 Head-up display device
WO2018199185A1 (en) 2017-04-26 2018-11-01 京セラ株式会社 Display device, display system, and mobile body
US20190141314A1 (en) * 2017-11-09 2019-05-09 Mindtronic Ai Co.,Ltd. Stereoscopic image display system and method for displaying stereoscopic images

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101473266A (en) * 2006-06-22 2009-07-01 诺基亚公司 Method and system for image stabilization
CN201937736U (en) * 2007-04-23 2011-08-17 德萨拉技术爱尔兰有限公司 Digital camera
CN102027737A (en) * 2008-05-16 2011-04-20 松下电器产业株式会社 Camera system
US20150314682A1 (en) * 2012-12-14 2015-11-05 Visteon Global Technologies, Inc. System and method for automatically adjusting an angle of a three-dimensional display within a vehicle
CN104954668A (en) * 2014-03-25 2015-09-30 松下电器(美国)知识产权公司 Image-capturing device for moving body
US20170046880A1 (en) * 2014-05-12 2017-02-16 Panasonic Intellectual Property Management Co., Ltd. Display device and display method
WO2018139611A1 (en) * 2017-01-27 2018-08-02 公立大学法人大阪市立大学 3d display device, 3d display system, head-up display, head-up display system, 3d display device design method, and moving body
CN107704805A (en) * 2017-09-01 2018-02-16 深圳市爱培科技术股份有限公司 method for detecting fatigue driving, drive recorder and storage device
CN108334871A (en) * 2018-03-26 2018-07-27 深圳市布谷鸟科技有限公司 The exchange method and system of head-up display device based on intelligent cockpit platform
CN108621947A (en) * 2018-05-04 2018-10-09 福建省汽车工业集团云度新能源汽车股份有限公司 A kind of vehicle-mounted head-up-display system of automatic adjusument

Also Published As

Publication number Publication date
EP3876224B1 (en) 2024-03-27
JP2020071453A (en) 2020-05-07
US20210339628A1 (en) 2021-11-04
JP7034052B2 (en) 2022-03-11
EP3876224A4 (en) 2022-05-18
WO2020090714A1 (en) 2020-05-07
EP3876224A1 (en) 2021-09-08

Similar Documents

Publication Publication Date Title
US11221482B2 (en) Display apparatus, display system, and mobile body
CN112970248A (en) Three-dimensional display device, head-up display system, moving object, and program
CN113039785A (en) Three-dimensional display device, three-dimensional display system, head-up display, and moving object
WO2021106688A1 (en) Head-up display, head-up display system, and moving body
CN112889274B (en) Three-dimensional display device, head-up display, mobile object, and program
CN112868058A (en) Wireless communication head-up display system, wireless communication apparatus, moving object, and program
CN113016178B (en) Head-up display, head-up display system, moving object, and method for designing head-up display
CN112889275B (en) Communication head-up display system, communication device, mobile object, and program
JP7274392B2 (en) Cameras, head-up display systems, and moving objects
WO2020196052A1 (en) Image display module, image display system, moving body, image display method, and image display program
WO2020130048A1 (en) Three-dimensional display device, head-up display system, and moving object
CN114730096A (en) Head-up display system and moving object
CN113196142A (en) Three-dimensional display device, three-dimensional display system, head-up display, and moving object
JP7441333B2 (en) 3D display device, image display system and moving object
US11961429B2 (en) Head-up display, head-up display system, and movable body
CN113924520A (en) Head-up display system and moving object
CN114746795A (en) Head-up display module, head-up display system, and moving object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination