WO2020189258A1 - Dispositif d'affichage, dispositif d'affichage tête haute et visiocasque - Google Patents

Dispositif d'affichage, dispositif d'affichage tête haute et visiocasque Download PDF

Info

Publication number
WO2020189258A1
WO2020189258A1 PCT/JP2020/008872 JP2020008872W WO2020189258A1 WO 2020189258 A1 WO2020189258 A1 WO 2020189258A1 JP 2020008872 W JP2020008872 W JP 2020008872W WO 2020189258 A1 WO2020189258 A1 WO 2020189258A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
display device
unit
virtual image
Prior art date
Application number
PCT/JP2020/008872
Other languages
English (en)
Japanese (ja)
Inventor
中村 彰宏
橋村 淳司
山田 範秀
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2021507161A priority Critical patent/JPWO2020189258A1/ja
Publication of WO2020189258A1 publication Critical patent/WO2020189258A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/54Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being generated by moving a 2D surface, e.g. by vibrating or rotating the 2D surface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers

Definitions

  • the present invention relates to a display device, a head-up display device, and a head-mounted display device.
  • a screen which is a projection plane of a two-dimensional image, is moved in a direction perpendicular to the main plane at a speed higher than that of human visual recognition, and is moved to that position.
  • a technique for switching a two-dimensional image is disclosed.
  • the screen moves at high speed to project a two-dimensional image, so that the observer observes the two-dimensional images corresponding to a plurality of positions at almost the same time. As a result, the observer can see the three-dimensional image drawn in the space due to the afterimage effect of the eyes.
  • the timing for displaying the two-dimensional image is generated based on the synchronization signal extracted from the video signal, and the two-dimensional image is displayed. Therefore, when a liquid crystal display or a projector is used to display a two-dimensional image, a small amount of light is projected onto the screen even at a position where no two-dimensional image is displayed. As a result, light is projected onto the screen in a short period of time at each position where no two-dimensional image is displayed, and the screen light is integrated by the afterimage effect of the observer's eyes, so that the screen glows dimly. A visible phenomenon (hereinafter referred to as "background light”) occurs. This causes a problem that the contrast of the three-dimensional image is lowered.
  • background light A visible phenomenon
  • An object of the present invention is to provide a display device, a head-up display device, and a head-mounted display device capable of suppressing background light.
  • a display device that performs three-dimensional display by a volume scanning method, from an acquisition unit that acquires image data, a display unit that displays an image of the image data acquired by the acquisition unit, and the display unit.
  • a plurality of image forming regions are provided at different positions in the optical axis direction of the emitted light, and an image forming portion that sequentially forms an image corresponding to the image in the image forming region and an image on the image forming region are converted.
  • a display device that performs three-dimensional display by a volume scanning method, and is an acquisition unit that acquires image data, a display unit that sequentially displays images of the image data acquired by the acquisition unit, and the display unit.
  • An image forming optical system that converts an image displayed on the display and forms a real image or a virtual image at a projection distance corresponding to each position of the display unit in the optical axis direction of the light emitted from the display unit, and the display unit.
  • a display device having a moving mechanism that moves the image in the optical axis direction, and a control unit that turns off the light source of the display unit when there is no image to be formed on the display unit at each of the positions.
  • the image forming unit has a projection member that forms an image by projecting an image displayed on the display unit, and moves the projection member to a predetermined position different in the optical axis direction.
  • the projection optical system for enlarging the image formed on the display unit is further provided, and the projection member is an intermediate screen for diffusing the imaged light from the projection optical system (3).
  • a head-up display device having a main control unit that causes the display device to form a virtual image at the projected distance corresponding to the distance to the object determined by the detection unit.
  • an object detection unit that detects an object existing in the detection area, and determines the distance to the object, and the object.
  • An eyepiece that guides the image formed by the main control unit and the display device into the user's field of view, which causes the display device to form a real image or a virtual image at the projection distance corresponding to the distance to the object determined by the detection unit.
  • a head-mounted display device having an optical system.
  • the light source of the display unit is turned off in the image forming region where there is no image to be formed, so that the background light can be suppressed. As a result, it is possible to suppress a decrease in the contrast of the three-dimensional image due to the background light.
  • FIG. 10A It is a block diagram which shows the hardware structure of a head-up display device. It is a perspective view which shows the specific display state by a head-up display device. It is a schematic diagram which illustrates the virtual image display device which concerns on 2nd Embodiment, and the head-mounted display device which includes this. It is sectional drawing of the head-mounted display apparatus along line BB of FIG. 10A.
  • the same elements are designated by the same reference numerals, and duplicate description will be omitted.
  • the dimensional ratios in the drawings are exaggerated for convenience of explanation and may differ from the actual ratios.
  • the vertical direction is the Z direction
  • the direction parallel to the traveling direction of the vehicle is the Y direction
  • the direction orthogonal to these Z and Y directions is the X direction when the virtual image display device is mounted on the vehicle.
  • First Embodiment> 1 and 2 show a usage state in which the virtual image display device 20 according to the first embodiment and the head-up display device (hereinafter, referred to as “HUD device”) 10 including the virtual image display device 20 are mounted in the vehicle body 811 of the vehicle 800. It is a schematic diagram to explain.
  • the user (driver) 900 is sitting in the driver's seat 816 while holding the steering wheel 813.
  • the virtual image display device (display device) 20 of the HUD device 10 transmits the image information displayed on the display unit 21 described later to the user (driver) 900 via the display screen 243. Display as a virtual image toward.
  • the configuration other than the display screen 243 of the virtual image display device 20 is installed so as to be embedded in the dashboard 814 of the vehicle body 811 behind the display 815 of the car navigation system or the like.
  • the virtual image display device 20 emits a display light D1 corresponding to the virtual image including driving-related information and the like toward the display screen 243.
  • the display screen 243 also called a combiner, is a semitransparent concave mirror or a plane mirror.
  • the front window may also serve as a combiner.
  • the display screen 243 is erected on the dashboard 814 by the support at the lower end, and reflects the display light D1 from the virtual image display device 20 toward the rear side (Y direction) of the vehicle body 811. That is, in the case of the illustration, the display screen 243 is a stand-alone type that is installed separately from the front window 812. The display light D1 reflected by the display screen 243 is guided to the pupil 910 of the user 900 sitting in the driver's seat 816 and an eye box (not shown) corresponding to the peripheral position thereof.
  • the user 900 can observe the display light D1 reflected by the display screen 243, that is, the virtual image i2 as a display image separated by a predetermined distance (virtual image distance) as if it were in front of the vehicle body 811.
  • the user 900 can observe the outside light transmitted through the display screen 243, that is, the front view, the real image of the automobile, and the like.
  • the user 900 observes the virtual image i2 including the operation-related information formed by the reflection of the display light D1 on the display screen 243 on the external world image behind the display screen 243, that is, the see-through image. it can.
  • FIG. 3 is a schematic side view showing the configuration of the virtual image display device 20 according to the first embodiment.
  • the virtual image display device 20 includes a display unit 21, a projection optical system 22, a virtual image distance changing unit (image forming unit) 23, a virtual image forming optical system 24, a housing 26, an intermediate screen (projection member) 29, and a display control unit 30. ..
  • Each component of the virtual image display device 20 other than the display screen 243 is housed in the housing 26.
  • the optical axis AX passing through the display unit 21, the projection optical system 22, and the virtual image distance changing unit 23 to reach the mirror 241 of the virtual image forming optical system 24 has the same height in the Z direction. Is set to.
  • the optical axis AX when distinguishing the optical axis AX before and after the virtual image distance changing unit 23, the optical axis on the upstream side is referred to as the optical axis AX0 and the optical axis on the downstream side is referred to as the optical axis AX1. Is simply expressed as the optical axis AX.
  • the display unit 21 has a two-dimensional display surface 21a.
  • the image formed on the display surface 21a is magnified by the projection optical system 22 and projected onto the intermediate screen 29.
  • the projection image on the intermediate screen 29 can be switched at a relatively high speed.
  • FIG. 4 is a block diagram illustrating the configurations of the display unit 21 and the display control unit 30.
  • the display unit 21 includes a display element 211 and a light source unit 212.
  • the display element 211 may be a reflective element such as a DMD (Digital Micromirror Device) or an LCOS (Liquid Crystal On Silicon).
  • the light source unit 212 includes, for example, a plurality of LEDs arranged in a matrix in order to realize RGB display.
  • the LED of the light source unit 212 functions as a light source.
  • the light source unit 212 may use a laser, a lamp, or the like instead of the LED.
  • a transmissive element such as a liquid crystal can be used as the display element 211, and a backlight can be used as the light source of the light source unit 212.
  • a transmissive element such as a liquid crystal
  • a backlight can be used as the light source of the light source unit 212.
  • the display element 211 is a liquid crystal display element
  • the installation of the projection optical system 22 can be omitted because the display element 211 itself has a wide angle of view.
  • the display unit 21 operates at a frame rate of 30 fps or more, preferably 150 fps or more. This makes it easy to make it appear that a plurality of virtual images i2 are displayed at the same time at different virtual image distances.
  • the projection optical system 22 is a fixed-focus lens system, and although not shown, it has a plurality of lenses.
  • the projection optical system 22 magnifies and projects the image formed on the display surface 21a of the display unit 21 as an intermediate image i1 on the intermediate screen 29 at an appropriate magnification (the intermediate image i1 itself is premised on the display operation of the display unit 21). Will be).
  • the projection optical system 22 has an aperture 221 arranged on the most intermediate screen 29 side of the projection optical system 22. By arranging the aperture 221 in this way, it becomes relatively easy to set and adjust the F number on the intermediate screen 29 side of the projection optical system 22.
  • the intermediate screen 29 is a diffusion screen for forming an image of light from the projection optical system 22 and controlling the light distribution angle to a desired angle, and is an imaging position (that is, a planned imaging position of the intermediate image i1 or a position thereof).
  • An intermediate image i1 is formed (within the depth of focus in the vicinity).
  • the intermediate image i1 as a display image is not necessarily formed, but in the following, it is assumed that the intermediate image i1 is formed even if it is not actually formed.
  • the position may also be referred to as the position of the intermediate image.
  • the intermediate screen 29 for example, frosted glass, a lens diffuser, a microlens array, or the like can be used.
  • the virtual image distance changing unit 23 is provided along with the intermediate screen 29, and is for moving the intermediate screen 29 and the intermediate image i1 to a desired position along the optical axis AX.
  • the virtual image distance changing unit 23 includes a guide unit 232 that guides the support frame portion 231 that supports the intermediate screen 29 to move in the optical axis AX direction, and the support frame portion 231 together with the intermediate screen 29 at a desired speed in the optical axis AX direction. It has a drive unit 233 which is reciprocated by
  • the distance between the virtual image i2 as a virtual image formed behind the display screen 243 by the virtual image forming optical system 24 and the observer is increased. Or it can be made smaller.
  • the virtual image i2 as a series of projected images can be made three-dimensional.
  • the moving range of the intermediate screen 29 along the optical axis AX corresponds to the planned imaging position of the intermediate image i1 or its vicinity, but is the range of the depth of focus on the intermediate screen 29 side of the projection optical system 22. It is desirable to keep it inside.
  • both the state of the intermediate image i1 and the imaged state of the virtual image i2 as a virtual image can be brought into a good state in which they are substantially in focus.
  • the moving speed of the intermediate screen 29 is faster than the human visual perception, and it is possible to make it appear as if the virtual image i2 as a virtual image is displayed at a plurality of places or at a plurality of virtual image distances at the same time.
  • the virtual image i2 is sequentially projected in five stages of long distance, medium and long distance, medium distance, medium and short distance, and short distance, when the display unit 21 displays at 150 fps, each distance (for example, long distance) is displayed. ),
  • the display of the virtual image i2 is switched at 30 fps, and the virtual images i2 at medium and long distances, medium distances, medium and short distances, and short distances are displayed in parallel and the switching is continuous. Be recognized.
  • the moving speed of the intermediate screen 29 is set to be synchronized with the display operation of the display unit 21.
  • the virtual image distance changing unit 23 in the present embodiment reciprocates the intermediate screen 29 along the optical axis AX.
  • the reciprocating motion by the motor repeats acceleration and deceleration, which is equivalent to a motor. A load is applied and heat is easily generated. As a result, it is assumed that the expected exercise speed cannot be obtained.
  • a spring into the guide portion 232 and assist acceleration and deceleration by the elastic force thereof.
  • a cam structure, a slider crank mechanism, or the like may be used to make the rotary motion a reciprocating motion.
  • the movement of the intermediate screen 29 along the optical axis AX is not limited to the reciprocating motion of the intermediate screen 29 along the optical axis AX.
  • a rotating body having an eccentric axis and having a plurality of intermediate screens 29 attached to the side surface of a pillar whose cross-sectional shape is polygonal, elliptical, spiral, etc. is defined as a rotation axis orthogonal to the optical axis AX direction.
  • the intermediate screen 29 may be moved along the optical axis AX by rotating it.
  • the intermediate screen 29 may be moved along the optical axis AX by rotating a plurality of intermediate screens 29 installed on the rotating disk so as to be offset in the optical axis AX direction with the axis AX direction as the rotation axis. ..
  • the liquid crystal layers are laminated along the optical axis AX, and the voltage applied to the liquid crystal layer at the position serving as the image forming region is controlled to change the refractive index. It may be configured to do so. This makes it possible to change the distance between the observer and the real or virtual image formed by the image forming optical system.
  • FIG. 5 is a schematic diagram illustrating a configuration having a moving mechanism for moving the display unit in the optical axis direction.
  • the virtual image distance is changed by providing a moving mechanism 234 that moves the display unit 21 in the optical axis AX1 direction according to the instruction of the display control unit 30. It may be configured to be used.
  • the virtual image forming optical system 24 magnifies the intermediate image i1 formed on the intermediate screen 29 in cooperation with the display screen 243, and forms the virtual image i2 in front of the user 900.
  • the virtual image forming optical system 24 is composed of at least one mirror, but includes two mirrors 241 and 242 in the illustrated example.
  • the display control unit 30 controls the display unit 21 and the virtual image distance changing unit 23.
  • the display control unit 30 includes a display element control unit 31, a light source drive unit 32, and a storage unit 33.
  • the display element control unit 31 is a computer including a CPU (Central Processing Unit), a RAM (Random Access Memory), and a ROM (Read Only Memory) (not shown).
  • the display element control unit 31 functions as an acquisition unit for acquiring image data to be displayed on the display element 211, and controls the display element 211 and the light source drive unit 32.
  • the image data is stored in advance in the storage unit 33, or is stored in advance in the storage unit of the main control unit 60, which will be described later.
  • the order and timing at which the images of the image data are displayed on the display element 211 are managed by the display element control unit 31.
  • the light source driving unit 32 drives the light source of the light source unit 212 according to the instruction of the display element control unit 31.
  • the image data is color, light of each RGB color is emitted from the light source unit 212 in a time-division manner.
  • the display element control unit 31 turns off the light source of the display unit 21 when there is no image to be displayed when the images of the image data are sequentially displayed on the display element 211.
  • the light source of the display unit 21 is turned off.
  • the display element control unit 31 reflects the light Ls from the light source unit 212 for each pixel of the display element 211 according to the acquired image data. By controlling, each pixel is switched on / off. As a result, the reflected light Ld with respect to the light Ls from the light source unit 212 is emitted from the display element 211, and the image is displayed on the display element 211.
  • the display control unit 30 controls the formation timing of the intermediate image i1 on the intermediate screen 29, and displays a three-dimensional virtual image i2 in which the virtual image distance (projection distance) changes behind the display screen 243. Specifically, by moving the position of the intermediate image i1, that is, the position of the intermediate screen 29 to the side closer to the virtual image forming optical system 24 on the optical axis AX1, the virtual image distance to the virtual image i2 is reduced. On the contrary, by moving the position of the intermediate screen 29 to the side farther from the virtual image forming optical system 24 on the optical axis AX1, the virtual image distance to the virtual image i2 is increased.
  • the intermediate screen 29 by arranging the intermediate screen 29 on the optical axis AX1, not only the intermediate image i1 movable in the optical axis AX1 direction can be formed, but also the viewing angle and the eyebox size can be secured. , The light utilization efficiency of the optical system can be increased. If the diffused light distribution angle by the intermediate screen 29 is too narrow, the eyebox size becomes small. On the contrary, if the light distribution angle of diffusion by the intermediate screen 29 is made too large, the F value of the virtual image forming optical system 24 needs to be reduced in order to increase the light utilization efficiency, so that the depth of focus becomes shallow and display is possible. The distance range is narrowed.
  • FIG. 6 is a flowchart for exemplifying the control method of the virtual image display device 20 in the first embodiment.
  • the processing of the flowchart shown in FIG. 6 is realized by the CPU of the display element control unit 31 executing the control program.
  • FIG. 7 is a schematic diagram conceptually showing an image forming region forming the intermediate image i1.
  • the processing when the display element 211 is a reflective element such as DMD or LCOS will be mainly described, but the processing when the display element 211 is a transmissive element such as a liquid crystal is also the same.
  • n corresponds to the number of image forming regions forming the intermediate image i1.
  • the image forming region for forming the intermediate image i1 is set at different positions in the optical axis AX1 direction of the light emitted from the display unit 21 according to the stage where the virtual image i2 is projected.
  • Each image forming region corresponds to the projection surface of the intermediate screen 29 moved to a different position in the optical axis AX1 direction, for example, when the display element 211 is a reflective element such as a DMD or LCOS.
  • the display element 211 is a transmissive element such as a liquid crystal
  • each image forming region corresponds to the display surface 21a of the display unit 21 which has moved to a different position in the optical axis AX1 direction.
  • n (n: 1 to 5) is a variable for indicating any of the first to fifth image forming regions as the nth image forming region, and is initialized to "1".
  • step S102 it is determined whether or not there is an image to be formed in the nth image forming region.
  • step S102 When there is an image to be formed in the nth image forming region (step S102: YES), the display element control unit 31 acquires an image corresponding to the image formed in the nth image forming region (step S103). Subsequently, an image is formed in the nth image forming region (step S104). More specifically, the display control unit 30 controls the virtual image distance changing unit 23 to move the position of the intermediate screen 29 to a position corresponding to the nth image forming region on the optical axis AX1 to the display unit 21. By displaying the image and projecting it on the intermediate screen 29, the intermediate image i1 is formed in the nth image forming region.
  • step S102 when there is no image to be formed in the nth image forming region (step S102: NO), the light source of the display unit 21 is turned off (step S105). As a result, the light from the display unit 21 is not projected onto the intermediate screen 29 in the image forming region where there is no image to be formed, so that the intermediate screen 29 is prevented from emitting light.
  • n is incremented (step S106), and the magnitude of n and a predetermined number N is compared (step S107).
  • step S107: YES the process ends (end).
  • step S107: NO the process proceeds to step S102.
  • the processing of the flowchart shown in FIG. 6 is processing for one frame (number of frames), and the above processing is repeated for each frame. Further, in the example shown in FIG. 7, the intermediate images i1 are formed because there are images 206 to 208 to be formed for the second, fourth, and fifth image forming regions 202, 204, and 205, respectively. On the other hand, since there is no image to be formed in the first and third image forming regions 201 and 203, the light source of the display unit 21 is turned off.
  • the light source of the display unit 21 is turned off in the first and third image forming regions 201 and 203 of the first to fifth image forming regions 201 to 205, so that the background light is suppressed. Has been done.
  • the background light is suppressed by turning off the light source of the display unit 21 in the image forming region where there is no image to be formed, so that the contrast of the virtual image i2 is high. Improve.
  • FIG. 8 is a block diagram illustrating a hardware configuration of the HUD device 10.
  • the HUD device 10 includes a driver detection unit 71, an environment monitoring unit 72, and a main control unit 60.
  • the main control unit 60 By controlling the entire HUD device 10, the main control unit 60 three-dimensionally displays a virtual image corresponding to an object such as an oncoming vehicle or a passerby. An example of displaying a virtual image will be described later.
  • the driver detection unit 71 is a part that detects the existence of the user 900 in the vehicle 800, and includes an internal camera 71a for the driver's seat 816, an image processing unit 71b for the driver's seat 816, and a determination unit 71c. ..
  • the internal camera 71a is installed on the dashboard 814 in the vehicle body 811 facing the driver's seat 816 (see FIG. 2), and captures an image of the head of the user 900 sitting in the driver's seat 816 and its surroundings. To do.
  • the driver's seat image processing unit 71b performs various image processing such as brightness correction on the image captured by the internal camera 71a, and facilitates the processing by the determination unit 71c.
  • the determination unit 71c detects the head and eyes (pupil 910) of the user 900 by extracting or cutting out an object from the driver's seat image processed by the driver's seat image processing unit 71b, and also detects the driver's seat image. It is possible to calculate the spatial position of the user 900's eyes (resulting in the direction of the line of sight) as well as the presence or absence of the user 900's head in the vehicle body 811 from the depth information accompanying the user 900.
  • the environment monitoring unit 72 functions as an object detection unit.
  • the environment monitoring unit 72 identifies objects such as automobiles, bicycles, and pedestrians that are close to the front, and determines the distance to the objects.
  • the environment monitoring unit 72 includes an external camera 72a, an external image processing unit 72b, and a determination unit 72c.
  • the external camera 72a is installed at an appropriate position inside and outside the vehicle body 811 and captures external images such as the front and sides of the user 900 or the vehicle 800.
  • the external image processing unit 72b performs various image processing such as brightness correction on the image captured by the external camera 72a, and facilitates the processing by the determination unit 72c.
  • the determination unit 72c detects the existence or nonexistence of objects such as automobiles, bicycles, and pedestrians by extracting or cutting out objects from the external image processed by the external image processing unit 72b, and depth information accompanying the external image.
  • the spatial position of the object in front of the vehicle 800 is calculated from.
  • the internal camera 71a and the external camera 72a include, for example, a compound eye type three-dimensional camera. That is, both cameras 71a and 72a are a matrix of camera elements in which a lens for imaging, CMOS (Complementary Metal-Oxide Sensor), and other image pickup elements are arranged in a matrix, and are used for the image pickup element. Each has a drive circuit of.
  • the plurality of camera elements constituting the cameras 71a and 72a are, for example, focused on different positions in the depth direction, or can detect relative parallax, and are obtained from each camera element. By analyzing the state of the image (focus state, position of the object, etc.), the distance to each area in the image or the object is determined.
  • a combination of a two-dimensional camera and an infrared distance sensor may be used instead of or together with the compound eye type cameras 71a and 72a as described above. As a result, it is possible to obtain distance information in the depth direction for each part in the captured screen. Further, instead of the compound eye type cameras 71a and 72a, a stereo camera in which two two-dimensional cameras are separately arranged can obtain distance information in the depth direction with respect to each part (area or object) in the captured screen. In addition, with a single two-dimensional camera, distance information in the depth direction may be obtained for each part in the captured screen by performing imaging while changing the focal length at high speed.
  • LIDAR Light Detection And Ringing
  • distance information in the depth direction can be obtained for each part (area or object) in the detection area.
  • LIDAR technology it is possible to measure the scattered light for pulsed laser irradiation, measure the distance and spread to an object at a long distance, and acquire the distance information to the object in the field of view and the information on the spread of the object.
  • a radar sensing technology such as this LIDAR technology
  • a technology for detecting an object's distance or the like from image information it is possible to improve the object detection accuracy.
  • the display control unit 30 operates the virtual image display device 20 under the control of the main control unit 60 to display a three-dimensional virtual image i2 in which the virtual image distance (also referred to as a projection distance) changes behind the display screen 243.
  • the virtual image distance also referred to as a projection distance
  • the light source of the display unit is turned off.
  • the display control unit 30 forms a virtual image i2 to be displayed on the virtual image display device 20 from display information including a display shape and a display distance received from the environment monitoring unit 72 via the main control unit 60.
  • the virtual image i2 is, for example, a frame F (see FIG. 9) located around the car, bicycle, pedestrian, or other object OB (see FIG. 9 described later) existing behind the display screen 243 in the depth position direction thereof. It becomes a sign like.
  • the HUD device 10 can display the virtual image i2 at an appropriate position regardless of the direction of the line of sight of the driver 900 by changing the virtual image distance, but the direction of the line of sight of the user 900 by the driver detection unit 71. It is also possible to apply the function of estimating the above to the display of the virtual image i2.
  • the display control unit 30 receives a detection output regarding the presence of the user 900 and the position of the eyes from the driver detection unit 71 via the main control unit 60. This makes it possible to automatically start and stop the projection of the virtual image i2 by the virtual image display device 20. It is also possible to project the virtual image i2 only in the direction of the line of sight of the user 900. Further, it is also possible to perform projection with emphasis such as brightening or blinking only the virtual image i2 in the direction of the line of sight of the user 900.
  • FIG. 9 is a perspective view illustrating a specific display state by the HUD device 10.
  • the front of the user 900 who is the driver (observer), is a detection area DF corresponding to the observation field of view.
  • the detection area DF that is, in the road and its surroundings, there are objects OB1 and OB3 of a person such as a pedestrian and an object OB2 of a moving body such as an automobile.
  • the main control unit 60 of the HUD device 10 projects a three-dimensional virtual image i2 (i21, i22, i24) by the virtual image display device 20, and serves as a related information image for each object OB1, OB2, OB3.
  • Frames F1, F2, and F3 are added.
  • the environment monitoring unit 72 determines the projection distances to the virtual images i21, i22, and i23 for displaying the frames F (F1, F2, F3). It corresponds to the distance from the user 900 or the vehicle 800 to each object OB1, OB2, OB3.
  • the virtual images i23 and i25 are not actually projected because the intermediate image i1 is not formed.
  • the projected distances of the virtual images i21, i22, and i24 are discrete and may not exactly match the actual distances to the objects OB1, OB2, and OB3. However, if the difference between the projected distances of the virtual images i21, i22, and i23 and the actual distances to the objects OB1, OB2, and OB3 is not large, parallax is unlikely to occur even if the viewpoint of the user 900 moves (in the X direction). The arrangement relationship between the objects OB1, OB2, OB3 and the frames F1, F2, F3 can be substantially maintained.
  • the light source of the display unit is turned off in the image forming region where there is no image to be formed, so that the background light is used. Can be suppressed. As a result, the decrease in contrast of the virtual image i2 due to the background light can be suppressed.
  • HMD device head-mounted display device
  • the virtual image display device of the present embodiment is mounted on the HMD device, it is smaller and lighter than the virtual image display device mounted on the HUD device of the first embodiment, but the virtual image display device of the first embodiment Has the same function as. That is, the virtual image display device of the present embodiment has a function of displaying the image information displayed on the display unit as a virtual image toward the user while changing the virtual image distance.
  • description of the same configuration as in the first embodiment will be omitted.
  • FIG. 10A is a schematic view illustrating the virtual image display device 40 according to the second embodiment and the HMD device 11 including the virtual image display device 40.
  • FIG. 10B is a cross-sectional view of the HMD device 11 along the line BB of FIG. 10A.
  • the HMD device 11 is an image display device worn on the head of a user (observer).
  • the HMD device 11 has a structure that imitates glasses for correcting eyesight. That is, the HMD device 11 has a pair of left and right temples 411 and 412, a bridge 413, a pair of left and right transparent members 414 and 415, a virtual image display device 40, an image pickup device 50, an object detection unit 52, and a control unit 53.
  • Temples 411 and 412 are, for example, long rod-shaped members made of an elastic material.
  • An ear hook portion to be hung on the user's ear is provided at one end of each of the temples 411 and 421, and transparent members 414 and 415 are fixed to the other end.
  • the bridge 413 is a short rod-shaped member for connecting a pair of left and right transparent members 414 and 415 to each other.
  • Transparent members 414 and 415 are fixed to both ends of the bridge 413. As a result, the pair of left and right transparent members 414 and 415 are held at regular intervals.
  • the transparent members 414 and 415 are transparent materials (glass, plastic, film, etc.) capable of transmitting light from the outside world and delivering it to the user's eyes so that the user wearing the HMD device 11 can observe the outside world. Consists of.
  • the main body 41 of the virtual image display device 40 is provided on the upper part of the transparent member 415 corresponding to the right eye of the user, and the prism 42 is provided inside the transparent member 415.
  • An imaging device 50 is provided on the upper portion of the transparent member 414 corresponding to the left eye. Further, an object detection unit 52 and a control unit 53 are arranged on the temple 412.
  • the imaginary image display device 40 displays an image additionally superimposed on the scenery of the outside world.
  • the virtual image display device 40 is based on information about the user's line-of-sight direction detected by a line-of-sight detection device (not shown), and is close to a person existing in the scenery of the outside world, and information about this person (for example, name, residential area, etc. ) Is included in the pop-up image.
  • the virtual image display device 40 may be a binocular type instead of the monocular type as shown in FIG. 10A.
  • the prism 42 is a polyhedron made of a transparent medium such as glass or quartz having a refractive index different from that of the surrounding space for refracting light.
  • the light output from the main body 41 of the virtual image display device 40 for displaying the superimposed image is refracted by the prism 42 and reaches the user's pupil 911 together with the light from the outside world. That is, the prism 42 functions as an eyepiece optical system and guides the image formed by the main body 41 into the user's field of view.
  • the light source of the display unit when there is no image to be formed in the image forming region, that is, when there is no image to be displayed on the display element, the light source of the display unit is turned off.
  • the image pickup apparatus 50 images the outside world of the camera lens 51 in the optical axis direction.
  • the image pickup device 220 is fixedly held with respect to the transparent member 414 so that the optical axis of the camera lens 51 substantially coincides with the line-of-sight direction of the user. This enables the image pickup device 50 to take a picture in the front field of view of the user.
  • the object detection unit 52 identifies an object (for example, a person, a vehicle, etc.) that is close to the front, and determines the distance to the object.
  • the object detection unit 52 performs various image processing such as brightness correction on the external image captured by the image pickup device 50, and extracts or cuts out the object from the external image to determine the existence or nonexistence of an object such as a person or a vehicle. To detect.
  • the object detection unit 52 calculates the spatial position of the object from the depth information accompanying the external image.
  • Control unit 53 operates the virtual image display device 40 to display a three-dimensional virtual image i2 in which the virtual image distance changes behind the prism 42.
  • the effect exerted by the virtual image display device 20 according to the first embodiment and the HUD device 10 provided with the virtual image display device 20 Has the same effect as.
  • the HMD device 11 has a structure imitating glasses for vision correction has been described, but the HMD device 11 may have a goggle type structure.
  • the virtual image display device 20 of the present invention and the HUD device 10 including the virtual image display device 20 have been described.
  • the present invention can be appropriately added, modified, and omitted by those skilled in the art within the scope of the technical idea.
  • the intermediate screen 29 or display 21 is moved along the optical axis AX to be formed behind the display screen 243 or prism 42 by the virtual image forming optics 24.
  • the case of changing the distance between the virtual image i2 and the observer has been described.
  • the display device of the present invention can be applied not only to the case of forming a virtual image but also to the case of forming a real image.
  • the distance between the observer and the real image formed by the image forming optical system by moving the intermediate screen 29 along the optical axis AX while projecting the image generated by the computer as a real image onto the intermediate screen 29. May be configured to change.
  • the real image can be changed while changing the projection distance to the real image.
  • the real image as a projected image can be made three-dimensional.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Instrument Panels (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Le problème décrit par la présente invention est de fournir un dispositif d'affichage, un dispositif d'affichage tête haute et un visiocasque pouvant supprimer la lumière de fond. La solution selon l'invention concerne un dispositif d'affichage d'image virtuelle (20) qui effectue un affichage tridimensionnel par un procédé de balayage de volume et comprend une unité d'acquisition (31), une unité d'affichage (21), une unité de formation d'image (23), un système optique de formation d'image (24) et une unité de commande (31). L'unité d'acquisition (31) acquiert des données d'image. L'unité d'affichage (21) affiche une image des données d'image acquises par l'unité d'acquisition (31). L'unité de formation d'image (23) est pourvue d'une pluralité de régions (201-205) de formation d'image à différentes positions dans la direction de l'axe optique de la lumière émise par l'unité d'affichage (21) et forme séquentiellement des images correspondant à l'image sur les régions de formation d'image. Le système optique (24) de formation d'image convertit les images sur les régions de formation d'image et forme une image réelle ou une image virtuelle à une distance de projection correspondant à chacune des positions des régions de formation d'image dans la direction de l'axe optique. L'unité de commande (31) éteint une source de lumière de l'unité d'affichage (21) lorsqu'une image à former n'est pas présente dans la région de formation d'image à chacune des positions.
PCT/JP2020/008872 2019-03-19 2020-03-03 Dispositif d'affichage, dispositif d'affichage tête haute et visiocasque WO2020189258A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021507161A JPWO2020189258A1 (fr) 2019-03-19 2020-03-03

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019050620 2019-03-19
JP2019-050620 2019-03-19

Publications (1)

Publication Number Publication Date
WO2020189258A1 true WO2020189258A1 (fr) 2020-09-24

Family

ID=72520240

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/008872 WO2020189258A1 (fr) 2019-03-19 2020-03-03 Dispositif d'affichage, dispositif d'affichage tête haute et visiocasque

Country Status (2)

Country Link
JP (1) JPWO2020189258A1 (fr)
WO (1) WO2020189258A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002228977A (ja) * 2001-02-02 2002-08-14 Kenwood Corp 立体表示装置及び立体表示方法
US20150362744A1 (en) * 2012-02-07 2015-12-17 Samsung Display Co., Ltd. Three-dimensional image display device
WO2017138409A1 (fr) * 2016-02-10 2017-08-17 株式会社リコー Dispositif d'affichage d'informations
JP2018092005A (ja) * 2016-12-02 2018-06-14 パナソニックIpマネジメント株式会社 表示装置
JP2018156062A (ja) * 2017-03-15 2018-10-04 株式会社リコー 表示装置、物体装置及び表示方法
WO2018199245A1 (fr) * 2017-04-28 2018-11-01 コニカミノルタ株式会社 Dispositif d'affichage d'image virtuelle et système d'affichage de corps mobile

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002228977A (ja) * 2001-02-02 2002-08-14 Kenwood Corp 立体表示装置及び立体表示方法
US20150362744A1 (en) * 2012-02-07 2015-12-17 Samsung Display Co., Ltd. Three-dimensional image display device
WO2017138409A1 (fr) * 2016-02-10 2017-08-17 株式会社リコー Dispositif d'affichage d'informations
JP2018092005A (ja) * 2016-12-02 2018-06-14 パナソニックIpマネジメント株式会社 表示装置
JP2018156062A (ja) * 2017-03-15 2018-10-04 株式会社リコー 表示装置、物体装置及び表示方法
WO2018199245A1 (fr) * 2017-04-28 2018-11-01 コニカミノルタ株式会社 Dispositif d'affichage d'image virtuelle et système d'affichage de corps mobile

Also Published As

Publication number Publication date
JPWO2020189258A1 (fr) 2020-09-24

Similar Documents

Publication Publication Date Title
KR101131983B1 (ko) 차량속도에 따라 가상 영상정보 이미지의 투영위치를 변화시키는 차량용 헤드업 디스플레이 장치
JP7003925B2 (ja) 反射板、情報表示装置および移動体
JP7189513B2 (ja) ヘッドアップディスプレイ装置
JP2011107382A (ja) 車両用表示装置
JP2018203245A (ja) 表示システム、電子ミラーシステム及び移動体
WO2018124299A1 (fr) Dispositif d'affichage d'image virtuelle et procédé correspondant
WO2020189258A1 (fr) Dispositif d'affichage, dispositif d'affichage tête haute et visiocasque
WO2018199245A1 (fr) Dispositif d'affichage d'image virtuelle et système d'affichage de corps mobile
JP2020154027A (ja) 表示装置
WO2019151314A1 (fr) Dispositif d'affichage
JP2019191368A (ja) 虚像表示装置、およびヘッドアップディスプレイ装置
JP2019120891A (ja) 虚像表示装置、およびヘッドアップディスプレイ装置
WO2018180857A1 (fr) Appareil d'affichage tête haute
JP2019197102A (ja) 虚像表示装置、およびヘッドアップディスプレイ装置
JP7121349B2 (ja) 表示方法及び表示装置
JPWO2019124323A1 (ja) 虚像表示装置、およびヘッドアップディスプレイ装置
WO2020184506A1 (fr) Dispositif d'affichage tête haute
WO2019138914A1 (fr) Dispositif d'affichage d'image virtuelle et dispositif d'affichage tête haute
WO2019107225A1 (fr) Dispositif d'affichage d'image virtuelle et dispositif d'affichage tête haute
WO2024090297A1 (fr) Système d'affichage tête haute
JP7111071B2 (ja) ヘッドアップディスプレイ装置
JP7280557B2 (ja) 表示装置及びこれによる表示方法
JP2020065125A (ja) 表示装置
WO2019093496A1 (fr) Système optique de projection d'image virtuelle et dispositif d'affichage
WO2019177113A1 (fr) Système optique d'affichage d'image virtuelle et dispositif d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20773420

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021507161

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20773420

Country of ref document: EP

Kind code of ref document: A1