CN110275297B - Head-mounted display device, display control method, and recording medium - Google Patents

Head-mounted display device, display control method, and recording medium Download PDF

Info

Publication number
CN110275297B
CN110275297B CN201910183333.5A CN201910183333A CN110275297B CN 110275297 B CN110275297 B CN 110275297B CN 201910183333 A CN201910183333 A CN 201910183333A CN 110275297 B CN110275297 B CN 110275297B
Authority
CN
China
Prior art keywords
display
distance
image
user
virtual screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910183333.5A
Other languages
Chinese (zh)
Other versions
CN110275297A (en
Inventor
藤卷由贵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seiko Epson Corp
Original Assignee
Seiko Epson Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seiko Epson Corp filed Critical Seiko Epson Corp
Publication of CN110275297A publication Critical patent/CN110275297A/en
Application granted granted Critical
Publication of CN110275297B publication Critical patent/CN110275297B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
    • G09F9/30Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements
    • G09F9/33Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements being semiconductor devices, e.g. diodes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers
    • H04N5/642Disposition of sound reproducers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0134Head-up displays characterised by optical features comprising binocular systems of stereoscopic type

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Computer Hardware Design (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Provided are a head-mounted display device, a display control method, and a recording medium, which reduce optical mismatch caused by a convergence angle deviation and a focal length deviation. The head-mounted display device includes: a display unit that displays a virtual screen set for each of a plurality of predetermined focal lengths; a distance-based classification image generation unit that generates a distance-based classification display image for displaying a display object representing a display object existing between two adjacent focal lengths on a virtual screen; and a display control unit that displays each of the generated distance-based classified display images on a virtual screen corresponding to a focal distance on a back side in a visual field direction of a user of the head-mounted display device, among two adjacent focal distances.

Description

Head-mounted display device, display control method, and recording medium
Technical Field
The present invention relates to a head-mounted display device.
Background
As a Head-Mounted Display device (Head Mounted Display, HMD) that is Mounted on the Head and displays an image or the like in a visual field area of a user, a transmissive Head-Mounted Display device is known that allows an external scene to be seen through together with a Display image when the Head-Mounted Display device is Mounted. When the external scene and the display image are displayed in a superimposed manner in the transmissive head-mounted display device, an optical mismatch may occur due to a deviation in the convergence angle and a deviation in the focal length of the external scene and the display image, which may cause a user to feel eyestrain or discomfort. Therefore, various techniques for reducing the optical mismatch in the transmissive head-mounted display device have been proposed. For example, patent document 1 proposes a technique of changing the convergence angle of the image display unit in accordance with the gaze point distance of the user of the transmissive head-mounted display device.
Patent document 1: japanese patent application laid-open No. 2010-139589
However, the technique described in patent document 1 has not actually been studied sufficiently for the focal length. For example, when the video display unit described in patent document 1 is moved in the direction of the user's sight line in accordance with the user's gaze point distance, there is a problem that display images of a plurality of display objects existing at positions having different gaze point distances cannot be displayed simultaneously, or when the user's gaze point moves, there is a problem that display is delayed. Such a problem is not limited to the transmissive head-mounted display device, but is also present in a head-mounted display device in which transmission of an external scene is blocked.
Disclosure of Invention
According to one embodiment of the present invention, a head-mounted display device is provided. The head-mounted display device includes: a display unit that displays a virtual screen set for each of a plurality of predetermined focal lengths; a distance-based classification image generating unit that generates a distance-based classification display image for displaying, on the virtual screen, a display object representing a display object existing between two adjacent focal lengths; and a display control unit that displays each of the generated distance-specific display images on the virtual screen corresponding to a focal distance on a back side in a visual field direction of a user of the head-mounted display device, out of the two adjacent focal distances.
Drawings
Fig. 1 is an explanatory diagram showing a schematic configuration of a head-mounted display device according to an embodiment of the present invention.
Fig. 2 is a plan view of a main portion showing a structure of an optical system included in the display unit.
Fig. 3 is an explanatory diagram schematically showing a detailed structure of the right optical system.
Fig. 4 is a diagram showing a configuration of a main part of the display section as viewed by a user.
Fig. 5 is a view for explaining the angle of view of the camera.
Fig. 6 is a block diagram functionally showing the structure of the HMD.
Fig. 7 is a block diagram functionally showing the configuration of the control device.
Fig. 8 is an explanatory diagram illustrating an example of augmented reality display performed by the HMD.
Fig. 9 is an explanatory diagram schematically showing an example of a display object.
Fig. 10 is an explanatory diagram schematically showing a display distance of a display object.
Fig. 11 is a flowchart showing processing steps of the display control process.
Fig. 12 is an explanatory diagram schematically showing a 3DCG object map.
Fig. 13 is an explanatory view schematically showing the visual field of the user after the execution of step S130.
Fig. 14 is an explanatory view schematically showing the visual field of the user when the user looks at the 1 st sorted-by-distance display image.
Fig. 15 is an explanatory view schematically showing the visual field of the user when the user looks at the 2 nd sorted-by-distance display image.
Fig. 16 is an explanatory view schematically showing the visual field of the user when looking at the 3 rd distance-classified display image.
Fig. 17 is an explanatory diagram schematically showing a display distance of a display object according to embodiment 2.
Fig. 18 is a flowchart showing the processing procedure of the display control processing of embodiment 2.
Fig. 19 is an explanatory view schematically showing the visual field of the user after the execution of step S135.
Fig. 20 is a flowchart showing the processing procedure of the display control processing of embodiment 3.
Fig. 21 is an explanatory view schematically showing the visual field of the user after execution of step S135 a.
Description of the reference symbols
10: a control device; 12: a lighting section; 14: a touch pad; 16: a direction key; 17: determining a key; 18: a power switch; 19: a vibrator; 20: a display unit; 21: a right holding portion; 22: a right display unit; 23: a left holding portion; 24: a left display unit; 26: a right light guide plate; 27: a front frame; 28: a left light guide plate; 30: a headphone microphone; 32: a right earphone; 34: a left earphone; 40: connecting a cable; 46: a connector; 61: a camera; 62: a sight line detection unit; 63: a microphone; 65: an illuminance sensor; 67: an LED indicator; 100: a head-mounted display device; 110: an operation section; 111: a 6-axis sensor; 113: a magnetic sensor; 115: a GNSS receiver; 117: a wireless communication unit; 118: a memory; 120: a controller substrate; 121: a nonvolatile storage unit; 122: a storage function section; 123: setting data; 124: content data; 130: a power supply unit; 132: a battery; 134: a power supply control circuit; 140: a main processor; 145: an image processing unit; 147: a display control unit; 149: an image pickup control unit; 150: a control function section; 151: an input/output control unit; 153: a distance-based classification image generation unit; 155: an intermediate distance image generating unit; 157: a fixation point distance calculating unit; 180: a speech codec; 182: a voice interface; 184: an external connector; 186: an external memory interface; 188: a USB connector; 192: a sensor hub; 196: an interface; 210: a display unit substrate; 211: an interface; 213: a receiving section; 215: an EEPROM; 217: a temperature sensor; 221: an OLED cell; 223: an OLED panel; 225: an OLED drive circuit; 230: a display unit substrate; 231: an interface; 233: a receiving section; 235: a 6-axis sensor; 237: a magnetic sensor; 239: a temperature sensor; 241: an OLED cell; 243: an OLED panel; 245: an OLED drive circuit; 251: a right optical system; 252: a left optical system; 261: a semi-transparent semi-reflective mirror; 281: a semi-transparent semi-reflective mirror; AI: an object image; CL: a virtual camera for the left eye; CR: a virtual camera for the right eye; EL: an end portion; ER: an end portion; im 1: 1, displaying images according to distance classification; im 2: 2, displaying images according to distance classification; im 3: 3, displaying the images according to distance classification; im 4: 4, displaying the images according to distance classification; im 5: 5, displaying the images according to distance classification; l: image light; l1: a projection lens; LE: a left eye; LL: a remote optical system; LL 1: a collimating lens; LL 2: a focal length changing lens; ML: a middle-distance optical system; ML 1: a collimating lens; ML 2: a focal length changing lens; OB: an object; OL: external light; OM: 3DCG object map; ob 1: 1 st display object; ob 2: a2 nd display object; ob 3: a3 rd display object; ob 4: 4, displaying the object; ob 5: a 5 th display object; PN: a display area; PSM: a prism; RD: a line of sight; RE: a right eye; RIm 1: displaying the images in a classified manner according to the continuous distance; RIm 2: displaying the images in a classified manner according to the continuous distance; RIm2 a: displaying the images in a classified manner according to the continuous distance; RIm2 b: displaying the images in a classified manner according to the continuous distance; RIm2 c: displaying the images in a classified manner according to the continuous distance; ROb 1: continuously displaying the objects; SC: an outdoor scene; SL: a close-range optical system; SL 1: a collimating lens; SL 2: a focal length changing lens; VS 1: 1, a virtual picture; VS 2: a2 nd virtual screen; VS 3: a3 rd virtual picture; VT: a field of view; w1: 1, displaying the distance; w2: the 2 nd display distance; w3: a3 rd display distance; m 1: 1 st reflective film; m 2: and (2) a second reflective film.
Detailed Description
A. Embodiment 1:
A1. overall structure of head-mounted display device:
fig. 1 is an explanatory diagram showing a schematic configuration of a head-mounted display device 100 according to an embodiment of the present invention. The Head Mounted Display device 100 is a Display device that is Mounted on the Head of a user, and is also referred to as a Head Mounted Display (HMD). The HMD 100 is a see-through (transmissive) head-mounted display device that displays an image in the outside world viewed through glass.
The HMD 100 includes a display unit 20 for allowing a user to see an image, and a control device 10 for controlling the display unit 20.
The display unit 20 is a wearing body to be worn on the head of the user, and has a spectacle shape in the present embodiment. In the present embodiment, a virtual screen is set in the display unit 20 for each of a plurality of predetermined focal lengths. The display unit 20 includes a right display unit 22, a left display unit 24, a right light guide plate 26, and a left light guide plate 28 on a support having a right holding portion 21, a left holding portion 23, and a front frame 27.
The right holding portion 21 and the left holding portion 23 extend rearward from both end portions of the front frame 27, respectively, and hold the display portion 20 on the head of the user like temples (temples) of eyeglasses. Here, of the two ends of the front frame 27, the end located on the right side of the user in the worn state of the display unit 20 is referred to as an end ER, and the end located on the left side of the user is referred to as an end EL. The right holding portion 21 is provided to extend from the end portion ER of the front frame 27 to a position corresponding to the right head of the user in the worn state of the display portion 20. The left holding portion 23 is provided to extend from the end portion EL of the front frame 27 to a position corresponding to the left head of the user in the worn state of the display portion 20.
Right light guide plate 26 and left light guide plate 28 are provided on front frame 27. The right light guide plate 26 is positioned in front of the right eye of the user in the worn state of the display unit 20, and allows the right eye to see an image. The left light guide plate 28 is positioned in front of the left eye of the user in the state where the display unit 20 is worn, and allows the left eye to see an image.
The front frame 27 has a shape that interconnects one end of the right light guide plate 26 and one end of the left light guide plate 28. The connection position corresponds to the position between the eyebrows of the user in the state where the display unit 20 is worn. A nose pad portion that abuts against the nose of the user in the worn state of the display portion 20 may be provided at a joint position of the right light guide plate 26 and the left light guide plate 28 on the front frame 27. In this case, the display unit 20 can be held on the head of the user by the nose pad portion, the right holding portion 21, and the left holding portion 23. Further, a band that comes into contact with the back head of the user in the worn state of the display unit 20 may be coupled to the right holding portion 21 and the left holding portion 23. In this case, the display unit 20 can be firmly held on the head of the user by the band.
The right display unit 22 performs image display based on the right light guide plate 26. The right display unit 22 is provided in the right holding portion 21 and is positioned near the right head of the user in the worn state of the display portion 20. The left display unit 24 performs image display based on the left light guide plate 28. The left display unit 24 is provided in the left holding portion 23 and is located near the left head of the user in the worn state of the display portion 20.
The right and left light guide plates 26 and 28 of the present embodiment are optical portions (e.g., prisms) formed of a light-transmissive resin or the like, and guide the image light output from the right and left display units 22 and 24 to the eyes of the user. Further, light adjusting plates may be provided on the surfaces of the right light guide plate 26 and the left light guide plate 28. The light modulation panel is a thin plate-shaped optical element having a transmittance that differs depending on the wavelength band of light, and functions as a so-called wavelength filter. The light modulation panel is disposed, for example, so as to cover a surface (a surface opposite to a surface facing the eyes of the user) of the front frame 27. By appropriately selecting the optical characteristics of the light modulation panel, the transmittance of light in an arbitrary wavelength band such as visible light, infrared light, and ultraviolet light can be adjusted, and the amount of external light that enters right light guide plate 26 and left light guide plate 28 from the outside and passes through right light guide plate 26 and left light guide plate 28 can be adjusted.
The display unit 20 guides the image light generated by the right display unit 22 and the left display unit 24 to the right light guide plate 26 and the left light guide plate 28, and allows the user to see an image (Augmented Reality (AR) image) while seeing the external scene through the display unit 20 (also referred to as "display image"). In the case where external light is incident to the eyes of the user through right and left light guide plates 26 and 28 from the front of the user, image light and external light constituting an image are incident to the eyes of the user. Therefore, the visibility of the image by the user is affected by the intensity of the external light.
Therefore, for example, the ease with which the image is viewed can be adjusted by mounting the light modulation panel on the front frame 27 and appropriately selecting or adjusting the optical characteristics of the light modulation panel. In a typical example, the following light modulation panels may be selected: the light modulation panel has a light transmittance at least to the extent that a user wearing the HMD 100 can see an external scene. In addition, sunlight can be suppressed to improve the visibility of an image. When the light modulation panel is used, the following effects can be expected: right light guide plate 26 and left light guide plate 28 are protected, and damage to right light guide plate 26 and left light guide plate 28, adhesion of dirt, and the like are suppressed. The light modulation panel may be attachable to and detachable from the front frame 27, the right light guide plate 26, and the left light guide plate 28, respectively. Further, the light modulation panel may be replaced with a plurality of light modulation panels and detached, or the light modulation panel may be omitted.
The camera 61 is disposed on the front frame 27 of the display unit 20. The camera 61 is disposed on the front surface of the front frame 27 at a position not to block external light transmitted through the right light guide plate 26 and the left light guide plate 28. In the example of fig. 1, the camera 61 is disposed on the end ER side of the front frame 27. The camera 61 may be disposed on the end EL side of the front frame 27, or may be disposed at a connection portion between the right light guide plate 26 and the left light guide plate 28.
The camera 61 is a digital camera having an imaging element such as a CCD or a CMOS, an imaging lens, and the like. The camera 61 of the present embodiment is a monocular camera, but a stereo camera may be used. The camera 61 captures an image of at least a part of the outside world (real space) in the front-side direction of the HMD 100 (in other words, the direction of the field of view seen by the user in the state of wearing the display unit 20). In other words, the camera 61 captures a range or a direction overlapping with the field of view of the user, and captures a direction seen by the user. The width of the angle of view of the camera 61 can be set as appropriate. In the present embodiment, the width of the angle of view of the camera 61 is set to capture the entire field of view of the user that the user can see through the right light guide plate 26 and the left light guide plate 28. The camera 61 performs imaging under the control of the control function unit 150 (fig. 7), and outputs the obtained imaging data to the control function unit 150.
The HMD 100 may also include a distance measuring sensor that detects the distance to a measurement target located in a predetermined measurement direction. The distance measuring sensor can be disposed at a coupling portion between the right light guide plate 26 and the left light guide plate 28 of the front frame 27, for example. The measurement direction of the distance measurement sensor may be a front side direction of the HMD 100 (a direction overlapping with the shooting direction of the camera 61). The distance measuring sensor may be composed of a light emitting unit such as an LED or a laser diode, and a light receiving unit that receives reflected light of light emitted from the light source reflected by the object to be measured. In this case, the distance is obtained by a triangulation distance measurement process or a distance measurement process based on a time difference. The distance measuring sensor may be configured by, for example, a transmitting unit that transmits an ultrasonic wave and a receiving unit that receives an ultrasonic wave reflected by the object to be measured. In this case, the distance is obtained by distance measurement processing based on the time difference. The distance measuring sensor measures a distance in accordance with an instruction from the control function unit 150, and outputs a detection result to the control function unit 150, similarly to the camera 61.
Fig. 2 is a plan view of a main part showing the structure of an optical system included in the display unit 20. For convenience of explanation, a right eye RE and a left eye LE of a user are illustrated in fig. 2. As shown in fig. 2, the right display unit 22 and the left display unit 24 are configured to be bilaterally symmetrical.
The right display unit 22 has an OLED (Organic Light Emitting Diode) unit 221 and a right optical system 251 as a structure for allowing the right eye RE to see an image (AR image). In fig. 2, the detailed configuration of the right optical system 251 is not shown.
The OLED unit 221 emits image light. The OLED unit 221 has an OLED panel 223 and an OLED driving circuit 225 driving the OLED panel 223. The OLED panel 223 is a self-luminous display panel including light-emitting elements that emit light by organic electroluminescence and emit R (red), G (green), and B (blue) color light, respectively. In the OLED panel 223, a plurality of pixels each including 1 element R, G, B and having a unit of 1 pixel are arranged in a matrix.
The OLED drive circuit 225 selects and energizes the light emitting elements included in the OLED panel 223 to emit light, in accordance with the control of the control function unit 150 (fig. 7) described later. The OLED drive circuit 225 is fixed to the rear surface of the OLED panel 223, i.e., the rear surface side of the light-emitting surface, by bonding or the like. The OLED driving circuit 225 may be formed of, for example, a semiconductor device that drives the OLED panel 223, and is mounted on a substrate fixed to the rear surface of the OLED panel 223. A temperature sensor 217 (fig. 6) described later is mounted on the substrate. In addition, the OLED panel 223 may adopt the following structure: light emitting elements emitting white light are arranged in a matrix, and color filters corresponding to R, G, B are arranged in a superimposed manner. In addition to the light emitting elements emitting R, G, B colored light, the OLED panel 223 having the WRGB structure including the light emitting elements emitting W (white) light may be used.
Fig. 3 is an explanatory diagram schematically showing a detailed structure of the right optical system 251. The right optical system 251 has optical systems SL, ML, and LL at respective focal lengths of a plurality of focal lengths, and has a prism PSM and a projection lens L1. The image light L emitted from the OLED unit 221 is guided to the right light guide plate 26 via the prism PSM and the projection lens L1 after the focal length thereof is changed by the respective optical systems SL, ML, and LL. In the present embodiment, the focal lengths are determined in advance as 3 stages of distances "focal length 1", "focal length 2", and "focal length 3".
The respective optical systems SL, ML, and LL have the same structure as each other. Specifically, the short-distance optical system SL includes a collimator lens SL1 and a focal length changing lens SL 2. The collimator lens SL1 is disposed on the OLED panel 223 side shown in fig. 2, and the focal length changing lens SL2 is disposed on the prism PSM side shown in fig. 3. The collimator lens SL1 and the focal length changing lens SL2 are arranged side by side with each other. The collimator lens SL1 makes the image light L emitted from the OLED cell 221 enter the focal length changing lens SL2 as a parallel light flux. The focal length changing lens SL2 changes the focal length of the image light L to the 1 st focal length, and causes the image light L whose focal length has been changed to enter the prism PSM. In the present embodiment, the 1 st focal length is set to 50 cm.
The middle-distance optical system ML has a collimator lens ML1 and a focal length changing lens ML 2. The structures of the respective lenses ML1 and ML2 are the same as those of the respective lenses SL1 and SL2 of the short-distance optical system SL. The image light L emitted from the OLED cell 221 is collimated by the collimator lens ML1 into parallel light beams and enters the focal length changing lens ML 2. The focal length changing lens ML2 changes the focal length of the image light L to the 2 nd focal length, and causes the image light L after the focal length change to be incident on the prism PSM. In the present embodiment, the 2 nd focal length is set to 1 meter.
The distance optical system LL has a collimator lens LL1 and a focal length changing lens LL 2. The structures of the respective lenses LL1 and LL2 are the same as those of the respective lenses SL1 and SL2 of the short-distance optical system SL. The image light L emitted from the OLED cell 221 is collimated by the collimator lens LL1 into a parallel light flux and enters the focal length changing lens LL 2. The focal length changing lens LL2 changes the focal length of the image light L to the 3 rd focal length, and causes the image light L whose focal length has been changed to enter the prism PSM. In the present embodiment, the 3 rd focal length is set to 2 meters.
The prism PSM causes the image light L emitted from each of the optical systems SL, ML, and LL to enter the projection lens L1. The prism PSM is a so-called cross dichroic prism, and is composed of 4 right-angle prisms in which right-angle surfaces are bonded to each other. A1 st reflection film m1 and a2 nd reflection film m2 are formed in a cross shape at the center of the prism PSM. The 1 st reflecting film m1 totally reflects only the image light L emitted from the middle-distance optical system ML in the direction of the projection lens L1. The 1 st reflection film m1 does not reflect the image light L emitted from the short-distance optical system SL and the long-distance optical system LL. The 2 nd reflecting film m2 totally reflects only the image light L emitted from the distant optical system LL in the direction of the projection lens L1. The 2 nd reflection film m2 does not reflect the image light L emitted from the short-distance optical system SL and the medium-distance optical system ML. The image light L emitted from the short-distance optical system SL passes through the 1 st reflection film m1 and the 2 nd reflection film m2 and is directly incident on the projection lens L1.
The projection lens L1 makes the plurality of image lights L emitted from the prism PSM with different focal lengths parallel to each other to be emitted to the right light guide plate 26. The projection lens L1 may be omitted.
As described above, the image light L emitted from the OLED unit 221 is guided to the right light guide plate 26 by the optical systems SL, ML, LL and the prism PSM, and thereby the virtual screen can be set in the display unit 20 according to the 1 st focal length, the 2 nd focal length, and the 3 rd focal length. In the following description, the virtual screen corresponding to the 1 st focal length is referred to as a1 st virtual screen, the virtual screen corresponding to the 2 nd focal length is referred to as a2 nd virtual screen, and the virtual screen corresponding to the 3 rd focal length is referred to as a3 rd virtual screen.
A plurality of reflecting surfaces that reflect the image light L are formed on the optical path of the light guided inside the right light guide plate 26. The image light L is guided to the right eye RE side by multiple reflection inside the right light guide plate 26. A half mirror 261 (reflection surface) is formed on the right light guide plate 26 in front of the right eye RE. The image light L is reflected by the half mirror 261 and then emitted from the right light guide plate 26 to the right eye RE, and the image light L forms an image on the retina of the right eye RE, thereby allowing the user to see the image.
The left display unit 24 includes an OLED unit 241 and a left optical system 252, and is configured to make the left eye LE see an image (AR image). The OLED unit 241 emits image light. The left optical system 252 has a lens group or the like, and guides the image light L emitted from the OLED unit 241 to the left light guide plate 28. The OLED unit 241 has an OLED panel 243 and an OLED driving circuit 245 driving the OLED panel 243. The details of each part are the same as the OLED unit 221, the OLED panel 223, and the OLED driving circuit 225. A temperature sensor 239 (fig. 6) is mounted on the substrate fixed to the rear surface of the OLED panel 243. Further, the details of the left optical system 252 are the same as the right optical system 251 described above.
With the above-described configuration, the HMD 100 can function as a see-through display device. That is, the image light L reflected by the half mirror 261 and the external light OL transmitted through the right light guide plate 26 are incident on the right eye RE of the user. The image light L reflected by the half mirror 281 and the external light OL transmitted through the left light guide plate 28 are incident on the left eye LE of the user. In this way, the HMD 100 causes the image light L of the internally processed image to be incident on the user's eye so as to overlap the external light OL. As a result, the user can see the outside scenery (real world) through the right and left light guide plates 26 and 28, and see virtual images (virtual images, AR images) based on the image light L so as to overlap with the outside scenery.
The right optical system 251 and the right light guide plate 26 are also collectively referred to as "right light guide unit", and the left optical system 252 and the left light guide plate 28 are also collectively referred to as "left light guide unit". The configurations of the right light guide unit and the left light guide unit are not limited to the above examples, and any form may be used as long as an image is formed in front of the eyes of the user using image light. For example, a diffraction grating may be used for the right light guide portion and the left light guide portion, or a transflective film may be used.
In fig. 1, the control device 10 and the display unit 20 are connected by a connection cable 40. The connection cable 40 is detachably connected to a connector provided at a lower portion of the control device 10, and is connected to various circuits inside the display unit 20 from a distal end of the left holding portion 23. The connection cable 40 has a metal cable or an optical fiber cable that transmits digital data. The connection cable 40 may also include a metal cable that transmits analog data. A connector 46 is provided in the middle of the connection cable 40.
The connector 46 is a jack for connecting a stereo mini plug, and the connector 46 and the control device 10 are connected by, for example, a wire for transmitting an analog voice signal. In the example of the present embodiment shown in fig. 1, the connector 46 is connected to the headphone microphone 30, and the headphone microphone 30 includes a right earphone 32 and a left earphone 34 constituting a stereo headphone, and a microphone 63.
For example, as shown in fig. 1, the microphone 63 is arranged such that the voice pickup portion of the microphone 63 faces the line of sight of the user. The microphone 63 collects voice and outputs a voice signal to the voice interface 182 (fig. 6). The microphone 63 may be a mono microphone, a stereo microphone, a directional microphone, or an omni-directional microphone.
The control device 10 is a device for controlling the HMD 100. The control device 10 includes a lighting unit 12, a touch panel 14, a direction key 16, a determination key 17, and a power switch 18. The lighting unit 12 notifies the operation state (for example, power on/off) of the HMD 100 by the light emission state. As the lighting unit 12, for example, an LED (Light Emitting Diode) can be used.
The touch panel 14 detects a touch operation on the operation surface of the touch panel 14 and outputs a signal corresponding to the detection content. As the touch panel 14, various touch panels such as an electrostatic type, a pressure detection type, and an optical type can be used. The direction key 16 detects a pressing operation for a key corresponding to the up, down, left, and right directions and outputs a signal corresponding to the detection content. The determination key 17 detects a pressing operation and outputs a signal for determining the content operated in the control device 10. The power switch 18 switches the state of the power of the HMD 100 by detecting a sliding operation of the switch.
Fig. 4 is a diagram showing a configuration of a main part of the display unit 20 seen by a user. In fig. 4, the connection cable 40, the right earphone 32, and the left earphone 34 are not illustrated. In the state of fig. 4, the rear surfaces of the right light guide plate 26 and the left light guide plate 28 are visible, and the half mirror 261 for irradiating the image light to the right eye RE and the half mirror 281 for irradiating the image light to the left eye LE in the substantially quadrangular region are visible. The user can see the outside scenery through the entirety of the right and left light guide plates 26, 28 including the half mirrors 261, 281, and can see rectangular display images at the positions of the half mirrors 261, 281.
Fig. 5 is a diagram for explaining the angle of view of the camera 61. Fig. 5 schematically shows the camera 61, the right eye RE and the left eye LE of the user in a plan view, and the angle of view (imaging range) of the camera 61 is represented by θ. The angle of view θ of the camera 61 extends in the horizontal direction as shown in the figure, and also extends in the vertical direction as in a normal digital camera.
As described above, the camera 61 is disposed at the right end of the display unit 20, and captures the direction of the user's field of view (i.e., the front of the user). Therefore, the optical axis of the camera 61 is in a direction including the line of sight direction of the right eye RE and the left eye LE. The external scenery that the user can see while wearing the HMD 100 is not limited to infinity. For example, when the user gazes at object OB with both eyes, the line of sight of the user is directed to object OB as indicated by reference numerals RD and LD in the figure. In this case, the distance from the user to the object OB is usually about 30cm to 10m, and more usually 1m to 4 m. Therefore, the HMD 100 can determine the reference of the upper limit and the lower limit of the distance from the user to the object OB in the normal use. The reference may be obtained in advance and preset in the HMD 100, or the user may set the reference. The optical axis and the angle of view of the camera 61 are preferably set to: in such a case where the distance from the object OB in the normal use corresponds to the set upper limit and lower limit criteria, the object OB is included in the angle of view.
In addition, the human visual field angle is generally about 200 degrees in the horizontal direction and about 125 degrees in the vertical direction. Among them, the effective visual field having excellent information receptivity is about 30 degrees in the horizontal direction and about 20 degrees in the vertical direction. The fixation point of the human fixation is 60-90 degrees in the horizontal direction and 45-70 degrees in the vertical direction, and the stable fixation visual field can be rapidly and stably observed. In this case, when the gaze point is the object OB (fig. 5), the effective field of view is 30 degrees in the horizontal direction and about 20 degrees in the vertical direction around the lines of sight RD and LD. In addition, the horizontal direction is 60-90 degrees, and the vertical direction is 45-70 degrees, which is a stable view. The actual Field Of View that the user sees through the right light guide plate 26 and the left light guide plate 28 Of the display unit 20 is referred to as the real Field Of View (FOV). The true field of view is narrower than the field of view angle and the steady gaze field of view, but wider than the effective field of view.
The field angle θ of the camera 61 of the present embodiment is set to a range that can capture an image wider than the field of view of the user. The angle of view θ of the camera 61 is preferably set to a range that can capture at least a wider field of view than the effective field of view of the user, and more preferably to a range that can capture a wider field of view than the real field of view. The angle of view θ of the camera 61 is preferably set to a range that can image a field of view wider than the steady gaze field of view of the user, and most preferably wider than the field of view angles of both eyes of the user. Therefore, the camera 61 may adopt the following configuration: a so-called wide-angle lens is provided as an image pickup lens so that a wide angle of view can be photographed. Wide-angle lenses may include lenses known as ultra-wide-angle lenses and quasi-wide-angle lenses. In addition, the camera 61 may include a single focus lens, a zoom lens, and a lens group including a plurality of lenses.
Fig. 6 is a block diagram functionally showing the structure of the HMD 100. The control device 10 includes: a main processor 140 that executes a program to control the HMD 100; a storage unit; an input/output unit; a sensor class; an interface; and a power supply section 130. The main processor 140 is connected to the storage unit, the input/output unit, the sensors, the interface, and the power supply unit 130. The main processor 140 is mounted on the controller board 120 having the control device 10 built therein.
The storage unit includes a memory 118 and a nonvolatile storage unit 121. Memory 118 constitutes a work area for temporarily storing computer programs executed by main processor 140 and data processed by main processor 140. The nonvolatile memory section 121 is constituted by a flash memory or an eMMC (embedded Multi Media Card). The nonvolatile storage unit 121 stores a computer program executed by the main processor 140 and various data processed by the main processor 140. In the present embodiment, these storage units are mounted on the controller board 120.
The input/output unit includes the touch pad 14 and the operation unit 110. The operation unit 110 includes a direction key 16, a determination key 17, and a power switch 18 of the control device 10. The main processor 140 controls the respective input/output units and acquires signals output from the respective input/output units.
The sensors include a 6-axis sensor 111, a magnetic sensor 113, and a GNSS (Global Navigation Satellite System) receiver 115. The 6-axis sensor 111 is a motion sensor (inertial sensor) having a 3-axis acceleration sensor and a 3-axis gyro (angular velocity) sensor. The 6-axis sensor 111 may be an IMU (Inertial Measurement Unit) in which these sensors are modularized. The magnetic sensor 113 is, for example, a 3-axis magnetic sensor. The GNSS receiver 115 detects the current position (longitude/latitude) of the control apparatus 10 from navigation signals received from artificial satellites constituting the GNSS. These sensors (the 6-axis sensor 111, the magnetic sensor 113, and the GNSS receiver 115) output detection values to the main processor 140 according to a sampling frequency specified in advance. The timing at which each sensor outputs the detection value may be based on an instruction from the main processor 140.
The interfaces include a wireless communication unit 117, a voice codec 180, an external connector 184, an external memory interface 186, a USB (Universal Serial Bus) connector 188, a sensor hub 192, an FPGA 194, and an interface 196. They function as interfaces with the outside.
The wireless communication section 117 performs wireless communication between the HMD 100 and an external apparatus. The wireless communication unit 117 is configured to include an antenna, an RF circuit, a baseband circuit, a communication control circuit, and the like, which are not shown, or is a device obtained by integrating them. The wireless communication unit 117 performs wireless communication based on a standard such as a wireless LAN including Bluetooth (registered trademark) and Wi-Fi (registered trademark).
The speech codec 180 is connected to a speech interface 182, and performs encoding/decoding of speech signals input/output via the speech interface 182. The voice interface 182 is an interface for inputting and outputting a voice signal. The voice codec 180 may have an a/D converter that performs conversion from an analog voice signal to digital voice data, and a D/a converter that performs conversion from digital voice data to an analog voice signal. The HMD 100 of the present embodiment outputs voices from the right earphone 32 and the left earphone 34, and collects the voices with the microphone 63. The voice codec 180 converts digital voice data output from the main processor 140 into an analog voice signal and outputs the analog voice signal via the voice interface 182. Also, the voice codec 180 converts an analog voice signal input to the voice interface 182 into digital voice data and outputs to the main processor 140.
The external connector 184 is a connector for connecting an external device (e.g., a personal computer, a smartphone, a game device, etc.) communicating with the main processor 140 to the main processor 140. The external device connected to the external connector 184 can be used for debugging a computer program executed by the main processor 140 and collecting a work log of the HMD 100, in addition to serving as a provider of content. The external connector 184 can take various forms. As the external connector 184, for example, an interface corresponding to wired connection such as a USB interface, a micro-USB interface, or a memory card interface, or an interface corresponding to wireless connection such as a wireless LAN interface or a Bluetooth interface can be used.
The external memory interface 186 is an interface that can be used to connect a removable type memory device. The external memory interface 186 includes, for example, a memory card slot for mounting a card-type recording medium and reading and writing data, and an interface circuit. The size, shape, specification, and the like of the card-type recording medium can be appropriately selected. The USB connector 188 is an interface capable of connecting a storage device based on the USB standard, a smart phone, a personal computer, and the like. The USB connector 188 includes, for example, a connector based on the USB standard and an interface circuit. The size, shape, version of the USB standard, and the like of the USB connector 188 can be appropriately selected.
Also, the HMD 100 has a vibrator 19. The vibrator 19 includes a motor, an eccentric rotor, and the like, not shown, and generates vibration under the control of the main processor 140. For example, when an operation on the operation unit 110 is detected, or when the power of the HMD 100 is turned on or off, the HMD 100 causes the vibrator 19 to vibrate in a predetermined vibration pattern. Instead of providing the vibrator 19 to the control device 10, the vibrator 19 may be provided on the display unit 20 side, for example, on the right holding portion 21 (right side portion of temple) of the image display unit.
The sensor hub 192 and the FPGA 194 are connected to the display unit 20 via an interface (I/F) 196. The sensor hub 192 acquires detection values of various sensors included in the display unit 20 and outputs the detection values to the main processor 140. The FPGA 194 performs processing of data transmitted and received between the main processor 140 and each part of the display unit 20 and transmission via the interface 196. The interface 196 is connected to the right display unit 22 and the left display unit 24 of the display section 20, respectively. In the example of the present embodiment, the connection cable 40 is connected to the left holding portion 23, the wiring connected to the connection cable 40 is laid inside the display portion 20, and the right display unit 22 and the left display unit 24 are connected to the interface 196 of the control device 10, respectively.
The power supply unit 130 includes a battery 132 and a power supply control circuit 134. The power supply unit 130 supplies electric power for operating the control device 10. The battery 132 is a rechargeable battery. The power supply control circuit 134 detects the remaining capacity of the battery 132 and controls charging of the OS 143 (fig. 7). The power supply control circuit 134 is connected to the main processor 140, and outputs a detection value of the remaining capacity of the battery 132 and a detection value of the voltage of the battery 132 to the main processor 140. Further, power may be supplied from the control device 10 to the display unit 20 in accordance with the power supplied from the power supply unit 130. The configuration may be such that: the main processor 140 can control the state of power supply from the power supply unit 130 to each unit of the control device 10 and the display unit 20.
The right display unit 22 has a display unit substrate 210, an OLED unit 221, a camera 61, an illuminance sensor 65, an LED indicator 67, and a temperature sensor 217. An interface (I/F)211 connected to the interface 196, a receiver (Rx)213, and an EEPROM (Electrically Erasable Programmable Read-Only Memory) 215 are mounted on the display unit substrate 210. The receiving unit 213 receives data input from the control device 10 via the interface 211. When receiving image data of an image displayed on the OLED unit 221, the receiving unit 213 outputs the received image data to the OLED drive circuit 225 (fig. 2).
The EEPROM 215 stores various data in a manner readable by the main processor 140. The EEPROM 215 stores, for example, data on the light emission characteristics and display characteristics of the OLED cells 221 and 241 of the display portion 20, data on sensor characteristics of the right display unit 22 or the left display unit 24, and the like. Specifically, for example, parameters related to gamma correction of the OLED units 221 and 241, data for compensating detection values of the temperature sensors 217 and 239, which will be described later, and the like are stored. These data are generated by inspection at factory shipment of the HMD 100 and written in the EEPROM 215. After shipment, main processor 140 reads data of EEPROM 215 and uses it for various processes.
The camera 61 performs shooting in accordance with a signal input via the interface 211, and outputs shot image data or a signal indicating a shooting result to the control device 10. As shown in fig. 1, the illuminance sensor 65 is provided at the end ER of the front frame 27 and is arranged to receive external light from the front of the user wearing the display unit 20. The illuminance sensor 65 outputs a detection value corresponding to the amount of received light (received light intensity). As shown in fig. 1, the LED indicator 67 is disposed near the camera 61 at the end ER of the front frame 27. The LED indicator 67 is lit during the shooting performed by the camera 61, thereby informing that the shooting is in progress.
The temperature sensor 217 detects a temperature and outputs a voltage value or a resistance value corresponding to the detected temperature. The temperature sensor 217 is mounted on the back side of the OLED panel 223 (fig. 2). The temperature sensor 217 may be mounted on the same substrate as the OLED drive circuit 225, for example. With this structure, the temperature sensor 217 mainly detects the temperature of the OLED panel 223. In addition, the temperature sensor 217 may be built in the OLED panel 223 or the OLED driving circuit 225 (fig. 2). For example, in the case where the OLED panel 223 is mounted as an integrated circuit on an integrated semiconductor chip as a Si-OLED together with the OLED driving circuit 225, the temperature sensor 217 may be mounted on the semiconductor chip.
The left display unit 24 has a display unit substrate 230, an OLED unit 241, and a temperature sensor 239. An interface (I/F)231 connected to the interface 196, a receiving unit (Rx)233, a 6-axis sensor 235, and a magnetic sensor 237 are mounted on the display unit substrate 230. The receiving unit 233 receives data input from the control device 10 via the interface 231. When receiving image data of an image displayed on the OLED unit 241, the receiving unit 233 outputs the received image data to the OLED drive circuit 245 (fig. 2).
The 6-axis sensor 235 is a motion sensor (inertial sensor) having a 3-axis acceleration sensor and a 3-axis gyroscope (angular velocity) sensor. The 6-axis sensor 235 may employ an IMU in which the above-described sensors are modularized. The magnetic sensor 237 is, for example, a 3-axis geomagnetic sensor. Since the 6-axis sensor 235 and the magnetic sensor 237 are provided on the display unit 20, when the display unit 20 is worn on the head of the user, the movement of the head of the user is detected. The orientation of the display portion 20 (i.e., the field of view of the user) is determined based on the detected head movement.
The temperature sensor 239 detects a temperature and outputs a voltage value or a resistance value corresponding to the detected temperature. The temperature sensor 239 is mounted on the back side of the OLED panel 243 (fig. 2). The temperature sensor 239 may be mounted on the same substrate as the OLED drive circuit 245, for example. With this structure, the temperature sensor 239 mainly detects the temperature of the OLED panel 243. The temperature sensor 239 may be built into the OLED panel 243 or the OLED driving circuit 245 (fig. 2). The details are the same as the temperature sensor 217.
The camera 61, the illuminance sensor 65, and the temperature sensor 217 of the right display unit 22, and the 6-axis sensor 235, the magnetic sensor 237, and the temperature sensor 239 of the left display unit 24 are connected to the sensor hub 192 of the control device 10. The sensor hub 192 sets and initializes the sampling period of each sensor according to the control of the main processor 140. The sensor hub 192 performs energization of each sensor, transmission of control data, acquisition of detection values, and the like in accordance with the sampling period of each sensor. The sensor hub 192 outputs detection values of the respective sensors of the right display unit 22 and the left display unit 24 to the main processor 140 at a predetermined timing. The sensor hub 192 may have a cache function of temporarily storing the detection values of the respective sensors. The sensor hub 192 may have a function of converting the signal form of the detection value of each sensor, the data form (for example, a function of converting to a unified form). The sensor hub 192 starts and stops energization to the LED indicator 67 according to control of the main processor 140, thereby turning on or off the LED indicator 67.
The visual line detection units 62 are disposed at positions corresponding to the lower sides of the left and right external canthi of the user, respectively. The left and right sight line detection units 62 have an infrared light emitting unit and an infrared light receiving unit, not shown, respectively. The right sight line detection unit 62 receives infrared rays emitted from the infrared ray emitting unit and reflected by contacting the right eye of the user. The left sight line detection unit 62 receives infrared rays emitted from the infrared ray emission unit, and reflected after contacting the left eye of the user. The left and right sight line detectors 62 detect the sight lines of the left and right eyes of the user based on the intensity of the received infrared rays. The line-of-sight detection unit 62 outputs the detected line of sight to the control device 10 via the interface 196. The reflectance of infrared light differs between the case where infrared light is in contact with the iris (black eyeball), the case where infrared light is in contact with the eyelid, and the case where infrared light is in contact with the white of the eye. Therefore, the line-of-sight detecting unit 62 can acquire the movement of the line of sight of the user based on the intensity of the received infrared ray. The line-of-sight detecting unit 62 may not be provided on each of the left and right sides, and may be provided on either of the left and right sides.
Fig. 7 is a block diagram functionally showing the configuration of the control device 10. The control device 10 functionally includes a storage function unit 122 and a control function unit 150. The storage function unit 122 is a logic storage unit configured by the nonvolatile storage unit 121 (fig. 6). Instead of using only the memory function section 121, the memory function section 122 may be configured to use the EEPROM 215 and the memory 118 in combination with the nonvolatile memory section 121. The control function unit 150 is configured by the main processor 140 executing a computer program, that is, hardware and software cooperating with each other.
The storage function unit 122 stores various data for controlling the processing of the function unit 150. Specifically, the storage function unit 122 of the present embodiment stores setting data 123 and content data 124. The setting data 123 includes various setting values related to the operation of the HMD 100. For example, the setting data 123 includes parameters, a determinant, a calculation formula, an LUT (Look Up Table), and the like when the control function unit 150 controls the HMD 100.
The content data 124 includes content data (image data, video data, audio data, and the like) including images and videos displayed on the display unit 20 under the control of the control function unit 150. In addition, the content data 124 may contain bidirectional type content data. The content of the bidirectional type refers to the following types of content: the operation of the user is acquired through the operation unit 110, the control function unit 150 executes the processing corresponding to the acquired operation content, and the content corresponding to the processing content is displayed on the display unit 20. In this case, the content data may include image data of a menu screen for acquiring an operation by a user, data for specifying a process corresponding to an item included in the menu screen, and the like.
The control function unit 150 executes various processes using the data stored in the storage function unit 122, and functions as an OS (Operating System) 143, an image processing unit 145, a display control unit 147, an imaging control unit 149, an input/output control unit 151, a distance-classified image generation unit 153, an intermediate distance image generation unit 155, and a gazing point distance calculation unit 157. In the present embodiment, each functional unit other than the OS 143 is configured as a computer program to be executed on the OS 143.
The image processing unit 145 generates signals to be transmitted to the right display unit 22 and the left display unit 24 based on the image data of the image/video displayed on the display unit 20. The signal generated by the image processing section 145 may be a vertical synchronization signal, a horizontal synchronization signal, a clock signal, an analog image signal, or the like. The image processing unit 145 may be configured by hardware (e.g., a Digital Signal Processor (DSP)) different from the main Processor 140, in addition to a configuration realized by the main Processor 140 executing a computer program.
In addition, the image processing section 145 may perform resolution conversion processing, image adjustment processing, 2D/3D conversion processing, and the like as necessary. The resolution conversion process is a process of converting the resolution of the image data into a resolution suitable for the right display unit 22 and the left display unit 24. The image adjustment processing is processing for adjusting the brightness and saturation of the image data. The 2D/3D conversion processing is processing of generating two-dimensional image data from three-dimensional image data, or generating three-dimensional image data from two-dimensional image data. When these processes are performed, the image processing unit 145 generates a signal for displaying an image from the processed image data, and transmits the signal to the display unit 20 via the connection cable 40.
The display control unit 147 generates a control signal for controlling the right display unit 22 and the left display unit 24, and controls generation and emission of the image light in each of the right display unit 22 and the left display unit 24 by the control signal. Specifically, the display control unit 147 controls the OLED drive circuits 225 and 245 to display images on the OLED panels 223 and 243. The display control unit 147 controls the timing at which the OLED drive circuits 225 and 245 draw on the OLED panels 223 and 243, controls the luminance of the OLED panels 223 and 243, and the like, based on the signal output from the image processing unit 145. The display control unit 147 causes each of the virtual screens of the display unit 20 to display a distance-specific display image, which will be described later, generated by the distance-specific image generation unit 153 in a display control process, which will be described later.
The imaging control unit 149 controls the camera 61 to perform imaging, generates captured image data, and temporarily stores the captured image data in the storage function unit 122. When the camera 61 is configured as a camera unit including a circuit for generating captured image data, the image pickup control unit 149 acquires captured image data from the camera 61 and temporarily stores the captured image data in the storage function unit 122.
The input/output control unit 151 appropriately controls the touch panel 14 (fig. 1), the direction key 16, and the enter key 17 to receive input commands from them. The accepted input command is output to the OS 143 or to a computer program running on the OS 143 together with the OS 143.
The distance-based classified image generating unit 153 generates a display image (hereinafter, referred to as "distance-based classified display image") for displaying a display object existing between two adjacent focal lengths among the 3 focal lengths on the virtual screen. The following is a detailed description of the display of images classified by distance.
The intermediate distance image generating unit 155 generates a display image (hereinafter, referred to as an "intermediate distance display image") for displaying a display object representing a display object existing at an intermediate position of each focal length of the 3 stages on the virtual screen. At this time, the intermediate distance image generation unit 155 synthesizes colors of display objects respectively existing at the 1 st focal length, the 2 nd focal length, and the 3 rd focal length, and sets the synthesized color as the color of the intermediate distance display image. In this case, the intermediate distance image generation unit 155 may combine the luminances (brightnesses) of the display objects existing at the 1 st focal length, the 2 nd focal length, and the 3 rd focal length, and set the combined luminances (brightnesses) as the colors of the intermediate distance display image.
The gaze point distance calculation unit 157 calculates a gaze point distance. In the present embodiment, the "gaze point distance" refers to a distance between the user and the gaze point at which the user gazes. The "gaze point" is a point where a left and right virtual lines connecting the center eye socket (the place with the highest vision in the retina) of the user and the center of the iris intersect. The gaze point distance calculation unit 157 detects the gaze point of the user from the line of sight of the user detected by the line of sight detection unit 62, and calculates the distance between the detected gaze point and the eyes of the user as the gaze point distance. The gaze point distance calculation unit 157 may repeatedly calculate the gaze point distance. Thus, the gaze point distance can be determined in real time.
A2. Augmented reality representation:
fig. 8 is an explanatory diagram illustrating an example of augmented reality display by the HMD 100. In fig. 8, the field of view VT of the user is illustrated. As described above, by imaging the image lights guided to both eyes of the user of the HMD 100 on the retina of the user, the user sees the object image AI of the display object as Augmented Reality (AR) within the display region PN. In the example shown in fig. 8, the object image AI is a menu screen of the OS of the HMD 100. The menu screen includes, for example, icon images IC for starting applications such as "analog watch", "message", "music", "navigation", "camera", "browser", "calendar" and "telephone". The right and left light guide plates 26 and 28 transmit light from the outside, thereby allowing the user to see the exterior view SC. In this way, the user of the HMD 100 according to the present embodiment can view the target image AI by overlapping the portion of the field of view VT in which the target image AI is displayed with the external scene SC. In addition, in the part of the field of view VT where the target image AI is not displayed, only the external scene SC can be seen.
Fig. 9 is an explanatory diagram schematically showing an example of a display object. The field of view VT is viewed by the user through the display unit 20 of the HMD 100 worn by the user of the HMD 100. In fig. 9, the outside scene SC is not shown. In the display region PN, 3 display objects Ob1, Ob2, and Ob3 are displayed. In the present embodiment, the "display object" refers to a stereoscopic (3D) display image representing a display object. Specifically, the 1 st display object Ob1 is a stereoscopic display image representing "apple" as a display object. The 2 nd display object Ob2 is a stereoscopic display image representing "tv" as a display object. The 3 rd display object Ob3 is a stereoscopic display image representing a "timepiece" as a display object.
As shown in fig. 9, 3 display objects Ob1, Ob2, and Ob3 are displayed to be arranged in the order of the 1 st display object Ob1, the 2 nd display object Ob2, and the 3 rd display object Ob3 from the display section 20 side (side close to the user) in the visual field direction of the user of the HMD 100. At this time, the distances from the left eye LE and the right eye RE of the user of the HMD 100 to the respective display objects Ob1, Ob2, and Ob3 are different for each display object Ob1, Ob2, and Ob 3.
Fig. 10 is an explanatory diagram schematically showing display distances of the display objects Ob1, Ob2, and Ob 3. In the present embodiment, the "display distance" refers to a distance between the viewpoint positions of the right and left eyes RE and LE of the user and the display position of the display object. As shown in fig. 10, the display distance of the 1 st display object Ob1 is the 1 st display distance W1. The 1 st display distance W1 is 50 cm, which is the same as the 1 st focal length set by the focal length changing lens SL2 of the short-distance optical system SL in the display unit 20. The display distance of the 2 nd display object Ob2 is the 2 nd display distance W2. The 2 nd display distance W2 is 1 meter, which is the same as the 2 nd focal length set by the focal length changing lens ML2 of the intermediate distance optical system ML in the display unit 20. The display distance of the 3 rd display object Ob3 is the 3 rd display distance W3. The 3 rd display distance W3 is 2 meters, which is the same as the 3 rd focal length set by the focal length changing lens LL2 of the distance optical system LL in the display unit 20.
As shown in fig. 9 and 10, when the display objects Ob1, Ob2, and Ob3 different in display distance are displayed on the display screen corresponding to a single focal length, an optical mismatch may occur due to the difference in focal length of each display object Ob1, Ob2, and Ob3 and the difference in convergence angle of each display object Ob1, Ob2, and Ob 3. However, in the present embodiment, in the display control process described later, the display objects Ob1, Ob2, and Ob3 are cut out between the focal lengths to generate a display image, and the generated display image is displayed on the virtual screen corresponding to the focal length on the back side in the visual field direction of the user of the HMD 100, out of the two adjacent focal lengths, whereby the optical mismatch can be reduced. The following describes specific processing contents.
A3. Display control processing:
fig. 11 is a flowchart showing processing steps of the display control process. When the user sets the power switch 18 of the control device 10 on, the display control process is started. The distance-specific image generation unit 153 generates a 3DCG object map (step S100). The 3DCG object map is obtained by mapping a 3-dimensional computer graphic object and the position (coordinates) of the object. The 3DCG object map may be generated by a known method, for example, by arranging each of the display objects Ob1, Ob2, and Ob3 in a virtual environment of 3-dimensional computer graphics, according to the position of each of the display objects Ob1, Ob2, and Ob3 with respect to an object (an object existing in a real space) in the external scene SC viewed through the display unit 20.
Fig. 12 is an explanatory diagram schematically showing the 3DCG object map OM. As shown in fig. 12, the display objects Ob1, Ob2, and Ob3 are arranged in the XYZ space of the 3DCG object map OM. Each of the display objects Ob1, Ob2, and Ob3 is configured in the same positional relationship as that shown in fig. 9 and 10. The distance-classified-image generating unit 153 can arrange the display objects Ob1, Ob2, and Ob3 in the 3DCG object map OM at positions where the positions of the objects existing in the real space coincide with the positions of the display objects by specifying the display positions of the display objects Ob1, Ob2, and Ob3 with reference to the objects when the objects existing in the real space are viewed from the viewpoint position of the user. As shown in fig. 12, the 3DCG object map OM is set with a left-eye virtual camera CL and a right-eye virtual camera CR. Each of the virtual cameras CL and CR is used to generate a display image classified by distance, which will be described later.
As shown in fig. 11, the distance-based classified image generating unit 153 sets the viewpoint positions of the left eye LE and the right eye RE (step S110). Specifically, the distance-based classified image generating unit 153 sets the position of the virtual camera CL for the left eye set in the 3DCG object map OM to the viewpoint position of the left eye LE of the user of the HMD 100, and sets the position of the virtual camera CR for the right eye to the viewpoint position of the right eye RE of the user of the HMD 100. At this time, the distance-based classified image generating unit 153 sets the orientation of each of the virtual cameras CL and CR to the visual field direction of the user of the HMD 100. In addition, step S110 may be executed simultaneously with step S100.
As shown in fig. 11, the distance-by-distance classification image generating unit 153 generates a distance-by-distance classification display image (step S120). Specifically, the distance-classified-image generating unit 153 captures images of the display objects Ob1, Ob2, and Ob3 by the virtual cameras CL and CR, and cuts out the display objects Ob1, Ob2, and Ob3 existing between two adjacent focal lengths from the captured images. More specifically, the distance-specific image generation unit 153 cuts out a display object having a focal length of 0cm to the 1 st focal length from the 3DCG object map OM, and generates a distance-specific display image corresponding to the 1 st focal length (hereinafter, referred to as "1 st distance-specific display image"). In the present embodiment, the "focal length of 0 cm" means that the distance to the viewpoint position of the user is 0 cm. As described above, since the positions of the virtual cameras CL and CR are set at the viewpoint positions of the left eye LE and the right eye RE of the user of the HMD 100, the classified-image generating unit 153 cuts out the display object existing between the virtual cameras CL and CR and the 1 st focal length.
The distance-specific image generation unit 153 cuts out display objects at the 1 st focal length to the 2 nd focal length from the 3DCG object map OM, and generates a distance-specific display image corresponding to the 2 nd focal length (hereinafter, referred to as "2 nd distance-specific display image"). The distance-specific image generation unit 153 cuts out display objects at the 2 nd focal length to the 3 rd focal length from the 3DCG object map OM, and generates a distance-specific display image corresponding to the 3 rd focal length (hereinafter, referred to as "3 rd distance-specific display image").
For example, in the example shown in fig. 12, an image in which the 1 st display object Ob1 is cut out is generated as the 1 st distance-based classification display image. Then, an image in which the 2 nd display object Ob2 is cut out is generated as a2 nd distance-based classified display image, and an image in which the 3 rd display object Ob3 is cut out is generated as a3 rd distance-based classified display image. In addition to the display objects existing between the 2 nd focal length and the 3 rd focal length, the display objects existing on the back side of the 3 rd focal length in the visual field direction of the user can be cut out from the 3 rd distance-classified display image.
In step S120, when the display objects Ob1, Ob2, and Ob3 completely overlap each other in the direction of the field of view of the user, the distance-based classified image generating unit 153 cuts only the display objects located on the front side in the field of view direction, and does not cut the display objects located on the rear side in the field of view direction. When the display objects Ob1, Ob2, and Ob3 partially overlap with each other when viewed from the visual field direction of the user, the distance-based classified image generating unit 153 cuts out the display objects located on the front side in the visual field direction and cuts out the display objects located on the rear side in the visual field direction so as not to overlap with the display objects located on the front side in the visual field direction. At this time, the distance-classified-image generating unit 153 cuts out a portion overlapping with a display object positioned on the near side in the visual field direction from a display object positioned on the far side in the visual field direction, and cuts out a portion other than the overlapping portion. This is to cut out only the portions visible to the virtual cameras CL and CR and not the portions invisible to the virtual cameras CL and CR, thereby reducing discomfort given to the user.
As shown in fig. 11, the display control unit 147 causes the generated distance-specific display images to be displayed on the virtual screen (step S130). Specifically, the display control unit 147 simultaneously displays each of the distance-classified images on a virtual screen corresponding to the focal length on the back side in the visual field direction of the user, out of the two adjacent focal lengths. For example, the focal length corresponding to the 1 st classified display image is 0cm and the 1 st focal length, and the focal length on the back side in the visual field direction of the user is the 1 st focal length. Accordingly, the 1 st sorted-by-distance display image is displayed on the 1 st virtual screen corresponding to the 1 st focal length. For example, the focal lengths corresponding to the 2 nd distance-based classified display image are the 1 st focal length and the 2 nd focal length, and the focal length on the back side in the visual field direction of the user is the 2 nd focal length. Accordingly, the 2 nd display image by distance is displayed on the 2 nd virtual screen corresponding to the 2 nd focal length. For example, the focal lengths corresponding to the 3 rd distance-based classified display image are the 2 nd focal length and the 3 rd focal length, and the focal length on the back side in the visual field direction of the user is the 3 rd focal length. Accordingly, the 3 rd by-distance classified display image is displayed on the 3 rd virtual screen corresponding to the 3 rd focal distance.
Fig. 13 is an explanatory view schematically showing the visual field VT of the user after the execution of step S130. In fig. 13, the outside scene SC is not illustrated, as in fig. 9. As shown in fig. 13, display images Im1, Im2, and Im3 sorted by distance are displayed in the display region PN. The 1 st sorted-by-distance display image Im1 is displayed on the 1 st virtual screen VS1 shown by a one-dot chain line. The 2 nd sorted-by-distance display image Im2 is displayed on the 2 nd virtual screen VS2 shown by a broken line. The 3 rd distance-classified display image Im3 is displayed on the 3 rd virtual screen VS3 shown by a two-dot chain line. As described above, since each of the distance-classified display images Im1, Im2, and Im3 is displayed on the virtual screens VS1, VS2, and VS3 corresponding to different focal lengths from each other, when the user looks at each of the distance-classified display images Im1, Im2, and Im3, it is possible to reduce the focal length deviation and the convergence angle deviation, and to view each of the distance-classified display images Im1, Im2, and Im 3.
Fig. 14 is an explanatory view schematically showing the visual field VT of the user when the user is looking at the 1 st sorted-by-distance display image Im 1. In fig. 14, the outside scene SC is not illustrated, as in fig. 13. As shown in fig. 14, when the user is gazing at the 1 st distance-classified display image Im1, the display distance (the 1 st display distance W1) of the 1 st display object Ob1 is the same as the 1 st focal length, and therefore the 1 st distance-classified display image Im1 is focused on the 1 st virtual screen VS 1. In this case, the 2 nd distance-classified display image Im2 is displayed on the 2 nd virtual screen VS2 corresponding to the 2 nd focal length larger than the 1 st display distance W1, and the 3 rd distance-classified display image Im3 is displayed on the 3 rd virtual screen VS3 corresponding to the 3 rd focal length larger than the 1 st display distance W1, and therefore the 2 nd distance-classified display image Im2 and the 3 rd distance-classified display image Im3 are blurred to be seen. In addition, in fig. 14, a case where the images are seen blurrily is schematically shown by applying shading to the 2 nd distance-classified display image Im2 and the 3 rd distance-classified display image Im 3.
Fig. 15 is an explanatory view schematically showing the visual field VT of the user when the user is looking at the 2 nd distance-based classified display image Im 2. In fig. 15, the outside scene SC is not illustrated as in fig. 14. In fig. 15, by hatching the 1 st and 3 rd distance classified display images Im1 and Im3 in the same manner as in fig. 14, the focus deviation is shown schematically and the images are seen blurrily. As shown in fig. 15, when the user is watching the 2 nd distance-classified display image Im2, the display distance (2 nd display distance W2) of the 2 nd display object Ob2 is the same as the 2 nd focal length, and therefore the 2 nd distance-classified display image Im2 is focused on the 2 nd virtual screen VS 2. In this case, the 1 st distance-classified display image Im1 is displayed on the 1 st virtual screen VS1 corresponding to the 1 st focal length smaller than the 2 nd display distance W2, and the 3 rd distance-classified display image Im3 is displayed on the 3 rd virtual screen VS3 corresponding to the 3 rd focal length larger than the 2 nd display distance W2, and therefore the 1 st distance-classified display image Im1 and the 3 rd distance-classified display image Im3 are blurred to be seen.
Fig. 16 is an explanatory view schematically showing the visual field VT of the user when the user is looking at the 3 rd distance-classified display image Im 3. In fig. 16, the outside scene SC is not illustrated as in fig. 14. In fig. 16, a case where the view is blurred due to the focus deviation is schematically shown by hatching the 1 st sorted-by-distance display image Im1 and the 2 nd sorted-by-distance display image Im2 in the same manner as in fig. 14. As shown in fig. 16, when the user is gazing at the 3 rd distance-classified display image Im3, the display distance (the 3 rd display distance W3) of the 3 rd display object Ob3 is the same as the 3 rd focal length, and therefore the 3 rd distance-classified display image Im3 is focused on the 3 rd virtual screen VS 3. In this case, the 1 st distance-classified display image Im1 is displayed on the 1 st virtual screen VS1 corresponding to the 1 st focal length smaller than the 3 rd display distance W3, and the 2 nd distance-classified display image Im2 is displayed on the 2 nd virtual screen VS2 corresponding to the 2 nd focal length smaller than the 3 rd display distance W3, and thus the 1 st distance-classified display image Im1 and the 2 nd distance-classified display image Im2 are blurry seen.
As shown in fig. 11, after step S130 is performed, the process returns to step S100 described above.
According to the HMD 100 of the present embodiment described above, the display images Im1, Im2, and Im3 classified by distance for displaying the display objects Ob1, Ob2, and Ob3 representing the display objects existing between the adjacent two focal lengths on the virtual screens VS1, VS2, and VS3 are generated. Further, the generated respective display images Im1, Im2, and Im3 classified by distance are displayed on a virtual screen corresponding to the focal length on the back side in the visual field direction of the user of the HMD 100, out of the two adjacent focal lengths, and optical mismatch due to the deviation of the convergence angle and the deviation of the focal length can be reduced.
Further, in the case where the plurality of display objects Ob1, Ob2, and Ob3 overlap as viewed from the visual field direction, since a part of the display objects Ob1, Ob2, and Ob3 located on the inner side in the visual field direction among the plurality of display objects Ob1, Ob2, and Ob3 is not displayed, discomfort given to the user can be reduced as compared with the configuration in which the display objects Ob1, Ob2, and Ob3 located on the inner side in the visual field direction are displayed.
B. Embodiment 2:
the head-mounted display device 100 according to embodiment 2 is the same as the head-mounted display device 100 according to embodiment 1, and therefore, detailed description thereof is omitted.
The control function unit according to embodiment 2 differs from the control function unit 150 according to embodiment 1 in that: the display control unit 147 causes the entire display object (hereinafter referred to as "continuous display object") representing the display object continuously existing within the plurality of focal lengths to be displayed on the virtual screen corresponding to the focal length closest to the gazing point distance. The other configurations of the control function unit according to embodiment 2 are the same as those of the control function unit 150 according to embodiment 1, and therefore detailed description thereof is omitted.
Fig. 17 is an explanatory diagram schematically showing the display distances of the display objects Ob4, Ob5, and ROb1 of embodiment 2. As shown in fig. 17, the 4 th display object Ob4 is a stereoscopic display image showing "foliage plant" as a display object. The continuous display object ROb1 is a stereoscopic display image representing a "table" as a display object. The 5 th display object Ob5 is a stereoscopic display image representing an "airplane" as a display object. The display distance of the 4 th display object Ob4 is the 1 st display distance W1. The display distance of the continuous display object ROb1 is a distance of the entire range from the 1 st display distance W1 to the 3 rd display distance W3. The display distance of the 5 th display object Ob5 is the 3 rd display distance W3. As in embodiment 1, the display distances W1, W2, and W3 correspond to the 1 st focal length, the 2 nd focal length, and the 3 rd focal length, respectively, described above.
Fig. 18 is a flowchart showing the processing procedure of the display control processing of embodiment 2. The display control processing of embodiment 2 differs from the processing steps of the display control processing of embodiment 1 shown in fig. 11 in that step S125, step S131, and step S135 are additionally executed. The other steps of the display control processing according to embodiment 2 are the same as those of embodiment 1, and therefore the same steps are assigned the same reference numerals and detailed description thereof is omitted.
As shown in fig. 18, when the display images classified by distance are generated (step S120), the classified by distance image generating unit 153 determines whether or not the display object is the continuous display object ROb1 (step S125). Specifically, the classified-by-distance image generating section 153 analyzes images captured by the respective virtual cameras CL and CR to determine whether or not the display object is continuous within a plurality of focal lengths. If it is determined that the display object is not the continuous display object ROb1 (no in step S125), step S130 is executed. On the other hand, when it is determined that the display object is the continuous display object ROb1 (yes in step S125), the gaze point distance calculation unit 157 calculates the gaze point distance (step S131). Specifically, first, the line-of-sight detecting unit 62 detects the line of sight of the user. Next, the gaze point distance calculation unit 157 calculates the gaze point based on the detected line of sight, and calculates the gaze point distance.
The display control unit 147 reduces and displays the distance-specific display images (step S135). Specifically, the display control unit 147 causes the distance-specific display image of the continuous display object ROb1 (hereinafter referred to as "continuous distance-specific display image") to be displayed on the virtual screen corresponding to the focal length closest to the calculated gazing point distance among the 1 st focal length, the 2 nd focal length, and the 3 rd focal length. Since the continuous display objects ROb1 are continuous across a plurality of focal lengths, the display control unit 147 reduces the size of the display images classified by continuous distances so as to match the size of the virtual screen, and displays the images on the single virtual screen.
Fig. 19 is an explanatory view schematically showing the visual field VT of the user after the execution of step S135. In fig. 19, the outside view SC is not shown as in fig. 13. As shown in fig. 19, display images Im4, RIm1, and Im5 sorted by distance are displayed in the display region PN. As shown in fig. 17, since the 4 th display object Ob4 is not the consecutive display object ROb1, the above-described step S130 is performed, and as shown in fig. 19, the 4 th distance-classified display image Im4 is displayed on the 1 st virtual screen VS1 corresponding to the 1 st focal length. Also, since the 5 th display object Ob5 is not the consecutive display object ROb1, the above-described step S130 is performed to cause the 5 th distance-classified display image Im5 to be displayed on the 3 rd virtual screen VS3 corresponding to the 3 rd focal length.
On the other hand, as shown in fig. 17, since the continuous display object ROb1 is a display object in the entire range from the 1 st display distance W1 to the 3 rd display distance W3, the above-described step S135 is performed to display the sorted-by-continuous-distance display image RIm1 on the virtual screen (the 1 st virtual screen VS1 in the example shown in fig. 19) corresponding to the focal distance closest to the calculated gazing point distance. Then, the display images RIm1 are displayed in a reduced size so as to display the entire continuous display object ROb1 on the 1 st virtual screen VS 1.
The HMD 100 according to embodiment 2 described above achieves the same effects as those of embodiment 1. Further, since the entire continuous display object ROb1 continuously existing across a plurality of focal lengths is displayed on the 1 st virtual screen VS1 corresponding to the focal length closest to the gazing point distance, the continuous display object ROb1 can be displayed with high accuracy.
C. Embodiment 3:
the head-mounted display device 100 according to embodiment 3 is the same as the head-mounted display device 100 according to embodiment 1, and therefore, detailed description thereof is omitted.
The control function unit of embodiment 3 differs from the control function unit 150 of embodiment 1 in the following points: the display control unit 147 divides the continuous display object ROb1 into virtual screens VS1, VS2, and VS3 corresponding to the respective focal lengths, and displays the divided screens. The other configurations of the control function unit according to embodiment 3 are the same as those of the control function unit 150 according to embodiment 1, and therefore, detailed description thereof is omitted.
Fig. 20 is a flowchart showing the processing procedure of the display control processing of embodiment 3. The display control processing of embodiment 3 differs from the processing steps of the display control processing of embodiment 2 shown in fig. 18 in the point where step S115 and step S133 are additionally executed, the point where step S125 is omitted, and the point where step S135a is executed instead of step S135. Since other steps of the display control processing of embodiment 3 are the same as those of embodiment 2, the same steps are assigned the same reference numerals, and detailed description thereof is omitted. Step S115 is the same as step S125 shown in fig. 18, but different reference numerals are given to the processing steps for convenience of explanation.
As shown in fig. 20, when the viewpoint positions of the left eye LE and the right eye RE are set (step S110), the distance-based classified image generating unit 153 determines whether or not the display object is the continuous display object ROb1 (step S115). The processing content of step S115 is the same as step S125 described above. If it is determined that the display object is not the continuous display object ROb1 (no in step S115), step S120 described above is executed. On the other hand, when it is determined that the display object is the continuous display object ROb1 (yes in step S115), the distance-specific image generation unit 153 divides and generates distance-specific display images (step S133). Specifically, the sorted-by-distance image generating section 153 generates images of the parts from the virtual cameras CL and CR to the 1 st focal length, the 1 st focal length to the 2 nd focal length, and the 2 nd focal length to the 3 rd focal length, respectively, of the sorted-by-successive-distance display images, thereby generating the sorted-by-successive-distance display images.
The display control unit 147 divides and displays the distance-specific display images (step S135 a). Specifically, the display control section 147 causes the 1 st virtual screen VS1 to display images of the portions ranging from the virtual cameras CL and CR to the 1 st focal length in the sequential distance-sorted display images. Then, the display control unit 147 causes the 2 nd virtual screen VS2 to display images of the portion from the 1 st focal length to the 2 nd focal length among the sorted display images by the continuous distance. Then, the display control unit 147 causes the 3 rd virtual screen VS3 to display images of the 2 nd focal length to the 3 rd focal length in the sequential distance-classified display images.
Fig. 21 is an explanatory view schematically showing the visual field VT of the user after the execution of step S135 a. In fig. 21, the outside scene SC is not illustrated as in fig. 19. In fig. 21, the images Im4 and Im5 displayed in a sorted manner by distance as shown in fig. 19 are not shown. As shown in fig. 21, a display image RIm2 sorted by continuous distance is displayed in the display area PN. The sorted-by-continuous-distance display image RIm2 is composed of 3 sorted-by-continuous-distance display images RIm2a, RIm2b, and RIm2 c. Each of the display images RIm2a, RIm2b, and RIm2c classified by continuous distance is displayed on a different virtual screen. Specifically, the sequential distance classification display image RIm2a is displayed on the 1 st virtual screen VS1, the sequential distance classification display image RIm2b is displayed on the 2 nd virtual screen VS2, and the sequential distance classification display image RIm2c is displayed on the 3 rd virtual screen VS 3.
As can be understood by comparing fig. 17 and 21, in embodiment 3, the continuous display object ROb1 is displayed without changing the size of the display object. When the user is gazing at each of the virtual screens VS1, VS2, and VS3, the user can focus on any of the continuous distance-based sorted display images RIm2a, RIm2b, and RIm2c displayed on the gazed virtual screens VS1, VS2, and VS 3.
The HMD 100 according to embodiment 3 described above achieves the same effects as those of embodiment 1. Further, since the continuous display object ROb1 continuously existing across a plurality of focal lengths is divided for each focal length and displayed on each of the virtual screens VS1, VS2, and VS3, when the user focuses on any of the virtual screens VS1, VS2, and VS3, the user can see the continuous display object ROb1 with high accuracy.
D. Other embodiments are as follows:
D1. other embodiment 1:
in embodiment 1 described above, the display control unit 147 may display the intermediate distance display image on the virtual screen corresponding to the focal length closest to the gazing point distance. For example, when a display object exists between the 1 st display object Ob1 and the 2 nd display object Ob2 shown in fig. 10, according to the 1 st embodiment, a display image classified by intermediate distance as a display image of the display object is displayed on the 2 nd virtual screen VS2 corresponding to the 2 nd focal length on the far side in the line of sight direction of the user, of the 1 st focal length and the 2 nd focal length corresponding to the classified display image by intermediate distance.
Instead, the display control unit 147 may display the classified display images at intermediate distances on a virtual screen corresponding to the focal distance closest to the gaze point distance calculated by the gaze point distance calculation unit 157. For example, the display control unit 147 may display the classified display images at intermediate distances on the 1 st virtual screen image VS 1. For example, the display control unit 147 may display the classified display images at intermediate distances on two virtual screens, i.e., the 1 st virtual screen VS1 and the 2 nd virtual screen VS 2. In this configuration, the display control unit 147 may display the intermediate-distance classified display images with different brightnesses (brightnesses) according to the gaze point distance. Specifically, the display may be such that the brightness (luminance) increases as the focal length increases. That is, in the normal case, the same effects as those of the above embodiments are obtained as long as the intermediate distance display image is displayed on at least one of the 1 st virtual screen VS1 and the 2 nd virtual screen VS 2.
D2. Other embodiment 2:
in each of the above embodiments, the distance-based classified image generating unit 153 may generate the distance-based classified display image using the gazing point distance. Specifically, in step S120, the distance-based classification image generation unit 153 may cut out a display object located between the viewpoint position of the user and the gaze point distance to generate a distance-based classification display image, and in step S130, the display control unit 147 may display the generated distance-based classification display image on a virtual screen corresponding to a focal length closest to the gaze point distance. In such a configuration, the same effects as those of the above embodiments are also obtained.
D3. Other embodiment 3:
in embodiment 2 described above, the display control unit 147 displays the entire sorted-by-continuous-distance display image RIm1 on the virtual screen corresponding to the focal length closest to the gaze point distance, but the present invention is not limited to this. For example, the display control unit 147 may calculate a midpoint of the length of the continuous display object ROb1 in the line of sight direction of the user, calculate the distance between the user and the midpoint, and display the entire display image RIm1 classified by the continuous distances on the virtual screen corresponding to the focal length closest to the calculated distance. Such a configuration also provides the same effects as those of the above embodiments.
D4. Other embodiment 4:
in each of the above embodiments, the sight line detection unit 62 is configured by an infrared light emitting unit and an infrared light receiving unit, but the present invention is not limited to this. For example, the line-of-sight detecting unit 62 may be a pupil imaging camera. In this configuration, the sight-line direction can be determined by analyzing images of the right eye RE and the left eye LE of the user captured by the left and right pupil capturing cameras. In this configuration, the gaze point distance calculation unit 157 can estimate the position observed by the user from the determined direction of the line of sight, and thus can determine the gaze point of the user. For example, the line-of-sight detecting unit 62 may detect the line of sight of the user by using a known technique such as a detection result of an electromyographic sensor and eye tracking. Such a configuration also provides the same effects as those of the above embodiments.
D5. Other embodiment 5:
in the above embodiments, the focal lengths are not limited to the above examples. For example, instead of 50 cm, the 1 st focal length may be set to any distance within the range reached by the user's hand. For example, instead of 1 meter, the 2 nd focal length may be set to an arbitrary distance between the 1 st focal length and the 3 rd focal length. For example, instead of 2 meters, the 3 rd focal length may be set to an arbitrary distance within a range that is larger than the 1 st focal length and the 2 nd focal length and in which the focal length of the user is effective. Such a configuration also provides the same effects as those of the above embodiments.
D6. Other embodiment 6:
in each of the above embodiments, the number of virtual screens set in the display unit 20 is 3, but instead, for example, the number of virtual screens may be 6 in total, that is, 3 virtual screens for the left eye LE and 3 virtual screens for the right eye RE, and any other number of virtual screens may be set as long as the virtual screens are set in the display unit 20 for each of a plurality of predetermined focal lengths. Such a configuration also provides the same effects as those of the above embodiments.
D7. Other embodiment 7:
in the above embodiments, the OLED cells 221 and 241 include the OLED panels 223 and 243 and the OLED driving circuits 225 and 245 for driving the OLED panels 223 and 243, and the OLED panels 223 and 243 are self-luminous display panels including light-emitting elements that emit light by organic electroluminescence and emit R (red), G (green), and B (blue) color light, respectively. In the OLED panels 223 and 243, a plurality of pixels each including 1 element R, G, B and having 1 pixel are arranged in a matrix, but the present invention is not limited to this. For example, the right display unit 22 and the left display unit 24 may be configured as video elements each having an OLED panel as a light source unit and a modulation element for modulating light emitted from the light source unit and outputting image light including a plurality of color lights. The modulation device for modulating the light emitted from the OLED panel is not limited to the structure using the transmissive liquid crystal panel. For example, a reflective liquid crystal panel may be used instead of the transmissive liquid crystal panel, a digital micromirror device may be used, or a laser-retinal projection-type HMD of a laser scanning system may be used. Such a configuration also provides the same effects as those of the above embodiments.
D8. Other embodiment mode 8:
in each of the above embodiments, when the display objects overlap in the visual field direction, the display control unit 147 causes the display objects located on the far side in the visual field direction to be displayed except for the portion overlapping with the display object located on the near side. For example, the display control unit 147 may display the entire display object located on the back side in the visual field direction of the user. In this configuration, for example, the luminance and saturation of the display object located on the far side in the visual field direction may be made lower than those of the display object located on the near side, thereby reducing the visibility of the display object located on the far side in the visual field direction. That is, in general, when a plurality of display objects overlap in the viewing direction, the same effects as those of the above-described embodiments can be obtained if at least a part of the display object located on the rear side in the viewing direction among the plurality of display objects is configured not to be displayed.
D9. Other embodiment 9:
in the above embodiments, the head-mounted display device 100 is a transmissive head-mounted display device, but the present invention is not limited thereto. For example, the HMD may be a video see-through HMD, or a monocular HMD. Such a configuration also provides the same effects as those of the above embodiments.
D10. Other embodiment 10:
in each of the above embodiments, the distance-based classified display images Im1 to Im5 are displayed on the virtual screens VS1 to VS3, but instead of or in addition to this, virtual screens different from the virtual screens VS1 to VS3 may be set in advance between the 1 st virtual screen VS1 and the 2 nd virtual screen VS2 or between the 2 nd virtual screen VS2 and the 3 rd virtual screen VS3, and the distance-based classified display image of the display object located between the 1 st virtual screen VS1 and the 2 nd virtual screen VS2 may be selectively displayed on either one of the virtual screen between the 1 st virtual screen VS1 and the 2 nd virtual screen VS2 and the 2 nd virtual screen VS2, or both of the virtual screens may be displayed. Such a configuration also provides the same effects as those of the above embodiments.
D11. Other embodiment 11:
in the above embodiments, the virtual screens VS1 to VS3 are set in accordance with a predetermined focal length in the visual field direction of the user, but the present invention is not limited to this. For example, each of the virtual screens VS1 to VS3 may be set in accordance with a predetermined focal length in the line of sight direction of the user. In this configuration, in the above-described display control process, the directions of the virtual cameras CL and CR may be set to the line-of-sight direction of the user, and the display images may be generated by distance classification of the display object existing in the line-of-sight direction of the user. For example, each of the virtual screens VS1 to VS3 may be set in accordance with a predetermined focal length in the visual field direction of the user. Such a configuration also provides the same effects as those of the above embodiments.
The present invention is not limited to the above-described embodiments, and can be realized in various configurations without departing from the spirit thereof. For example, in order to solve a part or all of the above-described problems or to achieve a part or all of the above-described effects, technical features in embodiments corresponding to technical features in the respective embodiments described in the summary of the invention may be appropriately replaced or combined. Further, unless the technical features are described as essential technical features in the present specification, they may be appropriately deleted.
E. Other modes are as follows:
(1) according to one embodiment of the present invention, a head-mounted display device is provided. The head-mounted display device includes: a display unit that displays a virtual screen set for each of a plurality of predetermined focal lengths; a distance-based classification image generating unit that generates a distance-based classification display image for displaying, on the virtual screen, a display object representing a display object existing between two adjacent focal lengths; and a display control unit that displays each of the generated distance-specific display images on the virtual screen corresponding to a focal distance on a back side in a visual field direction of a user of the head-mounted display device, out of the two adjacent focal distances.
According to the head mounted display device of this aspect, the distance-specific display images for displaying the display object representing the display object existing between the two adjacent focal lengths on the virtual screen are generated, and each of the generated distance-specific display images is displayed on the virtual screen corresponding to the focal length on the back side in the visual field direction of the user of the head mounted display device out of the two adjacent focal lengths, so that it is possible to reduce optical mismatch due to the deviation of the convergence angle and the deviation of the focal length.
(2) In the head-mounted display device according to the above aspect, when a plurality of the display objects overlap each other as viewed in the visual field direction, the display control unit may cause at least a part of the display object located on a rear side in the visual field direction among the plurality of the display objects to be not displayed.
According to the head-mounted display device of this aspect, when a plurality of display objects overlap each other as viewed in the visual field direction, at least a part of the display objects located on the back side in the visual field direction among the plurality of display objects is not displayed, and therefore, compared with a configuration in which display objects located on the back side in the visual field direction are displayed, a sense of discomfort given to the user can be reduced.
(3) In the head-mounted display device according to the above aspect, the head-mounted display device further includes: a line-of-sight detection unit that detects a line of sight of the user; and a gaze point distance calculation unit that calculates a gaze point distance, which is a distance between the user and a gaze point at which the user gazes, from the detected line of sight, and the display control unit may display the entire continuous display object, which is the display object representing the display object that continuously exists across a plurality of the focal distances, on the virtual screen corresponding to the focal distance closest to the gaze point distance.
According to the head-mounted display device of this aspect, the entire continuous display object that continuously exists across a plurality of focal lengths is displayed on the virtual screen corresponding to the focal length closest to the display gaze point distance, and therefore the continuous display object can be displayed with high accuracy.
(4) In the head-mounted display device according to the above aspect, the display control unit may divide the continuous display object, which is the display object representing the display object that continuously exists across a plurality of the focal lengths, for each of the focal lengths and display the continuous display object on each of the virtual screens.
According to the head-mounted display device of this aspect, since the continuous display object that continuously exists across a plurality of focal lengths is divided for each focal length and displayed on each virtual screen, the user can see the continuous display object with high accuracy when the user focuses on an arbitrary virtual screen.
(5) In the head-mounted display device according to the above aspect, the display unit may be provided with a1 st virtual screen on which a1 st display object representing a1 st display object is displayed and a2 nd virtual screen on which a2 nd display object representing a2 nd display object is displayed, and the head-mounted display device further includes an intermediate distance image generating unit that generates an intermediate distance display image for displaying the display object representing the display object present between the 1 st display object and the 2 nd display object as seen from the visual field direction on the virtual screen, the intermediate distance image generating unit may synthesize a color of the 1 st display object and a color of the 2 nd display object to generate the intermediate distance display image of the synthesized color, the display control unit causes the generated intermediate distance display image to be displayed on at least one of the 1 st virtual screen and the 2 nd virtual screen.
According to the head-mounted display device of this aspect, the color of the 1 st display object and the color of the 2 nd display object are combined to generate the intermediate distance display image of the combined color, and the intermediate distance display image is displayed on at least one of the 1 st virtual screen and the 2 nd virtual screen, so that the user can view the intermediate distance display image with high accuracy while looking at the virtual screen of at least one of the 1 st virtual screen and the 2 nd virtual screen.
(6) The head-mounted display device according to the above aspect may further include: a line-of-sight detection unit that detects a line of sight of the user; and a gaze point distance calculation unit that calculates a gaze point distance, which is a distance between the user and a gaze point at which the user gazes, from the detected line of sight, wherein the distance-based classified image generation unit generates the distance-based classified display image using the calculated gaze point distance, and wherein the display control unit causes the distance-based classified display image generated using the gaze point distance to be displayed on the virtual screen corresponding to the gaze point distance.
According to the head-mounted display device of this aspect, the distance-based classification display image is generated using the calculated gaze point distance, and the distance-based classification display image generated using the gaze point distance is displayed on the virtual screen corresponding to the gaze point distance, so that it is possible to reduce the occurrence of focus deviation and convergence angle deviation even when the user changes the gaze point.
The invention may also be implemented in various ways. For example, the present invention can be realized as a display control method for a head-mounted display device, a computer program for realizing the display control method, a recording medium on which the computer program is recorded, and the like.

Claims (7)

1. A head-mounted display device includes:
a display unit that displays a virtual screen set for each of a plurality of predetermined focal lengths;
a distance-based classification image generating unit that generates a distance-based classification display image for displaying, on the virtual screen, a display object representing a display object existing between two adjacent focal lengths;
a display control unit that displays each of the generated distance-based classified display images on the virtual screen corresponding to a focal distance on a back side in a visual field direction of a user of the head-mounted display device, of the two adjacent focal distances;
a line-of-sight detection unit that detects a line of sight of the user; and
a gaze point distance calculation unit that calculates a gaze point distance, which is a distance between the user and a gaze point at which the user gazes, from the detected line of sight,
the distance-based classification image generation unit generates the distance-based classification display image using the calculated gaze point distance,
the display control unit causes the distance-based classified display image generated using the gaze point distance to be displayed on the virtual screen corresponding to the gaze point distance.
2. The head-worn display device of claim 1,
when a plurality of the display objects overlap each other as viewed in the visual field direction, the display control unit causes at least a part of the display object located on a rear side in the visual field direction among the plurality of the display objects to be not displayed.
3. The head-mounted display device according to claim 1 or 2,
the display control unit divides a continuous display object, which is the display object representing a display object that continuously exists across a plurality of focal lengths, for each of the focal lengths and displays the continuous display object on each of the virtual screens.
4. A head-mounted display device includes:
a display unit that displays a virtual screen set for each of a plurality of predetermined focal lengths;
a distance-based classification image generating unit that generates a distance-based classification display image for displaying, on the virtual screen, a display object representing a display object existing between two adjacent focal lengths;
a display control unit that displays each of the generated distance-based classified display images on the virtual screen corresponding to a focal distance on a back side in a visual field direction of a user of the head-mounted display device, of the two adjacent focal distances;
a line-of-sight detection unit that detects a line of sight of the user; and
a gaze point distance calculation unit that calculates a gaze point distance, which is a distance between the user and a gaze point at which the user gazes, from the detected line of sight,
the display control unit causes the virtual screen corresponding to the focal distance closest to the gaze point to display the entire continuous display object representing the display object continuously existing across the plurality of focal distances.
5. A head-mounted display device includes:
a display unit that displays a virtual screen set for each of a plurality of predetermined focal lengths;
a distance-based classification image generating unit that generates a distance-based classification display image for displaying, on the virtual screen, a display object representing a display object existing between two adjacent focal lengths; and
a display control unit that displays each of the generated distance-specific display images on the virtual screen corresponding to a focal distance on a back side in a visual field direction of a user of the head-mounted display device, out of the two adjacent focal distances,
the display unit is provided with a1 st virtual screen and a2 nd virtual screen, wherein the 1 st virtual screen displays a1 st display object representing a1 st display object, the 2 nd virtual screen displays a2 nd display object representing a2 nd display object,
the head-mounted display device further includes an intermediate distance image generating unit that generates an intermediate distance display image for displaying the display object representing the display object existing between the 1 st display object and the 2 nd display object in the view direction on the virtual screen,
the intermediate distance image generation unit generates the intermediate distance display image of a color obtained by synthesizing the color of the 1 st display object and the color of the 2 nd display object,
the display control unit causes the generated intermediate distance display image to be displayed on at least one of the 1 st virtual screen and the 2 nd virtual screen.
6. A display control method of a head-mounted display device having a display unit that displays a virtual screen set for each of a plurality of predetermined focal lengths, the display control method comprising:
generating a distance-specific display image for displaying a display object representing a display object existing between two adjacent focal distances on the virtual screen;
displaying each of the generated distance-based classified display images on the virtual screen corresponding to a focal distance on a back side in a visual field direction of a user of the head-mounted display device, out of the two adjacent focal distances;
detecting a line of sight of the user;
calculating a gaze point distance, which is a distance between the user and a gaze point gazed by the user, from the detected line of sight;
generating the distance-by-distance classified display image using the calculated gazing point distance; and
displaying the distance-specific display image generated using the gaze point distance on the virtual screen corresponding to the gaze point distance.
7. A recording medium on which a computer program for realizing display control of a head-mounted display device having a display unit for displaying a virtual screen set for each of a plurality of predetermined focal lengths is recorded, the computer program causing a computer to realize:
generating a distance-specific display image for displaying a display object representing a display object existing between two adjacent focal distances on the virtual screen;
displaying each of the generated distance-based classified display images on the virtual screen corresponding to a focal distance on a back side in a visual field direction of a user of the head-mounted display device, out of the two adjacent focal distances;
detecting a line of sight of the user;
calculating a gaze point distance, which is a distance between the user and a gaze point gazed by the user, from the detected line of sight;
generating the distance-by-distance classified display image using the calculated gazing point distance; and
displaying the distance-specific display image generated using the gaze point distance on the virtual screen corresponding to the gaze point distance.
CN201910183333.5A 2018-03-13 2019-03-12 Head-mounted display device, display control method, and recording medium Active CN110275297B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018045023A JP7087481B2 (en) 2018-03-13 2018-03-13 Head-mounted display device, display control method, and computer program
JP2018-045023 2018-03-13

Publications (2)

Publication Number Publication Date
CN110275297A CN110275297A (en) 2019-09-24
CN110275297B true CN110275297B (en) 2021-10-01

Family

ID=67903955

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910183333.5A Active CN110275297B (en) 2018-03-13 2019-03-12 Head-mounted display device, display control method, and recording medium

Country Status (3)

Country Link
US (2) US10838215B2 (en)
JP (1) JP7087481B2 (en)
CN (1) CN110275297B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102270699B1 (en) * 2013-03-11 2021-06-28 매직 립, 인코포레이티드 System and method for augmented and virtual reality
WO2014144526A2 (en) 2013-03-15 2014-09-18 Magic Leap, Inc. Display system and method
CN111034190B (en) * 2017-08-29 2022-04-15 索尼公司 Information processing apparatus, information processing method, and program
JP2021064906A (en) * 2019-10-16 2021-04-22 トヨタ自動車株式会社 Image display system
JP2021152564A (en) 2020-03-24 2021-09-30 セイコーエプソン株式会社 Virtual image display device and optical unit
TWI766316B (en) * 2020-07-22 2022-06-01 財團法人工業技術研究院 Light transmitting display system, image output method thereof and processing device thereof
CN112346252A (en) * 2020-11-09 2021-02-09 京东方科技集团股份有限公司 Near-to-eye display device
DE112022002831T5 (en) * 2021-05-28 2024-03-14 Sony Group Corporation IMAGE DISPLAY DEVICE AND OPTICAL LIGHT GUIDE SYSTEM
US20230037329A1 (en) * 2021-08-05 2023-02-09 Meta Platforms Technologies, Llc Optical systems and methods for predicting fixation distance
KR20230046008A (en) * 2021-09-29 2023-04-05 삼성전자주식회사 Method for displaying content by augmented reality device and augmented reality device for displaying content
TR2022003184A2 (en) * 2022-03-03 2022-04-21 Koçak İsmai̇l EYE-FOLLOWING VIRTUAL REALITY DEVICE AND WORKING METHOD FOR AUDIO LABELING IN VIRTUAL ENVIRONMENT

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010139589A (en) * 2008-12-10 2010-06-24 Konica Minolta Opto Inc Video image display apparatus and head mounted display
CN102937745A (en) * 2012-11-13 2013-02-20 京东方科技集团股份有限公司 Open-type head-wearing display device and display method thereof
CN103487938A (en) * 2013-08-28 2014-01-01 成都理想境界科技有限公司 Head mounted display
CN103500446A (en) * 2013-08-28 2014-01-08 成都理想境界科技有限公司 Distance measurement method based on computer vision and application thereof on HMD
CN106249412A (en) * 2015-06-15 2016-12-21 三星电子株式会社 Head mounted display device
CN106646882A (en) * 2016-12-30 2017-05-10 北京七鑫易维信息技术有限公司 Head-mounted display device and adjusting parameter determining method thereof

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08223609A (en) * 1995-02-09 1996-08-30 Atr Tsushin Syst Kenkyusho:Kk Three-dimensional display method and display device for enabling focus control
JPH09331552A (en) * 1996-06-10 1997-12-22 Atr Tsushin Syst Kenkyusho:Kk Multi-focus head mount type display device
JP2002196280A (en) 2000-10-17 2002-07-12 Olympus Optical Co Ltd Display device
JP5678709B2 (en) * 2011-02-10 2015-03-04 ソニー株式会社 Display device
US9217867B2 (en) * 2011-03-24 2015-12-22 Seiko Epson Corporation Head-mounted display device and control method for the head-mounted display device
US8803764B2 (en) * 2011-03-24 2014-08-12 Seiko Epson Corporation Head-mount type display device and method of controlling head-mount type display device
JP6379572B2 (en) * 2014-03-27 2018-08-29 セイコーエプソン株式会社 Head-mounted display device and method for controlling head-mounted display device
JP6443677B2 (en) 2015-03-12 2018-12-26 日本精機株式会社 Head mounted display device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010139589A (en) * 2008-12-10 2010-06-24 Konica Minolta Opto Inc Video image display apparatus and head mounted display
CN102937745A (en) * 2012-11-13 2013-02-20 京东方科技集团股份有限公司 Open-type head-wearing display device and display method thereof
CN103487938A (en) * 2013-08-28 2014-01-01 成都理想境界科技有限公司 Head mounted display
CN103500446A (en) * 2013-08-28 2014-01-08 成都理想境界科技有限公司 Distance measurement method based on computer vision and application thereof on HMD
CN106249412A (en) * 2015-06-15 2016-12-21 三星电子株式会社 Head mounted display device
CN106646882A (en) * 2016-12-30 2017-05-10 北京七鑫易维信息技术有限公司 Head-mounted display device and adjusting parameter determining method thereof

Also Published As

Publication number Publication date
US10838215B2 (en) 2020-11-17
JP2019159076A (en) 2019-09-19
US20190285895A1 (en) 2019-09-19
US11536964B2 (en) 2022-12-27
US20210026145A1 (en) 2021-01-28
CN110275297A (en) 2019-09-24
JP7087481B2 (en) 2022-06-21

Similar Documents

Publication Publication Date Title
CN110275297B (en) Head-mounted display device, display control method, and recording medium
US10657722B2 (en) Transmissive display device, display control method, and computer program
CN108535868B (en) Head-mounted display device and control method thereof
CN110060614B (en) Head-mounted display device, control method thereof, and display system
CN109960481B (en) Display system and control method thereof
US10718948B2 (en) Head-mounted display apparatus, display control method, and computer program
US10567730B2 (en) Display device and control method therefor
US10948724B2 (en) Transmissive display device, display control method, and computer program
US20180246334A1 (en) Head-mounted display device, computer program, and control method for head-mounted display device
US20170289533A1 (en) Head mounted display, control method thereof, and computer program
CN109960039B (en) Display system, electronic device, and display method
US20190285896A1 (en) Transmission-type head mounted display apparatus, method of controlling transmission-type head mounted display apparatus, and computer program for controlling transmission-type head mounted display apparatus
JP7243193B2 (en) Display system, display system control method, information processing device, and information processing device control program
JP2017116562A (en) Display device, control method for the same and program
CN112581920B (en) Display system, display control method, and recording medium
US11156841B2 (en) Display device, control program for display device, control method for display device, and display system
JP2017134630A (en) Display device, control method of display device, and program
US11520414B2 (en) Display system, control program for information processing device, and method for controlling information processing device
JP2017183854A (en) Display system, display device, head mounted display device, display control method, control method for display device, and program
JP2017183855A (en) Display system, head mounted display device, display control method, and program
EP4298473A1 (en) Projector with field lens
JP2019114236A (en) Display system, electronic apparatus, and display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant