JP2012203128A - Head mounted display and method for controlling head mounted display - Google Patents

Head mounted display and method for controlling head mounted display Download PDF

Info

Publication number
JP2012203128A
JP2012203128A JP2011066393A JP2011066393A JP2012203128A JP 2012203128 A JP2012203128 A JP 2012203128A JP 2011066393 A JP2011066393 A JP 2011066393A JP 2011066393 A JP2011066393 A JP 2011066393A JP 2012203128 A JP2012203128 A JP 2012203128A
Authority
JP
Japan
Prior art keywords
image
head
unit
mounted display
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2011066393A
Other languages
Japanese (ja)
Inventor
Hitoshi Saito
均 齊藤
Original Assignee
Seiko Epson Corp
セイコーエプソン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seiko Epson Corp, セイコーエプソン株式会社 filed Critical Seiko Epson Corp
Priority to JP2011066393A priority Critical patent/JP2012203128A/en
Priority claimed from US13/419,010 external-priority patent/US9217867B2/en
Publication of JP2012203128A publication Critical patent/JP2012203128A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

In a head-mounted display device, a technique for detecting the movement of a head exceeding a certain amount of a user and improving the visibility of an outside scene is provided.
A head-mounted display device includes an image light generation unit that generates and emits image light representing an image, and a light guide unit that guides the emitted image light to a user's eyes. An image display unit for allowing a person to visually recognize a virtual image, a control unit that is connected to the image display unit and controls image display by the image display unit, and obtains change information that indicates a change in the orientation of the image display unit. And a detection unit that detects movement of the head exceeding a certain amount of the user wearing the image display unit. The control unit adjusts the luminance of the image light generation unit or reduces the image light generated by the image light generation unit so as to reduce the visibility of the virtual image when a head movement exceeding a certain amount is detected. Adjust.
[Selection] Figure 2

Description

  The present invention relates to a head-mounted display device that is a display device mounted on the head and a method for controlling the head-mounted display device.

  A head-mounted display device (Head Mounted Display (HMD)), which is a display device mounted on the head, is known. The head-mounted display device, for example, generates image light representing an image using a liquid crystal display and a light source, and guides the generated image light to a user's eye using a projection optical system or a light guide plate Thus, the user is made to recognize the virtual image.

  With the head-mounted display as described above, it is possible to enjoy images (videos) and audio anywhere while wearing like glasses. On the other hand, when the head-mounted display is mounted, there is a problem that the user may feel inconvenience because an image that blocks the outside scene is always displayed in front of the user's eyes. In order to solve such a problem, conventionally, a technique for detecting that the user is walking, automatically stopping the image display of the head mounted display, and improving the visibility of the outside scene is known. (For example, Patent Document 1).

JP-A-9-212382 JP 2008-116704 A JP 2003-51993 A JP 2007-134785 A JP 2004-96224 A

  In the prior art, the inconvenience for the user during walking is reduced. However, conventionally, even when the user is not walking, for example, he wants to reduce inconvenience that the user feels when he distracts his attention from the image displayed on the head mounted display. There was a request.

  An object of the present invention is to provide a technique for improving visibility of an outside scene according to the movement of a user's head in a head-mounted display device.

  SUMMARY An advantage of some aspects of the invention is to solve at least a part of the problems described above, and the invention can be implemented as the following forms or application examples.

[Application Example 1]
A head-mounted display device,
An image display unit that includes an image light generation unit that generates and emits image light representing an image, and a light guide unit that guides the emitted image light to the user's eyes, and allows the user to visually recognize a virtual image When,
A control unit connected to the image display unit and controlling image display by the image display unit;
Change information indicating a change in the orientation of the image display unit, and using the change information, a detection unit that detects head movement exceeding a certain amount of a user wearing the image display unit;
With
The controller is
When the movement of the head exceeding the certain amount is detected, the luminance of the image light generation unit is adjusted so as to reduce the visibility of the virtual image, or the image generated by the image light generation unit Head-mounted display device that adjusts light.
With such a configuration, the detection unit acquires change information indicating a change in the orientation of the image display unit, and uses the change information to move the head movement exceeding a certain amount of the user wearing the image display unit. The control unit adjusts the brightness of the image light generation unit or reduces the visibility of the virtual image when the head movement exceeds a certain amount, or is generated by the image light generation unit. In order to adjust the image light, the head-mounted display device can detect the movement of the head exceeding a certain amount of the user wearing the image display unit, and improve the visibility of the outside scene.

[Application Example 2]
A head-mounted display device according to Application Example 1,
The detector is
In response to occurrence of a predetermined trigger, an initial position, which is a reference position when detecting movement of the image display unit, is set, the change information with respect to the initial position is acquired, and the change information is used. A head-mounted display device that detects head movement exceeding the predetermined amount.
With such a configuration, the detection unit sets an initial position, which is a reference position when detecting the movement of the image display unit, and acquires change information with respect to the initial position. It is possible to detect head movement exceeding a certain amount of the user wearing the image display unit.

[Application Example 3]
A head-mounted display device according to Application Example 2,
The detection unit includes the initial position and the change information, the head angle corresponding to the vertical head movement of the user wearing the image display unit, and the face direction corresponding to the horizontal face movement. And a head-mounted display device that is specified by the combination.
With this configuration, the detection unit corresponds to the initial position and change information corresponding to the head angle in the vertical direction of the user wearing the image display unit and the movement of the face in the horizontal direction. And the direction of the face to be identified.

[Application Example 4]
The head-mounted display device according to application example 2 or 3,
The image display unit includes an angular velocity detection unit that detects an angular velocity of the image display unit,
The detector is
When the angular velocity detected as the change information is acquired, the angle obtained from the angular velocity exceeds a predetermined first threshold, and the angular velocity exceeds a predetermined second threshold A head-mounted display device that determines that movement of the head exceeding the predetermined amount is detected.
With such a configuration, the detection unit has an angle obtained from the angular velocity detected by the angular velocity detection unit exceeding a predetermined first threshold, and the angular velocity detected by the angular velocity detection unit is predetermined. In order to determine that a head movement exceeding a certain amount is detected when the second threshold is exceeded, a small user's movement can be ignored by appropriately setting the first threshold. By appropriately setting the second threshold value, it is possible to ignore the slow operation of the user.

[Application Example 5]
The head-mounted display device according to any one of Application Examples 2 to 4, further comprising:
The image display unit includes an acceleration detection unit that detects acceleration of the image display unit,
The detection unit further includes:
The head-mounted display that obtains the acceleration detected as the change information and determines whether a head movement exceeding the predetermined amount is detected using the inclination of the image display unit obtained from the acceleration. apparatus.
With such a configuration, the detection unit further uses the inclination of the image display unit obtained from the acceleration detected by the acceleration detection unit to determine whether a head movement exceeding a certain amount is detected. Thus, the accuracy of determination in the detection unit can be improved.

[Application Example 6]
A head-mounted display device according to application example 2 or 3,
The image display unit further includes:
A geomagnetism detection unit that detects the orientation of the image display unit using geomagnetism;
An acceleration detection unit for detecting acceleration of the image display unit;
With
The detector is
As the change information, the detected azimuth and the detected acceleration are acquired, the angle obtained from the azimuth exceeds a predetermined first threshold, and the acceleration is determined in advance. A head-mounted display device that determines that movement of the head exceeding the predetermined amount is detected when the threshold value of 2 is exceeded.
With such a configuration, the detection unit has an angle obtained from the direction detected by the geomagnetism detection unit exceeding a predetermined first threshold value, and an acceleration detected by the geomagnetism detection unit is predetermined. In a configuration including a geomagnetic detection unit and an acceleration detection unit instead of the angular velocity detection unit of Application Example 4 in order to determine that the movement of the head exceeding a certain amount is detected when the second threshold is exceeded. Also, the same effect as in Application Example 4 can be obtained.

[Application Example 7]
The head-mounted display device according to any one of Application Examples 1 to 6,
The predetermined trigger is at least one of power-on to the head-mounted display device, detection of activation of a predetermined application, and detection of a predetermined button press. Head-mounted display device.
With such a configuration, in response to the occurrence of at least one of power-on to the head-mounted display device, detection of activation of a predetermined application, and detection of a predetermined button press It is possible to set an initial position to be a reference when detecting the movement of the image display unit.

[Application Example 8]
The head-mounted display device according to any one of Application Examples 1 to 7,
The image light generator is
A display element for generating the image;
A light source for emitting image light representing the generated image;
Including
The controller is
A head-mounted display device that adjusts the luminance of the image light generation unit by turning off or reducing the illumination light of the light source when movement of the head exceeding the certain amount is detected.
With such a configuration, the control unit adjusts the luminance of the image light generation unit by turning off or reducing the illumination light of the light source when a head movement exceeding a certain amount is detected. The visibility of the outside scene in the image display unit can be improved.

[Application Example 9]
The head-mounted display device according to any one of Application Examples 1 to 7,
The controller is
When the movement of the head exceeding the predetermined amount is detected, the image light generated by the image light generation unit is adjusted by temporarily stopping the generation of the image light by the image light generation unit. Head mounted display device.
With this configuration, the control unit generates the image light by the image light generation unit by temporarily stopping the image light generation by the image light generation unit when a head movement exceeding a certain amount is detected. Since the image light to be adjusted is adjusted, the visibility of the outside scene in the image display unit can be improved.

[Application Example 10]
The head-mounted display device according to any one of Application Examples 1 to 9, further comprising:
The image display unit includes an imaging unit that images a user's eyeball image;
The detection unit further includes:
By analyzing the captured image of the eyeball, a line-of-sight movement amount indicating a movement amount with respect to the center position of the user's iris is acquired,
The control unit further includes:
When the line-of-sight movement amount exceeds a predetermined third threshold, the brightness of the image light generation unit is adjusted or generated by the image light generation unit so as to reduce the visibility of the virtual image. A head-mounted display device that adjusts the image light.
With such a configuration, the detection unit analyzes the captured eyeball image of the user to acquire a line-of-sight movement amount indicating the movement amount with respect to the center position of the user's iris, and the control unit further includes When the line-of-sight movement amount exceeds a predetermined third threshold, the brightness of the image light generation unit is adjusted or the image generated by the image light generation unit so as to reduce the visibility of the virtual image In order to adjust the light, in addition to the effect of Application Example 1, it is possible to detect that the user has shifted the line of sight, and to improve the visibility of the outside scene.

  The present invention can be realized in various modes. For example, a head-mounted display device and a head-mounted display device control method, a head-mounted display system, these methods, devices, or The present invention can be realized in the form of a computer program for realizing the function of the system, a recording medium on which the computer program is recorded, or the like.

It is explanatory drawing which shows the structure of the external appearance of the head mounted display apparatus in one Example of this invention. It is a block diagram which shows the composition of a head mount display functionally. It is explanatory drawing which shows a mode that image light is inject | emitted by the image light production | generation part. It is explanatory drawing which shows an example of the virtual image recognized by the user. It is explanatory drawing for demonstrating an operation | movement detection process. It is a flowchart which shows the procedure of an operation | movement detection process. It is explanatory drawing which shows the mode of the image light generation part in step S114 of FIG. It is explanatory drawing which shows a mode that an operation | movement detection process is performed. It is a block diagram which shows functionally the structure of the head mounted display in 2nd Example. It is a flowchart which shows the procedure of the gaze detection process of the head mounted display in 2nd Example. It is explanatory drawing about the analysis method of an eyeball image. It is a block diagram which shows functionally the structure of the head mounted display in a modification.

  Next, embodiments of the present invention will be described in the following order based on examples.

A. First embodiment:
(A-1) Configuration of the head-mounted display device:
FIG. 1 is an explanatory diagram showing an external configuration of a head-mounted display device according to an embodiment of the present invention. The head-mounted display device HM is a display device mounted on the head, and is also called a head mounted display (HMD). The head-mounted display HM of the present embodiment is an optically transmissive head-mounted display device that allows a user to visually recognize a virtual image and at the same time directly view an outside scene.

  The head-mounted display HM includes an image display unit 20 that allows the user to visually recognize a virtual image while being mounted on the user's head, and a control unit (controller) 10 that controls the image display unit 20.

  The image display unit 20 is a wearing body that is worn on the user's head, and has a glasses shape in the present embodiment. The image display unit 20 includes an ear hook unit 21, a right display drive unit 22, a left display drive unit 24, a right optical panel 26, a left optical panel 28, a gyro sensor 61, and a setting button 62. Yes. The ear hook portion 21 is a member provided so as to cross over the user's ear from the end portions of the right display drive portion 22 and the left display drive portion 24, and functions as a temple. The right optical panel 26 and the left optical panel 28 are disposed so as to be positioned in front of the user's right and left eyes, respectively, when the user wears the image display unit 20. The right display driving unit 22 is disposed at a connection portion between the right ear hooking portion 21 and the right optical panel 26. Further, the left display driving unit 24 is disposed at a connection portion between the left ear hooking portion 21 and the left optical panel 28. Hereinafter, the right display drive unit 22 and the left display drive unit 24 are collectively referred to simply as “display drive unit”, and the right optical panel 26 and the left optical panel 28 are also collectively referred to simply as “optical panel”.

  The gyro sensor 61 as an angular velocity detection unit is disposed inside the housing of the right display unit 22. The gyro sensor 61 in the present embodiment is a piezoelectric vibration type biaxial angular velocity sensor, and detects the angular velocities of the two axes (x axis, y axis) of the image display unit 20. The setting button 62 is disposed on the surface of the housing of the left display unit 24 and on the outer surface of the image display unit 20 (that is, the surface opposite to the mounting side of the image display unit 20). The setting button 62 is used to set an initial position to be a reference when detecting the orientation of the image display unit 20 in an operation detection process described later. Details will be described later.

  The display driving unit includes an LCD (Liquid Crystal Display), a projection optical system, and the like (not shown). Details will be described later. The optical panel includes a light guide plate (not shown) and a light control plate. The light guide plate is formed of a light transmissive resin material or the like, and emits image light taken from the display driving unit toward the user's eyes. The light control plate is a thin plate-like optical element, and is arranged so as to cover the front side (the side opposite to the user's eye side). The light control plate protects the light guide plate, suppresses damage to the light guide plate, adhesion of dirt, etc., and adjusts the light transmittance of the light control plate to adjust the amount of external light entering the user's eyes. The ease of visual recognition of the virtual image can be adjusted. The light control plate can be omitted.

  The image display unit 20 further includes a right earphone 32 for the right ear and a left earphone 34 for the left ear. The right earphone 32 and the left earphone 34 are respectively attached to the right and left ears when the user wears the image display unit 20.

  The image display unit 20 further includes a connection unit 40 for connecting the image display unit 20 to the control unit 10. The connection unit 40 includes a main body cord 48 connected to the control unit 10, a right cord 42 in which the main body cord 48 branches into two, a left cord 44, and a connecting member 46 provided at the branch point. Yes. The right cord 42 is connected to the right display driving unit 22, and the left cord 44 is connected to the left display driving unit 24. The image display unit 20 and the control unit 10 transmit various signals via the connection unit 40. A connector (not shown) that is fitted to each other is provided at the end of the main body cord 48 opposite to the connecting member 46 and the control section 10, and the main body cord 48 connector and the connector of the control section 10 are provided. The control unit 10 and the image display unit 20 are connected to or disconnected from each other by the fitting / releasing. For the right cord 42, the left cord 44, and the main body cord 48, for example, a metal cable or an optical fiber can be adopted.

  The control unit 10 is a device for operating the head mounted display HM. The control unit 10 includes a lighting unit 12, a touch pad 14, a cross key 16, and a power switch 18. The lighting unit 12 notifies the operation state of the head mounted display HM (for example, ON / OFF of the power supply) according to the light emission state. For example, an LED (Light Emitting Diode) can be used as the lighting unit 12. The touch pad 14 detects a user's finger operation on the operation surface of the touch pad 14 and outputs a signal corresponding to the detected content. The cross key 16 detects a pressing operation on a key corresponding to the up / down / left / right direction, and outputs a signal corresponding to the detected content. The power switch 18 switches the power-on state of the head mounted display HM by detecting a slide operation of the switch.

  FIG. 2 is a block diagram functionally showing the configuration of the head mounted display HM. The control unit 10 includes an input information acquisition unit 110, a storage unit 120, a power supply 130, a CPU 140, an interface 180, and transmission units (Tx) 51 and 52, and these units are connected to each other via a bus (not shown). Has been.

  The input information acquisition unit 110 has a function of acquiring a signal (for example, an operation input to the touch pad 14, the cross key 16, and the power switch 18) according to an operation input by the user. The storage unit 120 is a storage unit including a ROM, a RAM, a DRAM, a hard disk, and the like (not shown). The storage unit 120 includes threshold information CI. The threshold information CI is a threshold used in an operation detection process (described later), and in the present embodiment, three thresholds (first threshold and second threshold) are stored in advance. The power supply 130 supplies power to each part of the head mounted display HM. As the power supply 130, for example, a secondary battery can be used.

  The CPU 140 provides a function as an operating system (OS) 150 by executing a program installed in advance. The CPU 140 also functions as a detection unit 145, an image processing unit 160, an audio processing unit 170, and a display control unit 190 by developing firmware and computer programs stored in a ROM and a hard disk in the RAM and executing them. . Details of these parts will be described later.

  The detection unit 145 acquires change information (angular velocity that is a detection value of the gyro sensor 61 in the present embodiment) that is information indicating a change in the orientation of the image display unit 20, and uses the change information to perform motion detection processing. Execute. The motion detection process is a process of detecting the movement of the head exceeding a certain amount of the user wearing the image display unit 20 of the head mounted display HM and erasing the virtual image displayed on the image display unit 20. Details will be described later. The detection unit 145 corresponds to a “detection unit” in the claims.

  The interface 180 is an interface for connecting various external devices OA (for example, a personal computer PC, a mobile phone terminal, and a game terminal) as a content supply source to the control unit 10. As the interface 180, for example, a USB interface, a micro USB interface, a memory card interface, a wireless LAN interface, or the like can be provided. The content means information content including an image (still image, moving image), sound, and the like.

  The image processing unit 160 generates a clock signal PCLK, a vertical synchronization signal VSync, a horizontal synchronization signal HSync, and image data Data based on content input through the interface 180, and outputs these signals to the image through the connection unit 40. This is supplied to the display unit 20. Specifically, the image processing unit 160 acquires an image signal included in the content. For example, in the case of a moving image, the acquired image signal is generally an analog signal composed of 30 frame images per second. The image processing unit 160 separates synchronization signals such as the vertical synchronization signal VSync and the horizontal synchronization signal HSync from the acquired image signal. Further, the image processing unit 160 generates a clock signal PCLK using a PLL circuit (not shown) or the like according to the period of the separated vertical synchronization signal VSync and horizontal synchronization signal HSync.

  The image processing unit 160 converts the analog image signal from which the synchronization signal is separated into a digital image signal using an A / D conversion circuit or the like (not shown). Thereafter, the image processing unit 160 stores the converted digital image signal in the DRAM in the storage unit 120 for each frame as image data Data (RGB data) of the target image. Note that the image processing unit 160 may execute image processing such as resolution conversion processing, various tone correction processing such as adjustment of luminance and saturation, and keystone correction processing on the image data as necessary. .

  The image processing unit 160 transmits the generated clock signal PCLK, vertical synchronization signal VSync, horizontal synchronization signal HSync, and image data Data stored in the DRAM in the storage unit 120 via the transmission units 51 and 52, respectively. . The image data Data transmitted via the transmission unit 51 is also referred to as “right eye image data”, and the image data Data transmitted via the transmission unit 52 is also referred to as “left eye image data”. The transmission units 51 and 52 function as a transceiver for serial transmission between the control unit 10 and the image display unit 20.

  The display control unit 190 generates control signals for controlling the right display drive unit 22 and the left display drive unit 24. Specifically, the display control unit 190 uses the control signal to turn on / off the right LCD 241 by the right LCD control unit 211, turn on / off the right backlight 221 by the right backlight control unit 201, and control the left LCD. The left display drive unit 22 and the left display drive unit 24 are controlled by individually controlling the ON / OFF of the left LCD 242 by the unit 212 and the ON / OFF of the left backlight 222 by the left backlight control unit 202. Each controls the generation and emission of image light. For example, the display control unit 190 may cause both the right display driving unit 22 and the left display driving unit 24 to generate image light, generate only one image light, or neither may generate image light.

  The display control unit 190 transmits control signals for the right LCD control unit 211 and the left LCD control unit 212 via the transmission units 51 and 52, respectively. In addition, the display control unit 190 transmits control signals for the right backlight control unit 201 and the left backlight control unit 202, respectively.

  The audio processing unit 170 acquires an audio signal included in the content, amplifies the acquired audio signal, and supplies the acquired audio signal to the right earphone 32 and the left earphone 34 of the image display unit 20 via the connection unit 40.

  The image display unit 20 includes a right display drive unit 22, a left display drive unit 24, a right light guide plate 261 as the right optical panel 26, a left light guide plate 262 as the left optical panel 28, a gyro sensor 61, and a setting. A button 62, a right earphone 32, and a left earphone 34 are provided.

  The right display driving unit 22 includes a receiving unit (Rx) 53, a right backlight (BL) control unit 201 and a right backlight (BL) 221 that function as a light source, a right LCD control unit 211 that functions as a display element, and a right An LCD 241 and a right projection optical system 251 are included. The right backlight control unit 201, the right LCD control unit 211, the right backlight 221 and the right LCD 241 are also collectively referred to as “image light generation unit”.

  The receiving unit 53 functions as a receiver for serial transmission between the control unit 10 and the image display unit 20. The right backlight control unit 201 has a function of driving the right backlight 221 based on the input control signal. The right backlight 221 is a light emitter such as an LED. The right LCD control unit 211 has a function of driving the right LCD 241 based on the clock signal PCLK, the vertical synchronization signal VSync, the horizontal synchronization signal HSync, and the right-eye image data input via the reception unit 53. Have. The right LCD 241 is a transmissive liquid crystal panel in which a plurality of pixels are arranged in a matrix.

  FIG. 3 is an explanatory diagram illustrating a state in which image light is emitted by the image light generation unit. The right LCD 241 changes the transmittance of the light transmitted through the right LCD 241 by driving the liquid crystal corresponding to each pixel position arranged in a matrix, thereby changing the illumination light IL emitted from the right backlight 221. It has a function of modulating into effective image light PL representing an image. As shown in FIG. 3, in this embodiment, the backlight method is adopted. However, the image light may be emitted using a front light method or a reflection method.

  The right projection optical system 251 in FIG. 2 is configured by a collimating lens that converts image light emitted from the right LCD 241 to light beams in a parallel state. The right light guide plate 261 as the right optical panel 26 guides the image light output from the right projection optical system 251 to the right eye of the user while reflecting the image light along a predetermined optical path. The right projection optical system 251 and the right light guide plate 261 are collectively referred to as “light guide unit”.

  The left display driving unit 24 includes a receiving unit (Rx) 54, a left backlight (BL) control unit 202 and a left backlight (BL) 222 that function as a light source, a left LCD control unit 212 and a left that function as a display element. An LCD 242 and a left projection optical system 252 are included. The left backlight control unit 202, the left LCD control unit 212, the left backlight 222, and the left LCD 242 are collectively referred to as an “image light generation unit”, the left projection optical system 252, and the left light guide plate 262. Are also collectively referred to as “light guides”. The right display drive unit 22 and the left display drive unit 24 are paired. Since each unit of the left display drive unit 24 has the same configuration and operation as each unit described in the right display drive unit 22, a detailed description will be given. Omitted.

  FIG. 4 is an explanatory diagram illustrating an example of a virtual image recognized by the user. As described above, the image light guided to both eyes of the user of the head mounted display HM forms an image on the retina of the user, so that the user can visually recognize the virtual image. As shown in FIG. 4, a virtual image VI is displayed in the visual field VR of the user of the head mounted display HM. In addition, the user can see the outside scene SC through the right optical panel 26 and the left optical panel 28 except for the portion of the user's visual field VR where the virtual image VI is displayed. In the head mounted display HM of the present embodiment, the outside scene SC can be seen through the virtual image VI even in the portion where the virtual image VI is displayed in the visual field VR of the user.

(A-2) Motion detection process:
FIG. 5 is an explanatory diagram for explaining the operation detection process. The operation detection process of the head mounted display HM is a process of detecting a movement of the head exceeding a certain amount of the user wearing the image display unit 20 and erasing the virtual image displayed on the image display unit 20. Here, “the movement of the head exceeding a certain amount” means that the head of the user wearing the image display unit 20 (that is, the image display unit 20) moves beyond a certain amount with respect to the initial position. Means. In this embodiment, the initial position and the amount of movement of the head movement are specified by a combination of the angle of the head of the user wearing the image display unit 20 when the setting button 62 is pressed and the face direction.

  The angle of the head corresponds to the movement of the head in the vertical direction (vertical direction) PL of the user, as shown in FIG. For example, when the setting button 62 is pressed when the user's head is vertical (0 degrees), 0 degrees is set as the initial position. In this state, when the user turns his head in the MV direction, the movement amount of the head movement (head angle θ1) is the difference between the initial position (0 degrees) and the position after movement (45 degrees). 45 degrees. The amount of movement of the head movement (head angle θ1) corresponds to the y-axis angular velocity obtained from the output value of the gyro sensor 61.

  The face direction corresponds to the movement of the face in the horizontal direction HL of the user as shown in FIG. For example, when the setting button 62 is pressed when the user's face is facing the front (0 degree), 0 degree is set as the initial position. In this state, when the user turns his face in the MV direction, the movement amount of the head movement (face direction θ2) is the difference between the initial position (0 degrees) and the position after movement (70 degrees). It will be 70 degrees. The movement amount of the head movement (face direction θ2) corresponds to the x-axis angular velocity obtained from the output value of the gyro sensor 61.

  FIG. 6 is a flowchart showing the procedure of the operation detection process. The detection unit 145 determines whether or not the setting button 62 provided on the image display unit 20 has been pressed (step S102). When the setting button 62 is not pressed (step S102: NO), the detection unit 145 proceeds to step S102 and continues monitoring whether the setting button 62 is pressed. When the setting button 62 is pressed (step S102: YES), the detection unit 145 reads the first threshold value and the second threshold value from the threshold value information CI stored in the storage unit 120 (step S104).

  After acquiring the threshold value, the detection unit 145 acquires the output value of the gyro sensor 61 (step S106). The detection unit 145 determines whether or not at least one of the x-axis angle and the y-axis angle obtained from the acquired output value exceeds the first threshold Th1 (step S110). Here, the detection unit 145 obtains the x-axis angle and the y-axis angle by integrating the x-axis angular velocity and the y-axis angular velocity, respectively. When both the x-axis and y-axis angles are equal to or smaller than the first threshold Th1 (step S110: NO), the detection unit 145 causes the process to transition to step S106 and acquires the output value of the gyro sensor 61 again.

  On the other hand, when at least one of the x-axis and y-axis angles exceeds the first threshold Th1 (step S110: YES), the detection unit 145 has at least one of the x-axis angular velocity and the y-axis angular velocity. It is determined whether one of them exceeds the second threshold Th2 (step S112). When both the x-axis and y-axis angular velocities are equal to or less than the second threshold Th2 (step S112: NO), the detection unit 145 causes the process to transition to step S106 and acquires the output value of the gyro sensor 61 again.

  On the other hand, when at least one of the x-axis and y-axis angular velocities exceeds the second threshold Th2 (step S112: YES), the detection unit 145 turns off the backlight (step S114). Specifically, the detection unit 145 requests the display control unit 190 of the control unit 10 to turn off the backlight.

  FIG. 7 is an explanatory diagram showing the state of the image light generation unit in step S114 of FIG. In step S <b> 114 in FIG. 6, the display control unit 190 that has received the request from the detection unit 145 controls the right backlight control unit 201 to turn off the right backlight 221, and the left backlight control unit 202 performs the left. A control signal indicating that the backlight 222 is turned off is transmitted to the image display unit 20. The right backlight control unit 201 that has received the signal turns off the right backlight 221. Similarly, the left backlight control unit 202 that has received the signal turns off the left backlight 222. As a result, as shown in FIG. 7, since the images drawn on the right LCD 241 and the left LCD 242 are not emitted as image light, the display of the virtual image VI disappears from the visual field VR of the user.

  After the backlight is turned off, the detection unit 145 acquires the output value of the gyro sensor 61 (step S116). The detection unit 145 determines whether at least one of the x-axis angle and the y-axis angle obtained from the acquired output value is equal to or greater than the first threshold Th1 (step S118). When both the x-axis and y-axis angles are smaller than the first threshold Th1 (step S118: NO), the detection unit 145 causes the process to transition to step S116 and obtains the output value of the gyro sensor 61 again.

  On the other hand, when at least one of the x-axis and y-axis angles is equal to or greater than the first threshold Th1 (step S118: YES), the detection unit 145 turns on the backlight (step S120). Specifically, the detection unit 145 requests the display control unit 190 of the control unit 10 to turn on the backlight, and ends the process.

  In step S120 of FIG. 6, the display control unit 190 that has received the request from the detection unit 145 has a control signal indicating that the right backlight 221 is turned on by the right backlight control unit 201 and a left signal from the left backlight control unit 202. A control signal indicating that the backlight 222 is turned on is transmitted to the image display unit 20. The right backlight control unit 201 that has received the signal turns on the right backlight 221. Similarly, the left backlight control unit 202 that has received the signal turns on the left backlight 222. As a result, the images drawn on the right LCD 241 and the left LCD 242 are emitted as image light, and the virtual image VI is displayed again in the visual field VR of the user.

  As described above, when the setting button 62 is pressed, the operation detection process (FIG. 6) and subsequent steps are started, and monitoring of the output value of the gyro sensor 61 is started. The gyro sensor 61 is a sensor that detects an angular velocity, that is, an amount of change in angle per unit time (in other words, a change in direction). From this, the detection unit 145 detects the initial position of the image display unit 20 (the movement of the image display unit 20) by pressing the setting button 62, and the angle of the user's head and the direction of the face at that time. It can be said that the gyro sensor 61 detects an angular velocity indicating a change in the orientation of the image display unit 20 with respect to the initial position. The angular velocity detected by the gyro sensor 61 corresponds to “change information” in the claims.

  Note that the first and second threshold values stored in advance in the threshold information CI can be arbitrarily set. The first threshold value that is a threshold value for the angle may be 45 degrees, for example. By limiting the change in angle with the first threshold, it is possible to detect “a head movement exceeding a certain amount” of the user wearing the image display unit 20 in the motion detection process. In other words, when there is a change in the angle of the image display unit 20, whether the user has turned away from the initial position and a head movement exceeding a certain amount has occurred, or the user's head has moved slightly. It becomes possible to determine whether or not it is just.

  Moreover, it is preferable to make the 2nd threshold value used as the threshold value with respect to angular velocity (angle change amount per unit time) into a small value. By limiting the change in the angular velocity with the second threshold value, the slow motion of the user wearing the image display unit 20 can be ignored (allowed) and the display of the virtual image VI can be continued in the motion detection process. The threshold information CI may be configured to be arbitrarily changeable by the user.

  FIG. 8 is an explanatory diagram showing how the motion detection process is executed. FIG. 8A shows a state when the user wearing the image display unit 20 of the head mounted display HM presses the setting button 62. After the user presses the setting button 62, when the angle and angular velocity obtained from the acquired output value of the gyro sensor 61 do not exceed the first and second threshold values, in other words, the conditions of steps S110 to S112 are satisfied. If not satisfied, the backlight is not turned off (step S114) in the motion detection process, so that a virtual image VI is displayed in the visual field VR of the user as shown in FIG.

  FIG. 8B shows a state where the user wearing the image display unit 20 of the head mounted display HM turns away from the initial position. When the user turns to the side, for example, from the initial position shown in FIG. 8A, the x-axis direction angle and angular velocity obtained from the output value of the gyro sensor 61 change. When these exceed the first and second threshold values and the conditions of steps S110 to S112 are satisfied, the backlight is turned off (step S114) in the motion detection process, and the virtual image VI is displayed from the user's visual field VR. Disappears. As a result of the disappearance of the display of the virtual image VI that blocks the visual field VR, the user can clearly see the outside scene. The same applies to the case where the angle and the angular velocity in the y-axis direction obtained from the output value of the gyro sensor 61 change due to the user turning up or down, for example.

  Further, in step S114 of the operation detection process (FIG. 6), the detection unit 145 may perform the following process, for example, instead of turning off the backlight.

  In step S114 of FIG. 6, the detection unit 145 requests the display control unit 190 of the control unit 10 to reduce the luminance of the backlight. Upon receiving the request, the display control unit 190 designates the backlight brightness together with a control signal for designating ON / OFF of the backlight to the right backlight control unit 201 and the left backlight control unit 202. Send a control signal. As a control signal for designating the luminance of the backlight, for example, a PWM (Pulse Width Modulation) signal can be used. In this way, in step S106, the illumination light is reduced by the backlight (right backlight 221, left backlight 222) instead of turning off the backlight. If the illumination light is dimmed, the image light emitted by the image light generation unit becomes weak (the luminance of the image light generation unit decreases), so the virtual image VI displayed in the user's visual field VR is thin and blurred. It looks like this. As a result, the user can easily see the outside scene.

  In step S114 of FIG. 6, the detection unit 145 requests the display control unit 190 of the control unit 10 to temporarily stop driving the LCD (liquid crystal). The display control unit 190 that has received the request transmits a control signal designating LCD driving OFF to the right LCD control unit 211 and the left LCD control unit 212. In this way, the drawing of the image by the LCD (right LCD 241 and left LCD 242) is stopped, and the generation and emission of the image light in the image light generation unit are stopped, so that the virtual image VI is displayed from the user's visual field VR. Disappear. As a result, the user can easily see the outside scene.

  The detection unit 145 may request the display control unit 190 of the control unit 10 to reduce the aperture ratio of the LCD (liquid crystal). If the liquid crystal aperture ratio decreases, the image light emitted by the image light generation unit becomes weak, so that the virtual image VI displayed in the user's visual field VR becomes a light and blurry display. Therefore, the user can easily see the outside scene.

  In step S114 of FIG. 6, the detection unit 145 requests the image processing unit 160 of the control unit 10 to temporarily stop transmission of the image data Data. The image processing unit 160 that has received the request stops the image data Data to be transmitted to the image display unit 20. In this way, the drawing of the image by the LCD (right LCD 241 and left LCD 242) is stopped, and the generation and emission of the image light in the image light generation unit are stopped, so that the virtual image VI is displayed from the user's visual field VR. Disappear. As a result, the user can easily see the outside scene.

  In addition, the detection unit 145 may request the image processing unit 160 to set the image data Data to be black dummy data. In this way, the image drawn on the LCD (the right LCD 241 and the left LCD 242) is adjusted to a black dummy image, and the image light generation unit emits image light corresponding to this, so that the user's field of view The virtual image VI displayed on the VR is displayed as if it disappeared. Therefore, the user can easily see the outside scene.

  As described above, according to the first embodiment, the detection unit 145 uses the angular velocity detected by the gyro sensor 61 (change information indicating a change in orientation with respect to the initial position) to the user wearing the image display unit 20. Detects head movements exceeding a certain amount. The control unit 10 adjusts the brightness of the light source (the right backlight 221 and the left backlight 222) so as to reduce the visibility of the virtual image VI when a head movement exceeding a certain amount is detected. The image light generated by the image light generation unit is adjusted by adjusting the luminance of the light generation unit or adjusting the image generated by the display elements (the right LCD 241 and the left LCD 242). As a result, in the head mounted display HM, the movement of the head exceeding a certain amount of the user wearing the image display unit 20 can be detected, and the visibility of the outside scene can be improved.

B. Second embodiment:
In the second embodiment of the present invention, a configuration will be described in which in addition to the motion detection process, a line-of-sight detection process for detecting the direction of the line of sight of the user wearing the image display unit and deleting the virtual image is performed. Below, only the part which has a different structure and operation | movement from 1st Example is demonstrated. In the figure, the same components as those of the first embodiment are denoted by the same reference numerals as those of the first embodiment described above, and detailed description thereof is omitted.

(B-1) Configuration of the head-mounted display device:
FIG. 9 is a block diagram functionally showing the configuration of the head mounted display HMa in the second embodiment. The difference from the first embodiment shown in FIG. 2 is that a control unit 10 a is provided instead of the control unit 10, and an image display unit 20 a is provided instead of the image display unit 20.

  The image display unit 20a further includes an eye camera 63 as an imaging unit. The eye camera 63 in the present embodiment is disposed on the inner surface of the right optical panel 26 (that is, the surface facing the user's eye when the head mounted display HMa is attached). The eye camera 63 in this embodiment includes an infrared illumination and a CCD camera, and acquires an eyeball image of the right eye of the user.

  The control unit 10a includes threshold information CIa instead of the threshold information CI, and a detection unit 145a instead of the detection unit 145. In the threshold information CIa, a third threshold is stored in advance in addition to the first and second thresholds. The third threshold value is used in the line-of-sight detection process (described later). The detection unit 145a executes the line-of-sight detection process in parallel with the motion detection process (FIG. 6). The line-of-sight detection process is a process for detecting the direction of the line of sight of the user wearing the image display unit 20 and erasing the virtual image.

(B-2) Operation detection process:
The operation detection process of the head mounted display HMa in the second embodiment is the same as that of the first embodiment shown in FIG.

(B-3) Eye-gaze detection processing:
FIG. 10 is a flowchart illustrating a procedure of the line-of-sight detection process of the head mounted display HMa in the second embodiment. First, the detection unit 145a determines whether or not the setting button 62 provided on the image display unit 20a has been pressed (step S102). The details are the same as step S102 of the operation detection process (FIG. 6). When the setting button 62 is pressed (step S102: YES), the detection unit 145a reads the third threshold value from the threshold information CIa (step S202). After acquiring the threshold, the detection unit 145a acquires an eyeball image photographed by the eye camera 63 (step S204). Then, the detection unit 145a analyzes the acquired eyeball image (step S206).

  FIG. 11 is an explanatory diagram of an eyeball image analysis method. FIG. 11A shows an overview of a user's eye wearing the head mounted display HMa when viewed from the side. The detection unit 145a obtains a line-of-sight movement amount indicating how much the user's iris AE has moved relative to the center position AX of the user by performing image analysis on the eyeball image captured by the eye camera 63. In this embodiment, the line-of-sight movement amount is expressed by a combination of the movement amount in the x-axis direction and the movement amount in the y-axis direction. FIG. 11B shows a diagram when the iris AE is located at the center position AX, and the line-of-sight movement amount (x axis, y axis) at that time. In this example, since the iris AE is on the center position AX, the line-of-sight movement amount is (0, 0). FIG. 11C shows a view when the user's iris AE moves to the left with respect to the center position AX, and the line-of-sight movement amount (x axis, y axis) at that time. In this example, since the iris AE moves in the horizontal direction from the center position AX, the line-of-sight movement amount is (+10, 0).

  In step S208 of FIG. 10, the detection unit 145a determines whether at least one of the movement amount of the iris x-axis and the movement amount of the iris y-axis obtained in step S206 exceeds the third threshold Th3. (Step S208). If both the x-axis and y-axis movement amounts of the iris are equal to or smaller than the third threshold Th3 (step S208: NO), the detection unit 145a causes the process to transition to step S204, and the eyeball image of the eye camera 63 is again displayed. get. On the other hand, when at least one of the movement amounts of the x-axis and y-axis of the iris exceeds the third threshold Th3 (step S208: YES), the detection unit 145a turns off the backlight (step S114). The details are the same as step S114 of the operation detection process (FIG. 6).

  After the backlight is turned off, the detection unit 145a acquires an eyeball image of the eye camera 63 (step S210). Then, the detection unit 145a analyzes the acquired eyeball image to obtain the amount of iris line-of-sight movement (step S212). Details are the same as in step S206. The detecting unit 145a determines whether or not at least one of the obtained movement amount of the iris in the x-axis and movement amount of the iris in the y-axis is equal to or greater than the third threshold Th3 (step S214). When both the x-axis and y-axis movement amounts are smaller than the third threshold Th3 (step S214: NO), the detection unit 145a moves the process to step S210 and acquires the eyeball image of the eye camera 63 again. . On the other hand, when at least one of the movement amounts of the x-axis and the y-axis is equal to or greater than the third threshold Th3 (step S214: YES), the detection unit 145a turns on the backlight. Details are the same as step S120 of the operation detection process (FIG. 6).

  As described above, according to the second embodiment, the detection unit 145a analyzes the user's eyeball image captured by the eye camera 63, so that the x-axis direction with respect to the center position AX of the user's iris AE is analyzed. The line-of-sight movement amount including the movement amount and the movement amount in the y-axis direction is acquired, and the control unit 10 further increases the visibility of the virtual image when the line-of-sight movement amount exceeds a predetermined third threshold Th3. The brightness of the image light generation unit is adjusted so as to decrease, or the image light generated by the image light generation unit is adjusted. As a result, according to the present embodiment, in addition to the effects of the first embodiment, it is possible to detect that the user has shifted his / her line of sight from the center position, thereby improving the visibility of the outside scene.

C. Variations:
In addition, this invention is not restricted to said Example and embodiment, A various structure can be taken in the range which does not deviate from the summary. For example, a function realized by software may be realized by hardware. In addition, the following modifications are possible.

C1. Modification 1:
In the said Example, it illustrated about the structure of the head mounted display. However, the configuration of the head-mounted display can be arbitrarily determined without departing from the gist of the present invention. For example, each component can be added, deleted, converted, and the like.

  In the above embodiment, for convenience of explanation, it is assumed that the control unit includes a transmission unit (51, 52) and the image display unit includes a reception unit (53, 54). However, each of the transmission units (51, 52) and the reception units (53, 54) of the above embodiment has a function capable of bidirectional communication, and can function as a transmission / reception unit.

  For example, as illustrated in FIG. 12, the connection unit may be omitted, and the control unit and the image display unit may be configured to perform wireless communication. Specifically, the control unit further includes a wireless communication unit (81), and the image display unit further includes a wireless communication unit (82) and a power source (280). In this case, the wireless communication unit 81 functions as the transmission unit (51, 52) in the above embodiment, and the wireless communication unit 82 functions as the reception unit (53, 54) in the above embodiment.

  For example, the configurations of the control unit and the image display unit illustrated in FIG. 1 can be arbitrarily changed. Specifically, for example, the touch panel may be omitted from the control unit, and the operation may be performed using only the cross key. Further, the control unit may be provided with another operation interface such as an operation stick. Moreover, it is good also as what receives input from a keyboard or a mouse | mouth as a structure which can connect devices, such as a keyboard and a mouse | mouth, to a control part. In addition, a communication unit using Wi-Fi (wireless fidelity) or the like may be provided in the control unit.

  For example, the control unit illustrated in FIG. 1 is connected to the image display unit via a wired signal transmission path. However, the control unit and the image display unit may be connected by a connection via a wireless signal transmission path such as a wireless LAN, infrared communication, or Bluetooth (registered trademark).

  For example, although the head-mounted display is a binocular transmissive head-mounted display, the head-mounted display may be configured as a non-transmissive head-mounted display that blocks the outside scene when the user wears the head-mounted display. Good. A monocular head-mounted display may be used.

  For example, the image light generation unit is configured using left and right backlight control units, left and right LCD control units, left and right backlights, and left and right LCDs. EL (Organic Electro-Luminescence) and an organic EL control unit may be used. In that case, the organic EL and the organic EL control unit correspond to an “image light generation unit” in the claims.

  For example, functional units such as a detection unit, an image processing unit, a display control unit, and an audio processing unit are described as being realized by a CPU developing and executing a computer program stored in a ROM or a hard disk on the RAM. did. However, these functional units may be configured using an ASIC (Application Specific Integrated Circuit) designed to realize the function.

  For example, in the above-described embodiment, the image display unit is a head-mounted display that is mounted like glasses, but the image display unit is a normal flat display device (liquid crystal display device, plasma display device, organic EL display device, etc.) ). Also in this case, the connection between the control unit and the image display unit may be a connection via a wired signal transmission path or a connection via a wireless signal transmission path. In this way, the control unit can be used as a remote controller for a normal flat display device.

  Further, as the image display unit, instead of the image display unit worn like glasses, an image display unit of another shape such as an image display unit worn like a hat may be adopted. Further, the earphone may be an ear-hook type or a headband type, or may be omitted.

  For example, in the above embodiment, the secondary battery is used as the power source. However, the power source is not limited to the secondary battery, and various batteries can be used. For example, a primary battery, a fuel cell, a solar cell, a thermal cell, or the like may be used.

C2. Modification 2:
In the above embodiment, the image processing unit outputs the same image data as right-eye image data and left-eye image data. However, the image processing unit may be configured to allow the user to visually recognize a 3D virtual image by using different image data for the right eye image data and the left eye image data.

C3. Modification 3:
In the operation detection process (FIG. 6) of the above embodiment, the detection unit detects a head movement exceeding a certain amount of the user according to the output value from the gyro sensor. However, the aspect in the said Example is an example to the last, and the detection part can detect the motion of the head exceeding a fixed amount of a user using various methods.

  For example, in step S110 of the motion detection process, whether or not at least one of the x-axis angle and the y-axis angle obtained from the output value of the gyro sensor exceeds the same threshold (first threshold Th1). Was judged. However, in general, since the movement of the face in the horizontal direction is larger than the movement of the head in the vertical direction (vertical direction), the first threshold Th1 is set to the vertical direction (for the y-axis) and the horizontal direction (x Two types may be prepared corresponding to the shaft). The same applies to step S112.

  For example, in the above embodiment, a piezoelectric vibration type biaxial angular velocity sensor is used as the gyro sensor, and the movement of the user's head in the vertical direction (vertical direction) and the movement of the face in the horizontal direction are detected. . However, as the gyro sensor, various types of angular velocity sensors such as one axis and three axes can be used. If the number of axes that can be detected by the gyro sensor increases, the movement of the user's head can be detected in detail.

  For example, the image display unit is further provided with an acceleration sensor, and in addition to the determination in steps S108 to S112 of the motion detection process, the inclination, movement, vibration, and the like of the image display unit obtained from the output value of the acceleration sensor The impact may be used to determine “a head movement exceeding a certain amount”. If the gyro sensor and the acceleration sensor are used in combination, the determination accuracy in the detection unit can be improved. In this case, the acceleration sensor corresponds to an “acceleration detector” in the claims.

  For example, instead of the gyro sensor, the image display unit may include an electronic compass that detects the direction in which the image display unit is directed using geomagnetism and an acceleration sensor that detects the acceleration of the image display unit. In this case, the determination unit uses the angle obtained from the azimuth detected by the electronic compass to perform the determination in step S110 of the motion detection process, and uses the acceleration detected by the acceleration sensor in step S112 of the motion detection process. Each determination can be made. The electronic compass corresponds to the “geomagnetic detection unit” in the claims, and the acceleration sensor corresponds to the “acceleration detection unit” in the claims. In this way, the same effects as in the embodiment can be obtained even in a configuration including an electronic compass and an acceleration sensor instead of the gyro sensor.

C4. Modification 4:
In the motion detection process (FIG. 6) of the above embodiment, the detection unit sets an initial position triggered by pressing of the setting button, and continues the motion detection process. However, this trigger can be variously changed.

  For example, step S102 of the operation detection process may be omitted, and the detection unit may set the initial position and execute step S104 and subsequent steps when the head mounted display HM is powered on.

  For example, in step S102 of the motion detection process, the detection unit may set the initial position and execute step S104 and subsequent steps when it is determined that the user has held the same posture for a certain time. Specifically, the detection unit can determine the posture of the user by monitoring changes in the angle and angular velocity obtained from the output value of the gyro sensor. In this way, since the user does not need to operate the setting button, the usability of the head mounted display HM can be improved.

  For example, if the detection unit detects activation of a specific application (for example, operation reproduction software) installed in the head mounted display HM in step S102 of the operation detection process, the detection unit sets an initial position and performs step S104 and subsequent steps. It may be executed. The specific application can be arbitrarily determined and may be configured to be designated by the user.

  For example, when the image display unit is further provided with an acceleration sensor, in step S102 of the operation detection process, the detection unit performs “the operation of hitting the image display unit” based on the acceleration obtained from the output value of the acceleration sensor. May be detected, the initial position may be set, and step S104 and subsequent steps may be executed. In this way, the usability of the head mounted display HM can be improved by omitting the setting button.

C5. Modification 5:
In the said Example, it illustrated about arrangement | positioning of the acquisition part for acquiring change information. However, the arrangement of the acquisition unit for acquiring change information is merely an example, and various changes can be made.

  For example, in the first embodiment, the gyro sensor as the acquisition unit is disposed inside the housing of the right display unit. However, the gyro sensor may be arranged on both the right display unit and the left display unit. In that case, in the operation detection process (FIG. 6), the output values of the left and right gyro sensors may be compared to obtain an angle or the like.

  For example, in the second embodiment, the eye camera as the acquisition unit is arranged on the inner surface of the right optical panel. However, the eye camera may be arranged on both the inner surface of the right optical panel and the inner surface of the left optical panel. In this case, in the line-of-sight detection process (FIG. 10), the line-of-sight movement amount may be obtained using the movements of the left and right irises detected by the left and right eye cameras.

10, 10a ... Control unit (controller)
DESCRIPTION OF SYMBOLS 12 ... Lighting part 14 ... Touch pad 16 ... Cross key 18 ... Power switch 20, 20a ... Image display part 21 ... Ear hook part 22 ... Right display drive part 24 ... Left display drive part 26 ... Right optical panel 28 ... Left optical panel 32 ... Right earphone 34 ... Left earphone 40 ... Connector 42 ... Right cord 44 ... Left cord 46 ... Connecting member 48 ... Body cord 51 ... Transmitter 52 ... Transmitter 53 ... Receiver 54 ... Receiver 61 ... Gyro sensor (angular velocity) Detection unit)
62 ... Setting button 63 ... Eye camera (imaging part)
DESCRIPTION OF SYMBOLS 81 ... Wireless communication part 82 ... Wireless communication part 110 ... Input information acquisition part 120 ... Memory | storage part 130 ... Power supply 140 ... CPU
145, 145a ... detection part (detection part)
160 ... Image processing unit 170 ... Audio processing unit 180 ... Interface 190 ... Display control unit 201 ... Right backlight control unit (image light generation unit, light source)
202 ... Left backlight control unit (image light generation unit, light source)
211 ... Right LCD controller (image light generator, display element)
212 ... Left LCD controller (image light generator, display element)
221 ... Right backlight (image light generator, light source)
222: Left backlight (image light generator, light source)
241 ... Right LCD (image light generator, display element)
242 ... Left LCD (image light generation unit, display element)
251 ... Right projection optical system (light guide unit)
252 ... Left projection optical system (light guide)
261 ... Right light guide plate (light guide part)
262 ... Left light guide plate (light guide part)
PCLK: Clock signal VSync ... Vertical synchronization signal HSync ... Horizontal synchronization signal Data ... Image data CA ... Infrared MA ... Infrared OA ... External equipment SC ... Outside view PC ... Personal computer AE ... Iris AX ... Center position VI ... Virtual image CI, CIa ... Threshold Information IL ... Illumination light PL ... Image light HM, HMa ... Head mounted display (head mounted display device)
VR: Field of view Th1: First threshold Th2: Second threshold Th3: Third threshold

Claims (11)

  1. A head-mounted display device,
    An image display unit that includes an image light generation unit that generates and emits image light representing an image, and a light guide unit that guides the emitted image light to the user's eyes, and allows the user to visually recognize a virtual image When,
    A control unit connected to the image display unit and controlling image display by the image display unit;
    Change information indicating a change in the orientation of the image display unit, and using the change information, a detection unit that detects head movement exceeding a certain amount of a user wearing the image display unit;
    With
    The controller is
    When the movement of the head exceeding the certain amount is detected, the luminance of the image light generation unit is adjusted so as to reduce the visibility of the virtual image, or the image generated by the image light generation unit Head-mounted display device that adjusts light.
  2. The head-mounted display device according to claim 1,
    The detector is
    In response to occurrence of a predetermined trigger, an initial position, which is a reference position when detecting movement of the image display unit, is set, the change information with respect to the initial position is acquired, and the change information is used. A head-mounted display device that detects head movement exceeding the predetermined amount.
  3. The head-mounted display device according to claim 2,
    The detection unit includes the initial position and the change information, the head angle corresponding to the vertical head movement of the user wearing the image display unit, and the face direction corresponding to the horizontal face movement. And a head-mounted display device that is specified by the combination.
  4. The head-mounted display device according to claim 2 or 3, further comprising:
    The image display unit includes an angular velocity detection unit that detects an angular velocity of the image display unit,
    The detector is
    When the angular velocity detected as the change information is acquired, the angle obtained from the angular velocity exceeds a predetermined first threshold, and the angular velocity exceeds a predetermined second threshold A head-mounted display device that determines that movement of the head exceeding the predetermined amount is detected.
  5. The head-mounted display device according to any one of claims 2 to 4, further comprising:
    The image display unit includes an acceleration detection unit that detects acceleration of the image display unit,
    The detection unit further includes:
    The head-mounted display that obtains the acceleration detected as the change information and determines whether a head movement exceeding the predetermined amount is detected using the inclination of the image display unit obtained from the acceleration. apparatus.
  6. The head-mounted display device according to claim 2 or 3,
    The image display unit further includes:
    A geomagnetism detection unit that detects the orientation of the image display unit using geomagnetism;
    An acceleration detection unit for detecting acceleration of the image display unit;
    With
    The detector is
    As the change information, the detected azimuth and the detected acceleration are acquired, the angle obtained from the azimuth exceeds a predetermined first threshold, and the acceleration is determined in advance. A head-mounted display device that determines that movement of the head exceeding the predetermined amount is detected when the threshold value of 2 is exceeded.
  7. The head-mounted display device according to any one of claims 1 to 6,
    The predetermined trigger is at least one of power-on to the head-mounted display device, detection of activation of a predetermined application, and detection of a predetermined button press. Head-mounted display device.
  8. The head-mounted display device according to any one of claims 1 to 7,
    The image light generator is
    A display element for generating the image;
    A light source for emitting image light representing the generated image;
    Including
    The controller is
    A head-mounted display device that adjusts the luminance of the image light generation unit by turning off or reducing the illumination light of the light source when movement of the head exceeding the certain amount is detected.
  9. The head-mounted display device according to any one of claims 1 to 7,
    The controller is
    When the movement of the head exceeding the predetermined amount is detected, the image light generated by the image light generation unit is adjusted by temporarily stopping the generation of the image light by the image light generation unit. Head mounted display device.
  10. The head-mounted display device according to any one of claims 1 to 9, further comprising:
    The image display unit includes an imaging unit that images a user's eyeball image;
    The detection unit further includes:
    By analyzing the captured image of the eyeball, a line-of-sight movement amount indicating a movement amount with respect to the center position of the user's iris is acquired,
    The control unit further includes:
    When the line-of-sight movement amount exceeds a predetermined third threshold, the brightness of the image light generation unit is adjusted or generated by the image light generation unit so as to reduce the visibility of the virtual image. A head-mounted display device that adjusts the image light.
  11. A method for controlling a head-mounted display device,
    (A) a step of causing a user to visually recognize a virtual image using an image light generation unit that generates and emits image light representing an image, and a light guide unit that guides the emitted image light to a user's eyes; ,
    (B) controlling the image display in the step (a);
    (C) Obtaining change information indicating a change in orientation of the head-mounted display device, and using the change information, a head movement exceeding a certain amount of a user wearing the head-mounted display device. Detecting process;
    (D) When a head movement exceeding the certain amount is detected, the brightness of the image light generation unit is adjusted so as to reduce the visibility of the virtual image, or generated by the image light generation unit. Adjusting the image light;
    A method for controlling a head-mounted display device.
JP2011066393A 2011-03-24 2011-03-24 Head mounted display and method for controlling head mounted display Pending JP2012203128A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011066393A JP2012203128A (en) 2011-03-24 2011-03-24 Head mounted display and method for controlling head mounted display

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2011066393A JP2012203128A (en) 2011-03-24 2011-03-24 Head mounted display and method for controlling head mounted display
US13/419,010 US9217867B2 (en) 2011-03-24 2012-03-13 Head-mounted display device and control method for the head-mounted display device
US14/566,014 US9678346B2 (en) 2011-03-24 2014-12-10 Head-mounted display device and control method for the head-mounted display device
US14/943,549 US9588345B2 (en) 2011-03-24 2015-11-17 Head-mounted display device and control method for the head-mounted display device

Publications (1)

Publication Number Publication Date
JP2012203128A true JP2012203128A (en) 2012-10-22

Family

ID=47184226

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011066393A Pending JP2012203128A (en) 2011-03-24 2011-03-24 Head mounted display and method for controlling head mounted display

Country Status (1)

Country Link
JP (1) JP2012203128A (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013195924A (en) * 2012-03-22 2013-09-30 Sony Corp Head-mounted display
WO2014128752A1 (en) * 2013-02-19 2014-08-28 株式会社ブリリアントサービス Display control device, display control program, and display control method
JP2014186089A (en) * 2013-03-22 2014-10-02 Seiko Epson Corp Head-mounted type display device and method for controlling head-mounted type display device
WO2015084323A1 (en) 2013-12-03 2015-06-11 Nokia Corporation Display of information on a head mounted display
KR101560503B1 (en) * 2014-11-21 2015-10-15 성균관대학교산학협력단 Apparatus, system and method for controlling playback speed of video
WO2015189987A1 (en) * 2014-06-13 2015-12-17 日立マクセル株式会社 Wearable information display/input system, and portable information input/output device and information input method which are used therein
JP2016532178A (en) * 2013-06-08 2016-10-13 株式会社ソニー・インタラクティブエンタテインメント System and method for transitioning between transmissive mode and non-transmissive mode in a head mounted display
WO2016194232A1 (en) * 2015-06-05 2016-12-08 日立マクセル株式会社 Video display device and control method
US9568996B2 (en) 2013-02-13 2017-02-14 Seiko Epson Corporation Image display device and display control method for image display device
JPWO2015068440A1 (en) * 2013-11-08 2017-03-09 ソニー株式会社 Information processing apparatus, control method, and program
JPWO2015125363A1 (en) * 2014-02-21 2017-03-30 ソニー株式会社 Electronic apparatus and image providing method
JP2017072864A (en) * 2017-01-17 2017-04-13 株式会社Jvcケンウッド Image display device
WO2017104089A1 (en) * 2015-12-18 2017-06-22 日立マクセル株式会社 Collaborative head-mounted display system, system including display device and head-mounted display, and display device
WO2018084227A1 (en) * 2016-11-02 2018-05-11 シャープ株式会社 Terminal device, operating method, and program
WO2018173159A1 (en) * 2017-03-22 2018-09-27 マクセル株式会社 Image display device
JP2018190078A (en) * 2017-04-28 2018-11-29 株式会社スクウェア・エニックス Contents display program, computer device, contents display method, and contents display system

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0678247A (en) * 1992-08-24 1994-03-18 Olympus Optical Co Ltd Head-mounted type display device
JPH0678248A (en) * 1992-08-28 1994-03-18 Sony Corp Visual device
JPH0819004A (en) * 1994-06-30 1996-01-19 Victor Co Of Japan Ltd Head mount display device
JPH08328512A (en) * 1995-05-26 1996-12-13 Canon Inc Head mounting type display device
US5621424A (en) * 1992-08-24 1997-04-15 Olympus Optical Co., Ltd. Head mount display apparatus allowing easy switching operation from electronic image to external field image
JPH10341387A (en) * 1997-06-10 1998-12-22 Canon Inc Display device
JP2002312117A (en) * 2001-04-17 2002-10-25 Japan Aviation Electronics Industry Ltd Cylindrical image spherical image control device
JP2004056272A (en) * 2002-07-17 2004-02-19 Japan Aviation Electronics Industry Ltd All-round image display apparatus and magnetism correction method of head tracker used for the same
JP2004096224A (en) * 2002-08-29 2004-03-25 Sony Corp Power supply control method and head mount device
JP2005321479A (en) * 2004-05-06 2005-11-17 Olympus Corp Head mounted type display device
JP2008264341A (en) * 2007-04-24 2008-11-06 Chube Univ Eye movement measurement method and eye movement measuring instrument

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0678247A (en) * 1992-08-24 1994-03-18 Olympus Optical Co Ltd Head-mounted type display device
US5621424A (en) * 1992-08-24 1997-04-15 Olympus Optical Co., Ltd. Head mount display apparatus allowing easy switching operation from electronic image to external field image
JPH0678248A (en) * 1992-08-28 1994-03-18 Sony Corp Visual device
JPH0819004A (en) * 1994-06-30 1996-01-19 Victor Co Of Japan Ltd Head mount display device
JPH08328512A (en) * 1995-05-26 1996-12-13 Canon Inc Head mounting type display device
JPH10341387A (en) * 1997-06-10 1998-12-22 Canon Inc Display device
JP2002312117A (en) * 2001-04-17 2002-10-25 Japan Aviation Electronics Industry Ltd Cylindrical image spherical image control device
JP2004056272A (en) * 2002-07-17 2004-02-19 Japan Aviation Electronics Industry Ltd All-round image display apparatus and magnetism correction method of head tracker used for the same
JP2004096224A (en) * 2002-08-29 2004-03-25 Sony Corp Power supply control method and head mount device
JP2005321479A (en) * 2004-05-06 2005-11-17 Olympus Corp Head mounted type display device
JP2008264341A (en) * 2007-04-24 2008-11-06 Chube Univ Eye movement measurement method and eye movement measuring instrument

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013195924A (en) * 2012-03-22 2013-09-30 Sony Corp Head-mounted display
US9568996B2 (en) 2013-02-13 2017-02-14 Seiko Epson Corporation Image display device and display control method for image display device
WO2014128752A1 (en) * 2013-02-19 2014-08-28 株式会社ブリリアントサービス Display control device, display control program, and display control method
US9933853B2 (en) 2013-02-19 2018-04-03 Mirama Service Inc Display control device, display control program, and display control method
JPWO2014128752A1 (en) * 2013-02-19 2017-02-02 株式会社ブリリアントサービス Display control device, display control program, and display control method
JP2014186089A (en) * 2013-03-22 2014-10-02 Seiko Epson Corp Head-mounted type display device and method for controlling head-mounted type display device
US9823473B2 (en) 2013-03-22 2017-11-21 Seiko Epson Corporation Head-mounted display device and control method for head-mounted display device
JP2016532178A (en) * 2013-06-08 2016-10-13 株式会社ソニー・インタラクティブエンタテインメント System and method for transitioning between transmissive mode and non-transmissive mode in a head mounted display
US9908048B2 (en) 2013-06-08 2018-03-06 Sony Interactive Entertainment Inc. Systems and methods for transitioning between transparent mode and non-transparent mode in a head mounted display
JPWO2015068440A1 (en) * 2013-11-08 2017-03-09 ソニー株式会社 Information processing apparatus, control method, and program
JP2017502570A (en) * 2013-12-03 2017-01-19 ノキア テクノロジーズ オーユー Displaying information on the head mounted display
WO2015084323A1 (en) 2013-12-03 2015-06-11 Nokia Corporation Display of information on a head mounted display
EP3078019A4 (en) * 2013-12-03 2017-06-14 Nokia Technologies Oy Display of information on a head mounted display
US10386921B2 (en) 2013-12-03 2019-08-20 Nokia Technologies Oy Display of information on a head mounted display
JPWO2015125363A1 (en) * 2014-02-21 2017-03-30 ソニー株式会社 Electronic apparatus and image providing method
WO2015189987A1 (en) * 2014-06-13 2015-12-17 日立マクセル株式会社 Wearable information display/input system, and portable information input/output device and information input method which are used therein
KR101560503B1 (en) * 2014-11-21 2015-10-15 성균관대학교산학협력단 Apparatus, system and method for controlling playback speed of video
JPWO2016194232A1 (en) * 2015-06-05 2018-05-24 マクセル株式会社 Video display device and control method
WO2016194232A1 (en) * 2015-06-05 2016-12-08 日立マクセル株式会社 Video display device and control method
WO2017104089A1 (en) * 2015-12-18 2017-06-22 日立マクセル株式会社 Collaborative head-mounted display system, system including display device and head-mounted display, and display device
WO2018084227A1 (en) * 2016-11-02 2018-05-11 シャープ株式会社 Terminal device, operating method, and program
JP2017072864A (en) * 2017-01-17 2017-04-13 株式会社Jvcケンウッド Image display device
WO2018173159A1 (en) * 2017-03-22 2018-09-27 マクセル株式会社 Image display device
JP2018190078A (en) * 2017-04-28 2018-11-29 株式会社スクウェア・エニックス Contents display program, computer device, contents display method, and contents display system

Similar Documents

Publication Publication Date Title
EP2148504B1 (en) Information Display Device
US9046686B2 (en) Head-mount type display device
JP6186689B2 (en) Video display system
US9652036B2 (en) Device, head mounted display, control method of device and control method of head mounted display
CN104076512B (en) The control method of head-mount type display unit and head-mount type display unit
KR101845350B1 (en) Head-mounted display device, control method of head-mounted display device, and display system
EP3029550B1 (en) Virtual reality system
US9557566B2 (en) Head-mounted display device and control method for the head-mounted display device
US8670000B2 (en) Optical display system and method with virtual image contrast control
JP5884576B2 (en) Head-mounted display device and method for controlling head-mounted display device
CN105045375B (en) Head-mounted display device, control method therefor, control system, and computer program
US9217867B2 (en) Head-mounted display device and control method for the head-mounted display device
US9372345B2 (en) Head-mounted display device
US9959591B2 (en) Display apparatus, method for controlling display apparatus, and program
US9967487B2 (en) Preparation of image capture device in response to pre-image-capture signal
US9454006B2 (en) Head mounted display and image display system
US9824496B2 (en) Information display system using head mounted display device, information display method using head mounted display device, and head mounted display device
CN104423045B (en) Head-mount type display unit
US9411160B2 (en) Head mounted display, control method for head mounted display, and image display system
JP2012165084A (en) Head-mounted display device and head-mounted display device control method
JP6064464B2 (en) Head-mounted display device, head-mounted display device control method, and authentication system
US10162412B2 (en) Display, control method of display, and program
US10175923B2 (en) Display system, display device, information display method, and program
US10297062B2 (en) Head-mounted display device, control method for head-mounted display device, and computer program
US9542958B2 (en) Display device, head-mount type display device, method of controlling display device, and method of controlling head-mount type display device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20140121

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20141015

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20141028

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20141212

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20150317

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20150617

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20150624

A912 Removal of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A912

Effective date: 20150717