WO2018116519A1 - Head-mounted display - Google Patents

Head-mounted display Download PDF

Info

Publication number
WO2018116519A1
WO2018116519A1 PCT/JP2017/029078 JP2017029078W WO2018116519A1 WO 2018116519 A1 WO2018116519 A1 WO 2018116519A1 JP 2017029078 W JP2017029078 W JP 2017029078W WO 2018116519 A1 WO2018116519 A1 WO 2018116519A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
display
image
user
head
Prior art date
Application number
PCT/JP2017/029078
Other languages
French (fr)
Japanese (ja)
Inventor
宏征 高橋
Original Assignee
ブラザー工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ブラザー工業株式会社 filed Critical ブラザー工業株式会社
Publication of WO2018116519A1 publication Critical patent/WO2018116519A1/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/36Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using liquid crystals
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/66Transforming electric information into light information

Definitions

  • This disclosure relates to a head mounted display.
  • a display phenomenon that reduces the visibility of the user may occur according to a change in the positional relationship between the display unit on which the image is displayed and the user's eyes.
  • techniques for suppressing the occurrence of such a display phenomenon have been proposed.
  • color braking occurs when an image is displayed on a display unit by a field sequential method (hereinafter referred to as “FS method”).
  • FS method red (R), green (G), and blue (B) light is continuously switched to emit light, and the switching speed is faster than the temporal resolution of the human eye.
  • RGB images that are sequentially displayed as time elapses do not mix well.
  • RGB images are visually recognized as afterimages and the like, and the visibility of the user is reduced.
  • the technique for suppressing the fall of the visibility by color braking is proposed (for example, refer patent document 1).
  • HMD head mounted display
  • a display unit is held on the user's head via an arm unit or the like.
  • the positional relationship between the user's eyes and the display unit is easily changed as compared with a general installation type display device.
  • display phenomena such as the above-mentioned color braking that reduce the visibility of the user are likely to occur.
  • An object of the present disclosure is to provide a head mounted display that can suppress a decrease in the visibility of the user in accordance with a change in the positional relationship between the user's eyes and the display unit.
  • the head-mounted display is provided with a display unit having a display element that displays a color image by controlling the output timing of RGB light, and for mounting the display unit on a user's head.
  • a mounting unit including at least an adjustment unit capable of adjusting a position of the display unit with respect to the user's eye, and a position detection unit that detects a positional relationship between the user's eye and the display unit.
  • a control means for displaying an image on the display element in a display method for outputting RGB light at an output timing corresponding to the positional relationship detected by the position detecting means.
  • the HMD head mounted display
  • the HMD when the positional relationship between the user's eyes and the display unit satisfies a predetermined condition, the positional relationship between the user's eyes and the display unit is easily changed depending on the direction of the line of sight. Is likely to change.
  • the HMD outputs RGB light from the display element at an output timing at which a display phenomenon that reduces the visibility of the user is unlikely to occur according to the positional relationship between the user's eyes and the display unit. An image is displayed on the element. Thereby, HMD can suppress that a user's visibility falls.
  • the head-mounted display is provided with a display unit having a display element that displays a color image by controlling the output timing of RGB light, and for mounting the display unit on a user's head
  • a mounting unit having an adjustment unit capable of adjusting a position of the display unit with respect to the user's eye, and a position detection unit that detects a positional relationship between the user's eye and the display unit,
  • Control means for causing the display element to display an image in a first method or a second method in which RGB light output timings are different from each other in accordance with the positional relationship detected by the position detection unit; Is a method of switching and outputting RGB light in a first cycle in a time division manner, and the second method switches RGB light in a second cycle shorter than the first cycle in a time division manner.
  • FIG. 3 is an enlarged view of a first ball joint 6 and a second ball joint 7.
  • FIG. It is a figure which shows HMD1 of the state by which HD2 is arrange
  • a head mounted display (hereinafter referred to as “HMD”) 1 is a video transmission type HMD.
  • the HMD 1 includes a wearing tool 10, a mounting unit 11, and a display device 2 (hereinafter referred to as “HD2”).
  • HD2 display device 2
  • the upper side, lower side, left side, right side, front side, and rear side of the HMD 1 are defined.
  • the upper side, the lower side, the left side, the right side, the front side, and the rear side of the HMD 1 correspond to, for example, the upper side, the lower side, the upper left side, the lower right side, the left side, and the right side in FIG.
  • the upper side, the lower side, the left side, the right side, the front side, and the rear side of the HMD 1 correspond to the upper side, the lower side, the right side, the left side, the front side, and the rear side, respectively, for the user wearing the wearing tool 10. .
  • the wearing tool 10 is made of a flexible material such as resin or metal (for example, stainless steel).
  • the wearing tool 10 includes a first portion 10A and second portions 10B and 10C.
  • the wearing tool 10 is described as being divided into a first part 10A and a second part 10B, 10C, but the wearing tool 10 is divided into the first part 10A and the second part 10B. It is not divided into each member of 10C, but is an integral member as a whole. *
  • the first part 10A and the second part 10B, 10C are each curved and elongated plate-like members.
  • the first portion 10 ⁇ / b> A extends in the left-right direction between the position 102 and the position 103 in the wearing tool 10.
  • the first portion 10A is curved in a convex shape on the front side.
  • the position 102 is located on the left side of the center 101 in the left-right direction of the wearing tool 10.
  • the position 103 is located on the right side of the center 101 in the left-right direction of the wearing tool 10.
  • the second portion 10 ⁇ / b> B extends rearward from the position 102 in the wearing tool 10.
  • the second portion 10 ⁇ / b> C extends rearward from the position 103 in the wearing tool 10.
  • the second portions 10B and 10C each extend in a direction in which the rear end portions approach each other.
  • the wearing tool 10 is worn on the user's head with the first part 10A, the second part 10B, and 10C in contact with the frontal, right and left heads of the user, respectively. Is done.
  • the mounting portion 11 is fixed at the position 103 of the wearing tool 10.
  • the side surrounded by the first portion 10A and the second portions 10B and 10C in the wearing tool 10 is referred to as “inside”, and the side opposite to the inside is referred to as “outside”.
  • the mounting part 11 includes an arm part 12, a fixing part 14, a first ball joint 6, and a second ball joint 7.
  • fixed part 14 is fixed to the position 103 of the wearing tool 10 with the screw
  • the arm part 12 is substantially rod-shaped.
  • the arm portion 12 is made of resin, metal, or the like.
  • the arm part 12 extends in the vertical direction as viewed from the front.
  • the upper end portion of the arm portion 12 is connected to the fixed portion 14 via the first ball joint 6.
  • the lower end portion of the arm portion 12 is connected to the HD 2 via the second ball joint 7.
  • the arm part 12 connects the fixing part 14 and the HD 2.
  • the arm portion 12 and the fixing portion 14 can hold the HD 2 on the front side of the user's left eye in a state where the wearing tool 10 is worn on the user's head.
  • the first ball joint 6 and the second ball joint 7 can adjust the position of HD2 with respect to the user's left eye 8 (see FIG. 3 and the like).
  • the first ball joint 6 includes a ball stud 61 and a socket 62.
  • the ball stud 61 has a rod portion 61A and a sphere portion 61B.
  • the spherical body portion 61B is a spherical portion provided at one end of the rod portion 61A.
  • the socket 62 supports the sphere 61B so as to be slidable.
  • positioned among the wearing tools 10 is called "orthogonal direction.”
  • the socket 62 has a lid portion 621 and a receiving portion 622.
  • the receiving part 622 has a cylindrical shape and extends outward from the fixed part 14 along the orthogonal direction.
  • the spherical portion 61B is accommodated inside the receiving portion 622.
  • the receiving portion 622 has a thread on the outer surface.
  • the lid 621 has a thread on the inner surface.
  • the lid portion 621 is screwed into the receiving portion 622.
  • the lid 621 has a circular hole 621A.
  • the rod portion 61A of the ball stud 61 extends from the inside of the receiving portion 622 toward the hole portion 621A and passes through the hole portion 621A.
  • the rod portion 61A extends outward along the orthogonal direction from the sphere portion 61B, bends rightward, further extends along the left-right direction, and is supported by the arm portion 12.
  • the bending angle ⁇ 1 of the rod 61A is about 30 degrees.
  • the first ball joint 6 connects the fixing part 14 and the arm part 12 in a rotatable manner.
  • the movable range of the arm portion 12 with respect to the fixed portion 14 is limited by the contact of the rod portion 61A with the hole portion 621A.
  • the second ball joint 7 includes a ball stud 71 and a socket 72.
  • the ball stud 71 has a rod portion 71A and a sphere portion 71B.
  • the spherical portion 71B is a spherical portion provided at one end of the rod portion 71A.
  • the socket 72 is provided in the HD2.
  • the socket 72 supports the sphere 71B so as to be slidable.
  • the socket 72 has a lid portion 721 and a receiving portion 722.
  • the receiving part 722 is cylindrical and extends from the HD2.
  • the spherical portion 71B is accommodated in the receiving portion 722.
  • the receiving portion 722 has a thread on the outer surface.
  • the lid 721 has a thread on the inner surface.
  • the lid part 721 is screwed into the receiving part 722.
  • the lid 721 has a circular hole 721A.
  • the rod portion 71A of the ball stud 71 extends from the inside of the receiving portion 722 toward the hole portion 721A and passes through the hole portion 721A.
  • the other end portion of the rod portion 71A is supported by the arm portion 12.
  • the rod portion 71A extends from the arm portion 12 toward the left side, bends in a direction away from the wearing tool 10, and further extends, and is supported by the HD 2 via the sphere portion 71B and the socket 62.
  • the bending angle ⁇ 2 of the rod portion 71A is about 30 degrees.
  • the second ball joint 7 connects the HD 2 and the arm portion 12 in a rotatable manner.
  • the movable range of HD2 with respect to the arm portion 12 is limited by the rod portion 71A coming into contact with the hole portion 721A.
  • the HD 2 includes a housing 21.
  • the casing 21 has a hollow box shape.
  • a rectangular opening is provided at the rear end of the housing 21.
  • the opening is covered with a transparent plate member.
  • the liquid crystal panel 2 ⁇ / b> A is accommodated in the housing 21.
  • the liquid crystal panel 2A displays a color image by controlling the output timing of each of red (R), green (G), and blue (B) light.
  • the liquid crystal panel 2A can display an image in one of two display methods (field sequential method (hereinafter referred to as “FS method”) or another method) in which the output timings of RGB light are different from each other. .
  • the liquid crystal panel 2A is electrically connected to the external device 9 (see FIG. 5) via the cable 2B.
  • the external device 9 is a control box that performs control for displaying an image on the liquid crystal panel 2A, for example.
  • the liquid crystal panel 2A displays an image on the display surface based on the image signal received from the external device 9 via the cable 2B.
  • the light of the image displayed on the display surface of the liquid crystal panel 2 ⁇ / b> A is emitted toward the opening of the housing 21.
  • the light of the image passes through the opening of the housing 21 and further passes through the transparent plate member to the rear side.
  • the image light is emitted from the housing 21 to the outside.
  • the display surface of the liquid crystal panel 2A in other words, the portion of the liquid crystal panel 2A from which the image light is emitted is referred to as an “emission portion 22”.
  • a user fixes the wearing tool 10 of HMD1 to a head.
  • the user holds the HD 2, and as shown in FIG. 3A, the HD 2 is arranged so that any position of the light emitting portion 22 of the liquid crystal panel 2 A is disposed to face the front side of the left eye 8 of the user. Adjust the position.
  • the first ball joint 6 and the second ball joint 7 freely rotate, the user can easily move the HD 2.
  • a region including HD2 in a case where any position of the emission unit 22 faces the front side of the user's left eye 8 is referred to as a “central region”.
  • a center C11 of a sphere portion 61B (see FIG. 2A) of the first ball joint 6 is defined.
  • the center C12 of the emission part 22 of HD2 is defined.
  • the position of HD2 when the center C12 of the emitting portion 22 faces the front side of the user's left eye 8 (see FIG. 3A) is referred to as a “first position”.
  • the first position is included in the first region.
  • the distance between the centers C11 and C12 when the HD2 is disposed at the first position is referred to as a “first distance L1”.
  • a signal is output from the external device 9 (see FIG. 5) to the liquid crystal panel 2A of the HD 2 via the cable 2B (see FIG. 1).
  • An image is displayed on the emission part 22 of the liquid crystal panel 2A.
  • the light of the displayed image is emitted to the rear side through the opening of the housing 21.
  • the emitted image light enters the left eye 8 of the user. Thereby, the user recognizes the image.
  • the user may arrange and use the HD2 so that both the image displayed on the liquid crystal panel 2A of the HD2 and the scenery in front of the user can be visually recognized.
  • the user rotates the second ball joint 7 from the state in which the HD 2 is disposed at the first position (see FIG. 3A) clockwise by the first predetermined angle or more in a state viewed from above. (See FIG. 3B). That is, the user can adjust the second ball joint 7 so that the rotation angle about the vertical direction of the second ball joint 7 is larger than the rotation angle about the vertical direction of the first ball joint 6. Rotate.
  • the user moves the HD 2 until the emission part 22 is not disposed opposite to the front side of the left eye 8.
  • the HD 2 moves in a direction away from the wearing tool 10.
  • the distance between the centers C11 and C12 changes to a second distance L2 that is longer than the first distance L1.
  • the position of HD2 when the second ball joint 7 is moved by rotating the second ball joint 7 by a first predetermined angle or more is referred to as a “second position”.
  • the distance between the centers C11 and C12 is the second distance L2.
  • a predetermined area including HD2 arranged at the second position is referred to as a “first peripheral area”.
  • the rod portion 71A of the ball stud 71 of the second ball joint 7 extends from the arm portion 12, bends away from the wearing tool 10, and is supported by the socket 72. Connected to the spherical portion 71B. For this reason, as for the position of HD2, the movable range of HD2 in the direction away from the face becomes wider and the movable range of HD2 in the direction approaching the face becomes narrower than in the case where the rod portion 71A is not bent. Therefore, the user can easily move the HD 2 in the direction away from the face and place it at the second position.
  • the user may use the HD2 by placing it out of the field of view so that the scenery in front of him can be seen well.
  • the user rotates the first ball joint 6 more than the second predetermined angle clockwise from the state in which the HD 2 is disposed at the first position (see FIG. 3A) as viewed from the right side. (See FIG. 4). That is, the user can adjust the first ball joint 6 so that the rotation angle about the left and right direction of the first ball joint 6 is larger than the rotation angle about the left and right direction of the first ball joint 6. Rotate.
  • the user moves the HD 2 until the emitting portion 22 is not disposed to face the front side of the left eye 8.
  • the distance between the centers C11 and C12 is maintained at a third distance L3 that is substantially the same as the first distance L1.
  • the position of HD2 when the first ball joint 6 is moved by rotating the first ball joint 6 by a second predetermined angle or more is referred to as a “third position”.
  • the distance between the centers C11 and C12 is the third distance L3.
  • a predetermined region including HD2 arranged at the third position is referred to as a “second peripheral region”.
  • the rod portion 61A of the ball stud 61 of the first ball joint 6 extends from the spherical portion 61B along the orthogonal direction, bends rightward, and extends along the left-right direction. , Supported by the arm portion 12. That is, the extending direction (left-right direction) of a part of the rod portion 61A coincides with the axial direction (left-right direction) when the first ball joint 6 rotates. For this reason, the first ball joint 6 is easier to rotate around the left-right direction as compared to the case where the rod portion 61A is not bent. Therefore, the user can easily move the HD 2 from the first position to the third position by rotating the first ball joint 6.
  • the wearing tool 10 includes an acceleration sensor 41.
  • the acceleration sensor 41 is electrically connected to the HD2 FPGA 33 (described later).
  • the acceleration sensor 41 detects respective accelerations in the three-axis directions (X-axis direction, Y-axis direction, and Z-axis direction) that act on the wearing tool 10.
  • the acceleration sensor 41 outputs a signal indicating the detected acceleration to the FPGA 33.
  • the HD 2 includes a CPU 31, a storage unit 32, a liquid crystal panel 2 A, FPGAs 33 to 36, acceleration sensors 37 and 38, and a camera 39.
  • the CPU 31 controls the entire HMD 1.
  • the storage unit 32, the liquid crystal panel 2A, the FPGAs 33, 34, and 36, and the acceleration sensor 38 are electrically connected to the CPU 31.
  • the acceleration sensor 37 is electrically connected to the FPGA 33.
  • the camera 39 is electrically connected to the FPGA 34.
  • the FPGA 35 is electrically connected to the FPGA 36.
  • the storage unit 32 stores a program for main processing (see FIG. 7) executed by the CPU 31, various parameters, and the like.
  • the HMD 1 may have one FPGA having all the functions of the FPGAs 33 to 36 instead of the FPGAs 33 to 36.
  • Acceleration sensor 37 detects the respective accelerations in the three-axis directions that act on HD2.
  • the acceleration sensor 37 outputs a signal indicating the detected acceleration to the FPGA 33.
  • the FPGA 33 is a programmable logic device (PLD) having a function of detecting the position of the HD 2 with respect to the wearing tool 10.
  • the FPGA 33 specifies acceleration acting on the HD 2 based on the signal output from the acceleration sensor 37.
  • the FPGA 33 specifies the acceleration acting on the wearing tool 10 based on the signal output from the acceleration sensor 41.
  • FPGA33 detects the position of HD2 with respect to the wearing tool 10 by calculating the difference of each acceleration.
  • the FPGA 33 outputs a signal indicating the position of the detected HD 2 to the CPU 31.
  • the camera 39 is provided in the HD2 casing 21 (see FIG. 1), and can photograph the rear side of the casing 21.
  • the camera 39 outputs image data of the captured image to the FPGA 34.
  • the FPGA 34 is a PLD having a function of detecting the direction of the user's line of sight.
  • the method for specifying the direction of the line of sight by the FPGA 34 is as follows.
  • the FPGA 34 specifies the eye of the user's eye as a reference point and specifies the iris of the user's eye as a moving point from the image based on the image data output from the camera 39.
  • the FPGA 34 specifies the direction of the user's line of sight from the iris position with respect to the specified position of the eye.
  • the FPGA 34 outputs a signal indicating the detected line-of-sight direction to the CPU 31.
  • the acceleration sensor 38 detects the respective accelerations in the three-axis directions acting on the HD 2, similarly to the acceleration sensor 37.
  • the acceleration sensor 38 outputs a signal indicating the detected acceleration to the CPU 31.
  • the FPGA 35 is a PLD having a function of analyzing pixel colors.
  • the FPGA 35 detects the image signal transmitted from the external device 9 via the cable 2B.
  • the FPGA 35 analyzes each color of a plurality of pixels constituting the image based on the detected signal.
  • the FPGA 35 outputs a signal indicating the analysis result to the FPGA 36.
  • the FPGA 36 is a PLD having a function of detecting a ratio of a specific color area in the entire area of the image. Based on the signal output from the FPGA 35, the FPGA 36 detects the ratio of the black area in the entire area of the image displayed on the liquid crystal panel 2A.
  • the FPGA 36 outputs a signal indicating the detected ratio to the CPU 31.
  • the liquid crystal panel 2A switches the display method to one of two methods (FS method or another method) based on a signal output from the CPU 31.
  • the liquid crystal panel 2 ⁇ / b> A causes the emission unit 22 to display an image using a display method according to the signal output from the CPU 31 based on the image signal output from the external device 9.
  • the first display method that can be displayed on the HD2 liquid crystal panel 2A is the FS method.
  • the FS method is a method that allows a user to visually recognize a color image by switching and outputting red (R), green (G), and blue (B) light in a time-sharing manner.
  • FIG. 6A illustrates a timing chart of a vertical synchronization signal (Vsync) and output signals of R, G, and B light when a white image is displayed on the liquid crystal panel 2A by the FS method. .
  • the R, G, and B lights are output in synchronization with the vertical synchronization signal (Vsync) while being switched in the first period T1.
  • the R, G, and B lights are arranged at different times so as to be output at different timings.
  • the light output time at each timing is t1.
  • the second display method that can be displayed on the HD2 liquid crystal panel 2A is a different method.
  • Another method is a method in which the user visually recognizes an image by outputting any of R, G, and B light output by the FS method at the same timing as the FS method.
  • 6B shows the vertical synchronization signal (Vsync) and R, G, B when the FS method of FIG. 6A is switched to another method that outputs only green (G) light.
  • the timing chart of the output signal of each of these is illustrated.
  • any of the R, G, and B lights (B light in the case of FIG. 6B) is output in the first period T1.
  • the white image in the FS method (FIG. 6A) is converted into a monochrome (green) image by switching to another method.
  • the light output time t1 at each timing is the same as in the FS method.
  • the main process executed by the CPU 31 will be described with reference to FIG.
  • the main process is started when the CPU 31 executes the program stored in the storage unit 32 when the output of the image signal is started from the external device 9.
  • the CPU 31 outputs a signal for displaying an image by the FS method to the liquid crystal panel 2A (S11).
  • the liquid crystal panel 2 ⁇ / b> A displays an image on the emission unit 22 by the FS method based on the signal output from the external device 9.
  • CPU31 specifies the position of HD2 with respect to the wearing tool 10 based on the signal output from FPGA33 (S13).
  • the CPU 31 determines whether the HD2 is arranged in any position in the central area (see FIG. 3A) based on the specified position of the HD2 (S15).
  • S15 the specified position of the HD2
  • the CPU 31 advances the process to S17. Details of the processing of S17 will be described later.
  • the CPU 31 determines that HD2 is not arranged at any position in the central area (S15: NO), the HD2 is arranged at any position in the first peripheral area (see FIG. 3B). (S19). When it is determined that the HD 2 is disposed at any position in the first peripheral area (S19: YES), the CPU 31 advances the process to S21. Details of the processing of S21 will be described later. On the other hand, if the CPU 31 determines that HD2 is not arranged at any position in the first peripheral area (S19: NO), HD2 is arranged at any position in the second peripheral area (see FIG. 4). It is determined that The CPU 31 advances the process to S31. Details of the processing of S31 will be described later.
  • the CPU 31 specifies the direction of the user's line of sight based on the signal output from the FPGA 34 (S21).
  • CPU31 counts the frequency
  • the CPU 31 further specifies the number of times per unit time (for example, 1 second) as the change frequency (S23).
  • the CPU 31 determines whether the specified change frequency is less than a predetermined frequency (S25).
  • CPU31 advances a process to S27, when it determines with change frequency being less than predetermined frequency (S25: YES). Details of the processing of S27 will be described later. On the other hand, when it is determined that the change frequency is equal to or higher than the predetermined frequency (S25: NO), the CPU 31 advances the process to S31.
  • CPU31 specifies the ratio of the black area
  • the CPU 31 determines whether the specified ratio is less than a predetermined threshold (S29).
  • CPU31 advances a process to S17, when it determines with the ratio of a black area
  • CPU31 outputs the signal for displaying an image by FS method to liquid crystal panel 2A, when performing the process of S17 (S17).
  • the liquid crystal panel 2 ⁇ / b> A displays an image on the emission unit 22 by the FS method based on the signal output from the external device 9.
  • CPU31 returns a process to S13.
  • the CPU 31 executes the process of S31, the CPU 31 outputs a signal for displaying an image by another method to the liquid crystal panel 2A (S31).
  • the liquid crystal panel 2 ⁇ / b> A causes the emission unit 22 to display an image by another method based on the signal output from the external device 9.
  • CPU31 returns a process to S13.
  • the frequency with which the direction of the user's line of sight changes changes according to the positional relationship of HD2 with respect to the user's eyes. Specifically, for example, when the HD2 is arranged at any position in the central area (see FIG. 3A), the user is likely to be viewing the HD2 image. The frequency of change is relatively small. On the other hand, when HD2 is arranged at any position in the first peripheral area (see FIG. 3B), the user is likely to see both the HD2 image and the scenery in front of him. The frequency of change in the direction of the line of sight becomes relatively large. Here, when the direction of the line of sight of the user changes, there is a high possibility that color braking that reduces the visibility of the user will occur.
  • the CPU 31 of the HMD 1 switches the display method for displaying an image on the liquid crystal panel 2A according to the positional relationship between the HD 2 and the user's eyes. Specifically, the CPU 31 displays an image on the liquid crystal panel 2A by another method when there is a high possibility of color braking. In another method, RGB light is output from the liquid crystal panel 2A at an output timing at which color braking is unlikely to occur. Thereby, HMD1 can suppress that color braking generate
  • the FPGA 33 detects the position of the HD 2 with respect to the wearing tool 10 worn on the user's head based on signals output from the acceleration sensors 37 and 41. Thereby, the FPGA 33 can specify the position of HD2 with respect to the left eye 8 of the user.
  • the CPU 31 determines whether or not color braking is likely to occur based on the position of the HD 2 with respect to the user's left eye 8. When there is a high possibility that color braking will occur, the HMD displays an image on the liquid crystal panel 2A by another method in which color braking is difficult to occur (S31).
  • RGB light is output from the liquid crystal panel 2A at an output timing at which a display phenomenon that reduces the visibility of the user is unlikely to occur, and an image is displayed on the liquid crystal panel 2A.
  • HMD1 can suppress that color braking generate
  • any position of the emitting portion 22 is disposed opposite to the front side of the user's left eye 8.
  • the user has a high possibility of visually recognizing the image displayed on the emission unit 22.
  • the frequency of changes in the direction of the line of sight of the user is small and the direction of the line of sight is stable. That is, the HMD 1 is in a state where color braking is difficult to occur.
  • the CPU 31 displays an image on the liquid crystal panel 2A by the FS method (S17). Note that the FS method can increase the resolution of the displayed image as compared to another method.
  • the HMD 1 can display an image with better image quality on the liquid crystal panel 2A in a state where the possibility of color braking is low.
  • the display unit is arranged at any position in the first peripheral area (S15: NO)
  • the user can visually recognize both the image displayed on the liquid crystal panel 2A and the scenery in front of him. There is a high possibility that the direction of the line of sight is frequently changed. In other words, color braking is likely to occur.
  • the CPU 31 displays an image on the liquid crystal panel 2A by another method (S31).
  • HMD1 can suppress generation
  • the HMD 1 arranges the HD 2 in any one of the first position, the second position, and the third position by rotating at least one of the first ball joint 6 and the second ball joint 7.
  • the second distance L2 when HD2 is arranged at the second position (FIG. 3B) than the first distance L1 when HD2 is arranged at the first position (see FIG. 3A). See) is longer.
  • the CPU 31 determines whether the HD2 is disposed in the central region including the first position or the HD2 is disposed in the first peripheral region including the second position. This can be determined based on the distance to the unit 22.
  • the HMD 1 outputs whether the HD2 is arranged at any position in the central area including the first position or the HD2 is arranged at any position in the peripheral area including the second position. It can be specified depending on whether the distance between the portion 22 and the spherical portion 61B of the first ball joint 6 is the first distance L1 or the second distance L2. That is, the CPU 31 determines the ease of occurrence of color braking based on the distance between the spherical portion 61B of the first ball joint 6 and the emitting portion 22, and displays an appropriate display method (FS method) according to the position of HD2. Alternatively, an image can be displayed on the liquid crystal panel 2A by another method.
  • FS method display method
  • the HD 2 is disposed at the third position by rotating the first ball joint 6 by a second predetermined angle or more from the state disposed at the first position (see FIG. 3A) (FIG. 4). reference).
  • the rotation angle of the first ball joint 6 is larger than the rotation angle of the second ball joint 7. Therefore, the CPU 31 determines whether the HD2 is disposed in the central region including the first position or the HD2 is disposed in the second peripheral region including the third position. Can be determined based on That is, the CPU 31 determines the ease of occurrence of color braking according to the angle between the fixed portion 14 and the arm portion 12, and displays an image on the liquid crystal panel 2A by an appropriate display method (FS method or another method). Can be made.
  • the user may place the HD2 in the second peripheral region in order to view the scenery in front of the eyes well by removing the HD2 from the field of view.
  • color braking does not occur in the HMD1.
  • the CPU 31 determines that the HD 2 is disposed in the second peripheral area (S19: NO)
  • the CPU 31 displays an image on the liquid crystal panel 2A by another method (S31).
  • the HMD 1 causes color braking when the HD 2 moves from any position (for example, the third position) in the second peripheral area to any position (for example, the first position) in the central area. This can be suppressed.
  • the CPU 31 specifies the change frequency based on the number of times when the direction of the line of sight changes more than a predetermined value (S23).
  • a predetermined value S23
  • the change frequency is less than the predetermined frequency (S25: YES)
  • color braking is unlikely to occur even when an image is displayed on the liquid crystal panel 2A by the FS method. Therefore, in such a case, the CPU 31 displays an image on the liquid crystal panel 2A by the FS method (S17).
  • the change frequency is less than the predetermined frequency (S25: YES)
  • the process proceeds to S27, and if the CPU 31 determines that the ratio of the black area is less than the predetermined threshold (S29: YES), the process To S17.
  • the HMD 1 can perform display control utilizing the features of the FS method when color braking is difficult to occur.
  • the change frequency increases, color braking is more likely to occur when an image is displayed on the liquid crystal panel 2A by the FS method.
  • the CPU 31 displays an image on the liquid crystal panel 2A by another method (S31).
  • the HMD 1 determines a change in the direction of the user's line of sight, and in the present embodiment, the occurrence of color braking can be further suppressed by specifying the frequency, thereby reducing the visibility of the user. Can be suppressed.
  • the FPGA 34 specifies the direction of the line of sight of the user based on the image data output from the camera 39, so that the change in the direction of the line of sight can be specified easily and appropriately.
  • the FS system is a system that allows a user to visually recognize a color image by switching and outputting R, G, and B light in a time-sharing manner.
  • another method is a method for allowing a user to visually recognize a monochromatic image by outputting any one of R, G, and B light output by the FS method at the same timing as the FS method.
  • color braking occurs when an image of each RGB color is visually recognized as an afterimage or the like.
  • the HMD 1 displays the image on the liquid crystal panel 2 ⁇ / b> A by another method so that the user can visually recognize the monochromatic image, so that the occurrence of color braking can be suppressed.
  • the HMD 1 can suppress the total output amount of light by displaying an image on the liquid crystal panel 2A by another method. Therefore, the HMD 1 can suppress power consumption by displaying an image on the liquid crystal panel 2A by another method, as compared with the case of displaying an image on the liquid crystal panel 2A by the FS method.
  • FIG. 8B is a timing chart of the vertical sync signal (Vsync) and the output signals of the R, G, and B lights when the FS method of FIG. 8A is switched to another method. Illustrate.
  • Vsync vertical sync signal
  • all the R, G, and B lights are output in synchronization with the first period T1.
  • all images other than white (for example, R, G, and B colors) in the FS method of FIG. 8A are all converted to white images by switching to another method.
  • the light output time t1 at each timing is the same as in the FS method.
  • the HMD 1 can suppress the occurrence of color braking by displaying an image on the liquid crystal panel 2A by another method. This is because color braking occurs because light of each color of RGB is output at different timing, whereas light of each color of RGB is output at the same timing in another method. Further, when an image is displayed on the liquid crystal panel 2A by another method, all the RGB color images in the FS method are displayed as white images. That is, the original color image is converted into a monochrome image by switching the FS method to another method. On the other hand, in the above embodiment, an image in the FS system is converted into a single color image (see FIG. 6). Therefore, the HMD 1 can display an image on the liquid crystal panel 2 ⁇ / b> A while maintaining the display mode of the original image as much as possible as compared with the case of another method in the embodiment.
  • FIG. 9 is a timing chart of the vertical sync signal (Vsync) and the output signals of the R, G, and B lights when the FS method of FIG. 9A is switched to another method. Illustrate. The lights of R, G, and B are arranged at different times so as to be output at different timings as in the FS system. In another method, the light output time t2 is 1 ⁇ 2 of the FS output time t1.
  • the output frequency of light of each RGB color is twice that of the FS method (first cycle T1 ⁇ second cycle T2), and the output time of each light is 1 ⁇ 2 that of the FS method (output time). t1 ⁇ output time t2). For this reason, the total output time of light of each RGB color is the same as in the FS method.
  • the HMD 1 can suppress the occurrence of color braking by displaying an image on the liquid crystal panel 2A by another method. This is because color braking is caused by the fact that RGB color images do not mix well when viewed by the user, but by using a different method, the second period T2 is shorter than that of the FS method. This is because light of each color of RGB is output while being switched, and thus there is a high possibility that they will be appropriately mixed by the afterimage effect. Furthermore, RGB images in the FS format are reproduced in the same manner even in images displayed in a different format. Accordingly, the HMD 1 can appropriately reproduce the display mode of the original image even when the image is displayed on the liquid crystal panel 2A by another method.
  • the HMD 1 has a configuration in which the HD 2 is disposed in front of the user's left eye 8. However, the HD 2 may be disposed in front of the user's right eye.
  • the CPU 31 may display an image on the liquid crystal panel 2 ⁇ / b> A by another display method that is different from the FS method and has a low possibility of occurrence of color braking by the process of S ⁇ b> 17. Based on the position of HD2 specified based on the signal output from the FPGA 33 and the position of the iris of the user's eye specified by the FPGA 34, the CPU 31 determines the relative relationship between the left eye 8 of the user and HD2. The positional relationship may be specified. As a result, the CPU 31 can specify the relative position of the HD2 with respect to the left eye 8 of the user more accurately than when the position of the HD2 is specified based only on the signal output from the FPGA 33.
  • the HMD 1 may photograph the wearing tool 10 with the camera 39.
  • the CPU 31 may specify the position of the HD 2 based on the captured image data. Further, the HMD 1 may photograph the first ball joint 6 with the camera 39.
  • the CPU 31 may specify the distance between the centers C11 and C12 based on the captured image data.
  • the CPU 31 may specify the position of HD2 based on the specified distance.
  • the HMD 1 may include a sensor that detects the rotation angles of the first ball joint 6 and the second ball joint 7.
  • the CPU 31 may specify the position of the HD 2 based on the rotation angles of the first ball joint 6 and the second ball joint 7 specified based on this sensor.
  • the HMD 1 may not have the wearing tool 10.
  • the fixing portion 14 may be fixed to another member different from the wearing tool 10.
  • the fixing portion 14 may be fixed to a hat, a helmet, or the like worn by the user.
  • the first distance L1 when HD2 is disposed at the first position may be longer than the second distance L2 when HD2 is disposed at the second position.
  • the rotation angle of the first ball joint 6 when moving the HD 2 from the first position to the third position may be smaller than the rotation angle of the second ball joint 7.
  • the mounting portion 11 may have another adjustment mechanism, for example, a flexible joint formed of an elastic body such as rubber, instead of the first ball joint 6 and the second ball joint 7.
  • the HMD 1 may specify the direction of the user's line of sight by another method.
  • the HMD 1 may have a light source and a camera capable of infrared imaging when the user's face is illuminated with infrared rays.
  • the CPU 31 may detect corneal reflection from infrared reflected light photographed by the camera.
  • the CPU 31 may specify the position of the pupil with respect to the detected position of corneal reflection.
  • the CPU 31 may specify the direction of the line of sight based on the specified pupil position.
  • the FPGA 34 may output a signal indicating the position of the user's eyes to the CPU 31 instead of a signal indicating the direction of the line of sight.
  • the CPU 31 may specify the position of the user's eye based on the signal output from the FPGA 34.
  • CPU31 may count the frequency
  • the CPU 31 may further specify the number of times per unit time (for example, 1 second) as the change frequency (S23).
  • the FPGA 36 may detect whether the image is a black and white image or a color image based on a signal output from the FPGA 35, and may output a signal indicating the detection result to the CPU 31.
  • the CPU 31 determines that the image is a color image based on the signal output from the FPGA 36
  • the CPU 31 may cause the liquid crystal panel 2A to display the image using the FS method (S17).
  • the CPU 31 may cause the liquid crystal panel 2A to display the image by another method (S31).
  • the CPU 31 determines whether the liquid crystal panel 2A is used regardless of the line-of-sight direction and the ratio of the black area. Alternatively, the image may be displayed by another method (S31). When it is determined that the line-of-sight change frequency is less than the predetermined frequency (S25: YES), the CPU 31 may display an image on the liquid crystal panel 2A by the FS method regardless of the position of the HD2 or the ratio of the black region ( S17).
  • the position and quantity of the camera 39 provided in the HD2 casing 21 for photographing the user's left eye 8 are not particularly limited.
  • a camera 39A may be provided at the left end of the opening at the rear end of the housing 21 of the HD2, or a camera 39B may be provided at the right end. Both cameras 39A and 39B may be provided.
  • the shooting range can be easily directed to the user side. Therefore, the HMD 1 can appropriately photograph the user's left eye 8 with the camera 39.
  • the wearing tool 10 of HMD1 is mounted
  • the camera 39A may photograph the eye on the side separated from the HD2 among the both eyes of the user, that is, the right eye.
  • the FPGA 34 may specify the position of the right eye or the direction of the line of sight of the right eye.
  • the direction of the line of sight of the eye on the side separated from HD2 coincides with the direction of the line of sight of the eye on the side close to HD2. Therefore, as described above, even when processing similar to that in the above embodiment is performed based on the direction of the line of sight of the right eye, the HMD 1 controls the display method based on the change in the position of the left eye 8 of the user. It can be carried out.
  • the HMD 1 since the HMD 1 is mounted on the head, there is a tendency that it is desired to reduce the size, and there may be a case where sufficient space for newly installing a detection device cannot be secured on the side of the HMD 1 where the HD 2 is disposed. is there. Even in such a case, the HMD 1 uses the fact that the movement of the user's eye is interlocked with both eyes and arranges the detection device on the side where the HD 2 is not arranged.
  • the display method can be controlled in accordance with a change in the direction of the user's line of sight.
  • a half mirror 391 may be provided on the rear side of the liquid crystal panel 2A in the inside of the housing 21 of the HD2.
  • the half mirror 391 may reflect the external light incident from the opening at the rear end of the housing 21 to the right side.
  • a camera 39C capable of photographing the outside world based on the light reflected by the half mirror 391 may be provided.
  • the CPU 31 may specify the direction of the line of sight of the user's left eye 8 based on the data of the image captured by the camera 39C. With this configuration, it is possible to improve the accuracy of the specified line-of-sight direction as compared with the case of FIG.
  • the camera 39 may be provided in any of the wearing tool 10 and the mounting part 11 (fixing part 14, arm part 12, etc.).
  • the direction of the line of sight of the right eye may be specified by photographing the right eye of the user.
  • the camera 39 when the camera 39 is provided in the wearing tool 10, it can suppress that the weight of the camera 39 is weighted to HD2. Therefore, the HMD 1 can suppress a decrease in wearability due to an increase in the weight of the HD 2.
  • ⁇ Others> HD2 is an example of the “display unit” of the present disclosure.
  • the liquid crystal panel 2A is an example of the “display element” of the present disclosure.
  • the first ball joint 6 and the second ball joint 7 are examples of the “adjustment unit” of the present disclosure.
  • the FPGAs 33 and 34 are examples of the “position detection unit” of the present disclosure.
  • the CPU 31 that performs the processes of S17 and S31 is an example of the “control unit” of the present disclosure.
  • the CPU 31 that performs the process of S17 is an example of the “first control unit” of the present disclosure.
  • the CPU 31 that performs the process of S31 is an example of the “second control unit” of the present disclosure.
  • the first ball joint 6 is an example of the “first adjustment unit” in the present disclosure.
  • the second ball joint 7 is an example of the “second adjustment unit” in the present disclosure.
  • the FPGA 33 is an example of the “first position detection unit” of the present disclosure.
  • the FPGA 34 is an example of the “second position detection unit” of the present disclosure.
  • the camera 39 is an example of the “imaging unit” of the present disclosure.
  • the FPGA 36 is an example of the “ratio detection unit” of the present disclosure.
  • the first peripheral region and the second peripheral region are examples of the “peripheral region” of the present disclosure.
  • HMD 2 HD 2A: Liquid crystal panel 6: First ball joint 7: Second ball joint 8: Left eye 10: Wearing tool 11: Wearing part 12: Arm part 14: Fixing part 22: Emitting part 31: CPU 33, 34, 35, 36: FPGA 37, 38, 41: Acceleration sensor 39: Camera

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Nonlinear Science (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Liquid Crystal (AREA)
  • Mathematical Physics (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Liquid Crystal Display Device Control (AREA)

Abstract

The present invention provides a head-mounted display capable of suppressing a decrease in visual recognizability by a user in response to a change in a positional relationship between an eye of the user and a display part. An HMD 1 is characterized by being provided with: a display device 2 having a liquid crystal panel 2A on which a color image is displayed by controlling the output timings of RGB lights; a mounting part 11 that is a member for mounting the display device 2 on a head of a user, and includes at least a first ball joint 6 and a second ball joint 7 that are capable of adjusting the position of the display device 2 with respect to the eye of the user; an FPGA for detecting a positional relationship between the eye of the user and the display device 2; and a CPU for displaying an image on the liquid crystal panel 2A in a display method for outputting RGB lights at output timings corresponding to the positional relationship detected by the FPGA.

Description

ヘッドマウントディスプレイHead mounted display
 本開示は、ヘッドマウントディスプレイに関する。 This disclosure relates to a head mounted display.
 画像が表示される表示部と使用者の眼との位置関係の変化に応じて、使用者の視認性を低下させる表示現象が発生する場合がある。又、このような表示現象の発生を抑制するための技術が提案されている。 A display phenomenon that reduces the visibility of the user may occur according to a change in the positional relationship between the display unit on which the image is displayed and the user's eyes. In addition, techniques for suppressing the occurrence of such a display phenomenon have been proposed.
 例えば、上記のような表示現象の一つとして、カラーブレーキング(色割れともいう)がある。カラーブレーキングは、フィールドシーケンシャル方式(以下、「FS方式」という。)で画像が表示部に表示される場合に発生する。FS方式は、赤(R)、緑(G)、青(B)の光を継続的に切り替えて発光させ、切り替えの速さを人間の目の時間的分解能を越えた速さとすることによって、人間がRGBの色を混色して認識することを応用した方式である。カラーブレーキングは、時間経過に応じて順次表示されるRGB各色の画像がうまく混ざり合わないことによって発生する。カラーブレーキングが発生した場合、RGB各色の画像が残像等として視認され、使用者の視認性を低下させる。これに対し、カラーブレーキングによる視認性の低下を抑制するための技術が提案されている(例えば、特許文献1参照)。 For example, as one of the above display phenomena, there is color braking (also referred to as color breakup). Color braking occurs when an image is displayed on a display unit by a field sequential method (hereinafter referred to as “FS method”). In the FS system, red (R), green (G), and blue (B) light is continuously switched to emit light, and the switching speed is faster than the temporal resolution of the human eye. This is a method applied to human recognition of RGB colors mixed. Color braking occurs when RGB images that are sequentially displayed as time elapses do not mix well. When color braking occurs, RGB images are visually recognized as afterimages and the like, and the visibility of the user is reduced. On the other hand, the technique for suppressing the fall of the visibility by color braking is proposed (for example, refer patent document 1).
特開2003-58113号公報JP 2003-58113 A
 ヘッドマウントディスプレイ(以下、「HMD」という)では、使用者の頭部にアーム部等を介して表示部が保持される。このためHMDでは、使用者の眼と表示部との位置関係が、一般的な設置型の表示装置と比べて変化し易い。このためHMDでは、使用者の視認性を低下させる上記カラーブレーキング等の表示現象が発生し易いという問題点がある。 In a head mounted display (hereinafter referred to as “HMD”), a display unit is held on the user's head via an arm unit or the like. For this reason, in the HMD, the positional relationship between the user's eyes and the display unit is easily changed as compared with a general installation type display device. For this reason, in the HMD, there is a problem that display phenomena such as the above-mentioned color braking that reduce the visibility of the user are likely to occur.
 本開示の目的は、使用者の眼と表示部との位置関係の変化に応じて使用者の視認性が低下することを抑制できるヘッドマウントディスプレイを提供することである。 An object of the present disclosure is to provide a head mounted display that can suppress a decrease in the visibility of the user in accordance with a change in the positional relationship between the user's eyes and the display unit.
 本開示の第1態様に係るヘッドマウントディスプレイは、RGBの光の出力タイミングを制御することによってカラー画像を表示する表示素子を有する表示部と、使用者の頭部に前記表示部を装着するための部材であって、前記使用者の眼に対する前記表示部の位置を調整可能な調整部を少なくとも含む装着部と、前記使用者の眼と前記表示部との位置関係を検出する位置検出手段と、前記位置検出手段によって検出された前記位置関係に応じた出力タイミングでRGBの光を出力する表示方式で、前記表示素子に画像を表示させる制御手段とを備えたことを特徴とする。 The head-mounted display according to the first aspect of the present disclosure is provided with a display unit having a display element that displays a color image by controlling the output timing of RGB light, and for mounting the display unit on a user's head. A mounting unit including at least an adjustment unit capable of adjusting a position of the display unit with respect to the user's eye, and a position detection unit that detects a positional relationship between the user's eye and the display unit. And a control means for displaying an image on the display element in a display method for outputting RGB light at an output timing corresponding to the positional relationship detected by the position detecting means.
 ヘッドマウントディスプレイ(HMD)において、使用者の眼と表示部との位置関係が所定条件を満たす場合、視線の方向が変化し易くなることに応じて、使用者の眼と表示部との位置関係が変化し易くなる。なお、視線の方向が変化する場合、使用者の視認性を低下させる表示現象が発生する可能性が高くなる。これに対し、HMDは、使用者の眼と表示部との位置関係に応じて、使用者の視認性を低下させる表示現象が発生し難い出力タイミングで表示素子からRGBの光を出力し、表示素子に画像を表示させる。これによって、HMDは、使用者の視認性が低下することを抑制できる。 In the head mounted display (HMD), when the positional relationship between the user's eyes and the display unit satisfies a predetermined condition, the positional relationship between the user's eyes and the display unit is easily changed depending on the direction of the line of sight. Is likely to change. Note that when the direction of the line of sight changes, there is a high possibility that a display phenomenon that reduces the visibility of the user will occur. On the other hand, the HMD outputs RGB light from the display element at an output timing at which a display phenomenon that reduces the visibility of the user is unlikely to occur according to the positional relationship between the user's eyes and the display unit. An image is displayed on the element. Thereby, HMD can suppress that a user's visibility falls.
 本開示の第2態様に係るヘッドマウントディスプレイは、RGBの光の出力タイミングを制御することによってカラー画像を表示する表示素子を有する表示部と、使用者の頭部に前記表示部を装着するための部材であって、前記使用者の眼に対する前記表示部の位置を調整可能な調整部を有する装着部と、前記使用者の眼と前記表示部との位置関係を検出する位置検出手段と、前記位置検出手段によって検出された前記位置関係に応じ、RGBの光の出力タイミングが互いに相違する第1方式又は第2方式で前記表示素子に画像を表示させる制御手段とを備え、前記第1方式は、RGBの光を、時分割で、第1周期で切り替えて出力する方式であり、前記第2方式は、RGBの光を、時分割で、第1周期よりも短い第2周期で切り替えて出力する方式であることを特徴とする。第2態様によれば、請求項1と同様の効果を奏することができる。又、FS方式で表示されるRGB各色の画像は、第1方式又は第2方式で表示される画像でも同様に再現される。従って、HMDは、第1方式及び第2方式の何れの方式で画像を表示させる場合でも、元の画像の表示態様を適切に再現できる。 The head-mounted display according to the second aspect of the present disclosure is provided with a display unit having a display element that displays a color image by controlling the output timing of RGB light, and for mounting the display unit on a user's head A mounting unit having an adjustment unit capable of adjusting a position of the display unit with respect to the user's eye, and a position detection unit that detects a positional relationship between the user's eye and the display unit, Control means for causing the display element to display an image in a first method or a second method in which RGB light output timings are different from each other in accordance with the positional relationship detected by the position detection unit; Is a method of switching and outputting RGB light in a first cycle in a time division manner, and the second method switches RGB light in a second cycle shorter than the first cycle in a time division manner. Out Characterized in that it is a method of. According to the 2nd aspect, there can exist an effect similar to Claim 1. In addition, the RGB images displayed by the FS method are reproduced in the same manner even if the image is displayed by the first method or the second method. Accordingly, the HMD can appropriately reproduce the display mode of the original image regardless of whether the image is displayed by the first method or the second method.
HMD1の斜視図である。It is a perspective view of HMD1. 第1ボールジョイント6及び第2ボールジョイント7の拡大図である。3 is an enlarged view of a first ball joint 6 and a second ball joint 7. FIG. HD2が第1位置又は第2位置に配置された状態のHMD1を示す図である。It is a figure which shows HMD1 of the state by which HD2 is arrange | positioned in the 1st position or the 2nd position. HD2が第3位置に配置された状態のHMD1を示す図である。It is a figure which shows HMD1 of the state by which HD2 is arrange | positioned in the 3rd position. HMD1の電気的構成を示すブロック図である。It is a block diagram which shows the electric constitution of HMD1. フィールドシーケンシャル方式(A)、及び、別方式(B)を示すタイミングチャートである。It is a timing chart which shows a field sequential system (A) and another system (B). メイン処理のフローチャートである。It is a flowchart of a main process. 第1変形例におけるフィールドシーケンシャル方式(A)、及び、別方式(B)を示すタイミングチャートである。It is a timing chart which shows the field sequential system (A) in a 1st modification, and another system (B). 第2変形例におけるフィールドシーケンシャル方式(A)、及び、別方式(B)を示すタイミングチャートである。It is a timing chart which shows the field sequential system (A) and another system (B) in a 2nd modification. HD2に対するカメラ39の配置を示す図である。It is a figure which shows arrangement | positioning of the camera 39 with respect to HD2.
<HMD1の概要>
 本開示の実施形態について説明する。図1に示すように、ヘッドマウントディスプレイ(Head Mounted Display、以下、「HMD」という。)1は、ビデオ透過型のHMDである。HMD1は、着用具10、装着部11、及び、表示装置2(以下、「HD2」という。)を備える。以下、図の説明の理解を助けるため、HMD1の上側、下側、左側、右側、前側、及び、後側を定義する。HMD1の上側、下側、左側、右側、前側、及び、後側は、例えば、図1の上側、下側、左斜め上側、右斜め下側、左側、及び、右側にそれぞれ対応する。HMD1の上側、下側、左側、右側、前側、及び、後側は、それぞれ、着用具10が着用された使用者にとって、上側、下側、右側、左側、前側、及び、後側に対応する。
<Overview of HMD1>
An embodiment of the present disclosure will be described. As shown in FIG. 1, a head mounted display (hereinafter referred to as “HMD”) 1 is a video transmission type HMD. The HMD 1 includes a wearing tool 10, a mounting unit 11, and a display device 2 (hereinafter referred to as “HD2”). Hereinafter, in order to help understand the description of the figure, the upper side, lower side, left side, right side, front side, and rear side of the HMD 1 are defined. The upper side, the lower side, the left side, the right side, the front side, and the rear side of the HMD 1 correspond to, for example, the upper side, the lower side, the upper left side, the lower right side, the left side, and the right side in FIG. The upper side, the lower side, the left side, the right side, the front side, and the rear side of the HMD 1 correspond to the upper side, the lower side, the right side, the left side, the front side, and the rear side, respectively, for the user wearing the wearing tool 10. .
 図1に示すように、着用具10は、樹脂や金属(例えば、ステンレス)などの、可撓性を有する材質で構成される。着用具10は、第1部分10A及び第2部分10B、10Cを有する。なお、以下では、理解を容易とするために、着用具10を第1部分10A及び第2部分10B、10Cに区分して説明するが、着用具10は、第1部分10A及び第2部分10B、10Cのそれぞれの部材に分かれておらず、全体として一体の部材である。  As shown in FIG. 1, the wearing tool 10 is made of a flexible material such as resin or metal (for example, stainless steel). The wearing tool 10 includes a first portion 10A and second portions 10B and 10C. In the following, in order to facilitate understanding, the wearing tool 10 is described as being divided into a first part 10A and a second part 10B, 10C, but the wearing tool 10 is divided into the first part 10A and the second part 10B. It is not divided into each member of 10C, but is an integral member as a whole. *
 第1部分10A及び第2部分10B、10Cは、それぞれ、湾曲した細長い板状部材である。第1部分10Aは、着用具10のうち、位置102と位置103との間で左右方向に延びる。第1部分10Aは、前側に凸状に湾曲する。位置102は、着用具10の左右方向中心101よりも左側に位置する。位置103は、着用具10の左右方向中心101よりも右側に位置する。第2部分10Bは、着用具10のうち、位置102から後側に延びる。第2部分10Cは、着用具10のうち、位置103から後側に延びる。第2部分10B、10Cは、それぞれ、後端部が互いに近づく方向に延びる。着用具10は、使用者の前頭部、右側頭部、及び、左側頭部のそれぞれに、第1部分10A、第2部分10B、10Cを接触させた状態で、使用者の頭部に着用される。着用具10の位置103に、装着部11が固定される。以下、着用具10のうち第1部分10A及び第2部分10B、10Cで囲まれた側を「内側」といい、内側と反対側を「外側」という。 The first part 10A and the second part 10B, 10C are each curved and elongated plate-like members. The first portion 10 </ b> A extends in the left-right direction between the position 102 and the position 103 in the wearing tool 10. The first portion 10A is curved in a convex shape on the front side. The position 102 is located on the left side of the center 101 in the left-right direction of the wearing tool 10. The position 103 is located on the right side of the center 101 in the left-right direction of the wearing tool 10. The second portion 10 </ b> B extends rearward from the position 102 in the wearing tool 10. The second portion 10 </ b> C extends rearward from the position 103 in the wearing tool 10. The second portions 10B and 10C each extend in a direction in which the rear end portions approach each other. The wearing tool 10 is worn on the user's head with the first part 10A, the second part 10B, and 10C in contact with the frontal, right and left heads of the user, respectively. Is done. The mounting portion 11 is fixed at the position 103 of the wearing tool 10. Hereinafter, the side surrounded by the first portion 10A and the second portions 10B and 10C in the wearing tool 10 is referred to as “inside”, and the side opposite to the inside is referred to as “outside”.
 装着部11は、アーム部12、固定部14、第1ボールジョイント6、及び、第2ボールジョイント7を備える。固定部14は、着用具10の位置103に螺子14A(図2(A)参照)で固定される。アーム部12は略棒状である。アーム部12は、樹脂や金属などで構成される。アーム部12は、正面から見た状態で上下方向に延びる。アーム部12の上端部は、第1ボールジョイント6を介して固定部14に接続される。アーム部12の下端部は、第2ボールジョイント7を介してHD2に接続される。アーム部12は、固定部14とHD2とを連結する。アーム部12及び固定部14は、着用具10が使用者の頭部に着用された状態で、使用者の左眼の前側にHD2を保持できる。第1ボールジョイント6及び第2ボールジョイント7は、使用者の左眼8(図3等参照)に対するHD2の位置を調整可能である。 The mounting part 11 includes an arm part 12, a fixing part 14, a first ball joint 6, and a second ball joint 7. The fixing | fixed part 14 is fixed to the position 103 of the wearing tool 10 with the screw | thread 14A (refer FIG. 2 (A)). The arm part 12 is substantially rod-shaped. The arm portion 12 is made of resin, metal, or the like. The arm part 12 extends in the vertical direction as viewed from the front. The upper end portion of the arm portion 12 is connected to the fixed portion 14 via the first ball joint 6. The lower end portion of the arm portion 12 is connected to the HD 2 via the second ball joint 7. The arm part 12 connects the fixing part 14 and the HD 2. The arm portion 12 and the fixing portion 14 can hold the HD 2 on the front side of the user's left eye in a state where the wearing tool 10 is worn on the user's head. The first ball joint 6 and the second ball joint 7 can adjust the position of HD2 with respect to the user's left eye 8 (see FIG. 3 and the like).
 図2(A)に示すように、第1ボールジョイント6は、ボールスタッド61及びソケット62を備える。ボールスタッド61は、棒部61A及び球体部61Bを有する。球体部61Bは、棒部61Aの一端部に設けられた球状の部位である。ソケット62は、球体部61Bを摺動可能に支持する。以下の説明において、着用具10のうち固定部14及びソケット62が配置される位置における、着用具10の外周面100に直交する方向を「直交方向」という。ソケット62は、蓋部621及び受け部622を有する。受け部622は筒状であり、固定部14から直交方向に沿って外側に延びる。受け部622の内部に球体部61Bが収容される。受け部622は、外側の面にねじ山を有する。蓋部621は、内側の面にねじ山を有する。蓋部621は受け部622に螺合する。蓋部621は、円形の孔部621Aを有する。ボールスタッド61の棒部61Aは、受け部622の内部から孔部621Aに向けて延び、孔部621Aを通過する。棒部61Aは、球体部61Bから直交方向に沿って外側に延び、右方向に屈曲し、左右方向に沿って更に延び、アーム部12に支持される。棒部61Aの屈曲角度θ1は約30度である。 As shown in FIG. 2A, the first ball joint 6 includes a ball stud 61 and a socket 62. The ball stud 61 has a rod portion 61A and a sphere portion 61B. The spherical body portion 61B is a spherical portion provided at one end of the rod portion 61A. The socket 62 supports the sphere 61B so as to be slidable. In the following description, the direction orthogonal to the outer peripheral surface 100 of the wearing tool 10 in the position where the fixing | fixed part 14 and the socket 62 are arrange | positioned among the wearing tools 10 is called "orthogonal direction." The socket 62 has a lid portion 621 and a receiving portion 622. The receiving part 622 has a cylindrical shape and extends outward from the fixed part 14 along the orthogonal direction. The spherical portion 61B is accommodated inside the receiving portion 622. The receiving portion 622 has a thread on the outer surface. The lid 621 has a thread on the inner surface. The lid portion 621 is screwed into the receiving portion 622. The lid 621 has a circular hole 621A. The rod portion 61A of the ball stud 61 extends from the inside of the receiving portion 622 toward the hole portion 621A and passes through the hole portion 621A. The rod portion 61A extends outward along the orthogonal direction from the sphere portion 61B, bends rightward, further extends along the left-right direction, and is supported by the arm portion 12. The bending angle θ1 of the rod 61A is about 30 degrees.
 球体部61Bがソケット62に対して摺動することによって、第1ボールジョイント6は、固定部14とアーム部12とを回動可能に連結する。固定部14に対するアーム部12の可動範囲は、孔部621Aに棒部61Aが当接することよって制限される。 When the spherical body part 61B slides with respect to the socket 62, the first ball joint 6 connects the fixing part 14 and the arm part 12 in a rotatable manner. The movable range of the arm portion 12 with respect to the fixed portion 14 is limited by the contact of the rod portion 61A with the hole portion 621A.
 図2(B)に示すように、第2ボールジョイント7は、ボールスタッド71及びソケット72を備える。ボールスタッド71は、棒部71A及び球体部71Bを有する。球体部71Bは、棒部71Aの一端部に設けられた球状の部位である。ソケット72は、HD2に設けられる。ソケット72は、球体部71Bを摺動可能に支持する。ソケット72は、蓋部721及び受け部722を有する。受け部722は筒状であり、HD2から延びる。受け部722の内部に球体部71Bが収容される。受け部722は、外側の面にねじ山を有する。蓋部721は、内側の面にねじ山を有する。蓋部721は受け部722に螺合する。蓋部721は、円形の孔部721Aを有する。ボールスタッド71の棒部71Aは、受け部722の内部から孔部721Aに向けて延び、孔部721Aを通過する。棒部71Aの他端部は、アーム部12に支持される。棒部71Aは、アーム部12から左側に向けて延び、着用具10から離れる方向に屈曲して更に延び、球体部71B及びソケット62を介してHD2に支持される。棒部71Aの屈曲角度θ2は約30度である。 2B, the second ball joint 7 includes a ball stud 71 and a socket 72. The ball stud 71 has a rod portion 71A and a sphere portion 71B. The spherical portion 71B is a spherical portion provided at one end of the rod portion 71A. The socket 72 is provided in the HD2. The socket 72 supports the sphere 71B so as to be slidable. The socket 72 has a lid portion 721 and a receiving portion 722. The receiving part 722 is cylindrical and extends from the HD2. The spherical portion 71B is accommodated in the receiving portion 722. The receiving portion 722 has a thread on the outer surface. The lid 721 has a thread on the inner surface. The lid part 721 is screwed into the receiving part 722. The lid 721 has a circular hole 721A. The rod portion 71A of the ball stud 71 extends from the inside of the receiving portion 722 toward the hole portion 721A and passes through the hole portion 721A. The other end portion of the rod portion 71A is supported by the arm portion 12. The rod portion 71A extends from the arm portion 12 toward the left side, bends in a direction away from the wearing tool 10, and further extends, and is supported by the HD 2 via the sphere portion 71B and the socket 62. The bending angle θ2 of the rod portion 71A is about 30 degrees.
 球体部71Bがソケット72に対して摺動することによって、第2ボールジョイント7は、HD2とアーム部12とを回動可能に連結する。アーム部12に対するHD2の可動範囲は、孔部721Aに棒部71Aが当接することよって制限される。 As the spherical portion 71B slides on the socket 72, the second ball joint 7 connects the HD 2 and the arm portion 12 in a rotatable manner. The movable range of HD2 with respect to the arm portion 12 is limited by the rod portion 71A coming into contact with the hole portion 721A.
 図1に示すように、HD2は筐体21を備える。筐体21は中空箱状である。筐体21の後端部に、矩形状の開口が設けられる。開口は、透明の板状部材で覆われる。筐体21の内部に、液晶パネル2Aが収容される。液晶パネル2Aは、赤(R)、緑(G)、及び、青(B)のそれぞれの光の出力タイミングを制御することによってカラー画像を表示する。液晶パネル2Aは、RGBのそれぞれの光の出力タイミングが互いに相違する2つの表示方式(フィールドシーケンシャル方式(以下、「FS方式」という。)又は別方式)の何れかで画像を表示させることができる。液晶パネル2Aは、ケーブル2Bを介して外部機器9(図5参照)と電気的に接続する。外部機器9は、例えば、液晶パネル2Aに画像を表示させるための制御を行うコントロールボックスである。液晶パネル2Aは、ケーブル2Bを介して外部機器9から受信した画像信号に基づき、表示面に画像を表示させる。液晶パネル2Aの表示面に表示された画像の光は、筐体21の開口に向けて出射される。画像の光は、筐体21の開口を通過し、更に、透明の板状部材を後側に透過する。画像の光は、筐体21から外部に出射される。以下、図2(B)に示すように、液晶パネル2Aの表示面、言い換えれば、液晶パネル2Aのうち画像の光が出射される部分を、「出射部22」という。 As shown in FIG. 1, the HD 2 includes a housing 21. The casing 21 has a hollow box shape. A rectangular opening is provided at the rear end of the housing 21. The opening is covered with a transparent plate member. The liquid crystal panel 2 </ b> A is accommodated in the housing 21. The liquid crystal panel 2A displays a color image by controlling the output timing of each of red (R), green (G), and blue (B) light. The liquid crystal panel 2A can display an image in one of two display methods (field sequential method (hereinafter referred to as “FS method”) or another method) in which the output timings of RGB light are different from each other. . The liquid crystal panel 2A is electrically connected to the external device 9 (see FIG. 5) via the cable 2B. The external device 9 is a control box that performs control for displaying an image on the liquid crystal panel 2A, for example. The liquid crystal panel 2A displays an image on the display surface based on the image signal received from the external device 9 via the cable 2B. The light of the image displayed on the display surface of the liquid crystal panel 2 </ b> A is emitted toward the opening of the housing 21. The light of the image passes through the opening of the housing 21 and further passes through the transparent plate member to the rear side. The image light is emitted from the housing 21 to the outside. Hereinafter, as shown in FIG. 2B, the display surface of the liquid crystal panel 2A, in other words, the portion of the liquid crystal panel 2A from which the image light is emitted is referred to as an “emission portion 22”.
<HMD1の使用方法>
 使用者は、HMD1の着用具10を頭部に固定する。使用者はHD2を持ち、図3(A)に示すように、液晶パネル2Aの出射部22の何れかの位置が使用者の左眼8の前側に対向して配置されるように、HD2の位置を調節する。このとき、第1ボールジョイント6、及び、第2ボールジョイント7が自在に回動するため、使用者はHD2を容易に移動させることができる。以下、出射部22の何れかの位置が使用者の左眼8の前側に対向する場合のHD2を含む領域を、「中央領域」という。
<How to use HMD1>
A user fixes the wearing tool 10 of HMD1 to a head. The user holds the HD 2, and as shown in FIG. 3A, the HD 2 is arranged so that any position of the light emitting portion 22 of the liquid crystal panel 2 A is disposed to face the front side of the left eye 8 of the user. Adjust the position. At this time, since the first ball joint 6 and the second ball joint 7 freely rotate, the user can easily move the HD 2. Hereinafter, a region including HD2 in a case where any position of the emission unit 22 faces the front side of the user's left eye 8 is referred to as a “central region”.
 固定部14のうち、第1ボールジョイント6の球体部61B(図2(A)参照)の中心C11を定義する。HD2の出射部22の中心C12を定義する。出射部22の中心C12が使用者の左眼8の前側に対向する場合のHD2の位置(図3(A)参照)を、「第1位置」という。第1位置は第1領域に含まれる。HD2が第1位置に配置された場合の中心C11、C12との間の距離を、「第1距離L1」という。 Among the fixed portions 14, a center C11 of a sphere portion 61B (see FIG. 2A) of the first ball joint 6 is defined. The center C12 of the emission part 22 of HD2 is defined. The position of HD2 when the center C12 of the emitting portion 22 faces the front side of the user's left eye 8 (see FIG. 3A) is referred to as a “first position”. The first position is included in the first region. The distance between the centers C11 and C12 when the HD2 is disposed at the first position is referred to as a “first distance L1”.
 外部機器9(図5参照)からケーブル2B(図1参照)を介して、HD2の液晶パネル2Aに信号が出力される。液晶パネル2Aの出射部22に画像が表示される。表示された画像の光は、筐体21の開口を介して後側に出射される。HD2が中央領域に配置された場合、出射された画像の光は使用者の左眼8に入射する。これによって、使用者は画像を認識する。 A signal is output from the external device 9 (see FIG. 5) to the liquid crystal panel 2A of the HD 2 via the cable 2B (see FIG. 1). An image is displayed on the emission part 22 of the liquid crystal panel 2A. The light of the displayed image is emitted to the rear side through the opening of the housing 21. When the HD 2 is arranged in the central area, the emitted image light enters the left eye 8 of the user. Thereby, the user recognizes the image.
 使用者は、HD2の液晶パネル2Aに表示される画像と眼前の景色との両方を視認できるように、HD2を配置させて使用する場合がある。この場合、使用者は、HD2が第1位置に配置された状態(図3(A)参照)から、第2ボールジョイント7を、上側から見た状態で時計回りに第1所定角度以上回動させる(図3(B)参照)。つまり、使用者は、第2ボールジョイント7の上下方向を軸とした回動角度が、第1ボールジョイント6の上下方向を軸とした回動角度よりも大きくなるように、第2ボールジョイント7を回転させる。これによって使用者は、左眼8の前側に出射部22が対向して配置されなくなるまで、HD2を移動させる。図3(B)に示すように、HD2は、着用具10から離れる方向に移動する。中心C11、C12との間の距離は、第1距離L1よりも長い第2距離L2に変化する。 The user may arrange and use the HD2 so that both the image displayed on the liquid crystal panel 2A of the HD2 and the scenery in front of the user can be visually recognized. In this case, the user rotates the second ball joint 7 from the state in which the HD 2 is disposed at the first position (see FIG. 3A) clockwise by the first predetermined angle or more in a state viewed from above. (See FIG. 3B). That is, the user can adjust the second ball joint 7 so that the rotation angle about the vertical direction of the second ball joint 7 is larger than the rotation angle about the vertical direction of the first ball joint 6. Rotate. As a result, the user moves the HD 2 until the emission part 22 is not disposed opposite to the front side of the left eye 8. As shown in FIG. 3B, the HD 2 moves in a direction away from the wearing tool 10. The distance between the centers C11 and C12 changes to a second distance L2 that is longer than the first distance L1.
 以下、第1位置に配置されたHD2を、第2ボールジョイント7を第1所定角度以上回転させて移動させた場合におけるHD2の位置を、「第2位置」という。HD2が第2位置に配置された場合、中心C11、C12の間の距離は第2距離L2となる。中央領域を除く領域のうち、第2位置に配置されたHD2を含む所定の領域を「第1周辺領域」という。HD2が第2位置に配置された場合、使用者は、視線を前側に向けることによって、眼前の景色を視認可能となる。又、使用者は、視線の方向を、HMD1の右側(使用者から見て左側)にずらすことによって、液晶パネル2Aの出射部22に表示された画像を視認できる。 Hereinafter, the position of HD2 when the second ball joint 7 is moved by rotating the second ball joint 7 by a first predetermined angle or more is referred to as a “second position”. When HD2 is disposed at the second position, the distance between the centers C11 and C12 is the second distance L2. Of the areas other than the central area, a predetermined area including HD2 arranged at the second position is referred to as a “first peripheral area”. When the HD2 is disposed at the second position, the user can visually recognize the scenery in front of the eyes by directing the line of sight toward the front side. Further, the user can visually recognize the image displayed on the emission unit 22 of the liquid crystal panel 2A by shifting the direction of the line of sight to the right side of the HMD 1 (left side as viewed from the user).
 なお、図2(B)に示すように、第2ボールジョイント7のボールスタッド71の棒部71Aは、アーム部12から延び、着用具10から離れる方向に屈曲し、ソケット72に支持された状態の球体部71Bに接続される。このため、HD2の位置は、棒部71Aが屈曲しない場合に比べて、顔から離れる方向へのHD2の可動範囲は広くなり、顔に近づく方向のHD2の可動範囲が狭くなる。従って、使用者は、顔から離れる方向にHD2を移動させて第2位置に配置させることが容易に可能となる。 2B, the rod portion 71A of the ball stud 71 of the second ball joint 7 extends from the arm portion 12, bends away from the wearing tool 10, and is supported by the socket 72. Connected to the spherical portion 71B. For this reason, as for the position of HD2, the movable range of HD2 in the direction away from the face becomes wider and the movable range of HD2 in the direction approaching the face becomes narrower than in the case where the rod portion 71A is not bent. Therefore, the user can easily move the HD 2 in the direction away from the face and place it at the second position.
 使用者は、眼前の景色を良好に視認できるように、HD2を視界から外して配置させて使用する場合がある。この場合、使用者は、HD2が第1位置に配置された状態(図3(A)参照)から、第1ボールジョイント6を、右側から見た状態で時計回りに第2所定角度以上回動させる(図4参照)。つまり、使用者は、第1ボールジョイント6の左右方向を軸とした回動角度が、第1ボールジョイント6の左右方向を軸とした回動角度よりも大きくなるように、第1ボールジョイント6を回転させる。これによって使用者は、図4に示すように、左眼8の前側に出射部22が対向して配置されなくなるまで、HD2を移動させる。中心C11、C12との間の距離は、第1距離L1と略同一の第3距離L3で維持される。 The user may use the HD2 by placing it out of the field of view so that the scenery in front of him can be seen well. In this case, the user rotates the first ball joint 6 more than the second predetermined angle clockwise from the state in which the HD 2 is disposed at the first position (see FIG. 3A) as viewed from the right side. (See FIG. 4). That is, the user can adjust the first ball joint 6 so that the rotation angle about the left and right direction of the first ball joint 6 is larger than the rotation angle about the left and right direction of the first ball joint 6. Rotate. Accordingly, as shown in FIG. 4, the user moves the HD 2 until the emitting portion 22 is not disposed to face the front side of the left eye 8. The distance between the centers C11 and C12 is maintained at a third distance L3 that is substantially the same as the first distance L1.
 以下、第1位置に配置されたHD2を、第1ボールジョイント6を第2所定角度以上回動させて移動させた場合におけるHD2の位置を「第3位置」という。HD2が第3位置に配置された場合、中心C11、C12の間の距離は、第3距離L3となる。中央領域、及び、第1周辺領域を除く領域のうち、第3位置に配置されたHD2を含む所定の領域を、「第2周辺領域」という。HD2が第3位置に配置された場合、使用者は、HD2が視界に入ることを抑制しつつ眼前の景色を視認可能となる。 Hereinafter, the position of HD2 when the first ball joint 6 is moved by rotating the first ball joint 6 by a second predetermined angle or more is referred to as a “third position”. When HD2 is arranged at the third position, the distance between the centers C11 and C12 is the third distance L3. Of the central region and the region excluding the first peripheral region, a predetermined region including HD2 arranged at the third position is referred to as a “second peripheral region”. When HD2 is arrange | positioned in the 3rd position, the user can visually recognize the scenery in front of the eyes while suppressing HD2 from entering the field of view.
 なお、図2(A)に示すように、第1ボールジョイント6のボールスタッド61の棒部61Aは、球体部61Bから直交方向に沿って延び、右方向に屈曲して左右方向に沿って延び、アーム部12に支持される。つまり、棒部61Aの一部の延びる方向(左右方向)と、第1ボールジョイント6が回動するときの軸の方向(左右方向)とが一致する。このため、第1ボールジョイント6は、棒部61Aが屈曲しない場合に比べて、左右方向を軸として回動し易くなる。従って、使用者は、第1ボールジョイント6を回動させてHD2を第1位置から第3位置まで移動させることが容易に可能となる。 As shown in FIG. 2A, the rod portion 61A of the ball stud 61 of the first ball joint 6 extends from the spherical portion 61B along the orthogonal direction, bends rightward, and extends along the left-right direction. , Supported by the arm portion 12. That is, the extending direction (left-right direction) of a part of the rod portion 61A coincides with the axial direction (left-right direction) when the first ball joint 6 rotates. For this reason, the first ball joint 6 is easier to rotate around the left-right direction as compared to the case where the rod portion 61A is not bent. Therefore, the user can easily move the HD 2 from the first position to the third position by rotating the first ball joint 6.
<HMD1の電気的構成>
 図5を参照し、着用具10及びHMD1の電気的構成について説明する。着用具10は、加速度センサ41を備える。加速度センサ41は、HD2のFPGA33(後述)と電気的に接続する。加速度センサ41は、着用具10に作用する3軸方向(X軸方向、Y軸方向、Z軸方向)のそれぞれの加速度を検出する。加速度センサ41は、検出された加速度を示す信号を、FPGA33に出力する。
<Electrical configuration of HMD1>
With reference to FIG. 5, the electrical structure of the wearing tool 10 and HMD1 is demonstrated. The wearing tool 10 includes an acceleration sensor 41. The acceleration sensor 41 is electrically connected to the HD2 FPGA 33 (described later). The acceleration sensor 41 detects respective accelerations in the three-axis directions (X-axis direction, Y-axis direction, and Z-axis direction) that act on the wearing tool 10. The acceleration sensor 41 outputs a signal indicating the detected acceleration to the FPGA 33.
 HD2は、CPU31、記憶部32、液晶パネル2A、FPGA33~36、加速度センサ37、38、及び、カメラ39を有する。CPU31は、HMD1全体の制御を司る。記憶部32、液晶パネル2A、FPGA33、34、36、及び、加速度センサ38は、CPU31と電気的に接続する。加速度センサ37は、FPGA33と電気的に接続する。カメラ39は、FPGA34と電気的に接続する。FPGA35は、FPGA36と電気的に接続する。記憶部32には、CPU31によって実行されるメイン処理(図7参照)のプログラム、及び、各種パラメータ等が記憶される。なお、HMD1は、FPGA33~36のそれぞれの機能を全て有する1つのFPGAを、FPGA33~36の代わりに有していてもよい。 The HD 2 includes a CPU 31, a storage unit 32, a liquid crystal panel 2 A, FPGAs 33 to 36, acceleration sensors 37 and 38, and a camera 39. The CPU 31 controls the entire HMD 1. The storage unit 32, the liquid crystal panel 2A, the FPGAs 33, 34, and 36, and the acceleration sensor 38 are electrically connected to the CPU 31. The acceleration sensor 37 is electrically connected to the FPGA 33. The camera 39 is electrically connected to the FPGA 34. The FPGA 35 is electrically connected to the FPGA 36. The storage unit 32 stores a program for main processing (see FIG. 7) executed by the CPU 31, various parameters, and the like. The HMD 1 may have one FPGA having all the functions of the FPGAs 33 to 36 instead of the FPGAs 33 to 36.
 加速度センサ37は、HD2に作用する3軸方向のそれぞれの加速度を検出する。加速度センサ37は、検出された加速度を示す信号を、FPGA33に出力する。FPGA33は、着用具10に対するHD2の位置を検出する機能を備えたプログラマブルロジックデバイス(PLD)である。FPGA33は、加速度センサ37から出力される信号に基づき、HD2に作用する加速度を特定する。FPGA33は、加速度センサ41から出力される信号に基づき、着用具10に作用する加速度を特定する。FPGA33は、それぞれの加速度の差分を算出することによって、着用具10に対するHD2の位置を検出する。なお、着用具10は使用者の頭部に着用されるので、着用具10に対するHD2の位置は、使用者の頭部に対するHD2の位置に対応する。FPGA33は、検出されたHD2の位置を示す信号を、CPU31に出力する。 Acceleration sensor 37 detects the respective accelerations in the three-axis directions that act on HD2. The acceleration sensor 37 outputs a signal indicating the detected acceleration to the FPGA 33. The FPGA 33 is a programmable logic device (PLD) having a function of detecting the position of the HD 2 with respect to the wearing tool 10. The FPGA 33 specifies acceleration acting on the HD 2 based on the signal output from the acceleration sensor 37. The FPGA 33 specifies the acceleration acting on the wearing tool 10 based on the signal output from the acceleration sensor 41. FPGA33 detects the position of HD2 with respect to the wearing tool 10 by calculating the difference of each acceleration. Since the wearing tool 10 is worn on the user's head, the position of HD2 with respect to the wearing tool 10 corresponds to the position of HD2 with respect to the user's head. The FPGA 33 outputs a signal indicating the position of the detected HD 2 to the CPU 31.
 カメラ39は、HD2の筐体21(図1参照)に設けられ、筐体21の後側を撮影可能である。カメラ39は、撮影された画像の画像データを、FPGA34に出力する。FPGA34は、使用者の視線の方向を検出する機能を備えたPLDである。FPGA34による視線の方向の特定方法は、次の通りである。FPGA34は、カメラ39から出力される画像データに基づく画像から、使用者の眼の目頭を基準点として特定し、使用者の眼の虹彩を動点として特定する。FPGA34は、特定された目頭の位置に対する虹彩位置から、使用者の視線の方向を特定する。FPGA34は、検出された視線の方向を示す信号を、CPU31に出力する。 The camera 39 is provided in the HD2 casing 21 (see FIG. 1), and can photograph the rear side of the casing 21. The camera 39 outputs image data of the captured image to the FPGA 34. The FPGA 34 is a PLD having a function of detecting the direction of the user's line of sight. The method for specifying the direction of the line of sight by the FPGA 34 is as follows. The FPGA 34 specifies the eye of the user's eye as a reference point and specifies the iris of the user's eye as a moving point from the image based on the image data output from the camera 39. The FPGA 34 specifies the direction of the user's line of sight from the iris position with respect to the specified position of the eye. The FPGA 34 outputs a signal indicating the detected line-of-sight direction to the CPU 31.
 加速度センサ38は、加速度センサ37と同様、HD2に作用する3軸方向のそれぞれの加速度を検出する。加速度センサ38は、検出された加速度を示す信号を、CPU31に出力する。 The acceleration sensor 38 detects the respective accelerations in the three-axis directions acting on the HD 2, similarly to the acceleration sensor 37. The acceleration sensor 38 outputs a signal indicating the detected acceleration to the CPU 31.
 FPGA35は、画素の色を解析する機能を有するPLDである。FPGA35は、外部機器9から送信された画像信号を、ケーブル2Bを介して検出する。FPGA35は、検出された信号に基づき、画像を構成する複数の画素のそれぞれの色を解析する。FPGA35は、解析結果を示す信号を、FPGA36に出力する。FPGA36は、画像の全領域のうち特定の色の領域の比率を検出する機能を有するPLDである。FPGA36は、FPGA35から出力される信号に基づき、液晶パネル2Aに表示される画像の全領域のうち黒色の領域の比率を検出する。FPGA36は、検出された比率を示す信号を、CPU31に出力する。 The FPGA 35 is a PLD having a function of analyzing pixel colors. The FPGA 35 detects the image signal transmitted from the external device 9 via the cable 2B. The FPGA 35 analyzes each color of a plurality of pixels constituting the image based on the detected signal. The FPGA 35 outputs a signal indicating the analysis result to the FPGA 36. The FPGA 36 is a PLD having a function of detecting a ratio of a specific color area in the entire area of the image. Based on the signal output from the FPGA 35, the FPGA 36 detects the ratio of the black area in the entire area of the image displayed on the liquid crystal panel 2A. The FPGA 36 outputs a signal indicating the detected ratio to the CPU 31.
 液晶パネル2Aは、CPU31から出力される信号に基づき、表示方法を2つの方法(FS方式又は別方式)の何れかに切り替える。液晶パネル2Aは、外部機器9から出力される画像信号に基づき、CPU31から出力される信号に応じた表示方式で出射部22に画像を表示させる。 The liquid crystal panel 2A switches the display method to one of two methods (FS method or another method) based on a signal output from the CPU 31. The liquid crystal panel 2 </ b> A causes the emission unit 22 to display an image using a display method according to the signal output from the CPU 31 based on the image signal output from the external device 9.
<表示方式(FS方式及び別方式)>
 図6を参照し、HD2の液晶パネル2Aにおいて表示可能な2つの表示方式について説明する。HD2の液晶パネル2Aにおいて表示可能な1つ目の表示方式は、FS方式である。FS方式は、赤色(R)、緑色(G)、青色(B)の光を時分割で切り替えて出力することによって、使用者にカラー画像を視認させる方式である。図6(A)は、白色画像がFS方式で液晶パネル2Aに表示される場合の、垂直同期信号(Vsync)、及び、R、G、Bのそれぞれの光の出力信号のタイミングチャートを例示する。このように、FS方式では、R,G,Bのそれぞれの光が第1周期T1で切り替えられながら、垂直同期信号(Vsync)に同期して出力される。なお、R,G,Bのそれぞれの光は、異なるタイミングで出力されるように時間をずらして配列される。それぞれのタイミングにおける光の出力時間は、t1である。
<Display method (FS method and other methods)>
With reference to FIG. 6, two display methods that can be displayed on the liquid crystal panel 2A of HD2 will be described. The first display method that can be displayed on the HD2 liquid crystal panel 2A is the FS method. The FS method is a method that allows a user to visually recognize a color image by switching and outputting red (R), green (G), and blue (B) light in a time-sharing manner. FIG. 6A illustrates a timing chart of a vertical synchronization signal (Vsync) and output signals of R, G, and B light when a white image is displayed on the liquid crystal panel 2A by the FS method. . As described above, in the FS system, the R, G, and B lights are output in synchronization with the vertical synchronization signal (Vsync) while being switched in the first period T1. The R, G, and B lights are arranged at different times so as to be output at different timings. The light output time at each timing is t1.
 HD2の液晶パネル2Aにおいて表示可能な2つ目の表示方式は、別方式である。別方式は、FS方式で出力されるR、G、Bの光のうち何れかの光を、FS方式と同じタイミングで出力することによって、使用者に画像を視認させる方式である。図6(B)は、図6(A)のFS方式が、緑色(G)の光のみを出力する別方式に切り替えられた場合の、垂直同期信号(Vsync)、及び、R、G、Bのそれぞれの光の出力信号のタイミングチャートを例示する。このように、別方式では、R,G,Bの光のうち何れかの光(図6(B)の場合、Bの光)が、第1周期T1で出力される。この場合、FS方式(図6(A))における白色の画像は、別方式に切り替えられることによって、単色(緑色)の画像に変換される。それぞれのタイミングにおける光の出力時間t1は、FS方式と同一である。 The second display method that can be displayed on the HD2 liquid crystal panel 2A is a different method. Another method is a method in which the user visually recognizes an image by outputting any of R, G, and B light output by the FS method at the same timing as the FS method. 6B shows the vertical synchronization signal (Vsync) and R, G, B when the FS method of FIG. 6A is switched to another method that outputs only green (G) light. The timing chart of the output signal of each of these is illustrated. As described above, in another system, any of the R, G, and B lights (B light in the case of FIG. 6B) is output in the first period T1. In this case, the white image in the FS method (FIG. 6A) is converted into a monochrome (green) image by switching to another method. The light output time t1 at each timing is the same as in the FS method.
<メイン処理>
 図7を参照し、CPU31によって実行されるメイン処理について説明する。メイン処理は、外部機器9から画像信号の出力が開始された場合、記憶部32に記憶されたプログラムをCPU31が実行することによって開始される。図7に示すように、CPU31は、FS方式で画像を表示させるための信号を、液晶パネル2Aに出力する(S11)。液晶パネル2Aは、外部機器9から出力された信号に基づき、FS方式で画像を出射部22に表示させる。
<Main processing>
The main process executed by the CPU 31 will be described with reference to FIG. The main process is started when the CPU 31 executes the program stored in the storage unit 32 when the output of the image signal is started from the external device 9. As shown in FIG. 7, the CPU 31 outputs a signal for displaying an image by the FS method to the liquid crystal panel 2A (S11). The liquid crystal panel 2 </ b> A displays an image on the emission unit 22 by the FS method based on the signal output from the external device 9.
 CPU31は、FPGA33から出力される信号に基づき、着用具10に対するHD2の位置を特定する(S13)。CPU31は、特定されたHD2の位置に基づき、HD2が中央領域(図3(A)参照)の何れかの位置に配置されているか判定する(S15)。CPU31は、HD2が中央領域の何れかの位置に配置されていると判定された場合(S15:YES)、処理をS17に進める。S17の処理の詳細は後述する。 CPU31 specifies the position of HD2 with respect to the wearing tool 10 based on the signal output from FPGA33 (S13). The CPU 31 determines whether the HD2 is arranged in any position in the central area (see FIG. 3A) based on the specified position of the HD2 (S15). When it is determined that the HD 2 is disposed at any position in the central area (S15: YES), the CPU 31 advances the process to S17. Details of the processing of S17 will be described later.
 CPU31は、HD2が中央領域の何れの位置にも配置されていないと判定された場合(S15:NO)、HD2が第1周辺領域(図3(B)参照)の何れかの位置に配置されているか判定する(S19)。CPU31は、HD2が第1周辺領域の何れかの位置に配置されていると判定された場合(S19:YES)、処理をS21に進める。S21の処理の詳細は後述する。一方、CPU31は、HD2が第1周辺領域の何れの位置にも配置されていないと判定された場合(S19:NO)、HD2が第2周辺領域(図4参照)の何れかの位置に配置されていると判定する。CPU31は処理をS31に進める。S31の処理の詳細は後述する。 If the CPU 31 determines that HD2 is not arranged at any position in the central area (S15: NO), the HD2 is arranged at any position in the first peripheral area (see FIG. 3B). (S19). When it is determined that the HD 2 is disposed at any position in the first peripheral area (S19: YES), the CPU 31 advances the process to S21. Details of the processing of S21 will be described later. On the other hand, if the CPU 31 determines that HD2 is not arranged at any position in the first peripheral area (S19: NO), HD2 is arranged at any position in the second peripheral area (see FIG. 4). It is determined that The CPU 31 advances the process to S31. Details of the processing of S31 will be described later.
 CPU31は、FPGA34から出力される信号に基づき、使用者の視線の方向を特定する(S21)。CPU31は、S21の処理によって特定された視線の方向が所定以上変化した場合の回数を計数する。CPU31は、単位時間(例えば、1秒)当たりの回数を、変化頻度として更に特定する(S23)。CPU31は、特定された変化頻度が所定頻度未満か判定する(S25)。CPU31は、変化頻度が所定頻度未満と判定された場合(S25:YES)、処理をS27に進める。S27の処理の詳細は後述する。一方、CPU31は、変化頻度が所定頻度以上と判定された場合(S25:NO)、処理をS31に進める。 The CPU 31 specifies the direction of the user's line of sight based on the signal output from the FPGA 34 (S21). CPU31 counts the frequency | count when the direction of the eyes | visual_axis identified by the process of S21 changes more than predetermined. The CPU 31 further specifies the number of times per unit time (for example, 1 second) as the change frequency (S23). The CPU 31 determines whether the specified change frequency is less than a predetermined frequency (S25). CPU31 advances a process to S27, when it determines with change frequency being less than predetermined frequency (S25: YES). Details of the processing of S27 will be described later. On the other hand, when it is determined that the change frequency is equal to or higher than the predetermined frequency (S25: NO), the CPU 31 advances the process to S31.
 CPU31は、FPGA36から出力される信号に基づき、液晶パネル2Aに表示された画像の全領域のうち黒色の領域の比率を特定する(S27)。CPU31は、特定された比率が所定閾値未満か判定する(S29)。CPU31は、黒色の領域の比率が所定閾値未満と判定された場合(S29:YES)、処理をS17に進める。一方、CPU31は、黒色の領域の比率が所定閾値以上と判定された場合(S29:NO)、処理をS31に進める。 CPU31 specifies the ratio of the black area | region among all the areas | regions of the image displayed on liquid crystal panel 2A based on the signal output from FPGA36 (S27). The CPU 31 determines whether the specified ratio is less than a predetermined threshold (S29). CPU31 advances a process to S17, when it determines with the ratio of a black area | region being less than a predetermined threshold value (S29: YES). On the other hand, if the CPU 31 determines that the ratio of the black area is equal to or greater than the predetermined threshold (S29: NO), the CPU 31 advances the process to S31.
 CPU31は、S17の処理を実行する場合において、FS方式で画像を表示させるための信号を、液晶パネル2Aに出力する(S17)。液晶パネル2Aは、外部機器9から出力された信号に基づき、FS方式で画像を出射部22に表示させる。CPU31は、処理をS13に戻す。CPU31は、S31の処理を実行する場合において、別方式で画像を表示させるための信号を、液晶パネル2Aに出力する(S31)。液晶パネル2Aは、外部機器9から出力された信号に基づき、別方式で画像を出射部22に表示させる。CPU31は、処理をS13に戻す。 CPU31 outputs the signal for displaying an image by FS method to liquid crystal panel 2A, when performing the process of S17 (S17). The liquid crystal panel 2 </ b> A displays an image on the emission unit 22 by the FS method based on the signal output from the external device 9. CPU31 returns a process to S13. When executing the process of S31, the CPU 31 outputs a signal for displaying an image by another method to the liquid crystal panel 2A (S31). The liquid crystal panel 2 </ b> A causes the emission unit 22 to display an image by another method based on the signal output from the external device 9. CPU31 returns a process to S13.
<本実施形態の主たる作用効果> HMD1において、使用者の眼に対するHD2の位置関係に応じて、使用者の視線の方向が変化する頻度が変化する。具体的には、例えばHD2が中央領域の何れかの位置に配置された場合(図3(A)参照)、使用者はHD2の画像を視認している可能性が高いので、視線の方向の変化の頻度は相対的に小さくなる。一方、HD2が第1周辺領域の何れかの位置に配置された場合(図3(B)参照)、使用者はHD2の画像と眼前の景色との両方を視認している可能性が高いので、視線の方向の変化の頻度は相対的に大きくなる。ここで、使用者の視線の方向が変化する場合、使用者の視認性を低下させるカラーブレーキングが発生する可能性が高くなる。 <Main effects of this embodiment> In HMD1, the frequency with which the direction of the user's line of sight changes changes according to the positional relationship of HD2 with respect to the user's eyes. Specifically, for example, when the HD2 is arranged at any position in the central area (see FIG. 3A), the user is likely to be viewing the HD2 image. The frequency of change is relatively small. On the other hand, when HD2 is arranged at any position in the first peripheral area (see FIG. 3B), the user is likely to see both the HD2 image and the scenery in front of him. The frequency of change in the direction of the line of sight becomes relatively large. Here, when the direction of the line of sight of the user changes, there is a high possibility that color braking that reduces the visibility of the user will occur.
 これに対し、HMD1のCPU31は、HD2と使用者の眼との位置関係に応じて、液晶パネル2Aに画像を表示させる場合の表示方式を切り替える。具体的には、CPU31は、カラーブレーキングの発生する可能性が高い場合、別方式で液晶パネル2Aに画像を表示させる。別方式では、カラーブレーキングが発生し難い出力タイミングで液晶パネル2AからRGBの光が出力される。これによって、HMD1は、使用者の視線の方向の変化に基づいてカラーブレーキングが発生し、使用者の視認性が低下することを抑制できる。 On the other hand, the CPU 31 of the HMD 1 switches the display method for displaying an image on the liquid crystal panel 2A according to the positional relationship between the HD 2 and the user's eyes. Specifically, the CPU 31 displays an image on the liquid crystal panel 2A by another method when there is a high possibility of color braking. In another method, RGB light is output from the liquid crystal panel 2A at an output timing at which color braking is unlikely to occur. Thereby, HMD1 can suppress that color braking generate | occur | produces based on the change of a user's gaze direction, and a user's visibility falls.
 FPGA33は、加速度センサ37、41から出力される信号に基づき、使用者の頭部に着用される着用具10に対するHD2の位置を検出する。これによって、FPGA33は、使用者の左眼8に対するHD2の位置を特定できる。CPU31は、カラーブレーキングが発生する可能性が高い状態であるか否かを、使用者の左眼8に対するHD2の位置に基づいて判定する。HMDは、カラーブレーキングが発生する可能性が高い場合、カラーブレーキングが発生し難い別方式で液晶パネル2Aに画像を表示させる(S31)。使用者の視認性を低下させる可能性が高い場合、使用者の視認性を低下させる表示現象が発生し難い出力タイミングで液晶パネル2AからRGBの光を出力し、液晶パネル2Aに画像を表示させることで、HMD1は、カラーブレーキングが発生して使用者の視認性が低下することを、使用者の左眼8に対するHD2の位置に基づいて抑制できる。 The FPGA 33 detects the position of the HD 2 with respect to the wearing tool 10 worn on the user's head based on signals output from the acceleration sensors 37 and 41. Thereby, the FPGA 33 can specify the position of HD2 with respect to the left eye 8 of the user. The CPU 31 determines whether or not color braking is likely to occur based on the position of the HD 2 with respect to the user's left eye 8. When there is a high possibility that color braking will occur, the HMD displays an image on the liquid crystal panel 2A by another method in which color braking is difficult to occur (S31). When there is a high possibility that the visibility of the user is reduced, RGB light is output from the liquid crystal panel 2A at an output timing at which a display phenomenon that reduces the visibility of the user is unlikely to occur, and an image is displayed on the liquid crystal panel 2A. Thereby, HMD1 can suppress that color braking generate | occur | produces and a user's visibility falls based on the position of HD2 with respect to a user's left eye 8. FIG.
 HD2が中央領域の何れかの位置に配置されている場合(S15:YES)、使用者の左眼8の前側に出射部22の何れかの位置が対向して配置される。この場合、使用者は、出射部22に表示された画像を視認している可能性が高い。このため、使用者の視線の方向の変化の頻度は小さく、視線の方向が安定的であることが想定される。つまり、HMD1はカラーブレーキングは発生し難い状態である。この場合、CPU31は、FS方式で液晶パネル2Aに画像を表示させる(S17)。なお、FS方式は別方式と比較して、表示される画像の高解像度化が可能となる。このため、HMD1は、カラーブレーキングの発生する可能性が低い状態において、より画質の良い画像を液晶パネル2Aに表示させることが可能となる。一方、表示部が第1周辺領域の何れかの位置に配置されている場合(S15:NO)、使用者は、液晶パネル2Aに表示される画像と眼前の景色との両方を視認するために視線の方向を頻繁に変化させている可能性が高い。つまり、カラーブレーキングが発生しやすい状態である。この場合、CPU31は、別方式で液晶パネル2Aに画像を表示させる(S31)。これによって、HMD1は、カラーブレーキングの発生を抑制できるので、使用者の視認性が低下することを抑制できる。このように、HMD1は、HD2の位置に応じてカラーブレーキングの発生のし易さを判断し、カラーブレーキングが発生し易い場合に別方式で液晶パネル2Aに画像を表示させることができる。 When the HD2 is disposed at any position in the central region (S15: YES), any position of the emitting portion 22 is disposed opposite to the front side of the user's left eye 8. In this case, the user has a high possibility of visually recognizing the image displayed on the emission unit 22. For this reason, it is assumed that the frequency of changes in the direction of the line of sight of the user is small and the direction of the line of sight is stable. That is, the HMD 1 is in a state where color braking is difficult to occur. In this case, the CPU 31 displays an image on the liquid crystal panel 2A by the FS method (S17). Note that the FS method can increase the resolution of the displayed image as compared to another method. For this reason, the HMD 1 can display an image with better image quality on the liquid crystal panel 2A in a state where the possibility of color braking is low. On the other hand, when the display unit is arranged at any position in the first peripheral area (S15: NO), the user can visually recognize both the image displayed on the liquid crystal panel 2A and the scenery in front of him. There is a high possibility that the direction of the line of sight is frequently changed. In other words, color braking is likely to occur. In this case, the CPU 31 displays an image on the liquid crystal panel 2A by another method (S31). Thereby, since HMD1 can suppress generation | occurrence | production of color braking, it can suppress that a user's visibility falls. In this way, the HMD 1 can determine the ease of occurrence of color braking according to the position of the HD 2, and can display an image on the liquid crystal panel 2 </ b> A by another method when color braking is likely to occur.
 HMD1は、第1ボールジョイント6及び第2ボールジョイント7の少なくとも一方を回動させることによって、第1位置、第2位置、及び、第3位置の何れかにHD2を配置させる。ここで、HD2が第1位置に配置された場合の第1距離L1(図3(A)参照)よりも、HD2が第2位置に配置された場合の第2距離L2(図3(B)参照)の方が長い。このため、CPU31は、第1位置を含む中央領域にHD2が配置されているか、又は、第2位置を含む第1周辺領域にHD2が配置されているかを、装着部11の固定部14と出射部22との間の距離に基づいて判断できる。言い換えると、HMD1は、第1位置を含む中央領域の何れかの位置にHD2が配置されているか、又は、第2位置を含む周辺領域の何れかの位置にHD2が配置されているかを、出射部22と第1ボールジョイント6の球体部61Bとの間の距離が第1距離L1であるか又は第2距離L2であるかに応じて特定できる。即ち、CPU31は、第1ボールジョイント6の球体部61Bと出射部22との距離に基づいてカラーブレーキングの発生のし易さを判断し、HD2の位置に応じた適切な表示方式(FS方式又は別方式)で液晶パネル2Aに画像を表示させることができる。 The HMD 1 arranges the HD 2 in any one of the first position, the second position, and the third position by rotating at least one of the first ball joint 6 and the second ball joint 7. Here, the second distance L2 when HD2 is arranged at the second position (FIG. 3B) than the first distance L1 when HD2 is arranged at the first position (see FIG. 3A). See) is longer. For this reason, the CPU 31 determines whether the HD2 is disposed in the central region including the first position or the HD2 is disposed in the first peripheral region including the second position. This can be determined based on the distance to the unit 22. In other words, the HMD 1 outputs whether the HD2 is arranged at any position in the central area including the first position or the HD2 is arranged at any position in the peripheral area including the second position. It can be specified depending on whether the distance between the portion 22 and the spherical portion 61B of the first ball joint 6 is the first distance L1 or the second distance L2. That is, the CPU 31 determines the ease of occurrence of color braking based on the distance between the spherical portion 61B of the first ball joint 6 and the emitting portion 22, and displays an appropriate display method (FS method) according to the position of HD2. Alternatively, an image can be displayed on the liquid crystal panel 2A by another method.
 一方、HD2は、第1位置に配置された状態(図3(A)参照)から、第1ボールジョイント6を第2所定角度以上回動させることによって、第3位置に配置される(図4参照)。このときの第1ボールジョイント6の回動角度は、第2ボールジョイント7の回動角度よりも大きくなる。このため、CPU31は、第1位置を含む中央領域にHD2が配置されているか、又は、第3位置を含む第2周辺領域にHD2が配置されているかを、固定部14に対するアーム部12の角度に基づいて判断できる。即ち、CPU31は、固定部14とアーム部12との角度に応じてカラーブレーキングの発生のし易さを判断し、適切な表示方式(FS方式又は別方式)で液晶パネル2Aに画像を表示させることができる。 On the other hand, the HD 2 is disposed at the third position by rotating the first ball joint 6 by a second predetermined angle or more from the state disposed at the first position (see FIG. 3A) (FIG. 4). reference). At this time, the rotation angle of the first ball joint 6 is larger than the rotation angle of the second ball joint 7. Therefore, the CPU 31 determines whether the HD2 is disposed in the central region including the first position or the HD2 is disposed in the second peripheral region including the third position. Can be determined based on That is, the CPU 31 determines the ease of occurrence of color braking according to the angle between the fixed portion 14 and the arm portion 12, and displays an image on the liquid crystal panel 2A by an appropriate display method (FS method or another method). Can be made.
 使用者は、HD2を視界から外すことによって眼前の景色を良好に視認するために、HD2を第2周辺領域に配置させる場合がある。この場合、使用者は、液晶パネル2Aの出射部22に表示された画像を視認しないので、HMD1にカラーブレーキングは発生しない。しかし、例えば使用者が第2周辺領域の何れかの位置に配置されたHD2を、元の第1位置を含む中央領域に移動させた場合、液晶パネル2AにFS方式で画像が表示されていると、移動直後にカラーブレーキングが発生する可能性がある。このため、CPU31は、HD2が第2周辺領域に配置されていると判定した場合(S19:NO)、別方式で液晶パネル2Aに画像を表示させる(S31)。これによって、HMD1は、HD2が第2周辺領域の何れかの位置(例えば第3位置)から、中央領域の何れかの位置(例えば、第1位置)に移動したときにカラーブレーキングが発生することを抑制できる。  The user may place the HD2 in the second peripheral region in order to view the scenery in front of the eyes well by removing the HD2 from the field of view. In this case, since the user does not visually recognize the image displayed on the emission unit 22 of the liquid crystal panel 2A, color braking does not occur in the HMD1. However, for example, when the user moves the HD 2 arranged at any position in the second peripheral area to the central area including the original first position, an image is displayed on the liquid crystal panel 2A by the FS method. Color braking may occur immediately after the movement. Therefore, when the CPU 31 determines that the HD 2 is disposed in the second peripheral area (S19: NO), the CPU 31 displays an image on the liquid crystal panel 2A by another method (S31). As a result, the HMD 1 causes color braking when the HD 2 moves from any position (for example, the third position) in the second peripheral area to any position (for example, the first position) in the central area. This can be suppressed. *
 CPU31は、視線の方向が所定以上変化した場合の回数に基づいて変化頻度を特定する(S23)。変化頻度が所定頻度未満の場合(S25:YES)、FS方式で液晶パネル2Aに画像が表示された場合でも、カラーブレーキングは発生し難い。このため、CPU31は、このような場合にFS方式で液晶パネル2Aに画像を表示させる(S17)。 なお、本実施形態では変化頻度が所定頻度未満の場合(S25:YES) 、処理をS27に進め、CPU31は、黒色の領域の比率が所定閾値未満と判定された場合(S29:YES)、処理をS17に進める。これによって、HMD1は、カラーブレーキングが発生し難い場合にFS方式の特長を生かした表示制御が可能となる。一方、変化頻度が大きくなる程、FS方式で液晶パネル2Aに画像が表示された場合にカラーブレーキングが発生し易くなる。このため、CPU31は、変化頻度が所定頻度以上の場合(S25:NO)、別方式で液晶パネル2Aに画像を表示させる(S31)。これによって、HMD1は使用者の視線の方向の変化を判定し、本実施形態ではさらに、その頻度を特定することで、カラーブレーキングの発生を抑制できるので、使用者の視認性が低下することを抑制できる。なお、FPGA34は、カメラ39から出力される画像データに基づき、使用者の視線の方向を特定するので、視線の方向の変化を容易かつ適切に特定できる。 The CPU 31 specifies the change frequency based on the number of times when the direction of the line of sight changes more than a predetermined value (S23). When the change frequency is less than the predetermined frequency (S25: YES), color braking is unlikely to occur even when an image is displayed on the liquid crystal panel 2A by the FS method. Therefore, in such a case, the CPU 31 displays an image on the liquid crystal panel 2A by the FS method (S17). In the present embodiment, if the change frequency is less than the predetermined frequency (S25: YES), the process proceeds to S27, and if the CPU 31 determines that the ratio of the black area is less than the predetermined threshold (S29: YES), the process To S17. As a result, the HMD 1 can perform display control utilizing the features of the FS method when color braking is difficult to occur. On the other hand, as the change frequency increases, color braking is more likely to occur when an image is displayed on the liquid crystal panel 2A by the FS method. For this reason, when the change frequency is equal to or higher than the predetermined frequency (S25: NO), the CPU 31 displays an image on the liquid crystal panel 2A by another method (S31). As a result, the HMD 1 determines a change in the direction of the user's line of sight, and in the present embodiment, the occurrence of color braking can be further suppressed by specifying the frequency, thereby reducing the visibility of the user. Can be suppressed. Note that the FPGA 34 specifies the direction of the line of sight of the user based on the image data output from the camera 39, so that the change in the direction of the line of sight can be specified easily and appropriately.
 FS方式で画像が表示される場合、特に、画像が白黒画像に近似する程、カラーブレーキングが発生し易い。又、画像の全領域に対する黒色の領域の比率が大きい程、画像は白黒画像に近似する。これに対し、HMD1では、黒色の比率が所定閾値以上の場合(S29:NO)、別方式で液晶パネル2Aに画像を表示させる(S31)。これによって、カラーブレーキングが発生し易い状態において別方式で画像が表示されことになるので、HMD1は、カラーブレーキングを効果的に抑制できる。 When an image is displayed by the FS method, color braking is more likely to occur, particularly as the image approximates a black and white image. Further, the larger the ratio of the black area to the entire area of the image, the closer the image is to a black and white image. On the other hand, in HMD1, when the black ratio is equal to or greater than the predetermined threshold (S29: NO), an image is displayed on the liquid crystal panel 2A by another method (S31). As a result, an image is displayed in a different manner in a state where color braking is likely to occur, so the HMD 1 can effectively suppress color braking.
 FS方式は、R、G、Bの光を時分割で切り替えて出力することによって、使用者にカラー画像を視認させる方式である。一方、別方式は、FS方式で出力されるR、G、Bの光のうち何れかの光を、FS方式と同じタイミングで出力することによって、使用者に単色画像を視認させる方式である。なお、カラーブレーキングは、RGB各色の画像が残像等として視認されることによって発生する。このため、HMD1は、別方式で液晶パネル2Aに画像を表示させることによって、単色画像を使用者に視認させることになるので、カラーブレーキングの発生を抑制できる。又、HMD1は、別方式で液晶パネル2Aに画像を表示させることによって、光の出力総量を抑制できる。従って、HMD1は、別方式で液晶パネル2Aに画像を表示させることによって、FS方式で液晶パネル2Aに画像を表示させる場合と比べて消費電力を抑制できる。 The FS system is a system that allows a user to visually recognize a color image by switching and outputting R, G, and B light in a time-sharing manner. On the other hand, another method is a method for allowing a user to visually recognize a monochromatic image by outputting any one of R, G, and B light output by the FS method at the same timing as the FS method. Note that color braking occurs when an image of each RGB color is visually recognized as an afterimage or the like. For this reason, the HMD 1 displays the image on the liquid crystal panel 2 </ b> A by another method so that the user can visually recognize the monochromatic image, so that the occurrence of color braking can be suppressed. Further, the HMD 1 can suppress the total output amount of light by displaying an image on the liquid crystal panel 2A by another method. Therefore, the HMD 1 can suppress power consumption by displaying an image on the liquid crystal panel 2A by another method, as compared with the case of displaying an image on the liquid crystal panel 2A by the FS method.
<第1変形例>
 上記実施形態における別方式は、他の方式に変更できる。図8に示すように、第1変形例における別方式は、R、G、Bの光の全てを同持タイミングで出力することによって、使用者に画像を視認させる方式である。図8(B)は、図8(A)のFS方式が別方式に切り替えられた場合の、垂直同期信号(Vsync)、及び、R、G、Bのそれぞれの光の出力信号のタイミングチャートを例示する。このように、第1変形例における別方式では、R,G,Bのすべての光が、第1周期T1で同期して出力される。この場合、図8(A)のFS方式における白色以外(例えば、R,G,Bの各色)の画像は、別方式に切り替えられることによって、全て、白色の画像に変換される。それぞれのタイミングにおける光の出力時間t1は、FS方式と同一である。
<First Modification>
Another method in the above embodiment can be changed to another method. As shown in FIG. 8, another method in the first modification is a method in which the user visually recognizes an image by outputting all of R, G, and B light at the same timing. FIG. 8B is a timing chart of the vertical sync signal (Vsync) and the output signals of the R, G, and B lights when the FS method of FIG. 8A is switched to another method. Illustrate. Thus, in another system in the first modification, all the R, G, and B lights are output in synchronization with the first period T1. In this case, all images other than white (for example, R, G, and B colors) in the FS method of FIG. 8A are all converted to white images by switching to another method. The light output time t1 at each timing is the same as in the FS method.
 HMD1は、別方式で液晶パネル2Aに画像を表示させることによって、カラーブレーキングの発生を抑制できる。なぜならば、カラーブレーキングは、RGB各色の光が異なるタイミングで出力されることが原因で発生するのに対し、別方式では、RGB各色の光は同じタイミングで出力されるためである。又、別方式で液晶パネル2Aに画像が表示された場合、FS方式におけるRGB各色の画像は全て白色の画像として表示される。即ち、FS方式が別方式に切り替えられることによって、元のカラー画像は白黒画像に変換される。これに対し、上記実施形態では、FS方式における画像は単色画像に変換される(図6参照)。従って、HMD1は、上記実施形態における別方式の場合と比べて、元の画像の表示態様をできるだけ維持しつつ、液晶パネル2Aに画像を表示させることができる。 The HMD 1 can suppress the occurrence of color braking by displaying an image on the liquid crystal panel 2A by another method. This is because color braking occurs because light of each color of RGB is output at different timing, whereas light of each color of RGB is output at the same timing in another method. Further, when an image is displayed on the liquid crystal panel 2A by another method, all the RGB color images in the FS method are displayed as white images. That is, the original color image is converted into a monochrome image by switching the FS method to another method. On the other hand, in the above embodiment, an image in the FS system is converted into a single color image (see FIG. 6). Therefore, the HMD 1 can display an image on the liquid crystal panel 2 </ b> A while maintaining the display mode of the original image as much as possible as compared with the case of another method in the embodiment.
<第2変形例>
 図9に示すように、第2変形例における別方式は、R,G,Bのそれぞれの光を、第1周期T1の半分の第2周期T2で切り替えられながら出力することによって、使用者に画像を視認させる方式である。図9(B)は、図9(A)のFS方式が別方式に切り替えられた場合の、垂直同期信号(Vsync)、及び、R、G、Bのそれぞれの光の出力信号のタイミングチャートを例示する。R,G,Bのそれぞれの光は、FS方式と同様、異なるタイミングで出力されるように時間をずらして配列される。又、別方式では、光の出力時間t2がFS方式の出力時間t1の1/2とされる。つまり、別方式では、RGB各色の光の出力回数がFS方式の2倍(第1周期T1→第2周期T2)となり、それぞれの光の出力時間がFS方式の1/2となる(出力時間t1→出力時間t2)。このため、RGB各色の光の出力時間の総計は、FS方式の場合と同一になる。
<Second Modification>
As shown in FIG. 9, another method in the second modified example is that the R, G, B light is output while being switched in a second period T2 that is half of the first period T1. This is a method for visually recognizing an image. FIG. 9B is a timing chart of the vertical sync signal (Vsync) and the output signals of the R, G, and B lights when the FS method of FIG. 9A is switched to another method. Illustrate. The lights of R, G, and B are arranged at different times so as to be output at different timings as in the FS system. In another method, the light output time t2 is ½ of the FS output time t1. That is, in another method, the output frequency of light of each RGB color is twice that of the FS method (first cycle T1 → second cycle T2), and the output time of each light is ½ that of the FS method (output time). t1 → output time t2). For this reason, the total output time of light of each RGB color is the same as in the FS method.
 HMD1は、別方式で液晶パネル2Aに画像を表示させることによって、カラーブレーキングの発生を抑制できる。なぜならば、カラーブレーキングはRGB各色の画像が使用者に視認されるときにうまく混ざり合わないことが原因であるのに対し、別方式とすることによって、FS方式よりも短い第2周期T2で切り替えられながらRGB各色の光が出力されるので、残像効果によって適切に混ざり合う可能性が高いためである。更に、FS方式におけるRGB各色の画像は、別方式において表示される画像でも同様に再現される。従って、HMD1は、別方式で画像を液晶パネル2Aに表示させる場合でも、元の画像の表示態様を適切に再現できる。 The HMD 1 can suppress the occurrence of color braking by displaying an image on the liquid crystal panel 2A by another method. This is because color braking is caused by the fact that RGB color images do not mix well when viewed by the user, but by using a different method, the second period T2 is shorter than that of the FS method. This is because light of each color of RGB is output while being switched, and thus there is a high possibility that they will be appropriately mixed by the afterimage effect. Furthermore, RGB images in the FS format are reproduced in the same manner even in images displayed in a different format. Accordingly, the HMD 1 can appropriately reproduce the display mode of the original image even when the image is displayed on the liquid crystal panel 2A by another method.
<その他の変形例>
 本開示は、上記実施形態に限定されず、種々の変更が可能である。上記のHMD1では、使用者の左眼8の眼前にHD2が配置される構成であったが、使用者の右眼の眼前にHD2が配置される構成としてもよい。CPU31は、S17の処理によって、FS方式と異なる表示方式であってカラーブレーキングの発生の可能性が低い他の表示方式で、液晶パネル2Aに画像を表示させてもよい。CPU31は、FPGA33から出力される信号に基づいて特定されるHD2の位置と、FPGA34によって特定される使用者の眼の虹彩の位置とに基づき、使用者の左眼8とHD2との相対的な位置関係を特定してもよい。これによって、CPU31は、FPGA33から出力される信号のみに基づいてHD2の位置を特定する場合と比べて、より正確に、使用者の左眼8に対するHD2の相対的な位置を特定できる。
<Other variations>
The present disclosure is not limited to the above-described embodiment, and various modifications can be made. The HMD 1 has a configuration in which the HD 2 is disposed in front of the user's left eye 8. However, the HD 2 may be disposed in front of the user's right eye. The CPU 31 may display an image on the liquid crystal panel 2 </ b> A by another display method that is different from the FS method and has a low possibility of occurrence of color braking by the process of S <b> 17. Based on the position of HD2 specified based on the signal output from the FPGA 33 and the position of the iris of the user's eye specified by the FPGA 34, the CPU 31 determines the relative relationship between the left eye 8 of the user and HD2. The positional relationship may be specified. As a result, the CPU 31 can specify the relative position of the HD2 with respect to the left eye 8 of the user more accurately than when the position of the HD2 is specified based only on the signal output from the FPGA 33.
 HMD1は、カメラ39によって着用具10を撮影してもよい。CPU31は、撮影された画像のデータに基づいて、HD2の位置を特定してもよい。又、HMD1は、カメラ39によって第1ボールジョイント6を撮影してもよい。CPU31は、撮影された画像のデータに基づいて、中心C11、C12間の距離を特定してもよい。CPU31は、特定された距離に基づいて、HD2の位置を特定してもよい。又、HMD1は、第1ボールジョイント6及び第2ボールジョイント7の回動角度を検出するセンサを有していてもよい。CPU31は、このセンサに基づいて特定される第1ボールジョイント6及び第2ボールジョイント7のそれぞれの回動角度に基づいて、HD2の位置を特定してもよい。 The HMD 1 may photograph the wearing tool 10 with the camera 39. The CPU 31 may specify the position of the HD 2 based on the captured image data. Further, the HMD 1 may photograph the first ball joint 6 with the camera 39. The CPU 31 may specify the distance between the centers C11 and C12 based on the captured image data. The CPU 31 may specify the position of HD2 based on the specified distance. The HMD 1 may include a sensor that detects the rotation angles of the first ball joint 6 and the second ball joint 7. The CPU 31 may specify the position of the HD 2 based on the rotation angles of the first ball joint 6 and the second ball joint 7 specified based on this sensor.
 HMD1は着用具10を有していなくてもよい。固定部14は、着用具10と異なる別の部材に固定されてもよい。例えば固定部14は、使用者によって着用される帽子、ヘルメット等に固定されてもよい。HD2が第1位置に配置された場合の第1距離L1は、HD2が第2位置に配置された場合の第2距離L2より長くてもよい。HD2を第1位置から第3位置に移動させるときの第1ボールジョイント6の回動角度は、第2ボールジョイント7の回動角度よりも小さくてもよい。装着部11は、第1ボールジョイント6及び第2ボールジョイント7の代わりに、他の調節機構、例えば、ゴム等の弾性体によって形成されたフレキシブルジョイントを有していてもよい。 HMD 1 may not have the wearing tool 10. The fixing portion 14 may be fixed to another member different from the wearing tool 10. For example, the fixing portion 14 may be fixed to a hat, a helmet, or the like worn by the user. The first distance L1 when HD2 is disposed at the first position may be longer than the second distance L2 when HD2 is disposed at the second position. The rotation angle of the first ball joint 6 when moving the HD 2 from the first position to the third position may be smaller than the rotation angle of the second ball joint 7. The mounting portion 11 may have another adjustment mechanism, for example, a flexible joint formed of an elastic body such as rubber, instead of the first ball joint 6 and the second ball joint 7.
 HMD1は、使用者の視線の方向を別の方法で特定してもよい。例えば、HMD1は、使用者の顔を赤外線で照らすと光源、及び、赤外線の撮影が可能なカメラを有していてもよい。CPU31は、カメラによって撮影された赤外線の反射光から、角膜反射を検出してもよい。CPU31は、検出された角膜反射の位置に対する瞳孔の位置を特定してもよい。CPU31は、特定された瞳孔の位置に基づいて、視線の方向を特定してもよい。 The HMD 1 may specify the direction of the user's line of sight by another method. For example, the HMD 1 may have a light source and a camera capable of infrared imaging when the user's face is illuminated with infrared rays. The CPU 31 may detect corneal reflection from infrared reflected light photographed by the camera. The CPU 31 may specify the position of the pupil with respect to the detected position of corneal reflection. The CPU 31 may specify the direction of the line of sight based on the specified pupil position.
 FPGA34は、使用者の眼の位置を示す信号を、視線の方向を示す信号の代わりにCPU31に出力してもよい。CPU31は、FPGA34から出力された信号に基づき、使用者の眼の位置を特定してもよい。CPU31は、特定された眼の位置が所定以上変化した場合の回数を計数してもよい。CPU31は、単位時間(例えば、1秒)当たりの回数を、変化頻度として更に特定してもよい(S23)。 The FPGA 34 may output a signal indicating the position of the user's eyes to the CPU 31 instead of a signal indicating the direction of the line of sight. The CPU 31 may specify the position of the user's eye based on the signal output from the FPGA 34. CPU31 may count the frequency | count when the position of the specified eye changes more than predetermined. The CPU 31 may further specify the number of times per unit time (for example, 1 second) as the change frequency (S23).
 FPGA36は、FPGA35から出力される信号に基づき、画像が白黒画像であるかカラー画像であるかを検出し、検出結果を示す信号をCPU31に出力してもよい。CPU31は、FPGA36から出力される信号に基づき、画像がカラー画像であると判定された場合、液晶パネル2AにFS方式で画像を表示させてもよい(S17)。一方、CPU31は、画像が白黒画像であると判定された場合、液晶パネル2Aに別方式で画像を表示させてもよい(S31)。 The FPGA 36 may detect whether the image is a black and white image or a color image based on a signal output from the FPGA 35, and may output a signal indicating the detection result to the CPU 31. When the CPU 31 determines that the image is a color image based on the signal output from the FPGA 36, the CPU 31 may cause the liquid crystal panel 2A to display the image using the FS method (S17). On the other hand, when it is determined that the image is a black and white image, the CPU 31 may cause the liquid crystal panel 2A to display the image by another method (S31).
 CPU31は、HD2が第1周辺位置又は第2周辺位置に配置されていると判定された場合(S19:YES、S19:NO)、視線の方向や黒色の領域の比率に関わらず、液晶パネル2Aに別方式で画像を表示してもよい(S31)。CPU31は、視線の変化頻度が所定頻度未満と判定された場合(S25:YES)、HD2の位置や黒色の領域の比率に関わらず、液晶パネル2AにFS方式で画像を表示してもよい(S17)。 When it is determined that the HD 2 is arranged at the first peripheral position or the second peripheral position (S19: YES, S19: NO), the CPU 31 determines whether the liquid crystal panel 2A is used regardless of the line-of-sight direction and the ratio of the black area. Alternatively, the image may be displayed by another method (S31). When it is determined that the line-of-sight change frequency is less than the predetermined frequency (S25: YES), the CPU 31 may display an image on the liquid crystal panel 2A by the FS method regardless of the position of the HD2 or the ratio of the black region ( S17).
 使用者の左眼8を撮影する為にHD2の筐体21に設けられるカメラ39について、設けられる位置及び数量は特段限定されない。例えば、図10(A)に示すように、HD2の筐体21の後端部の開口のうち左端部にカメラ39Aが設けられてもよいし、右端部にカメラ39Bが設けられてもよい。又、カメラ39A、39Bの両方が設けられてもよい。なお、カメラ39がHD2に設けられる場合、撮影範囲を使用者側に向けることが容易に可能となる。従って、HMD1は、カメラ39によって使用者の左眼8を適切に撮影できる。又、HMD1の着用具10は頭部に装着されるため、小型化が望まれる傾向がある。このため、カメラ39を固定するペースが着用具10に十分に確保できない場合がある。このような場合でも、カメラ39をHD2に設けることによって、カメラ39を安定的に保持できる。 The position and quantity of the camera 39 provided in the HD2 casing 21 for photographing the user's left eye 8 are not particularly limited. For example, as shown in FIG. 10A, a camera 39A may be provided at the left end of the opening at the rear end of the housing 21 of the HD2, or a camera 39B may be provided at the right end. Both cameras 39A and 39B may be provided. Note that when the camera 39 is provided in the HD 2, the shooting range can be easily directed to the user side. Therefore, the HMD 1 can appropriately photograph the user's left eye 8 with the camera 39. Moreover, since the wearing tool 10 of HMD1 is mounted | worn with a head, there exists a tendency for size reduction to be desired. For this reason, the pace which fixes the camera 39 may not be ensured enough for the wearing tool 10. FIG. Even in such a case, the camera 39 can be stably held by providing the camera 39 in the HD 2.
 更に、カメラ39Aは、使用者の両眼のうちHD2から離隔する側の眼、即ち右眼を撮影してもよい。FPGA34は、右眼の位置、又は、右眼の視線の方向を特定してもよい。なお、眼の動きは両眼で連動するため、HD2から離隔する側の眼の視線の方向は、HD2に近接する側の眼の視線の方向と一致する。このため上記のように、右眼の視線の方向に基づいて上記実施形態と同様の処理が実施された場合でも、HMD1は、使用者の左眼8の位置の変化に基づく表示方式の制御を行うことができる。また、HMD1は頭部に装着されるため、小型化が望まれる傾向があり、HMD1のうちHD2が配置される側には、検出用の装置を新たに配置するスペースが十分に確保できない場合がある。このような場合であっても、HMD1では、使用者の眼の動きが両眼で連動することを利用して、HD2が配置されていない側に検出用の装置を配置するため、HMD1は、使用者の視線の方向の変化に応じて表示方式の制御を行うことができる。 Furthermore, the camera 39A may photograph the eye on the side separated from the HD2 among the both eyes of the user, that is, the right eye. The FPGA 34 may specify the position of the right eye or the direction of the line of sight of the right eye. In addition, since the movement of the eyes is interlocked with both eyes, the direction of the line of sight of the eye on the side separated from HD2 coincides with the direction of the line of sight of the eye on the side close to HD2. Therefore, as described above, even when processing similar to that in the above embodiment is performed based on the direction of the line of sight of the right eye, the HMD 1 controls the display method based on the change in the position of the left eye 8 of the user. It can be carried out. In addition, since the HMD 1 is mounted on the head, there is a tendency that it is desired to reduce the size, and there may be a case where sufficient space for newly installing a detection device cannot be secured on the side of the HMD 1 where the HD 2 is disposed. is there. Even in such a case, the HMD 1 uses the fact that the movement of the user's eye is interlocked with both eyes and arranges the detection device on the side where the HD 2 is not arranged. The display method can be controlled in accordance with a change in the direction of the user's line of sight.
 又、図10(B)に示すように、HD2の筐体21の内部のうち液晶パネル2Aの後側に、ハーフミラー391が設けられてもよい。ハーフミラー391は、筐体21の後端部の開口から入射した外界の光を、右側に反射してもよい。ハーフミラー391の右側に、ハーフミラー391によって反射された光に基づいて外界を撮影可能なカメラ39Cが設けられてもよい。CPU31は、カメラ39Cによって撮影された画像のデータに基づいて、使用者の左眼8の視線の方向を特定してもよい。この構成とした場合、図10(A)の場合と比べて、特定される視線の方向の精度を向上させることができる。 Further, as shown in FIG. 10B, a half mirror 391 may be provided on the rear side of the liquid crystal panel 2A in the inside of the housing 21 of the HD2. The half mirror 391 may reflect the external light incident from the opening at the rear end of the housing 21 to the right side. On the right side of the half mirror 391, a camera 39C capable of photographing the outside world based on the light reflected by the half mirror 391 may be provided. The CPU 31 may specify the direction of the line of sight of the user's left eye 8 based on the data of the image captured by the camera 39C. With this configuration, it is possible to improve the accuracy of the specified line-of-sight direction as compared with the case of FIG.
 更に、カメラ39は、着用具10、装着部11(固定部14、アーム部12等)の何れかに設けられてもよい。カメラ39が着用具10に装着される場合、使用者の右眼が撮影されることによって右眼の視線の方向が特定されてもよい。なお、カメラ39が着用具10に設けられる場合、カメラ39の重量がHD2に加重されることを抑制できる。従って、HMD1は、HD2の重量が大きくなることによって装着性が低下することを抑制できる。 Furthermore, the camera 39 may be provided in any of the wearing tool 10 and the mounting part 11 (fixing part 14, arm part 12, etc.). When the camera 39 is attached to the wearing tool 10, the direction of the line of sight of the right eye may be specified by photographing the right eye of the user. In addition, when the camera 39 is provided in the wearing tool 10, it can suppress that the weight of the camera 39 is weighted to HD2. Therefore, the HMD 1 can suppress a decrease in wearability due to an increase in the weight of the HD 2.
<その他>
 HD2は、本開示の「表示部」の一例である。液晶パネル2Aは、本開示の「表示素子」の一例である。第1ボールジョイント6、及び、第2ボールジョイント7は、本開示の「調整部」の一例である。FPGA33、34は、本開示の「位置検出手段」の一例である。S17、S31の処理を行うCPU31は、本開示の「制御手段」の一例である。S17の処理を行うCPU31は、本開示の「第1制御手段」の一例である。S31の処理を行うCPU31は、本開示の「第2制御手段」の一例である。第1ボールジョイント6は、本開示の「第1調整部」の一例である。第2ボールジョイント7は、本開示の「第2調整部」の一例である。FPGA33は、本開示の「第1位置検出手段」の一例である。FPGA34は、本開示の「第2位置検出手段」の一例である。カメラ39は本開示の「撮影部」の一例である。FPGA36は、本開示の「比率検出手段」の一例である。第1周辺領域及び第2周辺領域は、本開示の「周辺領域」の一例である。
<Others>
HD2 is an example of the “display unit” of the present disclosure. The liquid crystal panel 2A is an example of the “display element” of the present disclosure. The first ball joint 6 and the second ball joint 7 are examples of the “adjustment unit” of the present disclosure. The FPGAs 33 and 34 are examples of the “position detection unit” of the present disclosure. The CPU 31 that performs the processes of S17 and S31 is an example of the “control unit” of the present disclosure. The CPU 31 that performs the process of S17 is an example of the “first control unit” of the present disclosure. The CPU 31 that performs the process of S31 is an example of the “second control unit” of the present disclosure. The first ball joint 6 is an example of the “first adjustment unit” in the present disclosure. The second ball joint 7 is an example of the “second adjustment unit” in the present disclosure. The FPGA 33 is an example of the “first position detection unit” of the present disclosure. The FPGA 34 is an example of the “second position detection unit” of the present disclosure. The camera 39 is an example of the “imaging unit” of the present disclosure. The FPGA 36 is an example of the “ratio detection unit” of the present disclosure. The first peripheral region and the second peripheral region are examples of the “peripheral region” of the present disclosure.
1    :HMD
2    :HD
2A   :液晶パネル
6    :第1ボールジョイント
7    :第2ボールジョイント
8    :左眼
10   :着用具
11   :装着部
12   :アーム部
14   :固定部
22   :出射部
31   :CPU
33、34、35、36   :FPGA
37、38、41   :加速度センサ
39   :カメラ
1: HMD
2: HD
2A: Liquid crystal panel 6: First ball joint 7: Second ball joint 8: Left eye 10: Wearing tool 11: Wearing part 12: Arm part 14: Fixing part 22: Emitting part 31: CPU
33, 34, 35, 36: FPGA
37, 38, 41: Acceleration sensor 39: Camera

Claims (13)

  1.  RGBの光の出力タイミングを制御することによってカラー画像を表示する表示素子を有する表示部と、
     使用者の頭部に前記表示部を装着するための部材であって、前記使用者の眼に対する前記表示部の位置を調整可能な調整部を少なくとも含む装着部と、
     前記使用者の眼と前記表示部との位置関係を検出する位置検出手段と、
     前記位置検出手段によって検出された前記位置関係に応じた出力タイミングでRGBの光を出力する表示方式で、前記表示素子に画像を表示させる制御手段と
    を備えたことを特徴とするヘッドマウントディスプレイ。
    A display unit having a display element for displaying a color image by controlling the output timing of RGB light;
    A member for mounting the display unit on a user's head, the mounting unit including at least an adjustment unit capable of adjusting the position of the display unit with respect to the user's eyes;
    Position detecting means for detecting a positional relationship between the user's eye and the display unit;
    A head-mounted display, comprising: a control unit that displays an image on the display element in a display method in which RGB light is output at an output timing corresponding to the positional relationship detected by the position detection unit.
  2.  前記位置検出手段は、前記表示部の位置を検出する第1位置検出手段を備え、
     前記制御手段は、前記第1位置検出手段によって検出された前記表示部の位置に応じた出力タイミングでRGBの光を出力する表示方式で、前記表示素子に前記画像を表示させることを特徴とする請求項1に記載のヘッドマウントディスプレイ。
    The position detection means includes first position detection means for detecting the position of the display unit,
    The control means causes the display element to display the image by a display method in which RGB light is output at an output timing corresponding to the position of the display unit detected by the first position detection means. The head mounted display according to claim 1.
  3.  前記調整部は、前記表示素子に表示された前記画像の光が出射される出射部が前記使用者の眼に対向する場合の前記表示部の位置を含む中央領域、及び、前記中央領域を除く周辺領域の何れかの領域に、前記表示部の位置を調整可能であり、
     前記制御手段は、
     前記第1位置検出手段によって検出された前記表示部の位置が前記中央領域に含まれる場合、フィールドシーケンシャル方式で前記表示素子に前記画像を表示させる第1制御手段と、
     前記第1位置検出手段によって検出された前記表示部の位置が前記周辺領域に含まれる場合、前記フィールドシーケンシャル方式と異なる別方式で前記表示素子に前記画像を表示させる第2制御手段と
    を備えたことを特徴とする請求項2に記載のヘッドマウントディスプレイ。
    The adjustment unit excludes a central region including a position of the display unit when an emission unit from which light of the image displayed on the display element is emitted faces the user's eyes, and the central region The position of the display unit can be adjusted to any one of the peripheral areas,
    The control means includes
    First control means for displaying the image on the display element in a field sequential manner when the position of the display unit detected by the first position detection means is included in the central region;
    When the position of the display unit detected by the first position detection unit is included in the peripheral region, the display unit includes a second control unit that displays the image on the display element in a different method from the field sequential method. The head mounted display according to claim 2.
  4.  前記装着部は、前記使用者の頭部に着用される着用具に固定される固定部、及び、前記固定部と前記表示部とを連結するアーム部を有し、
     前記調整部は、
      前記固定部と前記アーム部とを回動可能に連結する第1調整部、及び、前記表示部と前記アーム部とを回動可能に連結する第2調整部を有し、
      前記第1調整部及び前記第2調整部の少なくとも何れかの回動によって、前記表示部を、前記出射部と前記第1調整部との間の距離が第1距離となる位置であって前記中央領域に含まれる第1位置と、前記出射部と前記第1調整部との間の距離が前記第1距離よりも長い第2距離となる位置であって前記周辺領域に含まれる第2位置とに調整可能であり、
     前記第1制御手段は、前記第1位置検出手段によって検出された前記表示部の位置が、前記第1位置を含む前記中央領域に含まれる場合、前記フィールドシーケンシャル方式で前記表示素子に前記画像を表示させ、
      前記第2制御手段は、前記第1位置検出手段によって検出された前記表示部の位置が、前記第2位置を含む前記周辺領域に含まれる場合、前記フィールドシーケンシャル方式と異なる別方式で前記表示素子に前記画像を表示させる
    ことを特徴とする請求項3に記載のヘッドマウントディスプレイ。
    The mounting part has a fixing part fixed to a wearing tool worn on the user's head, and an arm part that connects the fixing part and the display part,
    The adjustment unit is
    A first adjustment unit that rotatably connects the fixing unit and the arm unit; and a second adjustment unit that rotatably connects the display unit and the arm unit;
    By rotating at least one of the first adjustment unit and the second adjustment unit, the display unit is at a position where the distance between the emission unit and the first adjustment unit is a first distance, and A second position included in the peripheral area, which is a position where a distance between the first position included in the central area and the distance between the emitting portion and the first adjusting section is a second distance longer than the first distance. And can be adjusted
    When the position of the display unit detected by the first position detection unit is included in the central region including the first position, the first control unit displays the image on the display element by the field sequential method. Display
    When the position of the display unit detected by the first position detection unit is included in the peripheral region including the second position, the second control unit may perform the display element in a different method from the field sequential method. The head-mounted display according to claim 3, wherein the image is displayed on the head.
  5.  前記装着部は、前記使用者の頭部に着用される着用具に固定される固定部、及び、前記固定部と前記表示部とを連結するアーム部を有し、
     前記調整部は、
      前記固定部と前記アーム部とを回動可能に連結する第1調整部、及び、前記表示部と前記アーム部とを回動可能に連結する第2調整部を有し、
      前記第1調整部及び前記第2調整部の少なくとも何れかの回動によって、前記表示部を、前記中央領域に含まれる第1位置と、前記表示部が前記第1位置に配置された状態から前記第1調整部が前記第2調整部よりも大きい所定角度以上回動した場合の位置であって前記周辺領域に含まれる第3位置とに調整可能であり、
     前記第1制御手段は、前記第1位置検出手段によって検出された前記表示部の位置が、前記第1位置を含む前記中央領域に含まれる場合、前記フィールドシーケンシャル方式で前記表示素子に前記画像を表示させ、
      前記第2制御手段は、前記第1位置検出手段によって検出された前記表示部の位置が、前記第3位置を含む前記周辺領域に含まれる場合、前記フィールドシーケンシャル方式と異なる別方式で前記表示素子に前記画像を表示させる
    ことを特徴とする請求項3に記載のヘッドマウントディスプレイ。
    The mounting part has a fixing part fixed to a wearing tool worn on the user's head, and an arm part that connects the fixing part and the display part,
    The adjustment unit is
    A first adjustment unit that rotatably connects the fixing unit and the arm unit; and a second adjustment unit that rotatably connects the display unit and the arm unit;
    By rotating at least one of the first adjustment unit and the second adjustment unit, the display unit is moved from a first position included in the central region, and the display unit is arranged in the first position. The first adjustment unit can be adjusted to a position when the first adjustment unit is rotated by a predetermined angle or more larger than the second adjustment unit and to a third position included in the peripheral region,
    When the position of the display unit detected by the first position detection unit is included in the central region including the first position, the first control unit displays the image on the display element by the field sequential method. Display
    When the position of the display unit detected by the first position detection unit is included in the peripheral region including the third position, the second control unit may perform the display element in a different method from the field sequential method. The head-mounted display according to claim 3, wherein the image is displayed on the head.
  6.  前記位置検出手段は、前記使用者の眼の位置に基づいて視線の方向を検出する第2位置検出手段を備え、
     前記制御手段は、前記第2位置検出手段によって検出された前記視線の方向の変化に応じた出力タイミングでRGBの光を出力する表示方式で、前記表示素子に前記画像を表示させることを特徴とする請求項1に記載のヘッドマウントディスプレイ。
    The position detection means includes second position detection means for detecting the direction of the line of sight based on the position of the eyes of the user,
    The control means causes the display element to display the image in a display system that outputs RGB light at an output timing corresponding to a change in the direction of the line of sight detected by the second position detection means. The head mounted display according to claim 1.
  7.  前記制御手段は、
     前記第2位置検出手段によって検出された前記視線の方向の単位時間当たりの変化の頻度が、所定頻度未満の場合、フィールドシーケンシャル方式で前記表示素子に前記画像を表示させる第1制御手段と、
     前記第2位置検出手段によって検出された前記視線の方向の単位時間当たりの変化の頻度が、前記所定頻度以上の場合、前記フィールドシーケンシャル方式と異なる別方式で前記表示素子に前記画像を表示させる第2制御手段と
    を備えたことを特徴とする請求項6に記載のヘッドマウントディスプレイ。
    The control means includes
    First control means for displaying the image on the display element in a field sequential manner when the frequency of change in the direction of the line of sight detected by the second position detection means is less than a predetermined frequency;
    When the frequency of the change in the direction of the line of sight detected by the second position detection unit is equal to or higher than the predetermined frequency, the display element displays the image in a different method different from the field sequential method. The head-mounted display according to claim 6, further comprising: 2 control means.
  8.  前記第2位置検出手段は、前記使用者の眼のうち、前記表示部から離隔する側の眼の位置の変化を検出することを特徴とする請求項6又は7に記載のヘッドマウントディスプレイ。 The head mounted display according to claim 6 or 7, wherein the second position detecting means detects a change in a position of an eye on a side separated from the display unit among the eyes of the user.
  9.  前記使用者の眼を撮影する撮影部を更に備え、
     前記第2位置検出手段は、
     前記撮影部によって撮影された前記使用者の眼の画像データに基づいて、前記視線の方向を特定することを特徴とする請求項6から8の何れかに記載のヘッドマウントディスプレイ。
    The camera further includes a photographing unit that photographs the eyes of the user,
    The second position detecting means includes
    The head mounted display according to any one of claims 6 to 8, wherein a direction of the line of sight is specified based on image data of the user's eyes photographed by the photographing unit.
  10.  前記表示素子に表示された前記画像の全領域のうち黒色の領域の比率を検出する比率検出手段を備え、
     前記第1制御手段は、
     前記比率検出手段によって検出された前記比率が所定閾値未満である場合、前記フィールドシーケンシャル方式で前記表示素子に前記画像を表示させ
     前記第2制御手段は、
     前記比率検出手段によって検出された前記比率が所定閾値以上である場合、前記別方式で前記表示素子に前記画像を表示させることを特徴とする請求項3から9の何れかに記載のヘッドマウントディスプレイ。
    Comprising a ratio detection means for detecting a ratio of a black area of the entire area of the image displayed on the display element;
    The first control means includes
    When the ratio detected by the ratio detection unit is less than a predetermined threshold, the second control unit displays the image on the display element by the field sequential method.
    10. The head mounted display according to claim 3, wherein when the ratio detected by the ratio detection unit is equal to or greater than a predetermined threshold, the display device displays the image by the different method. 11. .
  11.  前記フィールドシーケンシャル方式は、RGBの光を時分割で切り替えて出力する方式であり、 前記別方式は、RGBのうち何れかの光を出力する方式であることを特徴とする請求項3から10の何れかに記載のヘッドマウントディスプレイ。 11. The field sequential method is a method of switching and outputting RGB light in a time division manner, and the separate method is a method of outputting any light of RGB. The head mounted display in any one.
  12.  前記フィールドシーケンシャル方式は、RGBの光を時分割で切り替えて出力する方式であり、
     前記別方式は、RGBの光を同時に出力する方式であることを特徴とする請求項3から10の何れかに記載のヘッドマウントディスプレイ。
    The field sequential method is a method of switching and outputting RGB light in a time division manner,
    The head-mounted display according to claim 3, wherein the different method is a method of simultaneously outputting RGB light.
  13.  RGBの光の出力タイミングを制御することによってカラー画像を表示する表示素子を有する表示部と、
     使用者の頭部に前記表示部を装着するための部材であって、前記使用者の眼に対する前記表示部の位置を調整可能な調整部を有する装着部と、
     前記使用者の眼と前記表示部との位置関係を検出する位置検出手段と、
     前記位置検出手段によって検出された前記位置関係に応じ、RGBの光の出力タイミングが互いに相違する第1方式又は第2方式で前記表示素子に画像を表示させる制御手段とを備え、
     前記第1方式は、RGBの光を、時分割で、第1周期で切り替えて出力する方式であり、
     前記第2方式は、RGBの光を、時分割で、第1周期よりも短い第2周期で切り替えて出力する方式であることを特徴とするヘッドマウントディスプレイ。
    A display unit having a display element for displaying a color image by controlling the output timing of RGB light;
    A member for mounting the display unit on a user's head, the mounting unit having an adjustment unit capable of adjusting the position of the display unit with respect to the user's eyes;
    Position detecting means for detecting a positional relationship between the user's eye and the display unit;
    Control means for displaying an image on the display element in a first method or a second method in which output timings of RGB light are different from each other according to the positional relationship detected by the position detection unit;
    The first method is a method of outputting RGB light by switching in a first cycle in a time division manner,
    The head mounted display is characterized in that the second method is a method of switching and outputting RGB light at a second period shorter than the first period in a time division manner.
PCT/JP2017/029078 2016-12-20 2017-08-10 Head-mounted display WO2018116519A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016246892A JP6540676B2 (en) 2016-12-20 2016-12-20 Head mounted display
JP2016-246892 2016-12-20

Publications (1)

Publication Number Publication Date
WO2018116519A1 true WO2018116519A1 (en) 2018-06-28

Family

ID=62627332

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/029078 WO2018116519A1 (en) 2016-12-20 2017-08-10 Head-mounted display

Country Status (2)

Country Link
JP (1) JP6540676B2 (en)
WO (1) WO2018116519A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7444861B2 (en) * 2018-09-26 2024-03-06 マジック リープ, インコーポレイテッド Diffractive optical element with refractive power

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000276115A (en) * 1999-03-29 2000-10-06 Canon Inc Picture display device
JP2015526750A (en) * 2012-06-12 2015-09-10 レコン インストルメンツ インコーポレイテッドRecon Instruments Inc. Head-up display system for glasses
WO2016194232A1 (en) * 2015-06-05 2016-12-08 日立マクセル株式会社 Video display device and control method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009276593A (en) * 2008-05-15 2009-11-26 Nippon Seiki Co Ltd Head-up display device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000276115A (en) * 1999-03-29 2000-10-06 Canon Inc Picture display device
JP2015526750A (en) * 2012-06-12 2015-09-10 レコン インストルメンツ インコーポレイテッドRecon Instruments Inc. Head-up display system for glasses
WO2016194232A1 (en) * 2015-06-05 2016-12-08 日立マクセル株式会社 Video display device and control method

Also Published As

Publication number Publication date
JP6540676B2 (en) 2019-07-10
JP2018101051A (en) 2018-06-28

Similar Documents

Publication Publication Date Title
JP6511386B2 (en) INFORMATION PROCESSING APPARATUS AND IMAGE GENERATION METHOD
JP6339239B2 (en) Head-mounted display device and video display system
JP6770536B2 (en) Techniques for displaying text more efficiently in virtual image generation systems
JP6378781B2 (en) Head-mounted display device and video display system
CN109960481B (en) Display system and control method thereof
US20170324899A1 (en) Image pickup apparatus, head-mounted display apparatus, information processing system and information processing method
JP6576536B2 (en) Information processing device
US9846305B2 (en) Head mounted display, method for controlling head mounted display, and computer program
US20160170482A1 (en) Display apparatus, and control method for display apparatus
US20210382316A1 (en) Gaze tracking apparatus and systems
JP2016122244A (en) Electronic apparatus, display device, and control method of electronic apparatus
EP3697086A1 (en) Information processing device, information processing method, and program
US11743447B2 (en) Gaze tracking apparatus and systems
WO2018116519A1 (en) Head-mounted display
JP6195126B2 (en) Image display device
WO2018061401A1 (en) Head-mounted display
JP6777052B2 (en) Head mounted display
JP2016116066A (en) Display device and control method of display device
EP3961572A1 (en) Image rendering system and method
JP2024007643A (en) Display system, control device, and method for displaying display system
JP6274531B2 (en) Head mounted display
JP2021047370A (en) Display device, method for controlling display device, control program of display device, and display system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17885298

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17885298

Country of ref document: EP

Kind code of ref document: A1