WO2010073879A1 - Visiocasque - Google Patents

Visiocasque Download PDF

Info

Publication number
WO2010073879A1
WO2010073879A1 PCT/JP2009/070151 JP2009070151W WO2010073879A1 WO 2010073879 A1 WO2010073879 A1 WO 2010073879A1 JP 2009070151 W JP2009070151 W JP 2009070151W WO 2010073879 A1 WO2010073879 A1 WO 2010073879A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
image
display
user
eyelid
Prior art date
Application number
PCT/JP2009/070151
Other languages
English (en)
Japanese (ja)
Inventor
邦宏 伊藤
Original Assignee
ブラザー工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ブラザー工業株式会社 filed Critical ブラザー工業株式会社
Publication of WO2010073879A1 publication Critical patent/WO2010073879A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning

Definitions

  • the present invention relates to a head mounted display.
  • the video display device is configured to detect the blinking movement of the observer's eyes by the imaging means for detecting the eye movement. Then, when the number of blinks of the observer's eyes obtained by this imaging means is equal to or less than a predetermined reference value, the image based on the normal image data is switched to the image based on the specific image data.
  • a video display device configured (see, for example, Japanese Patent Laid-Open No. 2000-221953).
  • the present invention has been made to solve the above-described problems, and a head capable of accurately operating a display state change of a displayed image by an eye movement and preventing a malfunction.
  • An object is to provide a mount display.
  • a head mounted display of the present invention is mounted on a user's head, and a display means for projecting an image of display information onto one eye of the user and allowing the user to visually recognize the image;
  • the projection side imaging means for imaging the one eye of the user, the other side imaging means for imaging the other eye of the user, and the projection side imaging means and the other side imaging means via at least one of the imaging means.
  • Eye movement detection means for detecting eye movement of at least one eye of a user, eyelid movement detection means for detecting a predetermined eyelid movement of the other eye via the other side imaging means, and eyelid movement detection means
  • the display unit is controlled to change the display state of the image according to the eye movement detected by the eye movement detection unit.
  • Display control means that, characterized by comprising a.
  • the user performs a predetermined operation with the eyelid of the other eye on which the image is not projected, and then the direction of the line of sight of one eye or the direction of the line of sight of the other eye on which the image is projected.
  • the viewing direction of at least one of them it is possible to change the display state of an image viewed with one eye.
  • the user moves one eye of the image being projected and one eye of the other eye, or by moving the eye gaze direction of both eyes to the left or right, to one eye.
  • the displayed manual page can be scrolled to the left or right.
  • the user can move one eye by moving the eye-gaze direction of one of the eyes on which the image is projected and the other eye, or the eye-gaze direction of both eyes upward or downward.
  • the page of a manual or the like displayed on the screen can be scrolled up or down. As a result, the user operates the eyelid with the eye not viewing the image. It is possible to prevent the timing from being closed.
  • the eye movement detection means detects eye movement of one eye of the user via one of the projection side imaging means and the other side imaging means. May be.
  • the user performs a predetermined operation with the eyelid of the other eye on which no image is projected, and then changes the line-of-sight direction of one eye while constantly checking the display image with one eye.
  • the user can scroll the left or right page of a manual or the like displayed on one eye by moving the line of sight of one eye to the left or right.
  • the user can scroll the page of the manual or the like displayed on one eye upward or downward by moving the line of sight of one eye upward or downward.
  • the eyelid operation is performed with the eye not viewing the image, so the eyelid is closed when the operation is being performed while stopping the image being continuously scrolled or when the operation is being performed while viewing the image. It is possible to reliably prevent the timing from being lost.
  • the head-mounted display according to the present invention is mounted on the user's head, and projects a display information image onto one of the user's eyes to allow the user to visually recognize the image.
  • the other side imaging means for imaging the eye, the eye movement detection means for detecting the eye movement of the other eye of the user via the other side imaging means, and the other eye of the other eye via the other side imaging means A eyelid motion detecting means for detecting a predetermined eyelid motion; and when the motion of the predetermined eyelid is detected via the eyelid motion detecting means, the display state of the image is detected by the eye movement detecting means
  • Display control means for controlling the display means so as to be changed in accordance with eye movements.
  • the user performs a predetermined operation with the eyelid of the other eye on which an image is not projected, and then visually recognizes with one eye by changing the viewing direction of the other eye.
  • the display state of the image can be changed. For example, the user can scroll the left or right page of a manual or the like displayed on one eye by moving the line of sight of the other eye on which the image is not projected left or right. Is possible. Further, the user scrolls the page of the manual or the like displayed on one eye in the upward or downward direction by moving the line of sight of the other eye on which the image is not projected upward or downward. It becomes possible.
  • the user performs the eyeball and eyelid movement of the eye not viewing the image, so when performing the operation by measuring the operation timing while observing the image such as moving image operation or stopping the image during continuous scrolling, You won't lose timing when you close your bag.
  • the user measures the timing with the eyelid of the other eye closed for a predetermined time or more, and opens the eyelid at the operation timing to input the line of sight. It is possible to prevent an erroneous operation and perform an operation at an accurate timing.
  • the eyelid movement detecting means may detect an action of opening the eyelid of the other eye after closing it for a predetermined time or more as the action of the predetermined eyelid.
  • the eyelid movement detecting means detects an action of closing the eyelid of the other eye as the action of the predetermined eyelid
  • the eye movement detecting means is the projection side imaging means.
  • the display control means detects that the eye eye of the other eye is closed via the eyelid motion detection means
  • the display means is controlled to change the display state of the image according to the eye movement of the one eye of the user detected by the eye movement detection means, and the other eye is changed via the eyelid movement detection means.
  • the display means may be controlled so as to stop changing the display state of the image.
  • FIG. 1 It is a figure which shows the state with which the head mounted display which concerns on Example 1 is mounted
  • FIG. 12 is a flowchart showing “content display control process 2” for controlling display of content displayed by the emitting device of the head mounted display according to the second embodiment according to the eye movement of the left eye of the user.
  • 14 is a flowchart showing “content display control process 3” for controlling display of content displayed by the emitting device of the head mounted display according to the third embodiment in accordance with the eye movements of both eyes of the user.
  • FIG. 10 is an explanatory diagram illustrating an example of eye movement detected in the third embodiment.
  • FIG. 10 is a block diagram illustrating an electrical configuration of a head mounted display according to a fourth embodiment.
  • FIG. 10 is a flowchart illustrating “content display control process 4” for controlling display of content displayed by an emission device of a head mounted display according to a fourth embodiment according to an eyeball movement of a user's right eye.
  • FIG. It is a figure which shows an example of the relationship between the movement of the eyeball of a right eye, and the content (work procedure manual) displayed. It is a figure which shows an example of the relationship between the movement of the eyeball of a right eye, and the content (work procedure manual) displayed.
  • the appearance of the head mounted display 1 shown in FIGS. 1 and 2 is merely an example, and the present invention is not limited to the head mounted display 1 having the appearance.
  • an embodiment in which the present invention is applied to a retinal scanning display will be described.
  • the present invention can be applied to various head mounted displays that perform image display using a liquid crystal element, an organic EL element, or the like.
  • an embodiment in which the present invention is applied to an optical see-through head mounted display will be described.
  • the present invention is a video see-through display as long as the head mounted display displays an image on one eye of a user.
  • the present invention can also be applied to a type or external non-transparent type head mounted display.
  • the head mounted display 1 receives various content information such as a moving image file, a still image file, and a document file (for example, a work procedure manual or a spreadsheet file) inputted from the outside as an image signal.
  • the beam light (hereinafter referred to as “image light”) modulated according to the image signal is scanned on the retina of the left eye 5 of the user 2 (see FIGS. 4 and 6). Accordingly, an image corresponding to the content information (hereinafter referred to as “content”) can be visually recognized by the left eye 5 of the user 2 (see FIG. 4).
  • the head-mounted display 1 has an external object 3 and a landscape in an area outside the area displaying the content in the field of view of the user 2 even when the content is being displayed. Etc. can be visually recognized. That is, the head-mounted display 1 is a see-through type head-mounted display that projects image light to the left eye 5 of the user 2 while transmitting outside light, that is, in a state where the outside is visible.
  • the head mounted display 1 basically includes a CCD camera 7, a CCD camera 8, an emission device 11, a prism 12, and a support member 13 that supports them. It is configured.
  • the CCD camera 7 images the left eye 5 of the user 2
  • the CCD camera 8 images the right eye 6 (see FIG. 6) of the user 2.
  • the emission device 11 emits image light Z1 to the left eye 5 of the user 2.
  • the prism 12 reflects the image light Z1 toward the left eye 5 of the user 2.
  • the prism 12 includes a beam splitter (not shown), transmits external light Z2 from the outside, and guides it to the left eye 5 of the user 2. In this manner, the prism 12 causes the image light Z1 incident from the side of the user 2 to enter the left eye 5 of the user 2 and causes the external light Z2 from the outside to enter the left eye 5 of the user 2. Therefore, the user 2 can visually recognize the content superimposed on the outside landscape.
  • a beam splitter not shown
  • the prism 12 can replace with the prism 12 and a half mirror can also be used. That is, the image light Z1 from the emitting device 11 is reflected by the half mirror and enters the left eye 5 of the user 2, and the external light Z2 passes through the half mirror and enters the user 2 eye.
  • the CCD camera 7 includes CCD controllers 15 (see FIG. 3) that control the entire CCD camera 7 and a CCD sensor 16 (see FIG. 3) that captures the left eye 5 facing the CCD camera 7. Yes.
  • the CCD camera 8 includes CCD controllers 17 (see FIG. 3) that control the entire CCD camera 8 and a CCD sensor 18 (see FIG. 3) that captures the right eye 6 facing the CCD camera 8. Yes.
  • the CCD controller 15 stores the image captured by the CCD sensor 16 in the left eye image storage area of the RAM 53 of the control unit 36.
  • the image stored in the left eye image storage area of the RAM 53 is subjected to image processing by the CPU 51, and the content is controlled to be displayed based on this image processing result as will be described later (see FIG. 5).
  • the CCD controller 17 stores the image captured by the CCD sensor 18 in the right eye image storage area of the RAM 53 of the control unit 36.
  • the image stored in the right-eye image storage area of the RAM 53 is subjected to image processing by the CPU 51.
  • the start and end of content display that is, whether content display is on or off, based on the image processing result. It is controlled (see FIG. 5).
  • FIG. 3 is a block diagram showing an electrical configuration of the head mounted display 1.
  • the head mounted display 1 includes a CCD camera 7, a CCD camera 8, an emission device 11, an input unit 31, an external connection unit 32, a video RAM 34, a font ROM 35, and a control unit 36. And a power source unit 37 and the like.
  • the CCD camera 7 images the left eye 5 of the user 2
  • the CCD camera 8 images the right eye 6 of the user 2.
  • the emission device 11 emits image light to the left eye 5 of the user 2.
  • the input unit 31 inputs various operations and data.
  • the external connection unit 32 receives content information from the outside.
  • the video RAM 34 stores image data such as contents and text to be displayed by the emission device 11.
  • the font data of the text displayed by the font ROM 35 emitting device 11 is stored.
  • the control unit 36 controls the entire head mounted display 1. The details of the emission device 11 will be described later with reference to FIG.
  • the CCD camera 7 includes a CCD controller 15 and a CCD sensor 16.
  • the CCD controller 15 stores a captured image of the left eye 5 captured by the CCD sensor 16 in the left eye image storage area of the RAM 53 of the control unit 36.
  • the CCD camera 8 includes a CCD controller 117, a CCD sensor 18, and the like.
  • the CCD controller 17 captures an image of the right eye 6 captured by the CCD sensor 18 and stores a right-eye image storage area in the RAM 53 of the control unit 36.
  • the input unit 31 includes an operation button group 41 including various function keys and an input control circuit 42.
  • the input control circuit 42 detects that the key operation of the operation button group 41 has been performed, and outputs a control signal corresponding to the operated key to the control unit 36. For example, when an eye mouse start key (not shown) is operated, the input control circuit 42 controls a control signal that instructs to execute a program (see FIG. 5) of “content display control process 1” described later. To the unit 36. When an eye mouse stop key (not shown) is operated, the input control circuit 42 sends a control signal for instructing the control unit 36 to end execution of a program of “content display control processing 1” described later. Output.
  • the external connection unit 32 includes an external connection module 45 that is connected to an external device (for example, a personal computer) and acquires content information, and an external connection control circuit 46. Then, the external connection control circuit 46 outputs the content information input from the external connection module 45 to the control unit 36, and the content information is stored in the flash memory 52.
  • an external connection module 45 that is connected to an external device (for example, a personal computer) and acquires content information
  • an external connection control circuit 46 outputs the content information input from the external connection module 45 to the control unit 36, and the content information is stored in the flash memory 52.
  • the control unit 36 includes a CPU 51, a flash memory 52, a RAM 53, and the like, which are connected to each other via a bus line (not shown) and exchange data with each other.
  • the CPU 51 is an arithmetic processing unit that controls the entire head mounted display 1 and manages all data related to the operation of the head mounted display 1. Further, the CPU 51 converts various content information into image signals and stores them in the video RAM 34. As will be described later, the CPU 51 captures images captured by the two eyes 5 and 6 of the user 2 captured by the CCD sensors 16 and 18 stored in the left eye image storage area and the right eye image storage area of the RAM 53. Processing is performed to detect the movement of each eyelid and eye movement, and display the content to be projected to the left eye 5 of the user 2 according to the detection result (see FIG. 5).
  • the flash memory 52 stores various programs executed by the CPU 51. For example, the flash memory 52 performs a display control such as a left page return, a right page return, an up scroll, a down scroll, etc. according to the eye movement of the left eye 5 or the right eye 6 for content displayed on the head mounted display 1 described later. A control processing program for driving and controlling the emission device 11 is stored. The flash memory 52 stores content information input from the external connection control circuit 46.
  • the various content information includes a work procedure manual (document file), a spreadsheet file, a moving image file, and the like. Further, part or all of the various programs may be stored in a read-only ROM (not shown).
  • the RAM 53 temporarily stores captured images captured by the CCD sensors 16 and 18, and temporarily stores various data when the CPU 51 executes various controls.
  • the video RAM 34 temporarily stores image signals.
  • the image signal stored in the video RAM 34 is output to the video signal processing circuit 63 of the emission device 11 and projected onto the retina of the user 2 as content. Therefore, the CPU 51 can control the content to be displayed by reading the content information from the flash memory 52, converting the read content information into an image signal, and writing it into the video RAM 34.
  • the power supply unit 37 includes a battery 56 and a charge control circuit 57.
  • the battery 56 is a power source for driving the head mounted display 1.
  • the charging control circuit 57 supplies the power of the battery 56 to the head mounted display 1 and supplies the power supplied from a charging adapter (not shown) to the battery 56 to charge the battery 56.
  • the light emitting unit 11 is provided with a light source unit 61.
  • the light source unit 61 generates image light based on the image signal input from the video RAM 34 and performs scanning control of the image light in the horizontal direction and the vertical direction based on the scanning control signal input from the CPU 51. .
  • the light source unit 61 is provided with a video signal processing circuit 63.
  • the video signal processing circuit 63 receives an image signal from the video RAM 34 and a scanning control signal from the CPU 51, and generates each signal as an element for synthesizing the video corresponding to the image signal. For example, the video signal processing circuit 63 outputs blue (B), green (G), and red (R) output control signals 65A to 65C, a horizontal synchronizing signal 66, and a vertical synchronizing signal 67 for each pixel.
  • B blue
  • G green
  • R red
  • the light source unit 61 is provided with a light source 71 and a light combiner 72.
  • the light source unit 71 functions as an image light output unit that converts the three output control signals 65A to 65C output from the video signal processing circuit 63 for each dot clock into image light.
  • the light combining unit 72 combines these three image lights into one image light to generate arbitrary image light.
  • the light source unit 71 includes a B laser 74 that generates blue image light and a B laser driver 75 that drives the B laser 74 in accordance with the output control signal 65A.
  • the light source unit 71 includes a G laser 76 that generates green image light and a G laser driver 77 that drives the G laser 76 in accordance with the output control signal 65B.
  • the light source unit 71 includes an R laser 78 that generates red image light and an R laser driver 79 that drives the R laser 78 in accordance with the output control signal 65C.
  • the lasers 74, 76, and 78 can be configured as, for example, a semiconductor laser or a solid-state laser with a harmonic generation mechanism. If a semiconductor laser is used, the drive current can be directly modulated to modulate the intensity of the image light. However, if a solid-state laser is used, each laser is equipped with an external modulator to modulate the intensity of the image light. Need to do.
  • the light combining unit 72 includes collimating optical systems 81, 82, 83, dichroic mirrors 84, 85, 86, and a coupling optical system 87.
  • Each of the collimating optical systems 81, 82, and 83 is provided so as to collimate the three image lights incident from the light source unit 71 into parallel lights.
  • Each dichroic mirror 84, 85, 86 combines the three collimated image lights.
  • the coupling optical system 87 guides the combined image light to the optical fiber 88.
  • the laser beams emitted from the lasers 74, 76, and 78 are collimated by the collimating optical systems 81, 82, and 83, and then enter the dichroic mirrors 84, 85, and 86. Thereafter, the dichroic mirrors 84, 85, and 86 selectively reflect and transmit each image light with respect to the wavelength, reach the coupling optical system 87, collect the light, and enter the optical fiber 88.
  • the emission device 11 includes a collimating optical system 89, a horizontal scanning unit 92, a vertical scanning unit 93, a relay optical system 94 provided between the horizontal scanning unit 92 and the vertical scanning unit 93, and relay optics.
  • a system 96 is provided.
  • the collimating optical system 89 collimates the image light generated by the light source unit 61 and emitted through the optical fiber 88 into parallel light.
  • the horizontal scanning unit 92 reciprocally scans the image light collimated by the collimating optical system 89 in the horizontal direction.
  • the vertical scanning unit 93 scans the image light that has been reciprocated in the horizontal direction by the horizontal scanning unit 92 in the vertical direction.
  • the relay optical system 96 emits the image light thus scanned in the horizontal direction and the vertical direction (two-dimensionally scanned) to the pupil 95 of the left eye 5 of the user 2.
  • the prism 12 is disposed between the relay optical system 96 and the pupil 95 of the user 2 and guides the image light emitted from the emission device 11 to the pupil 95 of the user 2 by totally reflecting the light.
  • the horizontal scanning unit 92 is provided with a resonant deflection element 101, a horizontal scanning control circuit 102, and a horizontal scanning angle detection circuit 103.
  • the resonant deflection element 101 has a reflective surface 101A for reciprocally scanning image light in the horizontal direction.
  • the horizontal scanning control circuit 102 performs drive control based on the horizontal synchronizing signal 66 input from the video signal processing circuit 63 so as to resonate the resonance type deflection element 101 and swing the reflecting surface 101A.
  • the horizontal scanning angle detection circuit 103 detects a swinging state such as a swinging angle and a swinging frequency of the resonant deflection element 101.
  • the vertical scanning unit 93 is provided with a deflection element 105, a vertical scanning control circuit 106, and a vertical scanning angle detection circuit 107.
  • the deflection element 105 has a reflection surface 105A for scanning image light in the vertical direction.
  • the vertical scanning control circuit 106 drives and controls the deflection element 105 to swing the reflecting surface 105A.
  • the vertical scanning angle detection circuit 107 detects a swing state such as a swing angle and a swing frequency of the deflection element 105.
  • the horizontal scanning angle detection circuit 103 outputs a swing signal 104 indicating the detected swing state of the resonance type deflection element 101 to the CPU 51.
  • the vertical scanning angle detection circuit 107 outputs a swing signal 108 indicating the detected swing state of the deflection element 105 to the CPU 51.
  • the CPU 51 determines a scanning control signal to be output to the video signal processing circuit 63 based on the received swing signals 104 and 108.
  • the drive states of the resonant deflection element 101 and the deflection element 105 are feedback controlled by the CPU 51.
  • the swing signals 104 and 108 may be output to the video signal processing circuit 63.
  • the video signal processing circuit 63 may be configured to output the horizontal synchronization signal 66 and the vertical synchronization signal 67 based on the swing signals 104 and 108. That is, the drive state of the resonance type deflection element 101 and the deflection element 105 may be feedback controlled by the video signal processing circuit 63.
  • the light scanned in the horizontal direction by the resonant deflection element 101 is converged on the reflection surface 105 A of the deflection element 105 by the relay optical system 94 provided between the horizontal scanning unit 92 and the vertical scanning unit 93. Then, the converged light is scanned in the vertical direction by the deflecting element 105 and is emitted to the relay optical system 96 as scanned image light scanned two-dimensionally.
  • the relay optical system 96 converts the display scanning image light emitted from the vertical scanning unit 93 into substantially parallel image light, and converges the center line of these image light to the pupil 95 of the user 2. Convert to
  • the arrangement of the horizontal scanning unit 92 and the vertical scanning unit 93 is switched, and the image light incident from the optical fiber 88 is scanned in the vertical direction by the vertical scanning unit 93 and then scanned in the horizontal direction by the horizontal scanning unit 92. You may do it.
  • the program shown in the flowchart of FIG. 5 is stored in the flash memory 52 of the control unit 36, and a control signal corresponding to an operation of an eye mouse start key (not shown) included in the operation button group 41 is input to the input control circuit.
  • a control signal corresponding to an operation of an eye mouse start key (not shown) included in the operation button group 41 is input to the input control circuit.
  • it is executed by the CPU 51.
  • the program shown in the flowchart of FIG. 5 is obtained when a control signal corresponding to an operation of an eye mouse stop key (not shown) included in the operation button group 41 is input from the input control circuit 42 to the control unit 36. The execution is terminated by the CPU 51.
  • the image signals captured by the CCD sensors 16 and 18 are respectively stored in the left eye image storage area and the right eye image storage area of the RAM 53 at predetermined frame rates. (For example, about 30 frames per second). It is assumed that content information to be displayed is stored in advance in the flash memory 52. In addition, when a plurality of pieces of content information are stored in the flash memory 52, it is assumed that which content information is to be displayed is set in advance.
  • step (hereinafter abbreviated as “S”) 11 the CPU 51 inputs contents such as a work procedure manual (document file) input from the external connection control circuit 46 and stored in the flash memory 52. If there is information, the content information is read out, converted into an image signal, and stored in the video RAM 34. Thereby, an image signal is input from the video RAM 34 to the video signal processing unit 63, and content (image) corresponding to the content information is displayed on the left eye 5 of the user 2 via the emission device 11.
  • contents such as a work procedure manual (document file) input from the external connection control circuit 46 and stored in the flash memory 52. If there is information, the content information is read out, converted into an image signal, and stored in the video RAM 34. Thereby, an image signal is input from the video RAM 34 to the video signal processing unit 63, and content (image) corresponding to the content information is displayed on the left eye 5 of the user 2 via the emission device 11.
  • the content information is a work procedure manual (document file)
  • the content corresponding to the first page see FIG. 8
  • the first and second pages in the spread state see FIG. 9
  • the image is displayed on the left eye 5 of the user 2 via the emission device 11.
  • the CPU 51 recognizes the image data stored in the right eye image storage area of the RAM 53, that is, the image data of the right eye 6 of the user 2 by a predetermined number of frames (for example, about 30 frames). To do. Then, the CPU 51 stores, in the RAM 53, image recognition data for a predetermined number of frames of the right eye 6 of the user 2, that is, the right eye 6 on the side where the emission device 11 is not attached.
  • image recognition when white and black eyes (pupil colors) of the eyeball are detected, it is recognized that the eyelid is not closed, while the white and black eyes of the eyeball are detected. If the skin color is mainly detected, it is recognized that the eyelid is closed.
  • the CPU 51 reads image recognition data for a predetermined number of frames of the right eye 6 of the user 2 from the RAM 53, and whether or not the eyelid 6A (see FIG. 6) of the right eye 6 is continuously closed.
  • a determination process for determining Specifically, the CPU 51 reads image recognition data for a predetermined number of frames of the right eye 6, and when the eyeball of the right eye 6 is not detected, that is, when skin color is mainly detected, the eyelid 6A is closed. Is determined.
  • the CPU 51 recognizes the image data stored in the left eye image storage area of the RAM 53, that is, the image data of the left eye 5 of the user 2 by a predetermined number of frames (for example, about 30 frames). Then, the CPU 51 stores in the RAM 53 image recognition data for a predetermined number of frames of the left eye 5 of the user 2, that is, the left eye 5 on which the emission device 11 is attached.
  • the CPU 51 reads out image recognition data for a predetermined number of frames of the left eye 5 of the user 2 from the RAM 53 and determines whether or not the eyeball 5 ⁇ / b> A (see FIG. 6) of the left eye 5 has moved. Execute the process.
  • the CPU 51 detects the moving direction of the eyeball 5A, that is, the moving direction of the pupil of the eyeball 5A, from the image recognition data for the predetermined number of frames of the left eye 5 captured by the CCD camera 7.
  • the CPU 51 determines that the eyeball 5A has moved in that direction.
  • the movement direction is stored in the RAM 53 as the movement direction of the eyeball 5A. Note that the CPU 51 determines that the eyeball 5A has moved in each direction when the pupil of the eyeball 5A moves in the upward direction 121, the downward direction 122, the left direction 123, or the right direction 124 from the center position of the white eye.
  • the CPU 51 determines that the eyeball 5A has not moved. To do.
  • FIG. 9 An example in which content (image) display control is performed based on the content of the control table 131 stored in advance in the flash memory 52 will be described with reference to FIGS.
  • the spread page display (display for two pages) is used, but it goes without saying that a single page may be displayed.
  • the second half of the third page and the first half of the fourth page is displayed.
  • the entire third page is displayed.
  • the scroll amount can be a predetermined number of rows / columns.
  • the CPU 51 may recognize the moving speed of the eyeball 5A and determine the scroll amount according to the recognized speed.
  • the fourth page and the fifth page are displayed. Is displayed.
  • the eyeball 5A of the left eye 5 is moved in the left direction 123 while the fourth page and the fifth page are displayed, the second page and the third page are displayed.
  • the eyeball 5A of the left eye 5 is in the upward direction 121, the downward direction 122, the leftward direction 123, or the right in S15.
  • the CPU 51 operates on the content (image) being displayed. , “Up scroll”, “Down scroll”, “Left scroll” or “Right scroll”, respectively.
  • the CPU 51 returns the playback position of the moving image file.
  • the playback position return / advance amount can be one chapter or a predetermined time (for example, about 30 seconds).
  • the CPU 51 may recognize the moving speed of the eyeball 5A and determine the return / advance amount of the reproduction position according to the recognized speed.
  • the CPU 51 recognizes an image captured by the right eye 6 captured by the CCD sensor 18.
  • the CPU 51 recognizes the captured image of the left eye 5 captured by the CCD sensor 16 and projects the content (image). The moving direction of the moving eyeball 5A is detected.
  • the CPU 51 performs display control such as upward scrolling and downward scrolling of the content (image) based on the upward movement 121, the downward direction 122, the leftward direction 123, or the rightward direction 124 of the eyeball 5A.
  • display control such as upward scrolling and downward scrolling of the content (image) based on the upward movement 121, the downward direction 122, the leftward direction 123, or the rightward direction 124 of the eyeball 5A.
  • the user 2 moves the line of sight of the left eye 5 in the upward direction 121, the downward direction 122, the left direction 123, or the right direction 124 by closing the right eye 6 on which no content is projected. It is possible to scroll and display the page (page) of content (image) such as the displayed work procedure manual in the vertical and horizontal directions, and to change the playback position of the moving image displayed on the left eye 5. .
  • the eye 6A of the right eye 6 is closed, and the gaze direction of the left eye 5 is the upward direction 121, the downward direction 122, the left direction 123, or the right direction 124.
  • the content (image) can be manipulated by moving it to, and work efficiency can be improved.
  • FIG. 1 when a work procedure manual relating to the object 3 is displayed on the head mounted display 1, the line of sight of the left eye 5 is moved up and down or left and right while handling the object 3 with both hands.
  • a predetermined page such as a work procedure manual can be displayed.
  • the user 2 moves the line of sight of the left eye 5 up and down or left and right only while closing the eyelid 6 ⁇ / b> A of the right eye 6, and manipulates the display state of the content (image) visually recognized by the left eye 5.
  • the CPU 51 can accurately distinguish the operation of the eyelid 6A of the right eye 6 from the physiological blink, and can prevent the malfunction of the head mounted display 1.
  • the CPU 51 can distinguish between the physiological movement of the left eye 5 and the movement when the eyeball 5A is moved for operation by opening / closing the eyelid 6A of the right eye 6, the physiological movement of the left eye 5 can be distinguished. It is possible to prevent malfunction of the head mounted display 1 due to the above.
  • the head mounted display 151 according to the second embodiment will be described with reference to FIG.
  • the same reference numerals as those of the head mounted display 1 according to the first embodiment denote the same or corresponding parts as those of the head mounted display 1 according to the first embodiment.
  • the schematic configuration of the head mounted display 151 according to the second embodiment is the same as that of the head mounted display 1 according to the first embodiment.
  • Various control processes of the head mounted display 151 are substantially the same control processes as those of the head mounted display 1 according to the first embodiment.
  • the CPU 51 of the head mounted display 151 is different in that, instead of the “content display control process 1” (S11 to S16), a “content display control process 2” described later is executed.
  • Content display control process 2 it is a process executed by the CPU 51 of the head mounted display 151 according to the second embodiment, and the content (image) displayed by the emission device 11 is displayed and controlled according to the eye movement of the left eye 5 of the user 2
  • the “display control process 2” will be described with reference to FIG.
  • the CPU 51 executes the same processing (see FIG. 5) as in S11 through S12. However, in S112, the CPU 51 stores, in the RAM 53, image recognition data for the number of frames corresponding to a predetermined time (for example, about 2 seconds) of the right eye 6 on the side on which the emission device 11 is not mounted.
  • a predetermined time for example, about 2 seconds
  • the CPU 51 stores the image recognition data for the number of frames (for example, about 60 frames) corresponding to the predetermined time (for example, about 2 seconds) of the right eye 6 of the user 2 from the RAM 53. Whether or not the eyelid 6A of the right eye 6 is continuously opened after being closed for a predetermined number of frames (for example, about 30 frames) for a predetermined time (for example, about 1 second). A determination process is performed to determine whether or not.
  • the CPU 51 executes the processing after S111 again.
  • the CPU 51 executes the same processing as in S112 to S113.
  • the eyelid 6A of the right eye 6 is not continuously opened after being closed for a predetermined number of frames (for example, about 30 frames) for a predetermined time (for example, about 1 second).
  • a predetermined number of frames for example, about 30 frames
  • a predetermined time for example, about 1 second.
  • the CPU 51 executes the processes after S114 again.
  • the CPU 51 recognizes an image captured by the right eye 6 captured by the CCD sensor 18. Then, the CPU 51 opens the eye 6A of the right eye 6 continuously after being closed for a predetermined number of frames (for example, about 30 frames) for a predetermined time (for example, about 1 second). In other words, the CPU 51 recognizes the captured image of the left eye 5 captured by the CCD sensor 16 and detects the moving direction of the eyeball 5A on which the content (image) is projected. Note that the CPU 51 determines that the eyeball 5A has moved in each direction when the pupil of the eyeball 5A moves in the upward direction 121, the downward direction 122, the left direction 123, or the right direction 124 from the center position of the white eye.
  • the CPU 51 performs display control such as upward scrolling and downward scrolling of the content (image) based on the upward movement 121, the downward direction 122, the leftward direction 123, or the rightward direction 124 of the eyeball 5A.
  • display control such as upward scrolling and downward scrolling of the content (image) based on the upward movement 121, the downward direction 122, the leftward direction 123, or the rightward direction 124 of the eyeball 5A.
  • the user 2 closes the eyelid 6A of the right eye 6 on which the content is not projected for a predetermined time (for example, about 1 second) and then opens the eyeline direction of the left eye 5 upward 121,
  • a predetermined time for example, about 1 second
  • the contents (images) such as the work procedure manual displayed on the left eye 5 are scroll-displayed or paged in the vertical direction or the horizontal direction, It is possible to change the playback position of the moving image displayed in FIG.
  • the user 2 again closes the eyelid 6A of the right eye 6 on which no content is projected for a predetermined time (for example, about 1 second) and then opens it, thereby changing the line-of-sight direction of the left eye 5. It is possible to cancel scroll display and page turning of contents (images) such as work procedure manuals. For this reason, it becomes possible to prevent the change in the display state of the content (image) due to the change in the viewing direction of the left eye 5 unnecessary for the user 2.
  • a predetermined time for example, about 1 second
  • the eye 6 of the right eye 6 is closed after being closed for a predetermined time (for example, about 1 second), and then the line of sight of the left eye 5 is opened.
  • the content (image) can be manipulated by moving the direction in the upward direction 121, the downward direction 122, the left direction 123, or the right direction 124, and work efficiency can be improved.
  • FIG. 1 when a work procedure manual relating to the object 3 is displayed on the head mounted display 151, the line of sight of the left eye 5 is moved up and down or left and right while handling the object 3 with both hands.
  • a predetermined page such as a work procedure manual can be displayed.
  • the user 2 again closes the eyelid 6A of the right eye 6 for a predetermined time (for example, about 1 second) and then opens it, thereby canceling the operation of the content (image) according to the line-of-sight direction of the left eye 5. It is possible to change the line-of-sight direction of both eyes 5 and 6 with a predetermined page such as a work procedure manual opened.
  • the user 2 closes the eyelid 6A of the right eye 6 for a predetermined time (for example, about 1 second) and then opens the left eye 5 to move the line of sight of the left eye 5 up and down or left and right. It becomes possible to manipulate the display state of the content (image) being viewed. Further, the CPU 51 can accurately distinguish the operation of the eyelid 6A of the right eye 6 from the physiological blink, and can prevent the malfunction of the head mounted display 151.
  • a predetermined time for example, about 1 second
  • the CPU 51 can accurately distinguish the operation of the eyelid 6A of the right eye 6 from the physiological blink, and can prevent the malfunction of the head mounted display 151.
  • the head mounted display 161 according to the third embodiment will be described with reference to FIGS.
  • the same reference numerals as those of the head mounted display 1 according to the first embodiment denote the same or corresponding parts as those of the head mounted display 1 according to the first embodiment.
  • the schematic configuration of the head mounted display 161 according to the third embodiment is the same as that of the head mounted display 1 according to the first embodiment.
  • Various control processes of the head mounted display 161 are substantially the same control processes as those of the head mounted display 151 according to the second embodiment.
  • the CPU 51 of the head mounted display 161 displays and controls the content (image) described later according to the eye movements of the two eyes 5 and 6 of the user 2 instead of the “content display control process 2” (S111 to S118). The difference is that the “content display control process 3” is executed.
  • Content display control process 3 it is a process executed by the CPU 51 of the head mounted display 161 according to the third embodiment, and the display of the content (image) displayed by the emitting device 11 is controlled according to the eye movements of the two eyes 5 and 6 of the user 2. “Content display control process 3” will be described with reference to FIGS.
  • the program shown in the flowchart of FIG. 11 is stored in the flash memory 52 of the control unit 36, and a control signal corresponding to the operation of an eye mouse start key (not shown) included in the operation button group 41 is input to the input control circuit.
  • a control signal corresponding to the operation of an eye mouse start key (not shown) included in the operation button group 41 is input to the input control circuit.
  • the program shown in the flowchart of FIG. 11 is executed when a control signal corresponding to an operation of an eye mouse stop key (not shown) included in the operation button group 41 is input from the input control circuit 42 to the control unit 36. The execution is terminated by the CPU 51.
  • the CPU 51 executes the same processing (see FIG. 10) as in S111 through S113.
  • the CPU 51 determines that the eyelid 6A of the right eye 6 is not closed for a predetermined number of frames (for example, about 30 frames) for a predetermined time (for example, about 30 frames), that is, If the eyelid 6A of the eye 6 has not been closed for a predetermined time or has been closed in a blink (S213: NO), the processing after S211 is executed again.
  • the CPU 51 stores image recognition data for the number of frames corresponding to a predetermined time (for example, about 2 seconds) of the right eye 6 on the side where the emission device 11 is not mounted in the RAM 53.
  • step S213 the CPU 51 opens the eye 6A of the right eye 6 continuously after the number of frames (for example, about 30 frames) for a predetermined time (for example, about 1 second) is closed. If yes (S213: YES), the process proceeds to S214.
  • the CPU 51 recognizes each image data of the same timing stored in the left eye image storage area and the right eye image storage area of the RAM 53 by a predetermined number of frames (for example, about 30 frames), and the user. Image recognition data for a predetermined number of frames at the same timing of the two eyes 5 and 6 is stored in the RAM 53.
  • the CPU 51 reads out the image recognition data for the predetermined number of frames at the same timing of the two eyes 5 and 6 of the user 2 from the RAM 53, and each eyeball 5A and 6B of both eyes 5 and 6 (FIG. 12). A determination process for determining whether or not (see) has moved at the same time is executed.
  • the CPU 51 determines the moving direction of each eyeball 5 ⁇ / b> A, 6 ⁇ / b> B from the image recognition data for a predetermined number of frames at the same timing of both eyes 5, 6 captured by the CCD cameras 7, 8. That is, the moving direction of the pupil of each eyeball 5A, 6B is detected.
  • the CPUs 5A and 6B move in the respective directions. Judge that it has moved.
  • the CPU 51 determines that each of the eyeballs 5A and 6B. Are simultaneously moved in that direction, and this moving direction is stored in the RAM 53 as the moving direction of each eyeball 5A, 6B.
  • the CPU 51 , 6B are determined not to move at the same time.
  • the CPU 51 proceeds to the process of S216.
  • the CPU 51 performs display control of content (image) based on the content of the control table 131 (see FIG. 7) stored in advance in the flash memory 52, and then proceeds to the processing of S217.
  • the scroll amount can be a predetermined number of rows / columns.
  • the moving speed of each eyeball 5A, 6B may be recognized, and the scroll amount may be determined according to the recognized speed.
  • the fourth page The eyes and the fifth page are displayed.
  • the 4th page and the 5th page are displayed and the respective eyeballs 5A and 6B of both eyes 5 and 6 are simultaneously moved in the left direction 123, the 2nd page and the 3rd page are displayed. Is displayed.
  • the CPU 51 may recognize the moving speed of each of the eyeballs 5A and 6B, and determine the return / advance amount of the reproduction position according to the recognized speed. As a result, the user 2 can always recognize the moving image with the left eye 5 on the wearing side even if the non-wearing right eye 6 is closed for a certain period of time in order to start the operation. , Can do the appropriate operation.
  • step S2108 the CPU 51 determines that the eyelid 6A of the right eye 6 is not closed for a predetermined number of frames (for example, about 30 frames) for a predetermined time (for example, about 30 frames), that is, If the eyelid 6A of the eye 6 has not been closed for a predetermined time or has been closed by blinking (S218: NO), the processing after S214 is executed again.
  • a predetermined number of frames for example, about 30 frames
  • a predetermined time for example, about 30 frames
  • the eyelid 6A of the right eye 6 is continuously opened after being closed for a predetermined number of frames (for example, about 30 frames) for a predetermined time (for example, about 1 second).
  • a predetermined number of frames for example, about 30 frames
  • a predetermined time for example, about 1 second.
  • the CPU 51 recognizes an image captured by the right eye 6 captured by the CCD sensor 18. Then, the CPU 51 indicates that the eye 6A of the right eye 6 is continuously opened after being closed for a predetermined number of frames (for example, about 30 frames) for a predetermined time (for example, about 1 second). In the case of recognition, the captured images of both eyes 5 and 6 imaged by the CCD sensors 16 and 18 are image-recognized, and the moving directions of the eyeballs 5A and 6B of both eyes 5 and 6 are detected.
  • the CPU 51 then scrolls up and down the content (image) projected to the left eye 5 based on the movement in the upward direction 121, the downward direction 122, the left direction 123, or the right direction 124 of each eyeball 5A, 6B. Display control.
  • the CPU 51 recognizes that the eyelid 6A of the right eye 6 is continuously opened after being closed for a predetermined number of frames, the CPU 51 ends the detection of the moving direction of each eyeball 5A, 6B. Then, the display control by the movement of each eyeball 5A, 6B of the content (image) is terminated and normal content display is performed.
  • the user 2 closes the eyelid 6A of the right eye 6 on which no content is projected for a predetermined time (for example, about 1 second), and then opens the eyes, thereby simultaneously increasing the line-of-sight directions of both eyes 5 and 6.
  • a predetermined time for example, about 1 second
  • the user 2 closes the eyelid 6A of the right eye 6 on which no content is projected for a predetermined time (for example, about 1 second), and then opens the eyes, thereby simultaneously increasing the line-of-sight directions of both eyes 5 and 6.
  • the playback position of the moving image displayed on the left eye 5 can be changed.
  • the user 2 again closes the eyelid 6A of the right eye 6 on which no content is projected for a predetermined time (for example, about 1 second), and then opens the eye 6A in the line-of-sight direction of both eyes 5 and 6. It becomes possible to cancel scroll display and page feed of contents (images) such as work procedure manuals due to the change. Then, the user 2 can prevent a change in the display state of the content (image) due to an unnecessary change in the viewing direction.
  • a predetermined time for example, about 1 second
  • the eyes 6A of the right eye 6 are closed for a predetermined time (for example, about 1 second) and then opened to open the eyes 5 and 6.
  • the content (image) can be manipulated by simultaneously moving the visual line direction in the upward direction 121, the downward direction 122, the left direction 123, or the right direction 124, thereby improving work efficiency.
  • a predetermined page such as a work procedure manual can be displayed.
  • the user 2 again operates the content (image) according to the line-of-sight direction of both eyes 5 and 6 by closing and opening the eyelid 6A of the right eye 6 for a predetermined time (for example, about 1 second).
  • a predetermined time for example, about 1 second.
  • the user 2 closes the eyelid 6A of the right eye 6 for a predetermined time (for example, about 1 second) and then opens it to move the eyes of both eyes 5 and 6 simultaneously up and down or left and right. It becomes possible to manipulate the display state of the content (image) visually recognized by the eye 5. Further, the CPU 51 can accurately distinguish the operation of the eyelid 6A of the right eye 6 from the physiological blink, and can prevent the malfunction of the head mounted display 161.
  • a predetermined time for example, about 1 second
  • the CPU 51 can accurately distinguish the operation of the eyelid 6A of the right eye 6 from the physiological blink, and can prevent the malfunction of the head mounted display 161.
  • a head mounted display 171 according to the fourth embodiment will be described with reference to FIGS.
  • the same reference numerals as those of the head mounted display 1 according to the first embodiment denote the same or corresponding parts as those of the head mounted display 1 according to the first embodiment.
  • the schematic configuration of the head mounted display 171 according to the fourth embodiment is substantially the same as the head mounted display 1 (see FIG. 1) according to the first embodiment.
  • the head mounted display 171 is different from the head mounted display 1 according to the first embodiment in that the CCD camera 7 for imaging the left eye 5 of the user 2 is not provided.
  • the head mounted display 171 includes a CCD camera 8, an output device 11, an input unit 31, an external connection unit 32, a video RAM 34, a font ROM 35, a control unit 36, a power source, and the like. Part 37 and the like.
  • the CCD camera 8 images the right eye 6 of the user 2.
  • the emission device 11 emits image light to the left eye 5 of the user 2.
  • the input unit 31 inputs various operations and data.
  • the external connection unit 32 receives content information from the outside.
  • the video RAM 34 stores image data such as contents and text to be displayed by the emission device 11.
  • the font data of the text displayed by the font ROM 35 emitting device 11 is stored.
  • the control unit 36 controls the entire head mounted display 1.
  • control processes of the head mounted display 171 are substantially the same control processes as those of the head mounted display 151 according to the second embodiment.
  • the CPU 51 of the head mounted display 171 replaces the “content display control process 2” (S111 to S118) with the content (image) described later according to the eye movement of the right eye 6 of the user 2 “content display”. The difference is that the control process 4 "is executed.
  • Content display control process 4 “contents” is processing executed by the CPU 51 of the head mounted display 171 according to the fourth embodiment, and displays and controls content (images) displayed by the emission device 11 according to the eye movement of the right eye 6 of the user 2.
  • the “display control process 4” will be described with reference to FIGS.
  • the program shown in the flowchart in FIG. 17 is stored in the flash memory 52 of the control unit 36, and a control signal corresponding to the operation of an eye mouse start key (not shown) included in the operation button group 41 is input to the input control circuit.
  • a control signal corresponding to the operation of an eye mouse start key (not shown) included in the operation button group 41 is input to the input control circuit.
  • it is executed by the CPU 51.
  • the program shown in the flowchart of FIG. 17 is obtained when a control signal corresponding to an operation of an eye mouse stop key (not shown) included in the operation button group 41 is input from the input control circuit 42 to the control unit 36. The execution is terminated by the CPU 51.
  • the CPU 51 executes the same processing (see FIG. 10) as in S111 to S113.
  • the CPU 51 determines that the eyelid 6A of the right eye 6 is not closed for a predetermined number of frames (for example, about 30 frames) for a predetermined time (for example, about 30 frames), that is, When the eyelid 6A of the eye 6 has not been closed for a predetermined time or has been closed by blinking (S313: NO), the processing after S311 is executed again.
  • the CPU 51 stores, in the RAM 53, image recognition data for the number of frames corresponding to a predetermined time (for example, about 2 seconds) of the right eye 6 on the side where the emission device 11 is not mounted.
  • the CPU 51 continuously opens after the eyelid 6A of the right eye 6 is closed for the number of frames (for example, about 30 frames) for a predetermined time (for example, about 1 second). If yes (S313: YES), the process proceeds to S314. In S314, the CPU 51 recognizes the image data stored in the right eye image storage area of the RAM 53 by a predetermined number of frames (for example, about 30 frames), and the right eye 6 of the user 2, that is, the emission device.
  • the RAM 53 stores image recognition data for a predetermined number of frames of the right eye 6 on the side on which the 11 is not attached.
  • the CPU 51 reads out image recognition data for a predetermined number of frames of the right eye 6 of the user 2 from the RAM 53, and determines whether or not the eyeball 6B (see FIG. 18) of the right eye 6 has moved. Execute the process. Note that the CPU 51 determines that the eyeball 6B has moved in each direction when the pupil of the eyeball 6B moves in the upward direction 121, the downward direction 122, the left direction 123, or the right direction 124 from the center position of the white eye.
  • the CPU 51 detects the moving direction of the eyeball 6B, that is, the moving direction of the pupil of the eyeball 6B, from the image recognition data for the predetermined number of frames of the right eye 6 captured by the CCD camera 8.
  • the CPU 51 determines that the eyeball 6B has moved in that direction.
  • the moving direction is stored in the RAM 53 as the moving direction of the eyeball 6B.
  • the CPU 51 determines that the eyeball 6B has not moved. To do.
  • the second half of the third page and the first half of the fourth page Is displayed.
  • the eyeball 6B of the right eye 6 is moved in the upward direction 121 in a state where the second half of the third page and the first half of the fourth page are displayed, the entire third page is displayed.
  • the scroll amount can be a predetermined number of rows / columns. Further, the moving speed of the eyeball 6B may be recognized, and the scroll amount may be determined according to the recognized speed.
  • the fourth and fifth pages are displayed. Is displayed.
  • the eyeball 6B of the right eye 6 is moved in the left direction 123 while the fourth and fifth pages are displayed, the second and third pages are displayed.
  • the eyeball 6B of the right eye 6 is up 121, down 122, left 123 or right in S315.
  • the CPU 51 operates on the content (image) being displayed. , “Up scroll”, “Down scroll”, “Left scroll” or “Right scroll”, respectively.
  • the CPU 51 returns the playback position of the moving image file.
  • the eyeball 6B of the right eye 6 moves in the right direction 124, that is, when it moves “to the right”
  • the CPU 51 advances the reproduction position of the moving image file.
  • the playback position return / advance amount can be one chapter or a predetermined time (for example, about 30 seconds).
  • the CPU 51 may recognize the moving speed of the eyeball 6B and determine the return / advance amount of the reproduction position according to the recognized speed. As a result, the user 2 can always recognize the moving image with the left eye 5 on the wearing side even if the non-wearing right eye 6 is closed for a certain period of time in order to start the operation. , Can do the appropriate operation.
  • the CPU 51 executes the same processing as in S312 to S313.
  • the CPU 51 determines that the eyelid 6A of the right eye 6 is not closed for a predetermined number of frames (for example, about 30 frames) for a predetermined time (for example, about 30 frames), that is, If the eyelid 6A of the eye 6 has not been closed for a predetermined time or has been closed by blinking (S318: NO), the processing after S314 is executed again.
  • the eyelid 6A of the right eye 6 is continuously opened after being closed for a predetermined number of frames (for example, about 30 frames) for a predetermined time (for example, about 1 second).
  • a predetermined number of frames for example, about 30 frames
  • a predetermined time for example, about 1 second.
  • the CPU 51 recognizes an image captured by the right eye 6 captured by the CCD sensor 18. Then, the CPU 51 opens the eye 6A of the right eye 6 continuously after being closed for a predetermined number of frames (for example, about 30 frames) for a predetermined time (for example, about 1 second). In this case, the captured image of the right eye 6 captured by the CCD sensor 18 is recognized as an image, and the moving direction of the eyeball 6B of the right eye 6 is detected.
  • the CPU 51 then displays the content (image) projected on the left eye 5 based on the movement in the upward direction 121, the downward direction 122, the left direction 123, or the right direction 124 of the eyeball 6B. Take control.
  • the CPU 51 ends the detection of the moving direction of the eyeball 6B of the right eye 6, Display control by moving the eyeball 6B of the content (image) projected to the left eye 5 is terminated and normal content display is performed.
  • the user 2 closes the eyelid 6A of the right eye 6 on which the content is not projected for a predetermined time (for example, about 1 second), and then opens the eye 6 so that the line-of-sight direction of the right eye 6 is the upward direction 121,
  • a predetermined time for example, about 1 second
  • the contents (images) such as the work procedure manual displayed on the left eye 5 are scroll-displayed or paged in the vertical direction or the horizontal direction, It is possible to change the playback position of the moving image displayed in FIG.
  • the user 2 again closes the eyelid 6A of the right eye 6 on which no content is projected for a predetermined time (for example, about 1 second) and then opens it, thereby changing the line-of-sight direction of the right eye 6. It is possible to cancel scroll display and page turning of contents (images) such as work procedure manuals. Then, the user 2 can prevent an unnecessary change in the display state of the content (image) due to a change in the viewing direction of the right eye 6.
  • a predetermined time for example, about 1 second
  • the user's line of sight of the right eye 6 is opened by closing the eyelid 6A of the right eye 6 for a predetermined time (for example, about 1 second) and then opening it.
  • the content (image) can be manipulated by moving the direction in the upward direction 121, the downward direction 122, the left direction 123, or the right direction 124, and work efficiency can be improved.
  • FIG. 15 when a work procedure manual or the like related to the object 3 is displayed on the head mounted display 171, the line of sight of the right eye 6 is moved up and down or left and right while handling the object 3 with both hands.
  • a predetermined page such as a work procedure manual can be displayed.
  • the user 2 again closes the eyelid 6 ⁇ / b> A of the right eye 6 for a predetermined time (for example, about 1 second), and then opens it to stop the operation of the content (image) according to the line-of-sight direction of the right eye 6. It is possible to change the line-of-sight direction of both eyes 5 and 6 with a predetermined page such as a work procedure manual opened.
  • the user 2 closes the eyelid 6A of the right eye 6 for a predetermined time (for example, about 1 second) and then opens the eye 6 to move the line of sight of the right eye 6 up and down or left and right. It becomes possible to manipulate the display state of the content (image) being viewed. Further, the CPU 51 can accurately distinguish the operation of the eyelid 6A of the right eye 6 from the physiological blink, and can prevent the malfunction of the head mounted display 171.
  • a predetermined time for example, about 1 second
  • the CPU 51 can accurately distinguish the operation of the eyelid 6A of the right eye 6 from the physiological blink, and can prevent the malfunction of the head mounted display 171.
  • the present invention is not limited to the first to fourth embodiments, and various improvements and modifications can be made without departing from the gist of the present invention. For example, the following may be used.
  • the emission device 11 of the head mounted display 1 may be attached to the right eye 6 side.
  • the CPU 51 determines whether or not the eyelid of the left eye 5 is closed based on the image data stored in the left eye image storage area of the RAM 53. Also good.
  • the CPU 51 determines the eyeball 6B of the right eye 6 based on the image data stored in the right eye image storage area of the RAM 53. Based on the movement in the upward direction 121, the downward direction 122, the left direction 123, or the right direction 124, display control such as upward scrolling and downward scrolling of the content (image) displayed on the right eye 6 may be performed. .
  • the emission device 11 of the head mounted display 151 may be attached to the right eye 6 side.
  • the number of frames for which the eyelid of the left eye 5 is a predetermined time (for example, about 1 second). After being closed (for example, about 30 frames), it may be determined whether or not it is continuously opened.
  • the CPU 51 stores image recognition data for the number of frames corresponding to a predetermined time (for example, about 2 seconds) of the left eye 5 on the side where the emission device 11 is not mounted in the RAM 53.
  • the CPU 51 continues after the eyelid of the left eye 5 is closed for a predetermined number of frames (for example, about 30 seconds) for a predetermined time (for example, about 30 seconds). If it is opened, the image data stored in the right eye image storage area of the RAM 53 is moved in the upward direction 121, the downward direction 122, the left direction 123, or the right direction 124 of the eye 6B of the right eye 6. Based on the above, display control such as up-scrolling and down-scrolling of the content (image) displayed on the right eye 6 may be performed.
  • the emission device 11 of the head mounted display 161 may be attached to the right eye 6 side.
  • the number of frames in which the eyelid of the left eye 5 is a predetermined time (for example, about 1 second). After being closed (for example, about 30 frames), it may be determined whether or not it is continuously opened.
  • the CPU 51 stores, in the RAM 53, image recognition data for the number of frames corresponding to a predetermined time (for example, about 2 seconds) of the left eye 5 on the side where the emission device 11 is not attached.
  • the CPU 51 continues after the eyelid of the left eye 5 is closed for the number of frames (for example, about 30 frames) for a predetermined time (for example, about 1 second).
  • the upward direction 121 is determined from the image data stored in the left-eye image storage area and the right-eye image storage area of the RAM 53. Based on the movement in the downward direction 122, the left direction 123, or the right direction 124, display control such as upward scrolling and downward scrolling of the content (image) displayed on the right eye 6 may be performed.
  • the emission device 11 of the head mounted display 171 may be attached to the right eye 6 side.
  • the CCD camera 7 that images the left eye 5 of the user 2 may be provided, and the CCD camera 8 that images the right eye 6 may not be provided.
  • steps S312 to S313 based on the image data stored in the left eye image storage area of the RAM 53, the number of frames in which the eyelid of the left eye 5 is a predetermined time (for example, about 1 second). After being closed (for example, about 30 frames), it may be determined whether or not it is continuously opened. However, in S ⁇ b> 312, the CPU 51 stores in the RAM 53 image recognition data for the number of frames corresponding to a predetermined time (for example, about 2 seconds) of the left eye 5 on the side where the emission device 11 is not mounted.
  • a predetermined time for example, about 1 second
  • steps S314 to S316 the CPU 51 continues after the eyelid of the left eye 5 is closed for a number of frames (for example, about 30 frames) for a predetermined time (for example, about 1 second).
  • a predetermined time for example, about 1 second.
  • Display control such as up-scrolling and down-scrolling of the content (image) displayed on the right eye 6 may be performed based on the movement of.

Abstract

L'invention concerne un visiocasque (1) caractérisé en ce qu'il est équipé d'une unité d'affichage (11) permettant de projeter une image d'informations d'affichage sur le premier œil (5) d'un utilisateur (2) et d'amener l'utilisateur à reconnaître visuellement l'image, d'une unité de capture d'images du côté projection (7) permettant de capturer une image du premier œil (5) de l'utilisateur (2), l'unité de capture d'images du second côté (8) permettant de capturer une image du second œil (6) de l'utilisateur, d'une unité de détection du mouvement du globe oculaire (51) permettant de détecter le mouvement du globe oculaire d'au moins un œil de l'utilisateur (2) par le biais de l'unité de capture d'images du côté projection (7) et/ou de l'unité de capture d'images du second côté (8), d'une unité de détection du mouvement des paupières (51) permettant de détecter un mouvement de paupières prédéfini du second œil (6) par le biais de l'unité de capture d'images du second côté (8), et d'une unité de commande d'affichage (51) permettant de commander l'unité d'affichage (11) de façon que l'état d'affichage de l'image soit changé en fonction du mouvement du globe oculaire détecté par l'unité de détection du mouvement du globe oculaire (51) lorsque le mouvement prédéfini des paupières est détecté par le biais de l'unité de détection du mouvement des paupières.
PCT/JP2009/070151 2008-12-24 2009-12-01 Visiocasque WO2010073879A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008-326991 2008-12-24
JP2008326991A JP2010152443A (ja) 2008-12-24 2008-12-24 ヘッドマウントディスプレイ

Publications (1)

Publication Number Publication Date
WO2010073879A1 true WO2010073879A1 (fr) 2010-07-01

Family

ID=42287497

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/070151 WO2010073879A1 (fr) 2008-12-24 2009-12-01 Visiocasque

Country Status (2)

Country Link
JP (1) JP2010152443A (fr)
WO (1) WO2010073879A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103455746A (zh) * 2013-09-10 2013-12-18 百度在线网络技术(北京)有限公司 头戴式显示设备
CN104076512A (zh) * 2013-03-25 2014-10-01 精工爱普生株式会社 头部佩戴型显示装置以及头部佩戴型显示装置的控制方法
JP2016177819A (ja) * 2016-04-26 2016-10-06 京セラ株式会社 表示装置、制御方法および制御プログラム
JP2016181264A (ja) * 2016-04-26 2016-10-13 京セラ株式会社 表示装置、制御方法および制御プログラム
CN106484107A (zh) * 2016-09-29 2017-03-08 宇龙计算机通信科技(深圳)有限公司 一种信息交互的方法及虚拟现实眼镜
US11681371B2 (en) 2021-06-30 2023-06-20 Tobii Ab Eye tracking system

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5558983B2 (ja) * 2010-09-15 2014-07-23 Necカシオモバイルコミュニケーションズ株式会社 画面表示装置、画面表示制御方法及び画面表示制御プログラム並びに情報端末装置
JP6303274B2 (ja) * 2013-03-25 2018-04-04 セイコーエプソン株式会社 頭部装着型表示装置および頭部装着型表示装置の制御方法
JP6094305B2 (ja) * 2013-03-26 2017-03-15 セイコーエプソン株式会社 頭部装着型表示装置、および、頭部装着型表示装置の制御方法
US9389683B2 (en) 2013-08-28 2016-07-12 Lg Electronics Inc. Wearable display and method of controlling therefor
KR102081933B1 (ko) * 2013-08-28 2020-04-14 엘지전자 주식회사 헤드 마운트 디스플레이 및 제어 방법
US9170646B2 (en) * 2013-09-04 2015-10-27 Johnson & Johnson Vision Care, Inc. Ophthalmic lens system capable of interfacing with an external device
KR20230142657A (ko) 2014-03-19 2023-10-11 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 눈 시선 추적을 사용하는 의료 디바이스, 시스템, 및 방법
JP6689203B2 (ja) 2014-03-19 2020-04-28 インテュイティブ サージカル オペレーションズ, インコーポレイテッド 立体ビューワのための視線追跡を統合する医療システム
WO2016113951A1 (fr) 2015-01-15 2016-07-21 株式会社ソニー・インタラクティブエンタテインメント Visiocasque et système d'affichage vidéo
JP2016136351A (ja) * 2015-01-23 2016-07-28 京セラ株式会社 電子機器及び制御方法
JP6231541B2 (ja) 2015-06-25 2017-11-15 株式会社Qdレーザ 画像投影装置
JP6554948B2 (ja) * 2015-07-07 2019-08-07 セイコーエプソン株式会社 表示装置、表示装置の制御方法、及び、プログラム
JP6231585B2 (ja) * 2016-01-05 2017-11-15 株式会社Qdレーザ 画像投影装置

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001100903A (ja) * 1999-09-28 2001-04-13 Sanyo Electric Co Ltd 視線検出機能搭載装置
JP2003230539A (ja) * 2002-02-07 2003-08-19 Minolta Co Ltd 視線検出装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001100903A (ja) * 1999-09-28 2001-04-13 Sanyo Electric Co Ltd 視線検出機能搭載装置
JP2003230539A (ja) * 2002-02-07 2003-08-19 Minolta Co Ltd 視線検出装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KOHEI ARAI ET AL.: "Shisen Nyuryoku ni yoru Kaiwa Shien System", THE TRANSACTIONS OF THE INSTITUTE OF ELECTRICAL ENGINEERS OF JAPAN C, vol. 128, no. 11, 1 November 2008 (2008-11-01), pages 1679 - 1686 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104076512A (zh) * 2013-03-25 2014-10-01 精工爱普生株式会社 头部佩戴型显示装置以及头部佩戴型显示装置的控制方法
EP2784632A3 (fr) * 2013-03-25 2016-02-24 Seiko Epson Corporation Dispositif d'affichage monté sur la tête et procédé de commande dudit dispositif
US9335547B2 (en) 2013-03-25 2016-05-10 Seiko Epson Corporation Head-mounted display device and method of controlling head-mounted display device
US9921646B2 (en) 2013-03-25 2018-03-20 Seiko Epson Corporation Head-mounted display device and method of controlling head-mounted display device
CN103455746A (zh) * 2013-09-10 2013-12-18 百度在线网络技术(北京)有限公司 头戴式显示设备
JP2016177819A (ja) * 2016-04-26 2016-10-06 京セラ株式会社 表示装置、制御方法および制御プログラム
JP2016181264A (ja) * 2016-04-26 2016-10-13 京セラ株式会社 表示装置、制御方法および制御プログラム
CN106484107A (zh) * 2016-09-29 2017-03-08 宇龙计算机通信科技(深圳)有限公司 一种信息交互的方法及虚拟现实眼镜
US11681371B2 (en) 2021-06-30 2023-06-20 Tobii Ab Eye tracking system

Also Published As

Publication number Publication date
JP2010152443A (ja) 2010-07-08

Similar Documents

Publication Publication Date Title
WO2010073879A1 (fr) Visiocasque
US8514148B2 (en) Head mount display
JP5104679B2 (ja) ヘッドマウントディスプレイ
WO2010071110A1 (fr) Affichage au niveau de la tête
JP5223689B2 (ja) ヘッドマウント型表示装置およびその駆動方法
JP6089705B2 (ja) 表示装置、および、表示装置の制御方法
EP3091387A1 (fr) Dispositif d'affichage monté sur la tête de mise au point automatique
EP2163937A1 (fr) Affichage monté sur la tête
JP5195537B2 (ja) ヘッドマウントディスプレイ
WO2010107072A1 (fr) Visiocasque
JP2010139901A (ja) ヘッドマウントディスプレイ
JP5012781B2 (ja) ヘッドマウントディスプレイ
JP6903998B2 (ja) ヘッドマウントディスプレイ
JP2011075956A (ja) ヘッドマウントディスプレイ
JP2011091789A (ja) ヘッドマウントディスプレイ
JP2010102215A (ja) 表示装置、画像処理方法、及びコンピュータプログラム
JP2003225207A (ja) 視線検出装置
JP2017009986A (ja) 画像投影装置
JP2010085786A (ja) 頭部装着型表示装置
JP2006017991A (ja) 映像表示装置
JP2010067154A (ja) ヘッドマウントディスプレイ、情報閲覧システム及び管理サーバ
JP5251813B2 (ja) 作業支援システム、ヘッドマウントディスプレイ及びプログラム
US20170261750A1 (en) Co-Aligned Retinal Imaging And Display System
JP5163535B2 (ja) ヘッドマウントディスプレイ
JP6394165B2 (ja) 虚像表示装置及び方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09834683

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09834683

Country of ref document: EP

Kind code of ref document: A1