WO2016072271A1 - Display device, method for controlling display device, and control program therefor - Google Patents

Display device, method for controlling display device, and control program therefor Download PDF

Info

Publication number
WO2016072271A1
WO2016072271A1 PCT/JP2015/079776 JP2015079776W WO2016072271A1 WO 2016072271 A1 WO2016072271 A1 WO 2016072271A1 JP 2015079776 W JP2015079776 W JP 2015079776W WO 2016072271 A1 WO2016072271 A1 WO 2016072271A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
unit
detection
movement
screen
Prior art date
Application number
PCT/JP2015/079776
Other languages
French (fr)
Japanese (ja)
Inventor
加藤 剛
真志 水金
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Publication of WO2016072271A1 publication Critical patent/WO2016072271A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning

Definitions

  • the present invention relates to a display device, a display device control method, and a control program therefor.
  • a general portable terminal has a touch panel screen that serves both as an image display and a user interface. By touching this screen, a user performs necessary input to display a desired image or input information. Etc. can be performed. However, there are cases where you want to operate the mobile device without touching the screen, such as when the user's hand is wet or dirty, and how to perform the input at that time is an issue .
  • Patent Document 1 discloses a mobile computing device including an infrared LED and an infrared proximity sensor. According to the technique of Patent Document 1, infrared light emitted from an infrared LED is reflected by a hand that is close to a computing device and reflected, and the reflected light is detected by an infrared proximity sensor, whereby the movement of the hand is detected. It can be detected, and a desired input can be performed in a non-contact manner by the movement (gesture) of the hand to perform operations such as swipe. Therefore, there is no risk of contaminating the computing device even if the user's hand is dirty.
  • Patent Document 2 as an example using a non-contact user interface, when a user's hand enters the imaging range of the camera, the user's hand is extracted from the captured image and the gesture of the hand is recognized.
  • a head mounted display hereinafter referred to as “HMD” that displays an image on the next page. Since a general HMD has a display placed in front of the user's eyes, it is inherently difficult to perform input by touching the screen, so it can be said that it is preferable to use gesture input.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide a display device, a display device control method, and a control program thereof that can easily and quickly display a desired screen.
  • the display device of the present invention includes a display unit having a display member having a screen capable of displaying an image, A detection device having a detection region and generating an output by detecting a moving direction in which the pointing unit moved by the user moves in the detection region; A control device for controlling the screen display of the display unit based on the output of the detection device, When the control unit detects that the pointing unit has moved in the detection area based on the output of the detection device, the control unit sets the moving direction of the pointing unit as the first moving direction and responds to the first moving direction. Screen display control, Furthermore, the timer starts counting from the time of detection of the first movement direction, and the instruction unit moves the first movement at least once during the operation of the timer from the time measurement until the specified time elapses. Even if the output of the detection device indicating that the movement is in a different movement direction from the direction is generated, the screen display control corresponding to the different movement direction is skipped.
  • the display device control method of the present invention includes a display unit having a display member having a screen capable of displaying an image, a detection area, and an indication unit that is moved by a user detects a moving direction in which the detection area moves.
  • a display device comprising: a detection device that generates an output; and a control method for the display device that controls screen display of the display unit based on the output of the detection device, When the instruction unit detects that the detection unit has moved in the detection area based on the output of the detection device, the movement direction of the instruction unit is set as the first movement direction, and screen display control according to the first movement direction is performed.
  • the indication unit indicates that the instruction unit has moved in a movement direction different from the first movement direction at least once from the time when the first movement direction is detected until the specified time elapses from the start of timing. Even when the output of the detection device is generated, the screen display control corresponding to the different movement directions is skipped.
  • a control program for a display device detects a moving direction in which a display unit having a display member having a screen capable of displaying an image, a detection area, and an instruction unit moved by a user moves in the detection area.
  • a control program for a display device comprising: a detection device that generates an output; and a control device that controls screen display of the display unit based on the output of the detection device, In the control device, When the instruction unit detects that the detection unit has moved in the detection area based on the output of the detection device, the movement direction of the instruction unit is set as the first movement direction, and screen display control according to the first movement direction is performed.
  • the indication unit indicates that the instruction unit has moved in a movement direction different from the first movement direction at least once from the time when the first movement direction is detected until the specified time elapses from the start of timing. Even if the output of the detection device is generated, the screen display control corresponding to the different movement directions is skipped.
  • a display device a control method for the display device, and a control program for the display device that allow a user to easily and quickly perform an intended screen display operation such as screen scrolling or continuous screen transition operation. .
  • HMD head mounted display
  • FIG. 1 is a perspective view of a head mounted display (HMD) according to a first embodiment. It is the figure which looked at HMD from the front. It is the figure which looked at HMD from the upper part. It is a schematic sectional drawing which shows the structure of a display unit. It is an enlarged view of a proximity sensor. It is a block diagram of the main circuit of HMD. It is a front view when a user wears HMD on the head. It is the side view (a) and partial top view (b) when a user wears HMD on the head. It is a figure which shows the image which a user visually recognizes through a see-through type image display part. It is a figure which shows the example of the signal waveform which a some light reception area
  • FIG. 1 is a perspective view of an HMD 100 including a display device according to the present embodiment.
  • FIG. 2 is a front view of the HMD 100 according to the present embodiment.
  • FIG. 3 is a view of the HMD 100 according to the present embodiment as viewed from above.
  • the right side and the left side of the HMD 100 refer to the right side and the left side for the user wearing the HMD 100.
  • the HMD 100 of the present embodiment includes a frame 101 and a display unit 104 that performs screen display.
  • the frame 101 constitutes a mounting member that holds the display unit 104 on the head so as to be positioned in front of the eyes.
  • a frame 101 that is U-shaped when viewed from above has a front part 101a to which two spectacle lenses 102 are attached, and side parts 101b and 101c extending rearward from both ends of the front part 101a.
  • the two spectacle lenses 102 attached to the frame 101 may or may not have refractive power.
  • the cylindrical main body 103 is fixed to the front part 101 a of the frame 101 on the upper part of the spectacle lens 102 on the right side (which may be on the left side according to the user's dominant eye).
  • a display unit 104 is provided in the main body 103 which is a part of the mounting member.
  • a display control unit 104DR (see FIG. 6 described later) that controls display of the display unit 104 based on an instruction from the processor 121 described later is disposed. If necessary, a display unit may be arranged in front of both eyes.
  • FIG. 4 is a schematic cross-sectional view showing the configuration of the display unit 104.
  • the display unit 104 includes an image forming unit 104A and an image display unit 104B.
  • the image forming unit 104A is incorporated in the main body unit 103, and includes a light source 104a, a unidirectional diffuser 104b, a condenser lens 104c, and a display element 104d.
  • the image display unit 104B which is a so-called see-through type display member, is disposed on the entire plate so as to extend downward from the main body unit 103 and extend in parallel to one eyeglass lens 102 (see FIG. 1).
  • the eyepiece prism 104f, the deflecting prism 104g, and the hologram optical element 104h is disposed on the entire plate so as to extend downward from the main body unit 103 and extend in parallel to one eyeglass lens 102 (see FIG. 1).
  • the light source 104a has a function of illuminating the display element 104d.
  • the light source 104a emits light having a predetermined wavelength width, whereby the image light obtained by illuminating the display element 104d can have a predetermined wavelength width, and the hologram optical element 104h transmits the image light. When diffracted, the image can be observed by the user over the entire observation angle of view at the position of the pupil B. Further, the peak wavelength for each color of the light source 104a is set in the vicinity of the peak wavelength of the diffraction efficiency of the hologram optical element 104h, so that the light use efficiency is improved.
  • the light source 104a is composed of LEDs that emit RGB light, the cost of the light source 104a can be reduced, and a color image is displayed on the display element 104d when the display element 104d is illuminated. The color image can be visually recognized by the user.
  • each of the RGB LED elements has a narrow emission wavelength width, the use of a plurality of such LED elements enables high color reproducibility and bright image display.
  • the display element 104d displays an image by modulating the light emitted from the light source 104a in accordance with image data, and is configured by a transmissive liquid crystal display element having pixels that serve as light transmitting regions in a matrix. Has been. Note that the display element 104d may be of a reflective type.
  • the eyepiece prism 104f totally reflects the image light from the display element 104d incident through the base end face PL1 by the opposed parallel inner side face PL2 and outer side face PL3, and passes through the hologram optical element 104h to the user's pupil.
  • external light is transmitted and guided to the user's pupil, and is composed of, for example, an acrylic resin together with the deflecting prism 104g.
  • the eyepiece prism 104f and the deflection prism 104g are joined by an adhesive with the hologram optical element 104h sandwiched between inclined surfaces PL4 and PL5 inclined with respect to the inner surface PL2 and the outer surface PL3.
  • the deflection prism 104g is joined to the eyepiece prism 104f, and becomes a substantially parallel flat plate integrated with the eyepiece prism 104f. By joining the deflecting prism 104g to the eyepiece prism 104f, it is possible to prevent distortion in the external image observed by the user through the display unit 104.
  • the hologram optical element 104h diffracts and reflects the image light (light having a wavelength corresponding to the three primary colors) emitted from the display element 104d, guides it to the pupil B, enlarges the image displayed on the display element 104d, and enlarges the user's pupil. It is a volume phase type reflection hologram guided as a virtual image.
  • the hologram optical element 104h has, for example, three wavelength ranges of 465 ⁇ 5 nm (B light), 521 ⁇ 5 nm (G light), and 634 ⁇ 5 nm (R light) with a peak wavelength of diffraction efficiency and a wavelength width of half the diffraction efficiency. The light is diffracted (reflected).
  • the peak wavelength of diffraction efficiency is the wavelength at which the diffraction efficiency reaches a peak
  • the wavelength width at half maximum of the diffraction efficiency is the wavelength width at which the diffraction efficiency is at half maximum of the diffraction efficiency peak. is there.
  • the reflection-type hologram optical element 104h has high wavelength selectivity, and only diffracts and reflects light having a wavelength in the above-mentioned wavelength range (near the exposure wavelength).
  • the hologram optical element 104h is transmitted, and a high external light transmittance can be realized.
  • the light emitted from the light source 104a is diffused by the unidirectional diffusion plate 104b, condensed by the condenser lens 104c, and enters the display element 104d.
  • the light incident on the display element 104d is modulated for each pixel based on the image data input from the display control unit 104DR, and is emitted as image light. Thereby, a color image is displayed on the display element 104d.
  • Image light from the display element 104d enters the eyepiece prism 104f from its base end face PL1, is totally reflected a plurality of times by the inner side face PL2 and the outer side face PL3, and enters the hologram optical element 104h.
  • the light incident on the hologram optical element 104h is reflected there, passes through the inner side surface PL2, and reaches the pupil B.
  • the user can observe an enlarged virtual image of the image displayed on the display element 104d, and can visually recognize it as a screen formed on the image display unit 104B.
  • “screen” may refer to an image to be displayed.
  • the eyepiece prism 104f, the deflecting prism 104g, and the hologram optical element 104h transmit almost all of the external light, the user can observe an external field image (real image) through them. Therefore, the virtual image of the image displayed on the display element 104d is observed so as to overlap with a part of the external image. In this manner, the user of the HMD 100 can simultaneously observe the image provided from the display element 104d and the external image via the hologram optical element 104h. Note that when the display unit 104 is in the non-display state, the image display unit 104B is transparent, and only the external image can be observed.
  • a display unit is configured by combining a light source, a liquid crystal display element, and an optical system.
  • a self-luminous display element for example, an organic EL display
  • Element for example, an organic EL display
  • a transmissive organic EL display panel having transparency in a non-light emitting state may be used.
  • a proximity sensor 105 constituting a detection device arranged near the center of the frame 101 and a lens 106 a of a camera 106 arranged near the side are in front. It is provided to face.
  • screen display control such as screen switching is performed according to the hand passing direction.
  • the “proximity sensor” refers to a detection region in the proximity range in front of the detection surface of the proximity sensor in order to detect that an object, for example, a part of the human body is close to the user's eyes. This means that a signal is output by detecting whether or not it exists.
  • the proximity range may be set as appropriate according to the operator's characteristics and preferences. For example, the proximity range from the detection surface of the proximity sensor may be within a range of 200 mm. If the distance from the proximity sensor is 200 mm or less, the user can put the palm in and out of the user's field of view with the arm bent, and can be easily operated by gestures using the hand.
  • the control device determines that the object exists in the proximity range based on a signal output from the proximity sensor when the object enters the detection area in the proximity range in front of the proximity sensor.
  • an effective signal may be output from the proximity sensor to the control device.
  • a passive proximity sensor has a detection unit that detects invisible light and electromagnetic waves emitted from an object when the object approaches.
  • a passive proximity sensor there are a pyroelectric sensor that detects invisible light such as infrared rays emitted from an approaching human body, and a capacitance sensor that detects a change in electrostatic capacitance between the approaching human body and the like.
  • the active proximity sensor includes an invisible light and sound wave projection unit, and a detection unit that receives the invisible light and sound wave reflected and returned from the object.
  • Active proximity sensors include infrared sensors that project infrared rays and receive infrared rays reflected by objects, laser sensors that project laser beams and receive laser beams reflected by objects, and project ultrasonic waves. Then, there is an ultrasonic sensor that receives ultrasonic waves reflected by an object. Note that the passive proximity sensor is excellent in low power consumption. An active proximity sensor is easy to improve the certainty of detection. For example, even when the user is wearing an infrared opaque glove, it can be detected. A plurality of types of proximity sensors may be used in combination.
  • Proximity sensors are generally smaller and cheaper than cameras, and consume less power.
  • the proximity sensor cannot perform complicated detection such as detection of the shape of the object, but can determine the approach or separation of the object, so that the HMD is operated by passing the hand or holding the hand.
  • complicated image processing required for gesture recognition by analysis of the image captured by the camera is also unnecessary.
  • the “indicator that is moved by the user” refers to an object that is driven according to the user's intention, such as a part of the user's body (for example, a hand or a finger), It may be.
  • FIG. 5 is an enlarged view of the proximity sensor 105 used in the present embodiment as viewed from the front.
  • the proximity sensor 105 includes a light receiving unit 105 a that receives invisible light such as infrared light emitted from a human body as detection light.
  • the light receiving unit 105a forms light receiving areas RA to RD arranged in 2 rows and 2 columns, and when detecting light is received, corresponding signals are individually output from the light receiving areas RA to RD. It has become.
  • the right sub-body portion 107 is attached to the right side portion 101b of the frame 101
  • the left sub-body portion 108 is attached to the left side portion 101c of the frame 101.
  • the right sub-main body portion 107 and the left sub-main body portion 108 have an elongated plate shape, and have elongated protrusions 107a and 108a on the inner side, respectively.
  • the right sub-body portion 107 is attached to the frame 101 in a positioned state
  • the elongated protrusion 108 a is attached to the side of the frame 101.
  • the left sub-main body portion 108 is attached to the frame 101 in a positioned state.
  • a geomagnetic sensor 109 for detecting geomagnetism
  • a gyro and acceleration sensor 110 for detecting geomagnetism
  • a speaker (or earphone) 111A and a microphone 111B are provided in the left sub-main body section 108.
  • the main main body 103 and the right sub main body 107 are connected so as to be able to transmit signals through a wiring HS
  • the main main body 103 and the left sub main body 108 are connected so as to be able to transmit signals through a wiring (not shown). Yes.
  • FIG. 1 As schematically illustrated in FIG.
  • the right sub-main body 107 is connected to the control unit CTU via a cord CD extending from the rear end.
  • a 6-axis sensor in which a gyro and an acceleration sensor are integrated may be used.
  • the HMD can be operated by sound based on an output signal generated from the microphone 111B according to the input sound.
  • the main main body 103 and the left sub main body 108 may be configured to be wirelessly connected.
  • FIG. 6 is a block diagram of main circuits of the HMD 100.
  • the control unit CTU includes a processor 121 as a control device, an operation unit 122, a GPS reception unit 123 that receives radio waves from GPS satellites, a communication unit 124 that exchanges data with the outside, and a ROM 125 that stores programs and the like.
  • a RAM 126 for storing image data and the like; a power supply circuit 128 and a battery 127 for supplying power to each unit; and a storage device 129 such as an SSD or a flash memory.
  • the processor 121 can use an application processor used in a smartphone or the like, but the type of the processor 121 is not limited.
  • the processor is suitable for a small HMD.
  • the processor 121, the proximity sensor 105, the display control unit 104DR, and the display unit 104 constitute a display device.
  • the light receiving unit 105a detects the detection light emitted from the human body
  • the signal is input to the processor 121.
  • the processor 121 controls image display on the display unit 104 via the display control unit 104DR.
  • the processor 121 receives power from the battery 127 via the power supply circuit 128, operates according to a program stored in at least one of the ROM 124 and the storage device 129, and operates according to an operation input such as power-on from the operation unit 122 according to an operation input such as power-on. Can be input and stored in the RAM 126 and communicated with the outside via the communication unit 123 as necessary. Furthermore, as will be described later, it is possible to detect a gesture operation in a non-contact manner and execute image control corresponding to the detected gesture operation. Further, the processor 121 can execute the lock mode in accordance with a control program stored in at least one of the ROM 124 and the storage device 129. Details of the lock mode will be described later.
  • FIG. 7 is a front view when the user US wears the HMD 100 of the present embodiment on the head.
  • FIG. 8 is a side view (a) and a top view (b) when the user US wears the HMD 100 on the head, and shows it together with the user's hand.
  • FIG. 9 is a diagram showing an image visually recognized by the user US through the see-through type image display unit 104B.
  • the gesture operation is an operation in which at least the hand HD of the user US approaches or separates from the proximity sensor 105 and can be detected by the processor 121 of the HMD 100 via the proximity sensor 105.
  • the screen 104i of the image display unit 104B is arranged so as to overlap the effective visual field EV of the user's eye facing the image display unit 104B (here, positioned in the effective visual field EV).
  • the detection area SA of the proximity sensor 105 is in the visual field of the user's eye facing the image display unit 104B.
  • the detection area SA is located within the stable focus field of the user's eye or inside the visual field (within about 90 ° horizontal and within about 70 ° vertical), and more preferably located inside the stable focus field.
  • the proximity sensor 105 may be installed with its arrangement and orientation adjusted so as to overlap with the effective visual field EV or the inner visual field (horizontal within about 30 °, vertical within about 20 °).
  • FIG. 9 shows an example in which the detection area SA overlaps the screen 104i.
  • the detection region SA of the proximity sensor 105 by setting the detection region SA of the proximity sensor 105 within the visual field of the eye of the user US while the user US is wearing the frame 101 that is a mounting member on the head, the screen is displayed through the screen 104i. While observing the hand, the approach and retraction of the hand to the detection area SA of the proximity sensor 105 can be reliably recognized without accompanying eye movement.
  • the gesture operation can be reliably performed while recognizing the detection area SA. .
  • the gesture operation can be performed more reliably. If the detection area SA overlaps the screen 104i, the gesture operation can be performed more reliably.
  • the proximity sensor has a plurality of light receiving regions as in the present embodiment, the entire plurality of light receiving regions are regarded as one light receiving unit, and the maximum detection range of the light receiving unit is regarded as a detection region.
  • the detection area SA of the proximity sensor 105 is set to overlap the screen 104i, an image indicating the detection area SA is displayed on the screen 104i (for example, the range of the area SA is displayed by a solid line). ), The user can surely recognize the detection area SA, so that the operation by the gesture can be performed more reliably.
  • gesture operation detection If there is nothing in front of the user US, the light receiving unit 105a does not receive the detection light, so the processor 121 determines that no gesture operation is performed. On the other hand, as shown in FIG. 8, when the user's US hand HD is approached in front of the user US, detection light emitted from the hand HD can be detected by the light receiving unit 105a as shown by a dotted line. . Accordingly, the processor 121 determines that a gesture operation has been performed.
  • the light receiving unit 105a has the light receiving regions RA to RD arranged in 2 rows and 2 columns (see FIG. 5). Therefore, when the user US moves the hand HD closer to the front of the HMD 100 from either the left, right, up, or down directions, the output timings of signals detected in the light receiving areas RA to RD are different.
  • 10 and 11 are examples of signal waveforms of the light receiving areas RA to RD, where the vertical axis represents the signal intensity of the light receiving areas RA to RD and the horizontal axis represents the time.
  • the vertical axis represents the signal intensity of the light receiving areas RA to RD
  • the horizontal axis represents the time.
  • the processor 121 detects the timing of this signal, and determines that the user US has performed the gesture operation by moving the hand HD from the right to the left (here, passing through the detection region). Accordingly, the processor 121 controls the display control unit 104DR to change the display so as to perform page turning from the image G1 to the image G2, for example, in accordance with the movement of the hand HD as shown in FIG. (FIG. 9 shows a state in which the image G1 is being switched while sliding from the image G1 to the image G2 or in the middle of being switched to the image G2 while the image G1 is being wiped).
  • the signals of the light receiving areas RA and RB rise first, the signals of the light receiving areas RC and RD rise after a delay, and the signals of the light receiving areas RA and RB further fall, and then the signals of the light receiving areas RC and RD. Is falling.
  • the processor 121 detects the timing of this signal, and determines that the user US has performed the gesture operation by moving the hand HD from the top to the bottom (here, passing through the detection region).
  • the presence and movement of the hand HD can be reliably detected by using the proximity sensor and positioning the detection region within the visual field of the user's eye facing the image display unit. Accordingly, the user US can view the hand operation and the operation in conjunction with each other, and can realize an intuitive operation. Furthermore, since the proximity sensor is small and consumes less power than the camera, the continuous operation time of the HMD 100 can be extended.
  • the execution of the gesture detection process by the processor 121 or the energization to the proximity sensor is stopped to detect the gesture operation. It may be temporarily interrupted.
  • the operation for detecting the gesture operation can be started and stopped by any means (for example, the operation unit 122).
  • the processor 121 may start and stop the operation for detecting the gesture operation by detecting the gesture operation.
  • the proximity sensor 105 is operated intermittently, the user US approaches the hand HD in front of the proximity sensor 105, and the hand is held as it is for a predetermined time (for example, about 1 second).
  • 105a continuously outputs the intermittent signal during that time, so that the processor 121 that detects this can start control for detecting the gesture operation, or the proximity sensor 105 can be returned to the normal detection operation.
  • the above gesture operation can also be used to cancel the sleep mode.
  • the light receiving unit 105a of the proximity sensor 105 can detect the relative light reception amount of the detection light, a change in the light reception amount can be achieved by moving the hand HD close to or away from the proximity sensor 105.
  • the processor 121 can determine different operations of the hand HD. It is conceivable to detect this and assign it to the operation of starting or stopping the gesture operation.
  • the proximity sensor 105 can detect an object other than the hand HD of the user US (for example, another person's hand) from its detection characteristics, the hand HD is detected when starting or stopping the operation for detecting the gesture operation. It is also preferable to use means other than that from the viewpoint of reducing malfunctions.
  • a voice recognition function for example, a voice such as “start” and “stop” of the user US is acquired by the microphone 111B, and an output signal from the microphone generated accordingly May be activated or stopped by the processor 121 analyzing and recognizing the voice.
  • the volume can be adjusted by voice recognition.
  • gesture recognition using a proximity sensor or camera if the user reciprocates the hand within the detection area of the proximity sensor for continuous scrolling or page turning in one direction, it is recognized as reciprocating scrolling or reciprocating page turning. There is a risk that. After moving the user's hand from one end of the proximity sensor's detection area to the other end, once removed from the detection area, then moved to one end outside the detection area, then detected again Although it is conceivable to move from one end of the region to the other end, in this way, a complicated operation is forced on the user, and the usability deteriorates.
  • the movement direction of the user's hand is stored as the first movement direction.
  • the screen display control according to the first movement direction is performed, and the user's hand is first at least once from the time when the first movement direction is detected until the specified time elapses. Even if the output of the proximity sensor 105 indicating that the vehicle has passed in a movement direction different from the movement direction is generated, the screen display control corresponding to the different movement direction is skipped. Accordingly, the user can easily and quickly perform the intended screen display operation such as screen scrolling and continuous screen transition without forcing the user to perform complicated operations.
  • the processor 121 waits until the light receiving unit 105a detects detection light (in this case, detection light) in step S101 in the flowchart of FIG.
  • detection light in this case, detection light
  • the approach direction of the hand HD is determined from the timing of the signals in the areas RA to RD as follows in step S102.
  • Signal rise in areas RB and RD first, followed by signal rise in areas RA and RC The hand HD has entered in the direction from left to right.
  • step S103 the processor 121 stores the determined approach direction of the hand HD.
  • step S104 the processor 121 waits until the light receiving unit 105a detects no detection light. If the detection is not detected, in step S105, the processor 121 determines the direction of detachment of the hand HD from the signals of the regions RA to RD. Judgment from the timing is as follows. (5) Signal fall in areas RA and RC first, followed by signal fall in areas RB and RD: Hand HD left in the direction from right to left. (6) Signal fall in areas RB and RD first, followed by signal fall in areas RA and RC: Hand HD left in the direction from left to right.
  • step S106 the processor 121 determines whether or not the entry direction and the withdrawal direction of the hand HD match. If the entering direction and the leaving direction of the hand HD do not match, there is a possibility of erroneous detection of the gesture operation. In this example, the gesture operation is not detected and the process proceeds to step S123. Note that a case where the entering direction and the leaving direction of the hand HD are different may be detected and used for another control. Alternatively, the time from the entry to the withdrawal of the hand HD may be measured, and if the movement is within the determined time, it may be determined that the gesture operation is correct. For example, when a signal indicating withdrawal is not output even after one second or more from the signal indicating entry, the gesture operation may be ignored or determined as another pattern gesture operation.
  • step S106 determines in step S106 whether the approach direction and the withdrawal direction of the hand HD are the same.
  • the processor 121 determines in step S107 whether the built-in lock mode timer is operating.
  • the timer for the lock mode operates for a specified time exceeding 0 seconds (for example, 0.5 seconds) from the start to the stop, and the lock mode is set while the timer is operating. Since there are individual differences in the moving speed and behavior of the hand HD, it is possible to optimize it for each user by adjusting the prescribed time for setting the lock mode. If the specified time is too long, the time that the user who wants to scroll in the reverse direction has to wait becomes longer. Therefore, it is desirable that the maximum time is 3 seconds, preferably 1 second or less. Note that this specified time can be adjusted according to the user's preference, thereby providing a user-friendly HMD.
  • the processor 121 When the hand HD is detected for the first time, since the timer is not yet operated (the lock mode is not set), the processor 121 resets and starts the timer in step S108 to start the time measurement. Set the lock mode.
  • the processor 121 further determines a gesture operation in step S109. Specifically, when the processor 121 determines that the gesture operations (1) and (5) are continuously performed, the screen moves from right to left (page feed or scroll) in step S110. In this manner, the display unit 104 is controlled via the display control unit 104DR. On the other hand, when the processor 121 determines that the gesture operations (2) and (6) are continuously performed, the processor 121 displays the display via the display control unit 104DR so that the screen moves from left to right in step S111. The unit 104 is controlled. Further, when the processor 121 determines that the gesture operations (3) and (7) are continuously performed, in step S112, the processor 121 displays the display through the display control unit 104DR so that the screen moves from top to bottom.
  • the unit 104 is controlled. If the processor 121 determines that the gesture operations (4) and (8) are continuously performed, the processor 121 displays the display via the display control unit 104DR so that the screen moves from bottom to top in step S113.
  • the unit 104 is controlled.
  • the direction in which the user's hand approaches or separates from the detection area is determined according to the timing at which the plurality of light receiving areas detect the user's hand, and the direction according to the determined direction. Since the screen display is controlled, variations in control according to the signal from the proximity sensor can be increased.
  • the user US can view the movement of the hand HD entering and leaving in front of the user through the image display unit 104B, and can synchronize with it and perform page turning and scrolling of the image in the same direction as the movement of the hand.
  • the desired operation can be performed intuitively, so it is easy to use.
  • the processor 121 stores the detected movement direction of the hand HD (direction passing through the detection area SA) in the RAM 126 or the register area in the processor 121 as the first movement direction, and then the flow. To step S123.
  • step S107 determines that the timer is operating in step S107, since the lock mode has already been set, the process proceeds to step S115, and the movement direction of the hand HD detected immediately before step S107 is It is determined whether or not the first moving direction is different.
  • the processor 121 determines in step S116 whether the detected moving direction of the hand HD intersects the initial moving direction. . If the processor 121 determines that the detected moving direction of the hand HD does not intersect the initial moving direction (that is, moves in the opposite direction), the processor 121 shifts the flow to step S123. That is, while the lock mode is set, even if the hand HD is moved in the direction opposite to the initial movement direction, the processor 121 skips the screen display control according to the movement direction different from the initial movement direction, and moves in the reverse direction. The page is not paged or scrolled.
  • an icon IND (see FIG. 13) indicating the lock mode setting (during timer operation) can be displayed in the image display unit 104B.
  • the convenience of the user US can be enhanced. If the direction of the arrow included in the icon IND indicates the initial movement direction, it is desirable because the user can recognize that the gesture operation in the other direction is ignored.
  • step S116 When it is determined in step S116 that the detected moving direction of the hand HD intersects the initial moving direction, the processor 121 determines that the user desires to release the lock mode, and the timer is set in step S117. Stop and release lock mode. That is, the user can release the lock mode by moving the hand HD in a direction intersecting the initial movement direction at an arbitrary timing while the lock mode is set, so that the mode can be quickly released. After that, the flow moves to step S123, and when the gesture operation detection control process is continued, the movement direction of the hand HD detected next becomes the first movement direction (No in step S107).
  • step S115 the processor 121 determines in step S115 that the detected moving direction of the hand HD is the same as the initial moving direction.
  • step S118 the processor 121 checks the count value of the timer. By confirming the count value of the timer, the moving speed of the hand HD can be grasped, and further, the reciprocating speed when the hand HD is reciprocated can be grasped. Then, based on the speed of the reciprocation of the hand HD, the processor 121 performs a screen movement amount per operation (for example, for page control such as page turning and scrolling as screen display control according to the passing direction of the hand HD). Then, it is determined whether or not adjustment of the page feed amount or scroll amount is necessary (step S119).
  • a screen movement amount per operation for example, for page control such as page turning and scrolling as screen display control according to the passing direction of the hand HD.
  • two thresholds relating to the moving speed of the hand HD are set in advance, and it is determined whether the moving amount of the screen needs to be adjusted based on a comparison between the moving speed of the hand HD and these thresholds.
  • the movement speed is compared with the previous movement speed of the hand HD to determine whether or not the adjustment of the movement amount of the screen is necessary. If necessary, the movement amount is adjusted in step S120. For example, when the speed of reciprocation of the hand HD is equal to or higher than a first threshold, the page feed amount and scroll amount are made larger than the standard value, and when the speed is less than the second threshold, the page feed amount and scroll amount are made standard. Make it smaller than the value.
  • step S121 the processor 121 causes the image to be paged or scrolled in the first movement direction.
  • the lock mode is set, as shown in FIG. 13, the user shakes the hand HD to the left and right while passing through the detection area SA, for example, so that only the direction in which the hand HD first passes is shown. Since the image can be repeatedly paged and scrolled, display control according to the intention of the user US can be performed at high speed.
  • the user US only needs to reciprocate the hand HD in the same direction and in the opposite direction in order to continuously page and scroll the image in one direction. There is no need to perform operations, reducing the burden on the user. Further, by increasing the speed of shaking hands and fingers, the number of pages to be advanced by one operation and the width of scrolling can be increased, so that the operability is further improved.
  • the processor 121 further resets the timer in step S122 and restarts the time measurement to reset the lock mode until the next specified time is over.
  • an upper limit of the number of repetitions may be provided, and the screen display control may be stopped when the upper limit is exceeded.
  • the processor 121 returns the flow to step S101 and continues control unless a signal instructing the end of the gesture operation detection operation is input in step S123.
  • the processor 121 ends the screen display control process by the gesture operation.
  • the lock mode can be turned on / off by operating the operation unit 122.
  • the lock mode can be set according to the flowchart of FIG. 12, and when the lock mode is set to OFF, the timer is forcibly set to the non-operating state so that the lock mode is not set.
  • the gesture operation detection control process performed by the processor 121 will be specifically described with reference to a schematic time chart shown in FIG. 14A, when the user shakes his / her hand from side to side (see FIG. 13), the sensor 105 outputs a signal that alternately repeats on and off as shown in FIG. 14B.
  • the signal value of the sensor 105 decreases, and when the hand is detected, the signal value increases.
  • Fig. 14 (c) shows the screen display control when the lock mode is turned on.
  • the timer is operated to set the lock mode, and the home screen is moved to the left to move to the next screen. (First screen) is displayed.
  • the first screen remains displayed.
  • the timer is reset to extend the lock mode and the first screen is moved to the left. Move to display the next screen (second screen).
  • the second screen remains displayed. Thereafter, screen feed is continuously performed in one direction until a timeout occurs.
  • Fig. 14 (d) shows the screen display control when the lock mode is set to OFF.
  • the home screen is moved leftward to display the next screen (first screen).
  • the first screen is moved rightward to display the original home screen.
  • the home screen is moved to the left to display the next screen (first screen).
  • the first screen is moved rightward to display the original home screen. That is, when the lock mode is set to OFF, two screens are alternately displayed in accordance with the movement of the hand.
  • the hand HD moves in one direction when passing the detection area SA of the sensor 105 as shown in FIG.
  • the gesture operation is performed so as to pass outside the detection area SA, that is, to turn the hand in front of the user.
  • FIG. 16 is a diagram illustrating an example of an image that is switched by a gesture operation.
  • the movement of the hand up and down, left and right can be recognized as a swipe operation, and can be switched by sliding in accordance with this.
  • the following operation is controlled by the processor 121 in accordance with a signal from the proximity sensor 105.
  • the home screen G11 is displayed on the display unit 104. On the home screen G11 shown in FIG.
  • the current date, temperature, humidity, and the like are displayed at the top of the screen.
  • a gesture operation is performed by moving the hand HD from the bottom to the top
  • a non-display mode in which no screen display is performed is entered, and the display unit 104 is switched to the non-display screen G01.
  • the user US can observe only the external image through the display unit 104. From this screen display (that is, during the execution of the non-display mode), when the gesture operation is performed by moving the hand HD from top to bottom, the non-display mode is canceled and the display unit 104 is switched to the home screen G11.
  • the gesture operation is performed by moving the hand HD from the right to the left from the display of the home screen G11
  • the music display mode is displayed and the music title display screen G10 is displayed on the display unit 104.
  • the title, author, and the like flowing from the speaker or earphone are displayed at the top of the screen, and the volume control VOL is displayed on the left side of the screen.
  • the volume of the song to be played increases one by one, and conversely the hand HD from the top to the bottom.
  • the volume decreases by one tick and the display changes with it.
  • the display unit 104 is switched to the home screen G11.
  • the display mode 104 is displayed and the imaging field angle display screen G12 is displayed on the display unit 104.
  • the user US can capture the subject within the rectangular frame displayed on the imaging angle-of-view display screen G12 with the camera 106 as a still image or a moving image.
  • the imaging may be started / finished in accordance with the switching of the screen, or the imaging may be started / finished after the standby time has elapsed from that point. Imaging may be started / stopped by operation of the operation unit or by voice. From this screen display, when the hand HD is moved from right to left and a gesture operation is performed, the display unit 104 is switched to the home screen G11.
  • a gesture operation is performed by moving the hand HD from the top to the bottom from the display on the home screen G11
  • the screen is switched to a content browsing display screen and an image G21n of a specific page of the content is displayed.
  • the display decrements the page and switches to the image G21n-1 of the previous page, and moves from left to right. Every time the HD is moved and a gesture operation is performed, the image is switched to the image G21n + 1 of the next page.
  • the setting mode is displayed and the setting display screen G31 is displayed on the display unit 104.
  • the display unit 104 switches to another setting display screen G30, or the hand HD is moved from the left to the right.
  • the display unit 104 is switched to another setting display screen G32, and each different setting can be performed using, for example, the operation unit 122.
  • a gesture operation is performed by moving the hand HD from the bottom to the top on the setting display screen G31, the display unit 104 is switched to the home screen G11 via the browsing screen G21. It is effective to use the above-described lock mode when, for example, the screen is continuously changed from the setting screen G30 to the setting screen G32 or when page feed is performed on the display screen G21.
  • the amount of page feed and the amount of movement of items on the menu screen may be changed according to the reciprocation speed of the hand HD as described above.
  • the user US can operate the screen by moving the hand HD in front of the display unit 104 while viewing the display screen of the display unit 104. Therefore, it is not necessary to move the viewpoint to view the operation unit like a device having an operation panel. It can appear to move and operate. Therefore, an easy-to-use user interface can be provided. Moreover, unlike the gesture recognition by the camera, the recognized area can be accurately grasped.
  • each screen is merely an example, and arbitrary items may be added or replaced from various screens including images associated with other functions. Moreover, what screen should be switched according to the moving direction of the hand, how to set the animation at the time of switching may be set as appropriate according to the preference of the user.
  • a pyroelectric sensor which is a kind of passive sensor, is used as the proximity sensor.
  • a light emitting unit that emits invisible light such as infrared light and a light emitting unit emit light.
  • An active type sensor having a light receiving unit that detects invisible light reflected by an object may be used. If an active sensor is used, there is an advantage that gesture operation can be detected even when the user is wearing gloves.
  • FIG. 17 is a front view of the smartphone 200 according to the second embodiment, which is shown together with the user's hand HD.
  • the smartphone 200 has the same configuration as that of the above-described embodiment (see FIG. 6).
  • the configuration having the same function is denoted by the same reference numeral and description thereof is omitted, but the smartphone 200 has a proximity sensor as a detection device. Instead, the camera 106 is used. However, you may provide a proximity sensor similarly to HMD100 mentioned above.
  • the imaging range of the camera 106 that is the imaging apparatus constitutes the detection area SA.
  • the built-in processor stores in advance each frame of a video image, which is image information taken by the camera 106, and stores the shape data and color feature data of the user's hand HD. And whether or not the hand HD has entered the frame of the camera 106 is determined. After recognizing that the hand HD has entered, the processor stores the features and feature points of the hand HD as tracking targets. If there is a hand HD having the same feature in the next frame, the processor measures the moving distance, direction, and speed.
  • the processor exceeds the minimum movement distance (for example, about half the distance of the screen) for which the movement of the hand HD is specified, and If it is determined that it has moved within the specified speed range, it is determined that the hand HD has passed the detection area, and unless the timer for the lock mode is activated, the lock mode is set and this direction is set as the first moving direction. As well as screen scrolling and screen transition in the first moving direction. At this time, the hand HD does not necessarily have to pass from the right to the left outside the frame of the camera 106.
  • the minimum movement distance for example, about half the distance of the screen
  • the processor moves through the video image of the camera 106 that the hand HD has moved in the direction opposite to the first movement direction, as in the above-described embodiment. Even if it is detected, screen scrolling, screen transition, etc. are not performed in the reverse direction. Furthermore, when the processor detects through the video image of the camera 106 that the hand HD has moved in the direction opposite to the initial movement direction, it is possible to cause the screen to scroll or transition in the first movement direction. .
  • the moving speed of the hand is too fast, it will be impossible to recognize the hand reliably in the frame image, and if it is slow, it can be regarded as another action, so it is desirable to define the speed range.
  • the hand recognition rate depends on the frame rate of the camera, a rate of about 15 fps or more is recommended.
  • HMD 101 Frame 101a Front part 101b Side part 101c Side part 101d Long hole 101e Long hole 102 Eyeglass lens 103 Main body part 104 Display 104A Image forming part 104B Image display part 104DR Display control part 104a Light source 104b Unidirectional diffuser 104c Condensing lens 104d Display element 104f Eyepiece prism 104g Deflection prism 104h Hologram optical element 104i Screen 105 Proximity sensor 105a Light receiving part 106 Camera 106a Lens 107 Right sub-main part 107a Protrusion 108 Left sub-main part 108a Protrusion 109 Acceleration sensor 110 Gyro 111A Speaker (or earphone) 111B Microphone 121 Processor 122 Operation unit 123 GPS reception unit 124 Communication unit 125 ROM 126 RAM 127 Battery 200 Smartphone CD code CTU Control unit HD Hand HS Wiring IND Icon PL1 Base end surface PL2 Inner surface PL3 Outer surface PL4 Inclined surface PL4, PL5

Abstract

The present invention provides a display device, a method for controlling the display device, and a control program therefor with which it is possible to easily and quickly display a desired screen image. The display device is equipped with: a display unit having a display member provided with a screen capable of displaying an image; a detection device having a detection area, the detection device detecting the direction of movement in which an indication unit moved by a user moves in the detection area and generating an output; and a control unit for controlling the screen display of the display unit on the basis of the output of the detection device. When it is detected on the basis of the output from the detection device that the indication unit moved by the user has moved in the detection area, the control device takes the direction of that movement as the first direction of movement, performs a screen display control that corresponds to the first direction of movement, and furthermore starts using a timer to measure a duration from when the first direction of movement is detected. Even when an output from the detection device indicating that the indication unit has passed through in a direction of movement differing from the first direction of movement is generated at least once while the timer is in operation from when the measurement started to when a prescribed amount of time elapses, the control device skips screen display control that corresponds to the different direction of movement.

Description

表示装置、表示装置の制御方法及びその制御プログラムDisplay device, display device control method, and control program therefor
 本発明は表示装置、表示装置の制御方法及びその制御プログラムに関する。 The present invention relates to a display device, a display device control method, and a control program therefor.
 近年、急速に発達したスマートフォン等の携帯端末は、ビジネスや家庭での作業補助に用いられることも多い。一般的な携帯端末では、画像表示とユーザーインタフェースを兼ねるタッチパネル式の画面を備えているので、ユーザーはこれにタッチすることで必要な入力を行って、所望の画像を表示したり情報を入力するなどの操作を行うことができる。しかるに、ユーザーの手が濡れていたり、汚れている場合など、画面にタッチせずに携帯端末の操作を行いたい場合があるが、その際の入力をどのように行うかが課題となっている。 In recent years, mobile terminals such as smartphones that have been rapidly developed are often used for business and home work assistance. A general portable terminal has a touch panel screen that serves both as an image display and a user interface. By touching this screen, a user performs necessary input to display a desired image or input information. Etc. can be performed. However, there are cases where you want to operate the mobile device without touching the screen, such as when the user's hand is wet or dirty, and how to perform the input at that time is an issue .
 特許文献1には、赤外線LEDと赤外線近接センサとを備えた移動コンピューティングデバイスが開示されている。特許文献1の技術によれば、赤外線LEDから出射された赤外光が、コンピューティングデバイスに近づけた手に当って反射して、その反射光を赤外線近接センサが検出することで手の動きを検出でき、その手の動き(ジェスチャー)により非接触で所望の入力を行ってスワイプ等の操作を行うことができる。従って、例えユーザーの手が汚れていても、コンピューティングデバイスを汚染する恐れがない。 Patent Document 1 discloses a mobile computing device including an infrared LED and an infrared proximity sensor. According to the technique of Patent Document 1, infrared light emitted from an infrared LED is reflected by a hand that is close to a computing device and reflected, and the reflected light is detected by an infrared proximity sensor, whereby the movement of the hand is detected. It can be detected, and a desired input can be performed in a non-contact manner by the movement (gesture) of the hand to perform operations such as swipe. Therefore, there is no risk of contaminating the computing device even if the user's hand is dirty.
特表2013-534009号公報Special table 2013-534209 特開2013-206412号公報JP 2013-206412 A
 ところで、ユーザーが、画面のスクロールや画面遷移(頁送り)を連続して行いたい場合がある。かかる場合、タッチパネル式の画面にタッチできるのであれば、ユーザーの指で画面を一方向に繰り返して擦ることで、容易に且つ迅速に所望の操作を実行できる。ところが、特許文献1に示される赤外線近接センサを用いたジェスチャー検知では、赤外線近接センサの検出領域内でユーザーが手を往復動作させると、これを検出した赤外線近接センサからの信号を受信したコンピューティングデバイス側では、往復スクロール或いは往復頁送りと認識してしまい、片方向の連続的なスクロールや頁送りをうまく行えないという問題がある。ユーザーの手を赤外線近接センサの検出領域の一方の端から他方の端へと移動させた後、検出領域外へと一旦離脱させ、更に検出領域外で一方の端へと移動させた後、再び検出領域の一方の端から他方の端へと移動させれば、片方向の連続的なスクロールや頁送りは可能ではあるが、ユーザーに対して複雑な動作を強いることになり、使い勝手が悪いという問題がある。 By the way, there are cases where the user wants to perform screen scrolling and screen transition (page feed) continuously. In such a case, if the touch panel screen can be touched, a desired operation can be easily and quickly performed by repeatedly rubbing the screen in one direction with the user's finger. However, in the gesture detection using the infrared proximity sensor disclosed in Patent Document 1, when the user reciprocates the hand within the detection area of the infrared proximity sensor, the computer that has received a signal from the infrared proximity sensor that has detected this is detected. On the device side, it is recognized as reciprocating scroll or reciprocating page feed, and there is a problem that continuous scrolling and page turning in one direction cannot be performed well. After moving the user's hand from one end of the detection area of the infrared proximity sensor to the other end, once removed from the detection area, further moved to one end outside the detection area, and again If it is moved from one end of the detection area to the other end, continuous scrolling and page turning in one direction are possible, but the user is forced to perform complex operations and is not easy to use. There's a problem.
 特許文献2には、非接触のユーザーインタフェースを用いた例として、カメラの撮像範囲内にユーザーの手が進入したときに、撮像した画像からユーザーの手を抽出して、その手のジェスチャーを認識することで表示画像を次頁の画像にするヘッドマウントディスプレイ(Head Mounted Display、以下HMDと記す)が開示されている。
一般的なHMDは、ユーザーの目の前にディスプレイを配置しているため、画面にタッチして行う入力は本来的に困難であるから、ジェスチャー入力を利用することが好ましいといえる。しかしながら、画面のスクロールや頁送り等を連続して実行したい場合には、特許文献1の技術と同様に、ユーザーの手の往復動作と認識されるのを避けるために、ユーザーの手をカメラの撮像範囲の一方の端から他方の端へと移動させた後、撮像範囲外へと一旦離脱させ、更に撮像範囲外で一方の端へと移動させた後、再び撮像範囲の一方の端から他方の端へと移動させるという複雑な動作が必要になる。しかもカメラの撮像範囲は赤外線近接センサの検出領域よりも広いことが多いので、それによりジェスチャー動作を行うユーザーの負担が大きくなってしまうという問題もある。
In Patent Document 2, as an example using a non-contact user interface, when a user's hand enters the imaging range of the camera, the user's hand is extracted from the captured image and the gesture of the hand is recognized. Thus, a head mounted display (hereinafter referred to as “HMD”) that displays an image on the next page is disclosed.
Since a general HMD has a display placed in front of the user's eyes, it is inherently difficult to perform input by touching the screen, so it can be said that it is preferable to use gesture input. However, when it is desired to continuously perform screen scrolling, page turning, etc., as in the technique of Patent Document 1, in order to avoid recognizing the user's hand as a reciprocating motion, After moving from one end of the imaging range to the other end, once removed from the imaging range, further moved to one end outside the imaging range, and again from one end of the imaging range to the other A complicated operation of moving to the end of the is required. In addition, since the imaging range of the camera is often wider than the detection area of the infrared proximity sensor, there is a problem that the burden on the user who performs the gesture operation becomes large.
 本発明は、上記の事情に鑑みてなされたものであって、所望の画面表示を容易且つ迅速に行える表示装置及び表示装置の制御方法並びにその制御プログラムを提供することを目的とする。 The present invention has been made in view of the above circumstances, and an object of the present invention is to provide a display device, a display device control method, and a control program thereof that can easily and quickly display a desired screen.
 本発明の表示装置は、画像を表示できる画面を備えた表示部材を有するディスプレイユニットと、
 検出領域を持ち、ユーザーにより動かされる指示部が前記検出領域を移動する移動方向を検出して出力を生成する検出装置と、
 前記検出装置の出力に基づいて、前記ディスプレイユニットの画面表示を制御する制御装置と、を備え、
 前記制御装置は、前記検出装置の出力に基づいて前記指示部が前記検出領域を移動したことを検出したときに、前記指示部の移動方向を最初の移動方向として、前記最初の移動方向に応じた画面表示制御を行い、
 更に前記最初の移動方向の検出時よりタイマによる計時を開始し、前記計時を開始してから規定時間が経過するまでの前記タイマの動作中に、少なくとも1回、前記指示部が前記最初の移動方向と異なる移動方向に移動したことを示す前記検出装置の出力が生成されても、前記異なる移動方向に応じた画面表示制御をスキップするものである。
The display device of the present invention includes a display unit having a display member having a screen capable of displaying an image,
A detection device having a detection region and generating an output by detecting a moving direction in which the pointing unit moved by the user moves in the detection region;
A control device for controlling the screen display of the display unit based on the output of the detection device,
When the control unit detects that the pointing unit has moved in the detection area based on the output of the detection device, the control unit sets the moving direction of the pointing unit as the first moving direction and responds to the first moving direction. Screen display control,
Furthermore, the timer starts counting from the time of detection of the first movement direction, and the instruction unit moves the first movement at least once during the operation of the timer from the time measurement until the specified time elapses. Even if the output of the detection device indicating that the movement is in a different movement direction from the direction is generated, the screen display control corresponding to the different movement direction is skipped.
 本発明の表示装置の制御方法は、画像を表示できる画面を備えた表示部材を有するディスプレイユニットと、検出領域を持ち、ユーザーにより動かされる指示部が前記検出領域を移動する移動方向を検出して出力を生成する検出装置と、を備えた表示装置において、前記検出装置の出力に基づいて、前記ディスプレイユニットの画面表示を制御する表示装置の制御方法であって、
 前記検出装置の出力に基づいて前記指示部が前記検出領域を移動したことを検出したときに、前記指示部の移動方向を最初の移動方向として、前記最初の移動方向に応じた画面表示制御を行い、
 更に前記最初の移動方向の検出時より計時を開始してから規定時間が経過するまでの間に、少なくとも1回、前記指示部が前記最初の移動方向と異なる移動方向に移動したことを示す前記検出装置の出力が生成されても、前記異なる移動方向に応じた画面表示制御をスキップするものである。
The display device control method of the present invention includes a display unit having a display member having a screen capable of displaying an image, a detection area, and an indication unit that is moved by a user detects a moving direction in which the detection area moves. A display device comprising: a detection device that generates an output; and a control method for the display device that controls screen display of the display unit based on the output of the detection device,
When the instruction unit detects that the detection unit has moved in the detection area based on the output of the detection device, the movement direction of the instruction unit is set as the first movement direction, and screen display control according to the first movement direction is performed. Done
Further, the indication unit indicates that the instruction unit has moved in a movement direction different from the first movement direction at least once from the time when the first movement direction is detected until the specified time elapses from the start of timing. Even when the output of the detection device is generated, the screen display control corresponding to the different movement directions is skipped.
 本発明の表示装置の制御プログラムは、画像を表示できる画面を備えた表示部材を有するディスプレイユニットと、検出領域を持ち、ユーザーにより動かされる指示部が前記検出領域を移動する移動方向を検出して出力を生成する検出装置と、前記検出装置の出力に基づいて、前記ディスプレイユニットの画面表示を制御する制御装置と、を備えた表示装置の制御プログラムであって、
 前記制御装置に、
 前記検出装置の出力に基づいて前記指示部が前記検出領域を移動したことを検出したときに、前記指示部の移動方向を最初の移動方向として、前記最初の移動方向に応じた画面表示制御を行わせ、
 更に前記最初の移動方向の検出時より計時を開始してから規定時間が経過するまでの間に、少なくとも1回、前記指示部が前記最初の移動方向と異なる移動方向に移動したことを示す前記検出装置の出力が生成されても、前記異なる移動方向に応じた画面表示制御をスキップさせるものである。
A control program for a display device according to the present invention detects a moving direction in which a display unit having a display member having a screen capable of displaying an image, a detection area, and an instruction unit moved by a user moves in the detection area. A control program for a display device comprising: a detection device that generates an output; and a control device that controls screen display of the display unit based on the output of the detection device,
In the control device,
When the instruction unit detects that the detection unit has moved in the detection area based on the output of the detection device, the movement direction of the instruction unit is set as the first movement direction, and screen display control according to the first movement direction is performed. Let
Further, the indication unit indicates that the instruction unit has moved in a movement direction different from the first movement direction at least once from the time when the first movement direction is detected until the specified time elapses from the start of timing. Even if the output of the detection device is generated, the screen display control corresponding to the different movement directions is skipped.
 本発明によれば、例えば画面のスクロールや画面遷移の連続動作など、ユーザーが、意図する画面表示操作を容易且つ迅速に行える表示装置及び表示装置の制御方法並びにその制御プログラムを提供することができる。 According to the present invention, it is possible to provide a display device, a control method for the display device, and a control program for the display device that allow a user to easily and quickly perform an intended screen display operation such as screen scrolling or continuous screen transition operation. .
第1の実施形態にかかるヘッドマウントディスプレイ(HMD)の斜視図である。1 is a perspective view of a head mounted display (HMD) according to a first embodiment. HMDを正面から見た図である。It is the figure which looked at HMD from the front. HMDを上方から見た図である。It is the figure which looked at HMD from the upper part. ディスプレイユニットの構成を示す概略断面図である。It is a schematic sectional drawing which shows the structure of a display unit. 近接センサの拡大図である。It is an enlarged view of a proximity sensor. HMDの主要回路のブロック図である。It is a block diagram of the main circuit of HMD. ユーザーがHMDを頭部に装着したときの正面図である。It is a front view when a user wears HMD on the head. ユーザーがHMDを頭部に装着したときの側面図(a)及び部分上面図(b)である。It is the side view (a) and partial top view (b) when a user wears HMD on the head. シースルー型の画像表示部を通してユーザーが視認する像を示す図である。It is a figure which shows the image which a user visually recognizes through a see-through type image display part. 複数の受光領域が出力する信号波形の例を示す図である。It is a figure which shows the example of the signal waveform which a some light reception area | region outputs. 複数の受光領域が出力する信号波形の他の例を示す図である。It is a figure which shows the other example of the signal waveform which a some light reception area | region outputs. プロセッサが実行する、ジェスチャー操作に基づく画面表示制御処理を示すフローチャートである。It is a flowchart which shows the screen display control process based on gesture operation which a processor performs. ロックモード設定時に、シースルー型の画像表示部を通してユーザーが自分の手HDを視認する状態を示す図である。It is a figure which shows the state which a user visually recognizes his hand HD through a see-through type image display part at the time of lock mode setting. ジェスチャー操作に基づく画面表示制御処理の例を示すタイムチャートである。It is a time chart which shows the example of the screen display control process based on gesture operation. ロックモードのオフ設定時に、同じ方向への画面表示制御を行う際の手HDの動きを示す図である。It is a figure which shows the motion of hand HD at the time of performing the screen display control to the same direction at the time of lock mode OFF setting. ジェスチャー操作によって切り変わる画像の例を示す図である。It is a figure which shows the example of the image switched by gesture operation. 第2の実施形態にかかるスマートフォン200を正面から見た図である。It is the figure which looked at the smart phone 200 concerning 2nd Embodiment from the front.
(第1の実施形態)
 以下に本実施形態を、図面を参照して説明する。図1は、本実施形態にかかる表示装置を含むHMD100の斜視図である。図2は、本実施形態にかかるHMD100を正面から見た図である。図3は、本実施形態にかかるHMD100を上方から見た図である。以下、HMD100の右側及左側とは、HMD100を装着したユーザーにとっての右側及び左側をいうものとする。
(First embodiment)
The present embodiment will be described below with reference to the drawings. FIG. 1 is a perspective view of an HMD 100 including a display device according to the present embodiment. FIG. 2 is a front view of the HMD 100 according to the present embodiment. FIG. 3 is a view of the HMD 100 according to the present embodiment as viewed from above. Hereinafter, the right side and the left side of the HMD 100 refer to the right side and the left side for the user wearing the HMD 100.
 図1~3に示すように、本実施形態のHMD100は、フレーム101と画面表示を行うディスプレイユニット104とを有している。フレーム101は、ディスプレイユニット104を、眼前に位置するように頭部に保持する装着部材を構成する。上方から見てコ字状であるフレーム101は、2つの眼鏡レンズ102を取り付ける前方部101aと、前方部101aの両端から後方へと延在する側部101b、101cとを有する。フレーム101に取り付けられた2つの眼鏡レンズ102は屈折力を有していてもよいし、有していなくてもよい。 As shown in FIGS. 1 to 3, the HMD 100 of the present embodiment includes a frame 101 and a display unit 104 that performs screen display. The frame 101 constitutes a mounting member that holds the display unit 104 on the head so as to be positioned in front of the eyes. A frame 101 that is U-shaped when viewed from above has a front part 101a to which two spectacle lenses 102 are attached, and side parts 101b and 101c extending rearward from both ends of the front part 101a. The two spectacle lenses 102 attached to the frame 101 may or may not have refractive power.
 右側(ユーザーの利き目などに応じて左側でもよい)の眼鏡レンズ102の上部において、円筒状の主本体部103がフレーム101の前方部101aに固定されている。装着部材の一部である主本体部103にはディスプレイユニット104が設けられている。主本体部103内には、後述するプロセッサ121からの指示に基づいてディスプレイユニット104の表示制御を司る表示制御部104DR(後述する図6を参照)が配置されている。なお、必要であれば両眼の前にそれぞれディスプレイユニットを配置してもよい。 The cylindrical main body 103 is fixed to the front part 101 a of the frame 101 on the upper part of the spectacle lens 102 on the right side (which may be on the left side according to the user's dominant eye). A display unit 104 is provided in the main body 103 which is a part of the mounting member. In the main body 103, a display control unit 104DR (see FIG. 6 described later) that controls display of the display unit 104 based on an instruction from the processor 121 described later is disposed. If necessary, a display unit may be arranged in front of both eyes.
 図4は、ディスプレイユニット104の構成を示す概略断面図である。ディスプレイユニット104は、画像形成部104Aと画像表示部104Bとからなる。画像形成部104Aは、主本体部103内に組み込まれており、光源104aと、一方向拡散板104bと、集光レンズ104cと、表示素子104dとを有している。一方、いわゆるシースルー型の表示部材である画像表示部104Bは、主本体部103から下方に向かい、片方の眼鏡レンズ102(図1参照)に平行に延在するように配置された全体的に板状であって、接眼プリズム104fと、偏向プリズム104gと、ホログラム光学素子104hとを有している。 FIG. 4 is a schematic cross-sectional view showing the configuration of the display unit 104. The display unit 104 includes an image forming unit 104A and an image display unit 104B. The image forming unit 104A is incorporated in the main body unit 103, and includes a light source 104a, a unidirectional diffuser 104b, a condenser lens 104c, and a display element 104d. On the other hand, the image display unit 104B, which is a so-called see-through type display member, is disposed on the entire plate so as to extend downward from the main body unit 103 and extend in parallel to one eyeglass lens 102 (see FIG. 1). The eyepiece prism 104f, the deflecting prism 104g, and the hologram optical element 104h.
 光源104aは、表示素子104dを照明する機能を有し、例えば光強度のピーク波長及び光強度半値の波長幅で462±12nm(B光)、525±17nm(G光)、635±11nm(R光)となる3つの波長帯域の光を発するRGB一体型のLEDで構成されている。このように光源104aが所定の波長幅の光を出射することにより、表示素子104dを照明して得られる画像光に所定の波長幅を持たせることができ、ホログラム光学素子104hにて画像光を回折させたときに、瞳孔Bの位置にて観察画角全域にわたってユーザーに画像を観察させることができる。また、光源104aの各色についてのピーク波長は、ホログラム光学素子104hの回折効率のピーク波長の近傍に設定されており、光利用効率の向上がはかられている。 The light source 104a has a function of illuminating the display element 104d. For example, 462 ± 12 nm (B light), 525 ± 17 nm (G light), 635 ± 11 nm (R It is composed of RGB-integrated LEDs that emit light in three wavelength bands. Thus, the light source 104a emits light having a predetermined wavelength width, whereby the image light obtained by illuminating the display element 104d can have a predetermined wavelength width, and the hologram optical element 104h transmits the image light. When diffracted, the image can be observed by the user over the entire observation angle of view at the position of the pupil B. Further, the peak wavelength for each color of the light source 104a is set in the vicinity of the peak wavelength of the diffraction efficiency of the hologram optical element 104h, so that the light use efficiency is improved.
 また、光源104aは、RGBの光を出射するLEDで構成されているので、光源104aのコストを抑えることができるとともに、表示素子104dを照明したときに、表示素子104dにてカラー画像を表示することが可能となり、そのカラー画像をユーザーが視認可能とすることができる。また、RGBの各LED素子は発光波長幅が狭いので、そのようなLED素子を複数用いることにより、色再現性が高く、明るい画像表示が可能となる。 Further, since the light source 104a is composed of LEDs that emit RGB light, the cost of the light source 104a can be reduced, and a color image is displayed on the display element 104d when the display element 104d is illuminated. The color image can be visually recognized by the user. In addition, since each of the RGB LED elements has a narrow emission wavelength width, the use of a plurality of such LED elements enables high color reproducibility and bright image display.
 表示素子104dは、光源104aからの出射光を画像データに応じて変調して画像を表示するものであり、光が透過する領域となる各画素をマトリクス状に有する透過型の液晶表示素子で構成されている。なお、表示素子104dは、反射型であってもよい。 The display element 104d displays an image by modulating the light emitted from the light source 104a in accordance with image data, and is configured by a transmissive liquid crystal display element having pixels that serve as light transmitting regions in a matrix. Has been. Note that the display element 104d may be of a reflective type.
 接眼プリズム104fは、基端面PL1を介して入射する表示素子104dからの画像光を、相対する平行な内側面PL2と外側面PL3とで全反射させ、ホログラム光学素子104hを介してユーザーの瞳に導く一方、外光を透過させてユーザーの瞳に導くものであり、偏向プリズム104gとともに、例えばアクリル系樹脂で構成されている。この接眼プリズム104fと偏向プリズム104gとは、内側面PL2及び外側面PL3に対して傾斜した傾斜面PL4、PL5でホログラム光学素子104hを挟み、接着剤で接合されている。 The eyepiece prism 104f totally reflects the image light from the display element 104d incident through the base end face PL1 by the opposed parallel inner side face PL2 and outer side face PL3, and passes through the hologram optical element 104h to the user's pupil. On the other hand, external light is transmitted and guided to the user's pupil, and is composed of, for example, an acrylic resin together with the deflecting prism 104g. The eyepiece prism 104f and the deflection prism 104g are joined by an adhesive with the hologram optical element 104h sandwiched between inclined surfaces PL4 and PL5 inclined with respect to the inner surface PL2 and the outer surface PL3.
 偏向プリズム104gは、接眼プリズム104fに接合されて、接眼プリズム104fと一体となって略平行平板となるものである。この偏向プリズム104gを接眼プリズム104fに接合することにより,ユーザーがディスプレイユニット104を介して観察する外界像に歪みが生じるのを防止することができる。 The deflection prism 104g is joined to the eyepiece prism 104f, and becomes a substantially parallel flat plate integrated with the eyepiece prism 104f. By joining the deflecting prism 104g to the eyepiece prism 104f, it is possible to prevent distortion in the external image observed by the user through the display unit 104.
 すなわち、例えば、接眼プリズム104fに偏向プリズム104gを接合させない場合、外光は接眼プリズム104fの傾斜面PL4を透過するときに屈折するので、接眼プリズム104fを介して観察される外界像に歪みが生じる。しかし、接眼プリズム104fに相補的な傾斜面PL5を有する偏向プリズム104gを接合させて一体的な略平行平板を形成することで、外光が傾斜面PL4,PL5(ホログラム光学素子104h)を透過するときの屈折を偏向プリズム104gでキャンセルすることができる。その結果、シースルーで観察される外界像に歪みが生じるのを防止することができる。なお、ディスプレイユニット104とユーザーの瞳の間に眼鏡レンズ102(図1参照)を装着すると、通常眼鏡を使用しているユーザーでも問題なく画像を観察することが可能である。 That is, for example, when the deflecting prism 104g is not joined to the eyepiece prism 104f, external light is refracted when passing through the inclined surface PL4 of the eyepiece prism 104f, so that an external image observed through the eyepiece prism 104f is distorted. . However, by joining the deflecting prism 104g having the inclined surface PL5 complementary to the eyepiece prism 104f to form an integral substantially parallel plate, external light is transmitted through the inclined surfaces PL4 and PL5 (hologram optical element 104h). The refraction at that time can be canceled by the deflecting prism 104g. As a result, it is possible to prevent distortion in the external image observed through the see-through. In addition, if the spectacle lens 102 (refer FIG. 1) is mounted | worn between the display unit 104 and a user's pupil, it will be possible for the user who normally uses spectacles to observe an image without a problem.
 ホログラム光学素子104hは、表示素子104dから出射される画像光(3原色に対応した波長の光)を回折反射して瞳孔Bに導き、表示素子104dに表示される画像を拡大してユーザーの瞳に虚像として導く体積位相型の反射型ホログラムである。このホログラム光学素子104hは、例えば、回折効率のピーク波長および回折効率半値の波長幅で465±5nm(B光)、521±5nm(G光)、634±5nm(R光)の3つの波長域の光を回折(反射)させるように作製されている。ここで、回折効率のピーク波長は、回折効率がピークとなるときの波長のことであり、回折効率半値の波長幅とは、回折効率が回折効率ピークの半値となるときの波長幅のことである。 The hologram optical element 104h diffracts and reflects the image light (light having a wavelength corresponding to the three primary colors) emitted from the display element 104d, guides it to the pupil B, enlarges the image displayed on the display element 104d, and enlarges the user's pupil. It is a volume phase type reflection hologram guided as a virtual image. The hologram optical element 104h has, for example, three wavelength ranges of 465 ± 5 nm (B light), 521 ± 5 nm (G light), and 634 ± 5 nm (R light) with a peak wavelength of diffraction efficiency and a wavelength width of half the diffraction efficiency. The light is diffracted (reflected). Here, the peak wavelength of diffraction efficiency is the wavelength at which the diffraction efficiency reaches a peak, and the wavelength width at half maximum of the diffraction efficiency is the wavelength width at which the diffraction efficiency is at half maximum of the diffraction efficiency peak. is there.
 反射型のホログラム光学素子104hは、高い波長選択性を有しており、上記波長域(露光波長近辺)の波長の光しか回折反射しないので、回折反射される波長以外の波長を含む外光はホログラム光学素子104hを透過することになり、高い外光透過率を実現することができる。 The reflection-type hologram optical element 104h has high wavelength selectivity, and only diffracts and reflects light having a wavelength in the above-mentioned wavelength range (near the exposure wavelength). The hologram optical element 104h is transmitted, and a high external light transmittance can be realized.
 次に、ディスプレイユニット104の動作について説明する。光源104aから出射された光は、一方向拡散板104bにて拡散され、集光レンズ104cにて集光されて表示素子104dに入射する。表示素子104dに入射した光は、表示制御部104DRから入力された画像データに基づいて画素ごとに変調され、画像光として出射される。これにより、表示素子104dには、カラー画像が表示される。 Next, the operation of the display unit 104 will be described. The light emitted from the light source 104a is diffused by the unidirectional diffusion plate 104b, condensed by the condenser lens 104c, and enters the display element 104d. The light incident on the display element 104d is modulated for each pixel based on the image data input from the display control unit 104DR, and is emitted as image light. Thereby, a color image is displayed on the display element 104d.
 表示素子104dからの画像光は、接眼プリズム104fの内部にその基端面PL1から入射し、内側面PL2と外側面PL3で複数回全反射されて、ホログラム光学素子104hに入射する。ホログラム光学素子104hに入射した光は、そこで反射され、内側面PL2を透過して瞳孔Bに達する。瞳孔Bの位置では、ユーザーは、表示素子104dに表示された画像の拡大虚像を観察することができ、画像表示部104Bに形成される画面として視認することができる。なお、本明細書において「画面」というときは、表示される画像を指すこともある。 Image light from the display element 104d enters the eyepiece prism 104f from its base end face PL1, is totally reflected a plurality of times by the inner side face PL2 and the outer side face PL3, and enters the hologram optical element 104h. The light incident on the hologram optical element 104h is reflected there, passes through the inner side surface PL2, and reaches the pupil B. At the position of the pupil B, the user can observe an enlarged virtual image of the image displayed on the display element 104d, and can visually recognize it as a screen formed on the image display unit 104B. In the present specification, “screen” may refer to an image to be displayed.
 一方、接眼プリズム104f、偏向プリズム104gおよびホログラム光学素子104hは、外光をほとんど全て透過させるので、ユーザーはこれらを介して外界像(実像)を観察することができる。したがって、表示素子104dに表示された画像の虚像は、外界像の一部に重なって観察されることになる。このようにして、HMD100のユーザーは、ホログラム光学素子104hを介して、表示素子104dから提供される画像と外界像とを同時に観察することができる。尚、ディスプレイユニット104が非表示状態のとき画像表示部104Bは素通しとなり、外界像のみを観察できる。なお、本例では、光源と液晶表示素子と光学系とを組み合わせて表示ユニットを構成しているが、光源と液晶表示素子の組合せに代えて、自発光型の表示素子(例えば、有機EL表示素子)を用いても良い。また、光源と液晶表示素子と光学系の組合せに代えて、非発光状態で透過性を有する透過型有機EL表示パネルを用いてもよい。いずれにしても、画像表示部104Bに対向するユーザーの眼の視野に入るように、好ましくは、有効視野に少なくとも一部が重なるように、画面を配置すると、ユーザーは画像を容易に視認することができる。 On the other hand, since the eyepiece prism 104f, the deflecting prism 104g, and the hologram optical element 104h transmit almost all of the external light, the user can observe an external field image (real image) through them. Therefore, the virtual image of the image displayed on the display element 104d is observed so as to overlap with a part of the external image. In this manner, the user of the HMD 100 can simultaneously observe the image provided from the display element 104d and the external image via the hologram optical element 104h. Note that when the display unit 104 is in the non-display state, the image display unit 104B is transparent, and only the external image can be observed. In this example, a display unit is configured by combining a light source, a liquid crystal display element, and an optical system. However, instead of a combination of a light source and a liquid crystal display element, a self-luminous display element (for example, an organic EL display) is used. Element) may be used. Further, instead of a combination of a light source, a liquid crystal display element, and an optical system, a transmissive organic EL display panel having transparency in a non-light emitting state may be used. In any case, when the screen is arranged so as to fall within the visual field of the user's eye facing the image display unit 104B, and preferably at least partially overlaps the effective visual field, the user can easily visually recognize the image. Can do.
 更に図1、2において、主本体部103の正面には、フレーム101の中央寄りに配置された検出装置を構成する近接センサ105と、側部寄りに配置されたカメラ106のレンズ106aが前方を向くようにして設けられている。後述するように、本実施形態では、近接センサ105からの出力に基づいてユーザーの手の動きを検出し、手の通過方向に応じて画面切り替え等の画面表示制御を行うようにしている。 Further, in FIGS. 1 and 2, on the front side of the main body 103, a proximity sensor 105 constituting a detection device arranged near the center of the frame 101 and a lens 106 a of a camera 106 arranged near the side are in front. It is provided to face. As will be described later, in this embodiment, the movement of the user's hand is detected based on the output from the proximity sensor 105, and screen display control such as screen switching is performed according to the hand passing direction.
 本明細書において、「近接センサ」とは、物体、例えば人体の一部がユーザーの眼前に近接していることを検知するために、近接センサの検出面前方の近接範囲にある検出領域内に存在しているか否かを検出して信号を出力するものをいう。近接範囲は、操作者の特性や好みに応じて適宜設定すればよいが、例えば、近接センサの検出面からの距離が200mm以内の範囲とすることができる。近接センサからの距離が200mm以内であれば、ユーザーが腕を曲げた状態で、手のひらをユーザーの視野内に入れたり出したりできるため、手を使ったジェスチャーによって容易に操作を行うことができ、また、ユーザー以外の人体や家具等を誤って検出する恐れが少なくなる。ここで制御装置は、近接センサの前方の近接範囲にある検出領域に物体が入った際に近接センサから出力される信号に基づいて、物体が近接範囲に存在すると判定する。検出領域に物体が入った際に、近接センサから有効な信号を制御装置に出力するようにしても良い。 In this specification, the “proximity sensor” refers to a detection region in the proximity range in front of the detection surface of the proximity sensor in order to detect that an object, for example, a part of the human body is close to the user's eyes. This means that a signal is output by detecting whether or not it exists. The proximity range may be set as appropriate according to the operator's characteristics and preferences. For example, the proximity range from the detection surface of the proximity sensor may be within a range of 200 mm. If the distance from the proximity sensor is 200 mm or less, the user can put the palm in and out of the user's field of view with the arm bent, and can be easily operated by gestures using the hand. In addition, there is less risk of erroneously detecting a human body or furniture other than the user. Here, the control device determines that the object exists in the proximity range based on a signal output from the proximity sensor when the object enters the detection area in the proximity range in front of the proximity sensor. When an object enters the detection area, an effective signal may be output from the proximity sensor to the control device.
 近接センサには、パッシブ型とアクティブ型とがある。パッシブ型の近接センサは、物体が近接した際に物体から放射される不可視光や電磁波を検出する検出部を有する。パッシブ型の近接センサとしては、接近した人体から放射される赤外線などの不可視光を検出する焦電センサや、接近した人体との間の静電容量変化を検出する静電容量センサなどがある。アクティブ型の近接センサは、不可視光や音波の投射部と、物体に反射して戻った不可視光や音波を受ける検出部とを有する。アクティブ型の近接センサとしては、赤外線を投射して物体で反射された赤外線を受光する赤外線センサや、レーザ光を投射して物体で反射されたレーザ光を受光するレーザセンサや、超音波を投射して物体で反射された超音波を受け取る超音波センサなどがある。尚、パッシブ型の近接センサは低消費電力性に優れている。アクティブ型の近接センサは検知の確実性を向上させやすく、例えば、ユーザーが赤外線不透過の手袋をしているような場合でも、検出することができる。複数種類の近接センサを組み合わせて用いても良い。 There are two types of proximity sensors: passive type and active type. A passive proximity sensor has a detection unit that detects invisible light and electromagnetic waves emitted from an object when the object approaches. As a passive proximity sensor, there are a pyroelectric sensor that detects invisible light such as infrared rays emitted from an approaching human body, and a capacitance sensor that detects a change in electrostatic capacitance between the approaching human body and the like. The active proximity sensor includes an invisible light and sound wave projection unit, and a detection unit that receives the invisible light and sound wave reflected and returned from the object. Active proximity sensors include infrared sensors that project infrared rays and receive infrared rays reflected by objects, laser sensors that project laser beams and receive laser beams reflected by objects, and project ultrasonic waves. Then, there is an ultrasonic sensor that receives ultrasonic waves reflected by an object. Note that the passive proximity sensor is excellent in low power consumption. An active proximity sensor is easy to improve the certainty of detection. For example, even when the user is wearing an infrared opaque glove, it can be detected. A plurality of types of proximity sensors may be used in combination.
 近接センサは、カメラに比べて、概して小型で安価であり、消費電力も小さい。また、近接センサは、物体の形状の検出など複雑な検出はできないが、物体の接近や離間を判別することができるので、手を通過させたり手をかざしたりすることで、HMDの操作を行うことができ、しかも、カメラの撮影画像の解析によるジェスチャー認識に必要とされる複雑な画像処理も不要である。 Proximity sensors are generally smaller and cheaper than cameras, and consume less power. The proximity sensor cannot perform complicated detection such as detection of the shape of the object, but can determine the approach or separation of the object, so that the HMD is operated by passing the hand or holding the hand. In addition, complicated image processing required for gesture recognition by analysis of the image captured by the camera is also unnecessary.
 本明細書において、「ユーザーに動かされる指示部」とは、ユーザーの意図に従って駆動される物体をいい、例えばユーザーの体の一部(例えば手や指)でも良いし、ユーザーが手に持つペンなどであっても良い。 In this specification, the “indicator that is moved by the user” refers to an object that is driven according to the user's intention, such as a part of the user's body (for example, a hand or a finger), It may be.
 図5は、本実施形態で用いる近接センサ105を正面から見た拡大図である。本実施形態では、近接センサ105として、焦電センサを用いた例について説明する。図5において、近接センサ105は、人体から放射される赤外光などの不可視光を検出光として受光する受光部105aを有する。受光部105aは、2行2列に並べられた受光領域RA~RDを形成しており、検出光を受光した際に、それに対応した信号が各受光領域RA~RDから個々に出力されるようになっている。人体から放射される赤外光を検出する焦電センサであれば、露出した人体以外の物体を誤検出しくにいので、例えば、狭所で作業する場合などにおいて、効果的に誤検出を防ぐことができるというメリットがある。 FIG. 5 is an enlarged view of the proximity sensor 105 used in the present embodiment as viewed from the front. In the present embodiment, an example in which a pyroelectric sensor is used as the proximity sensor 105 will be described. In FIG. 5, the proximity sensor 105 includes a light receiving unit 105 a that receives invisible light such as infrared light emitted from a human body as detection light. The light receiving unit 105a forms light receiving areas RA to RD arranged in 2 rows and 2 columns, and when detecting light is received, corresponding signals are individually output from the light receiving areas RA to RD. It has become. If it is a pyroelectric sensor that detects infrared light emitted from the human body, it is difficult to falsely detect an object other than the exposed human body. For example, when working in a narrow space, the false detection is effectively prevented. There is an advantage that you can.
 図1、2において、フレーム101の右側の側部101bには、右副本体部107が取り付けられ、フレーム101の左側の側部101cには、左副本体部108が取り付けられている。右副本体部107及び左副本体部108は、細長い板形状を有しており、それぞれ内側に細長い突起107a,108aを有している。この細長い突起107aを、フレーム101の側部101bの長孔101dに係合させることで、右副本体部107が位置決めされた状態でフレーム101に取り付けられ、また細長い突起108aを、フレーム101の側部101cの長孔101eに係合させることで、左副本体部108が位置決めされた状態でフレーム101に取り付けられている。 1 and 2, the right sub-body portion 107 is attached to the right side portion 101b of the frame 101, and the left sub-body portion 108 is attached to the left side portion 101c of the frame 101. The right sub-main body portion 107 and the left sub-main body portion 108 have an elongated plate shape, and have elongated protrusions 107a and 108a on the inner side, respectively. By engaging the elongated protrusion 107 a with the elongated hole 101 d of the side portion 101 b of the frame 101, the right sub-body portion 107 is attached to the frame 101 in a positioned state, and the elongated protrusion 108 a is attached to the side of the frame 101. By engaging with the long hole 101e of the portion 101c, the left sub-main body portion 108 is attached to the frame 101 in a positioned state.
 右副本体部107内には、地磁気を検出する地磁気センサ109(後述する図6参照)と、姿勢に応じた出力を生成する、角速度センサを含むジャイロ及び加速度センサ110(後述する図6参照)とが搭載されており、左副本体部108内には、スピーカ(又はイヤホン)111A及びマイク111B(後述する図6参照)とが設けられている。主本体部103と右副本体部107とは、配線HSで信号伝達可能に接続されており、主本体部103と左副本体部108とは、不図示の配線で信号伝達可能に接続されている。図3に簡略図示するように、右副本体部107は、その後端から延在するコードCDを介して制御ユニットCTUに接続されている。なお、ジャイロ及び加速度センサを一体化した6軸センサを用いてもよい。また、入力される音声に応じてマイク111Bから生成される出力信号に基づいて、音声によってHMDを操作することもできる。また、主本体部103と左副本体部108とが無線接続されるように構成してもよい。 In the right sub-main body 107, a geomagnetic sensor 109 (see FIG. 6 to be described later) for detecting geomagnetism, and a gyro and acceleration sensor 110 (see FIG. 6 to be described later) that generates an output corresponding to the posture. And a speaker (or earphone) 111A and a microphone 111B (see FIG. 6 to be described later) are provided in the left sub-main body section 108. The main main body 103 and the right sub main body 107 are connected so as to be able to transmit signals through a wiring HS, and the main main body 103 and the left sub main body 108 are connected so as to be able to transmit signals through a wiring (not shown). Yes. As schematically illustrated in FIG. 3, the right sub-main body 107 is connected to the control unit CTU via a cord CD extending from the rear end. A 6-axis sensor in which a gyro and an acceleration sensor are integrated may be used. Further, the HMD can be operated by sound based on an output signal generated from the microphone 111B according to the input sound. Further, the main main body 103 and the left sub main body 108 may be configured to be wirelessly connected.
 図6は、HMD100の主要回路のブロック図である。制御ユニットCTUは、制御装置であるプロセッサ121と、操作部122と、GPS衛星からの電波を受信するGPS受信部123と、外部とデータのやりとりを行う通信部124と、プログラム等を格納するROM125と、画像データ等を保存するRAM126と、各部に給電するための電源回路128及びバッテリー127と、SSDやフラッシュメモリ等のストレージデバイス129とを有している。プロセッサ121はスマートフォンなどで用いられているアプリケーションプロセッサを使用することが出来るが、プロセッサ121の種類は問わない。例えば、アプリケーションプロセッサの内部にGPUやCodecなど画像処理に必要なハードウェアが標準で組み込まれているものは、小型のHMDには適したプロセッサであるといえる。ここでは、プロセッサ121と、近接センサ105と、表示制御部104DRと、ディスプレイユニット104とで、表示装置を構成する。 FIG. 6 is a block diagram of main circuits of the HMD 100. The control unit CTU includes a processor 121 as a control device, an operation unit 122, a GPS reception unit 123 that receives radio waves from GPS satellites, a communication unit 124 that exchanges data with the outside, and a ROM 125 that stores programs and the like. A RAM 126 for storing image data and the like; a power supply circuit 128 and a battery 127 for supplying power to each unit; and a storage device 129 such as an SSD or a flash memory. The processor 121 can use an application processor used in a smartphone or the like, but the type of the processor 121 is not limited. For example, if an application processor has built-in hardware necessary for image processing such as GPU and Codec as a standard, it can be said that the processor is suitable for a small HMD. Here, the processor 121, the proximity sensor 105, the display control unit 104DR, and the display unit 104 constitute a display device.
 更に、プロセッサ121には、受光部105aが人体から放射される検出光を検出したときは、その信号が入力される。又、プロセッサ121は、表示制御部104DRを介してディスプレイユニット104の画像表示を制御する。 Further, when the light receiving unit 105a detects the detection light emitted from the human body, the signal is input to the processor 121. The processor 121 controls image display on the display unit 104 via the display control unit 104DR.
 プロセッサ121は、電源回路128を介してバッテリー127からの給電を受け、ROM124及びストレージデバイス129の少なくとも一方に格納されたプログラムに従って動作し、操作部122からの電源オンなどの操作入力に従い、カメラ106からの画像データを入力してRAM126に記憶し、必要に応じて通信部123を介して外部と通信を行うことができる。更に、後述するように、非接触でジェスチャー操作を検出して,それに応じた画像制御を実行できる。また、プロセッサ121は、ROM124及びストレージデバイス129の少なくも一方に格納された制御プログラムに従ってロックモードを実行可能である。ロックモードの詳細については後述する。 The processor 121 receives power from the battery 127 via the power supply circuit 128, operates according to a program stored in at least one of the ROM 124 and the storage device 129, and operates according to an operation input such as power-on from the operation unit 122 according to an operation input such as power-on. Can be input and stored in the RAM 126 and communicated with the outside via the communication unit 123 as necessary. Furthermore, as will be described later, it is possible to detect a gesture operation in a non-contact manner and execute image control corresponding to the detected gesture operation. Further, the processor 121 can execute the lock mode in accordance with a control program stored in at least one of the ROM 124 and the storage device 129. Details of the lock mode will be described later.
 図7は、ユーザーUSが本実施形態のHMD100を頭部に装着したときの正面図である。図8は、ユーザーUSがHMD100を頭部に装着したときの側面図(a)及び上面図(b)であり、ユーザーの手と共に示している。図9は、シースルー型の画像表示部104Bを通してユーザーUSが視認する像を示す図である。ここで、ジェスチャー操作とは、少なくともユーザーUSの手HDが近接センサ105に対して接近又は離間する動作であり、近接センサ105を介してHMD100のプロセッサ121が検出できるものである。 FIG. 7 is a front view when the user US wears the HMD 100 of the present embodiment on the head. FIG. 8 is a side view (a) and a top view (b) when the user US wears the HMD 100 on the head, and shows it together with the user's hand. FIG. 9 is a diagram showing an image visually recognized by the user US through the see-through type image display unit 104B. Here, the gesture operation is an operation in which at least the hand HD of the user US approaches or separates from the proximity sensor 105 and can be detected by the processor 121 of the HMD 100 via the proximity sensor 105.
 図9に示すように、画像表示部104Bの画面104iは、画像表示部104Bに対向するユーザーの眼の有効視野EVに重なるように(ここでは、有効視野EV内に位置するように)配置される。また、近接センサ105の検出領域SAは、画像表示部104Bに対向するユーザーの眼の視野内にある。好ましくは、検出領域SAが、ユーザーの眼の安定注視野又はその内側の視野内(水平約90°以内、垂直約70°以内)に位置し、さらに好ましくは、安定注視野よりも内側に位置する、有効視野EV又はその内側の視野内(水平約30°以内、垂直約20°以内)に重なるように位置するように、近接センサ105の配置と向きを調整して設置するとよい。 As shown in FIG. 9, the screen 104i of the image display unit 104B is arranged so as to overlap the effective visual field EV of the user's eye facing the image display unit 104B (here, positioned in the effective visual field EV). The Further, the detection area SA of the proximity sensor 105 is in the visual field of the user's eye facing the image display unit 104B. Preferably, the detection area SA is located within the stable focus field of the user's eye or inside the visual field (within about 90 ° horizontal and within about 70 ° vertical), and more preferably located inside the stable focus field. The proximity sensor 105 may be installed with its arrangement and orientation adjusted so as to overlap with the effective visual field EV or the inner visual field (horizontal within about 30 °, vertical within about 20 °).
 図9では、検出領域SAが画面104iに重なっている例を示している。このように、ユーザーUSが装着部材であるフレーム101を頭部に装着した状態で、ユーザーUSの眼の視野内に近接センサ105の検出領域SAが位置するように設定することで、画面104iを通して手を観察しつつ、眼の移動を伴うことなく、近接センサ105の検出領域SAへの手の進入と退避とを確実に視認することができる。特に、近接センサ105の検出領域SAを安定注視野又はその内側の視野内とすることで、ユーザーが画面を観察していても検出領域SAを認識しつつ、確実にジェスチャー操作を行うことができる。また、近接センサ105の検出領域SAを有効視野EV又はその内側の視野内とすることで、さらに確実にジェスチャー操作を行うことができる。検出領域SAが画面104iに重なるようにすれば、さらに確実にジェスチャー操作を行うことができる。なお、本実施形態のように、近接センサが複数の受光領域を有する場合は、複数の受光領域全体を一つの受光部とみて、その受光部の最大検出範囲を検出領域とみなすものとする。図9のように、近接センサ105の検出領域SAが画面104iに重なるように設定されている場合は、検出領域SAを示す画像を画面104iに表示する(例えば、領域SAの範囲を実線で表示する)と、ユーザーは検出領域SAを確実に認識できるので、ジェスチャーによる操作をより確実に行うことができる。 FIG. 9 shows an example in which the detection area SA overlaps the screen 104i. In this way, by setting the detection region SA of the proximity sensor 105 within the visual field of the eye of the user US while the user US is wearing the frame 101 that is a mounting member on the head, the screen is displayed through the screen 104i. While observing the hand, the approach and retraction of the hand to the detection area SA of the proximity sensor 105 can be reliably recognized without accompanying eye movement. In particular, by making the detection area SA of the proximity sensor 105 within the stable field of view or the inside visual field, even if the user observes the screen, the gesture operation can be reliably performed while recognizing the detection area SA. . Further, by making the detection area SA of the proximity sensor 105 within the effective visual field EV or the visual field inside the effective visual field EV, the gesture operation can be performed more reliably. If the detection area SA overlaps the screen 104i, the gesture operation can be performed more reliably. When the proximity sensor has a plurality of light receiving regions as in the present embodiment, the entire plurality of light receiving regions are regarded as one light receiving unit, and the maximum detection range of the light receiving unit is regarded as a detection region. As shown in FIG. 9, when the detection area SA of the proximity sensor 105 is set to overlap the screen 104i, an image indicating the detection area SA is displayed on the screen 104i (for example, the range of the area SA is displayed by a solid line). ), The user can surely recognize the detection area SA, so that the operation by the gesture can be performed more reliably.
 ジェスチャー操作の検出について説明する。ユーザーUSの前方に何も存在しなければ、受光部105aは検出光を受光しないので、プロセッサ121はジェスチャー操作が行われていないと判断する。一方、図8に示すように、ユーザーUSの目の前にユーザーUS自身の手HDを接近させると、点線で示すように手HDから放射される検出光を受光部105aで検出することができる。これによりプロセッサ121はジェスチャー操作が行われたと判断する。 Describes gesture operation detection. If there is nothing in front of the user US, the light receiving unit 105a does not receive the detection light, so the processor 121 determines that no gesture operation is performed. On the other hand, as shown in FIG. 8, when the user's US hand HD is approached in front of the user US, detection light emitted from the hand HD can be detected by the light receiving unit 105a as shown by a dotted line. . Accordingly, the processor 121 determines that a gesture operation has been performed.
 上述したように、受光部105aは、2行2列に並べられた受光領域RA~RDを有する(図5参照)。従って、ユーザーUSが、左右上下いずれの方向から手HDをHMD100の前方に近づけた場合、各受光領域RA~RDで検出する信号の出力タイミングが異なる。 As described above, the light receiving unit 105a has the light receiving regions RA to RD arranged in 2 rows and 2 columns (see FIG. 5). Therefore, when the user US moves the hand HD closer to the front of the HMD 100 from either the left, right, up, or down directions, the output timings of signals detected in the light receiving areas RA to RD are different.
 図10、11は、縦軸に受光領域RA~RDの信号強度をとり、横軸に時間をとって示した、受光領域RA~RDの信号波形の一例である。例えば、図8、図9を参照してユーザーUSがHMD100の前方で右方から左方に向かって手HDを移動させた場合、手HDから放射された検出光が受光部105aに入射するが、このとき最初に検出光を受光するのは受光領域RA,RCである。従って、図10に示すように、まず受光領域RA,RCの信号が立ち上がり、遅れて受光領域RB,RDの信号が立ち上がり、更に受光領域RA,RCの信号が立ち下がった後に、受光領域RB,RDの信号が立ち下がる。この信号のタイミングをプロセッサ121が検出し、ユーザーUSは手HDを右から左へと移動(ここでは、検出領域を通過)させてジェスチャー操作を行ったと判断する。これに伴い、プロセッサ121は、表示制御部104DRを制御して、図9に示すように、例えば手HDの動きに合わせて、画像G1から画像G2へと頁送りを行うように表示を変更することができる(図9には、画像G1から画像G2にスライドしながら切り替わる途中、あるいは、画像G1がワイピングされつつ画像G2に切り替わる途中の状態を示す)。 10 and 11 are examples of signal waveforms of the light receiving areas RA to RD, where the vertical axis represents the signal intensity of the light receiving areas RA to RD and the horizontal axis represents the time. For example, when the user US moves the hand HD from the right to the left in front of the HMD 100 with reference to FIGS. 8 and 9, the detection light emitted from the hand HD is incident on the light receiving unit 105a. In this case, the light receiving areas RA and RC first receive the detection light. Therefore, as shown in FIG. 10, first, the signals in the light receiving areas RA and RC rise, the signals in the light receiving areas RB and RD rise later, and the signals in the light receiving areas RA and RC fall, and then the light receiving areas RB, The RD signal falls. The processor 121 detects the timing of this signal, and determines that the user US has performed the gesture operation by moving the hand HD from the right to the left (here, passing through the detection region). Accordingly, the processor 121 controls the display control unit 104DR to change the display so as to perform page turning from the image G1 to the image G2, for example, in accordance with the movement of the hand HD as shown in FIG. (FIG. 9 shows a state in which the image G1 is being switched while sliding from the image G1 to the image G2 or in the middle of being switched to the image G2 while the image G1 is being wiped).
 図11の例では、まず受光領域RA,RBの信号が立ち上がり、遅れて受光領域RC,RDの信号が立ち上がり、更に受光領域RA,RBの信号が立ち下がった後に、受光領域RC,RDの信号が立ち下がっている。この信号のタイミングをプロセッサ121が検出し、ユーザーUSは手HDを上から下へと移動(ここでは、検出領域を通過)させてジェスチャー操作を行ったと判断する。 In the example of FIG. 11, the signals of the light receiving areas RA and RB rise first, the signals of the light receiving areas RC and RD rise after a delay, and the signals of the light receiving areas RA and RB further fall, and then the signals of the light receiving areas RC and RD. Is falling. The processor 121 detects the timing of this signal, and determines that the user US has performed the gesture operation by moving the hand HD from the top to the bottom (here, passing through the detection region).
 本実施形態によれば、近接センサを用いて、その検出領域が画像表示部に対向するユーザーの眼の視野内に位置させることで、確実に手HDの存在と移動を検出することができる。従って、ユーザーUSは手の動作と操作を連動させて見ることができ、直感的な操作を実現できる。更に、近接センサは、小型でカメラより低消費電力であるから、HMD100の連続動作時間を長くすることができる。 According to the present embodiment, the presence and movement of the hand HD can be reliably detected by using the proximity sensor and positioning the detection region within the visual field of the user's eye facing the image display unit. Accordingly, the user US can view the hand operation and the operation in conjunction with each other, and can realize an intuitive operation. Furthermore, since the proximity sensor is small and consumes less power than the camera, the continuous operation time of the HMD 100 can be extended.
 HMD100の消費電力をさらに低減したい場合や、ジェスチャー操作以外のユーザーインタフェースによる操作を優先したい場合は、プロセッサ121によるジェスチャーの検出処理の実行や近接センサへの通電を停止して、ジェスチャー操作の検出を一時的に中断してもよい。ジェスチャー操作を検出する動作の起動と停止は任意の手段(例えば操作部122)で行うことができる。 When it is desired to further reduce the power consumption of the HMD 100 or when priority is given to the operation by the user interface other than the gesture operation, the execution of the gesture detection process by the processor 121 or the energization to the proximity sensor is stopped to detect the gesture operation. It may be temporarily interrupted. The operation for detecting the gesture operation can be started and stopped by any means (for example, the operation unit 122).
 プロセッサ121がジェスチャー操作を検出する動作の起動と停止を、ジェスチャー操作の検出で行えるようにしてもよい。例えば、近接センサ105を間欠的に作動させておき、近接センサ105の前にユーザーUSが手HDを接近させ、そのまま所定時間(例えば約1秒)、手を保持しておくことで、受光部105aがその間継続して間欠信号を出力することで、これを検出したプロセッサ121がジェスチャー操作を検出する制御を開始したり、近接センサ105を通常の検出動作に戻したりすることができる。 The processor 121 may start and stop the operation for detecting the gesture operation by detecting the gesture operation. For example, the proximity sensor 105 is operated intermittently, the user US approaches the hand HD in front of the proximity sensor 105, and the hand is held as it is for a predetermined time (for example, about 1 second). 105a continuously outputs the intermittent signal during that time, so that the processor 121 that detects this can start control for detecting the gesture operation, or the proximity sensor 105 can be returned to the normal detection operation.
 HMD100全体を省電力状態とするスリープモードを設けている場合は、スリープモードの解除にも、上記のジェスチャー操作を用いることができる。また、近接センサ105の受光部105aは、検出光の相対的な受光量を検出することが可能であるから、手HDを近接センサ105の前で近づけたり遠ざけたりすることで、受光量の変化として手HDの異なる動作をプロセッサ121が判別できる。これを検出して、ジェスチャー操作の起動又は停止の操作に割り当てることも考えられる。 In the case where a sleep mode in which the entire HMD 100 is in a power saving state is provided, the above gesture operation can also be used to cancel the sleep mode. In addition, since the light receiving unit 105a of the proximity sensor 105 can detect the relative light reception amount of the detection light, a change in the light reception amount can be achieved by moving the hand HD close to or away from the proximity sensor 105. As a result, the processor 121 can determine different operations of the hand HD. It is conceivable to detect this and assign it to the operation of starting or stopping the gesture operation.
 但し、近接センサ105はその検出特性から、ユーザーUSの手HD以外の物体(例えば他人の手)も検出できてしまうことから、ジェスチャー操作を検出する動作の起動又は停止にあたっては、手HDを検出する以外の手段を用いることが、誤動作を減らすという観点からも好ましい。一例としては、プロセッサ121に音声認識機能を持たせることで、例えばマイク111BにてユーザーUSの「スタート」、「ストップ」などの音声を取得して、それに応じて生成されるマイクからの出力信号をプロセッサ121が解析して音声認識することで、ジェスチャー操作の起動や停止を行うようにしても良い。楽曲を聴いているときなどに、音声認識によりボリューム調整を行うこともできる。 However, since the proximity sensor 105 can detect an object other than the hand HD of the user US (for example, another person's hand) from its detection characteristics, the hand HD is detected when starting or stopping the operation for detecting the gesture operation. It is also preferable to use means other than that from the viewpoint of reducing malfunctions. As an example, by providing the processor 121 with a voice recognition function, for example, a voice such as “start” and “stop” of the user US is acquired by the microphone 111B, and an output signal from the microphone generated accordingly May be activated or stopped by the processor 121 analyzing and recognizing the voice. When listening to music, the volume can be adjusted by voice recognition.
 次に、プロセッサ121が実行する表示装置の制御方法を含む、ジェスチャー操作に基づく画面表示制御処理について説明する。近接センサやカメラを使ったジェスチャー認識では、片方向の連続的なスクロールや頁送りのために、近接センサの検出領域内でユーザーが手を往復動作させると、往復スクロール或いは往復頁送りと認識されてしまう恐れがある。ユーザーの手を近接センサの検出領域の一方の端から他方の端へと移動させた後、検出領域外へと一旦離脱させ、更に検出領域外で一方の端へと移動させた後、再び検出領域の一方の端から他方の端へと移動させることも考えられるが、このようにするとユーザーに対して複雑な動作を強いることになり、使い勝手が悪くなる。そこで、本実施形態においては、近接センサ105の出力に基づいてユーザーの手が近接センサ105の検出領域SAを通過したことを検出したときに、ユーザーの手の移動方向を最初の移動方向として記憶するとともに上記最初の移動方向に応じた画面表示制御を行い、さらに上記最初の移動方向の検出時より計時を開始してから規定時間経過するまでの間は、少なくとも1回、ユーザーの手が最初の移動方向と異なる移動方向に通過したことを示す近接センサ105の出力が生成されても、上記異なる移動方向に応じた画面表示制御をスキップするようにしている。これにより、ユーザーに複雑な操作を強いることなく、画面のスクロールや画面遷移の連続動作など、ユーザーが、意図した画面表示操作を容易且つ迅速に行える。 Next, screen display control processing based on gesture operations including the display device control method executed by the processor 121 will be described. In gesture recognition using a proximity sensor or camera, if the user reciprocates the hand within the detection area of the proximity sensor for continuous scrolling or page turning in one direction, it is recognized as reciprocating scrolling or reciprocating page turning. There is a risk that. After moving the user's hand from one end of the proximity sensor's detection area to the other end, once removed from the detection area, then moved to one end outside the detection area, then detected again Although it is conceivable to move from one end of the region to the other end, in this way, a complicated operation is forced on the user, and the usability deteriorates. Therefore, in the present embodiment, when it is detected that the user's hand has passed the detection area SA of the proximity sensor 105 based on the output of the proximity sensor 105, the movement direction of the user's hand is stored as the first movement direction. In addition, the screen display control according to the first movement direction is performed, and the user's hand is first at least once from the time when the first movement direction is detected until the specified time elapses. Even if the output of the proximity sensor 105 indicating that the vehicle has passed in a movement direction different from the movement direction is generated, the screen display control corresponding to the different movement direction is skipped. Accordingly, the user can easily and quickly perform the intended screen display operation such as screen scrolling and continuous screen transition without forcing the user to perform complicated operations.
 以下、図面を参照して、ジェスチャー操作に基づく画面表示制御処理について詳しく説明する。まず、上述した操作によってジェスチャー操作の検出動作をスタートさせた後、プロセッサ121は、図12のフローチャートのステップS101において、受光部105aが検出光(ここでは、検出光)を検出するまで待ち、検出光を検出した場合、ステップS102で、手HDの進入方向を領域RA~RDの信号のタイミングから,以下のように判断する。
(1)先に領域RA,RCの信号立ち上がり、次いで領域RB,RDの信号立ち上がり:手HDは右から左へ向かう方向に進入した。
(2)先に領域RB,RDの信号立ち上がり、次いで領域RA,RCの信号立ち上がり:手HDは左から右へ向かう方向に進入した。
(3)先に領域RA,RBの信号立ち上がり、次いで領域RC,RDの信号立ち上がり:手HDは上から下へ向かう方向に進入した。
(4)先に領域RC,RDの信号立ち上がり、次いで領域RA,RBの信号立ち上がり:手HDは下から上へ向かう方向に進入した。
Hereinafter, the screen display control process based on the gesture operation will be described in detail with reference to the drawings. First, after the gesture operation detection operation is started by the above-described operation, the processor 121 waits until the light receiving unit 105a detects detection light (in this case, detection light) in step S101 in the flowchart of FIG. When light is detected, the approach direction of the hand HD is determined from the timing of the signals in the areas RA to RD as follows in step S102.
(1) Signal rise in areas RA and RC first, followed by signal rise in areas RB and RD: The hand HD has entered in the direction from right to left.
(2) Signal rise in areas RB and RD first, followed by signal rise in areas RA and RC: The hand HD has entered in the direction from left to right.
(3) Signal rise in areas RA and RB first, followed by signal rise in areas RC and RD: The hand HD has entered in the direction from top to bottom.
(4) Signal rise in areas RC and RD first, then signal rise in areas RA and RB: The hand HD has entered in the direction from bottom to top.
 ステップS103で、プロセッサ121は判断した手HDの進入方向を記憶する。次いで、ステップS104で、プロセッサ121は、受光部105aが検出光の不検出になるまで待ち、不検出となった場合には、ステップS105で、手HDの離脱方向を領域RA~RDの信号のタイミングから,以下のように判断する。
(5)先に領域RA,RCの信号立ち下がり、次いで領域RB,RDの信号立ち下がり:手HDは右から左へ向かう方向に離脱した。
(6)先に領域RB,RDの信号立ち下がり、次いで領域RA,RCの信号立ち下がり:手HDは左から右へ向かう方向に離脱した。
(7)先に領域RA,RBの信号立ち下がり、次いで領域RC,RDの信号立ち下がり:手HDは上から下へ向かう方向に離脱した。
(8)先に領域RC,RDの信号立ち下がり、次いで領域RA,RBの信号立ち下がり:手HDは下から上へ向かう方向に離脱した。
In step S103, the processor 121 stores the determined approach direction of the hand HD. Next, in step S104, the processor 121 waits until the light receiving unit 105a detects no detection light. If the detection is not detected, in step S105, the processor 121 determines the direction of detachment of the hand HD from the signals of the regions RA to RD. Judgment from the timing is as follows.
(5) Signal fall in areas RA and RC first, followed by signal fall in areas RB and RD: Hand HD left in the direction from right to left.
(6) Signal fall in areas RB and RD first, followed by signal fall in areas RA and RC: Hand HD left in the direction from left to right.
(7) Signal fall in areas RA and RB first, followed by signal fall in areas RC and RD: Hand HD left in the direction from top to bottom.
(8) The signal fall in the areas RC and RD first, and then the signal fall in the areas RA and RB: The hand HD has left in the direction from the bottom to the top.
 ステップS106で,プロセッサ121は手HDの進入方向と離脱方向が一致しているか否か判断する。手HDの進入方向と離脱方向が一致していない場合、ジェスチャー操作の誤検出の恐れがあるため、本例では、ジェスチャー操作を検出せずステップS123へと移行する。尚、手HDの進入方向と離脱方向が異なる場合を検出して、別な制御に利用しても良い。又、手HDの進入から離脱までの時間を計測し、決められた時間内動作である場合に正しいジェスチャー操作と判定しても良い。例えば、進入を示す信号から1秒以上経っても離脱を示す信号が出力されない場合は、ジェスチャー動作を無視するか、或いは別のパターンのジェスチャー動作として判定してもよい。 In step S106, the processor 121 determines whether or not the entry direction and the withdrawal direction of the hand HD match. If the entering direction and the leaving direction of the hand HD do not match, there is a possibility of erroneous detection of the gesture operation. In this example, the gesture operation is not detected and the process proceeds to step S123. Note that a case where the entering direction and the leaving direction of the hand HD are different may be detected and used for another control. Alternatively, the time from the entry to the withdrawal of the hand HD may be measured, and if the movement is within the determined time, it may be determined that the gesture operation is correct. For example, when a signal indicating withdrawal is not output even after one second or more from the signal indicating entry, the gesture operation may be ignored or determined as another pattern gesture operation.
 一方、ステップS106で手HDの進入方向と離脱方向が一致していると判断した場合、プロセッサ121は、続くステップS107で、内蔵したロックモード用のタイマが動作中か否かを判別する。ロックモード用のタイマとは、スタートからストップまで0秒を超える規定時間(例えば0.5秒)だけ動作するものであり、このタイマが動作中の間、ロックモードが設定されることとなる。手HDの移動速度や振る舞いには個人差があるため、ロックモードの設定にかかる規定時間を調整することで、ユーザー個々に対して最適化が可能になる。規定時間が長すぎると、逆方向にスクロール等を行いたいユーザーが待たなくてはならない時間が長くなるので、最大3秒,好ましくは1秒以下とするのが望ましい。尚、この規定時間は、ユーザーの好みに合わせて調整可能であり、これによりユーザーフレンドリーなHMDを提供できる。 On the other hand, if it is determined in step S106 that the approach direction and the withdrawal direction of the hand HD are the same, the processor 121 determines in step S107 whether the built-in lock mode timer is operating. The timer for the lock mode operates for a specified time exceeding 0 seconds (for example, 0.5 seconds) from the start to the stop, and the lock mode is set while the timer is operating. Since there are individual differences in the moving speed and behavior of the hand HD, it is possible to optimize it for each user by adjusting the prescribed time for setting the lock mode. If the specified time is too long, the time that the user who wants to scroll in the reverse direction has to wait becomes longer. Therefore, it is desirable that the maximum time is 3 seconds, preferably 1 second or less. Note that this specified time can be adjusted according to the user's preference, thereby providing a user-friendly HMD.
 手HDの検出を最初に行った場合には、まだタイマが作動していない(ロックモード非設定)ので、プロセッサ121は、ステップS108でタイマをリセットしてスタートさせて計時を開始し、ここでロックモードを設定する。 When the hand HD is detected for the first time, since the timer is not yet operated (the lock mode is not set), the processor 121 resets and starts the timer in step S108 to start the time measurement. Set the lock mode.
 プロセッサ121は、更にステップS109でジェスチャー操作を判定する。具体的には、プロセッサ121は、ジェスチャー操作(1)と(5)が連続して行われたと判断したときは、ステップS110で、画面が右から左へと移動する(頁送り又はスクロールする)ように表示制御部104DRを介してディスプレイユニット104を制御する。又、プロセッサ121は、ジェスチャー操作(2)と(6)が連続して行われたと判断したときは、ステップS111で、画面が左から右へと移動するように表示制御部104DRを介してディスプレイユニット104を制御する。更に、プロセッサ121は、ジェスチャー操作(3)と(7)が連続して行われたと判断したときは、ステップS112で、画面が上から下へと移動するように表示制御部104DRを介してディスプレイユニット104を制御する。又、プロセッサ121は、ジェスチャー操作(4)と(8)が連続して行われたと判断したときは、ステップS113で、画面が下から上へと移動するように表示制御部104DRを介してディスプレイユニット104を制御する。このように、本実施の態様によれば、複数の受光領域がユーザーの手を検出するタイミングに応じて、ユーザーの手が検出領域に接近又は離間する方向を判別し、判別した方向に応じて画面表示を制御するので、近接センサの信号に応じた制御のバリエーションを増やすことができる。 The processor 121 further determines a gesture operation in step S109. Specifically, when the processor 121 determines that the gesture operations (1) and (5) are continuously performed, the screen moves from right to left (page feed or scroll) in step S110. In this manner, the display unit 104 is controlled via the display control unit 104DR. On the other hand, when the processor 121 determines that the gesture operations (2) and (6) are continuously performed, the processor 121 displays the display via the display control unit 104DR so that the screen moves from left to right in step S111. The unit 104 is controlled. Further, when the processor 121 determines that the gesture operations (3) and (7) are continuously performed, in step S112, the processor 121 displays the display through the display control unit 104DR so that the screen moves from top to bottom. The unit 104 is controlled. If the processor 121 determines that the gesture operations (4) and (8) are continuously performed, the processor 121 displays the display via the display control unit 104DR so that the screen moves from bottom to top in step S113. The unit 104 is controlled. Thus, according to the present embodiment, the direction in which the user's hand approaches or separates from the detection area is determined according to the timing at which the plurality of light receiving areas detect the user's hand, and the direction according to the determined direction. Since the screen display is controlled, variations in control according to the signal from the proximity sensor can be increased.
 ユーザーUSは、目の前に進入し離脱する手HDの動きを、画像表示部104Bを通して視認しつつ、それに同期して手の移動と同じ方向に画像の頁送りやスクロールを行うことが出来、直感的に所望の操作を行えるので使い勝手に優れる。 The user US can view the movement of the hand HD entering and leaving in front of the user through the image display unit 104B, and can synchronize with it and perform page turning and scrolling of the image in the same direction as the movement of the hand. The desired operation can be performed intuitively, so it is easy to use.
 更にプロセッサ121は、次のステップS114にて、検出した手HDの移動方向(検出領域SAを通過する方向)を、最初の移動方向としてRAM126内あるいはプロセッサ121内のレジスタ領域に記憶し、その後フローをステップS123へと移行させる。 Further, in the next step S114, the processor 121 stores the detected movement direction of the hand HD (direction passing through the detection area SA) in the RAM 126 or the register area in the processor 121 as the first movement direction, and then the flow. To step S123.
 一方、ステップS107で、タイマが動作中であるとプロセッサ121が判断したときは、既にロックモードが設定されているためステップS115へと進み、ステップS107の直前に検出した手HDの移動方向が、最初の移動方向と異なるか否かを判別する。 On the other hand, when the processor 121 determines that the timer is operating in step S107, since the lock mode has already been set, the process proceeds to step S115, and the movement direction of the hand HD detected immediately before step S107 is It is determined whether or not the first moving direction is different.
 検出した手HDの移動方向が最初の移動方向と異なると判断したときは、プロセッサ121は、更にステップS116で、検出した手HDの移動方向が最初の移動方向と交差するか否かを判別する。検出した手HDの移動方向が最初の移動方向と交差しない(つまり逆方向へ移動した)と判断したときは、プロセッサ121は、フローをステップS123へと移行させる。すなわち、ロックモード設定中は、手HDを最初の移動方向と逆方向に移動させても、プロセッサ121は、最初の移動方向とは異なる移動方向に応じた画面表示制御をスキップし、逆方向への画像の頁送りやスクロールを行わせない。従って、前後の頁を何度も行ったり来たりしたり、スクロールが往復してしまうといった、ユーザーの意図と異なる画面表示動作を回避することができる。また、ロックモードが設定されている間(タイマ動作中)、画像表示部104B内に、ロックモードの設定(タイマ動作中)を示すアイコンIND(図13参照)を表示することが出来、これによりユーザーUSの利便性を高めることができる。尚、アイコンINDに含まれる矢印の向きが、最初の移動方向を示していると、それ以外の方向のジェスチャー操作が無視されることを、ユーザーが認識できるので望ましい。 If the processor 121 determines that the detected moving direction of the hand HD is different from the initial moving direction, the processor 121 further determines in step S116 whether the detected moving direction of the hand HD intersects the initial moving direction. . If the processor 121 determines that the detected moving direction of the hand HD does not intersect the initial moving direction (that is, moves in the opposite direction), the processor 121 shifts the flow to step S123. That is, while the lock mode is set, even if the hand HD is moved in the direction opposite to the initial movement direction, the processor 121 skips the screen display control according to the movement direction different from the initial movement direction, and moves in the reverse direction. The page is not paged or scrolled. Accordingly, it is possible to avoid a screen display operation different from the user's intention, such as going back and forth between the previous and next pages and scrolling back and forth. In addition, while the lock mode is set (during timer operation), an icon IND (see FIG. 13) indicating the lock mode setting (during timer operation) can be displayed in the image display unit 104B. The convenience of the user US can be enhanced. If the direction of the arrow included in the icon IND indicates the initial movement direction, it is desirable because the user can recognize that the gesture operation in the other direction is ignored.
 尚、ステップS116で、検出した手HDの移動方向が最初の移動方向と交差すると判断したときは、プロセッサ121は、ユーザーがロックモードの解除を所望していると判断し、ステップS117でタイマをストップさせ、ロックモードを解除する。つまり、ユーザーはロックモード設定中に任意のタイミングで、手HDを最初の移動方向と交差する方向へと移動させることで、ロックモードを解除できるので,速やかなモード解除が可能になる。その後フローはステップS123へと移行し、ジェスチャー操作の検出制御処理を続行する場合、次に検出される手HDの移動方向が最初の移動方向となる(ステップS107でNoの判断)。 When it is determined in step S116 that the detected moving direction of the hand HD intersects the initial moving direction, the processor 121 determines that the user desires to release the lock mode, and the timer is set in step S117. Stop and release lock mode. That is, the user can release the lock mode by moving the hand HD in a direction intersecting the initial movement direction at an arbitrary timing while the lock mode is set, so that the mode can be quickly released. After that, the flow moves to step S123, and when the gesture operation detection control process is continued, the movement direction of the hand HD detected next becomes the first movement direction (No in step S107).
 一方、プロセッサ121は、ステップS115で、検出した手HDの移動方向が最初の移動方向と同じと判断したときは、まず、ステップS118で、タイマのカウント値を確認する。タイマのカウント値を確認することで、手HDの移動速度を把握することができ、延いては、手HDを往復移動させる場合の往復動作の速度を把握することができる。そして、プロセッサ121は、手HDの往復動作の速度に基づいて、手HDの通過方向に応じた画面表示制御としての頁送りやスクロールといった画面制御について、操作1回当たりの画面の移動量(例えば、頁送り量やスクロール量)の調整が必要か否かを判断する(ステップS119)。例えば、手HDの移動速度に関する予め2つの閾値を設定しておき、手HDの移動速度とこれらの閾値との比較から画面の移動量の調整が必要か否かを判断したり、手HDの移動速度と前回の手HDの移動速度とを比較して画面の移動量の調整が必要か否かを判断したりする。そして、必要な場合はステップS120で移動量の調整を行う。例えば、手HDの往復動の速度が第1の閾値以上の場合は、頁送り量やスクロール量を標準値よりも大きくし、第2の閾値未満の場合は、頁送り量やスクロール量を標準値よりも小さくする。あるいは、前回の移動速度に対する変化がプラスの場合は、その変化量に応じて頁送り量やスクロール量を大きくし、マイナスの場合は、その変化量に応じて頁送り量やスクロール量を小さくする。そして、プロセッサ121は、ステップS121で最初の移動方向に画像の頁送りやスクロールを行わせる。以上より明らかであるが、ロックモード設定中は、図13に示すように、ユーザーが、例えば手HDを、検出領域SAを通過させつつ左右に振ることで、最初に手HDが通過した方向のみ画像の頁送りやスクロールを繰り返し行うことが出来るから、ユーザーUSの意図に沿った表示制御を高速に行える。つまり、ユーザーUSは、一方向への画像の頁送りやスクロールを連続して行いたい場合には、手HDを同方向及び逆方向に往復動作させるのみで良く、これにより過度な振れ幅のジェスチャー操作を行う必要がなく、ユーザーの負担が軽減する。また、手や指を振る速度を速めることで、一回の操作で進むページ数やスクロールの幅を増やすことができるので、操作性がさらに高まる。 On the other hand, when the processor 121 determines in step S115 that the detected moving direction of the hand HD is the same as the initial moving direction, first, in step S118, the processor 121 checks the count value of the timer. By confirming the count value of the timer, the moving speed of the hand HD can be grasped, and further, the reciprocating speed when the hand HD is reciprocated can be grasped. Then, based on the speed of the reciprocation of the hand HD, the processor 121 performs a screen movement amount per operation (for example, for page control such as page turning and scrolling as screen display control according to the passing direction of the hand HD). Then, it is determined whether or not adjustment of the page feed amount or scroll amount is necessary (step S119). For example, two thresholds relating to the moving speed of the hand HD are set in advance, and it is determined whether the moving amount of the screen needs to be adjusted based on a comparison between the moving speed of the hand HD and these thresholds. The movement speed is compared with the previous movement speed of the hand HD to determine whether or not the adjustment of the movement amount of the screen is necessary. If necessary, the movement amount is adjusted in step S120. For example, when the speed of reciprocation of the hand HD is equal to or higher than a first threshold, the page feed amount and scroll amount are made larger than the standard value, and when the speed is less than the second threshold, the page feed amount and scroll amount are made standard. Make it smaller than the value. Alternatively, when the change with respect to the previous movement speed is positive, the page feed amount or scroll amount is increased according to the change amount, and when the change is negative, the page feed amount or scroll amount is decreased according to the change amount. . In step S121, the processor 121 causes the image to be paged or scrolled in the first movement direction. As is clear from the above, while the lock mode is set, as shown in FIG. 13, the user shakes the hand HD to the left and right while passing through the detection area SA, for example, so that only the direction in which the hand HD first passes is shown. Since the image can be repeatedly paged and scrolled, display control according to the intention of the user US can be performed at high speed. That is, the user US only needs to reciprocate the hand HD in the same direction and in the opposite direction in order to continuously page and scroll the image in one direction. There is no need to perform operations, reducing the burden on the user. Further, by increasing the speed of shaking hands and fingers, the number of pages to be advanced by one operation and the width of scrolling can be increased, so that the operability is further improved.
 プロセッサ121は、更にステップS122でタイマを一旦リセットし、再度計時をスタートさせることで、ロックモードを次の規定時間が終了するまで再設定する。これにより、ユーザーが所望する限り、画像の頁送りやスクロールを繰り返し行うことが出来る。但し、誤動作防止のため、繰り返し回数の上限を設けて、それを超えたら画面表示制御を停止させてもよい。 The processor 121 further resets the timer in step S122 and restarts the time measurement to reset the lock mode until the next specified time is over. As a result, as long as the user desires, it is possible to repeatedly feed and scroll the image. However, in order to prevent malfunction, an upper limit of the number of repetitions may be provided, and the screen display control may be stopped when the upper limit is exceeded.
 プロセッサ121は、ステップS123にて、ジェスチャー操作の検出動作の終了を指示する信号が入力されない限り、フローをステップS101へと戻して制御を続行する。一方、ステップS123で、ユーザーUSによりジェスチャー操作の検出動作の終了を指示する信号が入力されたと判断した場合、プロセッサ121は、ジェスチャー操作による画面表示制御処理を終了する。 The processor 121 returns the flow to step S101 and continues control unless a signal instructing the end of the gesture operation detection operation is input in step S123. On the other hand, when determining in step S123 that the user US has input a signal for instructing the end of the gesture operation detection operation, the processor 121 ends the screen display control process by the gesture operation.
 ジェスチャー操作は継続的に実施される可能性が高いため、通常は次のジェスチャー操作の検出を待つことが多いが、ユーザーUSが停止操作を忘れたときなど、未操作時間が長い場合は、消費電力削減のためジェスチャー操作の検出を停止させるようにしてもよい。又、操作部122の操作によって、ロックモードのオン/オフ設定を行うこともできる。ロックモードのオン設定時には、図12のフローチャートに従いロックモードの設定を行うことができ、ロックモードのオフ設定時には、タイマを強制的に非動作状態としてロックモードの設定を行わないようにできる。 Since gesture operations are likely to be carried out continuously, it usually waits for detection of the next gesture operation. However, if the user US forgets the stop operation, it is consumed if the non-operation time is long. You may make it stop detection of gesture operation for electric power reduction. In addition, the lock mode can be turned on / off by operating the operation unit 122. When the lock mode is set to ON, the lock mode can be set according to the flowchart of FIG. 12, and when the lock mode is set to OFF, the timer is forcibly set to the non-operating state so that the lock mode is not set.
 プロセッサ121が行うジェスチャー操作の検出制御処理を、図14に示す模式的なタイムチャートを用いて具体的に説明する。図14(a)に示すように、ユーザーが手を左右に振った場合(図13参照)、センサ105は、図14(b)に示すような、交互にオンオフを繰り返す信号を出力する。ここでは、手を検出しないときはセンサ105の信号値は低くなり、手を検出すると信号値が増大する。 The gesture operation detection control process performed by the processor 121 will be specifically described with reference to a schematic time chart shown in FIG. As shown in FIG. 14A, when the user shakes his / her hand from side to side (see FIG. 13), the sensor 105 outputs a signal that alternately repeats on and off as shown in FIG. 14B. Here, when the hand is not detected, the signal value of the sensor 105 decreases, and when the hand is detected, the signal value increases.
 図14(c)に、ロックモードのオン設定時の画面表示制御を示す。当初のホーム画面から、センサ105が右から左への手の移動を検出した時刻T1で、タイマを動作させてロックモードが設定されると共に、ホーム画面を左方へと移動させて次の画面(1番目の画面)を表示する。 Fig. 14 (c) shows the screen display control when the lock mode is turned on. At time T1 when the sensor 105 detects the movement of the hand from the right to the left from the initial home screen, the timer is operated to set the lock mode, and the home screen is moved to the left to move to the next screen. (First screen) is displayed.
 次いで、センサ105が左から右への手の移動を検出した時刻T2では、ロックモードが設定されているので、1番目の画面が表示されたままである。更にセンサ105が右から左への手の移動を検出した時刻T3で、規定時間を経過する前であるから、タイマをリセットしてロックモードを延長すると共に、1番目の画面を左方へと移動させて次の画面(2番目の画面)を表示する。次いで、センサ105が左から右への手の移動を検出した時刻T4では、ロックモードが設定されているので、2番目の画面が表示されたままである。以降、タイムアウトになるまで、画面送りが一方向に連続的に行われる。 Next, at the time T2 when the sensor 105 detects the movement of the hand from left to right, since the lock mode is set, the first screen remains displayed. Further, at time T3 when the movement of the hand from the right to the left is detected by the sensor 105, before the specified time has elapsed, the timer is reset to extend the lock mode and the first screen is moved to the left. Move to display the next screen (second screen). Next, at time T4 when the sensor 105 detects the movement of the hand from left to right, since the lock mode is set, the second screen remains displayed. Thereafter, screen feed is continuously performed in one direction until a timeout occurs.
 図14(d)に、ロックモードのオフ設定時の画面表示制御を示す。当初のホーム画面から、センサ105が右から左への手の移動を検出した時刻T1で、ホーム画面を左方へと移動させて次の画面(1番目の画面)を表示する。次に、センサ105が左から右への手の移動を検出した時刻T2で、1番目の画面を右方へと移動させて元のホーム画面を表示する。更に、ホーム画面から、センサ105が右から左への手の移動を検出した時刻T3で、ホーム画面を左方へと移動させて次の画面(1番目の画面)を表示する。次に、センサ105が左から右への手の移動を検出した時刻T4で、1番目の画面を右方へと移動させて元のホーム画面を表示する。つまり、ロックモードのオフ設定では、手の動きに合わせて、2つの画面が交互に表示されることとなる。 Fig. 14 (d) shows the screen display control when the lock mode is set to OFF. From the initial home screen, at time T1 when the sensor 105 detects the movement of the hand from right to left, the home screen is moved leftward to display the next screen (first screen). Next, at time T2 when the sensor 105 detects the movement of the hand from left to right, the first screen is moved rightward to display the original home screen. Further, at time T3 when the sensor 105 detects the movement of the hand from right to left from the home screen, the home screen is moved to the left to display the next screen (first screen). Next, at time T4 when the sensor 105 detects the movement of the hand from left to right, the first screen is moved rightward to display the original home screen. That is, when the lock mode is set to OFF, two screens are alternately displayed in accordance with the movement of the hand.
 ロックモードのオフ設定で、画像の頁送りやスクロールを一方向に繰り返して行いたい場合には、図15に示すように、手HDを,センサ105の検出領域SAを通過させる時には一方向に移動させ、戻すときは検出領域SAの外を通過するように、つまりユーザーの目前で手を回すようにジェスチャー操作する。 When it is desired to repeat page turning and scrolling in one direction with the lock mode off, the hand HD moves in one direction when passing the detection area SA of the sensor 105 as shown in FIG. When returning, the gesture operation is performed so as to pass outside the detection area SA, that is, to turn the hand in front of the user.
 図16は、ジェスチャー操作によって切り変わる画像の一例を示す図である。各画面間の遷移は、手の上下左右への手の移動をスワイプ操作として認識し、これに合わせてスライドさせて切り替わるようにすることができる。ここで、手HDを上下に移動させる際は、ユーザーの手のひらと肘を水平に保持したまま移動させるか、肘を支点にして手のひらを上下に移動させることが誤検出を回避する上で好ましい。以下の動作は、近接センサ105の信号に応じて、プロセッサ121が制御を司るものとする。HMD100の電源オン時には、ディスプレイユニット104にホーム画面G11を表示する。図16に示すホーム画面G11では、画面上部に現在の日時、気温、湿度などが表示される。ここで、下から上へと手HDを移動させてジェスチャー操作を行うと、画面表示を行わない非表示モードに入り、ディスプレイユニット104は非表示画面G01へと切り変わる。かかる状態では、ユーザーUSは、ディスプレイユニット104を通して外界像のみを観察できる。この画面表示(つまり、非表示モードの実行中)から、上から下へと手HDを移動させてジェスチャー操作を行うと、非表示モードが解除され、ディスプレイユニット104はホーム画面G11に切り変わる。 FIG. 16 is a diagram illustrating an example of an image that is switched by a gesture operation. As for the transition between the screens, the movement of the hand up and down, left and right can be recognized as a swipe operation, and can be switched by sliding in accordance with this. Here, when moving the hand HD up and down, it is preferable to move the palm and the elbow of the user while keeping the palm and the elbow horizontally, or to move the palm up and down with the elbow as a fulcrum in order to avoid erroneous detection. The following operation is controlled by the processor 121 in accordance with a signal from the proximity sensor 105. When the HMD 100 is powered on, the home screen G11 is displayed on the display unit 104. On the home screen G11 shown in FIG. 16, the current date, temperature, humidity, and the like are displayed at the top of the screen. Here, when a gesture operation is performed by moving the hand HD from the bottom to the top, a non-display mode in which no screen display is performed is entered, and the display unit 104 is switched to the non-display screen G01. In such a state, the user US can observe only the external image through the display unit 104. From this screen display (that is, during the execution of the non-display mode), when the gesture operation is performed by moving the hand HD from top to bottom, the non-display mode is canceled and the display unit 104 is switched to the home screen G11.
 又、ホーム画面G11の表示から、右から左へと手HDを移動させてジェスチャー操作を行うと、音楽再生モードに移行すると共に、ディスプレイユニット104に曲名表示画面G10を表示する。曲名表示画面G10では、画面上部に、スピーカまたはイヤホンより流れている曲目,作者などが表示されると共に、画面左にボリュームコントロールVOLが表示される。曲名表示画面G10が表示されている状態で、ユーザーUSが下から上へと手HDを移動させる度に、再生される曲のボリュームが一目盛ずつ増大し、逆に上から下へと手HDを移動させる度に、ボリュームが一目盛ずつ減少し、それと共に表示が変化する。つまり、ボリュームコントロールの画像が表示されている間に、検知領域に手が接近又は離間すると、その方向を判別して、ボリューム画像に対応付けられた機能(ボリューム)が調整され、それとともに表示が更新される。この画面表示から、左から右へと手HDを移動させてジェスチャー操作を行うと、ディスプレイユニット104はホーム画面G11に切り変わる。 Further, when the gesture operation is performed by moving the hand HD from the right to the left from the display of the home screen G11, the music display mode is displayed and the music title display screen G10 is displayed on the display unit 104. On the song name display screen G10, the title, author, and the like flowing from the speaker or earphone are displayed at the top of the screen, and the volume control VOL is displayed on the left side of the screen. Each time the user US moves the hand HD from the bottom to the top while the song title display screen G10 is displayed, the volume of the song to be played increases one by one, and conversely the hand HD from the top to the bottom. Each time you move the volume, the volume decreases by one tick and the display changes with it. In other words, if a hand approaches or moves away from the detection area while the volume control image is displayed, the direction is determined and the function (volume) associated with the volume image is adjusted, and the display is displayed along with it. Updated. From this screen display, when the hand HD is moved from left to right and a gesture operation is performed, the display unit 104 is switched to the home screen G11.
 又、ホーム画面G11の表示から、左から右へと手HDを移動させてジェスチャー操作を行うと、撮像モードに移行すると共に、ディスプレイユニット104に撮像画角表示画面G12を表示する。ユーザーUSは、撮像画角表示画面G12に表示された矩形枠内の被写体を、カメラ106により静止画又は動画にて撮像できることが分かる。画面の切り替わりに合わせて撮像開始/終了してもよいし、それを起点にして待機時間が経過した後に撮像開始/終了してもよい。操作部の操作や音声で撮像開始/終了してもよい。この画面表示から、右から左へと手HDを移動させてジェスチャー操作を行うと、ディスプレイユニット104はホーム画面G11に切り変わる。 In addition, when the gesture operation is performed by moving the hand HD from the left to the right from the display of the home screen G11, the display mode 104 is displayed and the imaging field angle display screen G12 is displayed on the display unit 104. It can be seen that the user US can capture the subject within the rectangular frame displayed on the imaging angle-of-view display screen G12 with the camera 106 as a still image or a moving image. The imaging may be started / finished in accordance with the switching of the screen, or the imaging may be started / finished after the standby time has elapsed from that point. Imaging may be started / stopped by operation of the operation unit or by voice. From this screen display, when the hand HD is moved from right to left and a gesture operation is performed, the display unit 104 is switched to the home screen G11.
 又、ホーム画面G11の表示から、上から下へと手HDを移動させてジェスチャー操作を行うと、コンテンツの閲覧表示画面に切り替わるとともに、コンテンツの特定頁の画像G21nを表示する。表示画面G21nの表示から、右から左へと手HDを移動させてジェスチャー操作を行う毎に、ディスプレイは頁をデクリメントし一つ前の頁の画像G21n-1に切り替わり、左から右へと手HDを移動させてジェスチャー操作を行う毎に、一つ後の頁の画像G21n+1に切り替わる。表示画面G21から、上から下へと手HDを移動させてジェスチャー操作を行うと、設定モードに移行すると共に、ディスプレイユニット104に設定表示画面G31を表示する。設定表示画面G31の表示から、右から左へと手HDを移動させてジェスチャー操作を行うと、ディスプレイユニット104は別の設定表示画面G30に切り替わり、或いは左から右へと手HDを移動させてジェスチャー操作を行うと、ディスプレイユニット104は別の設定表示画面G32に切り変わり、それぞれ別の設定を,例えば操作部122を利用して行うことができる。設定表示画面G31の表示から、下から上へと手HDを移動させてジェスチャー操作を行うと、ディスプレイユニット104は閲覧画面G21を経てホーム画面G11に切り変わる。上述したロックモードを、例えば、設定画面G30から設定画面G32へと画面を連続的に遷移させる場合や、表示画面G21で頁送りを行う場合に、上述したロックモードを用いるのが有効である。尚、複数の選択項目を含む一表示画面(メニュー画面)から、候補の項目を選択するための縦方向や横方向への移動を、ジェスチャー操作の検出を用いて行っても良い。頁送りの量やメニュー画面における項目の移動量を、上述したように手HDの往復速度に応じて変化させてもよい。 Also, when a gesture operation is performed by moving the hand HD from the top to the bottom from the display on the home screen G11, the screen is switched to a content browsing display screen and an image G21n of a specific page of the content is displayed. Each time the hand HD is moved from right to left on the display screen G21n and a gesture operation is performed, the display decrements the page and switches to the image G21n-1 of the previous page, and moves from left to right. Every time the HD is moved and a gesture operation is performed, the image is switched to the image G21n + 1 of the next page. When a gesture operation is performed by moving the hand HD from the top to the bottom from the display screen G21, the setting mode is displayed and the setting display screen G31 is displayed on the display unit 104. When the hand HD is moved from the right to the left on the setting display screen G31 and a gesture operation is performed, the display unit 104 switches to another setting display screen G30, or the hand HD is moved from the left to the right. When a gesture operation is performed, the display unit 104 is switched to another setting display screen G32, and each different setting can be performed using, for example, the operation unit 122. When a gesture operation is performed by moving the hand HD from the bottom to the top on the setting display screen G31, the display unit 104 is switched to the home screen G11 via the browsing screen G21. It is effective to use the above-described lock mode when, for example, the screen is continuously changed from the setting screen G30 to the setting screen G32 or when page feed is performed on the display screen G21. In addition, you may perform the movement to the vertical direction or horizontal direction for selecting a candidate item from one display screen (menu screen) containing a some selection item using the detection of gesture operation. The amount of page feed and the amount of movement of items on the menu screen may be changed according to the reciprocation speed of the hand HD as described above.
 ユーザーUSは、ディスプレイユニット104の表示画面を見ながら、その前方で手HDを動かすことで、画面操作ができるため、操作パネルを持つ装置のように操作部を見るための視点移動は不要となり、動作と操作が連動するように見せることができる。よって、使い勝手のよいユーザーインタフェースを提供できる。しかも、カメラによってジェスチャー認識するのとは異なり、認識される領域を正確に把握することができる。 The user US can operate the screen by moving the hand HD in front of the display unit 104 while viewing the display screen of the display unit 104. Therefore, it is not necessary to move the viewpoint to view the operation unit like a device having an operation panel. It can appear to move and operate. Therefore, an easy-to-use user interface can be provided. Moreover, unlike the gesture recognition by the camera, the recognized area can be accurately grasped.
 なお、各画面の表示内容はあくまでも一例であって、他の機能に関連づけられた画像を含む様々な画面から任意のものを追加したり差し替えたりしても構わない。また、手の移動方向に応じてどの画面に切り替えるか、切り替えるときのアニメーションをどのように設定するかなども、ユーザーの好みに応じて適宜設定すればよい。 Note that the display content of each screen is merely an example, and arbitrary items may be added or replaced from various screens including images associated with other functions. Moreover, what screen should be switched according to the moving direction of the hand, how to set the animation at the time of switching may be set as appropriate according to the preference of the user.
 又、本実施形態においては、近接センサとしてパッシブ型センサの一種である焦電センサを用いたが、これに代えて、赤外光等の不可視光を照射する発光部と、発光部から照射され、物体によって反射された不可視光を検出する受光部とを有するアクティブ型のセンサを用いてもよい。アクティブ型のセンサを用いれば、ユーザーが手袋をしているような場合でも、ジェスチャー操作の検出を行えるというメリットがある。 In this embodiment, a pyroelectric sensor, which is a kind of passive sensor, is used as the proximity sensor. Instead, a light emitting unit that emits invisible light such as infrared light and a light emitting unit emit light. An active type sensor having a light receiving unit that detects invisible light reflected by an object may be used. If an active sensor is used, there is an advantage that gesture operation can be detected even when the user is wearing gloves.
(第2の実施形態)
 図17は、第2の実施形態にかかるスマートフォン200を正面から見た図であり、ユーザーの手HDと共に示している。スマートフォン200は、上述した実施形態と同様な構成(図6参照)を持ち、ここでは同じ機能を有する構成について同じ符号を付して説明を省略するが、スマートフォン200では、検出装置として近接センサの代わりにカメラ106を用いている。但し、上述したHMD100と同様に近接センサを備えても良い。
(Second Embodiment)
FIG. 17 is a front view of the smartphone 200 according to the second embodiment, which is shown together with the user's hand HD. The smartphone 200 has the same configuration as that of the above-described embodiment (see FIG. 6). Here, the configuration having the same function is denoted by the same reference numeral and description thereof is omitted, but the smartphone 200 has a proximity sensor as a detection device. Instead, the camera 106 is used. However, you may provide a proximity sensor similarly to HMD100 mentioned above.
 ここでは、撮像装置であるカメラ106の撮像範囲が、検出領域SAを構成するものとする。ジェスチャー操作に基づく画面表示処理を実行する場合、内蔵されたプロセッサがカメラ106によって撮影された画像情報であるビデオ画像の各フレームを、予め記憶しているユーザーの手HDの形状や色の特徴データと比較し、手HDがカメラ106のフレーム内に入ったかどうかを判断する。手HDが進入したことを認識した後は、プロセッサは、その手HDの特徴と特徴点を追跡対象として記憶する。次のフレームでも同じ特徴の手HDが存在する場合は、プロセッサは、その移動距離、方向と速度を計測する。 Here, it is assumed that the imaging range of the camera 106 that is the imaging apparatus constitutes the detection area SA. When executing a screen display process based on a gesture operation, the built-in processor stores in advance each frame of a video image, which is image information taken by the camera 106, and stores the shape data and color feature data of the user's hand HD. And whether or not the hand HD has entered the frame of the camera 106 is determined. After recognizing that the hand HD has entered, the processor stores the features and feature points of the hand HD as tracking targets. If there is a hand HD having the same feature in the next frame, the processor measures the moving distance, direction, and speed.
 例えば、ユーザーが画面の右側から左側へ一方向に手HDを移動させた場合、プロセッサが、手HDの移動が規定された最低の移動距離(例えば、画面の半分の距離程度)を超え、且つ規定の速度範囲内で移動したと判断すれば、手HDが検出領域を通過したと判断し、ロックモード用のタイマが作動中でない限り、ロックモードを設定して、かかる方向を最初の移動方向として記憶すると共に、最初の移動方向に画面のスクロールや画面遷移等を行わせる。このとき、必ずしも手HDはカメラ106のフレームの枠外の右から左へ通過させる必要はない。 For example, when the user moves the hand HD in one direction from the right side to the left side of the screen, the processor exceeds the minimum movement distance (for example, about half the distance of the screen) for which the movement of the hand HD is specified, and If it is determined that it has moved within the specified speed range, it is determined that the hand HD has passed the detection area, and unless the timer for the lock mode is activated, the lock mode is set and this direction is set as the first moving direction. As well as screen scrolling and screen transition in the first moving direction. At this time, the hand HD does not necessarily have to pass from the right to the left outside the frame of the camera 106.
 よって、ビデオ画像のフレーム内で最初の移動方向を確認した後は、上述した実施形態と同様に、最初の移動方向と逆方向に手HDを移動したことを、カメラ106のビデオ画像を通してプロセッサが検出した場合でも、逆方向に画面のスクロールや画面遷移等を行わせることはない。更に、最初の移動方向と逆方向に手HDを移動したことを、カメラ106のビデオ画像を通してプロセッサが検出した場合には、最初の移動方向に画面のスクロールや画面遷移等を行わせることができる。 Therefore, after confirming the first movement direction in the frame of the video image, the processor moves through the video image of the camera 106 that the hand HD has moved in the direction opposite to the first movement direction, as in the above-described embodiment. Even if it is detected, screen scrolling, screen transition, etc. are not performed in the reverse direction. Furthermore, when the processor detects through the video image of the camera 106 that the hand HD has moved in the direction opposite to the initial movement direction, it is possible to cause the screen to scroll or transition in the first movement direction. .
 但し、手の移動速度が速すぎる場合は、フレームの画像の中で確実に手を認識できなくなり、遅い場合は別の動作とみなすことができるため、速度範囲を規定することが望ましい。手の認識率は、カメラのフレームレートにも依るが、およそ15fps以上のレートが推奨される。 However, if the moving speed of the hand is too fast, it will be impossible to recognize the hand reliably in the frame image, and if it is slow, it can be regarded as another action, so it is desirable to define the speed range. Although the hand recognition rate depends on the frame rate of the camera, a rate of about 15 fps or more is recommended.
 本発明は、本明細書に記載の実施形態に限定されるものではなく、他の実施形態・変形例を含むことは、本明細書に記載された実施形態や技術思想から本分野の当業者にとって明らかである。本明細書の記載及び実施形態は、あくまでも例証を目的としており、本発明の範囲は後述するクレームによって示されている。 The present invention is not limited to the embodiments described in the present specification, and includes other embodiments and modifications based on the embodiments and technical ideas described in the present specification. It is obvious to The description and embodiments herein are for illustrative purposes only, and the scope of the present invention is indicated by the following claims.
100      HMD
101      フレーム
101a     前方部
101b     側部
101c     側部
101d     長孔
101e     長孔
102      眼鏡レンズ
103      主本体部
104      ディスプレイ
104A     画像形成部
104B     画像表示部
104DR    表示制御部
104a     光源
104b     一方向拡散板
104c     集光レンズ
104d     表示素子
104f     接眼プリズム
104g     偏向プリズム
104h     ホログラム光学素子
104i     画面
105      近接センサ
105a     受光部
106      カメラ
106a     レンズ
107      右副本体部
107a     突起
108      左副本体部
108a     突起
109      加速度センサ
110      ジャイロ
111A     スピーカ(又はイヤホン)
111B     マイク
121      プロセッサ
122      操作部
123      GPS受信部
124      通信部
125      ROM
126      RAM
127      電池
200      スマートフォン
CD       コード
CTU      制御ユニット
HD       手
HS       配線
IND      アイコン
PL1      基端面
PL2      内側面
PL3      外側面
PL4      傾斜面
PL4,PL5  傾斜面
RA-RD    受光領域
SA       検出領域
EV       有効視野
US       ユーザー
100 HMD
101 Frame 101a Front part 101b Side part 101c Side part 101d Long hole 101e Long hole 102 Eyeglass lens 103 Main body part 104 Display 104A Image forming part 104B Image display part 104DR Display control part 104a Light source 104b Unidirectional diffuser 104c Condensing lens 104d Display element 104f Eyepiece prism 104g Deflection prism 104h Hologram optical element 104i Screen 105 Proximity sensor 105a Light receiving part 106 Camera 106a Lens 107 Right sub-main part 107a Protrusion 108 Left sub-main part 108a Protrusion 109 Acceleration sensor 110 Gyro 111A Speaker (or earphone)
111B Microphone 121 Processor 122 Operation unit 123 GPS reception unit 124 Communication unit 125 ROM
126 RAM
127 Battery 200 Smartphone CD code CTU Control unit HD Hand HS Wiring IND Icon PL1 Base end surface PL2 Inner surface PL3 Outer surface PL4 Inclined surface PL4, PL5 Inclined surface RA-RD Light receiving area SA Detection area EV Effective field US User

Claims (14)

  1.  画像を表示できる画面を備えた表示部材を有するディスプレイユニットと、
     検出領域を持ち、ユーザーにより動かされる指示部が前記検出領域を移動する移動方向を検出して出力を生成する検出装置と、
     前記検出装置の出力に基づいて、前記ディスプレイユニットの画面表示を制御する制御装置と、を備え、
     前記制御装置は、前記検出装置の出力に基づいて前記指示部が前記検出領域を移動したことを検出したときに、前記指示部の移動方向を最初の移動方向として、前記最初の移動方向に応じた画面表示制御を行い、
     更に前記最初の移動方向の検出時よりタイマによる計時を開始し、前記計時を開始してから規定時間が経過するまでの前記タイマの動作中に、少なくとも1回、前記指示部が前記最初の移動方向と異なる移動方向に移動したことを示す前記検出装置の出力が生成されても、前記異なる移動方向に応じた画面表示制御をスキップする表示装置。
    A display unit having a display member having a screen capable of displaying an image;
    A detection device having a detection region and generating an output by detecting a moving direction in which the pointing unit moved by the user moves in the detection region;
    A control device for controlling the screen display of the display unit based on the output of the detection device,
    When the control unit detects that the pointing unit has moved in the detection area based on the output of the detection device, the control unit sets the moving direction of the pointing unit as the first moving direction and responds to the first moving direction. Screen display control,
    Furthermore, the timer starts counting from the time of detection of the first movement direction, and the instruction unit moves the first movement at least once during the operation of the timer from the time measurement until the specified time elapses. A display device that skips screen display control in accordance with the different movement direction even if an output of the detection device that indicates movement in a movement direction different from the direction is generated.
  2.  前記制御装置は、前記タイマの動作中に、前記指示部が前記最初の移動方向に移動したことを示す前記検出装置の出力が生成されたときは、前記指示部が移動した前記最初の移動方向に応じた画面表示制御を行う請求項1に記載の表示装置。 When the control device generates an output of the detection device indicating that the instruction unit has moved in the first movement direction during the operation of the timer, the first movement direction in which the instruction unit has moved is generated. The display device according to claim 1, wherein screen display control is performed in accordance with the display.
  3.  前記制御装置は、前記タイマの動作中に、前記指示部が前記最初の移動方向と同じ方向に移動したことを示す前記検出手段の出力が生成されたときは、前記タイマの計時結果をリセットすると共に、再度前記タイマを動作させ、前記タイマの動作中に、前記指示部が前記最初の移動方向と異なる移動方向に移動したことを示す前記検出装置の出力が生成されても、前記異なる移動方向に応じた画面表示制御をスキップする請求項1又は2に記載の表示装置。 The control device resets the timing result of the timer when the output of the detection means indicating that the instruction unit has moved in the same direction as the initial movement direction is generated during the operation of the timer. In addition, when the timer is operated again and the output of the detection device indicating that the instruction unit has moved in a movement direction different from the initial movement direction is generated during the operation of the timer, the different movement directions The display device according to claim 1, wherein the screen display control according to is skipped.
  4.  前記制御装置は、前記タイマが動作中であることを示す表示を前記ディスプレイユニットに行わせる請求項1~3のいずれかに記載の表示装置。 The display device according to any one of claims 1 to 3, wherein the control device causes the display unit to display that the timer is operating.
  5.  前記指示部の移動と前記ディスプレイユニットの画面の移動が対応付けられており、
     前記最初の移動方向の検出時から前記指示部が前記最初の移動方向と同じ移動方向に移動したことを示す前記検出装置の出力が生成されるまでの時間に応じて、前記指示部の前記検出領域内での移動と対応した操作1回当たりの画面の移動量を調整する請求項1~4のいずれかに記載の表示装置。
    The movement of the instruction unit is associated with the movement of the screen of the display unit,
    The detection of the instruction unit according to a time from when the first movement direction is detected until an output of the detection device indicating that the instruction unit has moved in the same movement direction as the first movement direction is generated. The display device according to any one of claims 1 to 4, wherein the amount of movement of the screen per operation corresponding to movement in the area is adjusted.
  6.  前記制御装置は、前記最初の移動方向の検出時から前記指示部が前記最初の移動方向と同じ移動方向に移動したことを示す前記検出装置の出力が生成されるまでの時間と、予め定められた閾値とを比較し、その比較結果に基づいて、前記操作1回当たりの画面の移動量を調整する請求項5に記載の表示装置。 The control device determines in advance a time from when the first moving direction is detected until an output of the detecting device is generated indicating that the instruction unit has moved in the same moving direction as the first moving direction. The display device according to claim 5, wherein the amount of movement of the screen per operation is adjusted based on the comparison result.
  7.  前記検出装置は、前記検出領域内において互いに交差する複数の方向に前記指示部が移動することを検出でき、前記制御装置は、前記タイマの動作中に、前記指示部が前記最初の移動方向と交差する方向に移動したことを表す前記検出装置の出力が生成された場合は前記タイマによる計時を中止し、前記タイマによる計時の中止後に前記指示部が移動したことを示す前記検出手段の出力が生成されたときは前記指示部の移動に応じた前記画面制御を行う請求項1~6のいずれかに記載の表示装置。 The detection device can detect movement of the instruction unit in a plurality of directions intersecting each other in the detection area, and the control unit can detect whether the instruction unit is in the first movement direction during the operation of the timer. When the output of the detection device indicating movement in the intersecting direction is generated, the time measurement by the timer is stopped, and the output of the detection means indicating that the instruction unit has moved after the time measurement by the timer is stopped is The display device according to any one of claims 1 to 6, wherein when generated, the screen control is performed in accordance with movement of the instruction unit.
  8.  前記検出装置は、前記指示部を撮像して前記制御装置に画像情報を出力する撮像装置である請求項1~7のいずれかに記載の表示装置。 The display device according to any one of claims 1 to 7, wherein the detection device is an imaging device that images the instruction unit and outputs image information to the control device.
  9.  前記検出装置は、前記指示部から放射又は前記指示部によって反射される検出光を検出して、前記制御装置に信号を出力する受光部を有する請求項1~7のいずれかに記載の表示装置。 The display device according to any one of claims 1 to 7, wherein the detection device includes a light receiving unit that detects detection light emitted from the instruction unit or reflected by the instruction unit and outputs a signal to the control unit. .
  10.  前記規定時間は調整可能である請求項1~9のいずれかに記載の表示装置。 10. The display device according to claim 1, wherein the specified time is adjustable.
  11.  前記ユーザーの頭部に装着され、前記ユーザーの少なくとも一方の眼前に、前記表示部材が位置するように前記ディスプレイユニットを保持する装着部材を有し、前記検出装置は前記装着部材に保持されている請求項1~10のいずれかに記載の表示装置。 A mounting member that is mounted on the head of the user and holds the display unit so that the display member is positioned in front of at least one eye of the user, and the detection device is held by the mounting member The display device according to any one of claims 1 to 10.
  12.  前記ディスプレイユニットの表示部材はシースルー型である請求項11に記載の表示装置。 The display device according to claim 11, wherein the display member of the display unit is a see-through type.
  13.  画像を表示できる画面を備えた表示部材を有するディスプレイユニットと、検出領域を持ち、ユーザーにより動かされる指示部が前記検出領域を移動する移動方向を検出して出力を生成する検出装置と、を備えた表示装置において、前記検出装置の出力に基づいて、前記ディスプレイユニットの画面表示を制御する表示装置の制御方法であって、
     前記検出装置の出力に基づいて前記指示部が前記検出領域を移動したことを検出したときに、前記指示部の移動方向を最初の移動方向として、前記最初の移動方向に応じた画面表示制御を行い、
     更に前記最初の移動方向の検出時より計時を開始してから規定時間が経過するまでの間に、少なくとも1回、前記指示部が前記最初の移動方向と異なる移動方向に移動したことを示す前記検出装置の出力が生成されても、前記異なる移動方向に応じた画面表示制御をスキップする、表示装置の制御方法。
    A display unit having a display member having a screen capable of displaying an image; and a detection device having a detection area and generating an output by detecting a moving direction in which the pointing unit moved by the user moves in the detection area. In the display device, the display device control method for controlling the screen display of the display unit based on the output of the detection device,
    When the instruction unit detects that the detection unit has moved in the detection area based on the output of the detection device, the movement direction of the instruction unit is set as the first movement direction, and screen display control according to the first movement direction is performed. Done
    Further, the indication unit indicates that the instruction unit has moved in a movement direction different from the first movement direction at least once from the time when the first movement direction is detected until the specified time elapses from the start of timing. A control method for a display device, wherein screen display control corresponding to the different movement directions is skipped even if an output of the detection device is generated.
  14.  画像を表示できる画面を備えた表示部材を有するディスプレイユニットと、検出領域を持ち、ユーザーにより動かされる指示部が前記検出領域を移動する移動方向を検出して出力を生成する検出装置と、前記検出装置の出力に基づいて、前記ディスプレイユニットの画面表示を制御する制御装置と、を備えた表示装置の制御プログラムであって、
     前記制御装置に、
     前記検出装置の出力に基づいて前記指示部が前記検出領域を移動したことを検出したときに、前記指示部の移動方向を最初の移動方向として、前記最初の移動方向に応じた画面表示制御を行わせ、
     更に前記最初の移動方向の検出時より計時を開始してから規定時間が経過するまでの間に、少なくとも1回、前記指示部が前記最初の移動方向と異なる移動方向に移動したことを示す前記検出装置の出力が生成されても、前記異なる移動方向に応じた画面表示制御をスキップさせる、表示装置の制御プログラム。
    A display unit having a display member having a screen capable of displaying an image; a detection device having a detection region; and a detection unit that generates an output by detecting a moving direction in which the pointing unit moved by the user moves in the detection region; and the detection A control device for controlling the screen display of the display unit based on the output of the device, and a display device control program comprising:
    In the control device,
    When the instruction unit detects that the detection unit has moved in the detection area based on the output of the detection device, the movement direction of the instruction unit is set as the first movement direction, and screen display control according to the first movement direction is performed. Let
    Further, the indication unit indicates that the instruction unit has moved in a movement direction different from the first movement direction at least once from the time when the first movement direction is detected until the specified time elapses from the start of timing. A display device control program for skipping screen display control in accordance with the different movement directions even when an output of a detection device is generated.
PCT/JP2015/079776 2014-11-04 2015-10-22 Display device, method for controlling display device, and control program therefor WO2016072271A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014224075 2014-11-04
JP2014-224075 2014-11-04

Publications (1)

Publication Number Publication Date
WO2016072271A1 true WO2016072271A1 (en) 2016-05-12

Family

ID=55908998

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/079776 WO2016072271A1 (en) 2014-11-04 2015-10-22 Display device, method for controlling display device, and control program therefor

Country Status (1)

Country Link
WO (1) WO2016072271A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6186488B1 (en) * 2016-12-09 2017-08-23 京セラ株式会社 Electronic device, control method and program
JP2018018290A (en) * 2016-07-27 2018-02-01 京セラ株式会社 Electronic device, control method and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010129069A (en) * 2008-12-01 2010-06-10 Fujitsu Ten Ltd Display device
JP2012168890A (en) * 2011-02-16 2012-09-06 Ntt Docomo Inc Display device, communication device, and program
JP2013178785A (en) * 2013-03-13 2013-09-09 Mitsubishi Electric Corp Map information processing device
JP2013235443A (en) * 2012-05-09 2013-11-21 Konica Minolta Inc Image forming apparatus, flick operation receiving method, and flick operation receiving program
JP2014186401A (en) * 2013-03-21 2014-10-02 Sharp Corp Information display device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010129069A (en) * 2008-12-01 2010-06-10 Fujitsu Ten Ltd Display device
JP2012168890A (en) * 2011-02-16 2012-09-06 Ntt Docomo Inc Display device, communication device, and program
JP2013235443A (en) * 2012-05-09 2013-11-21 Konica Minolta Inc Image forming apparatus, flick operation receiving method, and flick operation receiving program
JP2013178785A (en) * 2013-03-13 2013-09-09 Mitsubishi Electric Corp Map information processing device
JP2014186401A (en) * 2013-03-21 2014-10-02 Sharp Corp Information display device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018018290A (en) * 2016-07-27 2018-02-01 京セラ株式会社 Electronic device, control method and program
US10606361B2 (en) 2016-07-27 2020-03-31 Kyocera Corporation Electronic device, control method, and non-transitory computer-readable recording medium
JP6186488B1 (en) * 2016-12-09 2017-08-23 京セラ株式会社 Electronic device, control method and program

Similar Documents

Publication Publication Date Title
JP6617974B2 (en) Electronic device, method for controlling electronic device, and control program therefor
JP5957875B2 (en) Head mounted display
US9753518B2 (en) Electronic apparatus and display control method
TWI498771B (en) Gesture recognition system and glasses with gesture recognition function
US10133407B2 (en) Display apparatus, display system, method for controlling display apparatus, and program
EP2579145A2 (en) Accessory to improve user experience with an electronic display
JP2013125247A (en) Head-mounted display and information display apparatus
JP6398870B2 (en) Wearable electronic device and gesture detection method for wearable electronic device
US8599171B2 (en) Optical position detecting device and display device with position detecting function
JP6116336B2 (en) CAMERA DEVICE AND WIRELESS COMMUNICATION TERMINAL HAVING THE SAME
CN105786163A (en) Display processing method and display processing device
TW201510772A (en) Gesture determination method and electronic device
WO2016052061A1 (en) Head-mounted display
WO2016072271A1 (en) Display device, method for controlling display device, and control program therefor
US11287945B2 (en) Systems and methods for gesture input
JP2006155244A (en) Information display device
KR102245374B1 (en) Wearable device and its control method
JP2012194626A (en) Display device
WO2017065051A1 (en) Input device, electronic apparatus, electronic apparatus input method, and input program for input device, electronic apparatus, and electronic apparatus input method
WO2017094557A1 (en) Electronic device and head-mounted display
WO2017065050A1 (en) Input device, electronic apparatus, electronic apparatus input method, and input program for input device, electronic apparatus, and electronic apparatus input method
JP6754329B2 (en) Image display device, head-mounted display, information display device, display processing method and program
TWI492099B (en) Glasses with gesture recognition function
WO2017104525A1 (en) Input device, electronic device, and head-mounted display
JP2017004532A (en) Head-mounted display and information display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15857270

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15857270

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP