WO2016190057A1 - Wearable electronic device and gesture detection method for wearable electronic device - Google Patents

Wearable electronic device and gesture detection method for wearable electronic device Download PDF

Info

Publication number
WO2016190057A1
WO2016190057A1 PCT/JP2016/063617 JP2016063617W WO2016190057A1 WO 2016190057 A1 WO2016190057 A1 WO 2016190057A1 JP 2016063617 W JP2016063617 W JP 2016063617W WO 2016190057 A1 WO2016190057 A1 WO 2016190057A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
processing unit
movement
wearable electronic
electronic device
Prior art date
Application number
PCT/JP2016/063617
Other languages
French (fr)
Japanese (ja)
Inventor
卓 天野
軌行 石井
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Publication of WO2016190057A1 publication Critical patent/WO2016190057A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to a wearable electronic device that can be worn, and particularly relates to a wearable electronic device that can recognize a predetermined gesture.
  • the present invention relates to a gesture detection method for wearable electronic devices.
  • the mobile computing device disclosed in this document is configured to acquire data related to a three-dimensional user movement comprising an infrared (IR) light emitting diode (LED) and an IR proximity sensor.
  • a sensor system communicatively coupled to the sensor system, and data clarity related to the three-dimensional user movement acquired by the sensor system and correct input gesture identification of the three-dimensional user movement.
  • a sensor control device module Configured to identify a property of the device that is indicative of a probability and regulate power consumption of at least one of the IR LED or the IR proximity sensor of the sensor system based on the device property.
  • the sensor system disclosed in Patent Document 1 is based on the assumption that the IR proximity sensor is stationary, and the IR proximity sensor is used to perform a three-dimensional user movement with respect to the IR proximity sensor, that is, a gesture. Relevant data is acquired. For this reason, if the IR proximity sensor moves, it is difficult to detect a gesture by the IR proximity sensor.
  • the present invention has been made in view of the above-described circumstances, and an object thereof is to provide a wearable electronic device capable of detecting a gesture even when a wearing part moves, and a gesture detection method for the wearable electronic device. It is to be.
  • the wearable electronic device and the gesture detection method thereof the first movement of the mounting member to be mounted on the first part in the living body is measured and mounted on the mounting member, which is different from the first part in the living body.
  • the second movement of the indicator provided in the second part or the second part is measured, and based on the first and second measurement results, the second movement of the second part or the indicator is determined. Gesture is recognized. Therefore, the wearable electronic device and the gesture detection method for the wearable electronic device according to the present invention can detect a gesture even if the wearing site moves.
  • the wearable electronic device in the present embodiment is an electronic device that can be worn and can detect a predetermined gesture, and includes a mounting member for mounting on a predetermined first part of a living body, and a first of the mounting member A first movement measuring unit for measuring movement; and a second indicator mounted on the mounting member and used for performing a gesture provided in the second part or the second part different from the first part in the living body.
  • Such a wearable electronic device may be an electronic device for any purpose, but here, as an example, a member (head) for mounting the mounting member on the head with the first part as the head More specifically, the case where it is a so-called head mounted display (HMD) will be described below.
  • a member head for mounting the mounting member on the head with the first part as the head More specifically, the case where it is a so-called head mounted display (HMD) will be described below.
  • HMD head mounted display
  • FIG. 1 is a perspective view showing a structural configuration of a wearable electronic device according to an embodiment.
  • FIG. 2 is a front view illustrating a structural configuration of the wearable electronic device according to the embodiment.
  • FIG. 3 is a top view illustrating a structural configuration of the wearable electronic device according to the embodiment.
  • FIG. 4 is a schematic cross-sectional view illustrating a configuration of a display unit in the wearable electronic device of the embodiment.
  • FIG. 5 is a block diagram illustrating an electrical configuration of the wearable electronic device according to the embodiment.
  • FIG. 6 is a diagram illustrating a configuration of a proximity sensor as an example of a second motion measurement unit in the wearable electronic device of the embodiment.
  • the right side and the left side of the HMD 100 refer to the right side and the left side for the user wearing the HMD 100.
  • the direction from the right side to the left side is the XYZ orthogonal.
  • a coordinate system is set, and the X axis, Y axis, Z axis, X direction, Y direction, and Z direction are used as appropriate.
  • the HMD 100 includes a frame 101 that is an example of a head mounting member to be mounted on the head.
  • a frame 101 that is substantially U-shaped when viewed from above includes a front part 101a to which two spectacle lenses 102 are attached, and side parts 101b and 101c that extend rearward (Y direction) from both ends of the front part 101a.
  • the two spectacle lenses 102 attached to the frame 101 may or may not have refractive power (optical power, reciprocal of focal length).
  • the cylindrical main body 103 is fixed to the front part 101 a of the frame 101 at the upper part of the right eyeglass lens 102 (which may be the left side according to the user's dominant eye etc.).
  • the main body 103 is provided with a display unit 104.
  • a display control unit 104DR (see FIG. 5 described later) that controls display of the display unit 104 based on an instruction from the control processing unit 121 described later is disposed. Note that a display unit may be disposed in front of both eyes as necessary.
  • the display unit 104 includes an image forming unit 104A and an image display unit 104B.
  • the image forming unit 104A is incorporated in the main body unit 103, and includes a light source 104a, a one-way diffusing plate 104b, a condenser lens 104c, and a display element 104d.
  • the image display unit 104B which is a so-called see-through type display member, is disposed on the entire plate so as to extend downward from the main body unit 103 and extend in parallel to one eyeglass lens 102 (see FIG. 1).
  • the eyepiece prism 104f, the deflecting prism 104g, and the hologram optical element 104h is disposed on the entire plate so as to extend downward from the main body unit 103 and extend in parallel to one eyeglass lens 102 (see FIG. 1).
  • the light source 104a has a function of illuminating the display element 104d.
  • the peak wavelength of light intensity and the half width of the light intensity are 462 ⁇ 12 nm (blue light (B light)), 525 ⁇ 17 nm (green light (G light )), 635 ⁇ 11 nm (red light (R light)), and is composed of RGB integrated light emitting diodes (LEDs) that emit light in three wavelength bands.
  • the display element 104d displays an image by modulating the light emitted from the light source 104a in accordance with image data, and is configured by a transmissive liquid crystal display element having pixels that serve as light transmitting regions in a matrix. Is done. Note that the display element 104d may be of a reflective type.
  • the eyepiece prism 104f totally reflects the image light from the display element 104d incident through the base end face PL1 by the opposed parallel inner side face PL2 and outer side face PL3, and passes through the hologram optical element 104h to the user's pupil. On the other hand, it transmits external light and guides it to the user's pupil, and is formed of, for example, an acrylic resin together with the deflecting prism 104g.
  • the eyepiece prism 104f and the deflection prism 104g are joined by an adhesive with the hologram optical element 104h sandwiched between inclined surfaces PL4 and PL5 inclined with respect to the inner surface PL2 and the outer surface PL3.
  • the deflection prism 104g is joined to the eyepiece prism 104f, and becomes a substantially parallel flat plate integrated with the eyepiece prism 104f.
  • the spectacle lens 102 (refer FIG. 1) is mounted
  • the hologram optical element 104h diffracts and reflects the image light (light having a wavelength corresponding to the three primary colors) emitted from the display element 104d, guides it to the pupil B, enlarges the image displayed on the display element 104d, and enlarges the user's pupil. It is a volume phase type reflection hologram guided as a virtual image.
  • the hologram optical element 104h has, for example, three wavelength ranges of 465 ⁇ 5 nm (B light), 521 ⁇ 5 nm (G light), and 634 ⁇ 5 nm (R light) with a peak wavelength of diffraction efficiency and a wavelength width of half the diffraction efficiency. The light is diffracted (reflected).
  • the peak wavelength of diffraction efficiency is the wavelength at which the diffraction efficiency reaches a peak
  • the wavelength width at half maximum of the diffraction efficiency is the wavelength width at which the diffraction efficiency is at half maximum of the diffraction efficiency peak. is there.
  • the display unit 104 having such a configuration, light emitted from the light source 104a is diffused by the unidirectional diffusion plate 104b, condensed by the condenser lens 104c, and incident on the display element 104d.
  • the light incident on the display element 104d is modulated for each pixel based on the image data input from the display control unit 104DR, and is emitted as image light. Thereby, a color image is displayed on the display element 104d.
  • the image light from the display element 104d enters the eyepiece prism 104f from its base end face PL1, is totally reflected a plurality of times by the inner side face PL2 and the outer side face PL3, and enters the hologram optical element 104h.
  • the light incident on the hologram optical element 104h is reflected there, passes through the inner side surface PL2, and reaches the pupil B.
  • the user can observe an enlarged virtual image of the image displayed on the display element 104d, and can visually recognize it as a screen formed on the image display unit 104B.
  • the eyepiece prism 104f, the deflecting prism 104g, and the hologram optical element 104h transmit almost all of the external light, the user can observe the external image (real image) through these. Therefore, the virtual image of the image displayed on the display element 104d is observed so as to overlap with a part of the external image. In this way, the user of the HMD 100 can simultaneously observe the image provided from the display element 104d and the external image via the hologram optical element 104h.
  • the image display unit 104B is transparent and can observe only an external image.
  • the display unit is configured by combining a light source, a liquid crystal display element, and an optical system.
  • a self-luminous display element for example, an organic EL display
  • Element may be used.
  • a transmissive organic EL display panel having transparency in a non-light emitting state may be used.
  • a second motion measuring unit for example, a proximity sensor
  • the lens 106a and the illuminance sensor 107 disposed between the second motion measuring unit 105 and the lens 106a are provided so as to face forward. Accordingly, in the example shown in FIGS. 1 to 3, the measurement direction of the second motion measurement unit 105, the optical axis of the camera 106, and the measurement direction of the illuminance sensor 107 are the same.
  • the measurement direction of the second motion measuring unit 105, the optical axis of the camera 106, and the measurement direction of the illuminance sensor 107 are the same as the central axis of the detection range of the proximity sensor 105A, the optical axis of the camera 106, and the illuminance sensor 107.
  • a position where not only when the central axes of the measurement range are parallel, but also when the three axes are slightly crossed, an output with a tendency similar to that when the three axes are parallel is obtained from the three parties. The case where these are arranged in a relationship is also included.
  • the right sub-body portion 108-R is attached to the right side portion 101b of the frame 101
  • the left sub-body portion 108-L is attached to the left side portion 101c of the frame 101.
  • the right sub-main body portion 108-R and the left sub-main body portion 108-L have an elongated plate shape, and have elongated projections 108a-R and 108a-L, respectively, inside.
  • the right sub-main body portion 108-R By engaging the elongated protrusion 108a-R with the elongated hole 101d of the side portion 101b of the frame 101, the right sub-main body portion 108-R is attached to the frame 101 in a positioned state, and the elongated protrusion 108a-L is By engaging with the long hole 101e of the side portion 101c of the frame 101, the left sub-main body portion 108-L is attached to the frame 101 in a positioned state.
  • a geomagnetic sensor 109 (see FIG. 5) and a first motion measuring unit (for example, a gyroscope and an acceleration sensor) 110 (see FIG. 5) are arranged in the right sub-main body portion 108-R.
  • a speaker (or earphone) 111A and a microphone 111B are arranged in the left sub-main body portion 108-L.
  • the main main body 103 and the right sub main body 108-R are connected so as to be able to transmit signals through a wiring HS, and the main main body 103 and the left sub main body 108-L can transmit signals through a wiring (not shown). It is connected to the.
  • a wiring HS a wiring
  • the main main body 103 and the left sub main body 108-L can transmit signals through a wiring (not shown). It is connected to the.
  • the right sub-main body 108-R is connected to the control unit CTU via a cord CD extending from the rear end thereof.
  • the HMD 100 may be configured to be operated by voice based on an output signal generated from the microphone 111B according to the input voice.
  • the main main body 103 and the left sub main body 108 may be configured to be wirelessly connected.
  • the HMD 100 includes a control unit CTU, a camera 106, a geomagnetic sensor 109, a first motion measuring unit 110, a second motion measuring unit 105, a microphone 111B, an illuminance sensor 107, and an image forming unit 104A. And a display control unit 104DR and a speaker 111A.
  • the control unit CTU includes a control processing unit 121, an operation unit 122, a GPS reception unit 123, a communication unit 124, a storage unit 125, a battery 126, and a power supply circuit 127.
  • the camera 106 is an apparatus that is connected to the control processing unit 121 and generates an image of a subject under the control of the control processing unit 121.
  • the camera 106 is, for example, an image forming optical system that forms an optical image of a subject on a predetermined image forming surface, and a light receiving surface that matches the image forming surface.
  • An image sensor that converts the image sensor into a digital signal processor (DSP) that performs known image processing on the output of the image sensor to generate an image (image data).
  • DSP digital signal processor
  • the imaging optical system includes one or more lenses, and includes the lens 106a as one of them.
  • the camera 106 outputs the generated image data to the control processing unit 121.
  • the geomagnetic sensor 109 is a circuit that is connected to the control processing unit 121 and measures the front ( ⁇ Y direction) orientation in the HMD 100 by measuring the earth magnetism. The geomagnetic sensor 109 outputs the measured orientation to the control processing unit 121.
  • the first motion measuring unit 110 is connected to the control processing unit 121 and is a device for measuring the motion of the frame 101 which is an example of the head-mounted member.
  • the first motion measuring unit 110 is mounted on the frame 101.
  • the first motion measurement unit 110 outputs the measurement result to the control processing unit 121. More specifically, the first motion measurement unit 110 is, for example, a gyroscope and an acceleration sensor 110a, and is disposed in the right sub-main body unit 108-R as described above.
  • the gyroscope and acceleration sensor 110a is configured to determine the angular velocity of the roll around the X axis, the angular velocity of the pitch around the Y axis, the angular velocity of the yaw around the Z axis, the acceleration in the X direction, the acceleration in the Y direction, and the Z according to the posture of the frame 101. It is a circuit that measures each acceleration in the direction.
  • the gyro and acceleration sensor 110 a outputs the measured angular velocities and accelerations to the control processing unit 121. Note that the gyro and acceleration sensor 110a may be a six-axis sensor in which these are integrated.
  • the second movement measuring unit 105 is connected to the control processing unit 121, and is a second indicator for performing a second movement of a second part different from the first part or a gesture provided in the second part. It is a device for measuring movement.
  • the first part is the head of a living body
  • the second part in the living body is a part different from the head, for example, a hand or a finger.
  • the indicator provided in the second part of the living body is, for example, a rod-shaped member (for example, a pen or a pointing rod) gripped by a user's hand, or a rod-shaped member mounted on a user's finger or arm with a mounting member. These members.
  • the second motion measuring unit 105 further detects the presence / absence of the second part.
  • the second motion measuring unit 105 is mounted on the frame 101 which is an example of the head mounting member.
  • the second motion measuring unit 105 outputs the measurement result to the control processing unit 121. More specifically, the second motion measuring unit 105 is, for example, a proximity sensor 105a and is mounted on the main body 103 fixed to the front part 101a of the frame 101 as described above.
  • the “proximity sensor” refers to a proximity range in front of the detection surface of the proximity sensor in order to detect that an object, for example, a part of a human body (such as a hand or a finger) is close to the user's eyes.
  • the signal is output by detecting whether or not it exists within the detection area.
  • the proximity range may be set as appropriate according to the operator's characteristics and preferences. For example, the proximity range from the detection surface of the proximity sensor may be within a range of 200 mm. If the distance from the proximity sensor is within 200 mm, the user can easily put the palm and fingers into and out of the user's field of view with the arm bent, so that the user can easily operate with gestures using the hands and fingers. In addition, the possibility of erroneous detection of a human body or furniture other than the user is reduced.
  • a passive proximity sensor has a detection unit that detects invisible light and electromagnetic waves emitted from an object when the object approaches.
  • a passive proximity sensor there are a pyroelectric sensor that detects invisible light such as infrared rays emitted from an approaching human body, an electrostatic capacitance sensor that detects a change in electrostatic capacitance between the approaching human body, and the like.
  • the active proximity sensor includes an invisible light and sound wave projection unit, and a detection unit that receives the invisible light and sound wave reflected and returned from the object.
  • Active proximity sensors include infrared sensors that project infrared rays and receive infrared rays reflected by objects, laser sensors that project laser beams and receive laser beams reflected by objects, and project ultrasonic waves. Then, there is an ultrasonic sensor that receives ultrasonic waves reflected by an object. Note that a passive proximity sensor does not need to project energy toward an object, and thus has excellent low power consumption. An active proximity sensor is easy to improve the certainty of detection. For example, even when a user wears a glove that does not transmit detection light emitted from a human body such as infrared light, Can detect movement. A plurality of types of proximity sensors may be combined.
  • the proximity sensor 105a includes four pyroelectric elements RA, RB, RC, and RD arranged in 2 rows and 2 columns, and detects invisible light such as infrared light emitted from the human body. Light is received as light, and a corresponding signal is output from each pyroelectric element RA to RD.
  • the outputs of the pyroelectric elements RA to RD vary in intensity according to the distance from the light receiving surface of the proximity sensor 105a to the object, and the intensity increases as the distance decreases.
  • the proximity sensor 105a outputs each output of the pyroelectric elements RA to RD to the control processing unit 121.
  • the microphone 111B is a circuit that is connected to the control processing unit 121 and converts acoustic vibration of sound into an electric signal.
  • the microphone 111 ⁇ / b> B outputs the converted electric signal representing the external sound to the control processing unit 121.
  • the illuminance sensor 107 is a circuit that is connected to the control processing unit 121 and measures illuminance.
  • the illuminance sensor 107 outputs the measured illuminance to the control processing unit 121.
  • the illuminance sensor 107 is, for example, a photodiode that outputs a current having a magnitude corresponding to the light intensity of incident light by photoelectric conversion, and an IV conversion circuit that converts the current value of the photodiode into a voltage value.
  • the peripheral circuit is provided.
  • the display control unit 104DR is a circuit that is connected to the control processing unit 121 and causes the image forming unit 104A to form an image by controlling the image forming unit 104A according to the control of the control processing unit 121.
  • the image forming unit 104A is as described above.
  • the speaker 111 ⁇ / b> A is a circuit that is connected to the control processing unit 121 and generates and outputs a sound corresponding to an electric signal representing a sound according to the control of the control processing unit 121.
  • the operation unit 122 is connected to the control processing unit 121 and is a device that inputs a predetermined instruction, such as power on / off, to the HMD 100, for example, one or a plurality of switches assigned a predetermined function Etc.
  • the GPS receiving unit 123 is connected to the control processing unit 121, and is a device that measures the position of the HDM 100 by a satellite positioning system for measuring the current position on the earth according to the control of the control processing unit 121.
  • the result (latitude X, longitude Y, altitude Z) is output to the control processing unit 121.
  • the GPS receiving unit 123 may be a GPS having a correction function for correcting an error such as DGSP (Differential GSP).
  • the communication unit 124 is a circuit that is connected to the control processing unit 121 and inputs / outputs data to / from an external device according to the control of the control processing unit 121.
  • an RS232C interface circuit that is a serial communication method
  • Bluetooth An interface circuit using the (registered trademark) standard
  • an interface circuit performing infrared communication such as an IrDA (Infrared Data Association) standard
  • an interface circuit using the USB (Universal Serial Bus) standard Universal Serial Bus
  • the communication unit 124 is a communication card or the like that communicates by wire or wireless, and may communicate with an external device such as a server device via a communication network such as an Ethernet environment (Ethernet is a registered trademark). ). Such a communication unit 124 generates a communication signal containing data to be transferred input from the control processing unit 121 according to a communication protocol used in the communication network, and generates the generated communication signal via the communication network. To the external device. The communication unit 124 receives a communication signal from an external device via the communication network, extracts data from the received communication signal, and converts the extracted data into data in a format that can be processed by the control processing unit 121. Output to the control processing unit 121.
  • the communication unit 124 transmits and receives communication signals according to the Wi-Fi (Wireless Fidelity) standard, which is one of the wireless LAN standards, and Wi-Fi Module (communication card) compatible with IEEE802.11b / g / n. It is configured with.
  • Wi-Fi Wireless Fidelity
  • Wi-Fi Module communication card
  • the battery 126 is a battery that accumulates electric power and supplies the electric power.
  • the battery 126 may be a primary battery or a secondary battery.
  • the power supply circuit 127 is a circuit that supplies power supplied from the battery 126 to each part of the HMD 100 that requires power at a voltage corresponding to each part.
  • the storage unit 125 is a circuit that is connected to the control processing unit 121 and stores various predetermined programs and various predetermined data under the control of the control processing unit 121.
  • the various predetermined programs include a control program that controls each part of the HMD 100 according to the function of each part, and first and second measurement results measured by the first and second motion measuring units 110 and 105.
  • a control processing program such as a gesture processing program for recognizing a gesture by the second movement of the second part based on the above.
  • the various predetermined data includes data necessary for controlling the HMD 100.
  • the storage unit 125 includes, for example, a ROM (Read Only Memory) that is a nonvolatile storage element, an EEPROM (Electrically Erasable Programmable Read Only Memory) that is a rewritable nonvolatile storage element, and the like.
  • the storage unit 125 includes a RAM (Random Access Memory) that serves as a working memory of the control processing unit 121 that stores data generated during the execution of the predetermined program.
  • the control processing unit 121 controls each part of the HMD 100 according to the function of each part, and based on the first and second measurement results measured by the first and second motion measuring units 110 and 105, A gesture based on the second movement is recognized, and processing corresponding to the recognized gesture is executed.
  • the control processing unit 121 includes, for example, a CPU (Central Processing Unit) and its peripheral circuits.
  • a control processing program is executed, so that a control unit 1211, a gesture processing unit 1212, and a detection result utilization unit 1213 are functionally configured. Part or all of the control unit 1211, the gesture processing unit 1212, and the detection result utilization unit 1213 may be configured by hardware.
  • the control unit 1211 controls each unit of the HMD 100 according to the function of each unit.
  • the gesture processing unit 1212 recognizes a gesture caused by the second movement of the second part based on the first and second measurement results measured by the first and second movement measuring units 110 and 105. More specifically, in the present embodiment, the gesture processing unit 1212 includes the first measurement result of the gyroscope and the acceleration sensor 110a and a plurality of pyroelectric elements in the proximity sensor 105a. Based on each output of RD, a predetermined gesture set in advance by the second movement of the hand or fingers is determined. The gesture processing unit 1212 notifies the detection result utilization unit 1213 of the determination result (detection result).
  • the gesture processing unit 1212 determines the first motion measurement unit 110 (gyroscope and acceleration sensor 110a in this embodiment) from the second measurement result measured by the second motion measurement unit 105 (proximity sensor 105a in this embodiment).
  • the first movement component due to the first movement is removed based on the first measurement result measured in step, and the second portion of the second part is removed based on the second measurement result from which the first movement component is removed. Recognize movement gestures.
  • the gesture processing unit 1212 starts its operation when the second motion measuring unit 105 detects the presence of the second part. More specifically, in the present embodiment, the gesture processing unit 1212 starts its operation by switching to the active state when the second motion measuring unit 105 detects the presence of the second part from the sleep state. The gesture processing unit 1212 may start its operation by turning on the power supply when the second motion measurement unit 105 detects the presence of the second part from the state where the power supply is turned off.
  • the gesture processing unit 1212 is mounted not on the frame 101 but on the control unit CTU that is separate from the frame 101.
  • the detection result utilization unit 1213 executes predetermined processing based on the determination result of the gesture processing unit 1212. For example, when the determination result of the gesture processing unit 1212 is a so-called “flick”, the detection result using unit 1213 performs the second operation from the first image formed in the image forming unit 104A by the display control of the display control unit 104DR. Change the display to turn the page. Further, for example, when the determination result of the gesture processing unit 1212 is a so-called “slide”, the detection result using unit 1213 moves the image formed in the image forming unit 104A by the display control of the display control unit 104DR. Change the display to.
  • FIG. 7 is a front view when the wearable electronic device of the embodiment is mounted.
  • FIG. 8 is a side view and a partial top view when the wearable electronic device of the embodiment is mounted.
  • FIG. 8 also shows the hand HD of the user US.
  • 8A is a side view
  • FIG. 8B is a partial top view.
  • FIG. 9 is a diagram illustrating an example of an image visually recognized by the user through the see-through image display unit.
  • FIG. 10 is a diagram illustrating an example of the output of the proximity sensor in the wearable electronic device of the embodiment. 10A shows the output of the pyroelectric element RA,
  • FIG. 10B shows the output of the pyroelectric element RB, FIG.
  • FIG. 10C shows the output of the pyroelectric element RC
  • FIG. 10D shows the output of the pyroelectric element RD.
  • the horizontal axis of each figure in FIG. 10 is time, and the vertical axis thereof is intensity (output level).
  • the gesture operation is an operation in which at least the hand HD or the finger of the user US enters or leaves the detection area of the proximity sensor 105a, and the gesture processing unit 1212 of the control processing unit 121 of the HMD 100 via the proximity sensor 105a. Can be detected.
  • the screen 104i of the image display unit 104B is arranged so as to overlap the effective visual field EV of the user's eye facing the image display unit 104B (here, positioned in the effective visual field EV).
  • the detection area SA of the proximity sensor 105a is in the visual field of the user's eye facing the image display unit 104B.
  • the detection area SA is located within the stable field of view of the user's eye or the field inside thereof (within about 90 ° horizontal and within about 70 ° vertical), and more preferably located inside the stable field of view.
  • the proximity sensor 105a may be installed with its arrangement and orientation adjusted so as to overlap with the effective visual field EV or the inner visual field (horizontal within about 30 °, vertical within about 20 °).
  • FIG. 9 shows an example in which the detection area SA overlaps the screen 104i.
  • the detection area SA of the proximity sensor 105a by setting the detection area SA of the proximity sensor 105a within the visual field of the eye of the user US while the user US is wearing the frame 101 that is the head mounting member on the head, the screen While observing the hand HD through 104i, the approach and retraction of the hand to the detection area SA of the proximity sensor 105a can be reliably visually recognized without moving the eye.
  • the gesture operation can be performed reliably while recognizing the detection area SA even when the user observes the screen. .
  • the gesture operation can be performed more reliably. If the detection area SA overlaps the screen 104i, the gesture operation can be performed more reliably.
  • the proximity sensor 105a includes a plurality of pyroelectric elements RA to RD as in the present embodiment, the entire light receiving area of the plurality of pyroelectric elements RA to RD is regarded as one light receiving unit, and The maximum detection range shall be regarded as the detection area. As shown in FIG.
  • the control processing unit 121 When there is nothing in front of the user US when the proximity sensor 105a is operating, the proximity sensor 105a does not receive invisible light as detection light, so that the control processing unit 121 is performing a gesture.
  • the gesture processing unit 1212 is set in the sleep state.
  • the proximity sensor 105a detects invisible light radiated from the hand HD, and the proximity sensor 105a based on the invisible light is detected.
  • the control processing unit 121 determines that a gesture has been performed, and activates the gesture processing unit 1212.
  • a gesture is performed with the hand HD of the user US.
  • the gesture may be performed by the user US using an indicator made of a material capable of emitting invisible light. You may go.
  • the proximity sensor 105a has four pyroelectric elements RA to RD arranged in two rows and two columns (see FIG. 6). Therefore, when the user US brings the hand HD close to the front of the HMD 100 from either the left, right, up, or down directions, the output timings of signals detected by the pyroelectric elements RA to RD are different.
  • the invisible light emitted from the hand HD is applied to the proximity sensor 105a.
  • the pyroelectric elements RA and RC first receive invisible light. Therefore, as shown in FIG. 10, first, the signals of the pyroelectric elements RA and RC rise, and the signals of the pyroelectric elements RB and RD rise after a delay. Thereafter, the signals of the pyroelectric elements RA and RC fall, and the signals of the pyroelectric elements RB and RD fall after a delay (not shown).
  • the gesture processing unit 1212 detects the timing of this signal, and the gesture processing unit 1212 determines that the user US has made a gesture by moving the hand HD from right to left.
  • the gesture processing unit 1212 can determine that the user US has made a gesture by moving the hand HD from left to right.
  • the gesture processing unit 1212 can determine that the user US has made a gesture by moving the hand HD from above to below.
  • the gesture processing unit 1212 can determine that the user US has made a gesture by moving the hand HD from the bottom to the top.
  • the gesture processing unit 1212 performs the gesture in which the user US moves the hand HD from the upper right to the lower left. Can be judged.
  • the gesture processing unit 1212 performs a gesture in which the user US moves the hand HD from the upper left to the lower right. Can be determined.
  • the gesture processing unit 1212 has performed the gesture in which the user US moves the hand HD from the lower left to the upper right. Can be judged.
  • the gesture processing unit 1212 performs a gesture in which the user US moves the hand HD from the lower right to the upper left. Can be determined.
  • FIG. 11 is a flowchart illustrating gesture detection processing (main routine) in the wearable electronic device according to the embodiment.
  • FIG. 12 is a flowchart illustrating a gesture determination process (subroutine) in the gesture detection process of the wearable electronic device according to the embodiment.
  • the gesture processing unit 1212 uses the hand or finger, which is an example of the second part, according to the basic principle described above.
  • the gesture by can be detected.
  • the proximity sensor 105a is disposed on the main body 103 of the frame 101, which is an example of a head mounting part, when the head, which is an example of the first part, is moved, only the basic principle described above is used. It is difficult for the gesture processing unit 1212 to detect the gesture correctly only based on the processing based on it. For example, even when the hand or finger is stationary, if the user swings the gesture, the gesture processing unit 1212 may detect a gesture with the hand or finger and make a misjudgment. .
  • the HMD 100 operates as follows.
  • the control unit 1211 samples the output of the proximity sensor 105a (each output of each current collecting element RA to RD) at a predetermined sampling interval (for example, 10 ms, 20 ms, or 30 ms).
  • a predetermined sampling interval for example, 10 ms, 20 ms, or 30 ms.
  • the first threshold th1 is a value set to avoid erroneous determination due to noise.
  • the control unit 1211 also samples the outputs of the gyroscope and the acceleration sensor 110a at the timing of sampling the output of the proximity sensor 105a.
  • the control unit 1211 also samples the outputs of the gyroscope and the acceleration sensor 110a at the timing of sampling the output of the proximity sensor 105a.
  • the control unit 1211 returns the process to the process S11 and executes the process S11 at the next sampling timing.
  • the control unit 1211 activates the gesture processing unit 1212 when the gesture processing unit 1212 is in the sleep state, and operates the gesture determination processing.
  • the gesture is provisionally determined (S12).
  • the gesture processing unit 1212 first determines the rising timing and falling timing of each output of the plurality of pyroelectric elements RA to RD in the proximity sensor 105a (S21).
  • the gesture processing unit 1212 determines whether or not a gesture can be determined (S22). As described above, the gesture is determined based on each rising timing and each falling timing of each signal output from each of the pyroelectric elements RA to RD in the proximity sensor 105a. In executing the process, the gesture cannot be determined. Therefore, whether or not the gesture processing unit 1212 can tentatively determine a gesture by combining the current processing result and a plurality of past processing results such as the previous processing result and the previous processing result in the current processing. Judging.
  • the gesture processing unit 1212 acquires the output of the proximity sensor 105a (each output of each current collector RA to RD) at the next sampling timing (S23). Is returned to step S21.
  • the gesture processing unit 1212 ends the determination process of the gesture shown in FIG. 12 as the temporarily determined gesture, and performs the process S13 of FIG. Execute.
  • the gesture processing unit 1212 determines whether the output of the gyroscope and the acceleration sensor 110a exceeds a predetermined threshold value (second threshold value) th2 or not.
  • a predetermined threshold value (second threshold value) th2 or not.
  • the second threshold th2 is a value set to avoid erroneous determination due to noise.
  • the processing unit 1212 determines (mainly determines) a gesture with the tentatively determined gesture (S15), and then executes processing S16.
  • the gesture processing unit 1212 corrects the tentatively determined gesture and finally determines (mainly determines) the gesture (S14). Then, the process S16 is executed.
  • the gesture processing unit 1212 obtains the temporarily determined speed vector (speed in the X direction, speed in the Z direction) of the gesture.
  • the output of the pyroelectric elements RA to RD is sampled at a predetermined sampling interval as described above, and the size (horizontal length (length in the X direction), vertical length (length in the Z direction) of each pyroelectric element RA to RD. , Diagonal length) can be measured in advance. Therefore, the sizes of the pyroelectric elements RA to RD are measured in advance and stored in the storage unit 125, and the sizes of the pyroelectric elements RA to RD are changed from the rising timing to the falling timing according to the tentatively determined gesture. By dividing by the time, the velocity vector is obtained.
  • the gesture processing unit 1212 determines whether the pyroelectric element RA or the pyroelectric element is based on the sampling interval. The time from the rise timing of the signal in RC to the fall timing is obtained, and the horizontal length of the pyroelectric element RA or the pyroelectric element RC is divided by the obtained time, whereby the speed in the X direction of the tentatively determined gesture is obtained. Ask. In the gesture of moving the hand HD from right to left, the speed in the Z direction is zero. Therefore, the velocity vector in the gesture of the tentatively determined gesture moving from right to left of the hand HD is obtained as (X-direction velocity, 0).
  • the time from the rising timing to the falling timing may be obtained by multiplying the number of samplings from the rising timing to the falling timing by a sampling interval, but a timer corresponding to each pyroelectric element RA to RD. May be functionally provided in the control processing unit 121, restarting the timer at the rise timing, and measuring the time until the fall timing with the timer.
  • the gesture processing unit 1212 obtains the first time from the rising timing of the signal in the pyroelectric element RA or the pyroelectric element RC to the falling timing based on the sampling interval, and the obtained first time.
  • a first speed first speed in the X direction
  • the second time from the rising timing of the signal in the pyroelectric element RD to the falling timing is obtained, and the second horizontal length in the pyroelectric element RB or pyroelectric element RD is divided by the obtained second time, 2 speeds (second speed in the X direction) are obtained, an average speed of the first and second speeds is obtained, and the obtained average speed is calculated as the temporary speed.
  • the gesture processing unit 1212 may obtain and average each corresponding speed for each pyroelectric element RA to RD, and this average speed may be used as the speed in the X direction of the tentatively determined gesture. The same applies to the following cases.
  • the velocity vector in the case where the tentatively determined gesture is a gesture of moving the hand HD from left to right is the same as that in the case where the tentatively determined gesture is the gesture of moving the hand HD from right to left. ( ⁇ X direction velocity, 0).
  • the gesture processing unit 1212 determines the signal rise timing in the pyroelectric element RA or the pyroelectric element RB based on the sampling interval. The time from the first to the fall timing is obtained, and the vertical length of the pyroelectric element RA or pyroelectric element RB is divided by the obtained time to obtain the speed in the ⁇ Z direction of the tentatively determined gesture.
  • the velocity in the X direction is zero. Therefore, the velocity vector in the gesture of the tentatively determined gesture moving from the top to the bottom of the hand HD is obtained as (speed in the 0, ⁇ Z direction).
  • the velocity vector in the case where the tentatively determined gesture is a movement gesture from the bottom to the top of the hand HD is the same as that in the case where the tentatively determined gesture is a movement gesture from the top to the bottom of the hand HD described above. (0, velocity in the Z direction).
  • the gesture processing unit 1212 determines the rise timing of the signal in the pyroelectric element RA or the pyroelectric element RD based on the sampling interval. Is calculated from the upper right to the lower left of the tentatively determined gesture by dividing the diagonal length of the pyroelectric element RA or the pyroelectric element RD by the obtained time. Ask. By dividing the obtained velocity in the direction from the upper right to the lower left into the respective components of the velocity in the X direction and the velocity in the Z direction based on the aspect ratio of the pyroelectric element RA or the aspect ratio of the pyroelectric element RD. The velocity vector in the gesture of the tentatively determined gesture moving from the upper right to the lower left of the hand HD is obtained as (velocity in the X direction, velocity in the Z direction).
  • a velocity vector in the case where the tentatively determined gesture is a movement gesture from the upper left to the lower right of the hand HD a velocity vector in a case where the tentatively determined gesture is a movement gesture from the lower left to the upper right of the hand HD
  • Each of the velocity vectors in the case where the tentatively determined gesture is a gesture of moving the hand HD from the lower right to the upper left is the above-described velocity vector which is the gesture of moving the hand HD from the upper right to the lower left. It is required in the same way as the case.
  • the gesture processing unit 1212 obtains a velocity vector in the first movement of the head, which is an example of the first part, from the acceleration in the X direction and the acceleration in the Z direction of the gyroscope and the acceleration sensor 110a.
  • a plurality of outputs are acquired from the gyro and the acceleration sensor 110a until the gesture is provisionally determined.
  • the average value of each of the plurality of outputs is used as the acceleration in the X direction and the acceleration in the Z direction of the gyroscope and the acceleration sensor 110a in the above-described processing.
  • an output acquired at the center timing among the plurality of outputs arranged in time series is used as the acceleration in the X direction and the acceleration in the Z direction of the gyroscope and the acceleration sensor 110a in the above-described processing.
  • the gesture processing unit 1212 subtracts the velocity vector in the first movement of the head from the velocity vector in the tentatively determined gesture, and the velocity in the second movement of the hand HD that is an example of the second part. Find a vector. Then, the gesture processing unit 1212 finally determines (mainly determines) a gesture with a gesture corresponding to the obtained velocity vector. Thereby, the temporarily determined gesture is corrected, and the gesture is finally determined (mainly determined).
  • the process HD is repeated from the lower right to the upper left of the hand HD by the process S12 (repetitive processes of the processes S21 to S23). Is temporarily determined as a movement gesture. Then, in step S13 and step S14, a velocity vector in the case of the tentatively determined movement of the hand HD from the lower right to the upper left is obtained, and the head obtained from the gyro and the acceleration sensor 110a is obtained from this velocity vector. The velocity vector of the part HD is subtracted to obtain the velocity vector of the hand HD.
  • the velocity vector of the hand HD In the velocity vector of the hand HD, the velocity component due to the movement from the top to the bottom of the head is removed, and the velocity in the Z direction becomes substantially zero. Therefore, the tentatively determined gesture of moving the hand HD from the lower right to the upper left is corrected to the gesture of moving the hand HD from the right to the left and finally determined.
  • step S16 the gesture processing unit 1212 notifies the finally determined gesture to the detection result utilization unit 1213.
  • the control unit 1211 samples the output of the proximity sensor 105a (each output of each of the current collecting elements RA to RD) at the next sampling timing, and the output of the proximity sensor 105a is the first output as in the process S11.
  • the threshold value th1 is exceeded, the presence or absence of the second part, in this example, the hand HD is determined (S17).
  • the control unit 1211 sets the gesture processing unit 1212 to the sleep state and returns the processing to the processing S11.
  • Process S11 is executed at the timing of sampling.
  • the control unit 1211 if the output of the proximity sensor 105a exceeds the first threshold th1 (Yes), that is, the output of any of the current collecting elements RA to RD in the proximity sensor 105a is When the first threshold value th1 is exceeded (Yes), the control unit 1211 returns the process to the process S12 and executes the next gesture determination process.
  • the detection result utilization unit 1213 executes a predetermined process based on the determination result of the gesture processing unit 1212. For example, when the determination result of the gesture processing unit 1212 is a so-called “flick”, the detection result using unit 1213 performs the second operation from the first image formed in the image forming unit 104A by the display control of the display control unit 104DR. Change the display to turn the page.
  • the HMD 100 as an example of the wearable electronic device in this embodiment and the gesture detection method implemented in the HMD 100 are measured by the second motion measurement unit 110 (the gyro and the acceleration sensor 110a in this embodiment).
  • the gesture by the second movement of the second part is performed. Since it recognizes, even if a mounting
  • the gesture processing unit 1212 does not operate until the second motion measurement unit 105 detects the presence of the second part (gesture processing). Since the unit 1212 is stopped), power can be saved.
  • the gesture processing unit 1212 is mounted on a control unit CTU that is separate from the frame 101 that is an example of a head-mounted member. Therefore, by mounting the minimum necessary configuration on the head mounting member, the head mounting member can be reduced in size and weight, and the mounting feeling such as discomfort and annoyance due to the mounting of the head mounting member can be reduced. it can.
  • the HMD 100 is based on the first measurement result obtained by measuring the predetermined motion in the living body by the first motion measuring unit 110 functionally in the control processing unit 121 as indicated by a broken line in FIG.
  • the gesture processing unit 1212 further includes a motion detection processing unit 1214 that detects the predetermined motion based on the second measurement result measured by the second motion measurement unit 105 when the motion detection processing unit 1214 detects the predetermined motion. Then, the gesture by the second movement of the second part may be recognized. According to this, when the living body performs a predetermined operation (for example, walking) excluding the movement (for example, swinging) of the first part, the gesture can be recognized without considering the predetermined movement.
  • a predetermined operation for example, walking
  • the movement for example, swinging
  • a measurement result measured by the first motion measurement unit 110 (in this embodiment, the gyroscope and the acceleration sensor 110a) when the living body performs the predetermined operation, for example, walking, is obtained in advance.
  • the obtained measurement result is stored in advance in the storage unit 125 as a signal pattern corresponding to the predetermined operation.
  • the motion detection processing unit 1214 correlates the first measurement result measured by the first motion measurement unit 110 with the signal pattern stored in the storage unit 125 within a predetermined range. It is determined whether or not the living body is performing the predetermined operation.
  • the gesture processing unit 1212 When the motion detection processing unit 1214 determines that the living body is performing the predetermined motion, the gesture processing unit 1212 performs measurement by the second motion measurement unit 105 (proximity sensor 105a in the present embodiment). Based on the measurement result, the gesture by the second movement of the second part (in the present embodiment, the hand HD) is recognized. Note that the motion detection processing unit 1214 may be configured by hardware.
  • the gesture processing unit 1212 receives the first motion measurement unit 110 (the gyro and acceleration sensor 110a in the present embodiment) when the output exceeds a predetermined threshold (third threshold) th3.
  • the output of the two-motion measuring unit 105 (proximity sensor 105a in this embodiment) may be canceled and the gesture temporary determination process may be performed again.
  • the first motion measurement unit 110 is the gyro and the acceleration sensor 110a, but is not limited thereto.
  • the first motion measuring unit 110 includes an acceleration sensor that measures acceleration, an angular velocity sensor that measures angular velocity, a speed sensor that measures velocity, a vibration sensor that measures vibration, an inclinometer that measures inclination, and a geomagnetic sensor that measures orientation. One or more of them may be provided.
  • a wearable electronic device is mounted on a mounting member for mounting on a first part in a living body, a first movement measuring unit for measuring a first movement of the mounting member, and the mounting member, A second movement measuring unit for measuring a second movement of an indicator for performing a gesture provided in the second part or the second part different from the first part in the living body; and the first and first A gesture processing unit for recognizing a gesture based on the second movement based on the first and second measurement results measured by the two-motion measurement unit.
  • the first motion measurement unit is one of an acceleration sensor that measures acceleration, an angular velocity sensor that measures angular velocity, a speed sensor that measures velocity, and a vibration sensor that measures vibration.
  • the second motion measuring unit includes a passive sensor including a plurality of pyroelectric elements arranged in a two-dimensional matrix.
  • the second motion measuring unit includes an active sensor including an infrared light source that emits infrared light and a plurality of pyroelectric elements arranged in a two-dimensional matrix. Configured.
  • the gesture processing unit is configured to determine the first measurement result based on the first measurement result measured by the first motion measurement unit from the second measurement result measured by the second motion measurement unit.
  • a first movement component due to one movement is removed, and gesture recognition based on the second movement of the second part is recognized based on the second measurement result from which the first movement component has been removed.
  • the gesture processing unit temporarily determines a gesture by the indicator provided in the second part of the living body or the second part based on the second measurement result, Based on the tentatively determined gesture and the first measurement result, the gesture by the second movement of the indicator provided in the second part or the second part of the living body is recognized.
  • the gesture processing unit modifies the gesture provisionally determined based on the second measurement result based on the first measurement result, and performs the gesture based on the second movement. recognize.
  • the gesture processing unit obtains the velocity vector of the first part and the velocity vector of the second part, and the velocity vector of the first part from the velocity vector of the second part. By subtracting, the gesture by the second movement is recognized.
  • Such a wearable electronic device is based on the second movement of the second part in consideration of not only the second measurement result measured by the second movement measurement unit but also the first measurement result measured by the first movement measurement unit. Since the gesture is recognized, the gesture can be detected even if the wearing part moves.
  • the above-described wearable electronic device further includes an operation detection processing unit that detects a predetermined operation in the living body based on a first measurement result measured by the first motion measurement unit, and the gesture
  • the processing unit detects the predetermined motion
  • the processing unit is provided in the second part or the second part based on the second measurement result measured by the second motion measurement unit. And recognizing a gesture caused by the second movement of the indicator.
  • Such a wearable electronic device can recognize a gesture without considering the predetermined movement when the living body performs a predetermined movement (for example, walking) other than the movement of the first part (for example, swinging). .
  • the gesture processing unit cancels the tentatively determined gesture when the first measurement result exceeds a predetermined threshold.
  • the second motion measuring unit further detects the presence or absence of the second part
  • the gesture processing unit is configured such that the second motion measuring unit is the second part.
  • the operation is started when the presence of the is detected.
  • the gesture processing unit performs the operation by turning on the power supply when the second motion measurement unit detects the presence of the second part from a state in which the power supply is turned off. Start.
  • the gesture processing unit starts its operation by changing to an active state when the second motion measuring unit detects the presence of the second part from a sleep state.
  • Such a wearable electronic device can save power because the gesture processing unit does not operate until the second motion measurement unit detects the presence of the second part (because the gesture processing unit is paused).
  • the gesture processing unit is mounted on a member separate from the mounting member.
  • Such a wearable electronic device can reduce the size of the mounting member by mounting the minimum necessary configuration on the mounting member, and can reduce a feeling of discomfort or annoyance due to mounting of the mounting member.
  • the mounting member is a member for mounting on the head with the first part as the head.
  • a wearable electronic device to be worn on the head can be provided, and even if such a wearable electronic device has a first movement such as a swing, for example, a gesture of a second movement by, for example, a hand or fingers. Can be detected.
  • a first movement such as a swing, for example, a gesture of a second movement by, for example, a hand or fingers.
  • a gesture detection method for a wearable electronic device is a gesture detection method for a wearable electronic device that is attached to a first part of a living body by a mounting member, and the first movement of the mounting member is measured.
  • the first measurement result is acquired, and the second measurement result is measured by measuring the second movement of the indicator for performing the gesture provided in the second part or the second part different from the first part in the living body. Acquiring and recognizing a gesture by the second movement based on the first and second measurement results.
  • a gesture based on the second part of the living body or an indicator provided in the second part is provisionally determined based on the second measurement result, and the provisional determination is performed.
  • the gesture by the second movement of the indicator provided in the second part or the second part of the living body is recognized.
  • the gesture temporarily determined based on the second measurement result is modified based on the first measurement result, and the gesture caused by the second movement is recognized.
  • the second measurement result is further detected when the predetermined action in the living body is detected based on the first measurement result and the predetermined action is detected. Based on the above, the gesture by the second movement is recognized.
  • Such a wearable electronic device gesture detection method takes into account not only the second measurement result measured in the second motion measurement step but also the first measurement result measured in the first motion measurement step, and the second portion of the second part. Since the gesture by two movements is recognized, even if the wearing part moves, the gesture can be detected.
  • the gesture detection method of a wearable electronic device and wearable electronic device can be provided.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

In this wearable electronic device and gesture detection method therefor, first movement of a mounted member mounted on a first region of a living body is measured, second movement of a second region different from the first region on the living body or a pointing body provided at the second region is measured, and on the basis of the results of the first and second measurements, gestures based on the second movement of the second region or the pointing body are recognized.

Description

ウェアラブル電子機器およびウェアラブル電子機器のジェスチャー検知方法Wearable electronic device and gesture detection method for wearable electronic device
 本発明は、身に付けることができるウェアラブル電子機器に関し、特に、所定のジェスチャーを認識できるウェアラブル電子機器に関する。そして、本発明は、ウェアラブル電子機器のジェスチャー検知方法に関する。 The present invention relates to a wearable electronic device that can be worn, and particularly relates to a wearable electronic device that can recognize a predetermined gesture. The present invention relates to a gesture detection method for wearable electronic devices.
 近年、急速に発達した例えばスマートフォン(多機能型携帯電話機)等の携帯端末装置では、一般に、タッチパネルが備えられ、ユーザは、例えば所望の画像を表示させたり、情報を入力したりする等のために、その必要な入力操作を、前記タッチパネルを用いて行っている。しかしながら、例えば、ユーザの手が濡れていたり、汚れていたりする場合等のように、前記タッチパネルにタッチせずに前記入力操作を行いたい場合がある。このため、意思伝達手段の1つの身振りや手振りであるジェスチャーによって前記入力操作を行うことができる機器が研究、開発されており、例えば、特許文献1に開示された技術がある。 2. Description of the Related Art In recent years, mobile terminal devices such as smart phones (multifunctional mobile phones) that have been rapidly developed are generally equipped with a touch panel, and a user can display a desired image or input information, for example. In addition, the necessary input operation is performed using the touch panel. However, there are cases where it is desired to perform the input operation without touching the touch panel, such as when the user's hand is wet or dirty. For this reason, a device capable of performing the input operation by one gesture of gesture transmission means or a gesture that is a hand gesture has been researched and developed. For example, there is a technique disclosed in Patent Document 1.
 この特許文献1に開示された移動コンピューティングデバイスは、赤外線(IR)発光ダイオード(LED)とIR近接センサとを備える、3次元のユーザの動きに関連するデータを取得するように構成されているセンサシステムと、前記センサシステムに通信可能に結合され、前記センサシステムにより取得された前記3次元のユーザの動きに関連するデータの明瞭度と、前記3次元のユーザの動きに関する正しい入力ジェスチャー識別の確率とを示す、前記デバイスのプロパティを識別し、前記デバイスのプロパティに基づいて、前記センサシステムの前記IR LEDまたは前記IR近接センサのうちの少なくとも1つの電力消費を規制するように構成されているセンサ制御装置モジュールとを具備する。 The mobile computing device disclosed in this document is configured to acquire data related to a three-dimensional user movement comprising an infrared (IR) light emitting diode (LED) and an IR proximity sensor. A sensor system, communicatively coupled to the sensor system, and data clarity related to the three-dimensional user movement acquired by the sensor system and correct input gesture identification of the three-dimensional user movement. Configured to identify a property of the device that is indicative of a probability and regulate power consumption of at least one of the IR LED or the IR proximity sensor of the sensor system based on the device property A sensor control device module.
 ところで、前記特許文献1に開示された前記センサシステムは、前記IR近接センサが静止していることを前提に、前記IR近接センサによって、前記IR近接センサに対する3次元のユーザの動き、すなわちジェスチャーに関連するデータを取得している。このため、前記IR近接センサが動いてしまうと、前記IR近接センサによってジェスチャーを検知することが難しい。 By the way, the sensor system disclosed in Patent Document 1 is based on the assumption that the IR proximity sensor is stationary, and the IR proximity sensor is used to perform a three-dimensional user movement with respect to the IR proximity sensor, that is, a gesture. Relevant data is acquired. For this reason, if the IR proximity sensor moves, it is difficult to detect a gesture by the IR proximity sensor.
特表2013-534009号公報Special table 2013-534209
 本発明は、上述の事情に鑑みて為された発明であり、その目的は、装着部位が動いたとしてもジェスチャーの検知が可能なウェアラブル電子機器、および、前記ウェアラブル電子機器のジェスチャー検知方法を提供することである。 The present invention has been made in view of the above-described circumstances, and an object thereof is to provide a wearable electronic device capable of detecting a gesture even when a wearing part moves, and a gesture detection method for the wearable electronic device. It is to be.
 本発明にかかるウェアラブル電子機器およびそのジェスチャー検知方法では、生体における第1部位に装着される装着部材の第1の動きが測定され、前記装着部材に搭載され、前記生体における前記第1部位と異なる第2部位または前記第2部位に設けられた指示体の第2の動きが測定され、これら第1および第2測定結果に基づいて、前記第2部位または前記指示体の前記第2の動きによるジェスチャーが認識される。したがって、本発明にかかるウェアラブル電子機器およびウェアラブル電子機器のジェスチャー検知方法は、装着部位が動いたとしてもジェスチャーの検知が可能となる。 In the wearable electronic device and the gesture detection method thereof according to the present invention, the first movement of the mounting member to be mounted on the first part in the living body is measured and mounted on the mounting member, which is different from the first part in the living body. The second movement of the indicator provided in the second part or the second part is measured, and based on the first and second measurement results, the second movement of the second part or the indicator is determined. Gesture is recognized. Therefore, the wearable electronic device and the gesture detection method for the wearable electronic device according to the present invention can detect a gesture even if the wearing site moves.
 上記並びにその他の本発明の目的、特徴及び利点は、以下の詳細な記載と添付図面から明らかになるであろう。 The above and other objects, features and advantages of the present invention will become apparent from the following detailed description and the accompanying drawings.
実施形態におけるウェアラブル電子機器の構造的な構成を示す斜視図である。It is a perspective view which shows the structural structure of the wearable electronic device in embodiment. 前記ウェアラブル電子機器の構造的な構成を示す正面図である。It is a front view which shows the structural structure of the said wearable electronic device. 前記ウェアラブル電子機器の構造的な構成を示す上面図である。It is a top view which shows the structural structure of the said wearable electronic device. 前記ウェアラブル電子機器におけるディスプレイユニットの構成を示す概略断面図である。It is a schematic sectional drawing which shows the structure of the display unit in the said wearable electronic device. 前記ウェアラブル電子機器の電気的な構成を示すブロック図である。It is a block diagram which shows the electrical structure of the said wearable electronic device. 前記ウェアラブル電子機器における第2動き測定部の一例としての近接センサの構成を示す図である。It is a figure which shows the structure of the proximity sensor as an example of the 2nd motion measurement part in the said wearable electronic device. 前記ウェアラブル電子機器を装着した場合の正面図である。It is a front view at the time of mounting | wearing with the said wearable electronic device. 前記ウェアラブル電子機器を装着した場合の側面図および部分上面図である。It is the side view at the time of mounting | wearing with the said wearable electronic device, and a partial top view. シースルー型の画像表示部を通してユーザが視認する像の一例を示す図である。It is a figure which shows an example of the image which a user visually recognizes through a see-through type image display part. 前記ウェアラブル電子機器における第2動き測定部の一例としての近接センサの出力の一例を示す図である。It is a figure which shows an example of the output of the proximity sensor as an example of the 2nd motion measurement part in the said wearable electronic device. 前記ウェアラブル電子機器におけるジェスチャー検知処理(メインルーチン)を示すフローチャートである。It is a flowchart which shows the gesture detection process (main routine) in the said wearable electronic device. 前記ウェアラブル電子機器の前記ジェスチャー検知処理におけるジェスチャー判定処理(サブルーチン)を示すフローチャートである。It is a flowchart which shows the gesture determination process (subroutine) in the said gesture detection process of the said wearable electronic device.
 以下、本発明にかかる実施の一形態を図面に基づいて説明する。なお、各図において同一の符号を付した構成は、同一の構成であることを示し、適宜、その説明を省略する。本明細書において、総称する場合には添え字を省略した参照符号で示し、個別の構成を指す場合には添え字を付した参照符号で示す。 Hereinafter, an embodiment according to the present invention will be described with reference to the drawings. In addition, the structure which attached | subjected the same code | symbol in each figure shows that it is the same structure, The description is abbreviate | omitted suitably. In this specification, when referring generically, it shows with the reference symbol which abbreviate | omitted the suffix, and when referring to an individual structure, it shows with the reference symbol which attached the suffix.
 本実施形態におけるウェアラブル電子機器は、身に付けることができ、所定のジェスチャーを検知できる電子機器であり、生体における所定の第1部位に装着するための装着部材と、前記装着部材の第1の動きを測定する第1動き測定部と、前記装着部材に搭載され、前記生体における前記第1部位と異なる第2部位または前記第2部位に設けられたジェスチャーを行うための指示体の第2の動きを測定する第2動き測定部と、前記第1および第2動き測定部で測定した第1および第2測定結果に基づいて、前記第2動きによるジェスチャーを認識するジェスチャー処理部とを備える。このようなウェアラブル電子機器は、任意の用途の電子機器であって良いが、ここでは、一例として、前記装着部材が前記第1部位を頭部として前記頭部に装着するための部材(頭部装着部材)であって、いわゆるヘッドマウントディスプレイ(HMD)である場合について、より具体的に、以下に、説明する。 The wearable electronic device in the present embodiment is an electronic device that can be worn and can detect a predetermined gesture, and includes a mounting member for mounting on a predetermined first part of a living body, and a first of the mounting member A first movement measuring unit for measuring movement; and a second indicator mounted on the mounting member and used for performing a gesture provided in the second part or the second part different from the first part in the living body. A second motion measuring unit for measuring a motion; and a gesture processing unit for recognizing a gesture based on the second motion based on the first and second measurement results measured by the first and second motion measuring units. Such a wearable electronic device may be an electronic device for any purpose, but here, as an example, a member (head) for mounting the mounting member on the head with the first part as the head More specifically, the case where it is a so-called head mounted display (HMD) will be described below.
 図1は、実施形態におけるウェアラブル電子機器の構造的な構成を示す斜視図である。図2は、実施形態におけるウェアラブル電子機器の構造的な構成を示す正面図である。図3は、実施形態におけるウェアラブル電子機器の構造的な構成を示す上面図である。図4は、実施形態のウェアラブル電子機器におけるディスプレイユニットの構成を示す概略断面図である。図5は、実施形態におけるウェアラブル電子機器の電気的な構成を示すブロック図である。図6は、実施形態のウェアラブル電子機器における第2動き測定部の一例としての近接センサの構成を示す図である。以下、HMD100の右側および左側とは、HMD100を装着したユーザにとっての右側および左側をいうものとし、図1および図2に示すように、前記右側から前記左側に向かう方向をX方向としたXYZ直交座標系が設定され、適宜、X軸、Y軸、Z軸、X方向、Y方向およびZ方向が用いられる。 FIG. 1 is a perspective view showing a structural configuration of a wearable electronic device according to an embodiment. FIG. 2 is a front view illustrating a structural configuration of the wearable electronic device according to the embodiment. FIG. 3 is a top view illustrating a structural configuration of the wearable electronic device according to the embodiment. FIG. 4 is a schematic cross-sectional view illustrating a configuration of a display unit in the wearable electronic device of the embodiment. FIG. 5 is a block diagram illustrating an electrical configuration of the wearable electronic device according to the embodiment. FIG. 6 is a diagram illustrating a configuration of a proximity sensor as an example of a second motion measurement unit in the wearable electronic device of the embodiment. Hereinafter, the right side and the left side of the HMD 100 refer to the right side and the left side for the user wearing the HMD 100. As shown in FIGS. 1 and 2, the direction from the right side to the left side is the XYZ orthogonal. A coordinate system is set, and the X axis, Y axis, Z axis, X direction, Y direction, and Z direction are used as appropriate.
 まず、HMD100の構造的な構成について説明する。図1ないし図3に示すように、本実施形態におけるHMD100は、頭部に装着するための頭部装着部材の一例であるフレーム101を備える。上方から見て略コ字状であるフレーム101は、2つの眼鏡レンズ102を取り付ける前方部101aと、前方部101aの両端から後方(Y方向)へと延在する側部101b、101cとを備える。フレーム101に取り付けられた2つの眼鏡レンズ102は、屈折力(光学的パワー、焦点距離の逆数)を有して良く、また、有しなくて良い。 First, the structural configuration of the HMD 100 will be described. As shown in FIGS. 1 to 3, the HMD 100 according to the present embodiment includes a frame 101 that is an example of a head mounting member to be mounted on the head. A frame 101 that is substantially U-shaped when viewed from above includes a front part 101a to which two spectacle lenses 102 are attached, and side parts 101b and 101c that extend rearward (Y direction) from both ends of the front part 101a. . The two spectacle lenses 102 attached to the frame 101 may or may not have refractive power (optical power, reciprocal of focal length).
 右側(ユーザーの利き目等に応じて左側でもよい)の眼鏡レンズ102の上部において、円筒状の主本体部103がフレーム101の前方部101aに固定されている。主本体部103にはディスプレイユニット104が設けられている。主本体部103内には、後述する制御処理部121からの指示に基づいてディスプレイユニット104の表示制御を司る表示制御部104DR(後述する図5を参照)が配置されている。なお、必要に応じて両眼の前にそれぞれディスプレイユニットが配置されてもよい。 The cylindrical main body 103 is fixed to the front part 101 a of the frame 101 at the upper part of the right eyeglass lens 102 (which may be the left side according to the user's dominant eye etc.). The main body 103 is provided with a display unit 104. In the main body 103, a display control unit 104DR (see FIG. 5 described later) that controls display of the display unit 104 based on an instruction from the control processing unit 121 described later is disposed. Note that a display unit may be disposed in front of both eyes as necessary.
 図4において、ディスプレイユニット104は、画像形成部104Aと画像表示部104Bとを備えて構成される。画像形成部104Aは、主本体部103内に組み込まれており、光源104aと、一方向拡散板104bと、集光レンズ104cと、表示素子104dとを備える。一方、いわゆるシースルー型の表示部材である画像表示部104Bは、主本体部103から下方に向かい、片方の眼鏡レンズ102(図1参照)に平行に延在するように配置された全体的に板状であって、接眼プリズム104fと、偏向プリズム104gと、ホログラム光学素子104hとを有している。 In FIG. 4, the display unit 104 includes an image forming unit 104A and an image display unit 104B. The image forming unit 104A is incorporated in the main body unit 103, and includes a light source 104a, a one-way diffusing plate 104b, a condenser lens 104c, and a display element 104d. On the other hand, the image display unit 104B, which is a so-called see-through type display member, is disposed on the entire plate so as to extend downward from the main body unit 103 and extend in parallel to one eyeglass lens 102 (see FIG. 1). The eyepiece prism 104f, the deflecting prism 104g, and the hologram optical element 104h.
 光源104aは、表示素子104dを照明する機能を有し、例えば光強度のピーク波長および光強度半値の波長幅で462±12nm(青色光(B光))、525±17nm(緑色光(G光))、635±11nm(赤色光(R光))となる3つの波長帯域の光を発するRGB一体型の発光ダイオード(LED)で構成されている。 The light source 104a has a function of illuminating the display element 104d. For example, the peak wavelength of light intensity and the half width of the light intensity are 462 ± 12 nm (blue light (B light)), 525 ± 17 nm (green light (G light )), 635 ± 11 nm (red light (R light)), and is composed of RGB integrated light emitting diodes (LEDs) that emit light in three wavelength bands.
 表示素子104dは、光源104aからの出射光を画像データに応じて変調して画像を表示するものであり、光が透過する領域となる各画素をマトリクス状に有する透過型の液晶表示素子で構成される。なお、表示素子104dは、反射型であってもよい。 The display element 104d displays an image by modulating the light emitted from the light source 104a in accordance with image data, and is configured by a transmissive liquid crystal display element having pixels that serve as light transmitting regions in a matrix. Is done. Note that the display element 104d may be of a reflective type.
 接眼プリズム104fは、基端面PL1を介して入射する表示素子104dからの画像光を、相対する平行な内側面PL2と外側面PL3とで全反射させ、ホログラム光学素子104hを介してユーザの瞳に導く一方、外光を透過させてユーザの瞳に導くものであり、偏向プリズム104gとともに、例えばアクリル系樹脂で形成されている。この接眼プリズム104fと偏向プリズム104gとは、内側面PL2および外側面PL3に対して傾斜した傾斜面PL4、PL5でホログラム光学素子104hを挟み、接着剤で接合される。 The eyepiece prism 104f totally reflects the image light from the display element 104d incident through the base end face PL1 by the opposed parallel inner side face PL2 and outer side face PL3, and passes through the hologram optical element 104h to the user's pupil. On the other hand, it transmits external light and guides it to the user's pupil, and is formed of, for example, an acrylic resin together with the deflecting prism 104g. The eyepiece prism 104f and the deflection prism 104g are joined by an adhesive with the hologram optical element 104h sandwiched between inclined surfaces PL4 and PL5 inclined with respect to the inner surface PL2 and the outer surface PL3.
 偏向プリズム104gは、接眼プリズム104fに接合されて、接眼プリズム104fと一体となって略平行平板となるものである。なお、ディスプレイユニット104とユーザの瞳の間に眼鏡レンズ102(図1参照)を装着すると、通常眼鏡を使用しているユーザでも画像を観察することが可能である。 The deflection prism 104g is joined to the eyepiece prism 104f, and becomes a substantially parallel flat plate integrated with the eyepiece prism 104f. In addition, if the spectacle lens 102 (refer FIG. 1) is mounted | worn between the display unit 104 and a user's pupil, even the user who uses normal spectacles can observe an image.
 ホログラム光学素子104hは、表示素子104dから出射される画像光(3原色に対応した波長の光)を回折反射して瞳孔Bに導き、表示素子104dに表示される画像を拡大してユーザの瞳に虚像として導く体積位相型の反射型ホログラムである。このホログラム光学素子104hは、例えば、回折効率のピーク波長および回折効率半値の波長幅で465±5nm(B光)、521±5nm(G光)、634±5nm(R光)の3つの波長域の光を回折(反射)させるように作製されている。ここで、回折効率のピーク波長は、回折効率がピークとなるときの波長のことであり、回折効率半値の波長幅とは、回折効率が回折効率ピークの半値となるときの波長幅のことである。 The hologram optical element 104h diffracts and reflects the image light (light having a wavelength corresponding to the three primary colors) emitted from the display element 104d, guides it to the pupil B, enlarges the image displayed on the display element 104d, and enlarges the user's pupil. It is a volume phase type reflection hologram guided as a virtual image. The hologram optical element 104h has, for example, three wavelength ranges of 465 ± 5 nm (B light), 521 ± 5 nm (G light), and 634 ± 5 nm (R light) with a peak wavelength of diffraction efficiency and a wavelength width of half the diffraction efficiency. The light is diffracted (reflected). Here, the peak wavelength of diffraction efficiency is the wavelength at which the diffraction efficiency reaches a peak, and the wavelength width at half maximum of the diffraction efficiency is the wavelength width at which the diffraction efficiency is at half maximum of the diffraction efficiency peak. is there.
 このような構成のディスプレイユニット104では、光源104aから出射された光は、一方向拡散板104bにて拡散され、集光レンズ104cにて集光されて表示素子104dに入射する。表示素子104dに入射した光は、表示制御部104DRから入力された画像データに基づいて画素ごとに変調され、画像光として出射される。これにより、表示素子104dには、カラー画像が表示される。表示素子104dからの画像光は、接眼プリズム104fの内部にその基端面PL1から入射し、内側面PL2と外側面PL3で複数回全反射されて、ホログラム光学素子104hに入射する。ホログラム光学素子104hに入射した光は、そこで反射され、内側面PL2を透過して瞳孔Bに達する。瞳孔Bの位置では、ユーザは、表示素子104dに表示された画像の拡大虚像を観察することができ、画像表示部104Bに形成される画面として視認することができる。 In the display unit 104 having such a configuration, light emitted from the light source 104a is diffused by the unidirectional diffusion plate 104b, condensed by the condenser lens 104c, and incident on the display element 104d. The light incident on the display element 104d is modulated for each pixel based on the image data input from the display control unit 104DR, and is emitted as image light. Thereby, a color image is displayed on the display element 104d. The image light from the display element 104d enters the eyepiece prism 104f from its base end face PL1, is totally reflected a plurality of times by the inner side face PL2 and the outer side face PL3, and enters the hologram optical element 104h. The light incident on the hologram optical element 104h is reflected there, passes through the inner side surface PL2, and reaches the pupil B. At the position of the pupil B, the user can observe an enlarged virtual image of the image displayed on the display element 104d, and can visually recognize it as a screen formed on the image display unit 104B.
 一方、接眼プリズム104f、偏向プリズム104gおよびホログラム光学素子104hは、外光をほとんど全て透過させるので、ユーザはこれらを介して外界像(実像)を観察できる。したがって、表示素子104dに表示された画像の虚像は、外界像の一部に重なって観察されることになる。このようにして、HMD100のユーザは、ホログラム光学素子104hを介して、表示素子104dから提供される画像と外界像とを同時に観察できる。なお、ディスプレイユニット104が非表示状態の場合、画像表示部104Bは、素通しとなり、外界像のみを観察できる。なお、本実施形態では、光源と液晶表示素子と光学系とを組み合わせてディスプレイユニットが構成されているが、光源と液晶表示素子の組合せに代え、自発光型の表示素子(例えば、有機EL表示素子)が用いられても良い。また、光源と液晶表示素子と光学系の組合せに代えて、非発光状態で透過性を有する透過型有機EL表示パネルが用いられてもよい。 On the other hand, since the eyepiece prism 104f, the deflecting prism 104g, and the hologram optical element 104h transmit almost all of the external light, the user can observe the external image (real image) through these. Therefore, the virtual image of the image displayed on the display element 104d is observed so as to overlap with a part of the external image. In this way, the user of the HMD 100 can simultaneously observe the image provided from the display element 104d and the external image via the hologram optical element 104h. In addition, when the display unit 104 is in a non-display state, the image display unit 104B is transparent and can observe only an external image. In this embodiment, the display unit is configured by combining a light source, a liquid crystal display element, and an optical system. However, instead of the combination of the light source and the liquid crystal display element, a self-luminous display element (for example, an organic EL display) is used. Element) may be used. Further, instead of a combination of a light source, a liquid crystal display element, and an optical system, a transmissive organic EL display panel having transparency in a non-light emitting state may be used.
 図1ないし図3に戻って、主本体部103の正面には、フレーム101の中央寄りに配置された第2動き測定部(例えば近接センサ等)105と、側部寄りに配置されたカメラ106のレンズ106aと、これら第2動き測定部105とレンズ106aとの間に配置された照度センサ107とが、前方を向くようにして設けられている。したがって、この図1ないし図3に示す例では、第2動き測定部105の測定方向、カメラ106の光軸および照度センサ107の測定方向とは、互いに同じ方向となっている。ここで、第2動き測定部105の測定方向、カメラ106の光軸および照度センサ107の測定方向が同じ方向とは、近接センサ105Aの検知範囲の中心軸、カメラ106の光軸、照度センサ107の測定範囲の中心軸が平行である場合だけでなく、3つの軸が若干交差していても、上記3つの軸が平行である場合と類似した傾向の出力が三者から得られるような位置関係でこれらが配置されている場合も含まれる。 Returning to FIGS. 1 to 3, on the front surface of the main body 103, a second motion measuring unit (for example, a proximity sensor) 105 disposed near the center of the frame 101 and a camera 106 disposed near the side. The lens 106a and the illuminance sensor 107 disposed between the second motion measuring unit 105 and the lens 106a are provided so as to face forward. Accordingly, in the example shown in FIGS. 1 to 3, the measurement direction of the second motion measurement unit 105, the optical axis of the camera 106, and the measurement direction of the illuminance sensor 107 are the same. Here, the measurement direction of the second motion measuring unit 105, the optical axis of the camera 106, and the measurement direction of the illuminance sensor 107 are the same as the central axis of the detection range of the proximity sensor 105A, the optical axis of the camera 106, and the illuminance sensor 107. A position where not only when the central axes of the measurement range are parallel, but also when the three axes are slightly crossed, an output with a tendency similar to that when the three axes are parallel is obtained from the three parties. The case where these are arranged in a relationship is also included.
 フレーム101の右側の側部101bには、右副本体部108-Rが取り付けられ、フレーム101の左側の側部101cには、左副本体部108-Lが取り付けられている。右副本体部108-Rおよび左副本体部108-Lは、細長い板形状を有しており、それぞれ内側に細長い突起108a-R、108a-Lを有している。この細長い突起108a-Rをフレーム101の側部101bの長孔101dに係合させることで、右副本体部108-Rは、位置決めされた状態でフレーム101に取り付けられ、細長い突起108a-Lをフレーム101の側部101cの長孔101eに係合させることで、左副本体部108-Lは、位置決めされた状態でフレーム101に取り付けられる。 The right sub-body portion 108-R is attached to the right side portion 101b of the frame 101, and the left sub-body portion 108-L is attached to the left side portion 101c of the frame 101. The right sub-main body portion 108-R and the left sub-main body portion 108-L have an elongated plate shape, and have elongated projections 108a-R and 108a-L, respectively, inside. By engaging the elongated protrusion 108a-R with the elongated hole 101d of the side portion 101b of the frame 101, the right sub-main body portion 108-R is attached to the frame 101 in a positioned state, and the elongated protrusion 108a-L is By engaging with the long hole 101e of the side portion 101c of the frame 101, the left sub-main body portion 108-L is attached to the frame 101 in a positioned state.
 右副本体部108-R内には、地磁気センサ109(図5参照)と、第1動き測定部(例えばジャイロおよび加速度センサ等)110(図5参照)とが配置されている。左副本体部108-L内には、スピーカ(またはイヤホン)111Aおよびマイク111B(図5参照)が配置されている。主本体部103と右副本体部108-Rとは、配線HSで信号伝達可能に接続されており、主本体部103と左副本体部108-Lとは、不図示の配線で信号伝達可能に接続されている。図3に簡略図示するように、右副本体部108-Rは、その後端から延在するコードCDを介して制御ユニットCTUに接続されている。なお、入力される音声に応じてマイク111Bから生成される出力信号に基づいて、音声によって操作されるように、HMD100が構成されてもよい。また、主本体部103と左副本体部108とは、無線接続されるように構成されてもよい。 A geomagnetic sensor 109 (see FIG. 5) and a first motion measuring unit (for example, a gyroscope and an acceleration sensor) 110 (see FIG. 5) are arranged in the right sub-main body portion 108-R. A speaker (or earphone) 111A and a microphone 111B (see FIG. 5) are arranged in the left sub-main body portion 108-L. The main main body 103 and the right sub main body 108-R are connected so as to be able to transmit signals through a wiring HS, and the main main body 103 and the left sub main body 108-L can transmit signals through a wiring (not shown). It is connected to the. As schematically shown in FIG. 3, the right sub-main body 108-R is connected to the control unit CTU via a cord CD extending from the rear end thereof. Note that the HMD 100 may be configured to be operated by voice based on an output signal generated from the microphone 111B according to the input voice. Further, the main main body 103 and the left sub main body 108 may be configured to be wirelessly connected.
 次に、HMD100の電気的な構成について説明する。図5において、HMD100は、制御ユニットCTUと、カメラ106と、地磁気センサ109と、第1動き測定部110と、第2動き測定部105と、マイク111Bと、照度センサ107と、画像形成部104Aと、表示制御部104DRと、スピーカ111Aとを備える。制御ユニットCTUは、制御処理部121と、操作部122と、GPS受信部123と、通信部124と、記憶部125と、バッテリ126と、電源回路127とを備える。 Next, the electrical configuration of the HMD 100 will be described. 5, the HMD 100 includes a control unit CTU, a camera 106, a geomagnetic sensor 109, a first motion measuring unit 110, a second motion measuring unit 105, a microphone 111B, an illuminance sensor 107, and an image forming unit 104A. And a display control unit 104DR and a speaker 111A. The control unit CTU includes a control processing unit 121, an operation unit 122, a GPS reception unit 123, a communication unit 124, a storage unit 125, a battery 126, and a power supply circuit 127.
 カメラ106は、制御処理部121に接続され、制御処理部121の制御に従って、被写体の画像を生成する装置である。カメラ106は、例えば、被写体の光学像を所定の結像面上に結像する結像光学系、前記結像面に受光面を一致させて配置され、前記被写体の光学像を電気的な信号に変換するイメージセンサ、前記イメージセンサの出力に対し公知の画像処理を施して画像(画像データ)を生成するデジタルシグナルプロセッサ(DSP)等を備えて構成される。前記結像光学系は、1または複数のレンズを備えて構成され、その1つとして前記レンズ106aを含む。カメラ106は、前記生成した画像データを制御処理部121へ出力する。 The camera 106 is an apparatus that is connected to the control processing unit 121 and generates an image of a subject under the control of the control processing unit 121. The camera 106 is, for example, an image forming optical system that forms an optical image of a subject on a predetermined image forming surface, and a light receiving surface that matches the image forming surface. An image sensor that converts the image sensor into a digital signal processor (DSP) that performs known image processing on the output of the image sensor to generate an image (image data). The imaging optical system includes one or more lenses, and includes the lens 106a as one of them. The camera 106 outputs the generated image data to the control processing unit 121.
 地磁気センサ109は、制御処理部121に接続され、地球磁気を測定することで、HMD100における前方(-Y方向)の方位を測定する回路である。地磁気センサ109は、前記測定した方位を制御処理部121へ出力する。 The geomagnetic sensor 109 is a circuit that is connected to the control processing unit 121 and measures the front (−Y direction) orientation in the HMD 100 by measuring the earth magnetism. The geomagnetic sensor 109 outputs the measured orientation to the control processing unit 121.
 第1動き測定部110は、制御処理部121に接続され、前記頭部装着部材の一例であるフレーム101の動きを測定するための装置である。第1動き測定部110は、前記フレーム101に搭載される。第1動き測定部110は、その測定結果を制御処理部121へ出力する。より具体的には、第1動き測定部110は、例えば、ジャイロおよび加速度センサ110aであり、上述したように、右副本体部108-R内に内蔵されて配置される。ジャイロおよび加速度センサ110aは、フレーム101の姿勢に応じた、X軸回りのロールの角速度、Y軸回りのピッチの角速度、Z軸回りのヨーの角速度、X方向の加速度、Y方向の加速度およびZ方向の加速度それぞれを測定する回路である。ジャイロおよび加速度センサ110aは、前記測定した各角速度および各加速度を制御処理部121へ出力する。なお、ジャイロおよび加速度センサ110aは、これらを一体化した6軸センサであってもよい。 The first motion measuring unit 110 is connected to the control processing unit 121 and is a device for measuring the motion of the frame 101 which is an example of the head-mounted member. The first motion measuring unit 110 is mounted on the frame 101. The first motion measurement unit 110 outputs the measurement result to the control processing unit 121. More specifically, the first motion measurement unit 110 is, for example, a gyroscope and an acceleration sensor 110a, and is disposed in the right sub-main body unit 108-R as described above. The gyroscope and acceleration sensor 110a is configured to determine the angular velocity of the roll around the X axis, the angular velocity of the pitch around the Y axis, the angular velocity of the yaw around the Z axis, the acceleration in the X direction, the acceleration in the Y direction, and the Z according to the posture of the frame 101. It is a circuit that measures each acceleration in the direction. The gyro and acceleration sensor 110 a outputs the measured angular velocities and accelerations to the control processing unit 121. Note that the gyro and acceleration sensor 110a may be a six-axis sensor in which these are integrated.
 第2動き測定部105は、制御処理部121に接続され、前記第1部位と異なる第2部位の第2の動きまたは前記第2部位に設けられたジェスチャーを行うための指示体の第2の動きを測定するための装置である。本実施形態では、前記第1部位は、生体の頭部であるので、前記生体における第2部位は、前記頭部と異なる部位、例えば、手や手指等である。また、前記生体の第2部位に設けられた指示体とは、例えばユーザの手によって把持された棒状の部材(例えばペンや指し棒等)、ユーザの指や腕に装着部材で装着された棒状の部材等である。そして、本実施形態では、第2動き測定部105は、さらに、前記第2部位の存否を検知する。第2動き測定部105は、前記頭部装着部材の一例であるフレーム101に搭載される。第2動き測定部105は、その測定結果を制御処理部121へ出力する。より具体的には、第2動き測定部105は、例えば、近接センサ105aであり、上述したように、フレーム101の前方部101aに固定されている主本体部103に搭載される。 The second movement measuring unit 105 is connected to the control processing unit 121, and is a second indicator for performing a second movement of a second part different from the first part or a gesture provided in the second part. It is a device for measuring movement. In this embodiment, since the first part is the head of a living body, the second part in the living body is a part different from the head, for example, a hand or a finger. The indicator provided in the second part of the living body is, for example, a rod-shaped member (for example, a pen or a pointing rod) gripped by a user's hand, or a rod-shaped member mounted on a user's finger or arm with a mounting member. These members. In the present embodiment, the second motion measuring unit 105 further detects the presence / absence of the second part. The second motion measuring unit 105 is mounted on the frame 101 which is an example of the head mounting member. The second motion measuring unit 105 outputs the measurement result to the control processing unit 121. More specifically, the second motion measuring unit 105 is, for example, a proximity sensor 105a and is mounted on the main body 103 fixed to the front part 101a of the frame 101 as described above.
 本明細書において、「近接センサ」とは、物体、例えば人体の一部(手や指など)がユーザの眼前に近接していることを検知するために、近接センサの検出面前方の近接範囲にある検出領域内に存在しているか否かを検出して信号を出力するものをいう。近接範囲は、操作者の特性や好みに応じて適宜に設定すればよいが、例えば、近接センサの検出面からの距離が200mm以内の範囲とすることができる。近接センサからの距離が200mm以内であれば、ユーザが腕を曲げた状態で、手のひらや指をユーザの視野内に入れたり出したりできるため、手や指を使ったジェスチャーによって容易に操作を行うことができ、また、ユーザ以外の人体や家具等を誤って検出する虞が少なくなる。 In the present specification, the “proximity sensor” refers to a proximity range in front of the detection surface of the proximity sensor in order to detect that an object, for example, a part of a human body (such as a hand or a finger) is close to the user's eyes. The signal is output by detecting whether or not it exists within the detection area. The proximity range may be set as appropriate according to the operator's characteristics and preferences. For example, the proximity range from the detection surface of the proximity sensor may be within a range of 200 mm. If the distance from the proximity sensor is within 200 mm, the user can easily put the palm and fingers into and out of the user's field of view with the arm bent, so that the user can easily operate with gestures using the hands and fingers. In addition, the possibility of erroneous detection of a human body or furniture other than the user is reduced.
 近接センサには、パッシブ型とアクティブ型とがある。パッシブ型の近接センサは、物体が近接した際に物体から放射される不可視光や電磁波を検出する検出部を有する。パッシブ型の近接センサとして、接近した人体から放射される赤外線等の不可視光を検出する焦電センサや、接近した人体との間の静電容量変化を検出する静電容量センサ等がある。アクティブ型の近接センサは、不可視光や音波の投射部と、物体に反射して戻った不可視光や音波を受ける検出部とを有する。アクティブ型の近接センサとしては、赤外線を投射して物体で反射された赤外線を受光する赤外線センサや、レーザ光を投射して物体で反射されたレーザ光を受光するレーザセンサや、超音波を投射して物体で反射された超音波を受け取る超音波センサ等がある。なお、パッシブ型の近接センサは、物体に向けてエネルギーを投射する必要がないので、低消費電力性に優れている。アクティブ型の近接センサは、検知の確実性を向上させ易く、例えば、ユーザが、赤外光などの人体から放射される検出光を透過しない手袋をしているような場合でも、ユーザの手の動きを検出できる。複数種類の近接センサが組み合わされても良い。 There are two types of proximity sensors: passive type and active type. A passive proximity sensor has a detection unit that detects invisible light and electromagnetic waves emitted from an object when the object approaches. As a passive proximity sensor, there are a pyroelectric sensor that detects invisible light such as infrared rays emitted from an approaching human body, an electrostatic capacitance sensor that detects a change in electrostatic capacitance between the approaching human body, and the like. The active proximity sensor includes an invisible light and sound wave projection unit, and a detection unit that receives the invisible light and sound wave reflected and returned from the object. Active proximity sensors include infrared sensors that project infrared rays and receive infrared rays reflected by objects, laser sensors that project laser beams and receive laser beams reflected by objects, and project ultrasonic waves. Then, there is an ultrasonic sensor that receives ultrasonic waves reflected by an object. Note that a passive proximity sensor does not need to project energy toward an object, and thus has excellent low power consumption. An active proximity sensor is easy to improve the certainty of detection. For example, even when a user wears a glove that does not transmit detection light emitted from a human body such as infrared light, Can detect movement. A plurality of types of proximity sensors may be combined.
 本実施形態では、近接センサ105aとして、2次元マトリクス状に配列された複数の焦電素子を備えた焦電センサが用いられている。図6において、近接センサ105aは、2行2列に配列された4個の焦電素子RA、RB、RC、RDを備えて構成され、人体から放射される赤外光等の不可視光を検出光として受光し、それに対応した信号が各焦電素子RA~RDそれぞれから出力される。各焦電素子RA~RDの各出力は、近接センサ105aの受光面から物体までの距離に応じて強度が変化し、距離が近いほど強度が大きくなる。近接センサ105aは、各焦電素子RA~RDの各出力を制御処理部121へ出力する。 In this embodiment, a pyroelectric sensor including a plurality of pyroelectric elements arranged in a two-dimensional matrix is used as the proximity sensor 105a. In FIG. 6, the proximity sensor 105a includes four pyroelectric elements RA, RB, RC, and RD arranged in 2 rows and 2 columns, and detects invisible light such as infrared light emitted from the human body. Light is received as light, and a corresponding signal is output from each pyroelectric element RA to RD. The outputs of the pyroelectric elements RA to RD vary in intensity according to the distance from the light receiving surface of the proximity sensor 105a to the object, and the intensity increases as the distance decreases. The proximity sensor 105a outputs each output of the pyroelectric elements RA to RD to the control processing unit 121.
 マイク111Bは、制御処理部121に接続され、音の音響振動を電気信号に変換する回路である。マイク111Bは、前記変換した、外部の音を表す前記電気信号を制御処理部121へ出力する。 The microphone 111B is a circuit that is connected to the control processing unit 121 and converts acoustic vibration of sound into an electric signal. The microphone 111 </ b> B outputs the converted electric signal representing the external sound to the control processing unit 121.
 照度センサ107は、制御処理部121に接続され、照度を測定する回路である。照度センサ107は、前記測定した照度を制御処理部121へ出力する。照度センサ107は、例えば、光電変換することで入射光の光強度に応じた大きさの電流を出力するホトダイオード、および、前記ホトダイオードの電流値を電圧値へ変換するI-V変換回路等の、その周辺回路を備えて構成される。 The illuminance sensor 107 is a circuit that is connected to the control processing unit 121 and measures illuminance. The illuminance sensor 107 outputs the measured illuminance to the control processing unit 121. The illuminance sensor 107 is, for example, a photodiode that outputs a current having a magnitude corresponding to the light intensity of incident light by photoelectric conversion, and an IV conversion circuit that converts the current value of the photodiode into a voltage value. The peripheral circuit is provided.
 表示制御部104DRは、制御処理部121に接続され、制御処理部121の制御に従って画像形成部104Aを制御することで、画像形成部104Aに画像を形成させる回路である。画像形成部104Aは、上述した通りである。 The display control unit 104DR is a circuit that is connected to the control processing unit 121 and causes the image forming unit 104A to form an image by controlling the image forming unit 104A according to the control of the control processing unit 121. The image forming unit 104A is as described above.
 スピーカ111Aは、制御処理部121に接続され、制御処理部121の制御に従って音を表す電気信号に応じた音を生成して出力するための回路である。 The speaker 111 </ b> A is a circuit that is connected to the control processing unit 121 and generates and outputs a sound corresponding to an electric signal representing a sound according to the control of the control processing unit 121.
 操作部122は、制御処理部121に接続され、例えば電源のオンオフ等の、予め設定された所定の指示をHMD100に入力する機器であり、例えば、所定の機能を割り付けられた1または複数のスイッチ等である。 The operation unit 122 is connected to the control processing unit 121 and is a device that inputs a predetermined instruction, such as power on / off, to the HMD 100, for example, one or a plurality of switches assigned a predetermined function Etc.
 GPS受信部123は、制御処理部121に接続され、制御処理部121の制御に従って、地球上の現在位置を測定するための衛星測位システムによって、当該HDM100の位置を測定する装置であり、その測位結果(緯度X、経度Y、高度Z)を制御処理部121へ出力する。なお、GPS受信部123は、DGSP(Differential GSP)等の誤差を補正する補正機能を持ったGPSであっても良い。 The GPS receiving unit 123 is connected to the control processing unit 121, and is a device that measures the position of the HDM 100 by a satellite positioning system for measuring the current position on the earth according to the control of the control processing unit 121. The result (latitude X, longitude Y, altitude Z) is output to the control processing unit 121. The GPS receiving unit 123 may be a GPS having a correction function for correcting an error such as DGSP (Differential GSP).
 通信部124は、制御処理部121に接続され、制御処理部121の制御に従って、外部機器との間でデータの入出力を行う回路であり、例えば、シリアル通信方式であるRS232Cのインターフェース回路、Bluetooth(登録商標)規格を用いたインターフェース回路、IrDA(Infrared Data Asscoiation)規格等の赤外線通信を行うインターフェース回路、および、USB(Universal Serial Bus)規格を用いたインターフェース回路等である。 The communication unit 124 is a circuit that is connected to the control processing unit 121 and inputs / outputs data to / from an external device according to the control of the control processing unit 121. For example, an RS232C interface circuit that is a serial communication method, Bluetooth An interface circuit using the (registered trademark) standard, an interface circuit performing infrared communication such as an IrDA (Infrared Data Association) standard, and an interface circuit using the USB (Universal Serial Bus) standard.
 また、通信部124は、有線または無線によって通信する通信カード等であり、例えばイーサネット環境等の通信ネットワークを介して例えばサーバ装置等の外部装置との間で通信しても良い(イーサネットは登録商標)。このような通信部124は、制御処理部121から入力された転送すべきデータを収容した通信信号を、前記通信ネットワークで用いられる通信プロトコルに従って生成し、この生成した通信信号を前記通信ネットワークを介して外部装置へ送信する。通信部124は、前記通信ネットワークを介して外部装置から通信信号を受信し、この受信した通信信号からデータを取り出し、この取り出したデータを制御処理部121が処理可能な形式のデータに変換して制御処理部121へ出力する。通信部124は、例えば、無線LANの規格の一つであるWi-Fi(Wireless Fidelity)規格で通信信号を送受信する、IEEE802.11b/g/nに対応したWi-Fi Module(通信カード)等を備えて構成される。 The communication unit 124 is a communication card or the like that communicates by wire or wireless, and may communicate with an external device such as a server device via a communication network such as an Ethernet environment (Ethernet is a registered trademark). ). Such a communication unit 124 generates a communication signal containing data to be transferred input from the control processing unit 121 according to a communication protocol used in the communication network, and generates the generated communication signal via the communication network. To the external device. The communication unit 124 receives a communication signal from an external device via the communication network, extracts data from the received communication signal, and converts the extracted data into data in a format that can be processed by the control processing unit 121. Output to the control processing unit 121. For example, the communication unit 124 transmits and receives communication signals according to the Wi-Fi (Wireless Fidelity) standard, which is one of the wireless LAN standards, and Wi-Fi Module (communication card) compatible with IEEE802.11b / g / n. It is configured with.
 バッテリ126は、電力を蓄積し、前記電力を供給する電池である。バッテリ126は、一次電池であってよく、また、二次電池であってよい。電源回路127は、バッテリ126から供給された電力を、電力を必要とする、当該HMD100の各部へ各部に応じた電圧で供給する回路である。 The battery 126 is a battery that accumulates electric power and supplies the electric power. The battery 126 may be a primary battery or a secondary battery. The power supply circuit 127 is a circuit that supplies power supplied from the battery 126 to each part of the HMD 100 that requires power at a voltage corresponding to each part.
 記憶部125は、制御処理部121に接続され、制御処理部121の制御に従って、各種の所定のプログラムおよび各種の所定のデータを記憶する回路である。前記各種の所定のプログラムには、例えば、当該HMD100の各部を当該各部の機能に応じて制御する制御プログラムや、第1および第2動き測定部110,105で測定した第1および第2測定結果に基づいて前記第2部位の第2の動きによるジェスチャーを認識するジェスチャー処理プログラム等の制御処理プログラムが含まれる。前記各種の所定のデータには、HMD100を制御する上で必要なデータが含まれる。記憶部125は、例えば不揮発性の記憶素子であるROM(Read Only Memory)や書き換え可能な不揮発性の記憶素子であるEEPROM(Electrically Erasable Programmable Read Only Memory)等を備える。そして、記憶部125は、前記所定のプログラムの実行中に生じるデータ等を記憶するいわゆる制御処理部121のワーキングメモリとなるRAM(Random Access Memory)等を含む。 The storage unit 125 is a circuit that is connected to the control processing unit 121 and stores various predetermined programs and various predetermined data under the control of the control processing unit 121. Examples of the various predetermined programs include a control program that controls each part of the HMD 100 according to the function of each part, and first and second measurement results measured by the first and second motion measuring units 110 and 105. And a control processing program such as a gesture processing program for recognizing a gesture by the second movement of the second part based on the above. The various predetermined data includes data necessary for controlling the HMD 100. The storage unit 125 includes, for example, a ROM (Read Only Memory) that is a nonvolatile storage element, an EEPROM (Electrically Erasable Programmable Read Only Memory) that is a rewritable nonvolatile storage element, and the like. The storage unit 125 includes a RAM (Random Access Memory) that serves as a working memory of the control processing unit 121 that stores data generated during the execution of the predetermined program.
 制御処理部121は、HMD100の各部を当該各部の機能に応じてそれぞれ制御し、第1および第2動き測定部110、105で測定した第1および第2測定結果に基づいて前記第2部位の第2の動きによるジェスチャーを認識し、この認識したジェスチャーに応じた処理を実行するものである。制御処理部121は、例えば、CPU(Central Processing Unit)およびその周辺回路等を備えて構成される。制御処理部121には、制御処理プログラムが実行されることによって、制御部1211、ジェスチャー処理部1212および検知結果利用部1213が機能的に構成される。制御部1211、ジェスチャー処理部1212および検知結果利用部1213の一部または全ては、ハードウェアで構成されてもよい。 The control processing unit 121 controls each part of the HMD 100 according to the function of each part, and based on the first and second measurement results measured by the first and second motion measuring units 110 and 105, A gesture based on the second movement is recognized, and processing corresponding to the recognized gesture is executed. The control processing unit 121 includes, for example, a CPU (Central Processing Unit) and its peripheral circuits. In the control processing unit 121, a control processing program is executed, so that a control unit 1211, a gesture processing unit 1212, and a detection result utilization unit 1213 are functionally configured. Part or all of the control unit 1211, the gesture processing unit 1212, and the detection result utilization unit 1213 may be configured by hardware.
 制御部1211は、HMD100の各部を当該各部の機能に応じてそれぞれ制御するものである。 The control unit 1211 controls each unit of the HMD 100 according to the function of each unit.
 ジェスチャー処理部1212は、第1および第2動き測定部110、105で測定した第1および第2測定結果に基づいて、前記第2部位の第2の動きによるジェスチャーを認識するものである。より具体的には、本実施形態では、ジェスチャー処理部1212は、ジャイロおよび加速度センサ110aの第1測定結果および近接センサ105aにおける複数の焦電素子、本実施形態では4個の焦電素子RA~RDの各出力に基づいて、手や手指の第2の動きによる、予め設定された所定のジェスチャーを判定するものである。ジェスチャー処理部1212は、その判定結果(検知結果)を検知結果利用部1213へ通知する。好ましくは、ジェスチャー処理部1212は、第2動き測定部105(本実施形態では近接センサ105a)で測定した第2測定結果から、第1動き測定部110(本実施形態ではジャイロおよび加速度センサ110a)で測定した第1測定結果に基づいて前記第1の動きによる第1動き成分を除去し、この第1動き成分を除去した前記第2測定結果に基づいて、前記第2部位の前記第2の動きによるジェスチャーを認識する。 The gesture processing unit 1212 recognizes a gesture caused by the second movement of the second part based on the first and second measurement results measured by the first and second movement measuring units 110 and 105. More specifically, in the present embodiment, the gesture processing unit 1212 includes the first measurement result of the gyroscope and the acceleration sensor 110a and a plurality of pyroelectric elements in the proximity sensor 105a. Based on each output of RD, a predetermined gesture set in advance by the second movement of the hand or fingers is determined. The gesture processing unit 1212 notifies the detection result utilization unit 1213 of the determination result (detection result). Preferably, the gesture processing unit 1212 determines the first motion measurement unit 110 (gyroscope and acceleration sensor 110a in this embodiment) from the second measurement result measured by the second motion measurement unit 105 (proximity sensor 105a in this embodiment). The first movement component due to the first movement is removed based on the first measurement result measured in step, and the second portion of the second part is removed based on the second measurement result from which the first movement component is removed. Recognize movement gestures.
 そして、ジェスチャー処理部1212は、第2動き測定部105が前記第2部位の存在を検知した場合にその動作を開始する。より具体的には、本実施形態では、ジェスチャー処理部1212は、スリープ状態から第2動き測定部105が前記第2部位の存在を検知した場合にアクティブ状態にすることでその動作を開始する。なお、ジェスチャー処理部1212は、給電をオフした状態から第2動き測定部105が前記第2部位の存在を検知した場合に前記給電をオンすることでその動作を開始してもよい。 The gesture processing unit 1212 starts its operation when the second motion measuring unit 105 detects the presence of the second part. More specifically, in the present embodiment, the gesture processing unit 1212 starts its operation by switching to the active state when the second motion measuring unit 105 detects the presence of the second part from the sleep state. The gesture processing unit 1212 may start its operation by turning on the power supply when the second motion measurement unit 105 detects the presence of the second part from the state where the power supply is turned off.
 また、ジェスチャー処理部1212は、本実施形態では、フレーム101ではなく、フレーム101と別体である制御ユニットCTUに搭載されている。 In addition, in this embodiment, the gesture processing unit 1212 is mounted not on the frame 101 but on the control unit CTU that is separate from the frame 101.
 検知結果利用部1213は、ジェスチャー処理部1212の判定結果に基づいて所定の処理を実行するものである。例えば、ジェスチャー処理部1212の判定結果がいわゆる「フリック」である場合には、検知結果利用部1213は、表示制御部104DRの表示制御によって画像形成部104Aに形成されている第1画像から第2画像へとページめくりを行うように表示を変更する。また例えば、ジェスチャー処理部1212の判定結果がいわゆる「スライド」である場合には、検知結果利用部1213は、表示制御部104DRの表示制御によって画像形成部104Aに形成されている画像を移動するように表示を変更する。 The detection result utilization unit 1213 executes predetermined processing based on the determination result of the gesture processing unit 1212. For example, when the determination result of the gesture processing unit 1212 is a so-called “flick”, the detection result using unit 1213 performs the second operation from the first image formed in the image forming unit 104A by the display control of the display control unit 104DR. Change the display to turn the page. Further, for example, when the determination result of the gesture processing unit 1212 is a so-called “slide”, the detection result using unit 1213 moves the image formed in the image forming unit 104A by the display control of the display control unit 104DR. Change the display to.
 まず、HMD100における、ジェスチャーを検知する基本動作について説明する。図7は、実施形態のウェアラブル電子機器を装着した場合の正面図である。図8は、実施形態のウェアラブル電子機器を装着した場合の側面図および部分上面図である。図8には、ユーザUSの手HDも図示されている。図8Aは、側面図であり、図8Bは、部分上面図である。図9は、シースルー型の画像表示部を通してユーザが視認する像の一例を示す図である。図10は、実施形態のウェアラブル電子機器における近接センサの出力の一例を示す図である。図10Aは、焦電素子RAの出力を示し、図10Bは、焦電素子RBの出力を示し、図10Cは、焦電素子RCの出力を示し、図10Dは、焦電素子RDの出力を示す。図10の各図の横軸は、時間であり、それらの縦軸は、強度(出力レベル)である。ここで、ジェスチャー操作とは、少なくともユーザUSの手HDや指が近接センサ105aの検出領域内に進入または離間する動作であり、近接センサ105aを介してHMD100の制御処理部121のジェスチャー処理部1212が検知できるものである。 First, the basic operation of detecting a gesture in the HMD 100 will be described. FIG. 7 is a front view when the wearable electronic device of the embodiment is mounted. FIG. 8 is a side view and a partial top view when the wearable electronic device of the embodiment is mounted. FIG. 8 also shows the hand HD of the user US. 8A is a side view, and FIG. 8B is a partial top view. FIG. 9 is a diagram illustrating an example of an image visually recognized by the user through the see-through image display unit. FIG. 10 is a diagram illustrating an example of the output of the proximity sensor in the wearable electronic device of the embodiment. 10A shows the output of the pyroelectric element RA, FIG. 10B shows the output of the pyroelectric element RB, FIG. 10C shows the output of the pyroelectric element RC, and FIG. 10D shows the output of the pyroelectric element RD. Show. The horizontal axis of each figure in FIG. 10 is time, and the vertical axis thereof is intensity (output level). Here, the gesture operation is an operation in which at least the hand HD or the finger of the user US enters or leaves the detection area of the proximity sensor 105a, and the gesture processing unit 1212 of the control processing unit 121 of the HMD 100 via the proximity sensor 105a. Can be detected.
 図9に示すように、画像表示部104Bの画面104iは、画像表示部104Bに対向するユーザの眼の有効視野EVに重なるように(ここでは、有効視野EV内に位置するように)配置される。近接センサ105aの検出領域SAは、画像表示部104Bに対向するユーザの眼の視野内にある。好ましくは、検出領域SAは、ユーザの眼の安定注視野またはその内側の視野内(水平約90°以内、垂直約70°以内)に位置し、さらに好ましくは、安定注視野よりも内側に位置する、有効視野EVまたはその内側の視野内(水平約30°以内、垂直約20°以内)に重なるように位置するように、近接センサ105aの配置と向きを調整して設置するとよい。 As shown in FIG. 9, the screen 104i of the image display unit 104B is arranged so as to overlap the effective visual field EV of the user's eye facing the image display unit 104B (here, positioned in the effective visual field EV). The The detection area SA of the proximity sensor 105a is in the visual field of the user's eye facing the image display unit 104B. Preferably, the detection area SA is located within the stable field of view of the user's eye or the field inside thereof (within about 90 ° horizontal and within about 70 ° vertical), and more preferably located inside the stable field of view. The proximity sensor 105a may be installed with its arrangement and orientation adjusted so as to overlap with the effective visual field EV or the inner visual field (horizontal within about 30 °, vertical within about 20 °).
 図9には、検出領域SAが画面104iに重なっている例が示されている。このように、ユーザUSが頭部装着部材であるフレーム101を頭部に装着した状態で、ユーザUSの眼の視野内に近接センサ105aの検出領域SAが位置するように設定することで、画面104iを通して手HDを観察しつつ、眼の移動を伴うことなく、近接センサ105aの検出領域SAへの手の進入と退避とを確実に視認できる。特に、近接センサ105aの検出領域SAを安定注視野またはその内側の視野内とすることで、ユーザが画面を観察していても検出領域SAを認識しつつ、確実にジェスチャー操作を行うことができる。また、近接センサ105aの検出領域SAを有効視野EVまたはその内側の視野内とすることで、さらに確実にジェスチャー操作を行うことができる。検出領域SAが画面104iに重なるようにすれば、さらに確実にジェスチャー操作を行うことができる。なお、本実施形態のように、近接センサ105aが複数の焦電素子RA~RDを有する場合は、複数の焦電素子RA~RDの受光領域全体を一つの受光部とみて、その受光部の最大検出範囲を検出領域とみなすものとする。図9のように、近接センサ105aの検出領域SAが画面104iに重なるように設定されている場合は、検出領域SAを示す画像を画面104iに表示する(例えば、領域SAの範囲を実線で表示する)と、ユーザは、検出領域SAを確実に認識できるので、ジェスチャーによる操作をより確実に行うことができる。 FIG. 9 shows an example in which the detection area SA overlaps the screen 104i. In this way, by setting the detection area SA of the proximity sensor 105a within the visual field of the eye of the user US while the user US is wearing the frame 101 that is the head mounting member on the head, the screen While observing the hand HD through 104i, the approach and retraction of the hand to the detection area SA of the proximity sensor 105a can be reliably visually recognized without moving the eye. In particular, by setting the detection area SA of the proximity sensor 105a within the stable visual field or the inside visual field, the gesture operation can be performed reliably while recognizing the detection area SA even when the user observes the screen. . Further, by making the detection area SA of the proximity sensor 105a within the effective visual field EV or the visual field inside the effective visual field EV, the gesture operation can be performed more reliably. If the detection area SA overlaps the screen 104i, the gesture operation can be performed more reliably. When the proximity sensor 105a includes a plurality of pyroelectric elements RA to RD as in the present embodiment, the entire light receiving area of the plurality of pyroelectric elements RA to RD is regarded as one light receiving unit, and The maximum detection range shall be regarded as the detection area. As shown in FIG. 9, when the detection area SA of the proximity sensor 105a is set to overlap the screen 104i, an image showing the detection area SA is displayed on the screen 104i (for example, the range of the area SA is displayed by a solid line). ), The user can surely recognize the detection area SA, so that the operation by the gesture can be more reliably performed.
 次に、ジェスチャーの検出の基本原理について説明する。近接センサ105aが作動しているときに、ユーザUSの前方に何も存在しなければ、近接センサ105aは、検出光としての不可視光を受光しないので、制御処理部121は、ジェスチャーが行われていないと判断し、ジェスチャー処理部1212をスリープ状態とする。一方、図8に示すように、ユーザUSの目の前にユーザUS自身の手HDを接近させると、手HDから放射される不可視光を近接センサ105aが検出し、これに基づく近接センサ105aからの出力信号に応じて制御処理部121は、ジェスチャーが行われたと判断し、ジェスチャー処理部1212をアクティブ状態とする。なお、以下においては、ユーザUSの手HDによってジェスチャーを行うものとして説明するが、指やその他の部位であってもよく、不可視光を放射できる材料からなる指示具をユーザUSが用いてジェスチャーを行ってもよい。 Next, the basic principle of gesture detection will be described. When there is nothing in front of the user US when the proximity sensor 105a is operating, the proximity sensor 105a does not receive invisible light as detection light, so that the control processing unit 121 is performing a gesture. The gesture processing unit 1212 is set in the sleep state. On the other hand, as shown in FIG. 8, when the user US's own hand HD is approached in front of the user US, the proximity sensor 105a detects invisible light radiated from the hand HD, and the proximity sensor 105a based on the invisible light is detected. In response to the output signal, the control processing unit 121 determines that a gesture has been performed, and activates the gesture processing unit 1212. In the following description, it is assumed that a gesture is performed with the hand HD of the user US. However, the gesture may be performed by the user US using an indicator made of a material capable of emitting invisible light. You may go.
 上述したように、近接センサ105aは、2行2列に並べられた4個の焦電素子RA~RDを有する(図6参照)。したがって、ユーザUSが、左右上下いずれの方向から手HDをHMD100の前方に近づけた場合、各焦電素子RA~RDで検出する信号の出力タイミングが異なる。 As described above, the proximity sensor 105a has four pyroelectric elements RA to RD arranged in two rows and two columns (see FIG. 6). Therefore, when the user US brings the hand HD close to the front of the HMD 100 from either the left, right, up, or down directions, the output timings of signals detected by the pyroelectric elements RA to RD are different.
 例えば、図8および図9を参照して、ユーザUSがHMD100の前方で右方から左方に向かって手HDを移動させたジェスチャーの場合、手HDから放射された不可視光が近接センサ105aに入射する。この場合に、最初に不可視光を受光するのは、焦電素子RA、RCである。したがって、図10に示すように、まず、焦電素子RA、RCの信号が立ち上がり、遅れて焦電素子RB、RDの信号が立ち上がる。そして、この後、焦電素子RA、RCの信号が立ち下がって、遅れて焦電素子RB、RDの信号が立ち下がる(図略)。この信号のタイミングをジェスチャー処理部1212が検出し、ジェスチャー処理部1212は、ユーザUSが手HDを右から左へと移動させたジェスチャーを行ったと判定する。 For example, referring to FIG. 8 and FIG. 9, in the case of a gesture in which the user US moves the hand HD from the right to the left in front of the HMD 100, the invisible light emitted from the hand HD is applied to the proximity sensor 105a. Incident. In this case, the pyroelectric elements RA and RC first receive invisible light. Therefore, as shown in FIG. 10, first, the signals of the pyroelectric elements RA and RC rise, and the signals of the pyroelectric elements RB and RD rise after a delay. Thereafter, the signals of the pyroelectric elements RA and RC fall, and the signals of the pyroelectric elements RB and RD fall after a delay (not shown). The gesture processing unit 1212 detects the timing of this signal, and the gesture processing unit 1212 determines that the user US has made a gesture by moving the hand HD from right to left.
 同様に、焦電素子RB、RDの信号が立ち上がり、遅れて焦電素子RA、RCの信号が立ち上がり、その後、焦電素子RB、RDの信号が立ち下がって、遅れて焦電素子RA、RCの信号が立ち下がる場合、ジェスチャー処理部1212は、ユーザUSが手HDを左から右へと移動させたジェスチャーを行ったと判定できる。 Similarly, the signals of the pyroelectric elements RB and RD rise and are delayed, the signals of the pyroelectric elements RA and RC rise, and then the signals of the pyroelectric elements RB and RD fall and are delayed with the pyroelectric elements RA, RC. When the signal falls, the gesture processing unit 1212 can determine that the user US has made a gesture by moving the hand HD from left to right.
 また同様に、焦電素子RA、RBの信号が立ち上がり、遅れて焦電素子RC、RDの信号が立ち上がり、その後、焦電素子RA、RBの信号が立ち下がって、遅れて焦電素子RC、RDの信号が立ち下がる場合、ジェスチャー処理部1212は、ユーザUSが手HDを上から下へと移動させたジェスチャーを行ったと判定できる。 Similarly, the signals of the pyroelectric elements RA and RB rise and are delayed, and the signals of the pyroelectric elements RC and RD rise, and then the signals of the pyroelectric elements RA and RB fall and are delayed. When the RD signal falls, the gesture processing unit 1212 can determine that the user US has made a gesture by moving the hand HD from above to below.
 また同様に、焦電素子RC、RDの信号が立ち上がり、遅れて焦電素子RA、RBの信号が立ち上がり、その後、焦電素子RC、RDの信号が立ち下がって、遅れて焦電素子RA、RBの信号が立ち下がる場合、ジェスチャー処理部1212は、ユーザUSが手HDを下から上へと移動させたジェスチャーを行ったと判定できる。 Similarly, the signals of the pyroelectric elements RC and RD rise and are delayed, the signals of the pyroelectric elements RA and RB rise, and then the signals of the pyroelectric elements RC and RD fall, and the pyroelectric elements RA, RD are delayed. When the RB signal falls, the gesture processing unit 1212 can determine that the user US has made a gesture by moving the hand HD from the bottom to the top.
 また同様に、焦電素子RAの信号が立ち上がり、遅れて焦電素子RB、RCの信号が立ち上がり、さらに遅れて焦電素子RDが立ち上がり、その後、焦電素子RAの信号が立ち下がって、遅れて焦電素子RB、RCの信号が立ち下がって、さらに遅れて焦電素子RDが立ち下がる場合、ジェスチャー処理部1212は、ユーザUSが手HDを右上から左下へと移動させたジェスチャーを行ったと判定できる。 Similarly, the signal of the pyroelectric element RA rises, the signal of the pyroelectric elements RB and RC rises with a delay, the pyroelectric element RD rises with a delay, and then the signal of the pyroelectric element RA falls and is delayed When the pyroelectric elements RB and RC fall and the pyroelectric element RD falls after a further delay, the gesture processing unit 1212 performs the gesture in which the user US moves the hand HD from the upper right to the lower left. Can be judged.
 また同様に、焦電素子RBの信号が立ち上がり、遅れて焦電素子RA、RDの信号が立ち上がり、さらに遅れて焦電素子RCが立ち上がり、その後、焦電素子RBの信号が立ち下がって、遅れて焦電素子RA、RDの信号が立ち下がって、さらに遅れて焦電素子RCが立ち下がる場合、ジェスチャー処理部1212は、ユーザUSが手HDを左上から右下へと移動させたジェスチャーを行ったと判定できる。 Similarly, the signal of the pyroelectric element RB rises, the signal of the pyroelectric elements RA and RD rises with a delay, the pyroelectric element RC rises with a delay, and then the signal of the pyroelectric element RB falls and delays When the pyroelectric elements RA and RD fall and the pyroelectric element RC falls later, the gesture processing unit 1212 performs a gesture in which the user US moves the hand HD from the upper left to the lower right. Can be determined.
 また同様に、焦電素子RDの信号が立ち上がり、遅れて焦電素子RB、RCの信号が立ち上がり、さらに遅れて焦電素子RAが立ち上がり、その後、焦電素子RDの信号が立ち下がって、遅れて焦電素子RB、RCの信号が立ち下がって、さらに遅れて焦電素子RAが立ち下がる場合、ジェスチャー処理部1212は、ユーザUSが手HDを左下から右上へと移動させたジェスチャーを行ったと判定できる。 Similarly, the signal of the pyroelectric element RD rises, the signal of the pyroelectric elements RB and RC rises with a delay, the pyroelectric element RA rises with a delay, and then the signal of the pyroelectric element RD falls and delays When the pyroelectric elements RB and RC fall and the pyroelectric element RA falls after a further delay, the gesture processing unit 1212 has performed the gesture in which the user US moves the hand HD from the lower left to the upper right. Can be judged.
 また同様に、焦電素子RCの信号が立ち上がり、遅れて焦電素子RA、RDの信号が立ち上がり、さらに遅れて焦電素子RBが立ち上がり、その後、焦電素子RCの信号が立ち下がって、遅れて焦電素子RA、RDの信号が立ち下がって、さらに遅れて焦電素子RBが立ち下がる場合、ジェスチャー処理部1212は、ユーザUSが手HDを右下から左上へと移動させたジェスチャーを行ったと判定できる。 Similarly, the signal of the pyroelectric element RC rises, the signals of the pyroelectric elements RA and RD rise after a delay, the pyroelectric element RB rises after a further delay, and then the signal of the pyroelectric element RC falls and delays When the pyroelectric elements RA and RD fall and the pyroelectric element RB falls after a further delay, the gesture processing unit 1212 performs a gesture in which the user US moves the hand HD from the lower right to the upper left. Can be determined.
 次に、上述した基本原理に基づく本実施形態におけるジェスチャーの検出について説明する。図11は、実施形態のウェアラブル電子機器におけるジェスチャー検知処理(メインルーチン)を示すフローチャートである。図12は、実施形態のウェアラブル電子機器の前記ジェスチャー検知処理におけるジェスチャー判定処理(サブルーチン)を示すフローチャートである。 Next, gesture detection in the present embodiment based on the basic principle described above will be described. FIG. 11 is a flowchart illustrating gesture detection processing (main routine) in the wearable electronic device according to the embodiment. FIG. 12 is a flowchart illustrating a gesture determination process (subroutine) in the gesture detection process of the wearable electronic device according to the embodiment.
 このようなHMD100では、第2動き測定部105の一例である近接センサ105aが静止している場合には、上述の基本原理によってジェスチャー処理部1212は、第2部位の一例である手や指等によるジェスチャーを検知できる。しかしながら、前記近接センサ105aは、頭部装着部の一例であるフレーム101の主本体部103に配設されているので、第1部位の一例である頭部を動かすと、上述の基本原理のみに基づく処理だけでは、ジェスチャー処理部1212は、前記ジェスチャーを正しく検知し難い。例えば、前記手や指等が静止している場合でも、ユーザが首振りを行うと、ジェスチャー処理部1212は、前記手や指等によるジェスチャーを検知してしまい、誤判定してしまう虞がある。 In such an HMD 100, when the proximity sensor 105a, which is an example of the second motion measuring unit 105, is stationary, the gesture processing unit 1212 uses the hand or finger, which is an example of the second part, according to the basic principle described above. The gesture by can be detected. However, since the proximity sensor 105a is disposed on the main body 103 of the frame 101, which is an example of a head mounting part, when the head, which is an example of the first part, is moved, only the basic principle described above is used. It is difficult for the gesture processing unit 1212 to detect the gesture correctly only based on the processing based on it. For example, even when the hand or finger is stationary, if the user swings the gesture, the gesture processing unit 1212 may detect a gesture with the hand or finger and make a misjudgment. .
 そこで、本実施形態では、HMD100は、次のように動作している。図11において、制御部1211は、所定のサンプリング間隔(例えば10msや20msや30ms等)で近接センサ105aの出力(各集電素子RA~RDの各出力)をサンプリングし、まず、近接センサ105aの出力が予め設定された所定の閾値(第1閾値)th1を越えているか否かを判定することによって、第2部位、この例では手HDの存否を判定する(S11)。前記第1閾値th1は、ノイズによる誤判定を回避するために設定された値である。また、制御部1211は、近接センサ105aの出力をサンプリングするタイミングで、ジャイロおよび加速度センサ110aの出力もサンプリングする。以下も同様に、制御部1211は、同様に、近接センサ105aの出力をサンプリングするタイミングで、ジャイロおよび加速度センサ110aの出力もサンプリングする。 Therefore, in this embodiment, the HMD 100 operates as follows. In FIG. 11, the control unit 1211 samples the output of the proximity sensor 105a (each output of each current collecting element RA to RD) at a predetermined sampling interval (for example, 10 ms, 20 ms, or 30 ms). By determining whether or not the output exceeds a predetermined threshold (first threshold) th1 set in advance, the presence or absence of the second part, in this example, the hand HD is determined (S11). The first threshold th1 is a value set to avoid erroneous determination due to noise. The control unit 1211 also samples the outputs of the gyroscope and the acceleration sensor 110a at the timing of sampling the output of the proximity sensor 105a. Similarly, the control unit 1211 also samples the outputs of the gyroscope and the acceleration sensor 110a at the timing of sampling the output of the proximity sensor 105a.
 前記判定の結果、近接センサ105aの出力が前記第1閾値th1を越えていない場合(No)には、制御部1211は、処理を処理S11に戻し、次のサンプリングのタイミングで処理S11を実行する。一方、前記判定の結果、近接センサ105aの出力が前記第1閾値th1を越えている場合(Yes)には、すなわち、近接センサ105aにおける各集電素子RA~RDのうちのいずれかの出力が前記第1閾値th1を越えている場合(Yes)には、制御部1211は、ジェスチャー処理部1212がスリープ状態である場合にはアクティブ状態にしてジェスチャー処理部1212を動作させ、ジェスチャーの判定処理を実行し、ジェスチャーを仮決定する(S12)。 As a result of the determination, when the output of the proximity sensor 105a does not exceed the first threshold th1 (No), the control unit 1211 returns the process to the process S11 and executes the process S11 at the next sampling timing. . On the other hand, as a result of the determination, if the output of the proximity sensor 105a exceeds the first threshold th1 (Yes), that is, the output of any of the current collecting elements RA to RD in the proximity sensor 105a is When the first threshold value th1 is exceeded (Yes), the control unit 1211 activates the gesture processing unit 1212 when the gesture processing unit 1212 is in the sleep state, and operates the gesture determination processing. The gesture is provisionally determined (S12).
 このジェスチャーの判定処理について図12を用いて説明する。図12において、まず、ジェスチャー処理部1212は、近接センサ105aにおける複数の焦電素子RA~RDの各出力の立ち上がりタイミングやその立ち下がりタイミングを判定する(S21)。 This gesture determination process will be described with reference to FIG. In FIG. 12, the gesture processing unit 1212 first determines the rising timing and falling timing of each output of the plurality of pyroelectric elements RA to RD in the proximity sensor 105a (S21).
 次に、ジェスチャー処理部1212は、ジェスチャーを決定できるか否かを判定する(S22)。上述したように、ジェスチャーは、近接センサ105aにおける焦電素子RA~RDそれぞれから出力される各信号の、時系列に並ぶ各立ち上がりタイミングおよび各立ち下がりタイミングに基づいて判定されるので、1回の処理の実行では、ジェスチャーは、判定できない。このため、ジェスチャー処理部1212は、今回の処理で、今回の処理結果と、例えば前回の処理結果や前々回の処理結果等の過去の複数の処理結果と合わせることで、ジェスチャーを仮決定できるか否かを判定している。 Next, the gesture processing unit 1212 determines whether or not a gesture can be determined (S22). As described above, the gesture is determined based on each rising timing and each falling timing of each signal output from each of the pyroelectric elements RA to RD in the proximity sensor 105a. In executing the process, the gesture cannot be determined. Therefore, whether or not the gesture processing unit 1212 can tentatively determine a gesture by combining the current processing result and a plurality of past processing results such as the previous processing result and the previous processing result in the current processing. Judging.
 この判定の結果、ジェスチャーを決定できない場合には、ジェスチャー処理部1212は、次のサンプリングのタイミングで近接センサ105aの出力(各集電素子RA~RDの各出力)を取得し(S23)、処理を処理S21に戻す。一方、前記判定の結果、ジェスチャーを決定できる場合には、ジェスチャー処理部1212は、この決定したジェスチャーを仮決定したジェスチャーとしてこの図12に示すジェスチャーの判定処理を終了し、図11の処理S13を実行する。 As a result of this determination, if the gesture cannot be determined, the gesture processing unit 1212 acquires the output of the proximity sensor 105a (each output of each current collector RA to RD) at the next sampling timing (S23). Is returned to step S21. On the other hand, as a result of the determination, if the gesture can be determined, the gesture processing unit 1212 ends the determination process of the gesture shown in FIG. 12 as the temporarily determined gesture, and performs the process S13 of FIG. Execute.
 図11に戻って、処理S13では、ジェスチャー処理部1212は、ジャイロおよび加速度センサ110aの出力が予め設定された所定の閾値(第2閾値)th2を越えているか否かを判定することによって、第1部位、この例では頭部の動きの有無を判定する。前記第2閾値th2は、ノイズによる誤判定を回避するために設定された値である。 Returning to FIG. 11, in process S13, the gesture processing unit 1212 determines whether the output of the gyroscope and the acceleration sensor 110a exceeds a predetermined threshold value (second threshold value) th2 or not. One part, in this example, the presence or absence of head movement is determined. The second threshold th2 is a value set to avoid erroneous determination due to noise.
 この判定の結果、ジャイロおよび加速度センサ110aの出力が前記第2閾値th2を越えていない場合(No)には、後述する処理S14で行うような第1測定結果を用いる修正を施すことなく、ジェスチャー処理部1212は、前記仮決定したジェスチャーでジェスチャーを最終的に決定(本決定)し(S15)、その次に処理S16を実行する。一方、前記判定の結果、ジャイロおよび加速度センサ110aの出力が前記第2閾値th2を越えている場合(Yes)には、すなわち、ジャイロおよび加速度センサ110aにおけるX方向の加速度およびZ方向の加速度のうちのいずれかの出力が前記第2閾値th2を越えている場合(Yes)には、ジェスチャー処理部1212は、前記仮決定したジェスチャーを修正してジェスチャーを最終的に決定(本決定)し(S14)、その次に処理S16を実行する。 If the result of this determination is that the output of the gyroscope and the acceleration sensor 110a does not exceed the second threshold th2 (No), the gesture using the first measurement result as performed in the processing S14 described later is not performed. The processing unit 1212 finally determines (mainly determines) a gesture with the tentatively determined gesture (S15), and then executes processing S16. On the other hand, as a result of the determination, when the output of the gyro and acceleration sensor 110a exceeds the second threshold th2 (Yes), that is, out of the acceleration in the X direction and the acceleration in the Z direction of the gyro and acceleration sensor 110a. When the output of any of the above exceeds the second threshold th2 (Yes), the gesture processing unit 1212 corrects the tentatively determined gesture and finally determines (mainly determines) the gesture (S14). Then, the process S16 is executed.
 この処理S14では、より詳しくは、まず、ジェスチャー処理部1212は、前記仮決定したジェスチャーの速度ベクトル(X方向の速度、Z方向の速度)を求める。焦電素子RA~RDの出力は、上述したように、所定のサンプリング間隔でサンプリングされ、各焦電素子RA~RDのサイズ(横長(X方向の長さ)、縦長(Z方向の長さ)、対角長)は、予め測定できる。このため、各焦電素子RA~RDのサイズが予め測定されて記憶部125に記憶され、前記仮決定したジェスチャーに応じて、焦電素子RA~RDのサイズを立ち上がりタイミングからその立ち下がりタイミングまでの時間で除算することで、速度ベクトルが求められる。 In this process S14, more specifically, first, the gesture processing unit 1212 obtains the temporarily determined speed vector (speed in the X direction, speed in the Z direction) of the gesture. The output of the pyroelectric elements RA to RD is sampled at a predetermined sampling interval as described above, and the size (horizontal length (length in the X direction), vertical length (length in the Z direction) of each pyroelectric element RA to RD. , Diagonal length) can be measured in advance. Therefore, the sizes of the pyroelectric elements RA to RD are measured in advance and stored in the storage unit 125, and the sizes of the pyroelectric elements RA to RD are changed from the rising timing to the falling timing according to the tentatively determined gesture. By dividing by the time, the velocity vector is obtained.
 より具体的には、前記仮決定したジェスチャーが手HDの右から左への移動のジェスチャーである場合には、ジェスチャー処理部1212は、前記サンプリング間隔に基づいて、焦電素子RAまたは焦電素子RCにおける信号の立ち上がりタイミングからその立ち下がりタイミングまでの時間を求め、この求めた時間で焦電素子RAまたは焦電素子RCにおける横長を除算することで、前記仮決定したジェスチャーのX方向の速度を求める。手HDの右から左への移動のジェスチャーでは、Z方向の速度は、0である。したがって、前記仮決定したジェスチャーが手HDの右から左への移動のジェスチャーにおける速度ベクトルは、(X方向の速度、0)として求められる。立ち上がりタイミングからその立ち下がりタイミングまでの時間は、前記立ち上がりタイミングから前記立ち下がりタイミングまでのサンプリング回数に、サンプリング間隔を乗算することによって求められて良いが、各焦電素子RA~RDに対応したタイマーが制御処理部121に機能的に設けられ、前記立ち上がりタイミングで前記タイマーをリスタートし、前記立ち下がりタイミングまでの時間を前記タイマーで計ることで求められても良い。 More specifically, when the tentatively determined gesture is a gesture of moving the hand HD from right to left, the gesture processing unit 1212 determines whether the pyroelectric element RA or the pyroelectric element is based on the sampling interval. The time from the rise timing of the signal in RC to the fall timing is obtained, and the horizontal length of the pyroelectric element RA or the pyroelectric element RC is divided by the obtained time, whereby the speed in the X direction of the tentatively determined gesture is obtained. Ask. In the gesture of moving the hand HD from right to left, the speed in the Z direction is zero. Therefore, the velocity vector in the gesture of the tentatively determined gesture moving from right to left of the hand HD is obtained as (X-direction velocity, 0). The time from the rising timing to the falling timing may be obtained by multiplying the number of samplings from the rising timing to the falling timing by a sampling interval, but a timer corresponding to each pyroelectric element RA to RD. May be functionally provided in the control processing unit 121, restarting the timer at the rise timing, and measuring the time until the fall timing with the timer.
 なお、この場合において、ジェスチャー処理部1212は、前記サンプリング間隔に基づいて、焦電素子RAまたは焦電素子RCにおける信号の立ち上がりタイミングからその立ち下がりタイミングまでの第1時間を求め、この求めた第1時間で焦電素子RAまたは焦電素子RCにおける第1横長を除算することで、第1速度(X方向の第1速度)を求め、同様に、前記サンプリング間隔に基づいて、焦電素子RBまたは焦電素子RDにおける信号の立ち上がりタイミングからその立ち下がりタイミングまでの第2時間を求め、この求めた第2時間で焦電素子RBまたは焦電素子RDにおける第2横長を除算することで、第2速度(X方向の第2速度)を求め、これら第1および第2速度の平均速度を求め、この求めた平均速度を、前記仮決定したジェスチャーのX方向の速度としても良い。あるいは、ジェスチャー処理部1212は、各焦電素子RA~RDごとにその対応する各速度を求めて平均し、この平均速度を、前記仮決定したジェスチャーのX方向の速度としても良い。以下の場合も同様である。 In this case, the gesture processing unit 1212 obtains the first time from the rising timing of the signal in the pyroelectric element RA or the pyroelectric element RC to the falling timing based on the sampling interval, and the obtained first time. By dividing the first horizontal length of the pyroelectric element RA or pyroelectric element RC in one hour, a first speed (first speed in the X direction) is obtained, and similarly, based on the sampling interval, the pyroelectric element RB Alternatively, the second time from the rising timing of the signal in the pyroelectric element RD to the falling timing is obtained, and the second horizontal length in the pyroelectric element RB or pyroelectric element RD is divided by the obtained second time, 2 speeds (second speed in the X direction) are obtained, an average speed of the first and second speeds is obtained, and the obtained average speed is calculated as the temporary speed. May be used as the speed of the X direction of the boss was gesture. Alternatively, the gesture processing unit 1212 may obtain and average each corresponding speed for each pyroelectric element RA to RD, and this average speed may be used as the speed in the X direction of the tentatively determined gesture. The same applies to the following cases.
 前記仮決定したジェスチャーが手HDの左から右への移動のジェスチャーである場合の速度ベクトルは、上述の、前記仮決定したジェスチャーが手HDの右から左への移動のジェスチャーである場合と同様に、(-X方向の速度、0)として求められる。 The velocity vector in the case where the tentatively determined gesture is a gesture of moving the hand HD from left to right is the same as that in the case where the tentatively determined gesture is the gesture of moving the hand HD from right to left. (−X direction velocity, 0).
 前記仮決定したジェスチャーが手HDの上から下への移動のジェスチャーである場合には、ジェスチャー処理部1212は、前記サンプリング間隔に基づいて、焦電素子RAまたは焦電素子RBにおける信号の立ち上がりタイミングからその立ち下がりタイミングまでの時間を求め、この求めた時間で焦電素子RAまたは焦電素子RBにおける縦長を除算することで、前記仮決定したジェスチャーの-Z方向の速度を求める。手HDの上から下への移動のジェスチャーでは、X方向の速度は、0である。したがって、前記仮決定したジェスチャーが手HDの上から下への移動のジェスチャーにおける速度ベクトルは、(0、-Z方向の速度)として求められる。 When the tentatively determined gesture is a gesture of moving from the top to the bottom of the hand HD, the gesture processing unit 1212 determines the signal rise timing in the pyroelectric element RA or the pyroelectric element RB based on the sampling interval. The time from the first to the fall timing is obtained, and the vertical length of the pyroelectric element RA or pyroelectric element RB is divided by the obtained time to obtain the speed in the −Z direction of the tentatively determined gesture. In the gesture of moving from the top to the bottom of the hand HD, the velocity in the X direction is zero. Therefore, the velocity vector in the gesture of the tentatively determined gesture moving from the top to the bottom of the hand HD is obtained as (speed in the 0, −Z direction).
 前記仮決定したジェスチャーが手HDの下から上への移動のジェスチャーである場合の速度ベクトルは、上述の、前記仮決定したジェスチャーが手HDの上から下への移動のジェスチャーである場合と同様に、(0、Z方向の速度)として求められる。 The velocity vector in the case where the tentatively determined gesture is a movement gesture from the bottom to the top of the hand HD is the same as that in the case where the tentatively determined gesture is a movement gesture from the top to the bottom of the hand HD described above. (0, velocity in the Z direction).
 前記仮決定したジェスチャーが手HDの右上から左下への移動のジェスチャーである場合には、ジェスチャー処理部1212は、前記サンプリング間隔に基づいて、焦電素子RAまたは焦電素子RDにおける信号の立ち上がりタイミングからその立ち下がりタイミングまでの時間を求め、この求めた時間で焦電素子RAまたは焦電素子RDにおける対角長を除算することで、前記仮決定したジェスチャーの右上から左下へ向かう方向の速度を求める。この求めた右上から左下へ向かう方向の速度を、焦電素子RAの縦横比または焦電素子RDの縦横比に基づいて、X方向の速度とZ方向の速度との各成分に分解することで、前記仮決定したジェスチャーが手HDの右上から左下への移動のジェスチャーにおける速度ベクトルは、(X方向の速度、Z方向の速度)として求められる。 When the tentatively determined gesture is a gesture for moving the hand HD from the upper right to the lower left, the gesture processing unit 1212 determines the rise timing of the signal in the pyroelectric element RA or the pyroelectric element RD based on the sampling interval. Is calculated from the upper right to the lower left of the tentatively determined gesture by dividing the diagonal length of the pyroelectric element RA or the pyroelectric element RD by the obtained time. Ask. By dividing the obtained velocity in the direction from the upper right to the lower left into the respective components of the velocity in the X direction and the velocity in the Z direction based on the aspect ratio of the pyroelectric element RA or the aspect ratio of the pyroelectric element RD. The velocity vector in the gesture of the tentatively determined gesture moving from the upper right to the lower left of the hand HD is obtained as (velocity in the X direction, velocity in the Z direction).
 前記仮決定したジェスチャーが手HDの左上から右下への移動のジェスチャーである場合の速度ベクトル、前記仮決定したジェスチャーが手HDの左下から右上への移動のジェスチャーである場合の速度ベクトル、および、前記仮決定したジェスチャーが手HDの右下から左上への移動のジェスチャーである場合の速度ベクトルそれぞれは、上述の、前記仮決定したジェスチャーが手HDの右上から左下への移動のジェスチャーである場合と同様に求められる。 A velocity vector in the case where the tentatively determined gesture is a movement gesture from the upper left to the lower right of the hand HD, a velocity vector in a case where the tentatively determined gesture is a movement gesture from the lower left to the upper right of the hand HD, and Each of the velocity vectors in the case where the tentatively determined gesture is a gesture of moving the hand HD from the lower right to the upper left is the above-described velocity vector which is the gesture of moving the hand HD from the upper right to the lower left. It is required in the same way as the case.
 次に、ジェスチャー処理部1212は、ジャイロおよび加速度センサ110aのX方向の加速度およびZ方向の加速度から、第1部位の一例である頭部の第1の動きにおける速度ベクトルを求める。ここで、ジェスチャーを仮決定するまでに、ジャイロおよび加速度センサ110aから複数の各出力が取得されている。このため、例えば、これら複数の各出力の平均値が上述の処理におけるジャイロおよび加速度センサ110aの前記X方向の加速度および前記Z方向の加速度として用いられる。また例えば、これら時系列に並ぶ複数の各出力のうちの中央のタイミングで取得された出力が上述の処理におけるジャイロおよび加速度センサ110aの前記X方向の加速度および前記Z方向の加速度として用いられる。 Next, the gesture processing unit 1212 obtains a velocity vector in the first movement of the head, which is an example of the first part, from the acceleration in the X direction and the acceleration in the Z direction of the gyroscope and the acceleration sensor 110a. Here, a plurality of outputs are acquired from the gyro and the acceleration sensor 110a until the gesture is provisionally determined. For this reason, for example, the average value of each of the plurality of outputs is used as the acceleration in the X direction and the acceleration in the Z direction of the gyroscope and the acceleration sensor 110a in the above-described processing. Further, for example, an output acquired at the center timing among the plurality of outputs arranged in time series is used as the acceleration in the X direction and the acceleration in the Z direction of the gyroscope and the acceleration sensor 110a in the above-described processing.
 次に、ジェスチャー処理部1212は、前記仮決定したジェスチャーにおける速度ベクトルから、前記頭部の第1の動きにおける速度ベクトルを減算し、第2部位の一例である手HDの第2の動きにおける速度ベクトルを求める。そして、ジェスチャー処理部1212は、この求めた速度ベクトルに対応するジェスチャーでジェスチャーを最終的に決定(本決定)する。これによって前記仮決定したジェスチャーが修正され、ジェスチャーが最終的に決定(本決定)される。 Next, the gesture processing unit 1212 subtracts the velocity vector in the first movement of the head from the velocity vector in the tentatively determined gesture, and the velocity in the second movement of the hand HD that is an example of the second part. Find a vector. Then, the gesture processing unit 1212 finally determines (mainly determines) a gesture with a gesture corresponding to the obtained velocity vector. Thereby, the temporarily determined gesture is corrected, and the gesture is finally determined (mainly determined).
 例えば、頭部を上から下へ動かしているときに、手HDを右から左へ移動すると、処理S12(処理S21ないし処理S23の各処理の繰り返し処理)によって、手HDの右下から左上への移動のジェスチャーとして仮決定される。そして、処理S13および処理S14で、前記仮決定した手HDの右下から左上への移動のジェスチャーである場合の速度ベクトルが求められ、この速度ベクトルから、ジャイロおよび加速度センサ110aから求めた前記頭部の速度ベクトルが減算され、手HDの速度ベクトルが求められる。この手HDの速度ベクトルは、頭部の上から下への移動による速度成分が除去され、Z方向の速度は、略ゼロになる。このため、前記仮決定した手HDの右下から左上への移動のジェスチャーは、手HDの右から左への移動のジェスチャーに修正され、最終的に決定される。 For example, when the head HD is moved from the right to the left while the head is moved from the top to the bottom, the process HD is repeated from the lower right to the upper left of the hand HD by the process S12 (repetitive processes of the processes S21 to S23). Is temporarily determined as a movement gesture. Then, in step S13 and step S14, a velocity vector in the case of the tentatively determined movement of the hand HD from the lower right to the upper left is obtained, and the head obtained from the gyro and the acceleration sensor 110a is obtained from this velocity vector. The velocity vector of the part HD is subtracted to obtain the velocity vector of the hand HD. In the velocity vector of the hand HD, the velocity component due to the movement from the top to the bottom of the head is removed, and the velocity in the Z direction becomes substantially zero. Therefore, the tentatively determined gesture of moving the hand HD from the lower right to the upper left is corrected to the gesture of moving the hand HD from the right to the left and finally determined.
 そして、処理S16では、ジェスチャー処理部1212は、この最終的に決定したジェスチャーを検知結果利用部1213へ通知する。 In step S16, the gesture processing unit 1212 notifies the finally determined gesture to the detection result utilization unit 1213.
 次に、制御部1211は、次のサンプリングのタイミングで近接センサ105aの出力(各集電素子RA~RDの各出力)をサンプリングし、処理S11と同様に、近接センサ105aの出力が前記第1閾値th1を越えているか否かを判定することによって、第2部位、この例では手HDの存否を判定する(S17)。 Next, the control unit 1211 samples the output of the proximity sensor 105a (each output of each of the current collecting elements RA to RD) at the next sampling timing, and the output of the proximity sensor 105a is the first output as in the process S11. By determining whether or not the threshold value th1 is exceeded, the presence or absence of the second part, in this example, the hand HD is determined (S17).
 この判定の結果、近接センサ105aの出力が前記第1閾値th1を越えていない場合(No)には、制御部1211は、ジェスチャー処理部1212をスリープ状態にして処理を処理S11に戻し、次のサンプリングのタイミングで処理S11を実行する。一方、前記判定の結果、近接センサ105aの出力が前記第1閾値th1を越えている場合(Yes)には、すなわち、近接センサ105aにおける各集電素子RA~RDのうちのいずれかの出力が前記第1閾値th1を越えている場合(Yes)には、制御部1211は、処理を処理S12へ戻し、次のジェスチャーの判定処理を実行する。 As a result of this determination, when the output of the proximity sensor 105a does not exceed the first threshold th1 (No), the control unit 1211 sets the gesture processing unit 1212 to the sleep state and returns the processing to the processing S11. Process S11 is executed at the timing of sampling. On the other hand, as a result of the determination, if the output of the proximity sensor 105a exceeds the first threshold th1 (Yes), that is, the output of any of the current collecting elements RA to RD in the proximity sensor 105a is When the first threshold value th1 is exceeded (Yes), the control unit 1211 returns the process to the process S12 and executes the next gesture determination process.
 一方、処理S16によるジェスチャー処理部1212から通知を受けると、検知結果利用部1213は、ジェスチャー処理部1212の判定結果に基づいて所定の処理を実行する。例えば、ジェスチャー処理部1212の判定結果がいわゆる「フリック」である場合には、検知結果利用部1213は、表示制御部104DRの表示制御によって画像形成部104Aに形成されている第1画像から第2画像へとページめくりを行うように表示を変更する。 On the other hand, upon receiving a notification from the gesture processing unit 1212 in the process S16, the detection result utilization unit 1213 executes a predetermined process based on the determination result of the gesture processing unit 1212. For example, when the determination result of the gesture processing unit 1212 is a so-called “flick”, the detection result using unit 1213 performs the second operation from the first image formed in the image forming unit 104A by the display control of the display control unit 104DR. Change the display to turn the page.
 以上説明したように、本実施形態におけるウェアラブル電子機器の一例としてのHMD100およびこれに実装されたジェスチャー検知方法は、第2動き測定部110(本実施形態ではジャイロおよび加速度センサ110a)で測定した第2測定結果だけでなく第1動き測定部(本実施形態では近接センサ105a)で測定した第1測定結果も考慮して第2部位(本実施形態では手HD)の第2の動きによるジェスチャーを認識するので、装着部位(本実施形態では頭部)が動いたとしてもジェスチャーの検知が可能となる。 As described above, the HMD 100 as an example of the wearable electronic device in this embodiment and the gesture detection method implemented in the HMD 100 are measured by the second motion measurement unit 110 (the gyro and the acceleration sensor 110a in this embodiment). In consideration of not only the two measurement results but also the first measurement result measured by the first movement measurement unit (proximity sensor 105a in this embodiment), the gesture by the second movement of the second part (hand HD in this embodiment) is performed. Since it recognizes, even if a mounting | wearing site | part (head part in this embodiment) moves, a gesture can be detected.
 本実施形態におけるウェアラブル電子機器の一例としてのHMD100およびこれに実装されたジェスチャー検知方法は、第2動き測定部105が第2部位の存在を検知するまでジェスチャー処理部1212が動作しないので(ジェスチャー処理部1212が休止しているので)、省電力化できる。 In the HMD 100 as an example of the wearable electronic device in this embodiment and the gesture detection method implemented therein, the gesture processing unit 1212 does not operate until the second motion measurement unit 105 detects the presence of the second part (gesture processing). Since the unit 1212 is stopped), power can be saved.
 本実施形態におけるウェアラブル電子機器の一例としてのHMD100およびこれに実装されたジェスチャー検知方法は、ジェスチャー処理部1212が頭部装着部材の一例であるフレーム101とは別体の制御ユニットCTUに搭載されるので、前記頭部装着部材に必要最小限の構成を搭載することで頭部装着部材の部分を小型化、軽量化でき、頭部装着部材の装着による例えば違和感や煩わしさ等の装着感を軽減できる。 In the HMD 100 as an example of the wearable electronic device in this embodiment and the gesture detection method implemented therein, the gesture processing unit 1212 is mounted on a control unit CTU that is separate from the frame 101 that is an example of a head-mounted member. Therefore, by mounting the minimum necessary configuration on the head mounting member, the head mounting member can be reduced in size and weight, and the mounting feeling such as discomfort and annoyance due to the mounting of the head mounting member can be reduced. it can.
 なお、上述の実施形態において、HMD100は、図5に破線で示すように、制御処理部121に機能的に、生体における所定の動作を第1動き測定部110で測定した第1測定結果に基づいて検知する動作検知処理部1214をさらに備え、ジェスチャー処理部1212は、動作検知処理部1214が前記所定の動作を検知した場合には、第2動き測定部105で測定した第2測定結果に基づいて、前記第2部位の第2の動きによるジェスチャーを認識してもよい。これによれば、前記生体が第1部位の動き(例えば首振り等)を除く所定の動作(例えば歩行等)をした場合に、前記所定の動きを考慮せずに、ジェスチャーを認識できる。より具体的には、前記生体が前記所定の動作、例えば歩行等を行った場合における第1動き測定部110(本実施形態ではジャイロおよび加速度センサ110a)で測定される測定結果が予め求められ、この求められた測定結果が前記所定の動作に対応する信号パターンとして記憶部125に予め記憶される。そして、上述のジェスチャーを判定する際に、動作検知処理部1214は、第1動き測定部110で測定した第1測定結果が記憶部125に記憶された前記信号パターンと所定の範囲内で相関するか否かを判定することで、前記生体が前記所定の動作を行っているか否かを判定する。ジェスチャー処理部1212は、動作検知処理部1214によって前記生体が前記所定の動作を行っていると判定された場合に、第2動き測定部105(本実施形態では近接センサ105a)で測定した第2測定結果に基づいて、前記第2部位(本実施形態では手HD)の第2の動きによるジェスチャーを認識する。なお、動作検知処理部1214は、ハードウェアで構成されても良い。 In the above-described embodiment, the HMD 100 is based on the first measurement result obtained by measuring the predetermined motion in the living body by the first motion measuring unit 110 functionally in the control processing unit 121 as indicated by a broken line in FIG. The gesture processing unit 1212 further includes a motion detection processing unit 1214 that detects the predetermined motion based on the second measurement result measured by the second motion measurement unit 105 when the motion detection processing unit 1214 detects the predetermined motion. Then, the gesture by the second movement of the second part may be recognized. According to this, when the living body performs a predetermined operation (for example, walking) excluding the movement (for example, swinging) of the first part, the gesture can be recognized without considering the predetermined movement. More specifically, a measurement result measured by the first motion measurement unit 110 (in this embodiment, the gyroscope and the acceleration sensor 110a) when the living body performs the predetermined operation, for example, walking, is obtained in advance. The obtained measurement result is stored in advance in the storage unit 125 as a signal pattern corresponding to the predetermined operation. When determining the gesture described above, the motion detection processing unit 1214 correlates the first measurement result measured by the first motion measurement unit 110 with the signal pattern stored in the storage unit 125 within a predetermined range. It is determined whether or not the living body is performing the predetermined operation. When the motion detection processing unit 1214 determines that the living body is performing the predetermined motion, the gesture processing unit 1212 performs measurement by the second motion measurement unit 105 (proximity sensor 105a in the present embodiment). Based on the measurement result, the gesture by the second movement of the second part (in the present embodiment, the hand HD) is recognized. Note that the motion detection processing unit 1214 may be configured by hardware.
 また、上述の実施形態において、ジェスチャー処理部1212は、第1動き測定部110(本実施形態ではジャイロおよび加速度センサ110a)の出力が所定の閾値(第3閾値)th3を越えた場合に、第2動き測定部105(本実施形態では近接センサ105a)の出力をキャンセルし、ジェスチャーの仮決定の処理をやり直しても良い。 In the above-described embodiment, the gesture processing unit 1212 receives the first motion measurement unit 110 (the gyro and acceleration sensor 110a in the present embodiment) when the output exceeds a predetermined threshold (third threshold) th3. The output of the two-motion measuring unit 105 (proximity sensor 105a in this embodiment) may be canceled and the gesture temporary determination process may be performed again.
 また、上述の実施形態では、第1動き測定部110は、ジャイロおよび加速度センサ110aであるが、これに限定されるものではない。第1動き測定部110は、加速度を測定する加速度センサ、角速度を測定する角速度センサ、速度を測定する速度センサ、振動を測定する振動センサ、傾斜を測定する傾斜計および方位を測定する地磁気センサのうちの1または複数を備えて構成されてもよい。 In the above-described embodiment, the first motion measurement unit 110 is the gyro and the acceleration sensor 110a, but is not limited thereto. The first motion measuring unit 110 includes an acceleration sensor that measures acceleration, an angular velocity sensor that measures angular velocity, a speed sensor that measures velocity, a vibration sensor that measures vibration, an inclinometer that measures inclination, and a geomagnetic sensor that measures orientation. One or more of them may be provided.
 本明細書は、上記のように様々な態様の技術を開示しているが、そのうち主な技術を以下に纏める。 This specification discloses various modes of technology as described above, and the main technologies are summarized below.
 一態様にかかるウェアラブル電子機器は、生体における第1部位に装着するための装着部材と、前記装着部材の第1の動きを測定するための第1動き測定部と、前記装着部材に搭載され、前記生体における前記第1部位と異なる第2部位または前記第2部位に設けられたジェスチャーを行うための指示体の第2の動きを測定するための第2動き測定部と、前記第1および第2動き測定部で測定した第1および第2測定結果に基づいて、前記第2の動きによるジェスチャーを認識するジェスチャー処理部とを備える。好ましくは、上述のウェアラブル電子機器において、前記第1動き測定部は、加速度を測定する加速度センサ、角速度を測定する角速度センサ、速度を測定する速度センサおよび振動を測定する振動センサのうちの1または複数を備えて構成される。好ましくは、これら上述のウェアラブル電子機器において、前記第2動き測定部は、2次元マトリクス状に配列された複数の焦電素子を備えるパッシブ型センサを備えて構成される。好ましくは、これら上述のウェアラブル電子機器において、前記第2動き測定部は、赤外光を照射する赤外線光源と、2次元マトリクス状に配列された複数の焦電素子とを備えるアクティブ型センサを備えて構成される。好ましくは、これら上述のウェアラブル電子機器において、前記ジェスチャー処理部は、前記第2動き測定部で測定した第2測定結果から、前記第1動き測定部で測定した第1測定結果に基づいて前記第1動きによる第1動き成分を除去し、この第1動き成分を除去した前記第2測定結果に基づいて、前記第2部位の前記第2動きによるジェスチャー認識する。好ましくは、上述のウェアラブル電子機器において、前記ジェスチャー処理部は、前記第2測定結果に基づいて前記生体の第2部位または前記第2部位に設けられた前記指示体によるジェスチャーを仮決定し、前記仮決定されジェスチャーと、前記第1測定結果とに基づいて、前記生体の第2部位または前記第2部位に設けられた前記指示体の前記第2の動きによるジェスチャーを認識する。好ましくは、上述のウェアラブル電子機器において、前記ジェスチャー処理部は、前記第2測定結果に基づいて仮決定したジェスチャーに、前記第1測定結果に基づいて修正を加え、前記第2の動きによるジェスチャーを認識する。好ましくは、上述のウェアラブル電子機器において、前記ジェスチャー処理部は、前記第1部位の速度ベクトルと前記第2部位の速度ベクトルとを求め、前記第2部位の速度ベクトルから前記第1部位の速度ベクトルを減算することで、前記第2の動きによるジェスチャーを認識する。 A wearable electronic device according to one aspect is mounted on a mounting member for mounting on a first part in a living body, a first movement measuring unit for measuring a first movement of the mounting member, and the mounting member, A second movement measuring unit for measuring a second movement of an indicator for performing a gesture provided in the second part or the second part different from the first part in the living body; and the first and first A gesture processing unit for recognizing a gesture based on the second movement based on the first and second measurement results measured by the two-motion measurement unit. Preferably, in the above-described wearable electronic device, the first motion measurement unit is one of an acceleration sensor that measures acceleration, an angular velocity sensor that measures angular velocity, a speed sensor that measures velocity, and a vibration sensor that measures vibration. It is configured with a plurality. Preferably, in the above-described wearable electronic devices, the second motion measuring unit includes a passive sensor including a plurality of pyroelectric elements arranged in a two-dimensional matrix. Preferably, in the above-described wearable electronic devices, the second motion measuring unit includes an active sensor including an infrared light source that emits infrared light and a plurality of pyroelectric elements arranged in a two-dimensional matrix. Configured. Preferably, in these above-described wearable electronic devices, the gesture processing unit is configured to determine the first measurement result based on the first measurement result measured by the first motion measurement unit from the second measurement result measured by the second motion measurement unit. A first movement component due to one movement is removed, and gesture recognition based on the second movement of the second part is recognized based on the second measurement result from which the first movement component has been removed. Preferably, in the wearable electronic device described above, the gesture processing unit temporarily determines a gesture by the indicator provided in the second part of the living body or the second part based on the second measurement result, Based on the tentatively determined gesture and the first measurement result, the gesture by the second movement of the indicator provided in the second part or the second part of the living body is recognized. Preferably, in the wearable electronic device described above, the gesture processing unit modifies the gesture provisionally determined based on the second measurement result based on the first measurement result, and performs the gesture based on the second movement. recognize. Preferably, in the above-described wearable electronic device, the gesture processing unit obtains the velocity vector of the first part and the velocity vector of the second part, and the velocity vector of the first part from the velocity vector of the second part. By subtracting, the gesture by the second movement is recognized.
 このようなウェアラブル電子機器は、第2動き測定部で測定した第2測定結果だけでなく第1動き測定部で測定した第1測定結果も考慮して前記第2部位の前記第2の動きによるジェスチャーを認識するので、装着部位が動いたとしてもジェスチャーの検知が可能となる。 Such a wearable electronic device is based on the second movement of the second part in consideration of not only the second measurement result measured by the second movement measurement unit but also the first measurement result measured by the first movement measurement unit. Since the gesture is recognized, the gesture can be detected even if the wearing part moves.
 他の一態様では、これら上述のウェアラブル電子機器において、前記生体における所定の動作を、前記第1動き測定部で測定した第1測定結果に基づいて検知する動作検知処理部をさらに備え、前記ジェスチャー処理部は、前記動作検知処理部が前記所定の動作を検知した場合には、前記第2動き測定部で測定した第2測定結果に基づいて、前記第2部位または前記第2部位に設けられた前記指示体の前記第2の動きによるジェスチャーを認識する。 In another aspect, the above-described wearable electronic device further includes an operation detection processing unit that detects a predetermined operation in the living body based on a first measurement result measured by the first motion measurement unit, and the gesture When the motion detection processing unit detects the predetermined motion, the processing unit is provided in the second part or the second part based on the second measurement result measured by the second motion measurement unit. And recognizing a gesture caused by the second movement of the indicator.
 このようなウェアラブル電子機器は、生体が第1部位の動き(例えば首振り等)を除く所定の動作(例えば歩行等)をした場合に、前記所定の動きを考慮せずに、ジェスチャーを認識できる。 Such a wearable electronic device can recognize a gesture without considering the predetermined movement when the living body performs a predetermined movement (for example, walking) other than the movement of the first part (for example, swinging). .
 他の一態様では、これら上述のウェアラブル電子機器において、前記ジェスチャー処理部は、前記第1測定結果が所定の閾値を超える場合、前記仮決定したジェスチャーを取り消す。 In another aspect, in these above-described wearable electronic devices, the gesture processing unit cancels the tentatively determined gesture when the first measurement result exceeds a predetermined threshold.
 他の一態様では、上述のウェアラブル電子機器において、前記第2動き測定部は、さらに、前記第2部位の存否を検知し、前記ジェスチャー処理部は、前記第2動き測定部が前記第2部位の存在を検知した場合に動作を開始する。好ましくは、上述のウェアラブル電子機器において、前記ジェスチャー処理部は、給電をオフした状態から前記第2動き測定部が前記第2部位の存在を検知した場合に前記給電をオンすることでその動作を開始する。好ましくは、上述のウェアラブル電子機器において、前記ジェスチャー処理部は、スリープ状態から前記第2動き測定部が前記第2部位の存在を検知した場合にアクティブ状態にすることでその動作を開始する。 In another aspect, in the above-described wearable electronic device, the second motion measuring unit further detects the presence or absence of the second part, and the gesture processing unit is configured such that the second motion measuring unit is the second part. The operation is started when the presence of the is detected. Preferably, in the above-described wearable electronic device, the gesture processing unit performs the operation by turning on the power supply when the second motion measurement unit detects the presence of the second part from a state in which the power supply is turned off. Start. Preferably, in the above-described wearable electronic device, the gesture processing unit starts its operation by changing to an active state when the second motion measuring unit detects the presence of the second part from a sleep state.
 このようなウェアラブル電子機器は、前記第2動き測定部が前記第2部位の存在を検知するまでジェスチャー処理部が動作しないので(ジェスチャー処理部が休止しているので)、省電力化できる。 Such a wearable electronic device can save power because the gesture processing unit does not operate until the second motion measurement unit detects the presence of the second part (because the gesture processing unit is paused).
 他の一態様では、これら上述のウェアラブル電子機器において、前記ジェスチャー処理部は、前記装着部材とは別体の部材に搭載される。 In another aspect, in these above-described wearable electronic devices, the gesture processing unit is mounted on a member separate from the mounting member.
 このようなウェアラブル電子機器は、装着部材に必要最小限の構成を搭載することで装着部材の部分を小型化でき、装着部材の装着による例えば違和感や煩わしさ等の装着感を軽減できる。 Such a wearable electronic device can reduce the size of the mounting member by mounting the minimum necessary configuration on the mounting member, and can reduce a feeling of discomfort or annoyance due to mounting of the mounting member.
 他の一態様では、これら上述のウェアラブル電子機器において、前記装着部材は、前記第1部位を頭部として前記頭部に装着するための部材である。 In another aspect, in these above-described wearable electronic devices, the mounting member is a member for mounting on the head with the first part as the head.
 これによれば、頭部に装着されるウェアラブル電子機器を提供でき、このようなウェアラブル電子機器は、例えば首振り等の第1動きがあっても、例えば手や手指等による第2動きのジェスチャーの検知が可能となる。 According to this, a wearable electronic device to be worn on the head can be provided, and even if such a wearable electronic device has a first movement such as a swing, for example, a gesture of a second movement by, for example, a hand or fingers. Can be detected.
 他の一態様にかかるウェアラブル電子機器のジェスチャー検知方法は、生体における第1部位に装着部材によって装着されるウェアラブル電子機器のジェスチャー検知方法であって、前記装着部材の第1の動きを測定して第1測定結果を取得し、前記生体における前記第1部位と異なる第2部位または前記第2部位に設けられたジェスチャーを行うための指示体の第2の動きを測定して第2測定結果を取得し、前記第1および第2測定結果に基づいて、前記第2の動きによるジェスチャーを認識する。好ましくは、上述のウェアラブル電子機器のジェスチャー検知方法において、前記第2測定結果に基づいて前記生体の第2部位または前記第2部位に設けられた指示体によるジェスチャーを仮決定し、前記仮決定されジェスチャーと前記第1測定結果とに基づいて、前記生体の第2部位または前記第2部位に設けられた前記指示体の前記第2の動きによるジェスチャーを認識する。好ましくは、上述のウェアラブル電子機器のジェスチャー検知方法において、前記第2測定結果に基づいて仮決定したジェスチャーに、前記第1測定結果に基づいて修正を加え、前記第2の動きによるジェスチャーを認識する。好ましくは、これら上述のウェアラブル電子機器のジェスチャー検知方法において、さらに、前記生体における所定の動作を第1測定結果に基づいて検知し、前記所定の動作を検知した場合には、前記第2測定結果に基づいて、前記第2の動きによるジェスチャーを認識する。 A gesture detection method for a wearable electronic device according to another aspect is a gesture detection method for a wearable electronic device that is attached to a first part of a living body by a mounting member, and the first movement of the mounting member is measured. The first measurement result is acquired, and the second measurement result is measured by measuring the second movement of the indicator for performing the gesture provided in the second part or the second part different from the first part in the living body. Acquiring and recognizing a gesture by the second movement based on the first and second measurement results. Preferably, in the gesture detection method for the wearable electronic device described above, a gesture based on the second part of the living body or an indicator provided in the second part is provisionally determined based on the second measurement result, and the provisional determination is performed. Based on the gesture and the first measurement result, the gesture by the second movement of the indicator provided in the second part or the second part of the living body is recognized. Preferably, in the gesture detection method for the wearable electronic device described above, the gesture temporarily determined based on the second measurement result is modified based on the first measurement result, and the gesture caused by the second movement is recognized. . Preferably, in the above-described gesture detection methods for wearable electronic devices, the second measurement result is further detected when the predetermined action in the living body is detected based on the first measurement result and the predetermined action is detected. Based on the above, the gesture by the second movement is recognized.
 このようなウェアラブル電子機器のジェスチャー検知方法は、第2動き測定工程で測定した第2測定結果だけでなく第1動き測定工程で測定した第1測定結果も考慮して前記第2部位の前記第2動きによるジェスチャーを認識するので、装着部位が動いたとしてもジェスチャーの検知が可能となる。 Such a wearable electronic device gesture detection method takes into account not only the second measurement result measured in the second motion measurement step but also the first measurement result measured in the first motion measurement step, and the second portion of the second part. Since the gesture by two movements is recognized, even if the wearing part moves, the gesture can be detected.
 この出願は、2015年5月22日に出願された日本国特許出願特願2015-104830を基礎とするものであり、その内容は、本願に含まれるものである。 This application is based on Japanese Patent Application No. 2015-104830 filed on May 22, 2015, the contents of which are included in the present application.
 本発明を表現するために、上述において図面を参照しながら実施形態を通して本発明を適切且つ十分に説明したが、当業者であれば上述の実施形態を変更および/または改良することは容易に為し得ることであると認識すべきである。したがって、当業者が実施する変更形態または改良形態が、請求の範囲に記載された請求項の権利範囲を離脱するレベルのものでない限り、当該変更形態または当該改良形態は、当該請求項の権利範囲に包括されると解釈される。 In order to express the present invention, the present invention has been properly and fully described through the embodiments with reference to the drawings. However, those skilled in the art can easily change and / or improve the above-described embodiments. It should be recognized that this is possible. Therefore, unless the modifications or improvements implemented by those skilled in the art are at a level that departs from the scope of the claims recited in the claims, the modifications or improvements are not covered by the claims. To be construed as inclusive.
 本発明によれば、ウェアラブル電子機器およびウェアラブル電子機器のジェスチャー検知方法が提供できる。
 
ADVANTAGE OF THE INVENTION According to this invention, the gesture detection method of a wearable electronic device and wearable electronic device can be provided.

Claims (13)

  1.  生体における第1部位に装着するための装着部材と、
     前記装着部材の第1の動きを測定するための第1動き測定部と、
     前記装着部材に搭載され、前記生体における前記第1部位と異なる第2部位または前記第2部位に設けられたジェスチャーを行うための指示体の第2の動きを測定するための第2動き測定部と、
     前記第1および第2動き測定部で測定した第1および第2測定結果に基づいて、前記第2の動きによるジェスチャーを認識するジェスチャー処理部とを備える、
     ウェアラブル電子機器。
    A mounting member for mounting on the first part of the living body;
    A first movement measuring unit for measuring a first movement of the mounting member;
    A second motion measuring unit mounted on the mounting member and for measuring a second motion of the indicator for performing a gesture provided in the second part or the second part different from the first part in the living body When,
    A gesture processing unit for recognizing a gesture by the second movement based on the first and second measurement results measured by the first and second movement measurement units,
    Wearable electronics.
  2.  前記ジェスチャー処理部は、前記第2測定結果に基づいて前記生体の第2部位または前記第2部位に設けられた前記指示体によるジェスチャーを仮決定し、前記仮決定されジェスチャーと、前記第1測定結果とに基づいて、前記生体の第2部位または前記第2部位に設けられた前記指示体の前記第2の動きによるジェスチャーを認識する、
     請求項1に記載のウェアラブル電子機器。
    The gesture processing unit tentatively determines a gesture by the indicator provided in the second part or the second part of the living body based on the second measurement result, the tentatively determined gesture, and the first measurement Recognizing a gesture by the second movement of the indicator provided in the second part of the living body or the second part based on the result,
    The wearable electronic device according to claim 1.
  3.  前記ジェスチャー処理部は、前記第2測定結果に基づいて仮決定したジェスチャーに、前記第1測定結果に基づいて修正を加え、前記第2の動きによるジェスチャーを認識する、
     請求項2に記載のウェアラブル電子機器。
    The gesture processing unit modifies the gesture temporarily determined based on the second measurement result based on the first measurement result, and recognizes the gesture due to the second movement.
    The wearable electronic device according to claim 2.
  4.  前記ジェスチャー処理部は、前記第1部位の速度ベクトルと前記第2部位の速度ベクトルとを求め、前記第2部位の速度ベクトルから前記第1部位の速度ベクトルを減算することで、前記第2の動きによるジェスチャーを認識する、
     請求項3に記載のウェアラブル電子機器。
    The gesture processing unit obtains the velocity vector of the first part and the velocity vector of the second part, and subtracts the velocity vector of the first part from the velocity vector of the second part, thereby Recognize gestures by movement,
    The wearable electronic device according to claim 3.
  5.  前記生体における所定の動作を、前記第1動き測定部で測定した第1測定結果に基づいて検知する動作検知処理部をさらに備え、
     前記ジェスチャー処理部は、前記動作検知処理部が前記所定の動作を検知した場合には、前記第2動き測定部で測定した第2測定結果に基づいて、前記第2部位または前記第2部位に設けられた前記指示体の前記第2の動きによるジェスチャーを認識する、
     請求項1ないし請求項4のいずれか1項に記載のウェアラブル電子機器。
    An operation detection processing unit configured to detect a predetermined operation in the living body based on a first measurement result measured by the first motion measurement unit;
    When the motion detection processing unit detects the predetermined motion, the gesture processing unit applies the second part or the second part based on the second measurement result measured by the second motion measurement unit. Recognizing a gesture caused by the second movement of the indicator provided;
    The wearable electronic device according to any one of claims 1 to 4.
  6.  前記ジェスチャー処理部は、前記第1測定結果が所定の閾値を超える場合、前記仮決定したジェスチャーを取り消す、
     請求項3に記載のウェアラブル電子機器。
    The gesture processing unit cancels the tentatively determined gesture when the first measurement result exceeds a predetermined threshold;
    The wearable electronic device according to claim 3.
  7.  前記第2動き測定部は、さらに、前記第2部位の存否を検知し、
     前記ジェスチャー処理部は、前記第2動き測定部が前記第2部位の存在を検知した場合に動作を開始する、
     請求項1または請求項2に記載のウェアラブル電子機器。
    The second motion measurement unit further detects the presence or absence of the second part,
    The gesture processing unit starts an operation when the second motion measuring unit detects the presence of the second part,
    The wearable electronic device according to claim 1 or 2.
  8.  前記ジェスチャー処理部は、前記装着部材とは別体の部材に搭載される、
     請求項1ないし請求項7のいずれか1項に記載のウェアラブル電子機器。
    The gesture processing unit is mounted on a member separate from the mounting member.
    The wearable electronic device according to any one of claims 1 to 7.
  9.  前記装着部材は、前記第1部位を頭部として前記頭部に装着するための部材である、
     請求項1ないし請求項8のいずれか1項に記載のウェアラブル電子機器。
    The mounting member is a member for mounting to the head with the first part as a head.
    9. The wearable electronic device according to any one of claims 1 to 8.
  10.  生体における第1部位に装着部材によって装着されるウェアラブル電子機器のジェスチャー検知方法であって、
     前記装着部材の第1の動きを測定して第1測定結果を取得し、
     前記生体における前記第1部位と異なる第2部位または前記第2部位に設けられたジェスチャーを行うための指示体の第2の動きを測定して第2測定結果を取得し、
     前記第1および第2測定結果に基づいて、前記第2の動きによるジェスチャーを認識する、
     ウェアラブル電子機器のジェスチャー検知方法。
    A gesture detection method for a wearable electronic device mounted on a first part of a living body by a mounting member,
    Measuring a first movement of the mounting member to obtain a first measurement result;
    Measuring a second movement of the indicator for performing a gesture provided in the second part or the second part different from the first part in the living body to obtain a second measurement result;
    Recognizing a gesture by the second movement based on the first and second measurement results;
    A gesture detection method for wearable electronic devices.
  11.  前記第2測定結果に基づいて前記生体の第2部位または前記第2部位に設けられた指示体によるジェスチャーを仮決定し、前記仮決定されジェスチャーと前記第1測定結果とに基づいて、前記第2の動きによるジェスチャーを認識する、
     請求項10に記載のウェアラブル電子機器のジェスチャー検知方法。
    Based on the second measurement result, a gesture based on the second part of the living body or an indicator provided in the second part is provisionally determined, and based on the provisionally determined gesture and the first measurement result, the first part Recognize gestures by the movement of 2.
    The gesture detection method of the wearable electronic device according to claim 10.
  12.  前記第2測定結果に基づいて仮決定したジェスチャーに、前記第1測定結果に基づいて修正を加え、前記第2の動きによるジェスチャーを認識する、
     請求項11に記載のウェアラブル電子機器のジェスチャー検知方法。
    Correcting the gesture based on the first measurement result to the gesture temporarily determined based on the second measurement result, and recognizing the gesture caused by the second movement;
    The gesture detection method of the wearable electronic device according to claim 11.
  13.  さらに、前記生体における所定の動作を第1測定結果に基づいて検知し、
     前記所定の動作を検知した場合には、前記第2測定結果に基づいて、前記第2の動きによるジェスチャーを認識する、
     請求項10または請求項11に記載のウェアラブル電子機器のジェスチャー検知方法。
    Furthermore, a predetermined operation in the living body is detected based on the first measurement result,
    If the predetermined movement is detected, the gesture based on the second movement is recognized based on the second measurement result.
    The gesture detection method of the wearable electronic device according to claim 10 or 11.
PCT/JP2016/063617 2015-05-22 2016-05-06 Wearable electronic device and gesture detection method for wearable electronic device WO2016190057A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-104830 2015-05-22
JP2015104830 2015-05-22

Publications (1)

Publication Number Publication Date
WO2016190057A1 true WO2016190057A1 (en) 2016-12-01

Family

ID=57394051

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/063617 WO2016190057A1 (en) 2015-05-22 2016-05-06 Wearable electronic device and gesture detection method for wearable electronic device

Country Status (1)

Country Link
WO (1) WO2016190057A1 (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH086708A (en) * 1994-04-22 1996-01-12 Canon Inc Display device
JP2000148381A (en) * 1998-11-05 2000-05-26 Telecommunication Advancement Organization Of Japan Input image processing method, input image processor and recording medium on which input image processing program has been recorded
JP2000353046A (en) * 1999-06-11 2000-12-19 Mr System Kenkyusho:Kk User interface device, user interface method, game device and program storage medium
JP2001103395A (en) * 1999-09-30 2001-04-13 Kawasaki Heavy Ind Ltd Head mount display device
JP2006293604A (en) * 2005-04-08 2006-10-26 Canon Inc Information processing method, information processor, and remote mixed reality sharing device
JP2007134785A (en) * 2005-11-08 2007-05-31 Konica Minolta Photo Imaging Inc Head mounted video display apparatus
JP2011237987A (en) * 2010-05-10 2011-11-24 Olympus Corp Operation input device and manipulator system
JP2013137413A (en) * 2011-12-28 2013-07-11 Brother Ind Ltd Head-mounted display
JP2013231520A (en) * 2012-04-27 2013-11-14 Panasonic Corp Air conditioner
WO2014128789A1 (en) * 2013-02-19 2014-08-28 株式会社ブリリアントサービス Shape recognition device, shape recognition program, and shape recognition method
JP2015069480A (en) * 2013-09-30 2015-04-13 ブラザー工業株式会社 Head-mount display, and control program

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH086708A (en) * 1994-04-22 1996-01-12 Canon Inc Display device
JP2000148381A (en) * 1998-11-05 2000-05-26 Telecommunication Advancement Organization Of Japan Input image processing method, input image processor and recording medium on which input image processing program has been recorded
JP2000353046A (en) * 1999-06-11 2000-12-19 Mr System Kenkyusho:Kk User interface device, user interface method, game device and program storage medium
JP2001103395A (en) * 1999-09-30 2001-04-13 Kawasaki Heavy Ind Ltd Head mount display device
JP2006293604A (en) * 2005-04-08 2006-10-26 Canon Inc Information processing method, information processor, and remote mixed reality sharing device
JP2007134785A (en) * 2005-11-08 2007-05-31 Konica Minolta Photo Imaging Inc Head mounted video display apparatus
JP2011237987A (en) * 2010-05-10 2011-11-24 Olympus Corp Operation input device and manipulator system
JP2013137413A (en) * 2011-12-28 2013-07-11 Brother Ind Ltd Head-mounted display
JP2013231520A (en) * 2012-04-27 2013-11-14 Panasonic Corp Air conditioner
WO2014128789A1 (en) * 2013-02-19 2014-08-28 株式会社ブリリアントサービス Shape recognition device, shape recognition program, and shape recognition method
JP2015069480A (en) * 2013-09-30 2015-04-13 ブラザー工業株式会社 Head-mount display, and control program

Similar Documents

Publication Publication Date Title
JP6398870B2 (en) Wearable electronic device and gesture detection method for wearable electronic device
US9360935B2 (en) Integrated bi-sensing optical structure for head mounted display
US9335547B2 (en) Head-mounted display device and method of controlling head-mounted display device
CN103529929B (en) Gesture recognition system and glasses capable of recognizing gesture actions
US20150009309A1 (en) Optical Frame for Glasses and the Like with Built-In Camera and Special Actuator Feature
US11947728B2 (en) Electronic device for executing function based on hand gesture and method for operating thereof
JP6607254B2 (en) Wearable electronic device, gesture detection method for wearable electronic device, and gesture detection program for wearable electronic device
US11733952B2 (en) Wearable electronic device including display, method for controlling display, and system including wearable electronic device and case
US20230244301A1 (en) Augmented reality device for changing input mode and method thereof
US20230199328A1 (en) Method of removing interference and electronic device performing the method
KR20150091724A (en) Wearable eyeglass device
US20230135420A1 (en) Wearable electronic device for displaying virtual object and method of controlling the same
US20220375172A1 (en) Contextual visual and voice search from electronic eyewear device
WO2016190057A1 (en) Wearable electronic device and gesture detection method for wearable electronic device
WO2016052061A1 (en) Head-mounted display
CN116830065A (en) Electronic device for tracking user gaze and providing augmented reality service and method thereof
WO2016072271A1 (en) Display device, method for controlling display device, and control program therefor
WO2017094557A1 (en) Electronic device and head-mounted display
JP6790769B2 (en) Head-mounted display device, program, and control method of head-mounted display device
US11762486B2 (en) Electronic device for performing plurality of functions using stylus pen and method for operating same
US20230251362A1 (en) Method of removing interference and electronic device performing the method
US11762202B1 (en) Ring-mounted flexible circuit remote control
US20240192492A1 (en) Wearable device outputting sound for object of interest and method for controlling the same
EP4350420A1 (en) Lens assembly including light-emitting element disposed on first lens, and wearable electronic device including same
US20240046578A1 (en) Wearable electronic device displaying virtual object and method for controlling the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16799765

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16799765

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP