WO2017018428A1 - ウェアラブル装置、制御方法及び制御プログラム - Google Patents
ウェアラブル装置、制御方法及び制御プログラム Download PDFInfo
- Publication number
- WO2017018428A1 WO2017018428A1 PCT/JP2016/071936 JP2016071936W WO2017018428A1 WO 2017018428 A1 WO2017018428 A1 WO 2017018428A1 JP 2016071936 W JP2016071936 W JP 2016071936W WO 2017018428 A1 WO2017018428 A1 WO 2017018428A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- wearable device
- display
- body motion
- upper limb
- rotating body
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Definitions
- This application relates to a wearable device that can be worn on a user's head, a control method, and a control program.
- the purpose of this application is to provide a wearable device that is easier to use.
- a wearable device detects a rotating body motion accompanied by rotation of an arm in the upper limb from a detection unit capable of detecting a user's upper limb in a real space and a detection result of the detection unit. And a control unit that executes a predetermined process, and is attachable to the head.
- the wearable apparatus which concerns on one aspect is a wearable apparatus which can be mounted
- the control unit rotates with reversal from one to the other of a first state in which the upper limb included in the captured image is on the palm side and a second state in which the upper limb included in the captured image is on the back side of the hand.
- a predetermined process is executed in response to the detection of the body movement.
- the wearable device includes a detection unit capable of detecting an upper limb of a user in a real space, and an operation in which a part of the upper limb is separated from the wearable device based on a detection result of the detection unit.
- a control unit that executes a predetermined process based on the detection of a specific body motion involving both of the movement of the other part of the upper limb approaching the wearable device and the head can be mounted on the head Is.
- a control method is a control method executed by a wearable device that can be worn on a head, including a detection unit that can detect an upper limb of a user in a real space, and a control unit. Then, the control unit performs a predetermined process based on the detection of the rotating body motion accompanied by the rotation of the arm in the upper limb from the detection result of the detection unit.
- the control program which concerns on one aspect is a wearable apparatus which can be mounted
- FIG. 1 is a perspective view of a wearable device 1.
- FIG. 2 is a block diagram of the wearable device 1.
- FIG. It is the perspective view which showed typically the detection range 51 of the detection part 5, and the display area 21 of the display parts 2a and 2b.
- FIG. 3B is a top view of FIG. 3A.
- FIG. 3B is a side view of FIG. 3A.
- FIG. 3 is a diagram for describing a first example of functions executed by wearable device 1.
- FIG. 3 is a diagram for describing a first example of functions executed by wearable device 1.
- 6 is a diagram illustrating a second example of a function executed by wearable device 1.
- FIG. It is a figure explaining the 3rd example of the function performed by the wearable apparatus.
- FIG. 10 is a diagram for explaining a second modification of the third to sixth examples.
- FIG. 10 is a diagram for explaining a third modification of the third example to the sixth example. It is a figure explaining the 7th example of the function performed by the wearable apparatus. It is a figure explaining the 8th example of the function performed by the wearable apparatus. It is a figure explaining the 9th example of the function performed by the wearable apparatus.
- FIG. 1 is a perspective view of the wearable device 1.
- the wearable device 1 is a head mount type (or glasses type) device that is worn on the user's head.
- the wearable device 1 has a front surface portion 1a, a side surface portion 1b, and a side surface portion 1c.
- Front part 1a is arranged in front of the user so as to cover both eyes of the user when worn.
- the side surface portion 1b is connected to one end portion of the front surface portion 1a
- the side surface portion 1c is connected to the other end portion of the front surface portion 1a.
- the side surface portion 1b and the side surface portion 1c are supported by a user's ear like a vine of glasses when worn, and stabilize the wearable device 1.
- the side surface portion 1b and the side surface portion 1c may be configured to be connected to the back surface of the user's head when worn.
- the front part 1a includes a display part 2a and a display part 2b on the surface facing the user's eyes when worn.
- the display unit 2a is disposed at a position facing the user's right eye when worn, and the display unit 2b is disposed at a position facing the user's left eye when worn.
- the display unit 2a displays an image for the right eye, and the display unit 2b displays an image for the left eye.
- the wearable device 1 can realize three-dimensional display using parallax of both eyes by including the display unit 2a and the display unit 2b that display images corresponding to each eye of the user when worn. .
- the display unit 2a and the display unit 2b are a pair of transmissive or semi-transmissive displays, but are not limited thereto.
- the display unit 2a and the display unit 2b may include lenses such as an eyeglass lens, a sunglasses lens, and an ultraviolet cut lens, and the display unit 2a and the display unit 2b may be provided separately from the lenses.
- the display unit 2a and the display unit 2b may be configured by a single display device as long as different images can be independently provided to the user's right eye and left eye.
- the imaging unit 3 (out camera) is provided on the front part 1a.
- the imaging unit 3 is disposed at the central portion of the front surface portion 1a.
- the imaging unit 3 acquires an image in a predetermined range in the scenery in front of the user.
- the imaging unit 3 can also acquire an image in a range corresponding to the user's field of view.
- the field of view here is a field of view when the user is looking at the front, for example.
- the imaging unit 3 includes an imaging unit disposed in the vicinity of one end (the right eye side at the time of mounting) of the front surface portion 1a and the other end (the left eye side at the time of mounting) of the front surface portion 1a. It may be constituted by two with the imaging unit arranged in.
- an image in a range corresponding to the field of view of the user's right eye is acquired by the imaging unit disposed in the vicinity of one end portion (the right eye side when worn) of the front surface portion 1a.
- An image in a range corresponding to the field of view of the user's left eye is acquired by an imaging unit disposed in the vicinity of the end portion (left eye side when worn).
- an imaging unit 4 (in-camera) is provided on the front surface 1a.
- the imaging unit 4 is arranged on the face side of the user in the front surface part 1a when the wearable device 1 is mounted on the user's head.
- the imaging unit 4 acquires a user's face, for example, an image of an eye.
- the front unit 1a is provided with a detection unit 5.
- the detection part 5 is arrange
- the side surface portion 1c is provided with an operation unit 6. The detection unit 5 and the operation unit 6 will be described later.
- Wearable device 1 has a function of making a user visually recognize various information.
- the wearable device 1 allows the user to visually recognize the foreground through the display unit 2a and the display unit 2b.
- the wearable device 1 allows the user to visually recognize the foreground through the display unit 2a and the display unit 2b and the display contents of the display unit 2a and the display unit 2b. .
- FIG. 2 is a block diagram of the wearable device 1.
- the wearable device 1 includes display units 2a and 2b, an imaging unit 3 (out camera) and an imaging unit 4 (in camera), a detection unit 5, an operation unit 6, a control unit 7, and a communication unit 8. And a storage unit 9.
- the display units 2a and 2b include a transflective or transmissive display device such as a liquid crystal display (Liquid Crystal Display) or an organic EL (Organic Electro-Luminescence) panel.
- the display units 2a and 2b display various information as images in accordance with control signals input from the control unit 7.
- the display units 2a and 2b may be projection devices that project an image onto the retina of the user using a light source such as a laser beam.
- a configuration may be adopted in which a half mirror is installed on the lens portion of wearable device 1 simulating glasses and an image irradiated from a separately provided projector is projected (in the example shown in FIG. 1, the display unit 2a and 2b show rectangular half mirrors).
- the display units 2a and 2b may display various information three-dimensionally. Further, various types of information may be displayed as if they exist in front of the user (a position away from the user).
- a method for displaying information in this way for example, a frame sequential method, a polarization method, a linear polarization method, a circular polarization method, a top and bottom method, a side-by-side method, an anaglyph method, a lenticular method, a parallax barrier method, a liquid crystal parallax method. Any of a multi-parallax method such as a barrier method and a two-parallax method may be employed.
- the imaging units 3 and 4 electronically capture an image using an image sensor such as a CCD (Charge Coupled Device Image Sensor) or a CMOS (Complementary Metal Oxide Semiconductor).
- the imaging units 3 and 4 convert the captured image into a signal and output the signal to the control unit 7.
- the detection unit 5 detects an actual object (predetermined object) existing in the foreground of the user.
- the detection unit 5 is, for example, a real object that matches a pre-registered object (for example, a human hand or finger) or a pre-registered shape (for example, a human hand or finger). Is detected.
- the detection unit 5 includes a sensor that detects an actual object.
- the detection unit 5 includes, for example, an infrared irradiation unit that irradiates infrared rays, and an infrared imaging unit as a sensor that can receive infrared rays reflected from an actual predetermined object.
- the infrared irradiation unit By providing the infrared irradiation unit on the front surface 1a of the wearable device 1, it is possible to irradiate infrared rays in front of the user. Further, the infrared imaging unit is provided on the front surface part 1a of the wearable device 1, so that infrared rays reflected from a predetermined object in front of the user can be detected. Note that the detection unit 5 may detect an actual object using at least one of visible light, ultraviolet light, radio waves, sound waves, magnetism, and capacitance in addition to infrared light.
- the imaging unit 3 (out camera) may also serve as the detection unit 5. That is, the imaging unit 3 detects an object in the imaging range by analyzing the image to be captured.
- the imaging unit 3 is provided on the front part 1a of the wearable device 1 as shown in FIG. 1 so that a predetermined object in front of the user can be imaged.
- the operation unit 6 is, for example, a touch sensor disposed on the side surface 1c.
- the touch sensor can detect a user's contact, and accepts basic operations such as starting and stopping the wearable device 1 and changing the operation mode according to the detection result.
- the operation unit 6 is disposed on the side surface portion 1c is shown, but the present invention is not limited to this, and the operation portion 6 may be disposed on the side surface portion 1b, or may be disposed on the side surface portion 1b and the side surface portion 1c. It may be arranged on both sides.
- the control unit 7 includes a CPU (Central Processing Unit) that is a calculation unit and a memory that is a storage unit, and implements various functions by executing programs using these hardware resources. Specifically, the control unit 7 reads a program or data stored in the storage unit 9 and expands it in a memory, and causes the CPU to execute instructions included in the program expanded in the memory. And the control part 7 reads / writes data with respect to a memory and the memory
- a CPU Central Processing Unit
- the communication unit 8 communicates wirelessly.
- the wireless communication standards supported by the communication unit 8 include, for example, cellular phone communication standards such as 2G, 3G, and 4G, and short-range wireless communication standards.
- Cellular phone communication standards include, for example, LTE (Long Term Evolution), W-CDMA (Wideband Code Division Multiple Access), WiMAX (Worldwide InteroperabilityCableD), WiMAX (Worldwide InteroperabilityCableD). (Global System for Mobile Communications), PHS (Personal Handy-phone System), and the like.
- Examples of short-range wireless communication standards include IEEE 802.11, Bluetooth (registered trademark), IrDA (Infrared Data Association), NFC (Near Field Communication), and WPAN (Wireless Personal Area Network).
- the communication unit 8 may support one or more of the communication standards described above.
- the wearable device 1 can transmit and receive various signals, for example, by performing wireless communication connection with other electronic devices (smartphones, notebook computers, televisions, and the like) having a wireless communication function.
- wearable device 1 includes a connector to which another electronic device is connected.
- the connector may be a general-purpose terminal such as USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface), Light Peak (Thunderbolt (registered trademark)), or an earphone microphone connector.
- the connector may be a dedicated terminal such as a dock connector.
- the connector may be connected to any device including, for example, an external storage, a speaker, and a communication device.
- the storage unit 9 is composed of a nonvolatile storage device such as a flash memory, and stores various programs and data.
- the program stored in the storage unit 9 includes a control program 90.
- the storage unit 9 may be configured by a combination of a portable storage medium such as a memory card and a read / write device that reads from and writes to the storage medium.
- the control program 90 may be stored in a storage medium.
- the control program 90 may be acquired from a server device, a smartphone, a laptop computer, a television, or the like by wireless communication or wired communication.
- the control program 90 provides functions related to various controls for operating the wearable device 1.
- the control program 90 includes a detection processing program 90a and a display control program 90b.
- the detection processing program 90a provides a function for detecting a predetermined object existing in the foreground of the user from the detection result of the detection unit 5.
- the detection processing program 90a provides a function of detecting the position of the predetermined object in the foreground of the user and the operation of the predetermined object from the detection result of the detection unit 5.
- the display control program 90b provides a function of displaying an image so as to be visible to the user and changing the display mode of the image according to the operation of a predetermined object.
- the detection unit 5 will be described as a sensor that detects an actual predetermined object using infrared rays.
- the detection unit 5 will be described as including an infrared irradiation unit that emits infrared rays and an infrared imaging unit that can receive infrared rays reflected from an actual predetermined object (having infrared sensitivity). That is, the control unit 7 detects an actual predetermined object from the captured image of the infrared imaging unit. Further, in the present embodiment, description will be made on the assumption that the display images are displayed as if the display units 2 a and 2 b exist at a position away from the wearable device 1.
- FIG. 3A is a perspective view schematically showing the detection range 51 of the detection unit 5 and the display areas 21 of the display units 2a and 2b.
- 3B is a top view of FIG. 3A
- FIG. 3C is a side view of FIG. 3A.
- a three-dimensional orthogonal coordinate system including an X axis, a Y axis, and a Z axis is defined.
- the X-axis direction refers to the horizontal direction
- the Y-axis direction refers to the vertical direction or the long-axis direction of the user's body.
- the Z-axis direction is the user's front-rear direction.
- the Z-axis positive direction indicates the direction of a greater depth in the irradiation of the infrared irradiation unit included in the detection unit 5.
- FIG. 3C corresponds to the field of view when the user visually recognizes the front.
- the detection range 51 has a three-dimensional space.
- the detection unit 5 including an infrared irradiation unit and an infrared imaging unit can detect a predetermined object in front of the user as a two-dimensional image and detect the shape of the predetermined object.
- the detection unit 5 detects a predetermined object as a two-dimensional image and can acquire depth data corresponding to the position coordinate data of each pixel of the image (that is, a depth image to which depth data is added). Can get).
- the depth data is data indicating a distance from the detection unit 5 to an actual object (predetermined object) corresponding to each pixel in the two-dimensional image.
- the control unit 7 Based on the detection result of the detection unit 5, the control unit 7 operates the predetermined object when, for example, the predetermined object is a user's arm, hand, finger, or a combination of these (generally referred to as an upper limb).
- the predetermined object is a user's arm, hand, finger, or a combination of these (generally referred to as an upper limb).
- body movement such as finger bending / extension, wrist bending, forearm rotation (inward or outward), or hand / finger rotation associated with forearm rotation.
- body movement such as finger bending / extension, wrist bending, forearm rotation (inward or outward), or hand / finger rotation associated with forearm rotation.
- the rotation of the forearm (pronation or rotation) or the rotation of the hand or finger accompanying the rotation of the forearm is referred to as “rotary body movement”.
- “Rotating body movement” means not only the movement of the palm side and back of the hand by the 180 degree rotation of the forearm, but also the rotation of the forearm less than 180 degrees due to the rotation of the forearm less than 180 degrees, It also includes hand and finger rotation due to rotation at an angle greater than 180 degrees.
- control unit 7 may detect that the position of the specific part of the upper limb moves within the detection range 51 as a body motion in addition to the above-described body motion. Further, the control unit 7 may detect that the upper limb has formed a specific shape as a body motion. For example, a form (good sign) in which the thumb is stretched upward and another finger is gripped may be detected as the body motion.
- control unit 7 actually rotates based on the change in the shape of the upper limb detected by the detection unit 5 in the process of rotating the forearm when detecting the rotational body motion among the body motions described above. Can detect body movements.
- the control unit 7 can also detect the rotation angle of the upper limb in the rotating body motion based on the change in the shape of the upper limb detected by the detection unit 5 in the process of rotating the forearm.
- control unit 7 can actually detect the rotating body motion based on the change in the depth data of the upper limbs in the process of rotating the forearm.
- the control unit 7 can also determine at least two regions in the upper limb in advance and detect a rotating body motion based on relative changes in depth data of the two regions during the forearm rotation process. For example, when the forearm is rotated (pronunciation or pronation) with two of the five fingers in the upper limb extended, one finger moves closer to the detection unit 5 according to the rotation. Since the other finger moves to a position further away from the detection unit 5, it is possible to actually detect the rotating body movement by detecting the change in the depth data based on the movement of these positions.
- the control unit 7 can also detect the rotation angle of the upper limb in the rotating body operation based on the change in the depth data that changes according to the rotation operation of the forearm.
- the control unit 7 detects the upper limb on the palm side, while the central part is If it is convex in the depth direction, it can be determined that it is on the back side of the hand.
- the control unit 7 detects a predetermined object in the detection range (in the imaging range), as in the detection unit 5. It is possible to detect the operation of a predetermined object.
- control unit 7 detects the rotational body motion among the body motions described above, it is actually based on the change in the shape of the upper limb in the captured image of the imaging unit 3 in the process of rotating the forearm. Rotating body motion can be detected.
- the control unit 7 can also detect the rotation angle of the upper limb in the rotating body motion based on the change in the shape of the upper limb in the captured image in the process of rotating the forearm.
- control unit 7 can analyze the captured image and determine whether it is the palm side or the back side of the hand depending on whether or not the hand nail is detected in the region recognized as the hand in the captured image. (I.e., if the nail is not detected, it is determined to be the palm side, and if the nail is detected, it is determined to be the back side of the hand). It may be detected that a rotating body motion has been made. Based on the change in the shape of the nail of the hand in the captured image or the change in the size of the area regarded as the nail in the process of rotating the forearm, the control unit 7 determines the rotation angle of the upper limb in the rotating body motion. It can also be detected.
- the detection method of the rotating body movement is the palm side or the back side of the hand based on whether or not there is a palm print (hand wrinkle) in the region recognized as the hand in the captured image. It may be possible to determine whether or not the rotating body motion is detected based on the change from one of the palm side and the back of the hand to the other due to the body motion.
- the display units 2 a and 2 b are visually recognized by the user in the display area 21 that is located away from the wearable device 1, not the part of the actually provided wearable device 1. Images are displayed as possible (hereinafter, the images displayed by the display units 2a and 2b may be referred to as display images). At this time, the display units 2a and 2b may display the display image as a solid 3D object having a depth. Note that the depth corresponds to the thickness in the Z-axis direction. However, the display units 2a and 2b do not display an image so as to be visible in the display area 21 away from the wearable device 1, but display images on the display units 2a and 2b of the actually provided wearable device 1. May be displayed.
- FIG. 4 is a diagram illustrating a first example of functions executed by wearable device 1. *
- FIG. 4 shows the display unit 2a or 2b (hereinafter also simply referred to as the display unit 2), the display area 21, and the upper limb of the user of the wearable device 1.
- the display unit 2a or 2b hereinafter also simply referred to as the display unit 2
- the display area 21 the upper limb of the user of the wearable device 1.
- illustration of other components in the wearable device 1 is omitted.
- FIG. 4 shows an area that can be visually recognized by a user in two dimensions. The same applies to the examples of FIGS. 5 to 20 described later.
- step S1 the user visually recognizes the back side BH of the right hand H (hereinafter also simply referred to as the hand BH) as the upper limb of the user over the display area 21.
- the hand BH is in the detection range 51 of the detection unit 5, and therefore the wearable device 1 recognizes the presence of the hand BH based on the detection result of the detection unit 5.
- the wearable device 1 displays an icon group OB1 including a plurality of icons indicating that a predetermined function associated in advance can be executed by a user operation (instruction operation such as selection / execution). Part 2 is displayed.
- the icon group OB1 is displayed as a transparent or translucent image in the first example. Therefore, the user can visually recognize the upper limb through the icon group OB1, but the present invention is not limited to this.
- the icon group OB1 may be displayed as an opaque image.
- step S1 when the user moves the hand BH so that the fingertip of the index finger of the hand BH overlaps the display range of one icon OB101 in the icon group OB1, the wearable device 1 causes the icon OB101 to be displayed by the user. Assuming that the icon has been selected, the display mode of the icon OB101 is changed (step S2). Wearable device 1 preliminarily estimates the range of the actual space that is visually recognized by the user and superimposed on display region 21, and accordingly, any one of display regions 21 depends on the detection position of the index finger within the range. It is possible to estimate whether or not the position is visually recognized. In the present embodiment, the icon or the icon group is defined as one of the display images.
- step S3 the wearable device 1 detects that a rotating body motion has been performed. Wearable device 1 considers that the user has performed an operation for executing the function associated with icon OB101 based on the detection of the rotating body motion, and starts executing the function (step S4).
- step S ⁇ b> 4 the wearable device 1 displays the function execution screen SC ⁇ b> 1 in the display area 21 of the display unit 2 in accordance with the execution of the function associated with the icon OB ⁇ b> 101.
- the palm side of the right hand H is referred to as a hand PH.
- the wearable device 1 includes the detection unit 5 that can detect the user's upper limb in the real space, and the rotation accompanied by the rotation of the arm in the upper limb from the detection result of the detection unit 5. And a control unit 7 that executes a predetermined process (activation of a function associated with the icon OB101 in the first example) based on the detection of the body motion.
- the wearable device 1 includes a display unit 2 that displays a display image in front of the user's eyes, and the control unit 7 performs a first process on the display image (in the first example) as a predetermined process.
- a predetermined process Execution of a function associated with the icon OB101 or display of the execution screen SC1).
- the “first process” to be described is a process mainly related to control of a predetermined display.
- the wearable device 1 is not configured to execute a predetermined function based on the movement of the upper limb, but is more unlikely to move unintentionally, and is a physical action involving a forearm rotation action. Therefore, it is possible to prevent an erroneous operation from occurring.
- the detection unit 5 includes the infrared irradiation unit and the infrared imaging unit.
- the imaging unit 3 may also serve as the detection unit.
- the wearable device 1 detects the user's upper limb from the imaging unit (the imaging unit 3 or the infrared imaging unit in the detection unit 5 described above) and the captured image captured by the imaging unit.
- a wearable device that can be worn by the user and includes a first state in which the upper limb included in the captured image is on the palm side and the upper limb included in the captured image is on the back side of the hand
- the predetermined processing may be executed when a rotating body motion involving reversal from one of the second states to the other is detected.
- the wearable device 1 detects the rotating body motion based on the detection of the body motion accompanying the reversal from the back side of the hand to the palm side, that is, the rotation of the forearm by 180 degrees.
- the present invention is not limited to this, and a configuration in which a rotating body motion is detected based on detection of rotation of an upper limb that is greater than or equal to a predetermined angle accompanying rotation of the forearm may be employed.
- the case where the position of the fingertip of the index finger of the right hand H does not change before and after performing the rotating body motion is illustrated.
- the user performs a rotating body motion with the extended index finger as the rotation axis.
- the mode of rotating body motion is not limited to this.
- a configuration in which the rotation axis does not coincide with the index finger and the position of the fingertip of the index finger is different before and after performing the rotating body motion may be detected as the rotating body motion. That is, in the wearable device 1, when the control unit 7 detects the rotating body motion, the control unit 7 selects a display image (object OB101 in the first example (step S2) based on the position of the upper limb at the time before the detection of the rotating body motion.
- the first processing related to)) may be executed.
- the wearable device 1 when the position of the fingertip (predetermined region in the upper limb) of the index finger is different before and after performing the rotating body motion, the wearable device 1 does not execute a predetermined process depending on the rotational body motion.
- a predetermined process may be executed based on the detection of the rotating body motion.
- FIG. 5 is a diagram illustrating a first example of functions executed by the wearable device 1 following FIG. Step S4 shown in FIG. 5 is the same state as step S4 shown in FIG. 4, that is, a state where a function based on the icon OB101 is being executed.
- step S4 when the user rotates the forearm in the direction opposite to the direction in step S2 in FIG. 4 (rotation in the direction indicated by the dotted arrow in FIG.
- the wearable device 1 considers that the user has performed the end operation of the function associated with the icon OB101, and ends the function execution (step S5).
- step S5 the wearable device 1 hides the execution screen SC1 when the function execution ends.
- the control unit 7 performs a body motion (one in the first example) that involves one of the pronation and supination operations of the arm as the rotation body motion.
- a predetermined process is executed based on the detection of the motion), and during the execution of the first process, a physical action involving the other of the pronation and supination (in the first example, pronation)
- the first process is terminated based on the detection of the first.
- execution of the predetermined process in the first example may be execution of a function associated with the icon OB101 or display of a function execution screen SC1 as the first process accompanying execution of the function.
- the end of the first process in the first example may end the execution of the function associated with the icon OB101, and the function execution screen as the first process accompanying the end. SC1 may not be displayed.
- the wearable apparatus 1 which concerns on this embodiment differs from the said structure, and the control part 7 detected the body motion accompanied by one of the pronation operation
- the first process is executed based on the detection of the body action accompanied by the other one of the pronation action and the pronation action within a predetermined time after the execution of the first process.
- the second process including the control content of the pair may be executed. For example, when the electronic file selected before the physical movement is deleted based on the detection of the physical movement accompanied by one of the pronation movement and the supination movement of the arm, after the deletion is made. If the other of the pronation and supination movements of the arm is detected within a predetermined time, the deleted electronic file may be returned (or restored) to the original position.
- the wearable device 1 when the wearable device 1 detects the rotating body motion, the wearable device 1 stores whether the rotating body motion is accompanied by the pronation or supination motion of the arm, executes a predetermined process, and executes the predetermined process. You may make it monitor whether the rotation body motion opposite to the stored rotation body motion is detected during execution or within the predetermined time after execution.
- the function is executed based on the transition from the back BH of the hand to the palm PH, and the function is stopped based on the transition from the palm PH to the back BH. It is not limited to this, and the reverse configuration may be used. That is, the function may be executed based on the transition from the palm side PH to the back side BH of the hand, and the function may be stopped based on the transition from the back side BH to the palm side PH.
- the wearable device 1 includes a body motion that involves a rotational motion (for example, an outward motion) of the forearm in a first direction and a rotational motion of the forearm in a second direction opposite to the first direction. The same predetermined process may be performed by any of physical movements (for example, pronation movements).
- FIG. 6 is a diagram for explaining a second example of functions executed by the wearable device 1.
- the wearable device 1 displays a hand object OH imitating a user's upper limb in the display area 21 of the display unit 2.
- the hand object OH is an image having a shape substantially the same as the shape of the upper limb of the user detected by the detection unit 5 at the display position based on the position of the upper limb of the user in the actual predetermined space detected by the detection unit 5. Is displayed.
- the wearable device 1 can appropriately set a detection range for specifying the position of the display area 21 in the detection range 51 of the detection unit 5. Even if it is not raised, an operation based on the body motion of the upper limb is possible.
- step S11 it is assumed that the user faces the back of the hand toward the detection unit 5 in the real space. Based on detecting the back side of the user's upper limb, the detection unit 5 displays the hand object OBH on the display unit 2 representing the back side of the upper limb hand.
- step S11 wearable device 1 displays icon group OB1 including a plurality of icons.
- the wearable device 1 uses the icon OB101.
- the display mode of the icon OB101 is changed (step S12).
- the wearable device 1 When the user rotates the forearm (in the direction of the dotted arrow in FIG. 6, that is, rotation) in a state where the icon OB101 is selected, the wearable device 1 detects that a rotating body motion has been performed. At the same time, the display mode of the hand object OBH in the state of the back of the hand is reversed to the state of the palm side (step S13). The wearable device 1 considers that the user has performed an operation for executing the function associated with the icon OB101 based on the detection of the rotating body motion, and starts executing the function (step S14).
- step S ⁇ b> 14 the wearable device 1 displays the function execution screen SC ⁇ b> 1 in the display area 21 of the display unit 2 in accordance with the execution of the function associated with the icon OB ⁇ b> 101.
- the hand object OH in the palm side state is represented as a hand object OPH.
- the rotating body motion is performed before the rotating body motion is performed. Later, the front-rear relationship between the icon OB101 and the hand object OH superimposed on each other may be changed.
- the hand object OH is displayed in front of the icon OB101, that is, the hand object OH is displayed with priority over the object OB101.
- the hand object OH may be changed to a display mode in which the hand object OH is displayed behind the object OB101, that is, the icon OB101 is displayed with priority over the hand object OH.
- a display mode in which a part of two images overlap each other and a part of one display image is displayed with priority over a part of the other image is referred to as “a plurality of display images are in a context. It is displayed.
- FIG. 7 is a diagram for explaining a third example of functions executed by the wearable device 1.
- the wearable device 1 displays a hand object OH having substantially the same shape as the shape of the upper limb in the real space on the display unit 2 at a display position based on the position of the upper limb in the real space.
- the wearable device 1 displays the object OB2 and the object OB3 in the display area 21 of the display unit 2.
- the objects OB2 and OB3 are displayed in a partially overlapping manner.
- the object OB2 is displayed in front of the object OB3, that is, the object OB2 is given priority over the object OB3. That is, the plurality of display images (objects OB2 and OB3) are displayed in a front-rear relationship.
- any object called “object (excluding a hand object)” corresponds to a display image.
- step S21 the user points the back side of the hand toward the detection unit 5 in the real space.
- the wearable device 1 displays the hand object OBH in which the back side of the upper limb hand is represented in the display area 21 based on the detection of the back side of the user's upper limb from the detection result of the detection unit 5. Since the user separates the index finger and the thumb from each other, the fingertip F of the index finger and the fingertip T of the thumb in the hand object OBH are displayed separately from each other. In the hand object OBH, the fingertip F of the index finger is superimposed on the object OB3, and the fingertip T of the thumb is superimposed on the object OB2.
- the wearable device 1 considers that both the object OB2 and the object OB3 are selected by the user.
- the wearable device 1 is configured such that the hand object OBH has a fingertip F around the forefinger and the thumb. A circular display effect is displayed around the fingertip T.
- step S21 when the user rotates the forearm (in the direction of the dotted arrow in FIG. 7, ie, rotation) (step S22), the wearable device 1 detects that the rotating body motion has been performed, and the object The anteroposterior relationship between OB2 and object OB3 is changed (step S23). As shown in step S23, the object OB3 is changed to a display mode in which the object OB3 is displayed in front of the object OB2, that is, the object OB3 is displayed with priority over the object OB2 based on the change in the context. .
- the wearable device 1 displays the hand object OPH in which the palm side of the upper limb is represented in the display area 21 after detecting the rotating body motion.
- the display unit 2 displays a plurality of display images
- the control unit 7 detects the rotating body motion in a state where the plurality of display images are designated. The first processing is executed.
- control part 7 is based on the fact that the hand object OH displayed based on the position of the upper limb is superimposed on the display image when the upper limb is at a predetermined position in the real space. It can be considered that the display image is designated. Even when it is estimated that the position of the upper limb in the real space is visually recognized by the user as if the display image is superimposed, it may be considered that the display image is designated by the upper limb.
- control unit 7 designates the first display image of the plurality of display images by a part of the upper limb (fingertip of the index finger), and the second display image of the plurality of display images is the other part of the upper limb (
- the first process may be executed when a rotating body motion is detected in a state designated by the fingertip of the thumb).
- control unit 7 has a configuration for changing the front-rear relationship of a plurality of display images as the first process.
- the object OB2 is specified when the fingertip T of the thumb overlaps the object OB2
- the object OB3 is specified when the fingertip F of the index finger overlaps the object OB3.
- the present invention is not limited to this configuration.
- FIG. 8 is a diagram for explaining a fourth example of functions executed by the wearable device 1.
- the wearable device 1 displays the hand object OH having substantially the same shape as the shape of the upper limb in the real space on the display unit 2 at a display position based on the position of the upper limb in the real space.
- the wearable device 1 displays the object OB4 and the object OB5 in the display area 21 of the display unit 2. Most of the objects OB4 and OB5 are displayed in a superimposed manner. The object OB4 is displayed in front of the object OB5, that is, the object OB4 has priority over the object OB5.
- step S31 the user points the back side of the hand toward the detection unit 5 in the real space.
- the detection unit 5 in the wearable device 1 displays a hand object OBH representing the back side of the upper limb on the display unit 2 based on detecting the back side of the upper limb of the user. Further, the user moves the hand object OBH to a position overlapping the object OB4 by moving the upper limb to a predetermined position in the real space. At this time, the wearable device 1 recognizes from the detection result of the detection unit 5 that a part of the hand object OBH is superimposed on the object OB4.
- step S31 when the user rotates the forearm (in the direction of the dotted arrow in FIG. 8, ie, rotation) (step S31), the wearable device 1 detects that the rotating body motion has been performed, and the object The front-rear relationship between OB4 and object OB5 is changed. (Step S32). As shown in step S32, the object OB5 is changed to a display mode in which the object OB5 is displayed ahead of the object OB4, that is, the object OB5 is given priority over the object OB4, based on the change in the context. .
- the object OB2 is designated by superimposing the fingertip T of the thumb, which is a part of the upper limb, on the object OB2, as in the third example, and the index finger, which is the other part of the upper limb, Even if the object OB3 is not designated by superimposing the fingertip F on the object OB3, the front-rear relation of the plurality of display images having the front-rear relation can be changed by the rotating body motion.
- FIG. 9 is a diagram for explaining a fifth example of functions executed by the wearable device 1.
- the wearable device 1 displays the hand object OH having substantially the same shape as the shape of the upper limb in the real space on the display unit 2 at a display position based on the position of the upper limb in the real space.
- the wearable device 1 displays the object OB6 and the object OB7 in the display area 21 of the display unit 2.
- the object OB6 and the object OB7 are displayed so as to overlap each other.
- the object OB6 is displayed ahead of the object OB7, that is, the object OB6 is given priority over the object OB7.
- step S41 the user points the back side of the hand toward the detection unit 5 in the real space.
- the detection unit 5 displays the hand object OBH on the display unit 2 representing the back side of the upper limb hand. Since the user separates the index finger and the thumb from each other, the fingertip of the index finger and the fingertip of the thumb in the hand object OBH are displayed separately from each other.
- the hand object OBH the fingertip of the index finger is superimposed on the object OB7, and the fingertip of the thumb is superimposed on the object OB6.
- the wearable device 1 considers that both the object OB6 and the object OB7 are selected by the user.
- the wearable device 1 uses the surroundings of the fingertips of the index finger and the fingertips of the thumb in the object OBH. Each display effect is displayed around.
- step S41 when the user rotates the forearm (in the direction of the dotted arrow in FIG. 9, ie, rotation) (step S42), the wearable device 1 detects that the rotating body motion has been performed and designates it. The display positions of the object OB6 and the object OB7 thus switched are switched (step S43).
- the wearable device 1 interchanges the display positions of the objects OB6 and OB7, the corner closest to the object OB7 in the object OB6 (the upper right corner in step S42) is before the rotating body motion (step In S42), the display position is changed to a position that coincides with the upper right corner of the object OB7. Further, the wearable device 1 has the corner portion of the object OB7 closest to the object OB6 (lower left corner portion in step S42) coincides with the lower left corner portion of the object OB6 before the rotating body motion (step S42). Change the display position to position.
- the mode of switching the display positions of the objects OB6 and OB7 is not limited to this.
- the wearable device 1 is configured such that a specific point in the object OB6 (for example, the center position of the object OB6) and a point corresponding to the specific point of the object OB6 in the object OB7 (center position of the object OB7) are interchanged.
- the display position of each object may be switched.
- the wearable device 1 detects the rotating body motion, the wearable device 1 aligns a part of the upper limb that designates each of the two display images and the other part, or the rotational direction in the rotating body motion (both in the fifth example).
- the relative relationship between the display positions of the two display images may be switched in the detected direction (X-axis direction).
- the relationship between the relative display positions of the two display images in the Y-axis direction is not particularly limited. Also, unlike the above example, if the part of the upper limb that designates each of the two display images is aligned with the other part, or if the rotational direction in the rotating body motion is the Y-axis direction, 2 You may make it replace the relative relationship in the Y-axis direction of the display position of one display image.
- the wearable device 1 moves the display position of the object OB6 that overlaps the fingertip of the thumb of the hand object OH to a position that overlaps at least the fingertip of the thumb after the rotating body motion, while the index finger of the hand object OH moves.
- the display position of the object OB7 superimposed on the fingertip may be moved to a position at least superimposed on the fingertip of the index finger after the rotating body motion.
- the control unit 7 has a configuration in which the display positions of a plurality of display images are switched as the first process based on the detection of the rotating body motion.
- the fingertip positions of the two fingers are switched by the rotating body motion.
- the display position is simply switched between before and after the rotating body motion as display control. Although illustrated, it is not limited to such a configuration.
- FIG. 10 is a diagram for explaining a sixth example of functions executed by the wearable device 1.
- the left side in FIG. 10 shows an area that can be viewed two-dimensionally by the user (corresponding to the XY plane in FIG. 3), and the right side is viewed from above the user's head in the vertical direction.
- An area that is sometimes visually recognized (corresponding to the XZ plane in FIG. 3) is shown.
- the wearable device 1 displays the hand object OH having substantially the same shape as the shape of the upper limb in the real space on the display unit 2 at a display position based on the position of the upper limb in the real space.
- the wearable device 1 displays the object OB8 and the object OB9 in the display area 21.
- step S51 the user points the back side of the hand toward the detection unit 5 in the real space.
- the detection unit 5 displays a hand object OBH representing the back side of the upper limb's hand in the display area 21 (left side of step S51). Since the user separates the index finger and the thumb from each other, F and the fingertip T of the thumb are displayed apart from each other at the fingertip of the index finger in the hand object OBH.
- the fingertip F of the index finger is superimposed on the object OB9, and the fingertip region T of the thumb is superimposed on the object OB8.
- the wearable device 1 considers that both the object OB8 and the object OB9 are selected by the user.
- step S51 the fingertip F of the index finger and the fingertip T of the thumb on the user's upper limb are in a state of being approximately the same distance in the Z-axis direction. That is, the state shown in step S51 is a state in which the user visually recognizes that both the fingertip F of the index finger and the fingertip T of the thumb are at a position that is approximately the same distance from the user.
- the fingertip F of the index finger and the fingertip T of the thumb are separated from each other by a distance d1 indicated by a double arrow in the X-axis direction.
- step S51 the wearable device 1 detects that the rotating body motion has been performed, and detects the X-axis direction component d2 in the distance between the fingertip F of the index finger and the fingertip T of the thumb.
- the distance d2 in step S52 is smaller than the distance d1 in step S51.
- the wearable device 1 determines an angle corresponding to the amount of change in the distance d based on the change in the distance d between the fingertip F of the index finger and the fingertip T of the thumb due to the rotating body motion. Detect as.
- the wearable device 1 detects that the rotating body motion is detected in step S51, and the object OB8 is based on the fact that the distance between the index fingertip F and the thumb fingertip T has decreased from the distance d1 to the distance d2. And the object OB9 in the X-axis direction are reduced (step S52).
- step S52 when the user further rotates the forearm (in the direction of the dotted arrow in FIG. 10, ie, rotation), the wearable device 1 moves between the fingertip F of the index finger and the fingertip T of the thumb.
- the amount of change in the distance d is detected again.
- the wearable device 1 detects that the distance d between the fingertip F of the index finger and the fingertip T of the index finger has become zero after the state where the distance d between the fingertip F of the index finger and the fingertip T of the thumb becomes zero, and thereby the fingertip F of the index finger. It is detected that the relative position in the X-axis direction with the fingertip T of the thumb has been switched.
- the wearable device 1 indicates that the index fingertip F is positioned on the right side of the thumb fingertip T in step S52, whereas the index fingertip F is higher than the thumb fingertip T in step S53. Detect that it is located on the left side.
- the wearable device 1 changes the relative positions in the X-axis direction between the index fingertip F and the thumb fingertip T in the X-axis direction between the objects OB8 and OB9 as shown in step S53.
- the relative position is changed.
- the wearable device 1 displayed each object so that the object OB9 is positioned on the right side of the object OB8 in step S52, whereas in step S53, the object OB9 is on the left side of the object OB8.
- the display positions of the respective objects are changed so as to be positioned, and the objects OB9 and OB8 are displayed apart from each other by a distance corresponding to the distance d3 between the fingertip F of the index finger and the fingertip T of the thumb.
- the wearable device 1 when the control unit 7 detects a rotating body motion, a part of the upper limb (index fingertip) and another part (thumb fingertip) associated with the rotating body motion.
- the relative position between the first display image and the second display image is changed in accordance with the change in the component in the predetermined direction (distance d) in the distance between the first display image and the second display image.
- the wearable device 1 does not change the component in the predetermined direction (distance d) in the distance between a part of the upper limb and the other part due to the rotating body motion, but changes the first according to the rotation angle in the rotating body motion.
- the relative position between the display image and the second display image may be changed.
- step S51 when the user rotates the forearm (in the direction of the dotted arrow in FIG. 10, ie, rotation), the wearable device 1 detects that the rotating body motion has been performed.
- the rotation angle of the upper limb in the rotating body motion is detected.
- the rotation angle is determined by an imaginary line v between an arbitrary point (for example, the center) at the fingertip F of the index finger and an arbitrary point (for example, the center) at the fingertip T of the thumb, and X
- the amount of change in the angle ⁇ formed by the reference line x parallel to the axis may be used.
- step S51 The state shown in step S51 is that the index finger 1 and the fingertip T of the index finger are both at a position that is approximately the same distance from the user, that is, the imaginary line v1 is parallel to the reference line x. Is zero.
- step S52 the virtual body v2 is not parallel to the reference line x due to the rotating body motion, and the angle ⁇ changes from the angle ⁇ 1 to the angle ⁇ 2.
- the rotation angle may be defined as an angle at which the line segment is inclined with respect to an arbitrary point, for example, the center point of the line segment between the fingertip F of the index finger and the fingertip T of the thumb.
- the detection method of the rotation angle the above-described various methods and other known methods may be appropriately employed.
- the wearable device 1 is configured in the X-axis direction based on the fact that the angle ⁇ formed by the virtual line v and the reference line x is changed from the angle ⁇ 1 to ⁇ 2 (0 ° ⁇ ⁇ 1 ⁇ 2 ⁇ 90 °) by the rotating body motion. Then, it is assumed that the fingertip F of the index finger and the fingertip T of the thumb are close to each other, and using this as a trigger, the display position is changed so as to reduce the distance in the X-axis direction between the object OB8 and the object OB9 (step S52). When the wearable device 1 displays the object OB8 and the object OB9 with a small distance in the X-axis direction, the object OB8 and the object OB9 are partially overlapped and displayed.
- step S52 when the user further rotates the forearm (in the direction of the dotted arrow in FIG. 10, ie, rotation), the wearable device 1 again determines the rotation angle of the upper limb, that is, the amount of change in the angle ⁇ . To detect.
- the wearable device 1 changes from step S52 to step S53 by rotating body motion, so that the angle ⁇ between the virtual line v and the reference line x changes from the angle ⁇ 2 (0 ° ⁇ ⁇ 2 ⁇ 90 °) to the angle ⁇ 3 (90 Based on the change to (° ⁇ ⁇ 3 ⁇ 180 °), it is detected that the relative positions in the X-axis direction between the fingertip F of the index finger and the fingertip T of the thumb are switched.
- the wearable device 1 changes the relative positions in the X-axis direction between the index fingertip F and the thumb fingertip T in the X-axis direction between the objects OB8 and OB9.
- the relative display position is changed.
- the wearable device 1 changes the positions of the objects OB8 and OB9, and displays the objects OB9 and OB8 apart from each other by a distance corresponding to the angle ⁇ 3.
- the wearable device 1 changes the display mode so that the object OB8 and the object OB9 are closer as the angle ⁇ is larger in the range where the angle ⁇ is 0 ° ⁇ ⁇ ⁇ 90 °.
- step S53 the virtual line v3 is not parallel to the reference line x.
- the control unit 7 when detecting the rotating body motion, the control unit 7 detects the rotation angle (change amount of the angle ⁇ ) in the rotating body motion, and performs the first process. In this configuration, the relative positions of the plurality of display images are changed according to the rotation angle (change amount of the angle ⁇ ).
- a plurality of display images are displayed in accordance with a change in a component in a predetermined direction in a distance between a part of an upper limb and another part due to a rotating body motion, or a rotation angle in the rotating body motion.
- the wearable device 1 may measure the duration time of the rotating body motion and change a plurality of relative positions based on the duration time.
- the wearable device 1 detects that a part of the upper limb has approached the wearable device 1 by a first predetermined distance while the other part of the upper limb has moved away from the wearable device 1 by a second predetermined distance. Based on the above, it may be considered that the rotating body motion has started.
- the two displays are performed on the basis that the rotating body motion is detected in a state where at least a part of the hand object OH is superimposed on at least one of the two display images.
- the configuration for changing the context of the image or the display position is exemplified, the configuration is not limited to such a configuration.
- the object OB8 is selected by bending the index finger with the index finger of the hand object OBH superimposed on the object OB8 (step S61), and then the index finger of the hand object OBH is selected.
- the object OB9 is selected by bending the index finger in a state of being superimposed on the object OB9 (step S62).
- step S63 When the rotating body motion is performed in a state where the objects OB8 and OB9 are selected and the hand object OBH is not superimposed on the objects OB8 and OB9 (step S63), the wearable device 1 detects that the rotating body motion is detected. Based on this, the display positions of the object OB8 and the object OB9 are switched (step S64).
- the wearable device 1 recognizes in advance the direction P1 defined by the display position of the object OB8 and the display position of the object OB9 (step S71).
- the direction P1 is represented by an imaginary line passing through a predetermined location (for example, the central coordinate position) of the object OB8 and a predetermined location (center coordinate position) of the object OB9 corresponding thereto. It is prescribed.
- a direction P2 defined by a virtual line passing through the fingertip is detected.
- the wearable device 1 determines whether the angle formed by the direction P1 and the direction P2 is within a predetermined range. If the wearable device 1 determines that the angle is within the predetermined range, the display positions of the object OB8 and the object OB9 are switched. (Step S72). Even with such a configuration, it is possible to change the front-rear relationship or display position of a plurality of display images without superimposing the upper limbs on the display image.
- the predetermined angle range may be defined to be, for example, less than 30 °.
- the wearable device 1 does not compare the direction P1 and the direction P2, for example, decomposes each of the direction P1 and the direction P2 into a component in the X-axis direction and a component in the Y-axis direction. If the directions coincide with each other, the front-rear relationship or display position of the plurality of display images may be changed based on the rotating body motion. In the example of FIG. 12, since the component in the X-axis direction is larger than the component in the Y-axis direction in both the direction P1 and the direction P2, it is controlled that the detected rotating body motion is effective as an operation for the first process. Determined by the unit 7.
- the wearable device 1 detects the rotating body motion of the upper limb in the state where the index finger and thumb of the hand object OBH are extended, the hand object OBH immediately before the rotating body motion is performed is detected.
- An imaginary line P3 is generated that passes through the index finger and the thumb fingertip (step S81).
- the wearable device 1 determines whether or not the virtual line P3 can pass through both the object OB8 and the object OB9. If it is determined that the virtual line P3 can pass through, the display position of the object OB8 and the object OB9 is switched (step). S82). Even with such a configuration, it is possible to change the front-rear relationship or display position of a plurality of display images without superimposing the upper limbs on the display image.
- the configuration in which the relative positions of the plurality of display images are changed is exemplified as the configuration for executing different display control according to the rotation angle in the rotating body motion.
- the configuration is not limited to this.
- FIG. 14 is a diagram for explaining a seventh example of functions executed by the wearable device 1.
- the left side in FIG. 14 shows an area (corresponding to the XY plane in FIG. 3) that can be viewed two-dimensionally by the user, and the back side BH of the user's right hand is directed to the user side. .
- the index finger of the hand BH is extended, and the extension direction of the index finger is defined as the Y ′ axis, and the direction perpendicular to the Y ′ axis direction is defined as the X ′ axis (where X′ ⁇ Y ′ plane Is assumed to be a plane substantially parallel to the XY plane).
- the right side in FIG. 14 shows a diagram of the index finger when the fingertip of the index finger is viewed from above the Y ′ axis.
- step S91 the wearable device 1 displays an icon OB10 indicating that the mail function can be executed by the user's selection and execution operation in the display area 21 of the display unit 2.
- wearable device 1 considers that icon OB10 has been selected by the user based on the fingertip of the index finger of hand BH being superimposed on the display range of icon OB10.
- Wearable device 1 preliminarily estimates the range of the actual space that is visually recognized by the user and superimposed on display region 21, and accordingly, any one of display regions 21 depends on the detection position of the index finger within the range. It is possible to estimate whether or not the position is visually recognized.
- step S91 that is, in the state where the icon OB10 is selected
- the user rotates the forearm by the first predetermined angle ⁇ 1 about the extending direction of the index finger (rotated in the direction of the dotted arrow in FIG. 14).
- the state transitions to the state shown in step S92.
- the wearable device 1 detects the first rotation angle ⁇ 1 when detecting that the rotating body motion is performed. Wearable device 1 considers that an operation for executing a function associated with icon OB10 has been performed by the user based on detection of the rotating body motion, and starts executing the function (step S92).
- step S92 the wearable device 1 displays execution screens SC2 and SC3 of the function on the display unit 2 as the function associated with the icon OB10 is executed.
- the execution screens SC2 and SC3 are images showing simple information in the exchange of the latest mail for each mail partner.
- step S93 the wearable device 1 executes the execution screens SC2 and SC3 in the case of the first predetermined angle ⁇ 1 based on the fact that the rotation angle in the rotating body motion has become the second predetermined angle ⁇ 2 larger than the first predetermined angle ⁇ 1. More detailed information amount (for example, part of the mail text is newly added), and larger image execution screens SC2 and SC3 are displayed on the display unit 2.
- the wearable device 1 displays the execution screen SC4 in addition to the execution screens SC2 and SC3 based on the fact that the rotation angle in the rotating body motion has become the second predetermined angle ⁇ 2 larger than the first predetermined angle ⁇ 1. 2 is displayed.
- the execution screen SC4 is, for example, an image showing information on the latest mail exchange with a mail partner different from the mail partner on the execution screens SC2 and SC3.
- step S94 the wearable device 1 has the third predetermined angle ⁇ 3 larger than the second predetermined angle ⁇ 2 based on the fact that the rotation angle in the rotating body motion is larger than the execution screen SC2 in the case of the second predetermined angle ⁇ 2.
- a detailed information amount for example, a screen on which past mail contents can be browsed
- a larger image execution screen SC2 is displayed on the display unit 2.
- the control unit 7 when the control unit 7 detects the rotating body motion, the control unit 7 detects the rotation angle in the rotating body motion, and as the first processing, processing according to the rotation angle Execute. Then, as the first process, the control unit 7 displays at least one other image (execution screen SC in the seventh example) related to the display image, and is included in the other image according to the rotation angle. The information amount, the size of another image, or the number of the other images is changed.
- FIG. 15 is a diagram illustrating an eighth example of functions executed by the wearable device 1.
- FIG. 15 shows a region (corresponding to the XY plane in FIG. 3) that can be viewed two-dimensionally by the user, and the back side BH of the user's right hand is directed to the user side.
- the wearable device 1 displays a web browser screen SC5 in the display area 21 of the display unit 2.
- two operation examples a first operation example shown in steps S101 to S103 and a second operation example shown in steps S111 to S113, are shown together.
- step S101 the user superimposes a predetermined character string SC501 on the screen SC5 with the index finger of the hand BH and bends the index finger.
- the wearable device 1 recognizes that a predetermined character string on the screen SC5 is selected by the user by detecting the position of the index finger of the hand BH in the real space and that the index finger is bent.
- step S101 that is, in the state where the character string SC501 is selected, when the user rotates the forearm (rotates in the direction of the dotted arrow in FIG. 15), the wearable device 1 rotates. It is detected whether or not an action has been made, and it is determined whether or not the movement of the upper limb position exceeds a predetermined length.
- step S102 when the wearable device 1 determines that the position of the upper limb after rotating body movement does not change compared to the state of step S101, that is, does not include movement of the upper limb position beyond a predetermined length, Based on this, the wearable device 1 shifts the display to another web browser screen SC6 corresponding to the character string SC501 selected by the user, for example, as shown in step S103.
- step S111 the user moves the upper limb as shown in step S112 from the state where the hand BH is superimposed on a predetermined position on the screen SC5.
- Rotating body movements The wearable device 1 detects that the rotating body motion is performed when the user rotates the forearm (in the direction of the dotted arrow in FIG. 15, ie, rotation), and the rotating body motion is detected at the position of the upper limb. It is determined whether or not the movement includes a predetermined length or more.
- the display control content different from the first operation example is shown in step S113. Instead of transitioning to the web browser screen SC6, the display is transitioned to another web browser screen SC7.
- the control unit 7 when the control unit 7 detects the rotating body motion, the control unit 7 determines whether the rotating body motion includes the movement of the position of the upper limb more than a predetermined length. And determining whether or not the second rotational body motion does not include movement of the position of the upper limb over a predetermined length, and between the predetermined processing based on the first rotational body motion and the predetermined processing based on the second rotational body motion.
- the control contents are different.
- the predetermined display control is performed as the predetermined operation based on the detection of the rotating body motion by the wearable device 1 is illustrated, but the predetermined operation is not limited to the display control.
- FIG. 16 is a diagram illustrating a ninth example of functions executed by the wearable device 1.
- FIG. 16 shows an area (corresponding to the XY plane in FIG. 3) that can be viewed two-dimensionally by the user.
- the wearable device 1 has activated the imaging function, and displays captured images sequentially captured by the imaging unit 3 on the display unit 2 as a preview window PW.
- step S121 the user moves the right hand H forward of the wearable device 1 and points the back side of the right hand H toward the wearable device 1.
- the back side BH of the right hand H is displayed in the preview window PW.
- step S122 The user rotates the forearm in front of the wearable device 1 while visually recognizing the preview window PW (in FIG. 16, in the direction indicated by the dotted arrow, ie, turns out) (step S122), the wearable device 1 captures the captured image. It is detected that the rotating body motion is made by analyzing the above. And the wearable apparatus 1 changes the process content in an imaging function as a predetermined
- step S121 the object OB11 indicating that it is the still image capturing mode is displayed, whereas in step S123, the display is changed to the object OB12 indicating that it is the moving image capturing mode.
- various setting values related to the imaging function based on the rotating body motion include, for example, correction values in exposure correction, ISO sensitivity, It may be configured to change white balance, shutter speed, aperture value, depth of field, focal length, zoom rate, and the like.
- count of repetition of a rotation body motion may be sufficient.
- FIG. 17 is a diagram illustrating a tenth example of functions executed by the wearable device 1.
- FIG. 17 shows an area (corresponding to the XY plane in FIG. 3) that can be viewed two-dimensionally by the user.
- step S131 the wearable device 1 displays the display image OB13 on the display unit 2.
- the notebook personal computer 100 is present as another electronic device at a position close to the user or easily visible.
- step S131 the user transitions to a state in which the notebook personal computer 100 is visually recognized through the display area 21 of the wearable device 1 by changing the orientation of the head, for example, while the wearable device 1 is worn (step S131). S132).
- the wearable device 1 determines that the notebook computer 100 is present in front of the wearable device 1 based on the detection result of the detection unit 5 or the captured image of the imaging unit 3.
- step S132 the user visually recognizes that the display image OB13 is superimposed on the notebook computer 100.
- the display image OB13 is opaque, and in the region where the display image OB13 and the notebook computer 100 are overlapped, the case where the notebook computer 100 cannot be seen is illustrated, but the display image OB13 is illustrated. May be transparent or translucent. In such a case, the user can easily view the notebook computer 100 through the display image OB13.
- step S132 the user moves the upper limb within the detection range of the detection unit 5 of the wearable device 1 and points the back side of the upper limb toward the detection unit 5 so that the wearable device 1 is substantially the same as the shape of the upper limb.
- a hand object OBH having the same shape and representing the back side of the upper limb is displayed on the display unit 2.
- step S132 the user rotates the forearm (rotating in the direction of the dotted arrow in FIG. 17, ie, turning around) with at least a part of the hand object OBH superimposed on the display image OB13. Is reversed (step S133), the wearable device 1 detects that a rotating body motion has been performed. Then, wearable device 1 hides display image OB13 based on the detection of the rotating body motion (step S134).
- the control unit 7 determines whether there is another display device in front of the wearable device 1, and the other display device in front of the wearable device 1. When there is a rotating body motion, the display image is not displayed. With such a configuration, when the display image displayed by the wearable device 1 hinders the visual recognition of the display content of the display device, such a hindered state can be immediately performed by a simple operation by the user. Can be resolved.
- the wearable device 1 when determining whether or not the notebook computer 100 is in front of the wearable device 1, the wearable device 1 includes a part or all of the notebook computer 100 within the detection range 51 of the detection unit 5 or the imaging range of the imaging unit 3. It may be determined that the notebook computer 100 is present in front of the wearable device 1 based on the detection of the detection range 51 or a predetermined range in the imaging range (for example, in the user's field of view). It may be determined that the notebook computer 100 is present in front of the wearable device 1 based on the detection of a part or all of the notebook computer 100 within an easy-to-enter range of about 30 degrees.
- FIG. 18 is a diagram illustrating an eleventh example of functions executed by the wearable device 1.
- the eleventh example is an example in which the wearable device 1 executes predetermined communication processing with other display devices based on the user's physical motion.
- FIG. 18 shows an area (corresponding to the XY plane in FIG. 3) that can be viewed two-dimensionally by the user.
- step S141 the wearable device 1 displays an image list OB14 in which a plurality of display images including the display image OB141 are displayed as a list on the display unit 2.
- the notebook computer 100 is present as another electronic device at a position close to the user or easily visible.
- step S141 when the user wearing the wearable device 1 changes the orientation of the head and changes to a state in which the notebook computer 100 is viewed through the display area 21 (step S142), the wearable device 1 detects the detection unit. 5 or a captured image of the imaging unit 3, it is determined that the notebook computer 100 is present in front of the wearable device 1.
- the wearable device 1 changes the display mode of the plurality of display images displayed in the image list OB14 based on the determination that the notebook computer 100 is present in front of the wearable device 1, for example, As shown in step S142, in the display area 21, the respective display images are rearranged and displayed at positions where they are not superimposed on the notebook personal computer 100 or are not visually recognized.
- step S142 the user moves the upper limb within the detection range of the detection unit 5 of the wearable device 1 and points the back side of the upper limb toward the detection unit 5 so that the wearable device 1 is substantially the same as the shape of the upper limb.
- a hand object OBH having the same shape and representing the back side of the upper limb is displayed on the display unit 2.
- step S142 the user rotates the forearm (in the direction of a dotted arrow in FIG. 18, ie, rotation) in a state where at least a part of the hand object OBH is superimposed on the display image OB151, thereby the hand object OBH.
- step S143 the wearable device 1 detects that a rotating body motion has been performed.
- the wearable device 1 considers that the display image OB141 has been selected by the user based on the detection of the rotating body motion, and changes the display mode of the display image OB141.
- Wearable device 1 changes the display mode so that display image OB141 is in front of hand OPH after the rotating body motion.
- step S143 the user rotates the forearm (rotates in the direction of the dotted arrow in FIG. 18, ie, pronation) with at least a part (fingertip) of the hand OPH superimposed on the display image OB141. That is, when the position of the fingertip is moved to a region overlapping the display unit of the notebook computer 100 in the display region 21 while performing the rotating body motion (step S144), the wearable device 1 displays the display image OB141 by the user. It is determined that an operation for transferring the corresponding image data to the notebook computer 100 has been performed. The wearable device 1 establishes a wireless communication connection with the notebook computer 100 and transmits image data to the notebook computer 100. As shown in step S145, the notebook computer 100 displays the display image OB141 'having the same content as the display image OB141 on the display unit 2 based on the image signal received from the wearable device 1.
- the wearable device 1 includes the communication unit 8 that communicates with other electronic devices, and the control unit 7 determines whether there is another display device in front of the wearable device 1.
- the control unit 7 determines whether there is another display device in front of the wearable device 1.
- a rotating body motion is detected when another display device is present in front of the wearable device 1 as a predetermined process
- a second process including a data transfer process through communication with another electronic device is executed as a predetermined process It has the composition to do.
- the wearable device 1 is configured such that at least a part (fingertip) of the hand OPH is overlapped with the display unit 2 of the notebook personal computer 100 in the display region 21 from a position where the hand OPH is superimposed on the display image OB141 with a rotating body motion.
- the position after the movement may be detected, and the notebook personal computer 100 may be controlled so that the display image OB141 ′ is displayed at a position overlapping with the detected position or a nearby position.
- the wearable device 1 moves from the position where at least a part (fingertip) of the hand OPH is superimposed on the display image OB141 to the area where the display area 21 is superimposed on the display unit of the notebook computer 100 without performing the rotating body movement. In such a case, it may be determined that the operation is not an operation for transferring the image data corresponding to the display image OB 141 to the notebook computer 100.
- FIG. 19 is a diagram for explaining a twelfth example of functions executed by the wearable device 1.
- the twelfth example is an example in which the wearable device 1 executes a predetermined communication process with another display device based on the user's physical motion, similarly to the eleventh example.
- FIG. 19 shows an area (corresponding to the XY plane in FIG. 3) that can be viewed two-dimensionally by the user.
- step S151 the wearable device 1 displays an image list OB15 in which a plurality of display images including the display image OB151 are displayed as a list on the display unit 2.
- the notebook computer 100 is present as another electronic device at a position close to the user or easily visible.
- step S151 when the user wearing the wearable device 1 changes the orientation of the head and changes to a state in which the notebook computer 100 is viewed through the display area 21 (step S152), the wearable device 1 5 or a captured image of the imaging unit 3, it is determined that the notebook computer 100 is present in front of the wearable device 1.
- step S152 when the user moves the upper limb within the detection range of the detection unit 5 of the wearable device 1 and directs the back side of the upper limb to the detection unit 5, the wearable device 1 is substantially the same as the shape of the upper limb.
- a hand object OBH having the same shape and representing the back side of the upper limb is displayed on the display unit 2.
- step S152 the user rotates the forearm (in FIG. 19, in the direction of the dotted arrow in FIG. 19), with the at least part of the hand object OBH superimposed on the display image OB151, thereby rotating the hand object OBH.
- step S153 the wearable device 1 detects that a rotating body motion has been performed. Then, when the wearable device 1 detects the rotating body motion, whether or not at least a part of the display image OB151 on which the user has superimposed at least a part of the hand object OBH is superimposed on the display unit of the notebook computer 100. If it is determined that the images are superimposed, it is assumed that the user has performed an operation to transfer the image data corresponding to the display image OB151 to the notebook computer 100.
- the wearable device 1 then establishes a wireless communication connection with the notebook computer 100 and transmits image data to the notebook computer 100.
- the notebook computer 100 displays the display image OB151 'having the same content as the display image OB151 on the display unit 2 based on the image signal received from the wearable device 1.
- the wearable device 1 detects the rotating body motion, whether or not at least a part of the display image OB151 on which the user has superimposed at least a part of the hand object OBH is superimposed on the display unit of the notebook computer 100. If it is determined that the image data corresponding to the display image OB151 is not superimposed, it is assumed that the operation for transferring the image data corresponding to the display image OB151 is not performed.
- FIG. 20 is a diagram for explaining a thirteenth example of functions executed by the wearable device 1.
- the thirteenth example is an example in which the wearable device 1 executes a predetermined communication process with another display device based on the user's body movement, similarly to the eleventh example and the twelfth example.
- FIG. 20 shows a region (corresponding to the XY plane in FIG. 3) that can be viewed two-dimensionally by the user.
- step S161 the user is viewing the television 200 as another electronic device over the display area 21 of the display unit 2.
- the user is viewing the video displayed on the television 200 through the display area 21 of the display unit 2.
- step S162 the user moves the upper limb within the detection range 51 of the detection unit 5 of the wearable device 1 and points the back side of the upper limb to the detection unit 5 side, so that the wearable device 1 has the shape of the upper limb.
- a hand object OBH having substantially the same shape and representing the back side of the upper limb is displayed on the display unit 2.
- step S162 when the user rotates the forearm (in the direction of the dotted arrow in FIG. 20, that is, turns around), the hand object OBH is inverted (step S162), and the wearable device 1 is rotated. Detect that. Then, when the wearable device 1 detects the rotating body motion, at least a part of the hand object OBH in the front-rear direction of the wearable device 1 or the direction where the hand object OBH intersects the XY plane at a predetermined angle is the TV 200 or It is determined whether or not a rotating body motion has been performed in a state of being superimposed on the display unit of the television 200.
- the wearable device 1 determines whether or not the rotating body motion has been detected in the state specified by the user with respect to the television 200 or the video displayed on the television 200.
- the wearable device 1 has detected a rotating body motion in a state specified by the user with respect to the TV 200 or the video displayed on the TV 200, that is, at least a part of the hand object OBH is detected by the TV 200 or the TV 200.
- a wireless communication connection is established with the television 200, and a transmission request for image data is made to the television 200.
- the television 200 transmits image data corresponding to the video displayed on the television 200 to the wearable device 1.
- Wearable device 1 causes display unit 2 to display video SC8 similar to the video displayed on television 200 based on the image data received from television 200 (step S163).
- the wearable device 1 may recognize in advance that the transmission request destination of the image data is the television 200 according to the setting of the user before detecting the rotating body movement.
- step S163 the user changes the shape of the hand object OBH in the state where the operation for displaying the video SC8 on the display unit 2 of the wearable device 1 is completed (step S163), and rotates the forearm in the changed state.
- step S164 the display is switched to the program list SC9 that can be broadcasted and received by the television 200 as an image different from the video SC8 (step S165).
- a number of examples of functions executed by the wearable device 1 have been shown with reference to the above embodiments.
- the operation is performed while visually recognizing the upper limb existing in the real space without displaying the object OH as in the first example, or the object OH is displayed as in the second example.
- the present invention is not limited to this.
- the configuration in which the operation is performed while visually recognizing the upper limb existing in the real space without displaying the object OH, and the object OH is displayed.
- any of the configurations in which the operation is performed while visually recognizing the object OH is applicable.
- the wearable device 1 may reduce or display one of the two display images based on the detection of the rotating body motion, and display the other display image in an enlarged manner.
- the wearable device 1 executes a predetermined action based on the detection of a rotating body action accompanied by the rotation of the arm in the upper limb among the body actions, or the imaging unit 3 (or detection)
- the first state in which the upper limb included in the captured image captured by the infrared imaging unit as the unit 5 is on the palm side and the second state on the back side of the hand are discriminated, and the first state and the second state are determined.
- An example of a configuration that executes a predetermined motion triggered by the detection of a rotating body motion accompanied by a reversal from one to the other is illustrated.
- the upper limb is the upper right limb
- the present invention is not limited thereto, and the upper limb may be the left upper limb. Further, both the upper right limb and the left upper limb may be used.
- the wearable device 1 has an operation in which a part of the upper limb (for example, the upper right limb) moves away from the wearable device 1 and an operation in which the other part of the upper limb (for example, the upper left limb) approaches the wearable device 1 based on the detection result of the detection unit 5. Based on the fact that a specific body motion involving both of these is detected, the predetermined processing exemplified in the above embodiments may be executed.
- the wearable device 1 performs a motion of pulling the left hand toward the user at the same time that the user pushes the right hand forward as a specific body motion, the motion is similar to the above-described rotational body motion.
- the above-described various predetermined operations may be executed.
- the wearable device 1 performs, as the predetermined process based on the rotating body motion, the first process related to the display image, the second process including the data transfer process by communication with other electronic devices,
- the example of the predetermined process is not limited to this.
- the wearable device 1 recognizes the input character based on the detection of the rotating body motion. / Kanji conversion, Japanese / English translation, conversion to a prediction candidate predicted based on the input character, or the like may be executed as a predetermined process.
- the wearable device 1 may sequentially change conversion candidates in the kana / kanji conversion based on the detected number of rotations of the rotating body motion. Similarly, the wearable device 1 sequentially changes a translation word candidate in Japanese / English translation or a prediction candidate predicted based on input characters based on the detected number of rotations of the rotating body motion. You may do it.
- the wearable device 1 has been shown to have an eyeglass shape, but the shape of the wearable device 1 is not limited to this.
- the wearable device 1 may have a helmet-type shape that covers substantially the upper half of the user's head.
- the wearable device 1 may have a mask type shape that covers almost the entire face of the user.
- the display unit 2 has been illustrated as having a pair of display units 2a and 2b provided in front of the user's left and right eyes. May have a configuration having one display unit provided in front of one of the left and right eyes of the user.
- the configuration in which the edge of the front part surrounds the entire circumference of the edge of the display area of the display unit 2 is not limited to this, but the display area of the display unit 2 is not limited to this.
- the structure which has enclosed only a part of edge may be sufficient.
- a configuration is shown in which the hand or finger is detected by the imaging unit (or detection unit) as the user's upper limb, but the hand or finger is in a state in which a glove, a glove, or the like is worn. However, it can be similarly detected.
- the present invention is not limited to this and may be configured as a method or program including each component.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
1a 前面部
1b 側面部
1c 側面部
2 表示部
3 撮像部
4 撮像部
5 検出部
6 操作部
7 制御部
8 通信部
9 記憶部
Claims (22)
- 現実の空間に在る利用者の上肢を検出可能な検出部と、
前記検出部の検出結果から、前記上肢における腕の回転を伴う回転身体動作を検出したことに基づいて、所定の処理を実行する制御部と、を備える、頭部に装着可能な
ウェアラブル装置。 - 利用者の眼前に表示画像を表示する表示部を備え、
前記制御部は、前記所定の処理として、前記表示画像に関する第1の処理を実行する
請求項1に記載のウェアラブル装置。 - 前記制御部は、前記回転身体動作として、前記腕の回内動作と回外動作の内の一方を伴う身体動作を検出したことに基づいて前記所定の処理を実行し、
前記所定の処理の実行中に、前記回内動作と前記回外動作の内の他方を伴う身体動作を検出したことに基づいて、前記所定の処理を終了する
請求項1又は2に記載のウェアラブル装置。 - 前記制御部は、前記回転身体動作として、前記腕の回内動作と回外動作の内の一方を伴う身体動作を検出したことに基づいて前記所定の処理を実行し、
前記所定の処理の実行後の所定時間内に、前記回内動作と前記回外動作の内の他方を伴う身体動作を検出したことに基づいて、前記所定の処理と対の制御内容を含む第2の処理を実行する
請求項1又は2に記載のウェアラブル装置。 - 前記制御部は、前記回転身体動作を検出すると、当該回転身体動作の検出前の時点における前記上肢の位置に基づいて選択された前記表示画像に対する前記第1の処理を実行する
請求項2から4の何れか一項に記載のウェアラブル装置。 - 前記表示部は、複数の前記表示画像を表示し、
前記制御部は、前記複数の表示画像が指定された状態で前記回転身体動作を検出すると、前記第1の処理を実行する
請求項2から5の何れか一項に記載のウェアラブル装置。 - 前記制御部は、前記上肢が前記現実の空間における所定位置に在ることに基づいて、前記複数の表示画像が指定されたものとみなす
請求項6に記載のウェアラブル装置。 - 前記制御部は、前記複数の表示画像の内の第1表示画像が前記上肢の一部によって指定され、当該複数の表示画像の内の第2表示画像が前記上肢の他部によって指定された状態で前記回転身体動作を検出すると、前記第1の処理を実行する
請求項6又は7に記載のウェアラブル装置。 - 前記制御部は、前記第1の処理として、前記複数の表示画像の前後関係を変更する
請求項6から8の何れか一項に記載のウェアラブル装置。 - 前記制御部は、前記第1の処理として、前記複数の表示画像の表示位置を入れ替える
請求項6から8の何れか一項に記載のウェアラブル装置。 - 前記制御部は、前記回転身体動作を検出すると、当該回転身体動作に伴う前記一部と前記他部との間の距離における所定方向の成分の変化に応じて、前記第1表示画像と前記第2表示画像との相対位置を変更する
請求項8に記載のウェアラブル装置。 - 前記制御部は、前記回転身体動作を検出すると、当該回転身体動作における回転角度を検出し、前記第1の処理として、前記回転角度に応じて、前記複数の表示画像の相対位置を変更する
請求項6から8の何れか一項に記載のウェアラブル装置。 - 前記制御部は、前記回転身体動作を検出すると、当該回転身体動作における回転角度を検出し、前記第1の処理として、前記回転角度に応じた処理を実行する
請求項2から8の何れか一項に記載のウェアラブル装置。 - 前記制御部は、前記第1の処理として、前記表示画像に関連する少なくとも一つの他の画像を表示し、
前記回転角度に応じて、
前記他の画像に含まれる情報量、又は
前記他の画像の大きさ、又は、
前記他の画像の数、を変更する
請求項13に記載のウェアラブル装置。 - 前記制御部は、前記回転身体動作を検出すると、該回転身体動作が前記上肢の位置の所定長さ以上の移動を含む第1回転身体動作か、前記上肢の位置の所定長さ以上の移動を含まない第2回転身体動作か、を判定し、
前記第1回転身体動作に基づく前記所定の処理と前記第2回転身体動作に基づく前記所定の処理との間で、制御内容を異ならせる
請求項2から8の何れか一項に記載のウェアラブル装置。 - 前記制御部は、前記第1の処理として、前記表示画像を非表示にする
請求項2に記載のウェアラブル装置。 - 前記制御部は、前記ウェアラブル装置の前方に他の表示機器が在るか否かを判定し、
前記ウェアラブル装置の前方に他の表示機器が在る場合に、前記回転身体動作を検出すると、前記表示画像を非表示にする
請求項16に記載のウェアラブル装置。 - 他の電子機器と通信する通信部を備え、
前記制御部は、前記ウェアラブル装置の前方に他の表示機器が在るか否かを判定し、
前記ウェアラブル装置の前方に前記他の表示機器が在る場合に、前記回転身体動作を検出すると、前記所定の処理として、前記他の電子機器との通信によるデータ転送処理を含む第2の処理を実行する
請求項1に記載のウェアラブル装置。 - 撮像部と、
前記撮像部が撮像する撮像画像から利用者の上肢を検出する制御部と、を備える、利用者に装着可能なウェアラブル装置であって、
前記制御部は、前記撮像画像に含まれる前記上肢が手の平側となる第1状態と前記撮像画像に含まれる前記上肢が手の甲側となる第2状態の内の一方から他方への反転を伴う回転身体動作を検出したことを契機に、所定の処理を実行する
ウェアラブル装置。 - 現実の空間に在る利用者の上肢を検出可能な検出部と、
前記検出部の検出結果から、前記上肢の一部が前記ウェアラブル装置から離れる動作と、前記上肢の他部が前記ウェアラブル装置に近づく動作と、の双方を伴う特定の身体動作を検出したことに基づいて、所定の処理を実行する制御部と、を備える、頭部に装着可能な
ウェアラブル装置。 - 現実の空間に在る利用者の上肢を検出可能な検出部と、制御部と、を備える、頭部に装着可能なウェアラブル装置によって実行される制御方法であって、
前記制御部は、前記検出部の検出結果から、前記上肢における腕の回転を伴う回転身体動作を検出したことに基づいて、所定の処理を実行する
制御方法。 - 現実の空間に在る利用者の上肢を検出可能な検出部と、制御部と、を備える、頭部に装着可能なウェアラブル装置において、
前記検出部の検出結果から、前記上肢における腕の回転を伴う回転身体動作を検出したことに基づいて、前記制御部に所定の処理を実行させる
制御プログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017530889A JP6510648B2 (ja) | 2015-07-29 | 2016-07-26 | ウェアラブル装置、制御方法及び制御プログラム |
US15/747,754 US20180217680A1 (en) | 2015-07-29 | 2016-07-26 | Wearable device, control method, and control code |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015149242 | 2015-07-29 | ||
JP2015-149242 | 2015-07-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017018428A1 true WO2017018428A1 (ja) | 2017-02-02 |
Family
ID=57884709
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/071936 WO2017018428A1 (ja) | 2015-07-29 | 2016-07-26 | ウェアラブル装置、制御方法及び制御プログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20180217680A1 (ja) |
JP (1) | JP6510648B2 (ja) |
WO (1) | WO2017018428A1 (ja) |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10509469B2 (en) | 2016-04-21 | 2019-12-17 | Finch Technologies Ltd. | Devices for controlling computers based on motions and positions of hands |
US10705113B2 (en) | 2017-04-28 | 2020-07-07 | Finch Technologies Ltd. | Calibration of inertial measurement units attached to arms of a user to generate inputs for computer systems |
US10540006B2 (en) | 2017-05-16 | 2020-01-21 | Finch Technologies Ltd. | Tracking torso orientation to generate inputs for computer systems |
US10379613B2 (en) * | 2017-05-16 | 2019-08-13 | Finch Technologies Ltd. | Tracking arm movements to generate inputs for computer systems |
US10341648B1 (en) * | 2017-09-20 | 2019-07-02 | Amazon Technologies, Inc. | Automated detection of problem indicators in video of display output |
US10521011B2 (en) * | 2017-12-19 | 2019-12-31 | Finch Technologies Ltd. | Calibration of inertial measurement units attached to arms of a user and to a head mounted device |
US10509464B2 (en) | 2018-01-08 | 2019-12-17 | Finch Technologies Ltd. | Tracking torso leaning to generate inputs for computer systems |
US11016116B2 (en) | 2018-01-11 | 2021-05-25 | Finch Technologies Ltd. | Correction of accumulated errors in inertial measurement units attached to a user |
US10416755B1 (en) | 2018-06-01 | 2019-09-17 | Finch Technologies Ltd. | Motion predictions of overlapping kinematic chains of a skeleton model used to control a computer system |
US11474593B2 (en) | 2018-05-07 | 2022-10-18 | Finch Technologies Ltd. | Tracking user movements to control a skeleton model in a computer system |
US10782651B2 (en) * | 2018-06-03 | 2020-09-22 | Apple Inc. | Image capture to provide advanced features for configuration of a wearable device |
US11009941B2 (en) | 2018-07-25 | 2021-05-18 | Finch Technologies Ltd. | Calibration of measurement units in alignment with a skeleton model to control a computer system |
WO2020209624A1 (en) | 2019-04-11 | 2020-10-15 | Samsung Electronics Co., Ltd. | Head mounted display device and operating method thereof |
US10809797B1 (en) | 2019-08-07 | 2020-10-20 | Finch Technologies Ltd. | Calibration of multiple sensor modules related to an orientation of a user of the sensor modules |
US11531392B2 (en) | 2019-12-02 | 2022-12-20 | Finchxr Ltd. | Tracking upper arm movements using sensor modules attached to the hand and forearm |
US12032736B2 (en) * | 2022-02-23 | 2024-07-09 | International Business Machines Corporation | Gaze based text manipulation |
US20240070995A1 (en) * | 2022-08-31 | 2024-02-29 | Snap Inc. | Wrist rotation manipulation of virtual objects |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013505508A (ja) * | 2009-09-22 | 2013-02-14 | ペブルステック リミテッド | コンピュータ装置の遠隔制御 |
JP2014119295A (ja) * | 2012-12-14 | 2014-06-30 | Clarion Co Ltd | 制御装置、及び携帯端末 |
WO2014181380A1 (ja) * | 2013-05-09 | 2014-11-13 | 株式会社ソニー・コンピュータエンタテインメント | 情報処理装置およびアプリケーション実行方法 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009508274A (ja) * | 2005-09-13 | 2009-02-26 | スペースタイムスリーディー・インコーポレーテッド | 3次元グラフィカル・ユーザ・インターフェースを提供するシステム及び方法 |
US9696808B2 (en) * | 2006-07-13 | 2017-07-04 | Northrop Grumman Systems Corporation | Hand-gesture recognition method |
JP2012168932A (ja) * | 2011-02-10 | 2012-09-06 | Sony Computer Entertainment Inc | 入力装置、情報処理装置および入力値取得方法 |
US8854433B1 (en) * | 2012-02-03 | 2014-10-07 | Aquifi, Inc. | Method and system enabling natural user interface gestures with an electronic system |
US10019144B2 (en) * | 2013-02-15 | 2018-07-10 | Quick Eye Technologies Inc. | Organizer for data that is subject to multiple criteria |
-
2016
- 2016-07-26 US US15/747,754 patent/US20180217680A1/en not_active Abandoned
- 2016-07-26 WO PCT/JP2016/071936 patent/WO2017018428A1/ja active Application Filing
- 2016-07-26 JP JP2017530889A patent/JP6510648B2/ja active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013505508A (ja) * | 2009-09-22 | 2013-02-14 | ペブルステック リミテッド | コンピュータ装置の遠隔制御 |
JP2014119295A (ja) * | 2012-12-14 | 2014-06-30 | Clarion Co Ltd | 制御装置、及び携帯端末 |
WO2014181380A1 (ja) * | 2013-05-09 | 2014-11-13 | 株式会社ソニー・コンピュータエンタテインメント | 情報処理装置およびアプリケーション実行方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2017018428A1 (ja) | 2018-03-29 |
JP6510648B2 (ja) | 2019-05-08 |
US20180217680A1 (en) | 2018-08-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017018428A1 (ja) | ウェアラブル装置、制御方法及び制御プログラム | |
JP6400197B2 (ja) | ウェアラブル装置 | |
US11262835B2 (en) | Human-body-gesture-based region and volume selection for HMD | |
JP6595597B2 (ja) | ウェアラブル装置、制御方法及び制御プログラム | |
US20200209961A1 (en) | Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device | |
KR101983725B1 (ko) | 전자 기기 및 전자 기기의 제어 방법 | |
KR102458344B1 (ko) | 카메라의 초점을 변경하는 방법 및 장치 | |
US9904360B2 (en) | Head tracking based gesture control techniques for head mounted displays | |
US11782514B2 (en) | Wearable device and control method thereof, gesture recognition method, and control system | |
TW201403380A (zh) | 手勢辨識系統及可辨識手勢動作的眼鏡 | |
US20220012922A1 (en) | Information processing apparatus, information processing method, and computer readable medium | |
KR20200040716A (ko) | 시선 추적을 이용한 시인성 개선 방법, 저장 매체 및 전자 장치 | |
JP6483514B2 (ja) | ウェアラブル装置、制御方法及び制御プログラム | |
KR102039948B1 (ko) | 증강현실이나 가상현실에 기초하여 가상의 인체 장기를 렌더링하는 이동 단말기 및 이를 이용하는 시스템 | |
WO2016006070A1 (ja) | 携帯情報端末装置及びそれと連携するヘッドマウントディスプレイ | |
JP6686319B2 (ja) | 画像投影装置及び画像表示システム | |
US8970483B2 (en) | Method and apparatus for determining input | |
JP6999822B2 (ja) | 端末装置および端末装置の制御方法 | |
KR20180097031A (ko) | 휴대 단말 장치와 프로젝션 장치를 포함하는 증강 현실 시스템 | |
TW201445366A (zh) | 手勢辨識系統及可辨識手勢動作的眼鏡 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16830533 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017530889 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15747754 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16830533 Country of ref document: EP Kind code of ref document: A1 |