WO2018079301A1 - 情報処理装置、方法及びプログラム - Google Patents
情報処理装置、方法及びプログラム Download PDFInfo
- Publication number
- WO2018079301A1 WO2018079301A1 PCT/JP2017/037127 JP2017037127W WO2018079301A1 WO 2018079301 A1 WO2018079301 A1 WO 2018079301A1 JP 2017037127 W JP2017037127 W JP 2017037127W WO 2018079301 A1 WO2018079301 A1 WO 2018079301A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information processing
- processing apparatus
- sensor
- trigger operation
- information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/28—Supervision thereof, e.g. detecting power-supply failure by out of limits supervision
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3206—Monitoring of events, devices or parameters that trigger a change in power modality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3234—Power saving characterised by the action undertaken
- G06F1/3293—Power saving characterised by the action undertaken by switching to a less power-consuming processor, e.g. sub-CPU
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/10—Earpieces; Attachments therefor ; Earphones; Monophonic headphones
- H04R1/1041—Mechanical or electronic switches, or control elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1636—Sensing arrangement for detection of a tap gesture on the housing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/10—Earpieces; Attachments therefor ; Earphones; Monophonic headphones
- H04R1/1008—Earpieces of the supra-aural or circum-aural type
Definitions
- This technology is related to improving the operability of the human interface.
- buttons and touch panels are the mainstream.
- Patent Document 1 describes control of an electronic device that does not use a button, and discloses that sensing by a touch device using an electrostatic switch starts only after the proximity sensor reacts.
- buttons and touch devices may be difficult to use, especially in small devices, and a simpler human interface is desired.
- an object of the present technology is to provide an information processing apparatus capable of diversifying operation inputs without imposing a burden on the user.
- an information processing apparatus including a control unit.
- the said control part detects the trigger operation according to the kind of 1st sensor information.
- a gesture operation performed by the user as an operation input is recognized based on the second sensor information.
- the operation input is determined based on a combination of the detected trigger operation and the recognized gesture operation.
- the operation input is not limited to one per gesture operation, and a plurality of operations can be performed. Therefore, it is possible to diversify the operation inputs without imposing a burden on the user. Become.
- the control unit may recognize the gesture operation based on the second sensor information input within a predetermined time after detecting the trigger operation.
- the control unit When the first sensor information is input, the control unit performs a single trigger operation when the length of time during which the first sensor information is continuously input is shorter than a predetermined threshold, and continues when the first sensor information is long.
- the operation input may be determined based on a combination of the detected single trigger operation or the continuous trigger operation and the recognized gesture operation.
- two types of trigger operation and continuous trigger operation can be set for one trigger operation, and variations of operation input are widened.
- the information input by the operation input may include a value corresponding to the detected time length of the continuous trigger operation.
- control unit may set the end point of the time length of the continuous trigger operation as the time point when the input of the first sensor information is finished.
- the said control part is good also considering the starting point of the length of time of the said continuous trigger operation as the time of the input of the said 1st sensor information having started. Or the said control part is good also considering the starting point of the length of time of the said continuous trigger operation as the time of the said continuous trigger operation being detected.
- the control unit may use a starting point of the time length of the continuous trigger operation as a time point when the gesture operation is recognized.
- the senor that outputs the first sensor information may be installed in a housing configured to be attachable to a user's body.
- the sensor may be installed at a position where an operation by the user's hand can be detected.
- the position where the sensor is installed may be configured to be outside the user's field of view when the housing is mounted.
- the information processing apparatus it is possible to provide an operation input interface for a wearable device such as a head-mounted display or a wearable computer.
- the information processing apparatus has a first power mode and a second power mode as power consumption modes of the information processing apparatus, and the first power mode consumes more power than the second power mode. There may be few modes.
- the control unit is configured to switch the power consumption mode of the information processing device to the second power mode when detecting the trigger operation when the information processing device is in the first power mode. May be.
- the trigger operation can be configured to be used as a trigger for switching to an operation mode with higher power consumption, and the power consumption of the apparatus can be suppressed.
- Another aspect of the present technology is an information processing method including the following steps. Detecting a trigger operation corresponding to the type of the first sensor information; Recognizing a gesture operation performed by a user as an operation input based on the second sensor information. Determining the operation input based on a combination of the detected trigger operation and the recognized gesture operation;
- Another aspect of the present technology is a program that causes a computer to execute the following steps. Detecting a trigger operation corresponding to the type of the first sensor information; Recognizing a gesture operation performed by a user as an operation input based on the second sensor information. Determining the operation input based on a combination of the detected trigger operation and the recognized gesture operation;
- timing chart (the 1) for demonstrating the process of the information processing apparatus which concerns on the said embodiment. It is a timing chart (the 2) for demonstrating the process of the information processing apparatus which concerns on the said embodiment. It is a figure which shows the external appearance structural example of the modification of the said embodiment.
- the information processing apparatus 1 includes a wearable device.
- the wearable device of the present embodiment is not particularly limited as long as it can secure a space in which a sensor group to be described later can be mounted, and examples thereof include a headphone and a head mounted display.
- the present invention can also be applied to a wristband type wearable device, a clothing type wearable device such as a jacket, and the like.
- headphones are employed as the wearable device of this embodiment. In addition to headphones, an earphone type may be used.
- the information processing apparatus 1 includes a user interface that receives a user operation input.
- a user operation input There is no limitation on what purpose the user's operation input is used. It may be a news reading application. In the following description, an example used for operating a music player that reproduces music content is shown.
- FIG. 1 is a diagram for explaining the outline of the present embodiment.
- the information processing apparatus 1 according to the present embodiment has a gesture recognition function for recognizing a user's movement. However, gesture recognition is performed when the user performs a trigger operation and the information processing apparatus 1 detects it. Although there are a plurality of trigger operations in the present embodiment, one method for realizing detection of a plurality of trigger operations is to prepare one sensor corresponding to each trigger operation. The information processing apparatus 1 interprets the gesture recognition according to the trigger operation corresponding to the type and the type of gesture recognition.
- “interpret” means to determine to replace an operation input by a series of trigger operations and gesture operations with a processing command to the information processing apparatus 1.
- FIG. 1 shows three types of gesture operation examples of “nodding”, “turning right and returning neck”, and “turning left and returning neck”.
- the interpretation of each gesture operation is changed according to a trigger operation performed before these three types of gesture operations are performed.
- FIG. 1A illustrates a case where the first trigger operation is performed
- FIG. 1B illustrates a case where the second trigger operation is performed.
- the information processing apparatus 1 interprets the gesture operation as “play / stop” of the music player (FIG. 1A). ). On the other hand, if the second trigger operation is first, the information processing apparatus 1 interprets the same gesture operation as “music album selection”. Similar information processing is performed for a gesture operation facing the right and a gesture operation facing the left.
- the proximity sensors are installed on the left and right of the headphones, respectively.
- the first trigger operation (FIG. 1A)
- the second trigger operation (FIG. 1B).
- the third trigger operation is detected. Also for the third trigger operation, a set of operation inputs corresponding to each gesture operation is set (FIG. 1C).
- the gesture of nodding and turning to the right is a simple gesture that anyone can learn.
- it since it is a daily movement, if it is interpreted as an operation input to the information processing apparatus 1, it becomes an input unintended by the user.
- gesture recognition is performed only after a trigger operation prior to gesture recognition is detected, input unintended by the user can be prevented.
- variations of operation inputs that can be input to the information processing apparatus 1 with the same gesture are diversified according to the type of trigger operation. Therefore, according to the present embodiment, operation inputs can be diversified without imposing a burden on the user.
- FIG. 2 is a diagram illustrating an external configuration example of the information processing apparatus 1 according to the present embodiment
- FIG. 3 is a block diagram illustrating an internal configuration example.
- the information processing apparatus 1 may include a headphone 2 and a mobile terminal 3 in terms of hardware configuration, for example.
- the headphone 2 has a sensor group 10.
- any form such as an earphone type, a wristband type, and a jacket type may be used as long as the device has a housing 20 that can be attached to a user where the sensor group 10 is installed But you can.
- Such a device is called a wearable device.
- the “casing” may imply a box shape, but here it simply means that an exterior is provided, and the shape is not limited.
- the headphones 2 and the portable terminal 3 have a wireless communication unit 19 and a wireless communication unit 39, respectively, and can communicate with each other, but there is no restriction on the specific communication mode. Wired communication may be used.
- each wireless communication unit includes an antenna and a wireless communication circuit, performs communication according to a wireless communication standard such as Bluetooth (registered trademark), and the detection information of the sensor group 10 is passed to the mobile terminal 3. It is.
- a smartphone can be used as the mobile terminal 3.
- the sensor group 10 shown in FIG. 2 includes a right proximity sensor 11R, a left proximity sensor 11L, a motion sensor 12, and a noise canceller microphone 13.
- a switch (not shown) connected so as to short-circuit the microphone may be used as one of the sensors constituting the sensor group 10.
- the right proximity sensor 11R and the left proximity sensor 11L may be an infrared method or other methods.
- the right proximity sensor 11R and the left proximity sensor 11L continue to output detection signals while sensing.
- the proximity sensor 11R and the left proximity sensor 11L are used to detect a user's trigger operation.
- the motion sensor 12 is a motion sensor that detects triaxial acceleration and triaxial angular velocity.
- the motion sensor 12 is used to recognize a user's gesture operation. You may use for recognition of trigger operation.
- the noise-canceling microphone 13 is a microphone that collects ambient sounds in a noise canceling technique that collects sounds around the headphones 2 and outputs noise in an opposite phase to reduce noise.
- the operation may be detected as a trigger operation.
- sensor information necessary for detecting a trigger operation is referred to as first sensor information
- sensor information necessary for recognizing a gesture operation is referred to as second sensor information.
- the sensor information is a bundle of sensor signals transmitted by at least one or more of the sensor group 10 accompanying sensing.
- the first sensor information and the second sensor information may overlap.
- sensor information from at least one of the right proximity sensor 11R and the left proximity sensor 11L is referred to as first sensor information.
- sensor information from the motion sensor 12 is set as second sensor information.
- the information processing apparatus 1 has a configuration in which the mobile terminal 3 includes an information processing unit 31 in addition to the configuration shown in FIG. 2.
- the information processing unit 31 may be at least one of the arithmetic processing devices of the mobile terminal 3.
- the information processing unit 31 may be a central processing unit (CPU), a DSP (Digital Signal Processor), or a SoC (System on Chip). It may be configured. You may design so that the information processing part 31 may be comprised combining one or more arithmetic processing devices.
- the information processing unit 31 is configured to execute a software program read into a memory such as the RAM 21 (22), and to include a trigger detection unit 32, a gesture recognition unit 33, and an operation input signal generation unit 34. The function of each part is described below.
- the trigger detection unit 32 detects a trigger operation corresponding to the type of the input first sensor information. For example, when the first sensor information is a signal representing sensing of the right proximity sensor 11R, a trigger operation of “the user raises his right hand and holds it over the right side of the headphones 2” is detected. Alternatively, when the first sensor information is a signal representing sensing of the left proximity sensor 11L, a trigger operation of “the user raises his left hand and holds it over the left side of the headphones 2” is detected.
- the trigger detection unit 32 detects a trigger operation corresponding to each digital data.
- the gesture recognition unit 33 recognizes a gesture operation performed by the user as an operation input based on the second sensor information.
- the gesture operation is defined in advance according to the nature of the wearable device and the type of available sensor.
- headphones are used as a wearable device and the motion sensor 12 is used for gesture recognition.
- the predetermined gesture operations include “nodding”, “right swing” and “left swing” of the neck. To do.
- the gesture recognition unit 33 performs gesture recognition based on the second sensor information input within a predetermined time after detection of the trigger operation by the trigger detection unit 32.
- the second sensor information may be transmitted to the information processing section 31 only after the trigger detection section 32 detects the trigger operation.
- the motion sensor 12 may not be operated until the trigger operation is detected by the trigger detection unit 32, or may be configured to operate in the power saving mode.
- the gesture recognition unit 33 continues to recognize the gesture operation until the timeout time elapses.
- the operation input signal generation unit 34 interprets an operation input that the user intends to input through the gesture operation by a combination of the trigger operation detected by the trigger detection unit 32 and the gesture operation recognized by the gesture recognition unit 33. As described with reference to FIG. 1, this interprets the intention of the user's operation input by a combination of the trigger operation and the gesture operation.
- the operation input signal generator 34 interprets the operation input with reference to a lookup table (not shown).
- the operation input signal generation unit 34 generates a command obtained as a result of interpretation as a signal.
- the generated signal is output to the subsequent stage as an operation input signal.
- the configuration example of this embodiment has been described above. Note that the internal configuration of the information processing apparatus 1 is not limited to the example illustrated in FIG. 3, and in the configuration of FIG. 3, some or all of the components included in the mobile terminal 3 may be included on the wearable device side. Good. 4 and 5 show the internal configuration in such a case.
- FIG. 4 is a modification of the present embodiment, which is a modification of the type in which trigger operation and gesture recognition are executed on the wearable device side.
- the portable terminal 3 has a CPU 38 and a RAM 22, and the CPU 38 executes the software program read into the RAM 22, and the operation input signal generation unit 34 is generated. Since it is not necessary to wirelessly communicate the output value (sensor information) of the sensor output from the sensor group 10 to the portable terminal 3, battery saving may be achieved.
- FIG. 5 is also a modification of the present embodiment, and is a modification of the type in which everything up to the operation input signal generation is executed on the wearable device side.
- the portable terminal 3 is not necessary as a component.
- Operation> 6 and 7 are flowcharts showing the flow of processing of the information processing apparatus 1 according to this embodiment.
- 8 and 9 are timing charts referred to for explaining the flow of processing of the information processing apparatus 1 according to the present embodiment.
- the information processing apparatus 1 waits until the first sensor information is detected by the trigger detection unit 32 (ST100). In this state, the functions of the gesture recognition unit 33 and the motion sensor 12 may be limited in a first power consumption mode that consumes less power, such as a power saving mode.
- the trigger detection unit 32 detects a trigger operation (ST100, Yes).
- the trigger detection unit 32 determines the type of trigger operation according to the type of the input first sensor information, and stores the type in a register provided in the information processing unit 31 (ST101).
- the type of trigger operation is referred to in ST105.
- the first sensor information includes a plurality of types such as sensing information of the right proximity sensor 11R, sensing information of the left proximity sensor 11L, and information transmitted by the microphone 13 when the microphone 13 is tapped.
- the type of trigger operation corresponding to each type is detected here. For example, a trigger operation of holding the right hand over the sensing information of the right proximity sensor 11R, a trigger operation of holding up the left hand over the sensing information of the left proximity sensor 11L, a trigger operation of holding both hands over the sensing information of both proximity sensors, A trigger operation for tapping is detected in information transmitted from the microphone 13 at the time of tapping.
- the operation input signal generator 34 determines whether or not the first sensor information continues to be detected (ST102). This determination is performed by the operation input signal generator 34. For example, when the right proximity sensor 11R transmits sensing information in ST100, it is determined whether or not the sensing information is continuously input. The operation input signal generation unit 34 determines whether or not sensing information continues to be input beyond a predetermined threshold, and determines whether or not there is continuity of the trigger operation.
- the information processing section 31 activates a sensor used for detecting a gesture operation (ST103).
- the motion sensor 12 is activated.
- the operation mode of the information processing apparatus 1 after activation of the sensor used for detecting the gesture operation may be referred to as a normal operation mode (second power consumption mode), and may be switched from the power saving mode to the normal operation mode.
- the information processing unit 31 may perform this switching.
- the gesture recognizing unit 33 waits for second sensor information that may be input until a predetermined time that is timed out (ST104). If the user's gesture operation cannot be recognized from the sensing information (an example of the second sensor information) of the motion sensor 12 within a predetermined time (ST104, No), the process ends.
- the operation input signal generation unit 34 When the user's gesture operation can be recognized from the sensing information (an example of the second sensor information) of the motion sensor 12 within a predetermined time (ST104, Yes), the operation input signal generation unit 34 has stored the trigger since ST101. Based on the combination of the type of operation and the gesture operation recognized in ST104, it is interpreted what operation input the user is trying to perform in the gesture operation recognized in ST104 (ST105).
- the same gesture operation becomes a “zoom-in” command (an example of operation input) or a “decrease volume” command (an example of operation input).
- a “return to previous page” command an example of operation input).
- the information processing section 31 or the portable terminal 3 executes the operation input interpreted in ST105 (ST106). Depending on the contents of the operation input, the headphones 2 or the information processing apparatus 1 as a whole may execute.
- the gesture recognition unit 33 sets a predetermined time until timeout.
- the starting point of the predetermined time is set to the end point of detection of the first sensor information, but may be set to the start point of detection.
- the gesture recognition unit 33 determines that there is no gesture operation or user operation input. Therefore, even when the trigger operation is not intended by the user, erroneous input by the information processing apparatus 1 can be suppressed.
- whether or not the second sensor information is input is determined only within a predetermined time from the detection of the trigger operation. Therefore, when performing the operation input, the user needs to perform the gesture operation after explicitly performing the trigger operation. Also with this configuration, erroneous input by the information processing apparatus 1 can be suppressed.
- the gesture recognition unit 33 recognizes whether or not this is a gesture operation.
- the execution of the operation input in ST106 is continued until the trigger operation is completed (ST107, Yes).
- the trigger operation is an operation of holding the right hand with the reaction of the right proximity sensor 11R
- the operation input interpreted by the combination of the type of the trigger operation and the gesture operation is continued while the right hand is held over. .
- the continuity of the trigger operation will be described with reference to FIG.
- the upper part of each sensor information indicates that sensor information is input, and the lower part indicates that sensor information is not input.
- the horizontal axis is time.
- the determination of the continuity of the trigger operation in ST102 of FIG. 6 is determined using a predetermined threshold value shown in FIG. Whether the operation input signal generation unit 34 exceeds the predetermined threshold for determining the continuity of the trigger operation, starting from the time when the input of the first sensor information rises.
- the continuity of the trigger operation is determined based on whether or not. When it does not exceed as shown in FIG. 9A, it is determined that the trigger operation is a single trigger operation. When exceeding as shown in FIGS. 9B and 9C, it is determined that the trigger operation is a continuous trigger operation or a continuous trigger operation.
- the type of the first sensor information is treated differently depending on whether or not the trigger operation is continued. For example, even sensor information output by the same right proximity sensor 11R is handled as different types of sensor information in a single trigger operation and a continuous trigger operation.
- the method for performing different types of operation input for each device that detects the trigger operation or for each combination of devices for the same gesture input has been described.
- the trigger operation has also been described with respect to the method of making the operation input different between the single trigger operation and the continuous trigger operation.
- a method for including a continuous value in an operation input using a continuous trigger operation will be described below.
- a button can be used to specify a continuous value by pressing the button for a long time, but this is a problem when there is a restriction on the human interface that can be used for input, such as a wearable device.
- a wearable device that does not have a button, it may be possible to input by gesture operation, but it is difficult to specify a continuous value by gesture operation. This is due to the fact that the natural sensation varies from user to user.
- a value corresponding to the detected time length of the continuous trigger operation is used as the continuous value.
- the length of time for continuous triggering is extracted as described below.
- the end and end of the time length of the continuous trigger operation may be the time when the continuous trigger operation ends. That is, it is set at the time when the input of the first sensor information is interrupted. For example, when a gesture operation such as nodding is performed after holding the hand to react the proximity sensor, the point at which the continuous trigger operation is ended is the point at which the user continues to hold the hand.
- the starting point of the time length of the continuous trigger operation is not limited. For example, when the input of the first sensor information is started, it is determined that the trigger operation has continuity, the gesture When the operation is recognized, there are the above three patterns. Any pattern may be selected.
- the operation input signal generation unit 34 generates a command based on the interpreted operation input and passes it to a subsequent processing block.
- a continuous value is extracted from a continuous trigger operation by the method described above, the continuous value Alternatively, a command with a value corresponding to this as an argument is generated.
- the value corresponding to the continuous value include a value having a proportional relationship, a value quantized according to some criterion, and a value calculated by a predetermined function (for example, a sigmoid function).
- a desired method can be selected as a method for deriving a value corresponding to a continuous value in accordance with an operation input target.
- proximity sensor sensing as an alternative to button press detection, but in this case, there is a possibility that the input unintended by the user may increase.
- An area having a certain area is required for a touch device using an electrostatic switch, and there is a risk that downsizing that is important as a human interface of a wearable device may not be possible.
- Patent Document 1 uses so-called “touch gestures” for operation input, but there is a limit to the types of “touch gestures” that can be stored by the user, and there is a problem that variations in operation input are difficult to increase. .
- the operation input is not limited to one per gesture operation, and a plurality of operations can be performed. That is, according to the present embodiment, operation inputs can be diversified without imposing a burden on the user.
- the trigger is fired unintentionally. Even in such a case, input unintended by the user can be prevented, and erroneous operation input is not performed.
- the single trigger operation when the first sensor information is input, if the length of time during which the first sensor information is continuously input is shorter than a predetermined threshold, the single trigger operation is long. Detect continuous triggering in case. Then, the operation input is interpreted by a combination of the detected single trigger operation or the continuous trigger operation and the recognized gesture operation. Therefore, two types of single trigger operation and continuous trigger operation can be set for one trigger operation, and the variation of operation input is widened.
- a value corresponding to the detected time length of the continuous trigger operation is included in the information input by the operation input. This makes it easy to absorb gestures that differ from person to person and recognizes them robustly, while holding the hand to react the proximity sensor and releasing the hand to stop the reaction. It becomes possible to perform operation input using continuous values.
- the headphones 2 are shown as the wearable device in the above embodiment, the present invention is not limited to this.
- other examples of wearable devices include headsets and neckband type terminals that can be placed on shoulders.
- the above-described embodiment can be implemented by being modified to such a device.
- FIG. 10 is a diagram illustrating an external configuration example of the information processing apparatus 1 according to a modified embodiment when the present technology is applied to a head-mounted display.
- a glasses-type head mounted display 2a is employed as the housing 20 configured to be mounted on the user's body.
- the eyeglass-type head mounted display 2 a includes an electrostatic touch device 14.
- the installation position of the touch device 14 is not limited, but may be a position where an operation by the user's hand can be detected when the glasses-type head mounted display 2a is attached to the user.
- the touch device 14 around the temple of the user wearing the glasses-type head mounted display 2a is arranged.
- the touch device 14 provides a function similar to the function provided by the proximity sensor 11R and the left proximity sensor 11L in the configuration example illustrated in FIG. That is, the touch device 14 generates first sensor information necessary for detecting the trigger operation by sensing and inputs the first sensor information to the information processing apparatus 1.
- the touch device 14 may be of a type that can detect a user's finger contact not only at a single point but also at multiple points.
- the multipoint type is a touch device 14 that can sense one or more touch points. In this case, for example, different trigger operations can be performed when touched with one finger and when touched with two fingers.
- the trigger detection unit 32 may be configured to detect different types of trigger operations according to the number of contact points sensed by the touch device 14.
- the location where the touch device 14 is disposed is not limited to the vicinity of the temple and may be within a range where the user's hand can physically reach so that an operation by the user's hand can be detected. Furthermore, the touch device 14 may be disposed outside the field of view of the user wearing the housing 20. Even if the touch device 14 is disposed in a place where the user who is wearing the housing 20 is not directly visible, according to the present modification to which the present technology is applied, the user's trigger operation and gesture operation can be performed without relying on vision. Thus, an operation input to the information processing apparatus 1 becomes possible.
- the glasses-type head mounted display 2a may have a transmissive display or a non-transmissive display.
- a trigger operation corresponding to the type of the first sensor information is detected, a gesture operation performed by the user as an operation input is recognized based on the second sensor information, and the detected trigger operation and the recognized gesture operation
- An information processing apparatus comprising: a control unit that determines the operation input based on a combination.
- the information processing apparatus according to (1) above, The information processing apparatus, wherein the control unit recognizes the gesture operation based on the second sensor information input within a predetermined time after detecting the trigger operation.
- the controller is When the first sensor information is input, a single trigger operation is detected when the length of time during which the first sensor information is continuously input is shorter than a predetermined threshold, and a continuous trigger operation is detected when the time is long. And The information processing apparatus determines the operation input based on a combination of the detected single trigger operation or the continuous trigger operation and the recognized gesture operation. (4) The information processing apparatus according to (3) above, When the control unit detects the continuous trigger operation, the control unit includes a value corresponding to the detected time length of the continuous trigger operation in the information input by the operation input.
- the information processing apparatus according to (4) or (5) above, The information processing apparatus, wherein the control unit sets a starting point of a time length of the continuous trigger operation as a time point when the gesture operation is recognized.
- the sensor that outputs the first sensor information is installed in a housing that can be worn on the user's body, The sensor is installed at a position where an operation by the user's hand can be detected.
- the position where the sensor is installed is a position that is out of the field of view of the user when the housing is mounted.
- the information processing apparatus according to any one of (9) and (10) above,
- the first sensor information is sensing information of a touch sensor capable of sensing one or more contact points
- the control unit detects the trigger operation according to the number of contact points sensed by the touch sensor.
- the information processing apparatus has a first power mode and a second power mode as power consumption modes of the information processing apparatus, When the control unit detects the trigger operation when the information processing apparatus is in the first power mode, the control unit switches the power consumption mode of the information processing apparatus to the second power mode, The information processing apparatus that consumes less power in the first power mode than in the second power mode.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Acoustics & Sound (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
上記制御部は、第1のセンサ情報の種類に応じたトリガ操作を検出する。
また、操作入力としてユーザにより行われるジェスチャ操作を第2のセンサ情報に基づいて認識する。
また、検出した前記トリガ操作と認識した前記ジェスチャ操作との組み合わせに基づいて、前記操作入力を決定する。
あるいは、上記制御部は、上記連続的トリガ操作の時間の長さの起点を、上記連続的トリガ操作が検出された時点としてもよい。
あるいは、上記制御部は、上記連続的トリガ操作の時間の長さの起点を、上記ジェスチャ操作が認識された時点としてもよい。
その場合、上記センサは、前記ユーザの手による操作を検出可能な位置に設置されてもよい。
その場合、上記センサの設置される位置は、上記筐体を装着した状態におけるユーザの視野外となる位置であるよう構成してもよい。
その場合、上記制御部は、上記情報処理装置が上記第1の電力モードにあるときに上記トリガ操作を検出すると、上記情報処理装置の電力消費モードを上記第2の電力モードに切り替えるよう構成してもよい。
第1のセンサ情報の種類に応じたトリガ操作を検出するステップ。
操作入力としてユーザにより行われるジェスチャ操作を第2のセンサ情報に基づいて認識するステップ。
検出した上記トリガ操作と認識した上記ジェスチャ操作との組み合わせに基づいて、上記操作入力を決定するステップ
第1のセンサ情報の種類に応じたトリガ操作を検出するステップ。
操作入力としてユーザにより行われるジェスチャ操作を第2のセンサ情報に基づいて認識するステップ。
検出した上記トリガ操作と認識した上記ジェスチャ操作との組み合わせに基づいて、上記操作入力を決定するステップ。
なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、又は、上記の効果に代えて、本明細書に示されたいずれかの効果又は本明細書から把握され得る他の効果が奏されてもよい。
1.本技術の一実施形態による情報処理装置の概要
2.構成
2-1.外観構成
2-2.内部構成
3.動作
4.まとめ
5.変形例
6.付記
本実施形態に係る情報処理装置1はウェアラブルデバイスを含む。本実施形態のウェアラブルデバイスとしては、後述するセンサ群が搭載可能なスペースを確保できるものである限り、特に限定するものではないが、ヘッドフォンやヘッドマウントディスプレイが一例として挙げられる。そのほか、リストバンド型のウェアラブルデバイスやジャケット等着衣型のウェアラブルデバイス等にも適用可能である。以下の説明においては、本実施形態のウェアラブルデバイスとしてヘッドフォンを採用する。また、ヘッドフォン以外にもイヤホン型でもよい。
図2は本実施形態に係る情報処理装置1の外観構成例を示す図であり、図3は内部構成例を示すブロック図である。
図3に示すように本実施形態に係る情報処理装置1は、図2に示した構成に加えて、携帯端末3が情報処理部31を内蔵する構成である。情報処理部31は、携帯端末3の演算処理装置の少なくとも一つであればよく、例えば、中央演算処理装置(CPU:Central Processing Unit)やDSP(Digital Signal Processor)、SoC(System on Chip)で構成されてもよい。1以上の演算処理デバイスが組み合わさって情報処理部31を構成するように設計してもよい。
図6と図7は本実施形態に係る情報処理装置1の処理の流れを示すフローチャートである。図8と図9は本実施形態に係る情報処理装置1の処理の流れを説明するために参照するタイミングチャートである。以下、これらの図面を参照しながら説明する。
上述の実施形態はユーザに負担をかけることなく複雑な操作入力を可能にすることに主眼がある。ウェアラブルデバイスは軽さや小型化が追求されるため搭載可能なヒューマンインターフェイスもボタン程度のものとなる場合がある。しかしながら、ボタンのみでは複雑な操作ができない。
さらに、本実施形態は、以下に述べるように変形して実施が可能である。
本明細書に開示される技術的思想の一部は、下記(1)~(11)のように記載されうる。
(1)
第1のセンサ情報の種類に応じたトリガ操作を検出し、操作入力としてユーザにより行われるジェスチャ操作を第2のセンサ情報に基づいて認識し、検出した前記トリガ操作と認識した前記ジェスチャ操作との組み合わせに基づいて、前記操作入力を決定する制御部
を具備する情報処理装置。
(2)
上記(1)に記載の情報処理装置であって、
前記制御部は、前記トリガ操作の検出後、所定の時間内に入力される前記第2のセンサ情報に基づいて、前記ジェスチャ操作の認識をする
情報処理装置。
(3)
上記(1)又は(2)に記載の情報処理装置であって、
前記制御部は、
前記第1のセンサ情報が入力された際、前記第1のセンサ情報が入力され続けている時間の長さが所定の閾値より短い場合に単発的トリガ操作、長い場合に連続的トリガ操作を検出し、
検出した前記単発的トリガ操作又は前記連続的トリガ操作と、認識した前記ジェスチャ操作との組み合わせにより、前記操作入力を決定する
情報処理装置。
(4)
上記(3)に記載の情報処理装置であって、
前記制御部は、前記連続的トリガ操作を検出した場合、前記操作入力により入力される情報に、検出した前記連続的トリガ操作の時間の長さに応じた値を含める
情報処理装置。
(5)
上記(4)に記載の情報処理装置であって、
前記制御部は、前記連続的トリガ操作の時間長さの終点を、前記第1のセンサ情報の入力が終わった時点とする
情報処理装置。
(6)
上記(4)又は(5)に記載の情報処理装置であって、
前記制御部は、前記連続的トリガ操作の時間の長さの起点を、前記第1のセンサ情報の入力が始まった時点とする
情報処理装置。
(7)
上記(4)又は(5)に記載の情報処理装置であって、
前記制御部は、前記連続的トリガ操作の時間の長さの起点を、前記連続的トリガ操作が検出された時点とする
情報処理装置。
(8)
上記(4)又は(5)に記載の情報処理装置であって、
前記制御部は、前記連続的トリガ操作の時間の長さの起点を、前記ジェスチャ操作が認識された時点とする
情報処理装置。
(9)
上記(1)から(8)のいずれかに記載の情報処理装置であって、
前記第1のセンサ情報を出力するセンサはユーザの身体に装着可能に構成された筐体に設置されており、
前記センサは、前記ユーザの手による操作を検出可能な位置に設置される
情報処理装置。
(10)
上記(9)に記載の情報処理装置であって、
前記センサの設置される前記位置は、前記筐体を装着した状態における前記ユーザの視野外となる位置である
情報処理装置。
(11)
上記(9)又は(10)のいずれかに記載の情報処理装置であって、
前記第1のセンサ情報は、1箇所以上の接触点を感知可能なタッチセンサの感知情報であり、
前記制御部は、前記タッチセンサが感知した接触点の数に応じた前記トリガ操作を検出する
情報処理装置。
(12)
上記(1)から(11)のいずれかに記載の情報処理装置であって、
前記情報処理装置は、前記情報処理装置の電力消費モードとして第1の電力モードと第2の電力モードを有し、
前記制御部は、前記情報処理装置が前記第1の電力モードにあるときに前記トリガ操作を検出すると、前記情報処理装置の電力消費モードを前記第2の電力モードに切り替え、
前記第1の電力モードは、前記第2の電力モードよりも電力消費量が少ない
情報処理装置。
(13)
第1のセンサ情報の種類に応じたトリガ操作を検出し、
操作入力としてユーザにより行われるジェスチャ操作を第2のセンサ情報に基づいて認識し、
検出した前記トリガ操作と認識した前記ジェスチャ操作との組み合わせに基づいて、前記操作入力を決定する
情報処理方法。
(14)
コンピュータに、
第1のセンサ情報の種類に応じたトリガ操作を検出するステップと、
操作入力としてユーザにより行われるジェスチャ操作を第2のセンサ情報に基づいて認識するステップと、
検出した前記トリガ操作と認識した前記ジェスチャ操作との組み合わせに基づいて、前記操作入力を決定するステップと
を実行させる
プログラム。
2…ヘッドフォン
2a…メガネ型ヘッドマウントディスプレイ
3…携帯端末
11R…右近接センサ
11L…左近接センサ
12…動きセンサ
13…マイクロフォン
14…タッチデバイス
19…無線通信部
20…筐体
21,22…RAM
31…情報処理部(SoC/CPU)
32…トリガ検出部
33…ジェスチャ認識部
34…操作入力信号生成部
38…CPU
39…無線通信部
Claims (14)
- 第1のセンサ情報の種類に応じたトリガ操作を検出し、操作入力としてユーザにより行われるジェスチャ操作を第2のセンサ情報に基づいて認識し、検出した前記トリガ操作と認識した前記ジェスチャ操作との組み合わせに基づいて、前記操作入力を決定する制御部
を具備する情報処理装置。 - 請求項1に記載の情報処理装置であって、
前記制御部は、前記トリガ操作の検出後、所定の時間内に入力される前記第2のセンサ情報に基づいて、前記ジェスチャ操作の認識をする
情報処理装置。 - 請求項1に記載の情報処理装置であって、
前記制御部は、
前記第1のセンサ情報が入力された際、前記第1のセンサ情報が入力され続けている時間の長さが所定の閾値より短い場合に単発的トリガ操作、長い場合に連続的トリガ操作を検出し、
検出した前記単発的トリガ操作又は前記連続的トリガ操作と、認識した前記ジェスチャ操作との組み合わせにより、前記操作入力を決定する
情報処理装置。 - 請求項3に記載の情報処理装置であって、
前記制御部は、前記連続的トリガ操作を検出した場合、前記操作入力により入力される情報に、検出した前記連続的トリガ操作の時間の長さに応じた値を含める
情報処理装置。 - 請求項4に記載の情報処理装置であって、
前記制御部は、前記連続的トリガ操作の時間長さの終点を、前記第1のセンサ情報の入力が終わった時点とする
情報処理装置。 - 請求項4に記載の情報処理装置であって、
前記制御部は、前記連続的トリガ操作の時間の長さの起点を、前記第1のセンサ情報の入力が始まった時点とする
情報処理装置。 - 請求項4に記載の情報処理装置であって、
前記制御部は、前記連続的トリガ操作の時間の長さの起点を、前記連続的トリガ操作が検出された時点とする
情報処理装置。 - 請求項4に記載の情報処理装置であって、
前記制御部は、前記連続的トリガ操作の時間の長さの起点を、前記ジェスチャ操作が認識された時点とする
情報処理装置。 - 請求項1に記載の情報処理装置であって、
前記第1のセンサ情報を出力するセンサはユーザの身体に装着可能に構成された筐体に設置されており、
前記センサは、前記ユーザの手による操作を検出可能な位置に設置される
情報処理装置。 - 請求項9に記載の情報処理装置であって、
前記センサの設置される前記位置は、前記筐体を装着した状態における前記ユーザの視野外となる位置である
情報処理装置。 - 請求項9に記載の情報処理装置であって、
前記第1のセンサ情報は、1箇所以上の接触点を感知可能なタッチセンサの感知情報であり、
前記制御部は、前記タッチセンサが感知した接触点の数に応じた前記トリガ操作を検出する
情報処理装置。 - 請求項1に記載の情報処理装置であって、
前記情報処理装置は、前記情報処理装置の電力消費モードとして第1の電力モードと第2の電力モードを有し、
前記制御部は、前記情報処理装置が前記第1の電力モードにあるときに前記トリガ操作を検出すると、前記情報処理装置の電力消費モードを前記第2の電力モードに切り替え、
前記第1の電力モードは、前記第2の電力モードよりも電力消費量が少ない
情報処理装置。 - 第1のセンサ情報の種類に応じたトリガ操作を検出し、
操作入力としてユーザにより行われるジェスチャ操作を第2のセンサ情報に基づいて認識し、
検出した前記トリガ操作と認識した前記ジェスチャ操作との組み合わせに基づいて、前記操作入力を決定する
情報処理方法。 - コンピュータに、
第1のセンサ情報の種類に応じたトリガ操作を検出するステップと、
操作入力としてユーザにより行われるジェスチャ操作を第2のセンサ情報に基づいて認識するステップと、
検出した前記トリガ操作と認識した前記ジェスチャ操作との組み合わせに基づいて、前記操作入力を決定するステップと
を実行させる
プログラム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112017005377.3T DE112017005377T5 (de) | 2016-10-25 | 2017-10-13 | Vorrichtung, verfahren und programm zur informationsverarbeitung |
KR1020197010728A KR20190068543A (ko) | 2016-10-25 | 2017-10-13 | 정보 처리 장치, 방법 및 프로그램 |
JP2018547559A JP7135859B2 (ja) | 2016-10-25 | 2017-10-13 | 情報処理装置、方法及びプログラム |
US16/333,037 US10712831B2 (en) | 2016-10-25 | 2017-10-13 | Information processing apparatus, method, and program |
CN201780064524.9A CN109891364B (zh) | 2016-10-25 | 2017-10-13 | 信息处理装置、方法和程序 |
US16/924,597 US20200341557A1 (en) | 2016-10-25 | 2020-07-09 | Information processing apparatus, method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016-208919 | 2016-10-25 | ||
JP2016208919 | 2016-10-25 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/333,037 A-371-Of-International US10712831B2 (en) | 2016-10-25 | 2017-10-13 | Information processing apparatus, method, and program |
US16/924,597 Continuation US20200341557A1 (en) | 2016-10-25 | 2020-07-09 | Information processing apparatus, method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018079301A1 true WO2018079301A1 (ja) | 2018-05-03 |
Family
ID=62024813
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/037127 WO2018079301A1 (ja) | 2016-10-25 | 2017-10-13 | 情報処理装置、方法及びプログラム |
Country Status (6)
Country | Link |
---|---|
US (2) | US10712831B2 (ja) |
JP (1) | JP7135859B2 (ja) |
KR (1) | KR20190068543A (ja) |
CN (1) | CN109891364B (ja) |
DE (1) | DE112017005377T5 (ja) |
WO (1) | WO2018079301A1 (ja) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20200141699A (ko) | 2019-06-11 | 2020-12-21 | 주식회사 신영 | 보강 부재 |
CN115484524A (zh) * | 2021-06-16 | 2022-12-16 | 开酷科技股份有限公司 | 具有手势辨识功能的耳机装置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011227638A (ja) * | 2010-04-19 | 2011-11-10 | Panasonic Corp | 手書き入力装置、手書き入力方法及び手書き入力プログラム |
WO2015060856A1 (en) * | 2013-10-24 | 2015-04-30 | Bodhi Technology Ventures Llc | Wristband device input using wrist movement |
JP2016149587A (ja) * | 2015-02-10 | 2016-08-18 | セイコーエプソン株式会社 | ヘッドマウントディスプレイ、およびその制御方法と制御プログラム |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9019201B2 (en) * | 2010-01-08 | 2015-04-28 | Microsoft Technology Licensing, Llc | Evolving universal gesture sets |
US8418993B2 (en) * | 2010-02-02 | 2013-04-16 | Chung-Chia Chen | System and method of touch free automatic faucet |
JP5379036B2 (ja) * | 2010-02-09 | 2013-12-25 | 株式会社ソニー・コンピュータエンタテインメント | 情報処理装置、その制御方法、プログラム、及び情報記憶媒体 |
JP2011170856A (ja) * | 2010-02-22 | 2011-09-01 | Ailive Inc | 複数の検出ストリームを用いたモーション認識用システム及び方法 |
DE102011090162B4 (de) * | 2011-12-30 | 2022-05-05 | Robert Bosch Gmbh | Alarmvorrichtung für ein Pilotenheadset |
EP2821888B1 (en) * | 2013-07-01 | 2019-06-12 | BlackBerry Limited | Gesture detection using ambient light sensors |
JP2016018432A (ja) | 2014-07-09 | 2016-02-01 | ローム株式会社 | ユーザインタフェイス装置 |
JP2016139174A (ja) * | 2015-01-26 | 2016-08-04 | セイコーエプソン株式会社 | ヘッドマウントディスプレイ、ヘッドマウントディスプレイの制御方法および制御プログラム |
JP5981591B1 (ja) | 2015-03-17 | 2016-08-31 | 株式会社コロプラ | 没入型仮想空間でオブジェクト操作を制御するためのコンピュータ・プログラムおよびコンピュータ・システム |
-
2017
- 2017-10-13 WO PCT/JP2017/037127 patent/WO2018079301A1/ja active Application Filing
- 2017-10-13 KR KR1020197010728A patent/KR20190068543A/ko active IP Right Grant
- 2017-10-13 US US16/333,037 patent/US10712831B2/en active Active
- 2017-10-13 JP JP2018547559A patent/JP7135859B2/ja active Active
- 2017-10-13 CN CN201780064524.9A patent/CN109891364B/zh active Active
- 2017-10-13 DE DE112017005377.3T patent/DE112017005377T5/de active Pending
-
2020
- 2020-07-09 US US16/924,597 patent/US20200341557A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011227638A (ja) * | 2010-04-19 | 2011-11-10 | Panasonic Corp | 手書き入力装置、手書き入力方法及び手書き入力プログラム |
WO2015060856A1 (en) * | 2013-10-24 | 2015-04-30 | Bodhi Technology Ventures Llc | Wristband device input using wrist movement |
JP2016149587A (ja) * | 2015-02-10 | 2016-08-18 | セイコーエプソン株式会社 | ヘッドマウントディスプレイ、およびその制御方法と制御プログラム |
Also Published As
Publication number | Publication date |
---|---|
US20190204933A1 (en) | 2019-07-04 |
US20200341557A1 (en) | 2020-10-29 |
DE112017005377T5 (de) | 2019-08-01 |
CN109891364A (zh) | 2019-06-14 |
JP7135859B2 (ja) | 2022-09-13 |
JPWO2018079301A1 (ja) | 2019-09-12 |
CN109891364B (zh) | 2022-05-03 |
KR20190068543A (ko) | 2019-06-18 |
US10712831B2 (en) | 2020-07-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3089018B1 (en) | Method, apparatus, and device for information processing | |
CN110164440B (zh) | 基于捂嘴动作识别的语音交互唤醒电子设备、方法和介质 | |
US20160349845A1 (en) | Gesture Detection Haptics and Virtual Tools | |
JP2021082333A (ja) | 端末装置、端末装置の制御方法およびプログラム | |
US20160299570A1 (en) | Wristband device input using wrist movement | |
CN108710615A (zh) | 翻译方法及相关设备 | |
EP3113014B1 (en) | Mobile terminal and method for controlling the same | |
TW201638728A (zh) | 用以處理與移動相關的資料之計算裝置及方法 | |
CN109067965A (zh) | 翻译方法、翻译装置、可穿戴装置及存储介质 | |
WO2018079301A1 (ja) | 情報処理装置、方法及びプログラム | |
WO2016049842A1 (zh) | 一种便携或可穿戴智能设备的混合交互方法 | |
US20230325002A1 (en) | Techniques for neuromuscular-signal-based detection of in-air hand gestures for text production and modification, and systems, wearable devices, and methods for using these techniques | |
WO2012159323A1 (zh) | 用电容屏手写笔模拟左右键输入的方法、电容屏手写笔及终端 | |
EP3139248B1 (en) | Method for gesture based human-machine interaction, portable electronic device and gesture based human-machine interface system | |
KR20160039961A (ko) | 사용자의 움직임을 인식하는 모션 인터페이스 디바이스 | |
CN111176422B (zh) | 智能穿戴设备及其操作方法、计算机可读存储介质 | |
WO2023275957A1 (ja) | 情報処理システム、情報処理端末、および想起操作認識方法 | |
CN107329574A (zh) | 用于电子设备的输入方法和系统 | |
JP2010250708A (ja) | 手袋型携帯端末 | |
EP4080329A1 (en) | Wearable control system and method to control an ear-worn device | |
TWM622556U (zh) | 基於觸控戒指之智慧手錶操作系統 | |
KR101805111B1 (ko) | 그립형 입력인터페이스 장치 및 그 방법 | |
AU2016100962B4 (en) | Wristband device input using wrist movement | |
CN118012325A (zh) | 一种手势识别装置 | |
TWI352930B (ja) |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17866108 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018547559 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20197010728 Country of ref document: KR Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17866108 Country of ref document: EP Kind code of ref document: A1 |