WO2022019085A1 - Information processing device and information processing method - Google Patents

Information processing device and information processing method Download PDF

Info

Publication number
WO2022019085A1
WO2022019085A1 PCT/JP2021/025072 JP2021025072W WO2022019085A1 WO 2022019085 A1 WO2022019085 A1 WO 2022019085A1 JP 2021025072 W JP2021025072 W JP 2021025072W WO 2022019085 A1 WO2022019085 A1 WO 2022019085A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
gesture
feedback
information processing
processing apparatus
Prior art date
Application number
PCT/JP2021/025072
Other languages
French (fr)
Japanese (ja)
Inventor
惇一 清水
慧 高橋
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2022019085A1 publication Critical patent/WO2022019085A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones

Definitions

  • This disclosure relates to an information processing device and an information processing method.
  • devices that can be operated by gestures do not necessarily have high usability.
  • the user may not know whether the operation is correctly detected by the device, and the user may take time to operate the device.
  • the information processing apparatus of one form according to the present disclosure includes an acquisition unit for acquiring information regarding an input speed of a user's gesture performed in the detection area of the detection unit for detecting an object, and the input unit.
  • An output control unit that controls output regarding feedback to the user based on information regarding speed is provided.
  • a plurality of components having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numerals.
  • a plurality of elements having substantially the same functional configuration are distinguished as needed, such as output devices 20A and 20B.
  • output devices 20A and 20B are simply referred to as output devices 20.
  • a detection unit for example, a proximity sensor
  • a detection result for example, a sensor value
  • a device capable of gesture detection may be simply referred to as an information processing device.
  • FIG. 1 and 2 are diagrams showing how the user makes a gesture.
  • FIG. 1 shows headphones 10A as an example of the information processing apparatus of the present embodiment.
  • FIG. 2 shows an earphone 10B as an example of the information processing apparatus of the present embodiment.
  • the device operation by the gesture is performed by the user performing a predetermined gesture in the space near the information processing device.
  • the headphones 10A and the earphones 10B are attached to the user's ears, and the user U makes a gesture for operating the device in the space near his / her ears.
  • the user U makes a gesture of moving the hand H from the front to the back
  • the user U makes a gesture of moving the hand H from the back to the front.
  • the motion of moving the hand includes the motion of moving the finger.
  • the description of "hand” appearing in the following explanation can be replaced with “finger” as appropriate.
  • the XYZ coordinate system may be used for the explanation in order to facilitate understanding.
  • the X-axis direction, the Y-axis direction, and the Z-axis direction are all directions determined with reference to the user.
  • the X-axis plus direction is the front direction of the user U
  • the X-axis minus direction is the back direction of the user U.
  • the Y-axis plus direction is the left-hand side direction of the user U
  • the Y-axis minus direction is the right-hand side direction of the user U.
  • the Z-axis plus direction is the upward direction
  • the Z-axis minus direction is the downward direction.
  • the user U is assumed to be in a standing or sitting position, the X-axis and the Y-axis are in the horizontal direction, and the Z-axis is in the vertical direction (gravity direction).
  • the X-axis direction, Y-axis direction, and Z-axis direction appearing in the following description may be appropriately read according to the posture of the user. For example, when the user is lying on his back, the X-axis direction is the vertical direction, and the Y-axis and the X-axis directions are the horizontal direction.
  • the gesture in which the user U moves the hand H in the horizontal direction may be referred to as a left / right swipe.
  • the gesture in which the user U moves the hand H in the vertical direction (Z-axis direction) may be referred to as a vertical swipe.
  • the information processing device detects the user's gesture based on the detection result of the object in the detection unit.
  • the information processing apparatus executes a function corresponding to the gesture (for example, Next / Prev or Vol + / Vol ⁇ ).
  • While device operation by gesture has the advantage that the user can easily operate the device, there is a problem that it is difficult for the user to confirm the success or failure of the operation by gesture.
  • the information processing device is a device worn in the user's ear such as headphones 10A and earphones 10B.
  • the operation on the information processing apparatus is executed in a place that cannot be visually recognized by the user, so that the user has his / her own hand located within the detectable range of the gesture. It is difficult to know whether or not it is.
  • the gesture is performed in the air, it is difficult for the user to know at which position in the air the gesture is performed because the user does not touch the information processing device. In that case, it is difficult for the user to grasp whether or not the gesture being executed is executed as intended.
  • the information processing device cannot correctly detect the user's gesture. In this case as well, it is difficult for the user to grasp whether his / her operation is fast or slow, and therefore it is difficult to grasp whether or not the gesture being executed is executed as intended.
  • the gesture may not be detected by the information processing device.
  • a gesture different from the gesture intended by the user may be detected by the information processing apparatus, and a function different from the function intended by the user may be executed by the information processing apparatus. In this case, the user will notice that the gesture was not correctly detected by the information processing apparatus after a while after performing the gesture or when a function other than the intended function is executed.
  • FIG. 3 is a diagram for explaining the detectable range of the gesture.
  • the detection region OR of the detection unit is formed in a certain range around the information processing apparatus 10.
  • the information processing device 10 may be headphones 10A or earphones 10B.
  • a part of the detection area OR is the gesture detectable range AR.
  • the detectable range AR is a quadrangular range within the detection area OR.
  • the shape of the detectable range AR is not limited to a quadrangle when viewed from the Y-axis direction, and may be, for example, a circle or an ellipse.
  • the detectable range AR is a part of the detection area OR, but the detectable range AR may be the entire range of the detection area OR.
  • the detectable range AR may be variable. For example, at least one of the position, size, and shape of the detectable range AR may be changed based on the information from the sensor included in the information processing apparatus 10.
  • the information processing device 10 acquires information indicating the position of the hand of the user U performing the gesture. Then, the information processing apparatus 10 constantly provides feedback while the user U's hand is within the detectable range AR. At this time, the feedback may be the sound output from the earphones or headphones. The information processing apparatus 10 does not have to provide feedback when the user U's hand is not within the detectable range AR. As a result, the user U can know whether or not his / her hand is within the detectable range AR.
  • the information processing apparatus 10 may acquire information regarding the input speed of the user's gesture performed in the detection area OR. Then, the information processing apparatus 10 performs output control regarding feedback to the user U based on the information regarding the input speed. For example, the information processing apparatus 10 acquires information on the movement speed of the hand of the user U, and performs output control regarding feedback based on the information on the movement speed.
  • FIG. 4A and 4B are diagrams for explaining changes in feedback based on information on the movement speed of the user U's hand.
  • the hand H of the user U is moving at the moving speed V 11
  • the hand H of the user U is moving at the moving speed V 12 faster than the moving speed V 11 .
  • the information processing apparatus 10 is configured to provide feedback at all times and in real time.
  • the real-time feedback means feedback without delay or feedback with a certain range of delay.
  • the information processing apparatus 10 changes the feedback to the user U according to the moving speed of the hand H of the user U.
  • the information processing apparatus 10 changes the feedback depending on whether or not the moving speed of the hand H of the user U exceeds a predetermined threshold value (hereinafter, referred to as a threshold value Vth).
  • the threshold value V th is larger than the moving speed V 11 shown in FIG. 4A and smaller than the moving speed V 12 shown in FIG. 4B.
  • the information processing apparatus 10 provides the first feedback.
  • the first feedback may be feedback to inform the user that the gesture has been input normally. Further, the first feedback may be zero feedback.
  • zero feedback means feedback without output, that is, the information processing apparatus 10 does not output as feedback.
  • not outputting is also regarded as a kind of feedback if it informs the user that the gesture is input normally or is not input normally, for example. Can be done.
  • the information processing apparatus 10 provides a second feedback different from the first feedback. ..
  • the second feedback is, at this time, the second feedback may be a sound having a higher pitch (frequency) than the first feedback.
  • the second feedback may be a sound with a higher frequency than the first feedback. If the first feedback is not zero feedback, the second feedback may be zero feedback.
  • the information processing apparatus 10 does not necessarily have to always provide feedback.
  • the information processing apparatus 10 does not have to give feedback to the user when the moving speed of the user's hand H does not exceed the threshold value Vth. Then, the information processing apparatus 10 may provide feedback when the moving speed exceeds the threshold value Vth. This makes it possible to inform the user U that the gesture input speed is high.
  • the threshold value is not limited to one and may be multiple.
  • there may be a plurality of further feedbacks for example, a fourth feedback, a fifth feedback, etc.
  • the information processing apparatus 10 may continuously change the feedback according to the moving speed of the hand H of the user U.
  • the frequency of the sound may be gradually changed according to the change in the moving speed of the hand H of the user U.
  • the information processing apparatus 10 since the information processing apparatus 10 provides feedback according to the position of the hand H of the user U, even if the detectable range AR is in a position where the user U cannot see, the user U can use the feedback. While performing the gesture, it is possible to know the appropriate position (the position of the detectable range AR) for performing the gesture. Further, since the information processing apparatus 10 changes the feedback according to the moving speed of the hand H of the user U, the user U can know the appropriate input speed of the gesture while performing the gesture. As a result, the user will perform the gesture at an appropriate position or speed, and the accuracy of the gesture detection by the information processing apparatus 10 will be high. As a result, the user can use the information processing apparatus 10 without feeling much stress, so that the usability of the information processing apparatus 10 is improved.
  • Embodiment 1 The outline of the present embodiment has been described above, but the information processing apparatus 10 according to the first embodiment will be described below.
  • the information processing device 10 is an information processing terminal that includes a detection unit that detects an object and executes various functions based on a user's input operation.
  • the information processing apparatus 10 is a device capable of outputting sound, which is attached to a portion of the body of the user U that cannot be visually recognized by the user U.
  • the information processing device 10 is an acoustic output device that can be worn by a user such as earphones and headphones.
  • the information processing device 10 may be a display device such as an AR (Augmented Reality) glass or an MR (Mixed Reality) glass that can be worn by a user.
  • the information processing apparatus 10 is a device that can be worn by the user U, and is provided with a predetermined portion located at least in the ear of the user U when the information processing device 10 is worn. If the information processing device 10 is an earphone or a headphone, the predetermined portion is a speaker and a housing for storing the speaker. Then, it is assumed that at least a detection unit for detecting an object is arranged in this predetermined portion.
  • the information processing apparatus 10 is configured to be able to detect a user's gesture, and executes a function such as music playback based on the user's gesture.
  • FIG. 5 is a diagram showing a configuration example of the information processing apparatus 10 according to the embodiment of the present disclosure.
  • the information processing apparatus 10 includes an input detection unit 11, a state detection unit 12, an output unit 13, a communication unit 14, a storage unit 15, and a control unit 16.
  • the configuration shown in FIG. 5 is a functional configuration, and the hardware configuration may be different from this. Further, the functions of the information processing apparatus 10 may be distributed and implemented in a plurality of physically separated configurations.
  • the input detection unit 11 is a detection unit that detects a user's input operation.
  • the input detection unit 11 is a non-contact type detection unit that detects an object in a non-contact manner.
  • the input detection unit 11 includes one or a plurality of proximity sensors (also referred to as non-contact sensors), and detects an object located in the detection area.
  • the proximity sensor may be an optical sensor or a capacitance type sensor.
  • An optical proximity sensor is a sensor that detects the light reflected by an object.
  • the capacitance type proximity sensor is a sensor that detects a change in capacitance that occurs between an object and the sensor.
  • the input detection unit 11 When the input detection unit 11 is a non-contact type, the input detection unit 11 may be a 2D sensor or a 3D sensor. When the input detection unit 11 is a 2D sensor, the detection area formed by the input detection unit 11 is a two-dimensional area, and when the input detection unit 11 is a 3D sensor, the detection area formed by the input detection unit 11 is a three-dimensional area. Will be.
  • the input detection unit 11 is not limited to the non-contact type.
  • the input detection unit 11 may be a contact type detection unit.
  • the input detection unit 11 includes one or a plurality of touch sensors, and detects an object that comes into contact with a predetermined place of the information processing apparatus 10.
  • the predetermined location is, for example, the side surface of the headphones (opposite the surface of the speaker).
  • the predetermined location is, for example, the side surface of the earphone (opposite the surface of the speaker).
  • the side surface of the headphones or the side surface of the earphones may be referred to as the housing surface.
  • FIG. 6 is a diagram showing a configuration example of the input detection unit 11.
  • FIG. 6 shows a state in which the headphone-type information processing apparatus 10 is viewed from the side surface of the headphones.
  • the configuration shown in FIG. 6 is merely an example, and the configuration of the input detection unit 11 is not limited to the configuration shown in FIG.
  • the input detection unit 11 includes an infrared light emitter ID and four photo sensors PD1, PD2, PD3, and PD4 as non-contact sensors.
  • the infrared emitter ID is arranged in the center of the housing surface, and the four photosensors PD1, PD2, PD3, and PD4 are arranged at equal intervals in the circumferential direction so as to surround the infrared emitter ID.
  • the input detection unit 11 detects an object by detecting the infrared light emitted from the infrared emitter ID and reflected on the surface of the object by the four photosensors PD1, PD2, PD3, and PD4.
  • the input detection unit 11 is provided with a touch pad TP as a contact type sensor.
  • the touch pad TP is composed of a plurality of touch sensors arranged in a plane.
  • the infrared emitter ID is arranged on the housing surface. The input detection unit 11 detects an object in contact with the surface of the touch pad TP.
  • the state detection unit 12 is a sensor unit that detects the state of the information processing device 10.
  • the state detection unit 12 is composed of one or a plurality of sensors that detect the state of the information processing device 10.
  • the state detection unit 12 includes, for example, one or a plurality of acceleration sensors. Further, the state detection unit 12 may include one or a plurality of gyro sensors. Further, the state detection unit 12 may include one or a plurality of geomagnetic sensors.
  • the state detection unit 12 may include a motion sensor in which these plurality of types of sensors are combined.
  • the state detection unit 12 detects the state of the information processing apparatus 10 based on the detection results of these sensors. For example, the state detection unit 12 detects the direction of gravity based on the sensor values of these sensors. For example, since the information processing device 10 is constantly subjected to gravitational acceleration, the information processing device 10 can detect the gravitational direction by averaging the directions of the accelerations detected by the acceleration sensor for a certain period of time. ..
  • the output unit 13 is an output interface that outputs information to the user.
  • the output unit 13 may be an acoustic device such as a speaker or a buzzer, or may be a vibration device such as a vibration motor.
  • the output unit 13 may be a display device such as a liquid crystal display (Liquid Crystal Display) or an organic EL display (Organic Electroluminescence Display), or may be a lighting device such as an LED (Light Emitting Diode) lamp. ..
  • the output unit 13 functions as an output means of the information processing apparatus 10.
  • the output unit 13 outputs various information to the user according to the control of the control unit 16.
  • the communication unit 14 is a communication interface for communicating with other devices.
  • the communication unit 14 may be a network interface or a device connection interface. Further, the communication unit 14 may be a wired interface or a wireless interface.
  • the communication unit 14 functions as a communication means of the information processing device 10.
  • the communication unit 14 communicates with other devices according to the control of the control unit 16. Other devices are, for example, terminal devices such as music players and smartphones.
  • the storage unit 15 is a storage device capable of reading and writing data such as a DRAM (Dynamic Random Access Memory), a SRAM (Static Random Access Memory), a flash memory, and a hard disk.
  • the storage unit 15 functions as a storage means for the information processing device 10.
  • the storage unit 15 stores, for example, the settings related to the detectable range AR of the gesture.
  • the control unit 16 is a controller that controls each unit of the information processing device 10.
  • the control unit 16 is realized by, for example, a processor such as a CPU (Central Processing Unit) or an MPU (Micro Processing Unit).
  • the control unit 16 is realized by the processor executing various programs stored in the storage device inside the information processing device 10 using a RAM (Random Access Memory) or the like as a work area.
  • the control unit 16 may be realized by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
  • the CPU, MPU, ASIC, and FPGA can all be regarded as controllers.
  • the control unit 16 includes an acquisition unit 161, a gesture detection unit 162, a command execution unit 163, an output control unit 164, and a guessing unit 165.
  • Each block (acquisition unit 161 to guessing unit 165) constituting the control unit 16 is a functional block indicating the function of the control unit 16.
  • These functional blocks may be software blocks or hardware blocks.
  • each of the above-mentioned functional blocks may be one software module realized by software (including a microprogram), or may be one circuit block on a semiconductor chip (die).
  • each functional block may be one processor or one integrated circuit. The method of configuring the functional block is arbitrary.
  • the control unit 16 may be configured in a functional unit different from the above-mentioned functional block. Further, another device may perform a part or all of the operations of each block (acquisition unit 161 to estimation unit 165) constituting the control unit 16. For example, a terminal device such as a music player or a smartphone may perform some or all operations of each block constituting the control unit 16. The operation of each block constituting the control unit 16 will be described later.
  • FIG. 7 is a diagram showing the correspondence between the functions of the information processing apparatus 10 and the gestures.
  • the information processing device 10 has a function related to answering a telephone in addition to a function related to playing music.
  • the information processing apparatus 10 has the functions shown in the following (1) to (7).
  • "function" can be paraphrased as "command”. (1) Play / Pause (2) AnswerCall / End (3) Next / Prev (4) Vol + / Vol- (5) Candle (6) Quick Attention (7) Ambient Control
  • the user's actions constituting one gesture may include a first action that triggers the start of the gesture and a second action following the first action.
  • the first operation is, for example, a swing-up operation of the hand H of the user U
  • the second operation is, for example, an operation such as a swipe following the first operation. It is also possible to regard each of the first motion and the second motion as one gesture.
  • the functions and gestures described here are just examples.
  • the functions and gestures of the information processing apparatus 10 are not limited to the functions and gestures shown below.
  • the first operation and the second operation are not limited to the operations shown below.
  • the combination of the function and the gesture is not limited to the combination shown below.
  • the information processing device 10 is assumed to be a headphone-type device worn on the user's ear, but the information processing device 10 is not limited to the headphone-type device.
  • the information processing device 10 may be an earphone type device.
  • the information processing apparatus 10 may have a predetermined portion to be attached to the user's ear and other portions.
  • the apparatus attached to the user's ear may be a predetermined device other than the information processing apparatus 10.
  • the predetermined device is, for example, an output device (for example, headphones or earphones) connected to the information processing device 10 wirelessly or by wire. If the information processing device 10 is worn on the user's ear, the predetermined device is the information processing device 10 itself.
  • a predetermined device can be paraphrased as a predetermined device.
  • Play / Pause is a function for playing a song
  • "Pause” is a function for pausing the playback of a song.
  • the "Play / Pause” is associated with the swing-up motion of the hand H as the first motion, and is associated with the pinch motion as the second motion.
  • FIG. 8 is a diagram for explaining a pinch operation. As shown in FIG. 8, the pinch operation is an operation of picking an object with a finger.
  • the information processing apparatus 10 can detect that the user U has performed a pinch operation even if the user U has not actually picked up an object (for example, the information processing apparatus 10).
  • FIG. 9 is a diagram for explaining the hold operation. As shown in FIG. 9, the hold operation is an operation of grasping an object with the hand H.
  • the information processing apparatus 10 can detect that the user U has performed the hold operation even if the user U does not actually grasp an object (for example, the information processing apparatus 10).
  • Next / Prev is a function for cueing the next song
  • Prev is a function for cueing the previous song or the song being played.
  • the "Next / Prev” is associated with the swing-up motion of the hand H as the first motion, and is associated with the left / right swipe as the second motion.
  • “Next” is associated with the left swipe (front swipe) of the left and right swipes as the second operation
  • "Prev” is associated with the right swipe of the left and right swipes as the second operation (prev). After swipe) is associated.
  • “Next / Prev” may be associated with a gesture that performs the second operation without the first operation.
  • FIG. 10A and 10B are diagrams for explaining a left-right swipe.
  • FIG. 10A is a left swipe and FIG. 10B is a right swipe.
  • the left / right swipe is an operation (gesture) involving a movement width, unlike the pinch operation and the hold operation.
  • the left swipe is an operation in which the user U slides the hand H in the forward direction (X-axis plus direction) by a predetermined movement width W1 as shown in FIG. 10A.
  • the right swipe is an operation in which the user U slides the hand H in the backward direction (X-axis minus direction) by a predetermined movement width W2 as shown in FIG. 10B.
  • FIGS. 10A and 10B are both views showing how the user U is performing a gesture near the left ear.
  • the right swipe is the action of sliding the hand H forward
  • the left swipe is the action of sliding the hand H backward. So be careful.
  • "Next" is associated with, for example, a right swipe (front swipe) as a second action
  • "Prev” is associated with, for example, a left swipe (back swipe) as a second action.
  • Vol + / Vol- is a function to raise the volume
  • “Vol-” is a function to lower the volume.
  • “Vol + / Vol-” is associated with the swinging motion of the hand H as the first motion, and is associated with the up / down swipe as the second motion.
  • “Vol +” is associated with the up swipe of the up and down swipes as the second action
  • “Vol-” is associated with the down swipe of the up and down swipes as the second action.
  • Vol + / Vol- may be associated with a gesture that performs the second operation without the first operation.
  • FIG. 11A and 11B are diagrams for explaining the up / down swipe.
  • FIG. 11A is a left swipe and FIG. 11B is a right swipe.
  • the up / down swipe is an operation (gesture) involving a movement width, unlike the pinch operation and the hold operation.
  • the upper swipe is an operation in which the user U slides the hand H upward (Z-axis plus direction) by a predetermined movement width W3, as shown in FIG. 11A.
  • the lower swipe is an operation in which the user U slides the hand H downward (Z-axis minus direction) by a predetermined movement width W4, as shown in FIG. 11B.
  • Vol + / Vol- is a function for adjusting the volume. Therefore, in order to execute this function, the information processing apparatus 10 can acquire not only the information on raising and lowering the sound but also the information on the amount of raising and lowering sound (the amount of raising or lowering the sound) from the user U. desirable. Therefore, in the present embodiment, "Vol + / Vol-" is assumed to be a function accompanied by an operation amount (in the case of this function, an amount of raising or lowering the sound). In this function, for example, the movement amount or movement speed of the hand H when swiping up and down is associated with the operation amount of "Vol + / Vol-". By configuring the information processing apparatus 10 so as to raise or lower a certain amount of sound with one input operation, it is possible to make this function a function without an operation amount.
  • Candle “Cancel” is a function of canceling the operation performed by the user U.
  • the “Cancel” is associated with the swinging motion of the hand H as the first motion, and is associated with the motion of holding the hand as the second motion.
  • 12A and 12B are diagrams for explaining the operation of holding the hand H. 12A is a view of the user U from the left side, and FIG. 12B is a view of the user U from the front. As shown in FIGS. 12A and 12B, the operation of holding the hand H is an operation in which the user U holds the spread hand H toward the information processing apparatus 10.
  • Quick Attention is a function for the user U to quickly hear the surrounding sound.
  • “Quick Attention” is a function that quickly lowers the volume of the sound being output (for example, music, call voice, or ringtone) to make it easier to hear the surrounding sound.
  • the "Quick Attention” is associated with the swinging motion of the hand H as the first motion, and is associated with the touch motion as the second motion.
  • FIG. 13 is a diagram for explaining the touch operation. As shown in FIG. 13, the touch operation is an operation in which the user U touches a part of the information processing apparatus 10 (for example, the housing surface).
  • Ambient Control is a function for the user U to listen to music while checking the surrounding sounds. Specifically, “Ambient Control” is a function of capturing external sounds during music reproduction.
  • the "Ambient Control” is associated with a swing-up motion of the hand H as a first motion, and is associated with a motion of moving the hand H away as a second motion.
  • FIG. 14 is a diagram for explaining an operation of moving the hand H away.
  • the motion of moving the hand H away is a motion (gesture) with a movement width, unlike the pinch motion, the hold motion, the touch motion, and the motion of holding the hand.
  • FIG. 14 is a diagram for explaining an operation of moving the hand H away.
  • the motion of moving the hand H away is a motion (gesture) with a movement width, unlike the pinch motion, the hold motion, the touch motion, and the motion of holding the hand.
  • the operation of moving the hand H away is such that after the user U holds the hand H over the information processing apparatus 10, the hand H is moved in a predetermined separation direction by a predetermined movement width W4. In the example of FIG. 14, it is an operation of moving in the Y-axis plus direction).
  • FIG. 15 is a flowchart showing a command execution process according to the present embodiment.
  • the command execution process is a process in which the information processing apparatus 10 executes a command input by the user U using a gesture.
  • the "command” is a command from the inside or the outside of the device for causing the information processing device 10 to execute a function. It is also possible to regard the "command” as a word indicating the function itself of the information processing apparatus 10, such as Play / Pause.
  • the description of "command” appearing in the following explanation can be replaced with "function”.
  • the command execution process is executed by the control unit 16 of the information processing device 10.
  • the command execution process is started, for example, when the information processing apparatus 10 is turned on.
  • the acquisition unit 161 of the control unit 16 acquires the sensor value from the input detection unit 11 and the state detection unit 12 (step S101). For example, the acquisition unit 161 acquires information about an object in the detection region OR from a non-contact type sensor such as a proximity sensor. Further, the acquisition unit 161 acquires information regarding contact with the housing surface from a contact-type sensor such as a touch sensor. Further, the acquisition unit 161 may acquire information regarding the state of the information processing device 10 from sensors such as an acceleration sensor, a gyro sensor, and a geomagnetic sensor.
  • the acquisition unit 161 acquires information regarding the input speed of the gesture of the user U performed in the detection area OR based on the sensor value acquired in step S101 (step S102). For example, the acquisition unit 161 acquires information on the movement speed of the user U's hand in the detection area OR based on the sensor value from the input detection unit 11. In addition to the information regarding the input speed, the acquisition unit 161 may acquire information indicating the position of the hand H of the user U who is performing the gesture in the detection area OR.
  • control unit 16 may change at least one of the position, size, and shape of the detectable range AR so that the user U can perform an appropriate input operation when acquiring information on the input speed.
  • control unit 16 may change the position, size, and shape of the detectable range AR based on the information acquired from the input detection unit 11 or the state detection unit 12.
  • the output control unit 164 of the control unit 16 performs output control regarding feedback to the user U (step S103).
  • the output control of the output control unit 164 is roughly classified into the following (1) to (4). (1) Feedback based on speed information (2) Feedback based on position information (3) Feedback based on speed information and position information (4) Feedback based on estimation results
  • the output control unit 164 can execute a plurality of processes selected from the output control processes shown in the following (1) to (4) in parallel or in combination.
  • the output control unit 164 controls the output related to feedback to the user U based on the information related to the input speed. For example, the output control unit 164 performs output control regarding feedback to the user U based on the information on the movement speed of the hand H of the user U.
  • the feedback may be sound or vibration.
  • the output control unit 164 may provide feedback in real time while the user U is performing the gesture. For example, the output control unit 164 may provide feedback corresponding to the movement speed of the user U's hand without delay after acquiring the movement speed information.
  • the output control unit 164 may change the feedback to the user U according to the moving speed of the hand H of the user U.
  • the information processing apparatus 10 may change the feedback depending on whether or not the moving speed of the hand H of the user U exceeds a predetermined threshold value Vth.
  • the output control unit 164 provides the first feedback.
  • the output control unit 164 provides a second feedback different from the first feedback. As a result, the output control unit 164 can inform the user U of the appropriate movement speed of the hand H.
  • either one of the first feedback and the second feedback may be zero feedback.
  • the output control unit 164 does not give feedback to the user, and the user's hand H is slower than the predetermined threshold value Vth. If you are moving at speed, you may give feedback.
  • a first threshold value V th11 and a second threshold value V th12 larger than the first threshold value V th11 may exist.
  • the speed between the first threshold value V th11 and the second threshold value V th12 is an appropriate movement speed of the user U's hand H assumed by the device developer.
  • any one of the first threshold value V th11 and the second threshold value V th12 may be regarded as the above-mentioned predetermined threshold value V th.
  • the output control unit 164 provides the first feedback as feedback. Further, the output control section 164, the moving speed of the hand H of the user U exceeds the first threshold value V th11, if not exceed the second threshold value V th12 is different second and first feedback Give feedback. Further, when the moving speed of the hand H of the user U exceeds the second threshold value V th12 , the output control unit 164 provides a third feedback different from the second feedback. The first feedback and the third feedback may be different or the same. As a result, the output control unit 164 can inform the user U of an appropriate movement speed of the hand H, which is neither fast nor slow.
  • the feedback may be sound. That is, the output control unit 164 may perform sound output control as output control related to feedback. In this case, the output control unit 164 may change the sound output mode based on the information regarding the input speed. More specifically, the output control unit 164 may change the sound as feedback according to the moving speed of the hand H of the user U.
  • the sound element changed by the output control unit 164 may be a volume, a frequency, or an output pattern. Frequency can be rephrased as pitch. Further, if the sound to be fed back is a voice, the element of the sound may be the intonation of the voice.
  • the first feedback, the second feedback, and the third feedback described above may be sounds having different volumes, frequencies, output patterns, or intonations. Of course, if the user U can recognize different sounds, the sound elements changed by the output control unit 164 are not limited to the volume, frequency, output pattern, and intonation. Further, when the feedback is changed, the output control unit 164 may change the combination of a plurality of sound elements.
  • the output control unit 164 may provide feedback according to the position of the hand H of the user U. For example, the output control unit 164 may provide feedback at all times while the position of the user U's hand is within the detectable range AR. When the feedback is sound, the output control unit 164 may always output sound while the user U's hand H is within the detectable range AR. Then, the output control unit 164 may not output the sound as feedback when the user U's hand H is not within the detectable range AR. As a result, the user U can know whether or not his / her hand is within the detectable range AR.
  • the output control unit 164 may provide feedback according to the moving speed of the hand H of the user U and its position. At this time, the output control unit 164 may change the threshold value Vth according to the position of the user's hand in the detection area OR. For example, the output control unit 164 may change a predetermined threshold Vth for changing the feedback according to the position of the user's hand in the detectable range AR. At this time, the output control unit 164 may divide the detectable range AR into a plurality of areas for processing.
  • FIG. 16 is a diagram for explaining a detectable range AR divided into a plurality of regions. In the example of FIG. 16, the detectable range AR is divided into a central region AR1 that occupies a certain range in the center of the detectable range AR and an inner edge region AR2 that occupies a certain range inward from the edge of the detectable range AR. There is.
  • the output control unit 164 may change a predetermined threshold value Vth for changing the feedback depending on whether the user U's hand H is in the central region AR1 or the inner edge region AR2. For example, the output control unit 164 sets a predetermined threshold value V th as the threshold value V th 21 when the hand H of the user U is in the central region AR1, and a predetermined threshold value when the hand H of the user U is in the inner edge region AR 2.
  • V th be a threshold value V th22 smaller than the threshold value V th 21 .
  • both the threshold value V th21 and the threshold value V th22 are assumed to be a kind of a predetermined threshold value V th.
  • FIG. 17A and 17B are diagrams for explaining the change in the feedback threshold value Vth based on the information on the position of the hand H of the user U.
  • the hand H of the user U is moving at the moving speed V 21 in the central region AR1
  • the hand H of the user U is moving at the moving speed V 21 in the inner edge region AR2. It is moving at a slower moving speed V 22.
  • the output control unit 164 gives first feedback to the user U, and the movement speed of the hand H of the user U is higher than the predetermined threshold value Vth. If it is fast, a second feedback different from the first feedback shall be given to the user U.
  • the moving speed V 21 of the hand H of the user U is faster than the moving speed V 22 shown in FIG. 17B, but slower than the threshold value V th 21 associated with the central region AR1. Therefore, the output control unit 164 uses the feedback to the user U as the first feedback.
  • the moving speed V 21 of the hand H of the user U is slower than the moving speed V 22 shown in FIG. 17A, but faster than the threshold value V th2 associated with the inner edge region AR2. Therefore, the output control unit 164 uses the feedback to the user U as the second feedback.
  • the information processing apparatus 10 can reduce excessive feedback to the gesture in the region where a high moving speed is allowed (for example, the central region AR1).
  • the information processing apparatus 10 can provide highly sensitive feedback to the gesture in the region where a high moving speed is not allowed (for example, the inner edge region AR2).
  • the information processing apparatus 10 can reduce the possibility that the gesture is performed outside the detectable range AR without giving the user stress due to excessive feedback.
  • the output control unit 164 may provide feedback based on the estimation result of whether or not the gesture will be performed outside the detectable range AR.
  • the guessing unit 165 of the control unit 16 determines that the input operation of the gesture of the user U is outside the detectable range AR based on the information on the position of the hand H of the user U performing the gesture and the information on the moving speed of the hand H. Guess if it will be done in.
  • the guessing unit 165 estimates whether or not the swipe end position is outside the detectable range AR from the swipe start position and the movement speed of the user's hand H. Then, when it is presumed that the input operation of the gesture will be performed outside the detectable range AR, the output control unit 164 gives predetermined feedback to the user U for warning.
  • the predetermined feedback may be performed independently of the feedback based on the speed information or the position information described in the above (1) to (3).
  • the gesture detection unit 162 of the control unit 16 determines whether or not the gesture has been detected (step S104). For example, the gesture detection unit 162 determines whether or not the first operation is detected in the detectable range AR and then the second operation is detected in the detectable range AR. In addition, “detection" in the detection of a gesture is a “detection” in a broad sense including recognition of a gesture. When the gesture is not detected (step S104: No), the gesture detection unit 162 returns the process to step S101.
  • step S104 when a gesture is detected (step S104: Yes), the command execution unit 163 of the control unit 16 executes a command corresponding to the detected gesture (step S105).
  • the command execution unit 163 executes the function of the information processing apparatus 10 associated with the detected gesture.
  • control unit 16 When the command execution is completed, the control unit 16 returns the process to step S101 and executes the processes of steps S101 to S105 again.
  • the user U since feedback is given to the user U based on the information of the gesture input speed, the user U can improve the gesture input accuracy based on the feedback. As a result, the gesture detection accuracy of the information processing apparatus 10 becomes high, so that the information processing apparatus 10 can realize high usability.
  • Embodiment 2 >>
  • the gesture input to the information processing apparatus 10 is one gesture at a time, but the information processing apparatus 10 may be configured to be capable of continuously inputting gestures.
  • the information processing apparatus 10 may be configured so that the user can continuously input the second operation after executing the first operation.
  • the information processing apparatus 10 may be configured so that the user can continuously input the upper swipe after executing the swing-up operation.
  • the gesture that can be continuously input is a gesture associated with a function with an operation amount (for example, a sound raising amount or a sound lowering amount) such as "Vol + / Vol-”. Then, it is assumed that the operation amount of the function is associated with the moving speed of the hand H of the user U who performs the gesture.
  • an operation amount for example, a sound raising amount or a sound lowering amount
  • the information processing apparatus 10 of the second embodiment will be described. Since the functional block configuration of the information processing apparatus 10 of the second embodiment is the same as that of the first embodiment, the description thereof will be omitted.
  • FIG. 18 is a flowchart showing a command execution process according to the present embodiment.
  • the command execution process is executed by the control unit 16 of the information processing apparatus 10.
  • the command execution process is started, for example, when the information processing apparatus 10 is turned on.
  • the acquisition unit 161 of the control unit 16 acquires the sensor value from the input detection unit 11 and the state detection unit 12 (step S201).
  • the acquisition unit 161 may acquire information about an object in the detection region OR from a non-contact type sensor such as a proximity sensor. Further, the acquisition unit 161 may acquire information regarding contact with the housing surface from a contact-type sensor such as a touch sensor. Further, the acquisition unit 161 may acquire information regarding the state of the information processing device 10 from sensors such as an acceleration sensor, a gyro sensor, and a geomagnetic sensor.
  • the acquisition unit 161 acquires the information of the first moving speed based on the sensor value acquired in step S201 (step S202).
  • the information of the first movement speed is the information of the movement speed of the hand H of the user U performing the first gesture (second operation) of the continuous input of the gesture that can be continuously input.
  • the first movement speed information may be information on the movement speed of the hand H of the user U who is performing a gesture that cannot be continuously input.
  • the output control unit 164 of the control unit 16 may provide feedback to the user U based on the information of the first movement speed.
  • the gesture detection unit 162 of the control unit 16 determines whether or not the gesture has been detected (step S203). For example, the gesture detection unit 162 determines whether or not the first operation is detected in the detectable range AR and then the second operation is detected in the detectable range AR. When the gesture is not detected (step S203: No), the gesture detection unit 162 returns the process to step S201.
  • step S203 when a gesture is detected (step S203: Yes), the command execution unit 163 of the control unit 16 executes a command corresponding to the detected gesture (step S204).
  • the command execution unit 163 executes the function of the information processing apparatus 10 associated with the detected gesture.
  • the command execution unit 163 determines whether or not the conditions for accepting continuous input of gestures are satisfied (step S205). For example, the gesture detection unit 162 determines whether the gesture detected in step S203 is a gesture capable of continuous input. Alternatively, if the gesture capable of continuous input has a predetermined number of continuous inputs, the gesture detection unit 162 determines whether the number of continuous inputs of the user U's gesture exceeds the predetermined number of continuous inputs.
  • the acceptance condition may be based on the estimation result of whether or not the gesture input operation after the second continuous input is performed outside the detectable range.
  • the guessing unit 165 of the control unit 16 inputs the predetermined gesture for the second continuous input based on the information of the position and the movement speed of the hand of the user U performing the predetermined gesture for the first continuous input. Guess whether the action will be performed outside the gesture's detectable range AR. Then, when it is presumed that the input operation of the gesture for the second continuous input is performed outside the detectable range AR, the command execution unit 163 determines that the reception condition is not satisfied, and the continuous input 2 The command related to the second gesture may not be executed. As a result, it is possible to prevent erroneous detection of an operation that is not intended for continuous input as continuous input.
  • step S205 If the reception condition is not satisfied (step S205: No), the gesture detection unit 162 returns the process to step S201.
  • step S205 when the reception condition is satisfied (step S205: Yes), the acquisition unit 161 acquires the sensor value from the input detection unit 11 and the state detection unit 12 (step S206). Then, the acquisition unit 161 acquires the information of the second moving speed based on the sensor value acquired in step S205 (step S207).
  • the information on the second movement speed is the information on the movement speed of the hand H of the user U who is performing the gesture (second operation) after the second continuous input of the gesture that can be continuously input.
  • the output control unit 164 performs output control regarding feedback when the second and subsequent gestures of continuous input are performed based on the information of the first movement speed and the information of the second movement speed (step). S208). For example, the output control unit 164 performs output control regarding feedback when the second and subsequent gestures of continuous input are performed based on the comparison result between the first movement speed and the second movement speed. More specifically, when the second movement speed is separated from the first movement speed by a predetermined speed or more, the output control unit 164 gives feedback to the second gesture of continuous input to the first gesture of continuous input. Change from feedback to. The feedback may be sound or vibration.
  • the gesture detection unit 162 determines whether or not the gesture has been detected (step S209). For example, the gesture detection unit 162 determines whether or not the first operation is detected in the detectable range AR and then the second operation is detected in the detectable range AR. When the gesture is not detected (step S209: No), the gesture detection unit 162 returns the process to step S205.
  • step S209 when a gesture is detected (step S209: Yes), the command execution unit 163 of the control unit 16 executes a command corresponding to the detected gesture (step S210).
  • the command execution unit 163 executes the function of the information processing apparatus 10 associated with the detected gesture.
  • control unit 16 returns the process to step S205.
  • the information processing apparatus 10 performs output control regarding feedback based on the comparison result between the first moving speed and the second moving speed, so that the user can perform the second and subsequent continuous inputs. It is easy to understand how much the input speed of the gesture of is faster or slower than the input speed of the first continuous input gesture.
  • the operation amount of the function is associated with the movement speed
  • the user U can grasp the input speed of the second and subsequent gestures corresponding to the desired operation amount from the comparison with the first time.
  • the user U can perform a desired input with a small number of continuous operations, such as being able to raise the volume at once.
  • Embodiment 3 the information processing apparatus 10 detects the movement of the user's hand H and executes a command.
  • the device for detecting the movement of the user's hand H and the device for executing the command may be performed by different devices.
  • the information processing system 1 of the third embodiment will be described.
  • FIG. 19 is a diagram showing a configuration example of the information processing system 1.
  • the information processing system 1 is a system that executes various functions based on the gesture of the user. As shown in FIG. 19, the information processing system 1 includes an output device 20 and a terminal device 30. In the example of FIG. 19, the information processing system 1 includes an output device 20A and an output device 20B. In the example of FIG. 19, the output device 20 and the terminal device 30 are wirelessly connected, but the output device 20 and the terminal device 30 may be configured to be connectable by wire.
  • the information processing system 1 may be provided with only one output device 20 or may be provided with a plurality of output devices 20.
  • the output device 20 is an earphone
  • the information processing system 1 may include a pair of output devices 20 that are wirelessly connected to the terminal device 30 and are attached to the left and right ears of the user U, respectively.
  • the output device 20A is an earphone worn on the left ear of the user U
  • the output device 20B is an earphone worn on the right ear of the user U.
  • one output device 20 does not necessarily have to be one integrated device. It is also possible to regard a plurality of separate devices that are functionally or practically related as one output device 20. For example, a pair of left and right earphones worn on the left and right ears of the user U may be regarded as one output device 20. Of course, one output device 20 may be one integrated earphone worn on one ear of the user U, or one integrated headphone worn on the left and right ears of the user U. good.
  • the output device 20 is an acoustic output device that can be worn by a user such as earphones and headphones.
  • the information processing device 10 may be a display device such as AR glass or MR glass that can be worn by the user.
  • the output device 20 is a device that can be worn by the user U, and is provided with a portion located at least in the ear of the user U when worn. Then, at least a detection unit for detecting an object is arranged in this portion.
  • the detection unit is a functional block corresponding to the input detection unit 11 when the first embodiment is described as an example. If there are two left and right parts located in the ears of the user U, each of both parts may have a detection unit, or only one part may have a detection part.
  • FIG. 20 is a diagram showing a configuration example of the output device 20 according to the embodiment of the present disclosure.
  • the output device 20 includes an input detection unit 21, a state detection unit 22, an output unit 23, a communication unit 24, and a control unit 26.
  • the configuration shown in FIG. 20 is a functional configuration, and the hardware configuration may be different from this. Further, the functions of the output device 20 may be distributed and implemented in a plurality of physically separated configurations.
  • the input detection unit 21 is a detection unit that detects a user's input operation.
  • the state detection unit 22 is a sensor unit that detects the state of the output device 20.
  • the output unit 23 is an output interface that outputs information to the user.
  • the communication unit 24 is a communication interface for communicating with other devices such as the terminal device 30.
  • the control unit 26 is a controller that controls each unit of the output device 20.
  • the input detection unit 21, the state detection unit 22, the output unit 23, the communication unit 24, and the control unit 26 are configured by the input detection unit 11, the state detection unit 12, the output unit 13, and the communication unit included in the information processing device 10.
  • the configuration is the same as that of the control unit 16 and the control unit 16. In this case, the description of the information processing device 10 may be replaced with the output device 20 as appropriate.
  • Terminal device configuration Next, the configuration of the terminal device 30 will be described.
  • the terminal device 30 is an information processing terminal capable of communicating with the output device 20.
  • the terminal device 30 is a kind of information processing device of the present embodiment.
  • the terminal device 30 is, for example, a mobile phone, a smart device (smartphone or tablet), a PDA (Personal Digital Assistant), or a personal computer.
  • the terminal device 30 may be a device such as a commercial camera equipped with a communication function, or may be a mobile body equipped with a communication device such as an FPU (Field Pickup Unit).
  • the terminal device 30 may be an M2M (Machine to Machine) device or an IoT (Internet of Things) device.
  • the terminal device 30 controls the output device 20 from the outside of the output device 20 via a wire or wirelessly.
  • FIG. 21 is a diagram showing a configuration example of the terminal device 30 according to the embodiment of the present disclosure.
  • the terminal device 30 includes an input unit 31, a state detection unit 32, an output unit 33, a communication unit 34, a storage unit 35, and a control unit 36.
  • the configuration shown in FIG. 21 is a functional configuration, and the hardware configuration may be different from this. Further, the functions of the terminal device 30 may be distributed and implemented in a plurality of physically separated configurations.
  • the input unit 31 is an input interface that accepts user input operations.
  • the input unit 31 is a button or a touch panel.
  • the state detection unit 32 is a sensor unit that detects the state of the terminal device 30.
  • the output unit 33 is an output interface that outputs information to the user.
  • the communication unit 34 is a communication interface for communicating with other devices such as the output device 20.
  • the storage unit 35 is a storage device capable of reading and writing data.
  • the storage unit 35 stores, for example, information about the detectable range AR of the gesture.
  • the control unit 36 is a controller that controls each unit of the terminal device 30.
  • the control unit 36 includes an acquisition unit 361, a gesture detection unit 362, a command execution unit 363, an output control unit 364, and a guessing unit 365.
  • the input detection unit 21, the state detection unit 22, the output unit 23, the communication unit 24, and the control unit 26 are configured by the input detection unit 11, the state detection unit 12, the output unit 13, and the communication unit included in the information processing device 10.
  • the configuration is the same as that of the control unit 16 and the control unit 16.
  • the acquisition unit 361 acquires information from the input detection unit 21 and the state detection unit 22 via communication.
  • the configuration is the same as that of the acquisition unit 161, the gesture detection unit 162, the command execution unit 163, the output control unit 164, and the guessing unit 165 included in the control unit 16 of the information processing apparatus 10.
  • the description of the information processing device 10 may be appropriately replaced with the output device 20 or the terminal device 30.
  • the information processing system 1 can execute a command execution process in the same manner as the information processing apparatus 10 of the first embodiment.
  • the command execution process of the information processing system 1 is the same as the command execution process of the first embodiment except that the terminal device 30 acquires the sensor value from the output device 20 via communication.
  • the command execution process will be described with reference to FIG.
  • FIG. 15 is a flowchart showing a command execution process according to the present embodiment.
  • the command execution process is executed by the control unit 36 of the terminal device 30.
  • the command execution process is executed, for example, when the terminal device 30 establishes communication with the output device 20.
  • the acquisition unit 361 of the control unit 36 acquires the sensor value from the input detection unit 21 and the state detection unit 22 of the output device 20 via the communication unit 34 (step S101). For example, the acquisition unit 361 acquires information about an object in the detection region OR from a non-contact type sensor such as a proximity sensor. Further, the acquisition unit 361 acquires information regarding contact with the housing surface from a contact-type sensor such as a touch sensor. Further, the acquisition unit 361 may acquire information regarding the state of the output device 20 from sensors such as an acceleration sensor, a gyro sensor, and a geomagnetic sensor.
  • the acquisition unit 361 acquires information regarding the input speed of the user's gesture performed in the detection area OR based on the sensor value acquired in step S101 (step S102). For example, the acquisition unit 361 acquires information on the movement speed of the user U's hand in the detection area OR based on the sensor value from the input detection unit 11. In addition to the information regarding the input speed, the acquisition unit 361 may acquire information indicating the position of the hand H of the user U who is performing the gesture in the detection area OR.
  • control unit 36 may change at least one of the position, size, and shape of the detectable range AR so that the user U can perform an appropriate input operation when acquiring information on the input speed.
  • control unit 36 may change the position, size, and shape of the detectable range AR based on the information acquired from the input detection unit 21 or the state detection unit 22.
  • the output control unit 364 of the control unit 36 performs output control regarding feedback to the user U (step S103).
  • the output control unit 164 performs output control regarding feedback to the user U based on the information regarding the input speed.
  • the feedback may be sound or vibration.
  • the output control unit 364 may control the output unit 33 included in the terminal device 30 to provide feedback, or may control the output unit 23 included in the output device 20 to provide feedback.
  • the concept of output control also includes control that causes another device to output a predetermined output via communication.
  • the output control unit 164 may provide feedback according to the position of the hand H of the user U. Further, the output control unit 164 may provide feedback according to the moving speed of the hand H of the user U and its position. Further, the output control unit 164 may provide feedback based on the estimation result of whether or not the gesture will be performed outside the detectable range AR.
  • the gesture detection unit 362 of the control unit 36 determines whether or not the gesture has been detected (step S104). When the gesture is not detected (step S104: No), the gesture detection unit 362 returns the process to step S101.
  • step S104 when a gesture is detected (step S104: Yes), the command execution unit 363 of the control unit 36 executes a command corresponding to the detected gesture (step S105).
  • the command execution unit 363 executes the function of the output device 20 associated with the detected gesture.
  • control unit 36 When the command execution is completed, the control unit 36 returns the process to step S101 and executes the processes of steps S101 to S105 again.
  • the user U since feedback is given to the user U based on the gesture input speed, the user U can improve the gesture input accuracy based on the feedback. As a result, the gesture detection accuracy of the terminal device 30 is increased, so that the information processing system 1 can realize high usability.
  • the information processing system 1 can also execute the command execution process of the second embodiment.
  • the command execution process of the information processing system 1 is the same as the command execution process of the second embodiment except that the terminal device 30 acquires the sensor value from the output device 20 via communication.
  • the command execution process will be described with reference to FIG.
  • FIG. 18 is a flowchart showing a command execution process according to the present embodiment.
  • the command execution process is executed by the control unit 36 of the terminal device 30.
  • the command execution process is executed, for example, when the terminal device 30 establishes communication with the output device 20.
  • the acquisition unit 361 of the control unit 36 acquires the sensor value from the input detection unit 11 and the state detection unit 12 (step S201).
  • the acquisition unit 361 may acquire information about an object in the detection region OR from a non-contact type sensor such as a proximity sensor. Further, the acquisition unit 361 may acquire information regarding contact with the housing surface from a contact-type sensor such as a touch sensor. Further, the acquisition unit 361 may acquire information regarding the state of the output device 20 from sensors such as an acceleration sensor, a gyro sensor, and a geomagnetic sensor.
  • the acquisition unit 361 acquires the information of the first moving speed based on the sensor value acquired in step S201 (step S202).
  • the output control unit 364 of the control unit 36 may provide feedback to the user U based on the information of the first movement speed.
  • the gesture detection unit 362 of the control unit 36 determines whether or not the gesture has been detected (step S203). When the gesture is not detected (step S203: No), the gesture detection unit 362 returns the process to step S201.
  • step S203 when a gesture is detected (step S203: Yes), the command execution unit 363 of the control unit 36 executes the command corresponding to the detected gesture (step S204). For example, the command execution unit 363 executes the function of the output device 20 associated with the detected gesture.
  • the command execution unit 363 determines whether or not the conditions for accepting continuous input of gestures are satisfied (step S205).
  • the acceptance condition may be based on the estimation result of whether or not the gesture input operation after the second continuous input is performed outside the detectable range.
  • step S205 If the reception condition is not satisfied (step S205: No), the gesture detection unit 362 returns the process to step S201.
  • step S205 when the reception condition is satisfied (step S205: Yes), the acquisition unit 361 acquires the sensor value from the input detection unit 11 and the state detection unit 12 (step S206). Then, the acquisition unit 361 acquires the information of the second moving speed based on the sensor value acquired in step S205 (step S207).
  • the output control unit 364 performs output control regarding feedback when the second and subsequent gestures of continuous input are performed based on the information of the first movement speed and the information of the second movement speed (step). S209). For example, the output control unit 364 controls the output regarding the feedback when the gesture after the second continuous input is performed, based on the comparison result between the first movement speed and the second movement speed.
  • the output control unit 364 may control the output unit 33 included in the terminal device 30 or the output unit 23 included in the output device 20 when giving feedback.
  • the feedback may be sound or vibration.
  • the gesture detection unit 362 determines whether or not the gesture has been detected (step S209). When the gesture is not detected (step S209: No), the gesture detection unit 362 returns the process to step S205.
  • step S209 Yes
  • the command execution unit 363 of the control unit 36 executes a command corresponding to the detected gesture (step S210).
  • the command execution unit 363 executes the function of the output device 20 associated with the detected gesture.
  • control unit 36 returns the process to step S205.
  • the terminal device 30 since the terminal device 30 performs output control regarding feedback based on the comparison result between the first moving speed and the second moving speed, the user U raises the volume at once. It becomes possible to perform desired input with a small number of continuous operations.
  • the information regarding the input speed of the gesture of the user U is assumed to be the information of the moving speed of the hand H of the user U performing the gesture.
  • the information regarding the input speed is not limited to the information on the movement speed of the hand H of the user U.
  • the information regarding the input speed may be the time required for the user U to input the gesture.
  • the information processing apparatus 10 may acquire information on the time required from the detection of the first operation to the end of the second operation as information on the input speed. .. Then, the information processing apparatus 10 may perform output control regarding feedback to the user U based on this time information. In this case, the information processing apparatus 10 may give feedback immediately after the gesture of the user U is completed (that is, immediately after the second operation is completed).
  • the terminal device 30 may acquire information on the time required from the detection of the first operation to the end of the second operation as information on the input speed. Then, the terminal device 30 may perform output control regarding feedback to the user U based on this time information. In this case, the terminal device 30 may give feedback immediately after the gesture of the user U is completed (that is, immediately after the second operation is completed).
  • the user U can improve the gesture input accuracy based on the feedback.
  • the gesture detection accuracy of the information processing apparatus 10 becomes high, so that the information processing apparatus 10 can realize high usability.
  • the feedback provided by the information processing apparatus 10 to the user U is sound or vibration, but the feedback is not limited to sound.
  • the feedback may be an output to a display device or an output to a blinking device.
  • the information processing device 10 may output the feedback by itself, or may have another device connected via communication perform the feedback output.
  • the feedback provided by the terminal device 30 to the user U is sound or vibration, but the feedback is not limited to sound.
  • the feedback may be an output to a display device or an output to a blinking device.
  • the device that outputs the feedback may be the terminal device 30, or the output device 20 connected to the terminal device 30.
  • the terminal device 30 may cause a device other than the terminal device 30 and the output device 20 to perform feedback output.
  • the detectable range AR is divided into two regions, a central region AR1 and an inner edge region AR2.
  • the detectable range AR may be divided into more than two regions.
  • the detectable range AR is divided into a region located near the center of the detectable range AR and a plurality of regions stacked toward the edge of the detectable range AR so as to surround the central region. You may.
  • the information processing apparatus 10 or the terminal apparatus 30 may set a predetermined threshold value Vth for changing the feedback to a different value in each of the plurality of regions including the central region.
  • the information processing apparatus 10 or the terminal apparatus 30 may have a value in which the predetermined threshold value Vth becomes smaller toward the outer region. This allows for more detailed feedback.
  • the functions of the information processing apparatus 10 of the first and second embodiments may include a predetermined function accompanied by an operation amount.
  • the predetermined function accompanied by the operation amount may be a function related to the volume operation (Vol + / Vol ⁇ ) or a function related to the reproduction speed.
  • the predetermined function accompanied by the operation amount is not limited to these, and may be, for example, a function related to fast forward, rewind, and slow playback.
  • a gesture with a movement width such as a swipe, may be associated with a predetermined function.
  • the information processing apparatus 10 may determine the operation amount of a predetermined function (for example, the amount of sound to be raised or lowered) based on the information of the movement speed of the hand H of the user U when inputting the gesture.
  • the function of the terminal device 30 of the third embodiment may include a predetermined function accompanied by an operation amount. Then, a gesture with a movement width such as a swipe may be associated with the predetermined function. Then, the terminal device 30 may determine the operation amount (for example, the amount of raising / lowering sound) of a predetermined function based on the information of the movement speed of the hand H of the user U when inputting the gesture.
  • the information processing device 10 and the output device 20 are devices that can be worn by the user (wearable devices), but are not necessarily wearable devices.
  • the information processing device 10 and the output device 20 may be devices installed and used in a structure or a moving body such as a television, a car navigation system, a driver's cab, and various operation panels. Further, the information processing device 10 and the output device 20 may be the mobile body itself.
  • the mobile body may be a mobile terminal such as a smartphone, a mobile phone, a personal computer, a music player, or a portable TV, or may be a remote controller for operating a device.
  • the moving body may be a moving body moving on land (for example, a vehicle such as a car, a bicycle, a bus, a truck, a motorcycle, a train, a linear motor car, etc.) or in the ground (for example, in a tunnel). It may be a moving body (for example, a subway) that moves around.
  • the moving body may be a moving body moving on the water (for example, a ship such as a passenger ship, a cargo ship, a hovercraft, etc.), or a moving body moving underwater (for example, a submersible, a submarine, an unmanned submarine, etc.). It may be a submarine). Further, the moving body may be a moving body moving in the atmosphere (for example, an aircraft such as an airplane, an airship, or a drone), or a moving body moving outside the atmosphere (for example, an artificial celestial body such as a space station). There may be.
  • the structure is, for example, a high-rise building, a house, a steel tower, a station facility, an airport facility, a port facility, a stadium, or the like.
  • the concept of structure includes not only buildings but also structures such as tunnels, bridges, dams, walls and iron pillars, and equipment such as cranes, gates and windmills.
  • the concept of a structure includes not only a structure on land or in the ground, but also a structure on water such as a pier and a mega float, and a structure underwater such as an ocean observation facility.
  • the information processing device 10 and the output device 20 are devices that can be worn by the user (wearable devices), but are not necessarily wearable devices.
  • the information processing device 10 may be a device installed and used in a structure or a moving body.
  • control device for controlling the information processing device 10, the output device 20, or the terminal device 30 of the present embodiment may be realized by a dedicated computer system or a general-purpose computer system.
  • a communication program for executing the above operation is stored and distributed in a computer-readable recording medium such as an optical disk, a semiconductor memory, a magnetic tape, or a flexible disk.
  • the control device is configured by installing the program in a computer and executing the above-mentioned processing.
  • the control device may be an information processing device 10, an output device 20, or an external device (for example, a personal computer) of the terminal device 30.
  • the control device may be an internal device (for example, control unit 16, control unit 26, or control unit 36) of the information processing device 10, the output device 20, or the terminal device 30.
  • the above communication program may be stored in a disk device provided in a server device on a network such as the Internet so that it can be downloaded to a computer or the like.
  • the above-mentioned functions may be realized by the collaboration between the OS (Operating System) and the application software.
  • the part other than the OS may be stored in a medium and distributed, or the part other than the OS may be stored in the server device so that it can be downloaded to a computer or the like.
  • each component of each device shown in the figure is a functional concept, and does not necessarily have to be physically configured as shown in the figure. That is, the specific form of distribution / integration of each device is not limited to the one shown in the figure, and all or part of them may be functionally or physically distributed / physically in arbitrary units according to various loads and usage conditions. Can be integrated and configured.
  • the present embodiment includes a device or any configuration constituting the system, for example, a processor as a system LSI (Large Scale Integration), a module using a plurality of processors, a unit using a plurality of modules, and a unit. It can also be implemented as a set or the like with other functions added (that is, a configuration of a part of the device).
  • a processor as a system LSI (Large Scale Integration)
  • a module using a plurality of processors a unit using a plurality of modules
  • a unit that is, a configuration of a part of the device.
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
  • the present embodiment can have a cloud computing configuration in which one function is shared by a plurality of devices via a network and jointly processed.
  • the information processing device 10 or the terminal device 30 provides feedback to the user U based on the input speed of the gesture of the user U. Therefore, the user U can improve the input accuracy of the gesture based on the feedback. As a result, the gesture detection accuracy of the information processing apparatus 10 becomes high, so that the information processing apparatus 10 can realize high usability.
  • the present technology can also have the following configurations.
  • An acquisition unit that acquires information about the input speed of the user's gesture performed in the detection area of the detection unit that detects an object, and an acquisition unit.
  • An output control unit that controls output related to feedback to the user based on the information related to the input speed.
  • Information processing device equipped with (2)
  • the acquisition unit acquires information on the movement speed of the hand of the user performing the gesture, and obtains information on the movement speed of the user's hand.
  • the output control unit performs output control related to the feedback based on the information of the movement speed.
  • the information processing apparatus according to (1) above.
  • the output control unit provides the feedback in real time.
  • the output control unit changes the feedback to the user according to the moving speed of the user's hand.
  • the output control unit changes the feedback depending on whether or not the moving speed exceeds a predetermined threshold value.
  • the output control unit provides first feedback as the feedback when the movement speed does not exceed the predetermined threshold value, and when the movement speed exceeds the predetermined threshold value, the first feedback is performed. Give a second feedback that is different from the first feedback, The information processing apparatus according to (5) above.
  • the output control unit does not give the feedback when the moving speed does not exceed the predetermined threshold value, and gives the feedback when the moving speed exceeds the predetermined threshold value.
  • the output control unit When the moving speed does not exceed the first threshold value as the predetermined threshold value, the first feedback is given as the feedback. When the moving speed exceeds the first threshold value and does not exceed the second threshold value larger than the first threshold value, a second feedback different from the first feedback is performed. When the moving speed exceeds the second threshold value, a third feedback different from the second feedback is given.
  • the information processing apparatus according to (5) above.
  • the acquisition unit acquires information indicating the position of the hand of the user performing the gesture, and obtains information indicating the position of the hand of the user.
  • the output control unit changes the predetermined threshold value according to the position of the user's hand.
  • the information processing apparatus according to any one of (5) to (8).
  • the output control unit changes the predetermined threshold value depending on whether the position of the user's hand is in the central region of the detectable range of the gesture or in the inner edge region of the detectable range.
  • the information processing apparatus according to (9) above.
  • a guessing unit that estimates whether or not the input operation of the gesture of the user will be performed outside the detectable range of the gesture based on the information of the position and the moving speed of the hand of the user performing the gesture. Equipped with The output control unit provides predetermined feedback to the user when it is presumed that the input operation of the gesture of the user will be performed outside the detectable range.
  • the information processing apparatus according to any one of (1) to (10).
  • the gesture includes a predetermined gesture that can be continuously input.
  • the acquisition unit performs information on a first movement speed indicating the movement speed of the user's hand performing the predetermined gesture for the first continuous input, and the predetermined gesture for the second continuous input.
  • the information of the second movement speed indicating the movement speed of the user's hand and the information of the second movement speed are acquired respectively.
  • the output control unit performs output control regarding feedback when the predetermined gesture of the second continuous input is performed, based on the comparison result between the second moving speed and the first moving speed.
  • the information processing apparatus according to any one of (1) to (11). (13) When the second moving speed is separated from the first moving speed by a predetermined speed or more, the output control unit gives feedback to the predetermined gesture of the second continuous input for the first continuous input. Change from the feedback for the given gesture, The information processing apparatus according to (12) above.
  • the acquisition unit acquires information indicating the position of the hand of the user performing the gesture, and obtains information indicating the position of the hand of the user.
  • the output control unit constantly provides the feedback while the position of the user's hand performing the gesture is within the detectable range of the gesture.
  • the information processing apparatus according to any one of (1) to (14).
  • the predetermined device including the detection unit is a device capable of outputting sound, which is attached to a portion of the user's body that cannot be visually recognized by the user.
  • the output control unit controls the sound output of the predetermined device as the output control related to the feedback.
  • the output control unit changes the output mode of the sound based on the information regarding the input speed.
  • the device for detecting the gesture is a headphone or an earphone.
  • the information processing apparatus according to (16) or (17).
  • the information processing device is a predetermined device provided with the detection unit, or a device that controls the predetermined device from the outside of the predetermined device via wire or wirelessly.
  • the information processing apparatus according to any one of (1) to (18).
  • (20) Acquires information about the input speed of the user's gesture performed in the detection area of the detection unit that detects the object. Output control regarding feedback to the user is performed based on the information regarding the input speed. Information processing method.
  • (21) Computer An acquisition unit that acquires information about the input speed of the user's gesture performed in the detection area of the detection unit that detects an object, An output control unit that controls output regarding feedback to the user based on the information regarding the input speed.
  • An information processing program to function as.
  • Information processing system 10 Information processing device 10A Headphone 10B Earphone 11, 21 Input detection unit 12, 22, 32 State detection unit 13, 23, 33 Output unit 14, 24, 34 Communication unit 15, 35 Storage unit 16, 26, 36 Control unit 20, 20A, 20B Output device 30 Terminal device 31 Input unit 161, 361 Acquisition unit 162, 362 Gesture detection unit 163, 363 Command execution unit 164, 364 Output control unit 165, 365 Guessing unit OR Detection area AR Detectable range AR1 central area AR2 inner edge area U user H hand

Abstract

This information processing device is provided with an acquisition unit which acquires information relating to the input speed of a user's gestures performed in the detection region of a detection unit which detects objects, and an output control unit which, on the basis of the information relating to the input speed, performs output control relating to feedback to the user.

Description

情報処理装置、及び情報処理方法Information processing equipment and information processing method
 本開示は、情報処理装置、及び情報処理方法に関する。 This disclosure relates to an information processing device and an information processing method.
 機器の操作方法として様々な方法が検討されている。例えば、近年では、ジェスチャによる操作が可能な機器が開発されている。 Various methods are being studied as device operation methods. For example, in recent years, devices that can be operated by gestures have been developed.
国際公開第2009/118183号International Publication No. 2009/118183
 ジェスチャによる操作が可能な機器は、現状では、必ずしもユーザビリティが高いとは言えない。例えば、現状の機器の場合、ユーザがジェスチャにより操作を行ったとしても、ユーザにはその操作が機器に正しく検出されたか分からず、ユーザが機器の操作に手間取ることがある。 At present, devices that can be operated by gestures do not necessarily have high usability. For example, in the case of the current device, even if the user performs an operation by a gesture, the user may not know whether the operation is correctly detected by the device, and the user may take time to operate the device.
 そこで、本開示では、ユーザビリティの高いジェスチャ操作機能を実現可能な情報処理装置、及び情報処理方法を提案する。 Therefore, in this disclosure, we propose an information processing device and an information processing method that can realize a gesture operation function with high usability.
 上記の課題を解決するために、本開示に係る一形態の情報処理装置は、物体を検出する検出部の検出領域で行われるユーザのジェスチャの入力速度に関する情報を取得する取得部と、前記入力速度に関する情報に基づいて、前記ユーザへのフィードバックに関する出力制御を行う出力制御部と、を備える。 In order to solve the above-mentioned problems, the information processing apparatus of one form according to the present disclosure includes an acquisition unit for acquiring information regarding an input speed of a user's gesture performed in the detection area of the detection unit for detecting an object, and the input unit. An output control unit that controls output regarding feedback to the user based on information regarding speed is provided.
ユーザがジェスチャを行う様子を示す図である。It is a figure which shows a mode that a user performs a gesture. ユーザがジェスチャを行う様子を示す図である。It is a figure which shows a mode that a user performs a gesture. ジェスチャの検出可能範囲を説明するための図である。It is a figure for demonstrating the detectable range of a gesture. ユーザUの手の移動速度の情報に基づくフィードバックの変化を説明するための図である。It is a figure for demonstrating the change of feedback based on the information of the movement speed of the hand of the user U. ユーザUの手の移動速度の情報に基づくフィードバックの変化を説明するための図である。It is a figure for demonstrating the change of feedback based on the information of the movement speed of the hand of the user U. 本開示の実施形態に係る情報処理装置の構成例を示す図である。It is a figure which shows the structural example of the information processing apparatus which concerns on embodiment of this disclosure. 入力検出部の構成例を示す図である。It is a figure which shows the structural example of the input detection part. 情報処理装置が有する機能とジェスチャとの対応関係を示した図である。It is a figure which showed the correspondence relationship between the function which an information processing apparatus has, and a gesture. ピンチ動作を説明するための図である。It is a figure for demonstrating a pinch operation. ホールド動作を説明するための図である。It is a figure for demonstrating the hold operation. 左右スワイプを説明するための図である。It is a figure for demonstrating the left-right swipe. 左右スワイプを説明するための図である。It is a figure for demonstrating the left-right swipe. 上下スワイプを説明するための図である。It is a figure for demonstrating the up and down swipe. 上下スワイプを説明するための図である。It is a figure for demonstrating the up and down swipe. 手をかざす動作を説明するための図である。It is a figure for demonstrating the operation of holding a hand. 手をかざす動作を説明するための図である。It is a figure for demonstrating the operation of holding a hand. タッチ動作を説明するための図である。It is a figure for demonstrating a touch operation. 手を遠ざける動作を説明するための図である。It is a figure for demonstrating the operation which keeps a hand away. 本実施形態に係るコマンド実行処理を示すフローチャートである。It is a flowchart which shows the command execution process which concerns on this embodiment. 複数の領域に分けられた検出可能範囲を説明するための図である。It is a figure for demonstrating the detectable range divided into a plurality of areas. ユーザの手の位置の情報に基づくフィードバックの閾値の変化を説明するための図である。It is a figure for demonstrating the change of the feedback threshold value based on the information of the position of a user's hand. ユーザの手の位置の情報に基づくフィードバックの閾値の変化を説明するための図である。It is a figure for demonstrating the change of the feedback threshold value based on the information of the position of a user's hand. 本実施形態に係るコマンド実行処理を示すフローチャートである。It is a flowchart which shows the command execution process which concerns on this embodiment. 情報処理システムの構成例を示す図である。It is a figure which shows the configuration example of an information processing system. 本開示の実施形態に係る出力装置の構成例を示す図である。It is a figure which shows the structural example of the output device which concerns on embodiment of this disclosure. 本開示の実施形態に係る端末装置の構成例を示す図である。It is a figure which shows the structural example of the terminal apparatus which concerns on embodiment of this disclosure.
 以下に、本開示の実施形態について図面に基づいて詳細に説明する。なお、以下の各実施形態において、同一の部位には同一の符号を付することにより重複する説明を省略する。 Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings. In each of the following embodiments, the same parts are designated by the same reference numerals, so that overlapping description will be omitted.
 また、本明細書及び図面において、実質的に同一の機能構成を有する複数の構成要素を、同一の符号の後に異なるアルファベットを付して区別する場合もある。例えば、実質的に同一の機能構成を有する複数の要素を、必要に応じて出力装置20A、及び20Bのように区別する。ただし、実質的に同一の機能構成を有する複数の要素各々を特に区別する必要がない場合、同一符号のみを付する。例えば、出力装置20A、及び20Bを特に区別する必要が無い場合には、単に出力装置20と称する。 Further, in the present specification and the drawings, a plurality of components having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numerals. For example, a plurality of elements having substantially the same functional configuration are distinguished as needed, such as output devices 20A and 20B. However, if it is not necessary to distinguish each of a plurality of elements having substantially the same functional configuration, only the same reference numerals are given. For example, when it is not necessary to distinguish between the output devices 20A and 20B, the output devices 20A and 20B are simply referred to as output devices 20.
 また、以下に示す項目順序に従って本開示を説明する。
  1.本開示の概要
    1-1.課題
    1-2.解決手段
  2.実施形態1
    2-1.情報処理装置の構成
    2-2.情報処理装置が検出可能なジェスチャ
    2-3.情報処理装置の動作
  3.実施形態2
    3-1.情報処理装置の動作
  4.実施形態3
    4-1.情報処理システムの構成
    4-2.出力装置の構成
    4-3.端末装置の構成
    4-4.情報処理システムの動作
  5.変形例
  6.むすび
In addition, the present disclosure will be described according to the order of items shown below.
1. 1. Outline of the present disclosure 1-1. Issues 1-2. Solution 2. Embodiment 1
2-1. Configuration of information processing device 2-2. Gestures that can be detected by the information processing device 2-3. Operation of information processing device 3. Embodiment 2
3-1. Operation of information processing device 4. Embodiment 3
4-1. Information processing system configuration 4-2. Output device configuration 4-3. Configuration of terminal equipment 4-4. Operation of information processing system 5. Modification example 6. Conclusion
<<1.本開示の概要>>
 まず、本開示の概要を説明する。
<< 1. Summary of the present disclosure >>
First, the outline of the present disclosure will be described.
<1-1.課題>
 近年、ジェスチャによる機器操作を可能にする技術が開発されている。例えば、近年では、ヘッドホンやイヤホン等、ユーザが装着可能な機器へのコマンドの入力操作(例えば、Next/Prev)を、ユーザがジェスチャを使って行うことを可能にする技術が開発されている。
<1-1. Challenges>
In recent years, technologies that enable gesture-based device operation have been developed. For example, in recent years, a technique has been developed that enables a user to perform a command input operation (for example, Next / Prev) to a device that can be worn by the user, such as headphones and earphones, using a gesture.
 ジェスチャ検出が可能な機器の多くは、物体を非接触で検出可能な検出部(例えば、近接センサ)を備え、検出部の検出結果(例えば、センサ値)に基づいてユーザのジェスチャを検出する。なお、以下の説明では、ジェスチャ検出が可能な機器のことを、単に情報処理装置と呼ぶことがある。 Many devices capable of gesture detection are equipped with a detection unit (for example, a proximity sensor) that can detect an object in a non-contact manner, and detect a user's gesture based on the detection result (for example, a sensor value) of the detection unit. In the following description, a device capable of gesture detection may be simply referred to as an information processing device.
 図1及び図2は、ユーザがジェスチャを行う様子を示す図である。図1には、本実施形態の情報処理装置の一例として、ヘッドホン10Aが示されている。また、図2には、本実施形態の情報処理装置の一例として、イヤホン10Bが示されている。ジェスチャによる機器操作は、情報処理装置近くの空間上でユーザが所定のジェスチャを行うことにより行われる。図1及び図2の例では、ヘッドホン10A及びイヤホン10Bはユーザの耳に装着されており、ユーザUは、自身の耳の近くの空間上で機器操作のためのジェスチャを行っている。図1の例では、ユーザUは、手Hを前から後ろに移動させるジェスチャを行っており、図2の例では、ユーザUは、手Hを後ろから前に移動させるジェスチャを行っている。 1 and 2 are diagrams showing how the user makes a gesture. FIG. 1 shows headphones 10A as an example of the information processing apparatus of the present embodiment. Further, FIG. 2 shows an earphone 10B as an example of the information processing apparatus of the present embodiment. The device operation by the gesture is performed by the user performing a predetermined gesture in the space near the information processing device. In the examples of FIGS. 1 and 2, the headphones 10A and the earphones 10B are attached to the user's ears, and the user U makes a gesture for operating the device in the space near his / her ears. In the example of FIG. 1, the user U makes a gesture of moving the hand H from the front to the back, and in the example of FIG. 2, the user U makes a gesture of moving the hand H from the back to the front.
 なお、本実施形態では、手を移動させる動作には、指を移動させる動作を含むものとする。以下の説明で登場する「手」の記載は、適宜「指」に置き換え可能である。 In the present embodiment, the motion of moving the hand includes the motion of moving the finger. The description of "hand" appearing in the following explanation can be replaced with "finger" as appropriate.
 また、以下の説明では、理解を容易にするため、説明にXYZ座標系を用いることがある。ここで、X軸方向、Y軸方向、Z軸方向は、いずれもユーザを基準として定まる方向である。X軸プラス方向はユーザUの正面方向であり、X軸マイナス方向はユーザUの背面方向である。また、Y軸プラス方向はユーザUの左手側方向であり、Y軸マイナス方向は、ユーザUの右手側方向である。また、Z軸プラス方向は上方向であり、Z軸マイナス方向は、下方向である。本実施形態では、ユーザUは立位或いは座位にあるものとし、X軸及びY軸は水平方向、Z軸は鉛直方向(重力方向)であるものとする。なお、以下の説明で登場するX軸方向、Y軸方向、及びZ軸方向は、ユーザの姿勢に合わせて適宜読み替えてもよい。例えば、ユーザが仰向けに寝ている場合は、X軸方向が鉛直方向となり、Y軸及びX軸方向が水平方向となる。 Also, in the following explanation, the XYZ coordinate system may be used for the explanation in order to facilitate understanding. Here, the X-axis direction, the Y-axis direction, and the Z-axis direction are all directions determined with reference to the user. The X-axis plus direction is the front direction of the user U, and the X-axis minus direction is the back direction of the user U. Further, the Y-axis plus direction is the left-hand side direction of the user U, and the Y-axis minus direction is the right-hand side direction of the user U. Further, the Z-axis plus direction is the upward direction, and the Z-axis minus direction is the downward direction. In the present embodiment, the user U is assumed to be in a standing or sitting position, the X-axis and the Y-axis are in the horizontal direction, and the Z-axis is in the vertical direction (gravity direction). The X-axis direction, Y-axis direction, and Z-axis direction appearing in the following description may be appropriately read according to the posture of the user. For example, when the user is lying on his back, the X-axis direction is the vertical direction, and the Y-axis and the X-axis directions are the horizontal direction.
 また、以下の説明では、ユーザUが手Hを水平方向(X軸方向)に移動させるジェスチャのことを左右スワイプということがある。また、以下の説明では、ユーザUが手Hを鉛直方向(Z軸方向)に移動させるジェスチャのことを上下スワイプということがある。 Further, in the following explanation, the gesture in which the user U moves the hand H in the horizontal direction (X-axis direction) may be referred to as a left / right swipe. Further, in the following description, the gesture in which the user U moves the hand H in the vertical direction (Z-axis direction) may be referred to as a vertical swipe.
 上述したように、情報処理装置は、検出部での物体の検出結果に基づいてユーザのジェスチャを検出する。情報処理装置は、ジェスチャを検出すると、当該ジェスチャに対応する機能(例えば、Next/PrevやVol+/Vol-)を実行する。 As described above, the information processing device detects the user's gesture based on the detection result of the object in the detection unit. When the information processing apparatus detects a gesture, it executes a function corresponding to the gesture (for example, Next / Prev or Vol + / Vol−).
 ジェスチャによる機器操作には、ユーザが気軽に機器を操作できるようになるという利点がある一方で、ユーザがジェスチャによる操作の成否を確認し難いという問題がある。 While device operation by gesture has the advantage that the user can easily operate the device, there is a problem that it is difficult for the user to confirm the success or failure of the operation by gesture.
 例えば、情報処理装置がヘッドホン10Aやイヤホン10Bのようなユーザの耳に装着される機器であるとする。この場合、情報処理装置に対する操作は、図1や図2に示したように、ユーザが視認できない箇所で実行されるので、ユーザは、自身の手が、ジェスチャの検出可能範囲に位置しているのか否かを把握することが難しい。特に、ジェスチャが空中で行われる場合、ユーザは、情報処理装置に触れていないため、空中のどの位置でジェスチャを行っているかを把握することが難しい。そうなると、ユーザは、実行中のジェスチャが意図通りに実行できているか否かを把握することが難しい。 For example, it is assumed that the information processing device is a device worn in the user's ear such as headphones 10A and earphones 10B. In this case, as shown in FIGS. 1 and 2, the operation on the information processing apparatus is executed in a place that cannot be visually recognized by the user, so that the user has his / her own hand located within the detectable range of the gesture. It is difficult to know whether or not it is. In particular, when the gesture is performed in the air, it is difficult for the user to know at which position in the air the gesture is performed because the user does not touch the information processing device. In that case, it is difficult for the user to grasp whether or not the gesture being executed is executed as intended.
 また、ユーザのジェスチャに係る動作が極端に速かったり、極端に遅かったりすると、情報処理装置は正しくユーザのジェスチャを検出できない。この場合も、ユーザは、自身の動作が速い遅いか把握することは難しいので、実行中のジェスチャが意図通りに実行できているか否かを把握することが難しい。 Also, if the operation related to the user's gesture is extremely fast or extremely slow, the information processing device cannot correctly detect the user's gesture. In this case as well, it is difficult for the user to grasp whether his / her operation is fast or slow, and therefore it is difficult to grasp whether or not the gesture being executed is executed as intended.
 そうなると、ユーザがジェスチャを行ったとしても、そのジェスチャが情報処理装置に検出されないという事態が発生しうる。また、ユーザが意図するジェスチャとは異なるジェスチャが情報処理装置に検出され、ユーザが意図する機能とは別の機能が情報処理装置に実行されてしまうという事態も発生しうる。この場合、ユーザは、ジェスチャを行ってしばらく経った時に、又は、意図する機能とは別の機能が実行された時に、ジェスチャが正しく情報処理装置に検出されなかったと気付くことになる。 In that case, even if the user makes a gesture, the gesture may not be detected by the information processing device. In addition, a gesture different from the gesture intended by the user may be detected by the information processing apparatus, and a function different from the function intended by the user may be executed by the information processing apparatus. In this case, the user will notice that the gesture was not correctly detected by the information processing apparatus after a while after performing the gesture or when a function other than the intended function is executed.
<1-2.解決手段>
 そこで、本実施形態では、以下に示す手段により上述の問題を解決する。
<1-2. Solution>
Therefore, in the present embodiment, the above-mentioned problem is solved by the means shown below.
 なお、解決手段の概要を説明する前に、ジェスチャの検出可能範囲について説明する。図3は、ジェスチャの検出可能範囲を説明するための図である。図3の例では、ユーザUをY軸プラス方向から見て、情報処理装置10周囲の一定範囲に検出部の検出領域ORが形成されている。なお、情報処理装置10は、ヘッドホン10Aであってもよいし、イヤホン10Bであってもよい。 Before explaining the outline of the solution, the detectable range of the gesture will be explained. FIG. 3 is a diagram for explaining the detectable range of the gesture. In the example of FIG. 3, when the user U is viewed from the Y-axis plus direction, the detection region OR of the detection unit is formed in a certain range around the information processing apparatus 10. The information processing device 10 may be headphones 10A or earphones 10B.
 図3の例では、検出領域ORの一部範囲がジェスチャの検出可能範囲ARとなっている。具体的には、検出可能範囲ARは、検出領域OR内の四角形の範囲となっている。なお、検出可能範囲ARの形状は、Y軸方向から見て四角形に限定されず、例えば、円形や楕円形であってもよい。また、図3の例では、検出可能範囲ARは検出領域ORの一部範囲となっているが、検出可能範囲ARは検出領域ORの全範囲であってもよい。また、検出可能範囲ARは可変であってもよい。例えば、情報処理装置10が備えるセンサからの情報に基づいて検出可能範囲ARの位置、大きさ、及び形状、の少なくとも1つを変更してもよい。 In the example of FIG. 3, a part of the detection area OR is the gesture detectable range AR. Specifically, the detectable range AR is a quadrangular range within the detection area OR. The shape of the detectable range AR is not limited to a quadrangle when viewed from the Y-axis direction, and may be, for example, a circle or an ellipse. Further, in the example of FIG. 3, the detectable range AR is a part of the detection area OR, but the detectable range AR may be the entire range of the detection area OR. Further, the detectable range AR may be variable. For example, at least one of the position, size, and shape of the detectable range AR may be changed based on the information from the sensor included in the information processing apparatus 10.
 以下、図3に示した検出可能範囲ARを例にとり、解決手段の概要を説明する。 Hereinafter, the outline of the solution will be described by taking the detectable range AR shown in FIG. 3 as an example.
 まず、情報処理装置10は、ジェスチャを行っているユーザUの手の位置を示す情報を取得する。そして、情報処理装置10は、ユーザUの手が検出可能範囲AR内にある間、常時、フィードバックを行う。このとき、フィードバックはイヤホン又はヘッドホンから出力される音であってもよい。なお、情報処理装置10は、ユーザUの手が検出可能範囲AR内にない場合は、フィードバックを行わなくてもよい。これにより、ユーザUは、自身の手が検出可能範囲AR内にあるか否かを知ることができる。 First, the information processing device 10 acquires information indicating the position of the hand of the user U performing the gesture. Then, the information processing apparatus 10 constantly provides feedback while the user U's hand is within the detectable range AR. At this time, the feedback may be the sound output from the earphones or headphones. The information processing apparatus 10 does not have to provide feedback when the user U's hand is not within the detectable range AR. As a result, the user U can know whether or not his / her hand is within the detectable range AR.
 また、情報処理装置10は、検出領域ORで行われるユーザのジェスチャの入力速度に関する情報を取得してもよい。そして、情報処理装置10は、入力速度に関する情報に基づいて、ユーザUへのフィードバックに関する出力制御を行う。例えば、情報処理装置10は、ユーザUの手の移動速度の情報を取得し、移動速度の情報に基づいて、フィードバックに関する出力制御を行う。 Further, the information processing apparatus 10 may acquire information regarding the input speed of the user's gesture performed in the detection area OR. Then, the information processing apparatus 10 performs output control regarding feedback to the user U based on the information regarding the input speed. For example, the information processing apparatus 10 acquires information on the movement speed of the hand of the user U, and performs output control regarding feedback based on the information on the movement speed.
 図4A及び図4Bは、ユーザUの手の移動速度の情報に基づくフィードバックの変化を説明するための図である。図4Aの例では、ユーザUの手Hは移動速度V11で移動しており、図4Bの例では、ユーザUの手Hは移動速度V11より速い移動速度V12で移動している。 4A and 4B are diagrams for explaining changes in feedback based on information on the movement speed of the user U's hand. In the example of FIG. 4A, the hand H of the user U is moving at the moving speed V 11 , and in the example of FIG. 4B, the hand H of the user U is moving at the moving speed V 12 faster than the moving speed V 11 .
 図4A及び図4Bの例では、情報処理装置10は、フィードバックを常時かつリアルタイムに行うよう構成されている。ここで、リアルタイムのフィードバックとは、遅延の無いフィードバック或いは一定の範囲の遅延でのフィードバックを意味する。そして、情報処理装置10は、ユーザUの手Hの移動速度に応じて、ユーザUへのフィードバックを変化させる。例えば、情報処理装置10は、ユーザUの手Hの移動速度が所定の閾値(以下、閾値Vthという。)を超えているか否かでフィードバックを変化させる。 In the examples of FIGS. 4A and 4B, the information processing apparatus 10 is configured to provide feedback at all times and in real time. Here, the real-time feedback means feedback without delay or feedback with a certain range of delay. Then, the information processing apparatus 10 changes the feedback to the user U according to the moving speed of the hand H of the user U. For example, the information processing apparatus 10 changes the feedback depending on whether or not the moving speed of the hand H of the user U exceeds a predetermined threshold value (hereinafter, referred to as a threshold value Vth).
 例えば、閾値Vthが、図4Aに示す移動速度V11より大きく、図4Bに示す移動速度V12より小さいとする。図4Aの例では、ユーザUの手Hは、閾値Vthより遅い移動速度V11で移動しているので、情報処理装置10は、第1のフィードバックを行う。第1のフィードバックは、ユーザにジェスチャが正常に入力されていることを知らせるためのフィードバックであってもよい。また、第1のフィードバックは、ゼロフィードバック(Zero-feedback)であってもよい。 For example, it is assumed that the threshold value V th is larger than the moving speed V 11 shown in FIG. 4A and smaller than the moving speed V 12 shown in FIG. 4B. In the example of FIG. 4A, since the hand H of the user U is moving at a moving speed V 11 slower than the threshold value V th , the information processing apparatus 10 provides the first feedback. The first feedback may be feedback to inform the user that the gesture has been input normally. Further, the first feedback may be zero feedback.
 ここで、ゼロフィードバックとは、出力の無いフィードバック、すなわち、情報処理装置10がフィードバックとしての出力を行わないことを意味する。本実施形態において、出力を行わないことも、それが、例えば、ユーザにジェスチャが正常に入力されていること又は正常に入力されていないことを知らせているのであれば、フィードバックの一種とみなすことができる。 Here, zero feedback means feedback without output, that is, the information processing apparatus 10 does not output as feedback. In the present embodiment, not outputting is also regarded as a kind of feedback if it informs the user that the gesture is input normally or is not input normally, for example. Can be done.
 一方、図4Bの例では、ユーザUの手Hは、閾値Vthより速い移動速度V12で移動しているので、情報処理装置10は、第1のフィードバックとは異なる第2のフィードバックを行う。第2のフィードバックは、このとき、第2のフィードバックは、第1のフィードバックよりピッチ(周波数)の高い音であってもよい。例えば、第2のフィードバックは、第1のフィードバックより周波数の高い音であってもよい。第1のフィードバックがゼロフィードバックでないのであれば、第2のフィードバックは、ゼロフィードバックであってもよい。 On the other hand, in the example of FIG. 4B, since the hand H of the user U is moving at a movement speed V 12 faster than the threshold value V th , the information processing apparatus 10 provides a second feedback different from the first feedback. .. The second feedback is, at this time, the second feedback may be a sound having a higher pitch (frequency) than the first feedback. For example, the second feedback may be a sound with a higher frequency than the first feedback. If the first feedback is not zero feedback, the second feedback may be zero feedback.
 なお、情報処理装置10は、必ずしも、常時、フィードバックを行う必要はない。例えば、情報処理装置10は、ユーザの手Hの移動速度が閾値Vthを超えていない場合には、ユーザへのフィードバックを行わなくてもよい。そして、情報処理装置10は、移動速度が閾値Vthを超えている場合に、フィードバックを行うようにしてもよい。これにより、ユーザUにジェスチャの入力速度が速いことを知らせることができる。 It should be noted that the information processing apparatus 10 does not necessarily have to always provide feedback. For example, the information processing apparatus 10 does not have to give feedback to the user when the moving speed of the user's hand H does not exceed the threshold value Vth. Then, the information processing apparatus 10 may provide feedback when the moving speed exceeds the threshold value Vth. This makes it possible to inform the user U that the gesture input speed is high.
 また、閾値は1つに限定されず複数あってもよい。この場合、第1のフィードバックと第2のフィードバックとに加えて、さらに複数のフィードバック(例えば、第4のフィードバック、第5のフィードバック等)があってもよい。勿論、情報処理装置10は、ユーザUの手Hの移動速度に応じて連続的にフィードバックを変化させてもよい。例えば、ユーザUの手Hの移動速度の変化に従って、徐々に音の周波数を変化させてもよい。 Also, the threshold value is not limited to one and may be multiple. In this case, in addition to the first feedback and the second feedback, there may be a plurality of further feedbacks (for example, a fourth feedback, a fifth feedback, etc.). Of course, the information processing apparatus 10 may continuously change the feedback according to the moving speed of the hand H of the user U. For example, the frequency of the sound may be gradually changed according to the change in the moving speed of the hand H of the user U.
 本解決手段によれば、情報処理装置10がユーザUの手Hの位置に応じてフィードバックを行っているので、例え検出可能範囲ARがユーザUの視認できない位置にあっても、ユーザUは、ジェスチャを行いながら、ジェスチャを行うのに適切な位置(検出可能範囲ARの位置)を知ることができる。また、情報処理装置10がユーザUの手Hの移動速度に応じてフィードバックを変化させているので、ユーザUは、ジェスチャを行いながら、ジェスチャの適切な入力速度を知ることができる。この結果、ユーザが適切な位置又は速度でジェスチャを行うようになるので、情報処理装置10によるジェスチャの検出精度は高くなる。結果として、ユーザがあまりストレスを感じることなく情報処理装置10を使用できるようになるので、情報処理装置10のユーザビリティが向上する。 According to the present solution, since the information processing apparatus 10 provides feedback according to the position of the hand H of the user U, even if the detectable range AR is in a position where the user U cannot see, the user U can use the feedback. While performing the gesture, it is possible to know the appropriate position (the position of the detectable range AR) for performing the gesture. Further, since the information processing apparatus 10 changes the feedback according to the moving speed of the hand H of the user U, the user U can know the appropriate input speed of the gesture while performing the gesture. As a result, the user will perform the gesture at an appropriate position or speed, and the accuracy of the gesture detection by the information processing apparatus 10 will be high. As a result, the user can use the information processing apparatus 10 without feeling much stress, so that the usability of the information processing apparatus 10 is improved.
<<2.実施形態1>>
 以上、本実施形態の概要を述べたが、以下、実施形態1に係る情報処理装置10について説明する。
<< 2. Embodiment 1 >>
The outline of the present embodiment has been described above, but the information processing apparatus 10 according to the first embodiment will be described below.
<2-1.情報処理装置の構成>
 まず、情報処理装置10の構成を説明する。
<2-1. Information processing device configuration>
First, the configuration of the information processing apparatus 10 will be described.
 情報処理装置10は、物体を検出する検出部を備え、ユーザの入力操作に基づいて各種機能を実行する情報処理端末である。例えば、情報処理装置10は、ユーザUの体のうちユーザUが視認できない部分に装着される音出力可能な機器である。例えば、情報処理装置10は、イヤホンやヘッドホン等のユーザが装着可能な音響出力装置である。情報処理装置10は、AR(Augmented Reality)グラスやMR(Mixed Reality)グラス等のユーザが装着可能な表示装置であってもよい。 The information processing device 10 is an information processing terminal that includes a detection unit that detects an object and executes various functions based on a user's input operation. For example, the information processing apparatus 10 is a device capable of outputting sound, which is attached to a portion of the body of the user U that cannot be visually recognized by the user U. For example, the information processing device 10 is an acoustic output device that can be worn by a user such as earphones and headphones. The information processing device 10 may be a display device such as an AR (Augmented Reality) glass or an MR (Mixed Reality) glass that can be worn by a user.
 本実施形態では、情報処理装置10は、ユーザUに装着可能な機器であるものとし、装着時に少なくともユーザUの耳に位置する所定の部位を備えているものとする。情報処理装置10がイヤホン又はヘッドホンなのであれば、所定の部位はスピーカとそれを格納するハウジングの部分である。そして、この所定の部位には、少なくとも物体を検出する検出部が配置されているものとする。情報処理装置10は、ユーザのジェスチャを検出可能に構成され、ユーザのジェスチャに基づいて音楽再生等の機能を実行する。 In the present embodiment, the information processing apparatus 10 is a device that can be worn by the user U, and is provided with a predetermined portion located at least in the ear of the user U when the information processing device 10 is worn. If the information processing device 10 is an earphone or a headphone, the predetermined portion is a speaker and a housing for storing the speaker. Then, it is assumed that at least a detection unit for detecting an object is arranged in this predetermined portion. The information processing apparatus 10 is configured to be able to detect a user's gesture, and executes a function such as music playback based on the user's gesture.
 図5は、本開示の実施形態に係る情報処理装置10の構成例を示す図である。情報処理装置10は、入力検出部11と、状態検出部12と、出力部13と、通信部14と、記憶部15と、制御部16と、を備える。なお、図5に示した構成は機能的な構成であり、ハードウェア構成はこれとは異なっていてもよい。また、情報処理装置10の機能は、複数の物理的に分離された構成に分散して実装されてもよい。 FIG. 5 is a diagram showing a configuration example of the information processing apparatus 10 according to the embodiment of the present disclosure. The information processing apparatus 10 includes an input detection unit 11, a state detection unit 12, an output unit 13, a communication unit 14, a storage unit 15, and a control unit 16. The configuration shown in FIG. 5 is a functional configuration, and the hardware configuration may be different from this. Further, the functions of the information processing apparatus 10 may be distributed and implemented in a plurality of physically separated configurations.
 入力検出部11は、ユーザの入力操作を検出する検出部である。例えば、入力検出部11は、物体を非接触で検出する非接触型の検出部である。入力検出部11は、1又は複数の近接センサ(非接触センサともいう。)を備え、検出領域内に位置する物体を検出する。近接センサは、光学式のセンサであってもよいし、静電容量式のセンサであってもよい。光学式の近接センサは、物体が反射した光を検出するセンサである。また、静電容量式の近接センサは、物体とセンサの間に生じる静電容量の変化を検出するセンサである。 The input detection unit 11 is a detection unit that detects a user's input operation. For example, the input detection unit 11 is a non-contact type detection unit that detects an object in a non-contact manner. The input detection unit 11 includes one or a plurality of proximity sensors (also referred to as non-contact sensors), and detects an object located in the detection area. The proximity sensor may be an optical sensor or a capacitance type sensor. An optical proximity sensor is a sensor that detects the light reflected by an object. The capacitance type proximity sensor is a sensor that detects a change in capacitance that occurs between an object and the sensor.
 なお、入力検出部11が非接触型の場合、入力検出部11は2Dセンサであってもよいし、3Dセンサであってもよい。入力検出部11が2Dセンサの場合、入力検出部11が形成する検出領域は二次元の領域となり、入力検出部11が3Dセンサの場合、入力検出部11が形成する検出領域は三次元の領域となる。 When the input detection unit 11 is a non-contact type, the input detection unit 11 may be a 2D sensor or a 3D sensor. When the input detection unit 11 is a 2D sensor, the detection area formed by the input detection unit 11 is a two-dimensional area, and when the input detection unit 11 is a 3D sensor, the detection area formed by the input detection unit 11 is a three-dimensional area. Will be.
 入力検出部11は、非接触式に限定されない。入力検出部11は、接触式の検出部であってもよい。この場合、入力検出部11は、1又は複数のタッチセンサを備え、情報処理装置10の所定の場所に接触した物体を検出する。情報処理装置10がヘッドホン型の装置なのであれば、所定の場所は、例えば、ヘッドホン側面(スピーカ面の反対面)である。また、情報処理装置10がイヤホン型の装置なのであれば、所定の場所は、例えば、イヤホン側面(スピーカ面の反対面)である。以下の説明では、ヘッドホン側面やイヤホン側面のことをハウジング面ということがある。 The input detection unit 11 is not limited to the non-contact type. The input detection unit 11 may be a contact type detection unit. In this case, the input detection unit 11 includes one or a plurality of touch sensors, and detects an object that comes into contact with a predetermined place of the information processing apparatus 10. If the information processing device 10 is a headphone type device, the predetermined location is, for example, the side surface of the headphones (opposite the surface of the speaker). If the information processing device 10 is an earphone type device, the predetermined location is, for example, the side surface of the earphone (opposite the surface of the speaker). In the following description, the side surface of the headphones or the side surface of the earphones may be referred to as the housing surface.
 図6は、入力検出部11の構成例を示す図である。図6には、ヘッドホン型の情報処理装置10をヘッドホン側面から見た様子が示されている。なお、図6に示した構成はあくまで一例であり、入力検出部11の構成は図6に示した構成に限定されない。 FIG. 6 is a diagram showing a configuration example of the input detection unit 11. FIG. 6 shows a state in which the headphone-type information processing apparatus 10 is viewed from the side surface of the headphones. The configuration shown in FIG. 6 is merely an example, and the configuration of the input detection unit 11 is not limited to the configuration shown in FIG.
 図6の例では、入力検出部11は、非接触式のセンサとして、赤外発光体IDと、4つのフォトセンサPD1、PD2、PD3、PD4と、を備えている。赤外発光体IDは、ハウジング面中央に配置されており、4つのフォトセンサPD1、PD2、PD3、PD4は、赤外発光体IDを取り囲むように周方向に等間隔に配置されている。入力検出部11は、赤外発光体IDから発光され、物体表面で反射された赤外光を4つのフォトセンサPD1、PD2、PD3、PD4で検出することにより、物体を検出する。 In the example of FIG. 6, the input detection unit 11 includes an infrared light emitter ID and four photo sensors PD1, PD2, PD3, and PD4 as non-contact sensors. The infrared emitter ID is arranged in the center of the housing surface, and the four photosensors PD1, PD2, PD3, and PD4 are arranged at equal intervals in the circumferential direction so as to surround the infrared emitter ID. The input detection unit 11 detects an object by detecting the infrared light emitted from the infrared emitter ID and reflected on the surface of the object by the four photosensors PD1, PD2, PD3, and PD4.
 また、入力検出部11は、接触式のセンサとして、タッチパッドTPを備えている。タッチパッドTPは、面状に配置された複数のタッチセンサで構成される。赤外発光体IDは、ハウジング面に配置されている。入力検出部11は、タッチパッドTPの表面に接触した物体を検出する。 Further, the input detection unit 11 is provided with a touch pad TP as a contact type sensor. The touch pad TP is composed of a plurality of touch sensors arranged in a plane. The infrared emitter ID is arranged on the housing surface. The input detection unit 11 detects an object in contact with the surface of the touch pad TP.
 状態検出部12は、情報処理装置10の状態に関する検出を行うセンサ部である。状態検出部12は、情報処理装置10の状態に関する検出を行う1又は複数のセンサにより構成される。状態検出部12は、例えば、1又は複数の加速度センサを備える。また、状態検出部12は、1又は複数のジャイロセンサを備えていてもよい。また、状態検出部12は、1又は複数の地磁気センサを備えていてもよい。状態検出部12は、これら複数種類のセンサを組み合わせたモーションセンサを備えていてもよい。 The state detection unit 12 is a sensor unit that detects the state of the information processing device 10. The state detection unit 12 is composed of one or a plurality of sensors that detect the state of the information processing device 10. The state detection unit 12 includes, for example, one or a plurality of acceleration sensors. Further, the state detection unit 12 may include one or a plurality of gyro sensors. Further, the state detection unit 12 may include one or a plurality of geomagnetic sensors. The state detection unit 12 may include a motion sensor in which these plurality of types of sensors are combined.
 状態検出部12は、これらのセンサの検出結果に基づいて、情報処理装置10の状態に関する検出を行う。例えば、状態検出部12は、これらのセンサのセンサ値に基づいて、重力方向を検出する。例えば、情報処理装置10には常に重力加速度が加わっているので、情報処理装置10は、加速度センサで検出した加速度の方向の一定時間の平均をとることで重力方向を検出することが可能である。 The state detection unit 12 detects the state of the information processing apparatus 10 based on the detection results of these sensors. For example, the state detection unit 12 detects the direction of gravity based on the sensor values of these sensors. For example, since the information processing device 10 is constantly subjected to gravitational acceleration, the information processing device 10 can detect the gravitational direction by averaging the directions of the accelerations detected by the acceleration sensor for a certain period of time. ..
 出力部13は、ユーザに情報を出力する出力インタフェースである。例えば、出力部13は、スピーカ、ブザー等の音響装置であってもよいし、振動モータ等の振動装置であってもよい。また、出力部13は、液晶ディスプレイ(Liquid Crystal Display)、有機ELディスプレイ(Organic Electroluminescence Display)等の表示装置であってもよいし、LED(Light Emitting Diode)ランプ等の点灯装置であってもよい。出力部13は、情報処理装置10の出力手段として機能する。出力部13は、制御部16の制御に従ってユーザに各種情報を出力する。 The output unit 13 is an output interface that outputs information to the user. For example, the output unit 13 may be an acoustic device such as a speaker or a buzzer, or may be a vibration device such as a vibration motor. Further, the output unit 13 may be a display device such as a liquid crystal display (Liquid Crystal Display) or an organic EL display (Organic Electroluminescence Display), or may be a lighting device such as an LED (Light Emitting Diode) lamp. .. The output unit 13 functions as an output means of the information processing apparatus 10. The output unit 13 outputs various information to the user according to the control of the control unit 16.
 通信部14は、他の装置と通信するための通信インタフェースである。通信部14は、ネットワークインタフェースであってもよいし、機器接続インタフェースであってもよい。また、通信部14は、有線インタフェースであってもよいし、無線インタフェースであってもよい。通信部14は、情報処理装置10の通信手段として機能する。通信部14は、制御部16の制御に従って他の装置と通信する。他の装置は、例えば、音楽プレーヤーやスマートフォン等の端末装置である。 The communication unit 14 is a communication interface for communicating with other devices. The communication unit 14 may be a network interface or a device connection interface. Further, the communication unit 14 may be a wired interface or a wireless interface. The communication unit 14 functions as a communication means of the information processing device 10. The communication unit 14 communicates with other devices according to the control of the control unit 16. Other devices are, for example, terminal devices such as music players and smartphones.
 記憶部15は、DRAM(Dynamic Random Access Memory)、SRAM(Static Random Access Memory)、フラッシュメモリ、ハードディスク等のデータ読み書き可能な記憶装置である。記憶部15は、情報処理装置10の記憶手段として機能する。記憶部15は、例えば、ジェスチャの検出可能範囲ARに関する設定を記憶する。 The storage unit 15 is a storage device capable of reading and writing data such as a DRAM (Dynamic Random Access Memory), a SRAM (Static Random Access Memory), a flash memory, and a hard disk. The storage unit 15 functions as a storage means for the information processing device 10. The storage unit 15 stores, for example, the settings related to the detectable range AR of the gesture.
 制御部16は、情報処理装置10の各部を制御するコントローラ(controller)である。制御部16は、例えば、CPU(Central Processing Unit)、MPU(Micro Processing Unit)等のプロセッサにより実現される。例えば、制御部16は、情報処理装置10内部の記憶装置に記憶されている各種プログラムを、プロセッサがRAM(Random Access Memory)等を作業領域として実行することにより実現される。なお、制御部16は、ASIC(Application Specific Integrated Circuit)やFPGA(Field Programmable Gate Array)等の集積回路により実現されてもよい。CPU、MPU、ASIC、及びFPGAは何れもコントローラとみなすことができる。 The control unit 16 is a controller that controls each unit of the information processing device 10. The control unit 16 is realized by, for example, a processor such as a CPU (Central Processing Unit) or an MPU (Micro Processing Unit). For example, the control unit 16 is realized by the processor executing various programs stored in the storage device inside the information processing device 10 using a RAM (Random Access Memory) or the like as a work area. The control unit 16 may be realized by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array). The CPU, MPU, ASIC, and FPGA can all be regarded as controllers.
 制御部16は、図5に示すように、取得部161と、ジェスチャ検出部162と、コマンド実行部163と、出力制御部164と、推測部165と、を備える。制御部16を構成する各ブロック(取得部161~推測部165)はそれぞれ制御部16の機能を示す機能ブロックである。これら機能ブロックはソフトウェアブロックであってもよいし、ハードウェアブロックであってもよい。例えば、上述の機能ブロックが、それぞれ、ソフトウェア(マイクロプログラムを含む。)で実現される1つのソフトウェアモジュールであってもよいし、半導体チップ(ダイ)上の1つの回路ブロックであってもよい。勿論、各機能ブロックがそれぞれ1つのプロセッサ又は1つの集積回路であってもよい。機能ブロックの構成方法は任意である。 As shown in FIG. 5, the control unit 16 includes an acquisition unit 161, a gesture detection unit 162, a command execution unit 163, an output control unit 164, and a guessing unit 165. Each block (acquisition unit 161 to guessing unit 165) constituting the control unit 16 is a functional block indicating the function of the control unit 16. These functional blocks may be software blocks or hardware blocks. For example, each of the above-mentioned functional blocks may be one software module realized by software (including a microprogram), or may be one circuit block on a semiconductor chip (die). Of course, each functional block may be one processor or one integrated circuit. The method of configuring the functional block is arbitrary.
 なお、制御部16は上述の機能ブロックとは異なる機能単位で構成されていてもよい。また、制御部16を構成する各ブロック(取得部161~推測部165)の一部又は全部の動作を、他の装置が行ってもよい。例えば、制御部16を構成する各ブロックの一部又は全部の動作を、音楽プレーヤーやスマートフォン等の端末装置が行ってもよい。制御部16を構成する各ブロックの動作は後述する。 The control unit 16 may be configured in a functional unit different from the above-mentioned functional block. Further, another device may perform a part or all of the operations of each block (acquisition unit 161 to estimation unit 165) constituting the control unit 16. For example, a terminal device such as a music player or a smartphone may perform some or all operations of each block constituting the control unit 16. The operation of each block constituting the control unit 16 will be described later.
<2-2.情報処理装置が検出可能なジェスチャ>
 以上、情報処理装置10の構成について述べたが、本実施形態に係る情報処理装置10の動作を詳細に説明する前に、情報処理装置10が検出可能なジェスチャについて説明する。
<2-2. Gestures that can be detected by information processing equipment>
The configuration of the information processing apparatus 10 has been described above, but before the operation of the information processing apparatus 10 according to the present embodiment is described in detail, the gestures that can be detected by the information processing apparatus 10 will be described.
 図7は、情報処理装置10が有する機能とジェスチャとの対応関係を示した図である。情報処理装置10は、音楽の再生に関する機能に加えて、電話の応答に関する機能を有している。例えば、情報処理装置10は、以下の(1)~(7)に示す機能を有している。なお、「機能」は「コマンド」と言い換えることができる。
  (1)Play/Pause
  (2)AnswerCall/End
  (3)Next/Prev
  (4)Vol+/Vol-
  (5)Cancel
  (6)Quick Attention
  (7)Ambient Control
FIG. 7 is a diagram showing the correspondence between the functions of the information processing apparatus 10 and the gestures. The information processing device 10 has a function related to answering a telephone in addition to a function related to playing music. For example, the information processing apparatus 10 has the functions shown in the following (1) to (7). In addition, "function" can be paraphrased as "command".
(1) Play / Pause
(2) AnswerCall / End
(3) Next / Prev
(4) Vol + / Vol-
(5) Candle
(6) Quick Attention
(7) Ambient Control
 以下、(1)~(7)の機能、及び、(1)~(7)それぞれに関連付けられたジェスチャを説明する。なお、1つのジェスチャを構成するユーザの動作には、ジェスチャの開始のトリガとなる第1動作と、第1動作に続く第2動作とが含まれていてもよい。第1動作は、例えば、ユーザUの手Hの振り上げ動作であり、第2動作は、例えば、第1動作に続くスワイプ等の動作である。第1動作及び第2動作それぞれを1つのジェスチャとみなすことも可能である。 Hereinafter, the functions of (1) to (7) and the gestures associated with each of (1) to (7) will be described. The user's actions constituting one gesture may include a first action that triggers the start of the gesture and a second action following the first action. The first operation is, for example, a swing-up operation of the hand H of the user U, and the second operation is, for example, an operation such as a swipe following the first operation. It is also possible to regard each of the first motion and the second motion as one gesture.
 なお、ここで説明する機能及びジェスチャはあくまで一例である。情報処理装置10が有する機能及びジェスチャは、以下に示す機能及びジェスチャに限定されない。第1動作及び第2動作も以下に示す動作に限定されない。また、機能とジェスチャとの組み合わせも、以下に示す組み合わせに限定されない。 The functions and gestures described here are just examples. The functions and gestures of the information processing apparatus 10 are not limited to the functions and gestures shown below. The first operation and the second operation are not limited to the operations shown below. Further, the combination of the function and the gesture is not limited to the combination shown below.
 以下の説明では、情報処理装置10はユーザの耳に装着されるヘッドホン型の機器であるものとするが、情報処理装置10はヘッドホン型の機器に限定されない。例えば、情報処理装置10はイヤホン型の機器であってもよい。また、情報処理装置10は、ユーザの耳に装着される所定の部位とその他の部位を有していてもよい。また、以下の説明では、情報処理装置10がユーザの耳に装着されるものとするが、ユーザの耳に装着される装置は、情報処理装置10以外の所定の装置であってもよい。所定の装置は、例えば、情報処理装置10と無線又は有線で接続される出力装置(例えば、ヘッドホン又はイヤホン)である。なお、情報処理装置10がユーザの耳に装着されるのであれば、所定の装置は情報処理装置10自身である。所定の装置は所定の機器と言い換え可能である。 In the following description, the information processing device 10 is assumed to be a headphone-type device worn on the user's ear, but the information processing device 10 is not limited to the headphone-type device. For example, the information processing device 10 may be an earphone type device. Further, the information processing apparatus 10 may have a predetermined portion to be attached to the user's ear and other portions. Further, in the following description, it is assumed that the information processing apparatus 10 is attached to the user's ear, but the apparatus attached to the user's ear may be a predetermined device other than the information processing apparatus 10. The predetermined device is, for example, an output device (for example, headphones or earphones) connected to the information processing device 10 wirelessly or by wire. If the information processing device 10 is worn on the user's ear, the predetermined device is the information processing device 10 itself. A predetermined device can be paraphrased as a predetermined device.
 以下、図7を参照しながら、情報処理装置10が有する機能と、その機能に関連付けられたジェスチャを説明する。 Hereinafter, the functions of the information processing apparatus 10 and the gestures associated with the functions will be described with reference to FIG. 7.
 (1)Play/Pause
 「Play」は、曲を再生する機能であり、「Pause」は曲の再生を一時停止する機能である。「Play/Pause」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、ピンチ動作が関連付けられている。図8は、ピンチ動作を説明するための図である。ピンチ動作は、図8に示すように、指で物を摘まむ動作である。なお、情報処理装置10は、ユーザUが実際に物(例えば、情報処理装置10)を摘まんでいなくても、ユーザUがピンチ動作を行ったと検出可能である。
(1) Play / Pause
"Play" is a function for playing a song, and "Pause" is a function for pausing the playback of a song. The "Play / Pause" is associated with the swing-up motion of the hand H as the first motion, and is associated with the pinch motion as the second motion. FIG. 8 is a diagram for explaining a pinch operation. As shown in FIG. 8, the pinch operation is an operation of picking an object with a finger. The information processing apparatus 10 can detect that the user U has performed a pinch operation even if the user U has not actually picked up an object (for example, the information processing apparatus 10).
 (2)Answer Call/End
 「Answer Call」は、電話に応答する機能であり、「Answer End」は、電話を終了する機能である。「Answer Call/End」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、ピンチ動作又はホールド動作が関連付けられている。図9は、ホールド動作を説明するための図である。ホールド動作は、図9に示すように、手Hで物を掴む動作である。なお、情報処理装置10は、ユーザUが実際に物(例えば、情報処理装置10)を掴んでいなくても、ユーザUがホールド動作を行ったと検出可能である。
(2) Answer Call / End
"Answer Call" is a function for answering a telephone call, and "Answer End" is a function for ending a telephone call. The "Answer Call / End" is associated with a swing-up motion of the hand H as a first motion, and a pinch motion or a hold motion as a second motion. FIG. 9 is a diagram for explaining the hold operation. As shown in FIG. 9, the hold operation is an operation of grasping an object with the hand H. The information processing apparatus 10 can detect that the user U has performed the hold operation even if the user U does not actually grasp an object (for example, the information processing apparatus 10).
 (3)Next/Prev
 「Next」は、次の曲の頭出しを行う機能であり、「Prev」は、前の曲または再生中の曲の頭出しを行う機能である。「Next/Prev」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、左右スワイプが関連付けられている。ここで、「Next」には、第2動作として、左右スワイプのうちの左スワイプ(前スワイプ)が関連付けられており、「Prev」には、第2動作として、左右スワイプのうちの右スワイプ(後スワイプ)が関連付けられている。なお、「Next/Prev」には、第1動作なしで第2動作を行うジェスチャが関連付けられていてもよい。
(3) Next / Prev
"Next" is a function for cueing the next song, and "Prev" is a function for cueing the previous song or the song being played. The "Next / Prev" is associated with the swing-up motion of the hand H as the first motion, and is associated with the left / right swipe as the second motion. Here, "Next" is associated with the left swipe (front swipe) of the left and right swipes as the second operation, and "Prev" is associated with the right swipe of the left and right swipes as the second operation (prev). After swipe) is associated. It should be noted that "Next / Prev" may be associated with a gesture that performs the second operation without the first operation.
 図10A及び図10Bは、左右スワイプを説明するための図である。図10Aが左スワイプであり、図10Bが右スワイプである。左右スワイプは、ピンチ動作およびホールド動作とは異なり、移動幅を伴う動作(ジェスチャ)である。ここで、左スワイプは、図10Aに示すように、ユーザUが所定の移動幅W1ほど手Hを前方向(X軸プラス方向)へ滑らせる動作である。また、右スワイプは、図10Bに示すように、ユーザUが所定の移動幅W2ほど手Hを後ろ方向(X軸マイナス方向)へ滑らせる動作である。 10A and 10B are diagrams for explaining a left-right swipe. FIG. 10A is a left swipe and FIG. 10B is a right swipe. The left / right swipe is an operation (gesture) involving a movement width, unlike the pinch operation and the hold operation. Here, the left swipe is an operation in which the user U slides the hand H in the forward direction (X-axis plus direction) by a predetermined movement width W1 as shown in FIG. 10A. Further, the right swipe is an operation in which the user U slides the hand H in the backward direction (X-axis minus direction) by a predetermined movement width W2 as shown in FIG. 10B.
 なお、図10A及び図10Bは、いずれも、ユーザUが左耳近くでジェスチャを行っている様子を示す図である。図10A及び図10Bの例とは異なり、ユーザUが右耳近くでジェスチャを行う場合、右スワイプが、手Hを前に滑らせる動作となり、左スワイプが、手Hを後ろに滑らせる動作となるので注意を要する。この場合、「Next」には、第2動作として、例えば、右スワイプ(前スワイプ)が関連付けられ、「Prev」には、第2動作として、例えば、左スワイプ(後スワイプ)が関連付けられる。 Note that FIGS. 10A and 10B are both views showing how the user U is performing a gesture near the left ear. Unlike the examples of FIGS. 10A and 10B, when the user U makes a gesture near the right ear, the right swipe is the action of sliding the hand H forward, and the left swipe is the action of sliding the hand H backward. So be careful. In this case, "Next" is associated with, for example, a right swipe (front swipe) as a second action, and "Prev" is associated with, for example, a left swipe (back swipe) as a second action.
 (4)Vol+/Vol-
 「Vol+」は、音量を上げる機能であり、「Vol-」は、音量を下げる機能である。「Vol+/Vol-」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、上下スワイプが関連付けられている。ここで、「Vol+」には、第2動作として、上下スワイプのうちの上スワイプが関連付けられており、「Vol-」には、第2動作として、上下スワイプのうちの下スワイプが関連付けられている。なお、「Vol+/Vol-」には、第1動作なしで第2動作を行うジェスチャが関連付けられていてもよい。
(4) Vol + / Vol-
"Vol +" is a function to raise the volume, and "Vol-" is a function to lower the volume. "Vol + / Vol-" is associated with the swinging motion of the hand H as the first motion, and is associated with the up / down swipe as the second motion. Here, "Vol +" is associated with the up swipe of the up and down swipes as the second action, and "Vol-" is associated with the down swipe of the up and down swipes as the second action. There is. It should be noted that "Vol + / Vol-" may be associated with a gesture that performs the second operation without the first operation.
 図11A及び図11Bは、上下スワイプを説明するための図である。図11Aが左スワイプであり、図11Bが右スワイプである。上下スワイプは、ピンチ動作およびホールド動作とは異なり、移動幅を伴う動作(ジェスチャ)である。ここで、上スワイプは、図11Aに示すように、ユーザUが所定の移動幅W3ほど手Hを上方向(Z軸プラス方向)へ滑らせる動作である。また、下スワイプは、図11Bに示すように、ユーザUが所定の移動幅W4ほど手Hを下方向(Z軸マイナス方向)に滑らせる動作である。 11A and 11B are diagrams for explaining the up / down swipe. FIG. 11A is a left swipe and FIG. 11B is a right swipe. The up / down swipe is an operation (gesture) involving a movement width, unlike the pinch operation and the hold operation. Here, the upper swipe is an operation in which the user U slides the hand H upward (Z-axis plus direction) by a predetermined movement width W3, as shown in FIG. 11A. Further, the lower swipe is an operation in which the user U slides the hand H downward (Z-axis minus direction) by a predetermined movement width W4, as shown in FIG. 11B.
 なお、「Vol+/Vol-」は、音量を調節するための機能である。そのため、情報処理装置10は、本機能を実行するために、音の上げ下げの情報のみならず、上げ下げする音の量(音の上げ量、又は下げ量)の情報もユーザUから取得すことが望ましい。そこで、本実施形態では、「Vol+/Vol-」は、操作量(本機能の場合、音の上げ量、又は下げ量)を伴う機能であるものとする。本機能おいては、例えば、上下スワイプの際の手Hの移動量又は移動速度が「Vol+/Vol-」の操作量に関連付けられる。なお、1回の入力操作で一定量の音を上げ下げするよう情報処理装置10を構成することで、本機能を操作量の伴わない機能とすることも可能である。 Note that "Vol + / Vol-" is a function for adjusting the volume. Therefore, in order to execute this function, the information processing apparatus 10 can acquire not only the information on raising and lowering the sound but also the information on the amount of raising and lowering sound (the amount of raising or lowering the sound) from the user U. desirable. Therefore, in the present embodiment, "Vol + / Vol-" is assumed to be a function accompanied by an operation amount (in the case of this function, an amount of raising or lowering the sound). In this function, for example, the movement amount or movement speed of the hand H when swiping up and down is associated with the operation amount of "Vol + / Vol-". By configuring the information processing apparatus 10 so as to raise or lower a certain amount of sound with one input operation, it is possible to make this function a function without an operation amount.
 (5)Cancel
 「Cancel」は、ユーザUが行った操作をキャンセルする機能である。「Cancel」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、手をかざす動作が関連付けられている。図12A及び図12Bは、手Hをかざす動作を説明するための図である。図12AはユーザUを左側から見た図であり、図12BはユーザUを正面から見た図である。手Hをかざす動作は、図12A及び図12Bに示すように、ユーザUが、広げた手Hを、情報処理装置10に向けてかざす動作である。
(5) Candle
"Cancel" is a function of canceling the operation performed by the user U. The "Cancel" is associated with the swinging motion of the hand H as the first motion, and is associated with the motion of holding the hand as the second motion. 12A and 12B are diagrams for explaining the operation of holding the hand H. 12A is a view of the user U from the left side, and FIG. 12B is a view of the user U from the front. As shown in FIGS. 12A and 12B, the operation of holding the hand H is an operation in which the user U holds the spread hand H toward the information processing apparatus 10.
 (6)Quick Attention
 「Quick Attention」は、ユーザUが周囲の音をすばやく聞き取るための機能である。具体的には、「Quick Attention」は、出力中の音(例えば、音楽、通話音声、又は着信音)の音量をすばやく下げて、周囲の音を聞き取り易くする機能である。「Quick Attention」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、タッチ動作が関連付けられている。図13は、タッチ動作を説明するための図である。タッチ動作は、図13に示すように、ユーザUが、情報処理装置10の一部(例えば、ハウジング面)にタッチする動作である。
(6) Quick Attention
"Quick Attention" is a function for the user U to quickly hear the surrounding sound. Specifically, "Quick Attention" is a function that quickly lowers the volume of the sound being output (for example, music, call voice, or ringtone) to make it easier to hear the surrounding sound. The "Quick Attention" is associated with the swinging motion of the hand H as the first motion, and is associated with the touch motion as the second motion. FIG. 13 is a diagram for explaining the touch operation. As shown in FIG. 13, the touch operation is an operation in which the user U touches a part of the information processing apparatus 10 (for example, the housing surface).
 (7)Ambient Control
 「Ambient Control」は、ユーザUが周囲の音を確認しながら音楽を聞くための機能である。具体的には、「Ambient Control」は、音楽の再生中に外音を取り込む機能である。「Ambient Control」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、手Hを遠ざける動作が関連付けられている。図14は、手Hを遠ざける動作を説明するための図である。手Hを遠ざける動作は、ピンチ動作、ホールド動作、タッチ動作、及び手をかざす動作とは異なり、移動幅を伴う動作(ジェスチャ)である。ここで、手Hを遠ざける動作は、図14に示すように、ユーザUが、手Hを情報処理装置10にかざす動作を行った後、所定の移動幅W4ほど手Hを所定の離間方向(図14の例であればY軸プラス方向)へ移動させる動作である。
(7) Ambient Control
The "Ambient Control" is a function for the user U to listen to music while checking the surrounding sounds. Specifically, "Ambient Control" is a function of capturing external sounds during music reproduction. The "Ambient Control" is associated with a swing-up motion of the hand H as a first motion, and is associated with a motion of moving the hand H away as a second motion. FIG. 14 is a diagram for explaining an operation of moving the hand H away. The motion of moving the hand H away is a motion (gesture) with a movement width, unlike the pinch motion, the hold motion, the touch motion, and the motion of holding the hand. Here, as shown in FIG. 14, the operation of moving the hand H away is such that after the user U holds the hand H over the information processing apparatus 10, the hand H is moved in a predetermined separation direction by a predetermined movement width W4. In the example of FIG. 14, it is an operation of moving in the Y-axis plus direction).
<2-3.情報処理装置の動作>
 以上、情報処理装置10が検出可能なジェスチャについて述べたが、次に、このようなジェスチャを検出可能な情報処理装置10の動作について説明する。
<2-3. Information processing device operation>
The gestures that can be detected by the information processing apparatus 10 have been described above. Next, the operation of the information processing apparatus 10 that can detect such gestures will be described.
 図15は、本実施形態に係るコマンド実行処理を示すフローチャートである。コマンド実行処理は、ユーザUがジェスチャを使って入力したコマンドを情報処理装置10が実行する処理である。ここで「コマンド」とは、情報処理装置10に機能を実行させるための装置内部又は外部からの指令のことである。なお「コマンド」を、Play/Pause等、情報処理装置10が有する機能そのものを指すワードとみなすことも可能である。以下の説明で登場する「コマンド」の記載は「機能」に置き換え可能である。 FIG. 15 is a flowchart showing a command execution process according to the present embodiment. The command execution process is a process in which the information processing apparatus 10 executes a command input by the user U using a gesture. Here, the "command" is a command from the inside or the outside of the device for causing the information processing device 10 to execute a function. It is also possible to regard the "command" as a word indicating the function itself of the information processing apparatus 10, such as Play / Pause. The description of "command" appearing in the following explanation can be replaced with "function".
 コマンド実行処理は、情報処理装置10の制御部16により実行される。コマンド実行処理は、例えば、情報処理装置10に電源が投入された場合に開始される。 The command execution process is executed by the control unit 16 of the information processing device 10. The command execution process is started, for example, when the information processing apparatus 10 is turned on.
 まず、制御部16の取得部161は、入力検出部11及び状態検出部12からセンサ値を取得する(ステップS101)。例えば、取得部161は、近接センサ等の非接触式のセンサから検出領域OR内の物体に関する情報を取得する。また、取得部161は、タッチセンサ等の接触式のセンサからハウジング面への接触に関する情報を取得する。また、取得部161は、加速度センサ、ジャイロセンサ、地磁気センサ等のセンサから情報処理装置10の状態に関する情報を取得してもよい。 First, the acquisition unit 161 of the control unit 16 acquires the sensor value from the input detection unit 11 and the state detection unit 12 (step S101). For example, the acquisition unit 161 acquires information about an object in the detection region OR from a non-contact type sensor such as a proximity sensor. Further, the acquisition unit 161 acquires information regarding contact with the housing surface from a contact-type sensor such as a touch sensor. Further, the acquisition unit 161 may acquire information regarding the state of the information processing device 10 from sensors such as an acceleration sensor, a gyro sensor, and a geomagnetic sensor.
 続いて、取得部161は、ステップS101で取得したセンサ値に基づいて、検出領域ORで行われるユーザUのジェスチャの入力速度に関する情報を取得する(ステップS102)。例えば、取得部161は、入力検出部11からのセンサ値に基づいて、検出領域OR内でのユーザUの手の移動速度の情報を取得する。取得部161は、入力速度に関する情報に加えて、ジェスチャを行っているユーザUの手Hの検出領域OR内での位置を示す情報を取得してもよい。 Subsequently, the acquisition unit 161 acquires information regarding the input speed of the gesture of the user U performed in the detection area OR based on the sensor value acquired in step S101 (step S102). For example, the acquisition unit 161 acquires information on the movement speed of the user U's hand in the detection area OR based on the sensor value from the input detection unit 11. In addition to the information regarding the input speed, the acquisition unit 161 may acquire information indicating the position of the hand H of the user U who is performing the gesture in the detection area OR.
 なお、制御部16は、入力速度に関する情報を取得するにあたって、ユーザUが適切な入力操作を行えるよう、検出可能範囲ARの位置、大きさ、及び形状の少なくとも1つを変更してもよい。例えば、制御部16は、入力検出部11又は状態検出部12から取得した情報に基づいて、検出可能範囲ARの位置、大きさ、及び形状を変更してもよい。 Note that the control unit 16 may change at least one of the position, size, and shape of the detectable range AR so that the user U can perform an appropriate input operation when acquiring information on the input speed. For example, the control unit 16 may change the position, size, and shape of the detectable range AR based on the information acquired from the input detection unit 11 or the state detection unit 12.
 次に、制御部16の出力制御部164は、ユーザUへのフィードバックに関する出力制御を行う(ステップS103)。出力制御部164の出力制御は、以下の(1)~(4)に大別される。
  (1)速度情報に基づくフィードバック
  (2)位置情報に基づくフィードバック
  (3)速度情報と位置情報に基づくフィードバック
  (4)推測結果に基づくフィードバック
Next, the output control unit 164 of the control unit 16 performs output control regarding feedback to the user U (step S103). The output control of the output control unit 164 is roughly classified into the following (1) to (4).
(1) Feedback based on speed information (2) Feedback based on position information (3) Feedback based on speed information and position information (4) Feedback based on estimation results
 以下、(1)~(4)をそれぞれ説明する。なお、出力制御部164は、以下の(1)~(4)に示す出力制御処理の中から選択される複数の処理を並行して或いは組み合わせて実行することが可能である。 Hereinafter, (1) to (4) will be described respectively. The output control unit 164 can execute a plurality of processes selected from the output control processes shown in the following (1) to (4) in parallel or in combination.
 (1)速度情報に基づくフィードバック
 出力制御部164は、入力速度に関する情報に基づいて、ユーザUへのフィードバックに関する出力制御を行う。例えば、出力制御部164は、ユーザUの手Hの移動速度の情報に基づいて、ユーザUへのフィードバックに関する出力制御を行う。フィードバックは、音であってもよいし、振動であってもよい。また、出力制御部164は、ユーザUがジェスチャを行っている間、リアルタイムにフィードバックを行ってもよい。例えば、出力制御部164は、ユーザUの手の移動速度に対応するフィードバックを、移動速度の情報を取得後、遅延なく行ってよい。
(1) Feedback based on speed information The output control unit 164 controls the output related to feedback to the user U based on the information related to the input speed. For example, the output control unit 164 performs output control regarding feedback to the user U based on the information on the movement speed of the hand H of the user U. The feedback may be sound or vibration. Further, the output control unit 164 may provide feedback in real time while the user U is performing the gesture. For example, the output control unit 164 may provide feedback corresponding to the movement speed of the user U's hand without delay after acquiring the movement speed information.
 フィードバックがリアルタイムに行われる場合、出力制御部164は、ユーザUの手Hの移動速度に応じて、ユーザUへのフィードバックを変化させてもよい。例えば、情報処理装置10は、ユーザUの手Hの移動速度が所定の閾値Vthを超えているか否かでフィードバックを変化させてもよい。例えば、ユーザUの手Hが所定の閾値Vthより遅い速度で移動している場合、出力制御部164は、第1のフィードバックを行う。一方、ユーザUの手Hが所定の閾値Vthより速い速度で移動している場合、出力制御部164は、第1のフィードバックとは異なる第2のフィードバックを行う。これにより、出力制御部164は、ユーザUに、適切な手Hの移動速度を知らせることができる。 When the feedback is performed in real time, the output control unit 164 may change the feedback to the user U according to the moving speed of the hand H of the user U. For example, the information processing apparatus 10 may change the feedback depending on whether or not the moving speed of the hand H of the user U exceeds a predetermined threshold value Vth. For example, when the hand H of the user U is moving at a speed slower than the predetermined threshold value Vth , the output control unit 164 provides the first feedback. On the other hand, when the hand H of the user U is moving at a speed faster than the predetermined threshold value Vth , the output control unit 164 provides a second feedback different from the first feedback. As a result, the output control unit 164 can inform the user U of the appropriate movement speed of the hand H.
 なお、第1のフィードバックと第2のフィードバックのいずれか一方がゼロフィードバックであってもよい。例えば、出力制御部164は、ユーザの手Hが所定の閾値Vthより遅い速度で移動している場合には、ユーザへのフィードバックを行わず、ユーザの手Hが所定の閾値Vthより遅い速度で移動している場合には、フィードバックを行ってもよい。 In addition, either one of the first feedback and the second feedback may be zero feedback. For example, when the user's hand H is moving at a speed slower than the predetermined threshold value Vth , the output control unit 164 does not give feedback to the user, and the user's hand H is slower than the predetermined threshold value Vth. If you are moving at speed, you may give feedback.
 なお、フィードバックを変化させる閾値は、複数存在していてもよい。例えば、フィードバックを変化させる閾値として、第1の閾値Vth11と、第1の閾値Vth11より大きい第2の閾値Vth12と、が存在していてもよい。本実施形態では、一例として、第1の閾値Vth11と第2の閾値Vth12との間の速度が、装置開発者が想定するユーザUの手Hの適切な移動速度であるものとする。なお、第1の閾値Vth11及び第2の閾値Vth12のいずれかを上述の所定の閾値Vthとみなしてもよい。 It should be noted that there may be a plurality of threshold values for changing the feedback. For example, as the threshold value for changing the feedback, a first threshold value V th11 and a second threshold value V th12 larger than the first threshold value V th11 may exist. In the present embodiment, as an example, it is assumed that the speed between the first threshold value V th11 and the second threshold value V th12 is an appropriate movement speed of the user U's hand H assumed by the device developer. In addition, any one of the first threshold value V th11 and the second threshold value V th12 may be regarded as the above-mentioned predetermined threshold value V th.
 このとき、出力制御部164は、ユーザUの手Hの移動速度が第1の閾値Vth11を超えていない場合には、フィードバックとして第1のフィードバックを行う。また、出力制御部164は、ユーザUの手Hの移動速度が第1の閾値Vth11を超え、第2の閾値Vth12を超えていない場合には、第1のフィードバックとは異なる第2のフィードバックを行う。また、出力制御部164は、ユーザUの手Hの移動速度が第2の閾値Vth12を超えている場合には、第2のフィードバックとは異なる第3のフィードバックを行う。第1のフィードバックと第3のフィードバックは異なっていてもよいし、同じであってもよい。これにより、出力制御部164は、ユーザUに、速くもなく遅くもない適切な手Hの移動速度を知らせることができる。 At this time, if the moving speed of the hand H of the user U does not exceed the first threshold value V th11 , the output control unit 164 provides the first feedback as feedback. Further, the output control section 164, the moving speed of the hand H of the user U exceeds the first threshold value V th11, if not exceed the second threshold value V th12 is different second and first feedback Give feedback. Further, when the moving speed of the hand H of the user U exceeds the second threshold value V th12 , the output control unit 164 provides a third feedback different from the second feedback. The first feedback and the third feedback may be different or the same. As a result, the output control unit 164 can inform the user U of an appropriate movement speed of the hand H, which is neither fast nor slow.
 上述したように、フィードバックは音であってもよい。すなわち、出力制御部164は、フィードバックに関する出力制御として、音の出力制御を行ってもよい。この場合、出力制御部164は、入力速度に関する情報に基づいて、音の出力態様を変化させてもよい。より具体的には、出力制御部164は、ユーザUの手Hの移動速度に応じて、フィードバックとしての音を変化させてもよい。 As mentioned above, the feedback may be sound. That is, the output control unit 164 may perform sound output control as output control related to feedback. In this case, the output control unit 164 may change the sound output mode based on the information regarding the input speed. More specifically, the output control unit 164 may change the sound as feedback according to the moving speed of the hand H of the user U.
 出力制御部164が音の出力態様を変化させる場合、出力制御部164が変化させる音の要素は、音量、周波数、又は出力パターンであってもよい。周波数はピッチと言い換えることができる。また、フィードバックとする音が声なのであれば、音の要素は、声の抑揚であってもよい。上述の第1のフィードバック、第2のフィードバック、及び第3のフィードバックは、音量、周波数、出力パターン、又は抑揚の異なる音であってもよい。勿論、ユーザUが異なる音と認識できるのであれば、出力制御部164が変化させる音の要素は、音量、周波数、出力パターン、及び抑揚に限定されない。また、フィードバックを変化させる場合、出力制御部164は、複数の音の要素の組み合わせを変化させてもよい。 When the output control unit 164 changes the output mode of the sound, the sound element changed by the output control unit 164 may be a volume, a frequency, or an output pattern. Frequency can be rephrased as pitch. Further, if the sound to be fed back is a voice, the element of the sound may be the intonation of the voice. The first feedback, the second feedback, and the third feedback described above may be sounds having different volumes, frequencies, output patterns, or intonations. Of course, if the user U can recognize different sounds, the sound elements changed by the output control unit 164 are not limited to the volume, frequency, output pattern, and intonation. Further, when the feedback is changed, the output control unit 164 may change the combination of a plurality of sound elements.
 (2)位置情報に基づくフィードバック
 出力制御部164は、ユーザUの手Hの位置に応じてフィードバックを行ってもよい。例えば、出力制御部164は、ユーザUの手の位置が検出可能範囲AR内にある間、常時、フィードバックを行ってもよい。フィードバックが音の場合、出力制御部164は、ユーザUの手Hが検出可能範囲AR内にある間、常時、音を出力してもよい。そして、出力制御部164は、ユーザUの手Hが検出可能範囲AR内にない場合は、フィードバックとしての音を出力しないようにしてもよい。これにより、ユーザUは、自身の手が検出可能範囲AR内にあるか否かを知ることができる。
(2) Feedback based on position information The output control unit 164 may provide feedback according to the position of the hand H of the user U. For example, the output control unit 164 may provide feedback at all times while the position of the user U's hand is within the detectable range AR. When the feedback is sound, the output control unit 164 may always output sound while the user U's hand H is within the detectable range AR. Then, the output control unit 164 may not output the sound as feedback when the user U's hand H is not within the detectable range AR. As a result, the user U can know whether or not his / her hand is within the detectable range AR.
 (3)速度情報と位置情報に基づくフィードバック
 出力制御部164は、ユーザUの手Hの移動速度とその位置に応じてフィードバックを行ってもよい。このとき、出力制御部164は、検出領域OR内でのユーザの手の位置に応じて、閾値Vthを変化させてもよい。例えば、出力制御部164は、検出可能範囲AR内でのユーザの手の位置に応じて、フィードバックを変化させるための所定の閾値Vthを変化させてもよい。このとき、出力制御部164は、検出可能範囲ARを複数の領域に分けて処理してもよい。図16は、複数の領域に分けられた検出可能範囲ARを説明するための図である。図16の例では、検出可能範囲ARは、検出可能範囲ARの中央の一定範囲を占める中央領域AR1と、検出可能範囲ARの縁から内側に一定範囲を占める内縁領域AR2と、に分けられている。
(3) Feedback based on speed information and position information The output control unit 164 may provide feedback according to the moving speed of the hand H of the user U and its position. At this time, the output control unit 164 may change the threshold value Vth according to the position of the user's hand in the detection area OR. For example, the output control unit 164 may change a predetermined threshold Vth for changing the feedback according to the position of the user's hand in the detectable range AR. At this time, the output control unit 164 may divide the detectable range AR into a plurality of areas for processing. FIG. 16 is a diagram for explaining a detectable range AR divided into a plurality of regions. In the example of FIG. 16, the detectable range AR is divided into a central region AR1 that occupies a certain range in the center of the detectable range AR and an inner edge region AR2 that occupies a certain range inward from the edge of the detectable range AR. There is.
 そして、出力制御部164は、ユーザUの手Hが中央領域AR1にある場合と内縁領域AR2にある場合とで、フィードバックを変化させる所定の閾値Vthを変化させてもよい。例えば、出力制御部164は、ユーザUの手Hが中央領域AR1にある場合は、所定の閾値Vthを閾値Vth21とし、ユーザUの手Hが内縁領域AR2にある場合は、所定の閾値Vthを閾値Vth21より小さい閾値Vth22とする。なお、本実施形態では、閾値Vth21及び閾値Vth22は、いずれも所定の閾値Vthの一種であるものとする。 Then, the output control unit 164 may change a predetermined threshold value Vth for changing the feedback depending on whether the user U's hand H is in the central region AR1 or the inner edge region AR2. For example, the output control unit 164 sets a predetermined threshold value V th as the threshold value V th 21 when the hand H of the user U is in the central region AR1, and a predetermined threshold value when the hand H of the user U is in the inner edge region AR 2. Let V th be a threshold value V th22 smaller than the threshold value V th 21 . In the present embodiment , both the threshold value V th21 and the threshold value V th22 are assumed to be a kind of a predetermined threshold value V th.
 図17A及び図17Bは、ユーザUの手Hの位置の情報に基づくフィードバックの閾値Vthの変化を説明するための図である。図17Aの例では、ユーザUの手Hは、中央領域AR1で、移動速度V21で移動しており、図17Bの例では、ユーザUの手Hは、内縁領域AR2で、移動速度V21より遅い移動速度V22で移動している。 17A and 17B are diagrams for explaining the change in the feedback threshold value Vth based on the information on the position of the hand H of the user U. In the example of FIG. 17A, the hand H of the user U is moving at the moving speed V 21 in the central region AR1, and in the example of FIG. 17B, the hand H of the user U is moving at the moving speed V 21 in the inner edge region AR2. It is moving at a slower moving speed V 22.
 ここで、移動速度V21は、中央領域AR1での閾値Vth21より遅く、移動速度V22は、縁内領域AR2での閾値Vth22より速いものとする。そして、出力制御部164は、ユーザUの手Hの移動速度が所定の閾値Vthより遅い場合は、ユーザUに第1のフィードバックを行い、ユーザUの手Hの移動速度が所定の閾値Vthより速い場合は、ユーザUに第1のフィードバックとは異なる第2のフィードバックを行うものとする。 Here, it is assumed that the moving speed V 21 is slower than the threshold value V th 21 in the central region AR1 and the moving speed V 22 is faster than the threshold value V th 22 in the in-rim region AR 2. Then, when the movement speed of the hand H of the user U is slower than the predetermined threshold value Vth, the output control unit 164 gives first feedback to the user U, and the movement speed of the hand H of the user U is higher than the predetermined threshold value Vth. If it is fast, a second feedback different from the first feedback shall be given to the user U.
 図17Aの場合、ユーザUの手Hの移動速度V21は、図17Bに示す移動速度V22より速いものの、中央領域AR1に関連づけられている閾値Vth21より遅い。そのため、出力制御部164は、ユーザUへのフィードバックを第1のフィードバックとする。一方、図17Bの場合、ユーザUの手Hの移動速度V21は、図17Aに示す移動速度V22より遅いものの、内縁領域AR2に関連づけられている閾値Vth2より速い。そのため、出力制御部164は、ユーザUへのフィードバックを第2のフィードバックとする。 In the case of FIG. 17A, the moving speed V 21 of the hand H of the user U is faster than the moving speed V 22 shown in FIG. 17B, but slower than the threshold value V th 21 associated with the central region AR1. Therefore, the output control unit 164 uses the feedback to the user U as the first feedback. On the other hand, in the case of FIG. 17B, the moving speed V 21 of the hand H of the user U is slower than the moving speed V 22 shown in FIG. 17A, but faster than the threshold value V th2 associated with the inner edge region AR2. Therefore, the output control unit 164 uses the feedback to the user U as the second feedback.
 これにより、情報処理装置10は、速い移動速度が許容される領域(例えば、中央領域AR1)でのジェスチャに過度なフィードバックを行うことを少なくできる。一方で、情報処理装置10は、速い移動速度が許容されない領域(例えば、内縁領域AR2)でのジェスチャには、高い感度のフィードバックを行うことができる。結果として、情報処理装置10は、ユーザに過度なフィードバックによるストレスを与えることなく、検出可能範囲ARの外でジェスチャが行われる可能性を少なくできる。 As a result, the information processing apparatus 10 can reduce excessive feedback to the gesture in the region where a high moving speed is allowed (for example, the central region AR1). On the other hand, the information processing apparatus 10 can provide highly sensitive feedback to the gesture in the region where a high moving speed is not allowed (for example, the inner edge region AR2). As a result, the information processing apparatus 10 can reduce the possibility that the gesture is performed outside the detectable range AR without giving the user stress due to excessive feedback.
 (4)推測結果に基づくフィードバック
 出力制御部164は、ジェスチャが検出可能範囲AR外で行われることになるか否かの推測結果に基づいてフィードバックを行ってもよい。例えば、制御部16の推測部165は、ジェスチャを行うユーザUの手Hの位置の情報と手Hの移動速度の情報とに基づいて、ユーザUのジェスチャの入力動作が検出可能範囲ARの外で行われることになるか否かを推測する。
(4) Feedback based on estimation result The output control unit 164 may provide feedback based on the estimation result of whether or not the gesture will be performed outside the detectable range AR. For example, the guessing unit 165 of the control unit 16 determines that the input operation of the gesture of the user U is outside the detectable range AR based on the information on the position of the hand H of the user U performing the gesture and the information on the moving speed of the hand H. Guess if it will be done in.
 例えば、ユーザUが入力しようとしているジェスチャがスワイプであるとする。このとき、推測部165は、スワイプの開始位置とユーザの手Hの移動速度からスワイプの終了位置が検出可能範囲ARの外となるか否かを推測する。そして、出力制御部164は、ジェスチャの入力動作が検出可能範囲ARの外で行われることになると推測される場合には、ユーザUへ警告のための所定のフィードバックを行う。所定のフィードバックは、上記(1)~(3)で述べた速度情報又は位置情報に基づくフィードバックとは独立して行われてもよい。 For example, assume that the gesture that user U is trying to input is a swipe. At this time, the guessing unit 165 estimates whether or not the swipe end position is outside the detectable range AR from the swipe start position and the movement speed of the user's hand H. Then, when it is presumed that the input operation of the gesture will be performed outside the detectable range AR, the output control unit 164 gives predetermined feedback to the user U for warning. The predetermined feedback may be performed independently of the feedback based on the speed information or the position information described in the above (1) to (3).
 推測結果に基づくフィードバックをユーザに行うことで、検出可能範囲AR外でジェスチャが行われる可能性がさらに小さくなる。 By giving feedback to the user based on the estimation result, the possibility that the gesture is performed outside the detectable range AR is further reduced.
 次に、制御部16のジェスチャ検出部162は、ジェスチャを検出したか判別する(ステップS104)。例えば、ジェスチャ検出部162は、検出可能範囲AR内で第1動作を検出した後、検出可能範囲AR内で第2動作を検出したか判別する。なお、ジェスチャの検出でいう「検出」は、ジェスチャの認識も含む広義の「検出」である。ジェスチャを検出していない場合(ステップS104:No)、ジェスチャ検出部162は、ステップS101に処理を戻す。 Next, the gesture detection unit 162 of the control unit 16 determines whether or not the gesture has been detected (step S104). For example, the gesture detection unit 162 determines whether or not the first operation is detected in the detectable range AR and then the second operation is detected in the detectable range AR. In addition, "detection" in the detection of a gesture is a "detection" in a broad sense including recognition of a gesture. When the gesture is not detected (step S104: No), the gesture detection unit 162 returns the process to step S101.
 一方、ジェスチャが検出された場合(ステップS104:Yes)、制御部16のコマンド実行部163は、検出したジェスチャに対応するコマンドを実行する(ステップS105)。例えば、コマンド実行部163は、検出したジェスチャに関連付けられた情報処理装置10の機能を実行する。 On the other hand, when a gesture is detected (step S104: Yes), the command execution unit 163 of the control unit 16 executes a command corresponding to the detected gesture (step S105). For example, the command execution unit 163 executes the function of the information processing apparatus 10 associated with the detected gesture.
 コマンドの実行が完了したら、制御部16は、ステップS101に処理を戻し、再びステップS101~ステップS105の処理を実行する。 When the command execution is completed, the control unit 16 returns the process to step S101 and executes the processes of steps S101 to S105 again.
 本実施形態によれば、ジェスチャの入力速度の情報に基づいてユーザUにフィードバックが行われるので、ユーザUはフィードバックに基づいてジェスチャの入力精度を高めることができる。結果として、情報処理装置10のジェスチャの検出精度が高くなるので、情報処理装置10は、高いユーザビリティを実現できる。 According to this embodiment, since feedback is given to the user U based on the information of the gesture input speed, the user U can improve the gesture input accuracy based on the feedback. As a result, the gesture detection accuracy of the information processing apparatus 10 becomes high, so that the information processing apparatus 10 can realize high usability.
<<3.実施形態2>>
 実施形態1では、情報処理装置10に入力されるジェスチャは1回につき1つのジェスチャであったが、情報処理装置10はジェスチャを連続入力可能に構成されていてもよい。例えば、情報処理装置10は、ユーザが第1動作を実行後、第2動作を連続して入力できるよう構成されていてもよい。例えば、「Vol+」に係るジェスチャを例に説明すると、情報処理装置10は、ユーザが振り上げ動作を実行後、上スワイプを連続して入力できるよう構成されていてもよい。
<< 3. Embodiment 2 >>
In the first embodiment, the gesture input to the information processing apparatus 10 is one gesture at a time, but the information processing apparatus 10 may be configured to be capable of continuously inputting gestures. For example, the information processing apparatus 10 may be configured so that the user can continuously input the second operation after executing the first operation. For example, to explain the gesture related to "Vol +" as an example, the information processing apparatus 10 may be configured so that the user can continuously input the upper swipe after executing the swing-up operation.
 なお、以下の説明では、連続入力可能なジェスチャは、「Vol+/Vol-」等、操作量(例えば、音の上げ量、又は下げ量)の伴う機能に関連付けられたジェスチャであるものとする。そして、機能の操作量には、当該ジェスチャを行うユーザUの手Hの移動速度が関連付けられるものとする。 In the following description, the gesture that can be continuously input is a gesture associated with a function with an operation amount (for example, a sound raising amount or a sound lowering amount) such as "Vol + / Vol-". Then, it is assumed that the operation amount of the function is associated with the moving speed of the hand H of the user U who performs the gesture.
 以下、実施形態2の情報処理装置10について説明する。なお、実施形態2の情報処理装置10の機能ブロック構成は実施形態1と同様であるので説明を省略する。 Hereinafter, the information processing apparatus 10 of the second embodiment will be described. Since the functional block configuration of the information processing apparatus 10 of the second embodiment is the same as that of the first embodiment, the description thereof will be omitted.
<3-1.情報処理装置の動作>
 以下、実施形態2に係る情報処理装置10の動作について説明する。
<3-1. Information processing device operation>
Hereinafter, the operation of the information processing apparatus 10 according to the second embodiment will be described.
 図18は、本実施形態に係るコマンド実行処理を示すフローチャートである。コマンド実行処理は、情報処理装置10の制御部16により実行される。コマンド実行処理は、例えば、情報処理装置10に電源が投入された場合に開始される。 FIG. 18 is a flowchart showing a command execution process according to the present embodiment. The command execution process is executed by the control unit 16 of the information processing apparatus 10. The command execution process is started, for example, when the information processing apparatus 10 is turned on.
 まず、制御部16の取得部161は、入力検出部11及び状態検出部12からセンサ値を取得する(ステップS201)。例えば、取得部161は、近接センサ等の非接触式のセンサから検出領域OR内の物体に関する情報を取得してもよい。また、取得部161は、タッチセンサ等の接触式のセンサからハウジング面への接触に関する情報を取得してもよい。また、取得部161は、加速度センサ、ジャイロセンサ、地磁気センサ等のセンサから情報処理装置10の状態に関する情報を取得してもよい。 First, the acquisition unit 161 of the control unit 16 acquires the sensor value from the input detection unit 11 and the state detection unit 12 (step S201). For example, the acquisition unit 161 may acquire information about an object in the detection region OR from a non-contact type sensor such as a proximity sensor. Further, the acquisition unit 161 may acquire information regarding contact with the housing surface from a contact-type sensor such as a touch sensor. Further, the acquisition unit 161 may acquire information regarding the state of the information processing device 10 from sensors such as an acceleration sensor, a gyro sensor, and a geomagnetic sensor.
 続いて、取得部161は、ステップS201で取得したセンサ値に基づいて、第1の移動速度の情報を取得する(ステップS202)。第1の移動速度の情報は、連続入力可能なジェスチャの連続入力1回目のジェスチャ(第2動作)を行っているユーザUの手Hの移動速度の情報である。第1の移動速度の情報は、連続入力ができないジェスチャを行っているユーザUの手Hの移動速度の情報であってもよい。ここで、制御部16の出力制御部164は、第1の移動速度の情報に基づいて、ユーザUにフィードバックを行ってもよい。 Subsequently, the acquisition unit 161 acquires the information of the first moving speed based on the sensor value acquired in step S201 (step S202). The information of the first movement speed is the information of the movement speed of the hand H of the user U performing the first gesture (second operation) of the continuous input of the gesture that can be continuously input. The first movement speed information may be information on the movement speed of the hand H of the user U who is performing a gesture that cannot be continuously input. Here, the output control unit 164 of the control unit 16 may provide feedback to the user U based on the information of the first movement speed.
 次に、制御部16のジェスチャ検出部162は、ジェスチャを検出したか判別する(ステップS203)。例えば、ジェスチャ検出部162は、検出可能範囲AR内で第1動作を検出した後、検出可能範囲AR内で第2動作を検出したか判別する。ジェスチャを検出していない場合(ステップS203:No)、ジェスチャ検出部162は、ステップS201に処理を戻す。 Next, the gesture detection unit 162 of the control unit 16 determines whether or not the gesture has been detected (step S203). For example, the gesture detection unit 162 determines whether or not the first operation is detected in the detectable range AR and then the second operation is detected in the detectable range AR. When the gesture is not detected (step S203: No), the gesture detection unit 162 returns the process to step S201.
 一方、ジェスチャが検出された場合(ステップS203:Yes)、制御部16のコマンド実行部163は、検出したジェスチャに対応するコマンドを実行する(ステップS204)。例えば、コマンド実行部163は、検出したジェスチャに関連付けられた情報処理装置10の機能を実行する。 On the other hand, when a gesture is detected (step S203: Yes), the command execution unit 163 of the control unit 16 executes a command corresponding to the detected gesture (step S204). For example, the command execution unit 163 executes the function of the information processing apparatus 10 associated with the detected gesture.
 次に、コマンド実行部163は、ジェスチャの連続入力の受付条件を満たしているか判別する(ステップS205)。例えば、ジェスチャ検出部162は、ステップS203で検出したジェスチャが、連続入力可能なジェスチャか判別する。或いは、連続入力可能なジェスチャに予め決められた連続入力回数があるのであれば、ジェスチャ検出部162は、ユーザUのジェスチャの連続入力回数が予め決められた連続入力回数を超えているか判別する。 Next, the command execution unit 163 determines whether or not the conditions for accepting continuous input of gestures are satisfied (step S205). For example, the gesture detection unit 162 determines whether the gesture detected in step S203 is a gesture capable of continuous input. Alternatively, if the gesture capable of continuous input has a predetermined number of continuous inputs, the gesture detection unit 162 determines whether the number of continuous inputs of the user U's gesture exceeds the predetermined number of continuous inputs.
 なお、受付条件は、連続入力2回目以降のジェスチャの入力動作が、検出可能範囲外で行われることになるか否かの推測結果に基づくものであってもよい。例えば、制御部16の推測部165は、連続入力1回目の前記所定のジェスチャを行っているユーザUの手の位置と移動速度の情報に基づいて、連続入力2回目の前記所定のジェスチャの入力動作がジェスチャの検出可能範囲ARの外で行われることになるか否かを推測する。そして、コマンド実行部163は、連続入力2回目のジェスチャの入力動作が検出可能範囲ARの外で行われることになると推測される場合には、受付条件を満たしていないと判別し、連続入力2回目のジェスチャに係るコマンドを実行しないようにしてもよい。これにより、連続入力を意図しない動作を連続入力として誤検出することを抑制できる。 The acceptance condition may be based on the estimation result of whether or not the gesture input operation after the second continuous input is performed outside the detectable range. For example, the guessing unit 165 of the control unit 16 inputs the predetermined gesture for the second continuous input based on the information of the position and the movement speed of the hand of the user U performing the predetermined gesture for the first continuous input. Guess whether the action will be performed outside the gesture's detectable range AR. Then, when it is presumed that the input operation of the gesture for the second continuous input is performed outside the detectable range AR, the command execution unit 163 determines that the reception condition is not satisfied, and the continuous input 2 The command related to the second gesture may not be executed. As a result, it is possible to prevent erroneous detection of an operation that is not intended for continuous input as continuous input.
 受付条件を満たしていない場合(ステップS205:No)、ジェスチャ検出部162は、ステップS201に処理を戻す。 If the reception condition is not satisfied (step S205: No), the gesture detection unit 162 returns the process to step S201.
 一方、受付条件を満たしている場合(ステップS205:Yes)、取得部161は、入力検出部11及び状態検出部12からセンサ値を取得する(ステップS206)。そして、取得部161は、ステップS205で取得したセンサ値に基づいて、第2の移動速度の情報を取得する(ステップS207)。第2の移動速度の情報は、連続入力可能なジェスチャの連続入力2回目以降のジェスチャ(第2動作)を行っているユーザUの手Hの移動速度の情報である。 On the other hand, when the reception condition is satisfied (step S205: Yes), the acquisition unit 161 acquires the sensor value from the input detection unit 11 and the state detection unit 12 (step S206). Then, the acquisition unit 161 acquires the information of the second moving speed based on the sensor value acquired in step S205 (step S207). The information on the second movement speed is the information on the movement speed of the hand H of the user U who is performing the gesture (second operation) after the second continuous input of the gesture that can be continuously input.
 次に、出力制御部164は、第1の移動速度の情報と第2の移動速度の情報とに基づいて、連続入力2回目以降のジェスチャが行われた際のフィードバックに関する出力制御を行う(ステップS208)。例えば、出力制御部164は、第1の移動速度と第2の移動速度との比較結果に基づいて、連続入力2回目以降のジェスチャが行われた際のフィードバックに関する出力制御を行う。より具体的には、出力制御部164は、第2の移動速度が第1の移動速度より所定の速度以上離れている場合には、連続入力2回目のジェスチャに対するフィードバックを連続入力1回目のジェスチャに対するフィードバックから変化させる。フィードバックは、音であってもよいし、振動であってもよい。 Next, the output control unit 164 performs output control regarding feedback when the second and subsequent gestures of continuous input are performed based on the information of the first movement speed and the information of the second movement speed (step). S208). For example, the output control unit 164 performs output control regarding feedback when the second and subsequent gestures of continuous input are performed based on the comparison result between the first movement speed and the second movement speed. More specifically, when the second movement speed is separated from the first movement speed by a predetermined speed or more, the output control unit 164 gives feedback to the second gesture of continuous input to the first gesture of continuous input. Change from feedback to. The feedback may be sound or vibration.
 次に、ジェスチャ検出部162は、ジェスチャを検出したか判別する(ステップS209)。例えば、ジェスチャ検出部162は、検出可能範囲AR内で第1動作を検出した後、検出可能範囲AR内で第2動作を検出したか判別する。ジェスチャを検出していない場合(ステップS209:No)、ジェスチャ検出部162は、ステップS205に処理を戻す。 Next, the gesture detection unit 162 determines whether or not the gesture has been detected (step S209). For example, the gesture detection unit 162 determines whether or not the first operation is detected in the detectable range AR and then the second operation is detected in the detectable range AR. When the gesture is not detected (step S209: No), the gesture detection unit 162 returns the process to step S205.
 一方、ジェスチャが検出された場合(ステップS209:Yes)、制御部16のコマンド実行部163は、検出したジェスチャに対応するコマンドを実行する(ステップS210)。例えば、コマンド実行部163は、検出したジェスチャに関連付けられた情報処理装置10の機能を実行する。 On the other hand, when a gesture is detected (step S209: Yes), the command execution unit 163 of the control unit 16 executes a command corresponding to the detected gesture (step S210). For example, the command execution unit 163 executes the function of the information processing apparatus 10 associated with the detected gesture.
 コマンドの実行が完了したら、制御部16は、ステップS205に処理を戻す。 When the command execution is completed, the control unit 16 returns the process to step S205.
 本実施形態によれば、情報処理装置10は、第1の移動速度と第2の移動速度との比較結果に基づいて、フィードバックに関する出力制御を行っているので、ユーザは、連続入力2回目以降のジェスチャの入力速度が連続入力1回目のジェスチャの入力速度と比べてどの程度速くなったか、又は遅くなったかわかりやすい。移動速度に機能の操作量が関連づけられている場合、ユーザUは、1回目との比較から、所望の操作量に対応する2回目以降のジェスチャの入力速度を把握できる。結果として、ユーザUは、一気にボリュームを上げることが可能になる等、少ない連続動作で所望の入力が行えるようになる。 According to the present embodiment, the information processing apparatus 10 performs output control regarding feedback based on the comparison result between the first moving speed and the second moving speed, so that the user can perform the second and subsequent continuous inputs. It is easy to understand how much the input speed of the gesture of is faster or slower than the input speed of the first continuous input gesture. When the operation amount of the function is associated with the movement speed, the user U can grasp the input speed of the second and subsequent gestures corresponding to the desired operation amount from the comparison with the first time. As a result, the user U can perform a desired input with a small number of continuous operations, such as being able to raise the volume at once.
<<4.実施形態3>>
 実施形態1及び2では、情報処理装置10が、ユーザの手Hの動きを検出し、コマンドを実行した。しかし、ユーザの手Hの動きを検出する装置とコマンドを実行する装置は異なる装置が行ってもよい。以下、実施形態3の情報処理システム1について説明する。
<< 4. Embodiment 3 >>
In the first and second embodiments, the information processing apparatus 10 detects the movement of the user's hand H and executes a command. However, the device for detecting the movement of the user's hand H and the device for executing the command may be performed by different devices. Hereinafter, the information processing system 1 of the third embodiment will be described.
<4-1.情報処理システムの構成>
 図19は、情報処理システム1の構成例を示す図である。情報処理システム1は、ユーザのジェスチャに基づいて各種機能を実行するシステムである。情報処理システム1は、図19に示すように、出力装置20と、端末装置30と、を備える。図19の例では、情報処理システム1は、出力装置20Aと出力装置20Bとを備えている。なお、図19の例では、出力装置20と端末装置30は無線接続されているが、出力装置20と端末装置30は有線接続可能に構成されていてもよい。
<4-1. Information processing system configuration>
FIG. 19 is a diagram showing a configuration example of the information processing system 1. The information processing system 1 is a system that executes various functions based on the gesture of the user. As shown in FIG. 19, the information processing system 1 includes an output device 20 and a terminal device 30. In the example of FIG. 19, the information processing system 1 includes an output device 20A and an output device 20B. In the example of FIG. 19, the output device 20 and the terminal device 30 are wirelessly connected, but the output device 20 and the terminal device 30 may be configured to be connectable by wire.
 なお、情報処理システム1は、出力装置20を1つのみ備えていてもよいし、複数備えていてもよい。例えば、出力装置20がイヤホンであるとすると、情報処理システム1は、端末装置30と無線接続され、ユーザUの左右の耳にそれぞれ装着される一対の出力装置20を備えていてもよい。一例として、出力装置20AはユーザUの左耳に装着されるイヤホンであり、出力装置20BはユーザUの右耳に装着されるイヤホンである。 The information processing system 1 may be provided with only one output device 20 or may be provided with a plurality of output devices 20. For example, assuming that the output device 20 is an earphone, the information processing system 1 may include a pair of output devices 20 that are wirelessly connected to the terminal device 30 and are attached to the left and right ears of the user U, respectively. As an example, the output device 20A is an earphone worn on the left ear of the user U, and the output device 20B is an earphone worn on the right ear of the user U.
 なお、1つの出力装置20は、必ずしも1つの一体の装置でなくてもよい。機能的に或いは用途的に連関する複数の別体の装置を、1つの出力装置20とみなすことも可能である。例えば、ユーザUの左右の耳にそれぞれ装着される左右一対のイヤホンを1つの出力装置20とみなしてもよい。勿論、1つの出力装置20は、ユーザUの一方の耳に装着される1つの一体のイヤホンであってもよいし、ユーザUの左右の耳に装着される1つの一体のヘッドホンであってもよい。 Note that one output device 20 does not necessarily have to be one integrated device. It is also possible to regard a plurality of separate devices that are functionally or practically related as one output device 20. For example, a pair of left and right earphones worn on the left and right ears of the user U may be regarded as one output device 20. Of course, one output device 20 may be one integrated earphone worn on one ear of the user U, or one integrated headphone worn on the left and right ears of the user U. good.
<4-2.出力装置の構成>
 まず、出力装置20の構成を説明する。
<4-2. Output device configuration>
First, the configuration of the output device 20 will be described.
 出力装置20は、イヤホンやヘッドホン等のユーザが装着可能な音響出力装置である。情報処理装置10は、ARグラスやMRグラス等のユーザが装着可能な表示装置であってもよい。本実施形態では、出力装置20は、ユーザUに装着可能な機器であるものとし、装着時に少なくともユーザUの耳に位置する部位を備えているものとする。そして、この部位には、少なくとも物体を検出する検出部が配置されている。ここで、検出部は、実施形態1を例に説明すると、入力検出部11に相当する機能ブロックである。なお、ユーザUの耳に位置する部位が左右2つあるのであれば、双方の部位それぞれが検出部を備えていてもよいし、一方の部位のみが検出部を備えていてもよい。 The output device 20 is an acoustic output device that can be worn by a user such as earphones and headphones. The information processing device 10 may be a display device such as AR glass or MR glass that can be worn by the user. In the present embodiment, the output device 20 is a device that can be worn by the user U, and is provided with a portion located at least in the ear of the user U when worn. Then, at least a detection unit for detecting an object is arranged in this portion. Here, the detection unit is a functional block corresponding to the input detection unit 11 when the first embodiment is described as an example. If there are two left and right parts located in the ears of the user U, each of both parts may have a detection unit, or only one part may have a detection part.
 図20は、本開示の実施形態に係る出力装置20の構成例を示す図である。出力装置20は、入力検出部21と、状態検出部22と、出力部23と、通信部24と、制御部26と、を備える。なお、図20に示した構成は機能的な構成であり、ハードウェア構成はこれとは異なっていてもよい。また、出力装置20の機能は、複数の物理的に分離された構成に分散して実装されてもよい。 FIG. 20 is a diagram showing a configuration example of the output device 20 according to the embodiment of the present disclosure. The output device 20 includes an input detection unit 21, a state detection unit 22, an output unit 23, a communication unit 24, and a control unit 26. The configuration shown in FIG. 20 is a functional configuration, and the hardware configuration may be different from this. Further, the functions of the output device 20 may be distributed and implemented in a plurality of physically separated configurations.
 入力検出部21は、ユーザの入力操作を検出する検出部である。状態検出部22は、出力装置20の状態に関する検出を行うセンサ部である。出力部23は、ユーザに情報を出力する出力インタフェースである。通信部24は、端末装置30等の他の装置と通信するための通信インタフェースである。制御部26は、出力装置20の各部を制御するコントローラである。 The input detection unit 21 is a detection unit that detects a user's input operation. The state detection unit 22 is a sensor unit that detects the state of the output device 20. The output unit 23 is an output interface that outputs information to the user. The communication unit 24 is a communication interface for communicating with other devices such as the terminal device 30. The control unit 26 is a controller that controls each unit of the output device 20.
 その他、入力検出部21、状態検出部22、出力部23、通信部24、及び制御部26の構成は、情報処理装置10が備える入力検出部11、状態検出部12、出力部13、通信部14、及び制御部16の構成と同様である。この場合、情報処理装置10の記載は、適宜、出力装置20に置き換えてもよい。 In addition, the input detection unit 21, the state detection unit 22, the output unit 23, the communication unit 24, and the control unit 26 are configured by the input detection unit 11, the state detection unit 12, the output unit 13, and the communication unit included in the information processing device 10. The configuration is the same as that of the control unit 16 and the control unit 16. In this case, the description of the information processing device 10 may be replaced with the output device 20 as appropriate.
<4-3.端末装置の構成>
 次に、端末装置30の構成を説明する。
<4-3. Terminal device configuration>
Next, the configuration of the terminal device 30 will be described.
 端末装置30は、出力装置20と通信可能な情報処理端末である。端末装置30は、本実施形態の情報処理装置の一種である。端末装置30は、例えば、携帯電話、スマートデバイス(スマートフォン、又はタブレット)、PDA(Personal Digital Assistant)、又はパーソナルコンピュータである。また、端末装置30は、通信機能が具備された業務用カメラといった機器であってもよいし、FPU(Field Pickup Unit)等の通信機器が搭載された移動体であってもよい。また、端末装置30は、M2M(Machine to Machine)デバイス、又はIoT(Internet of Things)デバイスであってもよい。端末装置30は、出力装置20の外部から有線又は無線を介して出力装置20を制御する。 The terminal device 30 is an information processing terminal capable of communicating with the output device 20. The terminal device 30 is a kind of information processing device of the present embodiment. The terminal device 30 is, for example, a mobile phone, a smart device (smartphone or tablet), a PDA (Personal Digital Assistant), or a personal computer. Further, the terminal device 30 may be a device such as a commercial camera equipped with a communication function, or may be a mobile body equipped with a communication device such as an FPU (Field Pickup Unit). Further, the terminal device 30 may be an M2M (Machine to Machine) device or an IoT (Internet of Things) device. The terminal device 30 controls the output device 20 from the outside of the output device 20 via a wire or wirelessly.
 図21は、本開示の実施形態に係る端末装置30の構成例を示す図である。端末装置30は、入力部31と、状態検出部32と、出力部33と、通信部34と、記憶部35と、制御部36と、を備える。なお、図21に示した構成は機能的な構成であり、ハードウェア構成はこれとは異なっていてもよい。また、端末装置30の機能は、複数の物理的に分離された構成に分散して実装されてもよい。 FIG. 21 is a diagram showing a configuration example of the terminal device 30 according to the embodiment of the present disclosure. The terminal device 30 includes an input unit 31, a state detection unit 32, an output unit 33, a communication unit 34, a storage unit 35, and a control unit 36. The configuration shown in FIG. 21 is a functional configuration, and the hardware configuration may be different from this. Further, the functions of the terminal device 30 may be distributed and implemented in a plurality of physically separated configurations.
 入力部31は、ユーザの入力操作を受け付ける入力インタフェースである。例えば、入力部31は、ボタンやタッチパネルである。状態検出部32は、端末装置30の状態に関する検出を行うセンサ部である。出力部33は、ユーザに情報を出力する出力インタフェースである。通信部34は、出力装置20等の他の装置と通信するための通信インタフェースである。記憶部35は、データ読み書き可能な記憶装置である。記憶部35は、例えば、ジェスチャの検出可能範囲ARに関する情報を記憶する。制御部36は、端末装置30の各部を制御するコントローラである。制御部36は、取得部361と、ジェスチャ検出部362と、コマンド実行部363と、出力制御部364と、推測部365と、を備える。 The input unit 31 is an input interface that accepts user input operations. For example, the input unit 31 is a button or a touch panel. The state detection unit 32 is a sensor unit that detects the state of the terminal device 30. The output unit 33 is an output interface that outputs information to the user. The communication unit 34 is a communication interface for communicating with other devices such as the output device 20. The storage unit 35 is a storage device capable of reading and writing data. The storage unit 35 stores, for example, information about the detectable range AR of the gesture. The control unit 36 is a controller that controls each unit of the terminal device 30. The control unit 36 includes an acquisition unit 361, a gesture detection unit 362, a command execution unit 363, an output control unit 364, and a guessing unit 365.
 その他、入力検出部21、状態検出部22、出力部23、通信部24、及び制御部26の構成は、情報処理装置10が備える入力検出部11、状態検出部12、出力部13、通信部14、及び制御部16の構成と同様である。また、取得部361、ジェスチャ検出部362、コマンド実行部363、出力制御部364、及び推測部365の構成は、取得部361が通信を介して入力検出部21及び状態検出部22から情報を取得する以外は、情報処理装置10の制御部16が備える取得部161、ジェスチャ検出部162、コマンド実行部163、出力制御部164、及び推測部165の構成と同様である。この場合、情報処理装置10の記載は、適宜、出力装置20又は端末装置30に置き換えてもよい。 In addition, the input detection unit 21, the state detection unit 22, the output unit 23, the communication unit 24, and the control unit 26 are configured by the input detection unit 11, the state detection unit 12, the output unit 13, and the communication unit included in the information processing device 10. The configuration is the same as that of the control unit 16 and the control unit 16. Further, in the configuration of the acquisition unit 361, the gesture detection unit 362, the command execution unit 363, the output control unit 364, and the guessing unit 365, the acquisition unit 361 acquires information from the input detection unit 21 and the state detection unit 22 via communication. The configuration is the same as that of the acquisition unit 161, the gesture detection unit 162, the command execution unit 163, the output control unit 164, and the guessing unit 165 included in the control unit 16 of the information processing apparatus 10. In this case, the description of the information processing device 10 may be appropriately replaced with the output device 20 or the terminal device 30.
<4-4.情報処理システムの動作>
 以上、情報処理システム1の構成を説明したが、次に、このような構成を有する情報処理装置10の動作について説明する。
<4-4. Information processing system operation>
The configuration of the information processing system 1 has been described above, but next, the operation of the information processing apparatus 10 having such a configuration will be described.
 (コマンド実行処理1)
 情報処理システム1は、実施形態1の情報処理装置10と同様にコマンド実行処理を実行可能である。情報処理システム1のコマンド実行処理は、端末装置30が出力装置20から通信を介してセンサ値を取得する以外は、実施形態1のコマンド実行処理と同様である。以下、実施形態1と同様に、図15を参照しながらコマンド実行処理を説明する。
(Command execution process 1)
The information processing system 1 can execute a command execution process in the same manner as the information processing apparatus 10 of the first embodiment. The command execution process of the information processing system 1 is the same as the command execution process of the first embodiment except that the terminal device 30 acquires the sensor value from the output device 20 via communication. Hereinafter, as in the first embodiment, the command execution process will be described with reference to FIG.
 図15は、本実施形態に係るコマンド実行処理を示すフローチャートである。コマンド実行処理は、端末装置30の制御部36により実行される。コマンド実行処理は、例えば、端末装置30が出力装置20と通信を確立した場合に実行される。 FIG. 15 is a flowchart showing a command execution process according to the present embodiment. The command execution process is executed by the control unit 36 of the terminal device 30. The command execution process is executed, for example, when the terminal device 30 establishes communication with the output device 20.
 まず、制御部36の取得部361は、通信部34を介して、出力装置20の入力検出部21及び状態検出部22からセンサ値を取得する(ステップS101)。例えば、取得部361は、近接センサ等の非接触式のセンサから検出領域OR内の物体に関する情報を取得する。また、取得部361は、タッチセンサ等の接触式のセンサからハウジング面への接触に関する情報を取得する。また、取得部361は、加速度センサ、ジャイロセンサ、地磁気センサ等のセンサから出力装置20の状態に関する情報を取得してもよい。 First, the acquisition unit 361 of the control unit 36 acquires the sensor value from the input detection unit 21 and the state detection unit 22 of the output device 20 via the communication unit 34 (step S101). For example, the acquisition unit 361 acquires information about an object in the detection region OR from a non-contact type sensor such as a proximity sensor. Further, the acquisition unit 361 acquires information regarding contact with the housing surface from a contact-type sensor such as a touch sensor. Further, the acquisition unit 361 may acquire information regarding the state of the output device 20 from sensors such as an acceleration sensor, a gyro sensor, and a geomagnetic sensor.
 続いて、取得部361は、ステップS101で取得したセンサ値に基づいて、検出領域ORで行われるユーザのジェスチャの入力速度に関する情報を取得する(ステップS102)。例えば、取得部361は、入力検出部11からのセンサ値に基づいて、検出領域OR内でのユーザUの手の移動速度の情報を取得する。取得部361は、入力速度に関する情報に加えて、ジェスチャを行っているユーザUの手Hの検出領域OR内での位置を示す情報を取得してもよい。 Subsequently, the acquisition unit 361 acquires information regarding the input speed of the user's gesture performed in the detection area OR based on the sensor value acquired in step S101 (step S102). For example, the acquisition unit 361 acquires information on the movement speed of the user U's hand in the detection area OR based on the sensor value from the input detection unit 11. In addition to the information regarding the input speed, the acquisition unit 361 may acquire information indicating the position of the hand H of the user U who is performing the gesture in the detection area OR.
 なお、制御部36は、入力速度に関する情報を取得するにあたって、ユーザUが適切な入力操作を行えるよう、検出可能範囲ARの位置、大きさ、及び形状の少なくとも1つを変更してもよい。例えば、制御部36は、入力検出部21又は状態検出部22から取得した情報に基づいて、検出可能範囲ARの位置、大きさ、及び形状を変更してもよい。 Note that the control unit 36 may change at least one of the position, size, and shape of the detectable range AR so that the user U can perform an appropriate input operation when acquiring information on the input speed. For example, the control unit 36 may change the position, size, and shape of the detectable range AR based on the information acquired from the input detection unit 21 or the state detection unit 22.
 制御部36の出力制御部364は、ユーザUへのフィードバックに関する出力制御を行う(ステップS103)。例えば、出力制御部164は、入力速度に関する情報に基づいて、ユーザUへのフィードバックに関する出力制御を行う。フィードバックは、音であってもよいし、振動であってもよい。 The output control unit 364 of the control unit 36 performs output control regarding feedback to the user U (step S103). For example, the output control unit 164 performs output control regarding feedback to the user U based on the information regarding the input speed. The feedback may be sound or vibration.
 ここで、出力制御部364は、端末装置30が備える出力部33を制御してフィードバックを行ってもよいし、出力装置20が備える出力部23を制御してフィードバックを行ってもよい。出力制御という概念には、通信を介して他の装置に所定の出力を行わせる制御も含まれる。 Here, the output control unit 364 may control the output unit 33 included in the terminal device 30 to provide feedback, or may control the output unit 23 included in the output device 20 to provide feedback. The concept of output control also includes control that causes another device to output a predetermined output via communication.
 また、出力制御部164は、ユーザUの手Hの位置に応じてフィードバックを行ってもよい。また、出力制御部164は、ユーザUの手Hの移動速度とその位置に応じてフィードバックを行ってもよい。また、出力制御部164は、ジェスチャが検出可能範囲AR外で行われることになるか否かの推測結果に基づいてフィードバックを行ってもよい。 Further, the output control unit 164 may provide feedback according to the position of the hand H of the user U. Further, the output control unit 164 may provide feedback according to the moving speed of the hand H of the user U and its position. Further, the output control unit 164 may provide feedback based on the estimation result of whether or not the gesture will be performed outside the detectable range AR.
 次に、制御部36のジェスチャ検出部362は、ジェスチャを検出したか判別する(ステップS104)。ジェスチャを検出していない場合(ステップS104:No)、ジェスチャ検出部362は、ステップS101に処理を戻す。 Next, the gesture detection unit 362 of the control unit 36 determines whether or not the gesture has been detected (step S104). When the gesture is not detected (step S104: No), the gesture detection unit 362 returns the process to step S101.
 一方、ジェスチャが検出された場合(ステップS104:Yes)、制御部36のコマンド実行部363は、検出したジェスチャに対応するコマンドを実行する(ステップS105)。例えば、コマンド実行部363は、検出したジェスチャに関連付けられた出力装置20の機能を実行する。 On the other hand, when a gesture is detected (step S104: Yes), the command execution unit 363 of the control unit 36 executes a command corresponding to the detected gesture (step S105). For example, the command execution unit 363 executes the function of the output device 20 associated with the detected gesture.
 コマンドの実行が完了したら、制御部36は、ステップS101に処理を戻し、再びステップS101~ステップS105の処理を実行する。 When the command execution is completed, the control unit 36 returns the process to step S101 and executes the processes of steps S101 to S105 again.
 本実施形態によっても、ジェスチャの入力速度に基づいてユーザUにフィードバックが行われるので、ユーザUはフィードバックに基づいてジェスチャの入力精度を高めることができる。結果として、端末装置30のジェスチャの検出精度が高くなるので、情報処理システム1は、高いユーザビリティを実現できる。 Also in this embodiment, since feedback is given to the user U based on the gesture input speed, the user U can improve the gesture input accuracy based on the feedback. As a result, the gesture detection accuracy of the terminal device 30 is increased, so that the information processing system 1 can realize high usability.
 (コマンド実行処理2)
 情報処理システム1は、実施形態2のコマンド実行処理を実行することも可能である。情報処理システム1のコマンド実行処理は、端末装置30が出力装置20から通信を介してセンサ値を取得する以外は、実施形態2のコマンド実行処理と同様である。以下、実施形態2と同様に、図18を参照しながらコマンド実行処理を説明する。
(Command execution process 2)
The information processing system 1 can also execute the command execution process of the second embodiment. The command execution process of the information processing system 1 is the same as the command execution process of the second embodiment except that the terminal device 30 acquires the sensor value from the output device 20 via communication. Hereinafter, as in the second embodiment, the command execution process will be described with reference to FIG.
 図18は、本実施形態に係るコマンド実行処理を示すフローチャートである。コマンド実行処理は、端末装置30の制御部36により実行される。コマンド実行処理は、例えば、端末装置30が出力装置20と通信を確立した場合に実行される。 FIG. 18 is a flowchart showing a command execution process according to the present embodiment. The command execution process is executed by the control unit 36 of the terminal device 30. The command execution process is executed, for example, when the terminal device 30 establishes communication with the output device 20.
 まず、制御部36の取得部361は、入力検出部11及び状態検出部12からセンサ値を取得する(ステップS201)。例えば、取得部361は、近接センサ等の非接触式のセンサから検出領域OR内の物体に関する情報を取得してもよい。また、取得部361は、タッチセンサ等の接触式のセンサからハウジング面への接触に関する情報を取得してもよい。また、取得部361は、加速度センサ、ジャイロセンサ、地磁気センサ等のセンサから出力装置20の状態に関する情報を取得してもよい。 First, the acquisition unit 361 of the control unit 36 acquires the sensor value from the input detection unit 11 and the state detection unit 12 (step S201). For example, the acquisition unit 361 may acquire information about an object in the detection region OR from a non-contact type sensor such as a proximity sensor. Further, the acquisition unit 361 may acquire information regarding contact with the housing surface from a contact-type sensor such as a touch sensor. Further, the acquisition unit 361 may acquire information regarding the state of the output device 20 from sensors such as an acceleration sensor, a gyro sensor, and a geomagnetic sensor.
 続いて、取得部361は、ステップS201で取得したセンサ値に基づいて、第1の移動速度の情報を取得する(ステップS202)。ここで、制御部36の出力制御部364は、第1の移動速度の情報に基づいて、ユーザUにフィードバックを行ってもよい。 Subsequently, the acquisition unit 361 acquires the information of the first moving speed based on the sensor value acquired in step S201 (step S202). Here, the output control unit 364 of the control unit 36 may provide feedback to the user U based on the information of the first movement speed.
 次に、制御部36のジェスチャ検出部362は、ジェスチャを検出したか判別する(ステップS203)。ジェスチャを検出していない場合(ステップS203:No)、ジェスチャ検出部362は、ステップS201に処理を戻す。 Next, the gesture detection unit 362 of the control unit 36 determines whether or not the gesture has been detected (step S203). When the gesture is not detected (step S203: No), the gesture detection unit 362 returns the process to step S201.
 一方、ジェスチャが検出された場合(ステップS203:Yes)、制御部36のコマンド実行部363は、検出したジェスチャに対応するコマンドを実行する(ステップS204)。例えば、コマンド実行部363は、検出したジェスチャに関連付けられた出力装置20の機能を実行する。 On the other hand, when a gesture is detected (step S203: Yes), the command execution unit 363 of the control unit 36 executes the command corresponding to the detected gesture (step S204). For example, the command execution unit 363 executes the function of the output device 20 associated with the detected gesture.
 次に、コマンド実行部363は、ジェスチャの連続入力の受付条件を満たしているか判別する(ステップS205)。ここで、受付条件は、連続入力2回目以降のジェスチャの入力動作が、検出可能範囲外で行われることになるか否かの推測結果に基づくものであってもよい。 Next, the command execution unit 363 determines whether or not the conditions for accepting continuous input of gestures are satisfied (step S205). Here, the acceptance condition may be based on the estimation result of whether or not the gesture input operation after the second continuous input is performed outside the detectable range.
 受付条件を満たしていない場合(ステップS205:No)、ジェスチャ検出部362は、ステップS201に処理を戻す。 If the reception condition is not satisfied (step S205: No), the gesture detection unit 362 returns the process to step S201.
 一方、受付条件を満たしている場合(ステップS205:Yes)、取得部361は、入力検出部11及び状態検出部12からセンサ値を取得する(ステップS206)。そして、取得部361は、ステップS205で取得したセンサ値に基づいて、第2の移動速度の情報を取得する(ステップS207)。 On the other hand, when the reception condition is satisfied (step S205: Yes), the acquisition unit 361 acquires the sensor value from the input detection unit 11 and the state detection unit 12 (step S206). Then, the acquisition unit 361 acquires the information of the second moving speed based on the sensor value acquired in step S205 (step S207).
 次に、出力制御部364は、第1の移動速度の情報と第2の移動速度の情報とに基づいて、連続入力2回目以降のジェスチャが行われた際のフィードバックに関する出力制御を行う(ステップS209)。例えば、出力制御部364は、第1の移動速度と第2の移動速度との比較結果に基づいて、連続入力2回目以降のジェスチャが行われた際のフィードバックに関する出力制御を行う。 Next, the output control unit 364 performs output control regarding feedback when the second and subsequent gestures of continuous input are performed based on the information of the first movement speed and the information of the second movement speed (step). S209). For example, the output control unit 364 controls the output regarding the feedback when the gesture after the second continuous input is performed, based on the comparison result between the first movement speed and the second movement speed.
 なお、出力制御部364は、フィードバックを行うにあたり、端末装置30が備える出力部33を制御してもよいし、出力装置20が備える出力部23を制御してもよい。フィードバックは、音であってもよいし、振動であってもよい。 The output control unit 364 may control the output unit 33 included in the terminal device 30 or the output unit 23 included in the output device 20 when giving feedback. The feedback may be sound or vibration.
 次に、ジェスチャ検出部362は、ジェスチャを検出したか判別する(ステップS209)。ジェスチャを検出していない場合(ステップS209:No)、ジェスチャ検出部362は、ステップS205に処理を戻す。 Next, the gesture detection unit 362 determines whether or not the gesture has been detected (step S209). When the gesture is not detected (step S209: No), the gesture detection unit 362 returns the process to step S205.
 一方、ジェスチャが検出された場合(ステップS209:Yes)、制御部36のコマンド実行部363は、検出したジェスチャに対応するコマンドを実行する(ステップS210)。例えば、コマンド実行部363は、検出したジェスチャに関連付けられた出力装置20の機能を実行する。 On the other hand, when a gesture is detected (step S209: Yes), the command execution unit 363 of the control unit 36 executes a command corresponding to the detected gesture (step S210). For example, the command execution unit 363 executes the function of the output device 20 associated with the detected gesture.
 コマンドの実行が完了したら、制御部36は、ステップS205に処理を戻す。 When the command execution is completed, the control unit 36 returns the process to step S205.
 本実施形態によれば、端末装置30は、第1の移動速度と第2の移動速度との比較結果に基づいて、フィードバックに関する出力制御を行っているので、ユーザUは、一気にボリュームを上げることが可能になる等、少ない連続動作で所望の入力が行えるようになる。 According to the present embodiment, since the terminal device 30 performs output control regarding feedback based on the comparison result between the first moving speed and the second moving speed, the user U raises the volume at once. It becomes possible to perform desired input with a small number of continuous operations.
<<5.変形例>>
 上述の各実施形態はそれぞれ一例を示したものであり、種々の変更及び応用が可能である。
<< 5. Modification example >>
Each of the above embodiments shows an example, and various modifications and applications are possible.
 (フィードバック処理に関する変形例)
 上述の実施形態では、ユーザUのジェスチャの入力速度に関する情報は、ジェスチャを行っているユーザUの手Hの移動速度の情報であるものとした。しかし、入力速度に関する情報は、ユーザUの手Hの移動速度の情報に限定されない。例えば、入力速度に関する情報は、ユーザUがジェスチャの入力に要した時間であってもよい。
(Variation example of feedback processing)
In the above-described embodiment, the information regarding the input speed of the gesture of the user U is assumed to be the information of the moving speed of the hand H of the user U performing the gesture. However, the information regarding the input speed is not limited to the information on the movement speed of the hand H of the user U. For example, the information regarding the input speed may be the time required for the user U to input the gesture.
 例えば、1つのジェスチャが第1動作とそれに続く第2動作で構成されるとする。この場合、実施形態1、2に係る情報処理装置10は、第1動作を検出してから、第2動作が終了するまでに要した時間の情報を、入力速度に関する情報として取得してもよい。そして、情報処理装置10は、この時間情報に基づいて、ユーザUへのフィードバックに関する出力制御を行ってもよい。この場合、情報処理装置10は、ユーザUのジェスチャが終了した直後に(すなわち、第2動作が終了した直後に)、フィードバックを行ってもよい。 For example, suppose one gesture consists of a first movement and a subsequent second movement. In this case, the information processing apparatus 10 according to the first and second embodiments may acquire information on the time required from the detection of the first operation to the end of the second operation as information on the input speed. .. Then, the information processing apparatus 10 may perform output control regarding feedback to the user U based on this time information. In this case, the information processing apparatus 10 may give feedback immediately after the gesture of the user U is completed (that is, immediately after the second operation is completed).
 同様に、実施形態3に係る端末装置30は、第1動作を検出してから、第2動作が終了するまでに要した時間の情報を、入力速度に関する情報として取得してもよい。そして、端末装置30は、この時間情報に基づいて、ユーザUへのフィードバックに関する出力制御を行ってもよい。この場合、端末装置30は、ユーザUのジェスチャが終了した直後に(すなわち、第2動作が終了した直後に)、フィードバックを行ってもよい。 Similarly, the terminal device 30 according to the third embodiment may acquire information on the time required from the detection of the first operation to the end of the second operation as information on the input speed. Then, the terminal device 30 may perform output control regarding feedback to the user U based on this time information. In this case, the terminal device 30 may give feedback immediately after the gesture of the user U is completed (that is, immediately after the second operation is completed).
 この変形例においても、ユーザUはフィードバックに基づいてジェスチャの入力精度を高めることができる。結果として、情報処理装置10のジェスチャの検出精度が高くなるので、情報処理装置10は、高いユーザビリティを実現できる。 Even in this modification, the user U can improve the gesture input accuracy based on the feedback. As a result, the gesture detection accuracy of the information processing apparatus 10 becomes high, so that the information processing apparatus 10 can realize high usability.
 (フィードバック手段に関する変形例)
 実施形態1、2では、情報処理装置10がユーザUに行うフィードバックは音又は振動であるものとしたが、フィードバックは音に限定されない。フィードバックは表示装置への出力であってもよいし、点滅装置への出力であってもよい。このとき、情報処理装置10は、自身でフィードバック出力を行ってもよいし、通信を介して接続された他の装置にフィードバック出力を行わせてもよい。
(Modified example of feedback means)
In the first and second embodiments, the feedback provided by the information processing apparatus 10 to the user U is sound or vibration, but the feedback is not limited to sound. The feedback may be an output to a display device or an output to a blinking device. At this time, the information processing device 10 may output the feedback by itself, or may have another device connected via communication perform the feedback output.
 また、実施形態3では、端末装置30がユーザUに行うフィードバックは音又は振動であるものとしたが、フィードバックは音に限定されない。フィードバックは表示装置への出力であってもよいし、点滅装置への出力であってもよい。このとき、フィードバックの出力を行う装置は、端末装置30であってもよいし、端末装置30と接続された出力装置20であってもよい。端末装置30は、端末装置30及び出力装置20以外の装置にフィードバック出力を行わせてもよい。 Further, in the third embodiment, the feedback provided by the terminal device 30 to the user U is sound or vibration, but the feedback is not limited to sound. The feedback may be an output to a display device or an output to a blinking device. At this time, the device that outputs the feedback may be the terminal device 30, or the output device 20 connected to the terminal device 30. The terminal device 30 may cause a device other than the terminal device 30 and the output device 20 to perform feedback output.
 (フィードバックの変更に関する変形例)
 上述の実施形態では、一例として、検出可能範囲ARは、中央領域AR1と内縁領域AR2の2つの領域に分けられるものとした。しかし、検出可能範囲ARは、2つより多くの領域に分けられてもよい。例えば、検出可能範囲ARは、検出可能範囲ARの中央付近に位置する領域と、その中央の領域を取り囲むように検出可能範囲ARの縁に向けて積層された複数の領域と、に分けられていてもよい。そして、情報処理装置10又は端末装置30は、フィードバックを変更するための所定の閾値Vthを、中央の領域を含む複数の領域それぞれで異なる値としてもよい。具体的には、情報処理装置10又は端末装置30は、外側の領域に行くに従って所定の閾値Vthが小さい値としてもよい。これにより、より細かなフィードバックが可能になる。
(Variation example of changing feedback)
In the above embodiment, as an example, the detectable range AR is divided into two regions, a central region AR1 and an inner edge region AR2. However, the detectable range AR may be divided into more than two regions. For example, the detectable range AR is divided into a region located near the center of the detectable range AR and a plurality of regions stacked toward the edge of the detectable range AR so as to surround the central region. You may. Then, the information processing apparatus 10 or the terminal apparatus 30 may set a predetermined threshold value Vth for changing the feedback to a different value in each of the plurality of regions including the central region. Specifically, the information processing apparatus 10 or the terminal apparatus 30 may have a value in which the predetermined threshold value Vth becomes smaller toward the outer region. This allows for more detailed feedback.
 (操作量に関する変形例)
 実施形態1、2の情報処理装置10が有する機能には、操作量を伴う所定の機能が含まれていてもよい。このとき、操作量を伴う所定の機能は音量操作に係る機能(Vol+/Vol-)であってもよいし、再生速度に係る機能であってもよい。また、操作量を伴う所定の機能は、これらに限定されず、例えば、早送り、巻き戻し、スロー再生に係る機能であってもよい。所定の機能には、スワイプ等、移動幅を伴うジェスチャが関連付けられていてもよい。そして、情報処理装置10は、ジェスチャを入力する際のユーザUの手Hの移動速度の情報に基づいて、所定の機能の操作量(例えば、上げ下げする音の量)を判別してもよい。
(Modification example related to operation amount)
The functions of the information processing apparatus 10 of the first and second embodiments may include a predetermined function accompanied by an operation amount. At this time, the predetermined function accompanied by the operation amount may be a function related to the volume operation (Vol + / Vol−) or a function related to the reproduction speed. Further, the predetermined function accompanied by the operation amount is not limited to these, and may be, for example, a function related to fast forward, rewind, and slow playback. A gesture with a movement width, such as a swipe, may be associated with a predetermined function. Then, the information processing apparatus 10 may determine the operation amount of a predetermined function (for example, the amount of sound to be raised or lowered) based on the information of the movement speed of the hand H of the user U when inputting the gesture.
 同様に、実施形態3の端末装置30が有する機能には、操作量を伴う所定の機能が含まれていてもよい。そして、所定の機能には、スワイプ等、移動幅を伴うジェスチャが関連付けられていてもよい。そして、端末装置30は、ジェスチャを入力する際のユーザUの手Hの移動速度の情報に基づいて、所定の機能の操作量(例えば、上げ下げする音の量)を判別してもよい。 Similarly, the function of the terminal device 30 of the third embodiment may include a predetermined function accompanied by an operation amount. Then, a gesture with a movement width such as a swipe may be associated with the predetermined function. Then, the terminal device 30 may determine the operation amount (for example, the amount of raising / lowering sound) of a predetermined function based on the information of the movement speed of the hand H of the user U when inputting the gesture.
 これにより、1つのジェスチャで操作量も入力できるようになるので、情報処理装置10又は端末装置30のユーザビリティが向上する。 This makes it possible to input the operation amount with one gesture, so that the usability of the information processing device 10 or the terminal device 30 is improved.
 (装置態様に関する変形例)
 上述の実施形態では、情報処理装置10及び出力装置20は、ユーザが装着可能な機器(装着可能機器)であるものとしたが、必ずしも装着可能機器でなくてもよい。例えば、情報処理装置10及び出力装置20は、テレビ、カーナビゲーションシステム、運転台、各種操作パネル等、構造物や移動体に設置されて使用される機器であってもよい。また、情報処理装置10及び出力装置20は、移動体そのものであってもよい。
(Modified example of device mode)
In the above-described embodiment, the information processing device 10 and the output device 20 are devices that can be worn by the user (wearable devices), but are not necessarily wearable devices. For example, the information processing device 10 and the output device 20 may be devices installed and used in a structure or a moving body such as a television, a car navigation system, a driver's cab, and various operation panels. Further, the information processing device 10 and the output device 20 may be the mobile body itself.
 ここで、移動体は、スマートフォン、携帯電話、パーソナルコンピュータ、音楽プレーヤー、ポータブルテレビ等のモバイル端末であってもよいし、機器操作のためのリモートコントローラであってもよい。また、移動体は、陸上を移動する移動体(例えば、自動車、自転車、バス、トラック、自動二輪車、列車、リニアモーターカー等の車両)であってもよいし、地中(例えば、トンネル内)を移動する移動体(例えば、地下鉄)であってもよい。また、移動体は、水上を移動する移動体(例えば、旅客船、貨物船、ホバークラフト等の船舶)であってもよいし、水中を移動する移動体(例えば、潜水艇、潜水艦、無人潜水機等の潜水船)であってもよい。また、移動体は、大気圏内を移動する移動体(例えば、飛行機、飛行船、ドローン等の航空機)であってもよいし、大気圏外を移動する移動体(例えば、宇宙ステーション等の人工天体)であってもよい。 Here, the mobile body may be a mobile terminal such as a smartphone, a mobile phone, a personal computer, a music player, or a portable TV, or may be a remote controller for operating a device. Further, the moving body may be a moving body moving on land (for example, a vehicle such as a car, a bicycle, a bus, a truck, a motorcycle, a train, a linear motor car, etc.) or in the ground (for example, in a tunnel). It may be a moving body (for example, a subway) that moves around. Further, the moving body may be a moving body moving on the water (for example, a ship such as a passenger ship, a cargo ship, a hovercraft, etc.), or a moving body moving underwater (for example, a submersible, a submarine, an unmanned submarine, etc.). It may be a submarine). Further, the moving body may be a moving body moving in the atmosphere (for example, an aircraft such as an airplane, an airship, or a drone), or a moving body moving outside the atmosphere (for example, an artificial celestial body such as a space station). There may be.
 また、構造物は、例えば、高層ビル、家屋、鉄塔、駅施設、空港施設、港湾施設、スタジアム等の建物である。なお、構造物という概念には、建物のみならず、トンネル、橋梁、ダム、塀、鉄柱等の構築物(Non-building structure)や、クレーン、門、風車等の設備も含まれる。また、構造物という概念には、陸上又は地中の構造物のみならず、桟橋、メガフロート等の水上の構造物や、海洋観測設備等の水中の構造物も含まれる。 The structure is, for example, a high-rise building, a house, a steel tower, a station facility, an airport facility, a port facility, a stadium, or the like. The concept of structure includes not only buildings but also structures such as tunnels, bridges, dams, walls and iron pillars, and equipment such as cranes, gates and windmills. Further, the concept of a structure includes not only a structure on land or in the ground, but also a structure on water such as a pier and a mega float, and a structure underwater such as an ocean observation facility.
 (その他の変形例)
 上述の実施形態では、情報処理装置10及び出力装置20は、ユーザが装着可能な機器(装着可能機器)であるものとしたが、必ずしも装着可能機器でなくてもよい。例えば、情報処理装置10は、構造物や移動体に設置されて使用される機器であってもよい。
(Other variants)
In the above-described embodiment, the information processing device 10 and the output device 20 are devices that can be worn by the user (wearable devices), but are not necessarily wearable devices. For example, the information processing device 10 may be a device installed and used in a structure or a moving body.
 また、本実施形態の情報処理装置10、出力装置20、又は端末装置30を制御する制御装置は、専用のコンピュータシステムにより実現してもよいし、汎用のコンピュータシステムによって実現してもよい。 Further, the control device for controlling the information processing device 10, the output device 20, or the terminal device 30 of the present embodiment may be realized by a dedicated computer system or a general-purpose computer system.
 例えば、上述の動作を実行するための通信プログラムを、光ディスク、半導体メモリ、磁気テープ、フレキシブルディスク等のコンピュータ読み取り可能な記録媒体に格納して配布する。そして、例えば、該プログラムをコンピュータにインストールし、上述の処理を実行することによって制御装置を構成する。このとき、制御装置は、情報処理装置10、出力装置20、又は端末装置30の外部の装置(例えば、パーソナルコンピュータ)であってもよい。また、制御装置は、情報処理装置10、出力装置20、又は端末装置30の内部の装置(例えば、制御部16、制御部26、又は制御部36)であってもよい。 For example, a communication program for executing the above operation is stored and distributed in a computer-readable recording medium such as an optical disk, a semiconductor memory, a magnetic tape, or a flexible disk. Then, for example, the control device is configured by installing the program in a computer and executing the above-mentioned processing. At this time, the control device may be an information processing device 10, an output device 20, or an external device (for example, a personal computer) of the terminal device 30. Further, the control device may be an internal device (for example, control unit 16, control unit 26, or control unit 36) of the information processing device 10, the output device 20, or the terminal device 30.
 また、上記通信プログラムをインターネット等のネットワーク上のサーバ装置が備えるディスク装置に格納しておき、コンピュータにダウンロード等できるようにしてもよい。また、上述の機能を、OS(Operating System)とアプリケーションソフトとの協働により実現してもよい。この場合には、OS以外の部分を媒体に格納して配布してもよいし、OS以外の部分をサーバ装置に格納しておき、コンピュータにダウンロード等できるようにしてもよい。 Further, the above communication program may be stored in a disk device provided in a server device on a network such as the Internet so that it can be downloaded to a computer or the like. Further, the above-mentioned functions may be realized by the collaboration between the OS (Operating System) and the application software. In this case, the part other than the OS may be stored in a medium and distributed, or the part other than the OS may be stored in the server device so that it can be downloaded to a computer or the like.
 また、上記実施形態において説明した各処理のうち、自動的に行われるものとして説明した処理の全部又は一部を手動的に行うこともでき、あるいは、手動的に行われるものとして説明した処理の全部又は一部を公知の方法で自動的に行うこともできる。この他、上記文書中や図面中で示した処理手順、具体的名称、各種のデータやパラメータを含む情報については、特記する場合を除いて任意に変更することができる。例えば、各図に示した各種情報は、図示した情報に限られない。 Further, among the processes described in the above-described embodiment, all or a part of the processes described as being automatically performed can be manually performed, or the processes described as being manually performed can be performed. All or part of it can be done automatically by a known method. In addition, information including processing procedures, specific names, various data and parameters shown in the above documents and drawings can be arbitrarily changed unless otherwise specified. For example, the various information shown in each figure is not limited to the information shown in the figure.
 また、図示した各装置の各構成要素は機能概念的なものであり、必ずしも物理的に図示の如く構成されていることを要しない。すなわち、各装置の分散・統合の具体的形態は図示のものに限られず、その全部又は一部を、各種の負荷や使用状況などに応じて、任意の単位で機能的又は物理的に分散・統合して構成することができる。 Further, each component of each device shown in the figure is a functional concept, and does not necessarily have to be physically configured as shown in the figure. That is, the specific form of distribution / integration of each device is not limited to the one shown in the figure, and all or part of them may be functionally or physically distributed / physically in arbitrary units according to various loads and usage conditions. Can be integrated and configured.
 また、上述の実施形態は、処理内容を矛盾させない領域で適宜組み合わせることが可能である。また、上述の実施形態のフローチャートに示された各ステップは、適宜順序を変更することが可能である。 Further, the above-described embodiments can be appropriately combined in a region where the processing contents do not contradict each other. Further, the order of each step shown in the flowchart of the above-described embodiment can be changed as appropriate.
 また、例えば、本実施形態は、装置またはシステムを構成するあらゆる構成、例えば、システムLSI(Large Scale Integration)等としてのプロセッサ、複数のプロセッサ等を用いるモジュール、複数のモジュール等を用いるユニット、ユニットにさらにその他の機能を付加したセット等(すなわち、装置の一部の構成)として実施することもできる。 Further, for example, the present embodiment includes a device or any configuration constituting the system, for example, a processor as a system LSI (Large Scale Integration), a module using a plurality of processors, a unit using a plurality of modules, and a unit. It can also be implemented as a set or the like with other functions added (that is, a configuration of a part of the device).
 なお、本実施形態において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、全ての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。 In the present embodiment, the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
 また、例えば、本実施形態は、1つの機能を、ネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成をとることができる。 Further, for example, the present embodiment can have a cloud computing configuration in which one function is shared by a plurality of devices via a network and jointly processed.
<<6.むすび>>
 以上説明したように、本開示の一実施形態によれば、情報処理装置10又は端末装置30は、ユーザUのジェスチャの入力速度に基づいてユーザUにフィードバックを行っている。そのため、ユーザUはフィードバックに基づいてジェスチャの入力精度を高めることができる。結果として、情報処理装置10のジェスチャの検出精度が高くなるので、情報処理装置10は、高いユーザビリティを実現できる。
<< 6. Conclusion >>
As described above, according to one embodiment of the present disclosure, the information processing device 10 or the terminal device 30 provides feedback to the user U based on the input speed of the gesture of the user U. Therefore, the user U can improve the input accuracy of the gesture based on the feedback. As a result, the gesture detection accuracy of the information processing apparatus 10 becomes high, so that the information processing apparatus 10 can realize high usability.
 以上、本開示の各実施形態について説明したが、本開示の技術的範囲は、上述の各実施形態そのままに限定されるものではなく、本開示の要旨を逸脱しない範囲において種々の変更が可能である。また、異なる実施形態及び変形例にわたる構成要素を適宜組み合わせてもよい。 Although each embodiment of the present disclosure has been described above, the technical scope of the present disclosure is not limited to the above-mentioned embodiments as they are, and various changes can be made without departing from the gist of the present disclosure. be. In addition, components spanning different embodiments and modifications may be combined as appropriate.
 また、本明細書に記載された各実施形態における効果はあくまで例示であって限定されるものでは無く、他の効果があってもよい。 Further, the effects in each embodiment described in the present specification are merely examples and are not limited, and other effects may be obtained.
 なお、本技術は以下のような構成も取ることができる。
(1)
 物体を検出する検出部の検出領域で行われるユーザのジェスチャの入力速度に関する情報を取得する取得部と、
 前記入力速度に関する情報に基づいて、前記ユーザへのフィードバックに関する出力制御を行う出力制御部と、
 を備える情報処理装置。
(2)
 前記取得部は、前記ジェスチャを行っている前記ユーザの手の移動速度の情報を取得し、
 前記出力制御部は、前記移動速度の情報に基づいて、前記フィードバックに関する出力制御を行う、
 前記(1)に記載の情報処理装置。
(3)
 前記出力制御部は、前記フィードバックをリアルタイムに行う、
 前記(2)に記載の情報処理装置。
(4)
 前記出力制御部は、前記ユーザの手の前記移動速度に応じて、前記ユーザへの前記フィードバックを変化させる、
 前記(2)又は(3)に記載の情報処理装置。
(5)
 前記出力制御部は、前記移動速度が所定の閾値を超えているか否かで前記フィードバックを変化させる、
 前記(4)に記載の情報処理装置。
(6)
 前記出力制御部は、前記移動速度が前記所定の閾値を超えていない場合には、前記フィードバックとして第1のフィードバックを行い、前記移動速度が前記所定の閾値を超えている場合には、前記第1のフィードバックとは異なる第2のフィードバックを行う、
 前記(5)に記載の情報処理装置。
(7)
 前記出力制御部は、前記移動速度が前記所定の閾値を超えていない場合には、前記フィードバックを行わず、前記移動速度が前記所定の閾値を超えている場合には、前記フィードバックを行う、
 前記(5)に記載の情報処理装置。
(8)
 前記出力制御部は、
 前記移動速度が前記所定の閾値としての第1の閾値を超えていない場合には、前記フィードバックとして第1のフィードバックを行い、
 前記移動速度が前記第1の閾値を超え、該第1の閾値より大きい第2の閾値を超えていない場合には、前記第1のフィードバックとは異なる第2のフィードバックを行い、
 前記移動速度が前記第2の閾値を超えている場合には、前記第2のフィードバックとは異なる第3のフィードバックを行う、
 前記(5)に記載の情報処理装置。
(9)
 前記取得部は、前記ジェスチャを行っている前記ユーザの手の位置を示す情報を取得し、
 前記出力制御部は、前記ユーザの手の位置に応じて、前記所定の閾値を変化させる、
 前記(5)~(8)のいずれか1つに記載の情報処理装置。
(10)
 前記出力制御部は、前記ユーザの手の位置が前記ジェスチャの検出可能範囲の中央領域にある場合と前記検出可能範囲の内縁領域にある場合とで、前記所定の閾値を変化させる、
 前記(9)に記載の情報処理装置。
(11)
 前記ジェスチャを行う前記ユーザの手の位置と移動速度の情報に基づいて、前記ユーザの前記ジェスチャの入力動作が前記ジェスチャの検出可能範囲外で行われることになるか否かを推測する推測部、を備え、
 前記出力制御部は、前記ユーザの前記ジェスチャの入力動作が前記検出可能範囲外で行われることになると推測される場合には、前記ユーザへ所定のフィードバックを行う、
 前記(1)~(10)のいずれか1つに記載の情報処理装置。
(12)
 前記ジェスチャには、連続入力可能な所定のジェスチャが含まれ、
 前記取得部は、連続入力1回目の前記所定のジェスチャを行っている前記ユーザの手の移動速度を示す第1の移動速度の情報と、連続入力2回目の前記所定のジェスチャを行っている前記ユーザの手の移動速度を示す第2の移動速度の情報と、をそれぞれ取得し、
 前記出力制御部は、前記第2の移動速度と前記第1の移動速度との比較結果に基づいて、前記連続入力2回目の前記所定のジェスチャが行われた際のフィードバックに関する出力制御を行う、
 前記(1)~(11)のいずれか1つに記載の情報処理装置。
(13)
 前記出力制御部は、前記第2の移動速度が前記第1の移動速度より所定の速度以上離れている場合には、前記連続入力2回目の前記所定のジェスチャに対するフィードバックを前記連続入力1回目の前記所定のジェスチャに対するフィードバックから変化させる、
 前記(12)に記載の情報処理装置。
(14)
 前記ジェスチャが検出された場合に、該ジェスチャが関連付けられた機能を実行する実行部と、
 前記連続入力1回目の前記所定のジェスチャを行っている前記ユーザの手の位置と移動速度の情報に基づいて、連続入力2回目の前記所定のジェスチャの入力動作が前記ジェスチャの検出可能範囲外で行われることになるか否かを推測する推測部と、を備え、
 前記実行部は、連続入力2回目の前記所定のジェスチャの入力動作が前記検出可能範囲外で行われることになると推測される場合には、前記連続入力2回目の前記所定のジェスチャに係る機能を実行しない、
 前記(12)又は(13)に記載の情報処理装置。
(15)
 前記取得部は、前記ジェスチャを行っている前記ユーザの手の位置を示す情報を取得し、
 前記出力制御部は、前記ジェスチャを行う前記ユーザの手の位置が、前記ジェスチャの検出可能範囲内にある間、常時、前記フィードバックを行う、
 前記(1)~(14)のいずれか1つに記載の情報処理装置。
(16)
 前記検出部を備える所定の機器は、前記ユーザの体のうち該ユーザが視認できない部分に装着される音出力可能な機器であり、
 前記出力制御部は、前記フィードバックに関する出力制御として、前記所定の機器の音の出力制御を行う、
 前記(1)~(15)のいずれか1つに記載の情報処理装置。
(17)
 前記出力制御部は、前記入力速度に関する情報に基づいて、前記音の出力態様を変化させる、
 前記(16)に記載の情報処理装置。
(18)
 前記ジェスチャを検出する機器は、ヘッドホン又はイヤホンである、
 前記(16)又は(17)に記載の情報処理装置。
(19)
 前記情報処理装置は、前記検出部を備える所定の機器、又は前記所定の機器を該所定の機器の外部から有線又は無線を介して制御する装置である、
 前記(1)~(18)のいずれか1つに記載の情報処理装置。
(20)
 物体を検出する検出部の検出領域で行われるユーザのジェスチャの入力速度に関する情報を取得し、
 前記入力速度に関する情報に基づいて、前記ユーザへのフィードバックに関する出力制御を行う、
 情報処理方法。
(21)
 コンピュータを、
 物体を検出する検出部の検出領域で行われるユーザのジェスチャの入力速度に関する情報を取得する取得部、
 前記入力速度に関する情報に基づいて、前記ユーザへのフィードバックに関する出力制御を行う出力制御部、
 として機能させるための情報処理プログラム。
The present technology can also have the following configurations.
(1)
An acquisition unit that acquires information about the input speed of the user's gesture performed in the detection area of the detection unit that detects an object, and an acquisition unit.
An output control unit that controls output related to feedback to the user based on the information related to the input speed.
Information processing device equipped with.
(2)
The acquisition unit acquires information on the movement speed of the hand of the user performing the gesture, and obtains information on the movement speed of the user's hand.
The output control unit performs output control related to the feedback based on the information of the movement speed.
The information processing apparatus according to (1) above.
(3)
The output control unit provides the feedback in real time.
The information processing device according to (2) above.
(4)
The output control unit changes the feedback to the user according to the moving speed of the user's hand.
The information processing apparatus according to (2) or (3) above.
(5)
The output control unit changes the feedback depending on whether or not the moving speed exceeds a predetermined threshold value.
The information processing apparatus according to (4) above.
(6)
The output control unit provides first feedback as the feedback when the movement speed does not exceed the predetermined threshold value, and when the movement speed exceeds the predetermined threshold value, the first feedback is performed. Give a second feedback that is different from the first feedback,
The information processing apparatus according to (5) above.
(7)
The output control unit does not give the feedback when the moving speed does not exceed the predetermined threshold value, and gives the feedback when the moving speed exceeds the predetermined threshold value.
The information processing apparatus according to (5) above.
(8)
The output control unit
When the moving speed does not exceed the first threshold value as the predetermined threshold value, the first feedback is given as the feedback.
When the moving speed exceeds the first threshold value and does not exceed the second threshold value larger than the first threshold value, a second feedback different from the first feedback is performed.
When the moving speed exceeds the second threshold value, a third feedback different from the second feedback is given.
The information processing apparatus according to (5) above.
(9)
The acquisition unit acquires information indicating the position of the hand of the user performing the gesture, and obtains information indicating the position of the hand of the user.
The output control unit changes the predetermined threshold value according to the position of the user's hand.
The information processing apparatus according to any one of (5) to (8).
(10)
The output control unit changes the predetermined threshold value depending on whether the position of the user's hand is in the central region of the detectable range of the gesture or in the inner edge region of the detectable range.
The information processing apparatus according to (9) above.
(11)
A guessing unit that estimates whether or not the input operation of the gesture of the user will be performed outside the detectable range of the gesture based on the information of the position and the moving speed of the hand of the user performing the gesture. Equipped with
The output control unit provides predetermined feedback to the user when it is presumed that the input operation of the gesture of the user will be performed outside the detectable range.
The information processing apparatus according to any one of (1) to (10).
(12)
The gesture includes a predetermined gesture that can be continuously input.
The acquisition unit performs information on a first movement speed indicating the movement speed of the user's hand performing the predetermined gesture for the first continuous input, and the predetermined gesture for the second continuous input. The information of the second movement speed indicating the movement speed of the user's hand and the information of the second movement speed are acquired respectively.
The output control unit performs output control regarding feedback when the predetermined gesture of the second continuous input is performed, based on the comparison result between the second moving speed and the first moving speed.
The information processing apparatus according to any one of (1) to (11).
(13)
When the second moving speed is separated from the first moving speed by a predetermined speed or more, the output control unit gives feedback to the predetermined gesture of the second continuous input for the first continuous input. Change from the feedback for the given gesture,
The information processing apparatus according to (12) above.
(14)
When the gesture is detected, an execution unit that executes a function associated with the gesture, and an execution unit.
Based on the information on the position and movement speed of the user's hand performing the predetermined gesture of the first continuous input, the input operation of the predetermined gesture of the second continuous input is outside the detectable range of the gesture. Equipped with a guessing part to guess whether or not it will be done,
When it is presumed that the input operation of the predetermined gesture for the second continuous input is performed outside the detectable range, the execution unit performs the function related to the predetermined gesture for the second continuous input. Do not execute
The information processing apparatus according to (12) or (13).
(15)
The acquisition unit acquires information indicating the position of the hand of the user performing the gesture, and obtains information indicating the position of the hand of the user.
The output control unit constantly provides the feedback while the position of the user's hand performing the gesture is within the detectable range of the gesture.
The information processing apparatus according to any one of (1) to (14).
(16)
The predetermined device including the detection unit is a device capable of outputting sound, which is attached to a portion of the user's body that cannot be visually recognized by the user.
The output control unit controls the sound output of the predetermined device as the output control related to the feedback.
The information processing apparatus according to any one of (1) to (15).
(17)
The output control unit changes the output mode of the sound based on the information regarding the input speed.
The information processing apparatus according to (16) above.
(18)
The device for detecting the gesture is a headphone or an earphone.
The information processing apparatus according to (16) or (17).
(19)
The information processing device is a predetermined device provided with the detection unit, or a device that controls the predetermined device from the outside of the predetermined device via wire or wirelessly.
The information processing apparatus according to any one of (1) to (18).
(20)
Acquires information about the input speed of the user's gesture performed in the detection area of the detection unit that detects the object.
Output control regarding feedback to the user is performed based on the information regarding the input speed.
Information processing method.
(21)
Computer,
An acquisition unit that acquires information about the input speed of the user's gesture performed in the detection area of the detection unit that detects an object,
An output control unit that controls output regarding feedback to the user based on the information regarding the input speed.
An information processing program to function as.
 1 情報処理システム
 10 情報処理装置
 10A ヘッドホン
 10B イヤホン
 11、21 入力検出部
 12、22、32 状態検出部
 13、23、33 出力部
 14、24、34 通信部
 15、35 記憶部
 16、26、36 制御部
 20、20A、20B 出力装置
 30 端末装置
 31 入力部
 161、361 取得部
 162、362 ジェスチャ検出部
 163、363 コマンド実行部
 164、364 出力制御部
 165、365 推測部
 OR 検出領域
 AR 検出可能範囲
 AR1 中央領域
 AR2 内縁領域
 U ユーザ
 H 手
1 Information processing system 10 Information processing device 10A Headphone 10B Earphone 11, 21 Input detection unit 12, 22, 32 State detection unit 13, 23, 33 Output unit 14, 24, 34 Communication unit 15, 35 Storage unit 16, 26, 36 Control unit 20, 20A, 20B Output device 30 Terminal device 31 Input unit 161, 361 Acquisition unit 162, 362 Gesture detection unit 163, 363 Command execution unit 164, 364 Output control unit 165, 365 Guessing unit OR Detection area AR Detectable range AR1 central area AR2 inner edge area U user H hand

Claims (20)

  1.  物体を検出する検出部の検出領域で行われるユーザのジェスチャの入力速度に関する情報を取得する取得部と、
     前記入力速度に関する情報に基づいて、前記ユーザへのフィードバックに関する出力制御を行う出力制御部と、
     を備える情報処理装置。
    An acquisition unit that acquires information about the input speed of the user's gesture performed in the detection area of the detection unit that detects an object, and an acquisition unit.
    An output control unit that controls output related to feedback to the user based on the information related to the input speed.
    Information processing device equipped with.
  2.  前記取得部は、前記ジェスチャを行っている前記ユーザの手の移動速度の情報を取得し、
     前記出力制御部は、前記移動速度の情報に基づいて、前記フィードバックに関する出力制御を行う、
     請求項1に記載の情報処理装置。
    The acquisition unit acquires information on the movement speed of the hand of the user performing the gesture, and obtains information on the movement speed of the user's hand.
    The output control unit performs output control related to the feedback based on the information of the movement speed.
    The information processing apparatus according to claim 1.
  3.  前記出力制御部は、前記フィードバックをリアルタイムに行う、
     請求項2に記載の情報処理装置。
    The output control unit provides the feedback in real time.
    The information processing apparatus according to claim 2.
  4.  前記出力制御部は、前記ユーザの手の前記移動速度に応じて、前記ユーザへの前記フィードバックを変化させる、
     請求項2に記載の情報処理装置。
    The output control unit changes the feedback to the user according to the moving speed of the user's hand.
    The information processing apparatus according to claim 2.
  5.  前記出力制御部は、前記移動速度が所定の閾値を超えているか否かで前記フィードバックを変化させる、
     請求項4に記載の情報処理装置。
    The output control unit changes the feedback depending on whether or not the moving speed exceeds a predetermined threshold value.
    The information processing apparatus according to claim 4.
  6.  前記出力制御部は、前記移動速度が前記所定の閾値を超えていない場合には、前記フィードバックとして第1のフィードバックを行い、前記移動速度が前記所定の閾値を超えている場合には、前記第1のフィードバックとは異なる第2のフィードバックを行う、
     請求項5に記載の情報処理装置。
    The output control unit provides first feedback as the feedback when the movement speed does not exceed the predetermined threshold value, and when the movement speed exceeds the predetermined threshold value, the first feedback is performed. Give a second feedback that is different from the first feedback,
    The information processing apparatus according to claim 5.
  7.  前記出力制御部は、前記移動速度が前記所定の閾値を超えていない場合には、前記フィードバックを行わず、前記移動速度が前記所定の閾値を超えている場合には、前記フィードバックを行う、
     請求項5に記載の情報処理装置。
    The output control unit does not give the feedback when the moving speed does not exceed the predetermined threshold value, and gives the feedback when the moving speed exceeds the predetermined threshold value.
    The information processing apparatus according to claim 5.
  8.  前記出力制御部は、
     前記移動速度が前記所定の閾値としての第1の閾値を超えていない場合には、前記フィードバックとして第1のフィードバックを行い、
     前記移動速度が前記第1の閾値を超え、該第1の閾値より大きい第2の閾値を超えていない場合には、前記第1のフィードバックとは異なる第2のフィードバックを行い、
     前記移動速度が前記第2の閾値を超えている場合には、前記第2のフィードバックとは異なる第3のフィードバックを行う、
     請求項5に記載の情報処理装置。
    The output control unit
    When the moving speed does not exceed the first threshold value as the predetermined threshold value, the first feedback is given as the feedback.
    When the moving speed exceeds the first threshold value and does not exceed the second threshold value larger than the first threshold value, a second feedback different from the first feedback is performed.
    When the moving speed exceeds the second threshold value, a third feedback different from the second feedback is given.
    The information processing apparatus according to claim 5.
  9.  前記取得部は、前記ジェスチャを行っている前記ユーザの手の位置を示す情報を取得し、
     前記出力制御部は、前記ユーザの手の位置に応じて、前記所定の閾値を変化させる、
     請求項5に記載の情報処理装置。
    The acquisition unit acquires information indicating the position of the hand of the user performing the gesture, and obtains information indicating the position of the hand of the user.
    The output control unit changes the predetermined threshold value according to the position of the user's hand.
    The information processing apparatus according to claim 5.
  10.  前記出力制御部は、前記ユーザの手の位置が前記ジェスチャの検出可能範囲の中央領域にある場合と前記検出可能範囲の内縁領域にある場合とで、前記所定の閾値を変化させる、
     請求項9に記載の情報処理装置。
    The output control unit changes the predetermined threshold value depending on whether the position of the user's hand is in the central region of the detectable range of the gesture or in the inner edge region of the detectable range.
    The information processing apparatus according to claim 9.
  11.  前記ジェスチャを行う前記ユーザの手の位置と移動速度の情報に基づいて、前記ユーザの前記ジェスチャの入力動作が前記ジェスチャの検出可能範囲外で行われることになるか否かを推測する推測部、を備え、
     前記出力制御部は、前記ユーザの前記ジェスチャの入力動作が前記検出可能範囲外で行われることになると推測される場合には、前記ユーザへ所定のフィードバックを行う、
     請求項1に記載の情報処理装置。
    A guessing unit that estimates whether or not the input operation of the gesture of the user will be performed outside the detectable range of the gesture based on the information of the position and the moving speed of the hand of the user performing the gesture. Equipped with
    The output control unit provides predetermined feedback to the user when it is presumed that the input operation of the gesture of the user will be performed outside the detectable range.
    The information processing apparatus according to claim 1.
  12.  前記ジェスチャには、連続入力可能な所定のジェスチャが含まれ、
     前記取得部は、連続入力1回目の前記所定のジェスチャを行っている前記ユーザの手の移動速度を示す第1の移動速度の情報と、連続入力2回目の前記所定のジェスチャを行っている前記ユーザの手の移動速度を示す第2の移動速度の情報と、をそれぞれ取得し、
     前記出力制御部は、前記第2の移動速度と前記第1の移動速度との比較結果に基づいて、前記連続入力2回目の前記所定のジェスチャが行われた際のフィードバックに関する出力制御を行う、
     請求項1に記載の情報処理装置。
    The gesture includes a predetermined gesture that can be continuously input.
    The acquisition unit performs information on a first movement speed indicating the movement speed of the user's hand performing the predetermined gesture for the first continuous input, and the predetermined gesture for the second continuous input. The information of the second movement speed indicating the movement speed of the user's hand and the information of the second movement speed are acquired respectively.
    The output control unit performs output control regarding feedback when the predetermined gesture of the second continuous input is performed, based on the comparison result between the second moving speed and the first moving speed.
    The information processing apparatus according to claim 1.
  13.  前記出力制御部は、前記第2の移動速度が前記第1の移動速度より所定の速度以上離れている場合には、前記連続入力2回目の前記所定のジェスチャに対するフィードバックを前記連続入力1回目の前記所定のジェスチャに対するフィードバックから変化させる、
     請求項12に記載の情報処理装置。
    When the second moving speed is separated from the first moving speed by a predetermined speed or more, the output control unit gives feedback to the predetermined gesture of the second continuous input for the first continuous input. Change from the feedback for the given gesture,
    The information processing apparatus according to claim 12.
  14.  前記ジェスチャが検出された場合に、該ジェスチャが関連付けられた機能を実行する実行部と、
     前記連続入力1回目の前記所定のジェスチャを行っている前記ユーザの手の位置と移動速度の情報に基づいて、連続入力2回目の前記所定のジェスチャの入力動作が前記ジェスチャの検出可能範囲外で行われることになるか否かを推測する推測部と、を備え、
     前記実行部は、連続入力2回目の前記所定のジェスチャの入力動作が前記検出可能範囲外で行われることになると推測される場合には、前記連続入力2回目の前記所定のジェスチャに係る機能を実行しない、
     請求項12に記載の情報処理装置。
    When the gesture is detected, an execution unit that executes a function associated with the gesture, and an execution unit.
    Based on the information on the position and movement speed of the user's hand performing the predetermined gesture of the first continuous input, the input operation of the predetermined gesture of the second continuous input is outside the detectable range of the gesture. Equipped with a guessing part to guess whether or not it will be done,
    When it is presumed that the input operation of the predetermined gesture for the second continuous input is performed outside the detectable range, the execution unit performs the function related to the predetermined gesture for the second continuous input. Do not execute
    The information processing apparatus according to claim 12.
  15.  前記取得部は、前記ジェスチャを行っている前記ユーザの手の位置を示す情報を取得し、
     前記出力制御部は、前記ジェスチャを行う前記ユーザの手の位置が、前記ジェスチャの検出可能範囲内にある間、常時、前記フィードバックを行う、
     請求項1に記載の情報処理装置。
    The acquisition unit acquires information indicating the position of the hand of the user performing the gesture, and obtains information indicating the position of the hand of the user.
    The output control unit constantly provides the feedback while the position of the user's hand performing the gesture is within the detectable range of the gesture.
    The information processing apparatus according to claim 1.
  16.  前記検出部を備える所定の機器は、前記ユーザの体のうち該ユーザが視認できない部分に装着される音出力可能な機器であり、
     前記出力制御部は、前記フィードバックに関する出力制御として、前記所定の機器の音の出力制御を行う、
     請求項1に記載の情報処理装置。
    The predetermined device including the detection unit is a device capable of outputting sound, which is attached to a portion of the user's body that cannot be visually recognized by the user.
    The output control unit controls the sound output of the predetermined device as the output control related to the feedback.
    The information processing apparatus according to claim 1.
  17.  前記出力制御部は、前記入力速度に関する情報に基づいて、前記音の出力態様を変化させる、
     請求項16に記載の情報処理装置。
    The output control unit changes the output mode of the sound based on the information regarding the input speed.
    The information processing apparatus according to claim 16.
  18.  前記ジェスチャを検出する機器は、ヘッドホン又はイヤホンである、
     請求項16に記載の情報処理装置。
    The device for detecting the gesture is a headphone or an earphone.
    The information processing apparatus according to claim 16.
  19.  前記情報処理装置は、前記検出部を備える所定の機器、又は前記所定の機器を該所定の機器の外部から有線又は無線を介して制御する装置である、
     請求項1に記載の情報処理装置。
    The information processing device is a predetermined device provided with the detection unit, or a device that controls the predetermined device from the outside of the predetermined device via wire or wirelessly.
    The information processing apparatus according to claim 1.
  20.  物体を検出する検出部の検出領域で行われるユーザのジェスチャの入力速度に関する情報を取得し、
     前記入力速度に関する情報に基づいて、前記ユーザへのフィードバックに関する出力制御を行う、
     情報処理方法。
    Acquires information about the input speed of the user's gesture performed in the detection area of the detection unit that detects the object.
    Output control regarding feedback to the user is performed based on the information regarding the input speed.
    Information processing method.
PCT/JP2021/025072 2020-07-20 2021-07-02 Information processing device and information processing method WO2022019085A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020123992 2020-07-20
JP2020-123992 2020-07-20

Publications (1)

Publication Number Publication Date
WO2022019085A1 true WO2022019085A1 (en) 2022-01-27

Family

ID=79729714

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/025072 WO2022019085A1 (en) 2020-07-20 2021-07-02 Information processing device and information processing method

Country Status (1)

Country Link
WO (1) WO2022019085A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013229051A (en) * 2012-02-15 2013-11-07 Immersion Corp Interactivity model for shared feedback on mobile devices
JP2017509181A (en) * 2014-01-03 2017-03-30 ハーマン インターナショナル インダストリーズ インコーポレイテッド Gesture-interactive wearable spatial audio system
US20200162599A1 (en) * 2012-06-15 2020-05-21 Muzik Inc. Audio/Video Wearable Computer System with Integrated Projector

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013229051A (en) * 2012-02-15 2013-11-07 Immersion Corp Interactivity model for shared feedback on mobile devices
US20200162599A1 (en) * 2012-06-15 2020-05-21 Muzik Inc. Audio/Video Wearable Computer System with Integrated Projector
JP2017509181A (en) * 2014-01-03 2017-03-30 ハーマン インターナショナル インダストリーズ インコーポレイテッド Gesture-interactive wearable spatial audio system

Similar Documents

Publication Publication Date Title
US11188187B2 (en) Information processing apparatus, information processing method, and recording medium
US10055879B2 (en) 3D human face reconstruction method, apparatus and server
US9513714B2 (en) Methods and apparatuses for gesture-based user input detection in a mobile device
US9514512B2 (en) Method and apparatus for laying out image using image recognition
US9958938B2 (en) Gaze tracking for a mobile device
US11416080B2 (en) User intention-based gesture recognition method and apparatus
US20190043521A1 (en) Automatic Gain Adjustment for Improved Wake Word Recognition in Audio Systems
EP2804126A1 (en) Detection of loss and automatically locking of a mobile device
EP3832605B1 (en) Method and device for determining potentially visible set, apparatus, and storage medium
US20130197916A1 (en) Terminal device, speech recognition processing method of terminal device, and related program
WO2021103841A1 (en) Control vehicle
CN108694073A (en) Control method, device, equipment and the storage medium of virtual scene
WO2020155980A1 (en) Control method and terminal device
US20140194147A1 (en) Apparatus and method for reducing battery consumption of mobile terminal
WO2022019085A1 (en) Information processing device and information processing method
US10133966B2 (en) Information processing apparatus, information processing method, and information processing system
CN107943484A (en) The method and apparatus for performing business function
KR20140115656A (en) Method and apparatus for controlling operation in a electronic device
WO2022014609A1 (en) Information processing device and information processing method
CN112717409B (en) Virtual vehicle control method, device, computer equipment and storage medium
JP6170963B2 (en) Mobile device, water immersion estimation method and water immersion estimation program
CN113962138B (en) Method, device, equipment and storage medium for determining parameter value of mobile platform
CN116643558A (en) Robot path-finding method, path-finding device, apparatus, and computer-readable storage medium
CN113094282B (en) Program block running method, device, equipment and storage medium
JP6182571B2 (en) Portable device, control method, and control program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21846621

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21846621

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP