WO2022014609A1 - Information processing device and information processing method - Google Patents
Information processing device and information processing method Download PDFInfo
- Publication number
- WO2022014609A1 WO2022014609A1 PCT/JP2021/026355 JP2021026355W WO2022014609A1 WO 2022014609 A1 WO2022014609 A1 WO 2022014609A1 JP 2021026355 W JP2021026355 W JP 2021026355W WO 2022014609 A1 WO2022014609 A1 WO 2022014609A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- information
- user
- information processing
- detectable range
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/10—Earpieces; Attachments therefor ; Earphones; Monophonic headphones
Definitions
- This disclosure relates to an information processing device and an information processing method.
- While device operation by gesture has the advantage that the user can easily operate the device, there is a problem that false detection is likely to occur.
- device operation by gesture may cause problems such as erroneously detecting a user's unconscious movement as a gesture, or detecting a user's gesture as a gesture different from the gesture intended by the user. be.
- the information processing apparatus includes a gesture detection unit for detecting a user's gesture performed in the detection area of the detection unit for detecting an object, information on the gesture, and information on the gesture. Based on at least one of the information regarding the state of the predetermined device including the detection unit, at least one of the detectable range of the gesture in the detection area and the reference direction for detecting the gesture is corrected. It is provided with a correction unit and a correction unit.
- a plurality of components having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numerals.
- a plurality of elements having substantially the same functional configuration are distinguished as needed, such as output devices 20A and 20B.
- output devices 20A and 20B are simply referred to as output devices 20.
- a detection unit for example, a proximity sensor
- a detection result for example, a sensor value
- a device capable of gesture detection may be simply referred to as an information processing device.
- FIG. 1 and 2 are diagrams showing how the user makes a gesture.
- FIG. 1 shows an earphone 10A as an example of the information processing apparatus of the present embodiment.
- FIG. 2 shows the headphones 10B as an example of the information processing apparatus of the present embodiment.
- the device operation by the gesture is performed by the user performing a predetermined gesture in the space near the information processing device.
- the earphone 10A and the headphone 10B are attached to the user's ear, and the user U makes a gesture for operating the device in the space near his / her ear.
- the user U makes a gesture of moving the hand H from the front to the back
- the user U makes a gesture of moving the hand H from the back to the front.
- the motion of moving the hand includes the motion of moving the finger.
- the description of "hand” appearing in the following explanation can be replaced with “finger” as appropriate.
- the XYZ coordinate system may be used for the explanation in order to facilitate understanding.
- the X-axis direction, the Y-axis direction, and the Z-axis direction are all directions determined with reference to the user.
- the X-axis plus direction is the front direction of the user U
- the X-axis minus direction is the back direction of the user U.
- the Y-axis plus direction is the left-hand side direction of the user U
- the Y-axis minus direction is the right-hand side direction of the user U.
- the Z-axis plus direction is the upward direction
- the Z-axis minus direction is the downward direction.
- the user U is assumed to be in a standing or sitting position, the X-axis and the Y-axis are in the horizontal direction, and the Z-axis is in the vertical direction (gravity direction).
- the X-axis direction, Y-axis direction, and Z-axis direction appearing in the following description may be appropriately read according to the posture of the user. For example, when the user is lying on his back, the X-axis direction is the vertical direction, and the Y-axis and the X-axis directions are the horizontal direction.
- the gesture in which the user U moves the hand H in the horizontal direction may be referred to as a left / right swipe.
- the gesture in which the user U moves the hand H in the vertical direction (Z-axis direction) may be referred to as a vertical swipe.
- the information processing device detects the user's gesture based on the detection result of the object in the detection unit.
- the information processing apparatus executes a function corresponding to the gesture (for example, Next / Prev or Vol + / Vol ⁇ ).
- the information processing device is a device that can be worn by the user.
- the cause of the erroneous detection is (1-1-1) a change in the position of the detection area of the detection unit and (1-1-2) a change in the orientation of the information processing apparatus.
- false positives caused by these will be described in detail.
- the device that can be worn by the user may be abbreviated as the device that can be worn.
- FIG. 3A is a diagram showing a state in which the earphone 10A is attached in a standard orientation
- FIG. 3B is a diagram showing a state in which the earphone 10A is attached in an irregular orientation.
- the detection region OR is located in a certain range centered on the position of the ear of the user U when viewed from the Y-axis plus direction. There is. That is, in the example of FIG. 3A, the detection region OR is located in the range intended by the developer of the apparatus.
- the detection region OR is located in a range extending from slightly below the ear of the user U to the temporal region. That is, in the example of FIG. 3B, the detection region OR is not located in the range intended by the developer of the apparatus.
- the detection area OR is located at a position suitable for the user's sensation, whereas in the example of FIG. 3B, the user's sensation is present. There is a detection area OR at a position that does not match.
- the detection area OR is located at a position that does not suit the user's sense, even an operation that the user U does not intend as a gesture will be detected as a gesture.
- the action of the user U trying to scratch his head and bringing his hand to the temporal region is likely to be an action within the detection region OR in the example of FIG. 3B.
- the operation may be detected as a gesture by the information processing apparatus.
- the operation that the user U intentionally performs the gesture becomes an operation outside the detection area OR in some cases.
- the gesture performed slightly below the earphone 10A is likely to be an operation near the user's ear, but outside the detection region OR.
- the information processing device cannot detect the user's action as a gesture.
- the detection region OR extends to a position that does not suit the user U's sense. In that case, the same problem as the above-mentioned problem occurs.
- the change in the position of the detection area of the detection unit may occur even when the information processing device is a device other than the earphone type.
- the change in the position of the detection area of the detection unit may occur even when the information processing device is a headphone type device as shown in FIG. 2, for example. Therefore, when the information processing device is a device other than the earphone type, the same problem as described in this (1-1-1) occurs.
- FIG. 4A is a diagram showing a state in which the headphones 10B are worn in a standard wearing direction
- FIG. 4B is a diagram showing a state in which the headphones 10B are worn in an irregular direction.
- D1 to D4 in the figure are directions with respect to the device. More specifically, D1 to D4 in the figure indicate the upward, downward, forward, and backward directions when the information processing apparatus is mounted in the standard mounting orientation. D1 is upward, D2 is downward, D3 is forward, and D4 is backward. In the following description, D1 to D4 are referred to as device directions.
- the vertical direction (Z-axis direction) and the device directions D1 and D2 match, and the horizontal direction (X-axis direction) and the device.
- the directions D3 and D4 also match.
- the headphones 10B are mounted in an irregular direction, the vertical direction (Z-axis direction) and the device directions D1 and D2 do not match, and the horizontal direction (X-axis direction).
- the device directions D3 and D4 do not match either.
- the direction of the information processing device with respect to the direction of the user U (for example, the X-axis direction and the Z-axis direction) (for example, the device directions D3 and D4, and the device directions D1 and D2). Changes.
- the information processing apparatus determines whether or not the gesture performed by the user is a left / right swipe based on the horizontal direction (device directions D3, D4) at the time of standard mounting.
- the horizontal direction (X-axis direction) and the horizontal direction (device directions D3, D4) of the information processing apparatus at the time of standard mounting coincide with each other. Therefore, the information processing apparatus can detect that the user's gesture is a left / right swipe without any problem.
- the horizontal direction (X-axis direction) and the horizontal direction of the information processing device at the time of standard mounting do not match. Therefore, even if the user intentionally makes a gesture with a left / right swipe, the information processing apparatus cannot detect that the gesture is a left / right swipe.
- the direction of the movement of the user's hand is close to the vertical direction (device directions D1 and D2) of the information processing apparatus at the time of standard mounting. Therefore, in the example of FIG. 4B, the information processing apparatus erroneously detects that the user has swiped up and down.
- a change in the orientation of the information processing device may occur even when the information processing device is a device other than the headphone type.
- the change in the orientation of the information processing device may occur even when the information processing device is an earphone type device. Therefore, when the information processing device is a device other than the headphone type, the same problem as described in this (1-1-2) occurs.
- FIG. 5 is a diagram showing a state in which the detectable range of the gesture is limited to a part of the detection area OR.
- the detection region OR is formed in a certain range around the information processing apparatus 10 when viewed from the Y-axis plus direction.
- the information processing device 10 may be an earphone 10A or a headphone 10B.
- the detection area OR is divided into a detectable range AR and a non-detectable range NR.
- the detectable range AR is the range in which the information processing apparatus 10 can detect the gesture
- the non-detectable range NR is the range in which the information processing apparatus 10 does not detect the gesture. Both the detectable range AR and the non-detectable range NR are within the detection area OR.
- the information processing device 10 corrects the detectable range AR based on the information regarding the gesture of the user U.
- the information about the gesture may be the information of the operation of the user U regarding the gesture. More specifically, the information regarding the gesture may be information on an operation (hereinafter referred to as an approach operation) in which the user U enters the hand H into the detectable range AR.
- the approaching motion may be an motion in which the user U swings the hand H toward the ear (hereinafter, referred to as a swinging motion).
- the swing-up motion is, for example, an motion in which the user U swings the hand H of the user U toward the ear like a pendulum with the elbow joint as a fulcrum while moving the elbow joint stationary or slowly forward.
- the approaching motion is the swinging motion of the hand H of the user U.
- the approach direction AD of the user U's hand H into the detectable range AR is a certain direction due to the structure of the human body. Although there are individual differences and differences between gestures, in many cases, the approach direction AD is assumed to fall within a range of a certain vertical angle from the back direction (X-axis minus direction). In the example of FIG. 5, the user's hand H has entered the detectable range AR from a position slightly downward from the front of the user U. Therefore, in the example of FIG. 5, the approach direction AD is a direction slightly inclined upward from the back surface direction (X-axis minus direction).
- the information processing apparatus 10 can infer where the user U makes the gesture. Therefore, the information processing apparatus 10 corrects the detectable range AR to be narrower than the detection area OR based on the information of the approach direction AD.
- the detectable range AR is set in a state of being biased in the detection area OR so that a part of the edge of the detectable range AR touches a part of the edge of the detection area OR. More specifically, the detectable range AR is biased to a position where the user U is presumed to have entered the detection region OR (hereinafter referred to as an approach position).
- the approach position will be described using the example of FIG. 5, for example, an intersection of an arrow indicating the approach direction AD and a broken line indicating the edge of the detection region OR.
- the information on the approaching motion is not limited to the approaching direction AD, and may be, for example, information on the approaching position.
- the information processing apparatus 10 may correct the detectable range AR to be narrower than the detection area OR based on the information of the approach position.
- the information processing apparatus 10 corrects the detectable range of the gesture based on the information of the swinging motion of the hand H of the user U. As a result, it is less likely that an action that the user does not intend as a gesture is detected as a gesture. Even if the information processing apparatus 10 increases the sensitivity of the detection unit and widens the detection area OR, the possibility of erroneous detection is low.
- the information processing apparatus 10 of the present embodiment corrects the reference direction for gesture detection according to the inclination of the apparatus.
- “detection” in the detection of a gesture is a “detection” in a broad sense including recognition of a gesture.
- the reference direction for gesture detection is the reference direction for the information processing apparatus 10 to detect the gesture. For example, if the information processing apparatus 10 is trying to detect a left / right swipe, the reference direction is, for example, the horizontal direction.
- FIG. 6 is a diagram showing how the information processing apparatus 10 corrects the reference direction.
- B1 to B4 are the corrected reference directions.
- B1 to B4 may be simply referred to as "reference direction”. Since the device directions D1 to D4 are directions with respect to the device, when the information processing device 10 is tilted and mounted, it is tilted from the horizontal direction or the vertical direction.
- the information processing apparatus 10 detects a gesture with the device directions D1 to D4 as reference directions, erroneous detection such as detecting a left / right swipe as an up / down swipe will occur.
- the information processing apparatus 10 corrects the reference direction based on the information about the gesture.
- the information about the gesture may be the information of the operation of the user U regarding the gesture. More specifically, the information about the gesture may be the information of the approaching motion in which the user U enters the hand H into the detectable range AR. If the information processing device 10 is a device worn on the ear of the user U, the approaching motion may be an motion in which the user U swings his / her hand H toward his / her ear (swinging motion).
- the approach direction AD of the user U's hand H into the detectable range AR is a certain direction.
- the information processing apparatus 10 corrects the reference direction based on the information of the approach direction AD.
- FIG. 7 is a diagram showing how the information processing apparatus 10 corrects the reference direction. Since the device directions D1 to D4 are directions with respect to the device, the information processing device 10 naturally knows about the device directions D1 to D4. The information processing device 10 corrects the reference direction based on the information in the relative device directions D1 to D4 with respect to the approach direction AD. As an example, the information processing apparatus 10 can correct the reference direction by the following method.
- the information processing device 10 calculates an angle R formed by at least one of the device directions D1 to D4 (device direction D2 in the example of FIG. 7) and the approach direction AD.
- the approach direction AD is, for example, an angle at the position C where the user U's hand H enters the detectable range AR.
- the approach direction AD is a direction slightly inclined upward from the back surface direction (X-axis minus direction) of the user U. Therefore, the information processing device 10 knows the angle of the device direction reference of the approach direction AD that should be when the information processing device 10 is installed as standard.
- the information processing device 10 can estimate the mounting inclination of the information processing device 10 from the difference between the angle of the approach direction AD that should be when the information processing device 10 is mounted as standard and the angle of the approach direction AD that is actually detected.
- the information processing apparatus 10 may use the difference in angles as the estimation result as it is. Then, the reference direction is corrected to the reference directions B1 to B4 based on the estimation result of the mounting inclination of the information processing apparatus 10.
- the reference directions B1 to B4 are in the horizontal direction or the vertical direction, but the reference directions B1 to B4 do not necessarily have to be in the horizontal direction or the vertical direction.
- the reference direction can be any direction that the information processing apparatus 10 uses as a reference for detecting the gesture. For example, if the information processing apparatus 10 detects a swipe in the diagonal 45 ° direction, the diagonal 45 ° direction may be regarded as the reference direction.
- the information processing apparatus 10 does not use the device directions D1 to D4 as the reference direction, but detects the gesture based on the corrected reference directions B1 to B4. This reduces erroneous detection, for example, detecting a left / right swipe as an up / down swipe.
- the detectable range of the gesture and / or the gesture is based on the information about the gesture of the user U. Corrected the reference direction for gesture detection. However, the information processing apparatus 10 can also correct the detectable range and / or the reference direction by using information other than the information regarding the gesture.
- the information processing apparatus 10 may correct at least one of the detectable range and the reference direction based on the information regarding the state of the information processing apparatus 10.
- the information processing apparatus 10 includes a sensor unit that detects the state of the information processing apparatus 10, and corrects at least one of the detectable range and the reference direction based on the information from the sensor unit.
- the sensor unit may be composed of one or a plurality of acceleration sensors, or one or a plurality of gyro sensors.
- the information processing apparatus 10 may be able to correct at least one of the detectable range and the reference direction based on the mounting state of the information processing apparatus 10 to the user U estimated based on the information from the sensor unit. ..
- the information processing device 10 can estimate the gravitational direction based on the information from the sensor unit.
- the information processing apparatus 10 estimates the inclination of the information processing apparatus 10 from the standard mounted state from the estimated gravity direction. Then, the information processing apparatus 10 corrects at least one of the detectable range and the reference direction based on the estimation result of the inclination.
- the information processing apparatus 10 detects the gesture based on the corrected detectable range and / or the reference direction.
- Embodiment 1 The outline of the present embodiment has been described above, but the information processing apparatus 10 according to the first embodiment will be described below.
- the information processing device 10 is an information processing terminal that executes various functions based on a user's input operation.
- the information processing device 10 is an acoustic output device that can be worn by a user such as earphones and headphones.
- the information processing device 10 may be a display device such as an AR (Augmented Reality) glass or an MR (Mixed Reality) glass that can be worn by a user.
- the information processing apparatus 10 is assumed to be a device that can be worn by the user U, and is provided with a portion located at least in the ear of the user U when worn. Then, it is assumed that at least a detection unit for detecting an object is arranged in this portion.
- the information processing apparatus 10 is configured to be able to detect a user's gesture, and executes a function such as music playback based on the user's gesture.
- FIG. 8 is a diagram showing a configuration example of the information processing apparatus 10 according to the embodiment of the present disclosure.
- the information processing apparatus 10 includes an input detection unit 11, a state detection unit 12, an output unit 13, a communication unit 14, a storage unit 15, and a control unit 16.
- the configuration shown in FIG. 8 is a functional configuration, and the hardware configuration may be different from this. Further, the functions of the information processing apparatus 10 may be distributed and implemented in a plurality of physically separated configurations.
- the input detection unit 11 is a detection unit that detects a user's input operation.
- the input detection unit 11 is a non-contact type detection unit that detects an object in a non-contact manner.
- the input detection unit 11 includes one or a plurality of proximity sensors (also referred to as non-contact sensors), and detects an object located in the detection area.
- the proximity sensor may be an optical sensor or a capacitance type sensor.
- An optical proximity sensor is a sensor that detects the light reflected by an object.
- the capacitance type proximity sensor is a sensor that detects a change in capacitance that occurs between an object and the sensor.
- the input detection unit 11 When the input detection unit 11 is a non-contact type, the input detection unit 11 may be a 2D sensor or a 3D sensor. When the input detection unit 11 is a 2D sensor, the detection area formed by the input detection unit 11 is a two-dimensional area, and when the input detection unit 11 is a 3D sensor, the detection area formed by the input detection unit 11 is a three-dimensional area. It becomes.
- the input detection unit 11 is not limited to the non-contact type.
- the input detection unit 11 may be a contact type detection unit.
- the input detection unit 11 includes one or a plurality of touch sensors, and detects an object that comes into contact with a predetermined place of the information processing apparatus 10.
- the predetermined location is, for example, the side surface of the earphone (opposite the surface of the speaker).
- the predetermined location is, for example, the side surface of the headphones (opposite the surface of the speaker).
- the side surface of the earphone or the side surface of the headphone may be referred to as the housing surface.
- FIG. 9 is a diagram showing a configuration example of the input detection unit 11.
- FIG. 9 shows a state in which the earphone-type information processing apparatus 10 is viewed from the side surface of the earphone.
- the configuration shown in FIG. 9 is merely an example, and the configuration of the input detection unit 11 is not limited to the configuration shown in FIG.
- the input detection unit 11 includes an infrared emitter IR, four photo sensors PD1, PD2, PD3, and PD4 as non-contact sensors.
- the infrared emitter ID is arranged in the center of the housing surface, and the four photosensors PD1, PD2, PD3, and PD4 are arranged at equal intervals in the circumferential direction so as to surround the infrared emitter ID.
- the input detection unit 11 detects an object by detecting infrared light emitted from the infrared emitter ID and reflected on the surface of the object by four photosensors PD1, PD2, PD3, and PD4.
- the input detection unit 11 is provided with a touch pad TP as a contact type sensor.
- the touch pad TP is composed of a plurality of touch sensors arranged in a plane.
- the infrared emitter ID is arranged on the housing surface. The input detection unit 11 detects an object in contact with the surface of the touch pad TP.
- the state detection unit 12 is a sensor unit that detects the state of the information processing device 10.
- the state detection unit 12 is composed of one or a plurality of sensors that detect the state of the information processing device 10.
- the state detection unit 12 includes, for example, one or a plurality of acceleration sensors. Further, the state detection unit 12 may include one or a plurality of gyro sensors. Further, the state detection unit 12 may include one or a plurality of geomagnetic sensors.
- the state detection unit 12 may include a motion sensor in which these plurality of types of sensors are combined. Further, a biosensor for acquiring user's biometric information such as a blood pressure sensor or a heartbeat sensor may be provided.
- the state detection unit 12 detects the state of the information processing apparatus 10 based on the detection results of these sensors. For example, the state detection unit 12 detects the direction of gravity based on the sensor values of these sensors. For example, since the information processing device 10 is constantly subjected to gravitational acceleration, the information processing device 10 can detect the gravitational direction by averaging the directions of the accelerations detected by the acceleration sensor for a certain period of time. ..
- the output unit 13 is an output interface that outputs information to the user.
- the output unit 13 may be an acoustic device such as a speaker or a buzzer, or may be a vibration device such as a vibration motor.
- the output unit 13 may be a display device such as a liquid crystal display (Liquid Crystal Display) or an organic EL display (Organic Electroluminescence Display), or may be a lighting device such as an LED (Light Emitting Diode) lamp. ..
- the output unit 13 functions as an output means of the information processing apparatus 10.
- the output unit 13 outputs various information to the user according to the control of the control unit 16.
- the communication unit 14 is a communication interface for communicating with other devices.
- the communication unit 14 may be a network interface or a device connection interface. Further, the communication unit 14 may be a wired interface or a wireless interface.
- the communication unit 14 functions as a communication means of the information processing device 10.
- the communication unit 14 communicates with other devices according to the control of the control unit 16. Other devices are, for example, terminal devices such as music players and smartphones.
- the storage unit 15 is a storage device capable of reading and writing data such as a DRAM (Dynamic Random Access Memory), a SRAM (Static Random Access Memory), a flash memory, and a hard disk.
- the storage unit 15 functions as a storage means for the information processing device 10.
- the storage unit 15 stores, for example, the settings related to the detectable range AR of the gesture.
- the control unit 16 is a controller that controls each unit of the information processing device 10.
- the control unit 16 is realized by, for example, a processor such as a CPU (Central Processing Unit) or an MPU (Micro Processing Unit).
- the control unit 16 is realized by the processor executing various programs stored in the storage device inside the information processing device 10 using a RAM (Random Access Memory) or the like as a work area.
- the control unit 16 may be realized by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
- the CPU, MPU, ASIC, and FPGA can all be regarded as controllers.
- the control unit 16 includes an acquisition unit 161, a gesture detection unit 162, a command execution unit 163, an output control unit 164, a guessing unit 165, and a correction unit 166.
- Each block (acquisition unit 161 to correction unit 166) constituting the control unit 16 is a functional block indicating the function of the control unit 16.
- These functional blocks may be software blocks or hardware blocks.
- each of the above-mentioned functional blocks may be one software module realized by software (including a microprogram), or may be one circuit block on a semiconductor chip (die).
- each functional block may be one processor or one integrated circuit. The method of configuring the functional block is arbitrary.
- the control unit 16 may be configured in a functional unit different from the above-mentioned functional block. Further, another device may perform a part or all of the operations of each block (acquisition unit 161 to correction unit 166) constituting the control unit 16. For example, a terminal device such as a music player or a smartphone may perform some or all operations of each block constituting the control unit 16. The operation of each block constituting the control unit 16 will be described later.
- FIG. 10 is a diagram showing a correspondence between functions and gestures of the information processing apparatus 10.
- the information processing device 10 has a function related to answering a telephone in addition to a function related to playing music.
- the information processing apparatus 10 has the functions shown in the following (1) to (7).
- "function" can be paraphrased as "command”. (1) Play / Pause (2) AnswerCall / End (3) Next / Prev (4) Vol + / Vol- (5) Candle (6) Quick Attention (7) Ambient Control
- the user's actions constituting one gesture may include a first action that triggers the start of the gesture and a second action following the first action.
- the first operation is, for example, a swing-up operation of the hand H of the user U
- the second operation is, for example, an operation such as a swipe following the first operation. It is also possible to regard each of the first motion and the second motion as one gesture.
- the functions and gestures described here are just examples.
- the functions and gestures of the information processing apparatus 10 are not limited to the functions and gestures shown below.
- the first operation and the second operation are not limited to the operations shown below.
- the combination of the function and the gesture is not limited to the combination shown below.
- the Noise Canceling function may be operated by a gesture.
- the information processing device 10 is assumed to be an earphone-type device worn on the user's ear, but the information processing device 10 is not limited to the earphone-type device.
- the information processing device 10 may be a headphone type device.
- the information processing apparatus 10 may have a portion worn on the user's ear and another portion.
- it is assumed that the information processing apparatus 10 is attached to the user's ear, but the apparatus attached to the user's ear may be a predetermined device other than the information processing apparatus 10.
- the predetermined device is, for example, an output device wirelessly connected to the information processing device 10 (for example, an earphone or headphone, a hearing aid, a sound collector, etc., regardless of whether it is a wired, wireless, canal type, open ear type, etc.). You may. If the information processing device 10 is worn on the user's ear, the predetermined device is the information processing device 10 itself.
- a predetermined device can be paraphrased as a predetermined device.
- Play / Pause is a function for playing a song
- "Pause” is a function for pausing the playback of a song.
- the "Play / Pause” is associated with the swing-up motion of the hand H as the first motion, and is associated with the pinch motion as the second motion.
- FIG. 11 is a diagram for explaining a pinch operation. As shown in FIG. 11, the pinch operation is an operation of picking an object with a finger.
- the information processing apparatus 10 can detect that the user U has performed a pinch operation even if the user U has not actually picked up an object (for example, the information processing apparatus 10).
- FIG. 12 is a diagram for explaining the hold operation. As shown in FIG. 12, the hold operation is an operation of grasping an object with the hand H.
- the information processing apparatus 10 can detect that the user U has performed the hold operation even if the user U does not actually grasp an object (for example, the information processing apparatus 10).
- Next / Prev is a function for cueing the next song
- Prev is a function for cueing the previous song or the song being played.
- the "Next / Prev” is associated with the swinging motion of the hand H as the first motion, and is associated with the left and right swipe as the second motion.
- “Next” is associated with the left swipe (front swipe) of the left and right swipes as the second operation
- "Prev” is associated with the right swipe of the left and right swipes as the second operation (prev). After swipe) is associated.
- “Next / Prev” may be associated with a gesture that performs the second operation without the first operation.
- FIG. 13A and 13B are diagrams for explaining a left-right swipe.
- FIG. 13A is a left swipe and FIG. 13B is a right swipe.
- the left / right swipe is an operation (gesture) involving a movement width, unlike the pinch operation and the hold operation.
- the left swipe is an operation in which the user U slides the hand H in the forward direction (X-axis plus direction) by a predetermined movement width W1 as shown in FIG. 13A.
- the right swipe is an operation in which the user U slides the hand H in the backward direction (X-axis minus direction) by a predetermined movement width W2 as shown in FIG. 13B.
- FIGS. 13A and 13B are diagrams showing how the user U is performing a gesture near the left ear.
- the right swipe is the action of sliding the hand H forward
- the left swipe is the action of sliding the hand H backward. So be careful.
- "Next" is associated with, for example, a right swipe (front swipe) as a second action
- "Prev” is associated with, for example, a left swipe (back swipe) as a second action.
- Vol + / Vol- is a function to raise the volume
- “Vol-” is a function to lower the volume.
- “Vol + / Vol-” is associated with the swinging motion of the hand H as the first motion, and is associated with the up / down swipe as the second motion.
- “Vol +” is associated with the up swipe of the up and down swipes as the second action
- “Vol-” is associated with the down swipe of the up and down swipes as the second action.
- Vol + / Vol- may be associated with a gesture that performs the second operation without the first operation.
- FIG. 14A and 14B are diagrams for explaining the up / down swipe.
- FIG. 14A is a left swipe and FIG. 14B is a right swipe.
- the up / down swipe is an operation (gesture) involving a movement width, unlike the pinch operation and the hold operation.
- the upper swipe is an operation in which the user U slides the hand H upward (Z-axis plus direction) by a predetermined movement width W3, as shown in FIG. 14A.
- the lower swipe is an operation in which the user U slides the hand H downward (Z-axis minus direction) by a predetermined movement width W4, as shown in FIG. 14B.
- Vol + / Vol- is a function for adjusting the volume. Therefore, in order to execute this function, the information processing apparatus 10 can acquire not only the information on raising and lowering the sound but also the information on the amount of raising and lowering sound (the amount of raising or lowering the sound) from the user U. desirable. Therefore, in the present embodiment, "Vol + / Vol-" is assumed to be a function accompanied by an operation amount (in the case of this function, an amount of raising or lowering the sound). In this function, for example, the movement amount or movement speed of the hand H when swiping up and down is associated with the operation amount of "Vol + / Vol-". By configuring the information processing apparatus 10 so as to raise or lower a certain amount of sound with one input operation, it is possible to make this function a function without an operation amount.
- Candle “Cancel” is a function of canceling the operation performed by the user U.
- the “Cancel” is associated with the swinging motion of the hand H as the first motion, and is associated with the motion of holding the hand as the second motion.
- 15A and 15B are diagrams for explaining the operation of holding the hand H.
- FIG. 15A is a view of the user U viewed from the left side
- FIG. 15B is a view of the user U viewed from the front.
- the operation of holding the hand H is an operation in which the user U holds the spread hand H toward the information processing apparatus 10.
- Quick Attention is a function for the user U to quickly hear the surrounding sound. Specifically, “Quick Attention” is a function that makes it easier to hear surrounding sounds by quickly lowering or stopping the volume of the sound being output (for example, music, call voice, or ringtone). .. The “Quick Attention” is associated with the swinging motion of the hand H as the first motion, and is associated with the touch motion as the second motion.
- FIG. 16 is a diagram for explaining a touch operation. As shown in FIG. 16, the touch operation is an operation in which the user U touches a part of the information processing apparatus 10 (for example, the housing surface).
- Ambient Control is a function for the user U to listen to music while checking the surrounding sounds. Specifically, “Ambient Control” is a function of capturing external sounds during music reproduction.
- the "Ambient Control” is associated with a swing-up motion of the hand H as a first motion, and is associated with a motion of moving the hand H away as a second motion.
- FIG. 17 is a diagram for explaining an operation of moving the hand H away.
- the motion of moving the hand H away is a motion (gesture) with a movement width, unlike the pinch motion, the hold motion, the touch motion, and the motion of holding the hand.
- the hand H is moved in a predetermined separation direction by a predetermined movement width W4. In the example of FIG. 17, it is an operation of moving in the Y-axis plus direction).
- the information processing apparatus 10 may correct the detectable range and the reference direction immediately after the detection of the first motion is completed, or the gesture after the acquisition of the information on the series of movements of the user related to the gesture is completed.
- the detectable range and the reference direction may be corrected at the stage of performing the recognition process of.
- the information processing apparatus 10 corrects the detectable range based on the information of the first operation after the acquisition of the information of both the first operation and the second operation is completed. Then, the information processing apparatus 10 determines whether or not the second operation has been performed within the detectable range based on the information of the second operation for which the acquisition has been completed. If the second operation is not performed within the detectable range, the information processing apparatus 10 excludes the information of the second operation for which acquisition has been completed from the target of gesture detection.
- the information processing apparatus 10 corrects the reference direction based on the information of the first operation, for example, after the acquisition of the information of both the first operation and the second operation is completed.
- the information processing apparatus 10 interprets the information of the second operation for which the acquisition has been completed, based on the information in the reference direction corrected based on the information of the first operation.
- the "Play / Pause" is associated with the swing-up motion of the hand H as the first motion, and is associated with the pinch motion as the second motion.
- the information processing apparatus 10 limits the detectable range based on, for example, the first operation.
- the information processing device 10 does not have to make corrections in the reference direction. This is because the pinch motion is a gesture without a movement width, so information in the reference direction is not always necessary for detecting the gesture.
- the "Answer Call / End" is associated with a swing-up motion of the hand H as a first motion, and a pinch motion or a hold motion as a second motion.
- the information processing apparatus 10 limits the detectable range based on, for example, the first operation.
- the information processing device 10 does not have to make corrections in the reference direction. This is because the pinch operation and the hold operation are gestures without a movement width, and therefore the information in the reference direction is not always necessary for the gesture detection.
- Next / Prev is associated with the swing-up motion of the hand H as the first motion, and is associated with the left / right swipe as the second motion.
- the information processing apparatus 10 limits the detectable range based on, for example, the first operation. After limiting the detectable range based on the first operation, the information processing apparatus 10 may make a correction to widen the limited detectable range after detecting the start of the operation of the second operation, for example. As a result, the detection accuracy of the second operation is improved. Further, in this function, the information processing apparatus 10 corrects the reference direction based on, for example, the first operation.
- Vol + / Vol- "Vol + / Vol-" is associated with the swinging motion of the hand H as the first motion, and is associated with the up / down swipe as the second motion.
- the information processing apparatus 10 limits the detectable range based on, for example, the first operation. After limiting the detectable range based on the first operation, the information processing apparatus 10 may make a correction to widen the limited detectable range after detecting the start of the operation of the second operation, for example. As a result, the detection accuracy of the second operation is improved. Further, in this function, the information processing apparatus 10 corrects the reference direction based on, for example, the first operation.
- Candle The "Cancel” is associated with the swinging motion of the hand H as the first motion, and is associated with the motion of holding the hand H as the second motion.
- the information processing apparatus 10 sets a detectable range based on, for example, information about the state of the information processing apparatus 10.
- the information processing apparatus 10 sets the detectable range based on the information from the state detection unit 12.
- the information processing apparatus 10 does not necessarily have to perform correction in the reference direction. This is because the movement of holding the hand H is a gesture without a movement width, and therefore information in the reference direction is not always necessary for detecting the gesture.
- the "Quick Attention" is associated with the swinging motion of the hand H as the first motion, and is associated with the touch motion as the second motion.
- the information processing apparatus 10 does not have to correct the detectable range and the reference direction. This is because information on the detectable range and the reference direction is not always necessary for detecting the touch operation.
- Ambient Control is associated with a swing-up motion of the hand H as a first motion, and is associated with a motion of moving the hand H away as a second motion.
- the information processing apparatus 10 limits the detectable range based on, for example, the first operation. After limiting the detectable range based on the first operation, the information processing apparatus 10 may make a correction to widen the limited detectable range after detecting the start of the operation of the second operation, for example. As a result, the detection accuracy of the second operation is improved. Further, in this function, the information processing apparatus 10 corrects the reference direction based on, for example, the first operation.
- FIG. 18 is a flowchart showing a command execution process according to the present embodiment.
- the command execution process is a process in which the information processing apparatus 10 executes a command input by the user U using a gesture.
- the "command” is a command from the inside or the outside of the device for causing the information processing device 10 to execute a function. It is also possible to regard the "command” as a word indicating the function itself of the information processing apparatus 10, such as Play / Pause.
- the description of "command” appearing in the following explanation can be replaced with "function”.
- the command execution process is executed by the control unit 16 of the information processing device 10.
- the command execution process is started, for example, when the information processing apparatus 10 is turned on.
- the acquisition unit 161 of the control unit 16 acquires the sensor value from the input detection unit 11 and the state detection unit 12 (step S101). For example, the acquisition unit 161 acquires information about an object in the detection region OR from a non-contact type sensor such as a proximity sensor. Further, the acquisition unit 161 acquires information regarding contact with the housing surface from a contact-type sensor such as a touch sensor. Further, the acquisition unit 161 acquires information regarding the state of the information processing device 10 from sensors such as an acceleration sensor, a gyro sensor, and a geomagnetic sensor.
- the acquisition unit 161 acquires information regarding the gesture of the user U based on the sensor value acquired in step S101 (step S102). For example, the acquisition unit 161 acquires information on the operation of the user U regarding the gesture based on the sensor value from the input detection unit 11. More specifically, the acquisition unit 161 acquires information on the approaching motion in which the user U enters the hand H into the detectable range AR based on the sensor value of a non-contact type sensor such as a proximity sensor. If the information processing device 10 is a device worn on the ear of the user U, the approaching motion may be a swinging motion in which the user U swings the hand H toward the ear. In this case, the information on the approaching motion may be the information on the approaching direction AD of the user U's hand H into the detectable range AR.
- a non-contact type sensor such as a proximity sensor.
- the acquisition unit 161 acquires information regarding the state of the information processing apparatus 10 based on the sensor value acquired in step S101 (step S103). For example, the acquisition unit 161 acquires information about the state of the information processing apparatus 10 based on the sensor value from the state detection unit 12. For example, the acquisition unit 161 acquires information regarding the direction of gravity based on information from an acceleration sensor, a gyro sensor, or a geomagnetic sensor.
- the correction unit 166 of the control unit 16 executes the correction process related to the gesture detection (step 104). For example, the correction unit 166 corrects at least one of the detectable range of the gesture and the reference direction for detecting the gesture.
- the corrections made by the correction unit 166 are roughly classified into (1) corrections based on information on gestures and (2) corrections based on information on device states.
- the correction unit 166 corrects at least one of the detectable range AR and the reference direction based on the information acquired in step S102. For example, the correction unit 166 corrects at least one of the detectable range AR and the reference direction based on the information of the operation of the user U. ⁇ 1-2.
- the action of the user U may be the swing-up action of the hand H.
- the swing-up motion may be an approach motion in which the user U enters the hand H into the detectable range AR.
- the correction unit 166 may correct at least one of the detectable range AR and the reference direction based on the information of the approach direction AD of the user U's hand H to the detectable range AR.
- the information on the approaching motion may be the information on the approaching direction AD of the user U's hand H.
- the correction unit 166 may correct at least one of the detectable range AR and the reference direction based on the information of the approach direction AD. More specifically, the correction unit 166 corrects at least one of the detectable range AD and the reference direction based on the information regarding the mounting state of the information processing apparatus 10 to the user U estimated based on the information of the approach direction AD. You may.
- the information regarding the mounting state may be, for example, information regarding the mounting inclination.
- the mounting tilt is, for example, the tilt of the current information processing apparatus 10 from the standard mounting state of the information processing apparatus 10.
- the estimation of the mounting inclination may be performed by the estimation unit 165 of the control unit 16 based on the information of the approach direction AD. Then, the correction unit 166 may correct at least one of the detectable range AD and the reference direction based on the estimation result of the estimation unit 165.
- the user's actions constituting one gesture may include a first action that triggers the start of the gesture and a second action following the first action.
- the first operation is a swing-up operation of the hand H of the user U.
- the correction unit 166 may widen the detectable range for detecting the second motion to be wider than the detectable range AR when the swing-up motion is detected. good. More specifically, when the correction unit 166 detects the swing-up motion in the detectable range for detecting the second motion when the gesture to be detected (second motion) is a gesture with a movement width. It may be wider than the detectable range AR of.
- the gesture accompanied by the movement width is, for example, a swipe or an action of moving the hand H away.
- the correction unit 166 corrects at least one of the detectable range AR and the reference direction based on the information acquired in step S103. For example, the correction unit 166 corrects at least one of the detectable range AR and the reference direction based on the information regarding the mounting state of the information processing apparatus 10 to the user U, which is estimated based on the information from the state detection unit 12. ..
- the mounting tilt is, for example, the tilt of the current information processing apparatus 10 from the standard mounting state of the information processing apparatus 10.
- the estimation of the mounting inclination may be performed by the estimation unit 165 of the control unit 16. Then, the correction unit 166 may correct at least one of the detectable range AD and the reference direction based on the estimation result of the estimation unit 165.
- the correction unit corrects at least one of the detectable range AR and the reference direction based on the information regarding the posture of the user U estimated based on the information from the state detection unit 12.
- the information regarding the posture of the user U is information indicating that the user is lying on his back or the like.
- the information regarding the posture of the user U may be information in the direction of gravity detected by the acceleration sensor or the gyro sensor.
- the estimation unit 165 of the control unit 16 may estimate the posture of the user U.
- the correction unit 166 may correct at least one of the detectable range AD and the reference direction based on the estimation result of the estimation unit 165.
- the gesture detection unit 162 of the control unit 16 determines whether or not the gesture has been detected (step S105). For example, the gesture detection unit 162 determines whether or not the first operation is detected in the detection area OR and then the second operation is also detected in the detection area OR. When the gesture is not detected (step S105: No), the gesture detection unit 162 returns the process to step S101.
- the gesture detection unit 162 determines whether the detected gesture is executed within the detectable range AR (step S106). For example, the gesture detection unit 162 determines whether the first motion is detected in the detectable range AR and then the second motion is detected in the detectable range AR. If the second operation is a gesture with a movement width such as a left / right swipe, whether the gesture was executed within the detectable range AR depending on whether the start position and the end position are both within the detectable range AR. It may be determined.
- the detectable range AR for detecting the first operation and the detectable range AR for detecting the second operation may have different widths.
- the control unit 16 makes the detectable range for detecting the second motion wider than the detectable range AR when the swing-up motion is detected. May be good.
- the output control unit 164 controls the output unit 13 and is accepting the gesture detection. You may notify the user that this is the case. At this time, the notification given by the output control unit 164 may be a sound.
- step S106 If the gesture is not executed within the detectable range (step S106: No), the gesture detection unit 162 returns the process to step S101.
- the gesture detection unit 162 When the hand H of the user U enters the detectable range AR from the non-detectable range NR, the gesture detection unit 162 said that the second operation is performed within the detectable range AR regardless of whether or not the second operation is performed. Gestures may not be detected.
- FIGS. 19A and 19B are diagrams showing an example of a case where the gesture detection unit 162 does not detect a gesture.
- the detectable range AR is unevenly arranged in the detection region OR so that a part of the edge of the detectable range AR touches a part of the edge of the detection region OR. More specifically, the detectable range AR is unevenly arranged on the front side of the user U.
- FIG. 19A when the user U's hand H enters from the detectable range AR from the non-detectable range NR, the user U's entry into the detectable range AR is not due to the swing-up operation. Is assumed. Therefore, even if the second operation is performed within the detectable range AR as shown in FIG. 19B, the gesture detection unit 162 does not detect the gesture.
- the gesture detection unit 162 indicates that the second operation is performed within the detectable range AR when the user U's hand H directly enters the detectable range AR instead of from the non-detectable range NR.
- the gesture is detected as a condition.
- FIG. 20A and 20B are diagrams showing an example of a case where the gesture detection unit 162 detects a gesture.
- the user U directly enters the detectable range AR of the hand H, and then performs the second operation as shown in FIG. 20B.
- the control unit 16 sets the detectable range for detecting the second motion to the detectable range AR when the first motion is detected ( It is wider than the detectable range AR) shown in FIG. 20A.
- the gesture detection unit 162 detects the gesture.
- the gesture detection unit 162 prevents the gesture from being detected, and the user U's hand H does not detect the non-detectable range NR.
- the possibility of erroneous detection of the gesture is further reduced by detecting the gesture.
- step S106 when the gesture is executed within the detectable range (step S106: Yes), the command execution unit 163 of the control unit 16 executes the command corresponding to the detected gesture (step S107). For example, the command execution unit 163 executes the function of the information processing apparatus 10 associated with the detected gesture.
- the correction unit 166 executes a correction process related to gesture detection (step S108). For example, the correction unit 166 corrects at least one of the detectable range and the reference direction based on the information of the gesture detected by the gesture detection unit 162. For example, the correction unit 166 widens or narrows the detectable range AR based on the information on the start position and the end position of the swipe. Further, the correction unit 166 finely adjusts the reference direction from the movement direction of the detected left / right swipe or up / down swipe.
- control unit 16 returns the process to step S101 and executes the process of steps S101 to S108 again.
- the information processing apparatus 10 since the information processing apparatus 10 corrects the detectable range, it is possible to reduce the situation where even an operation not intended by the user as a gesture is detected as a gesture. Further, since the information processing apparatus 10 corrects the reference direction, it is less likely that the gesture intended by the user is mistaken for another gesture.
- Embodiment 2 >> In the first embodiment, the information processing apparatus 10 detects the movement of the user's hand H and executes a command. However, the device that detects the movement of the user's hand H and the device that executes the command may be different devices. Hereinafter, the information processing system 1 of the second embodiment will be described.
- FIG. 21 is a diagram showing a configuration example of the information processing system 1.
- the information processing system 1 is a system that executes various functions based on the gesture of the user. As shown in FIG. 21, the information processing system 1 includes an output device 20 and a terminal device 30. In the example of FIG. 21, the information processing system 1 includes an output device 20A and an output device 20B. In the example of FIG. 21, the output device 20 and the terminal device 30 are wirelessly connected, but the output device 20 and the terminal device 30 may be configured to be connectable by wire.
- the information processing system 1 may be provided with only one output device 20 or may be provided with a plurality of output devices 20.
- the output device 20 is an earphone
- the information processing system 1 may include a pair of output devices 20 that are wirelessly connected to the terminal device 30 and are attached to the left and right ears of the user U, respectively.
- the output device 20A is an earphone worn on the left ear of the user U
- the output device 20B is an earphone worn on the right ear of the user U.
- one output device 20 does not necessarily have to be one integrated device. It is also possible to regard a plurality of separate devices that are functionally or practically related as one output device 20. For example, a pair of left and right earphones worn on the left and right ears of the user U may be regarded as one output device 20. Of course, one output device 20 may be one integrated earphone worn on one ear of the user U, or one integrated headphone worn on the left and right ears of the user U. good.
- the output device 20 is an acoustic output device that can be worn by a user such as earphones and headphones.
- the information processing device 10 may be a display device such as AR glass or MR glass that can be worn by the user.
- the output device 20 is a device that can be worn by the user U, and is provided with a portion located at least in the ear of the user U when worn. Then, at least a detection unit for detecting an object is arranged in this portion.
- the detection unit is a functional block corresponding to the input detection unit 11 when the first embodiment is described as an example. If there are two left and right parts located in the ears of the user U, each of both parts may have a detection unit, or only one part may have a detection part.
- FIG. 22 is a diagram showing a configuration example of the output device 20 according to the embodiment of the present disclosure.
- the output device 20 includes an input detection unit 21, a state detection unit 22, an output unit 23, a communication unit 24, and a control unit 26.
- the configuration shown in FIG. 22 is a functional configuration, and the hardware configuration may be different from this. Further, the functions of the output device 20 may be distributed and implemented in a plurality of physically separated configurations.
- the input detection unit 21 is a detection unit that detects a user's input operation.
- the state detection unit 22 is a sensor unit that detects the state of the output device 20.
- the output unit 23 is an output interface that outputs information to the user.
- the communication unit 24 is a communication interface for communicating with other devices such as the terminal device 30.
- the control unit 26 is a controller that controls each unit of the output device 20.
- the input detection unit 21, the state detection unit 22, the output unit 23, the communication unit 24, and the control unit 26 are configured by the input detection unit 11, the state detection unit 12, the output unit 13, and the communication unit included in the information processing device 10.
- the configuration is the same as that of the control unit 16 and the control unit 16. In this case, the description of the information processing device 10 may be replaced with the output device 20 as appropriate.
- Terminal device configuration Next, the configuration of the terminal device 30 will be described.
- the terminal device 30 is an information processing terminal capable of communicating with the output device 20.
- the terminal device 30 is a kind of information processing device of the present embodiment.
- the terminal device 30 is, for example, a mobile phone, a smart device (smartphone or tablet), a PDA (Personal Digital Assistant), or a personal computer.
- the terminal device 30 may be a device such as a commercial camera equipped with a communication function, or may be a mobile body equipped with a communication device such as an FPU (Field Pickup Unit).
- the terminal device 30 may be an M2M (Machine to Machine) device or an IoT (Internet of Things) device.
- the terminal device 30 controls the output device 20 from the outside of the output device 20 via wire or wireless.
- FIG. 23 is a diagram showing a configuration example of the terminal device 30 according to the embodiment of the present disclosure.
- the terminal device 30 includes an input unit 31, a state detection unit 32, an output unit 33, a communication unit 34, a storage unit 35, and a control unit 36.
- the configuration shown in FIG. 23 is a functional configuration, and the hardware configuration may be different from this. Further, the functions of the terminal device 30 may be distributed and implemented in a plurality of physically separated configurations.
- the input unit 31 is an input interface that accepts user input operations.
- the input unit 31 is a button or a touch panel.
- the state detection unit 32 is a sensor unit that detects the state of the terminal device 30.
- the output unit 33 is an output interface that outputs information to the user.
- the communication unit 34 is a communication interface for communicating with other devices such as the output device 20.
- the storage unit 35 is a storage device capable of reading and writing data.
- the storage unit 35 stores, for example, information about the detectable range AR of the gesture.
- the control unit 36 is a controller that controls each unit of the terminal device 30.
- the control unit 36 includes an acquisition unit 361, a gesture detection unit 362, a command execution unit 363, an output control unit 364, a guessing unit 365, and a correction unit 366.
- the input detection unit 21, the state detection unit 22, the output unit 23, the communication unit 24, and the control unit 26 are configured by the input detection unit 11, the state detection unit 12, the output unit 13, and the communication unit included in the information processing device 10.
- the configuration is the same as that of the control unit 16 and the control unit 16.
- the acquisition unit 361 communicates with the input detection unit 21 and the state detection unit 22.
- the configuration is the same as that of the acquisition unit 161, the gesture detection unit 162, the command execution unit 163, the output control unit 164, the guessing unit 165, and the correction unit 166 included in the control unit 16 of the information processing apparatus 10 except that the information is acquired from. be.
- the description of the information processing device 10 may be appropriately replaced with the output device 20 or the terminal device 30.
- the information processing system 1 can execute a command execution process in the same manner as the information processing apparatus 10 of the first embodiment.
- the command execution process of the information processing system 1 is the same as the command execution process of the first embodiment except that the terminal device 30 acquires the sensor value from the output device 20.
- the command execution process will be described with reference to FIG. 18 as in the first embodiment.
- FIG. 18 is a flowchart showing a command execution process according to the present embodiment.
- the command execution process is executed by the control unit 36 of the terminal device 30.
- the command execution process is executed, for example, when the terminal device 30 establishes communication with the output device 20.
- the acquisition unit 361 of the control unit 36 acquires the sensor value from the input detection unit 21 and the state detection unit 22 of the output device 20 via the communication unit 34 (step S101). For example, the acquisition unit 361 acquires information about an object in the detection region OR from the non-contact type sensor of the output device 20. Further, the acquisition unit 361 acquires information regarding contact of the output device 20 with the housing surface from the contact-type sensor of the output device 20. Further, the acquisition unit 361 acquires information regarding the state of the output device 20 from a sensor such as an acceleration sensor of the output device 20.
- the acquisition unit 361 acquires information regarding the gesture of the user U based on the sensor value acquired in step S101 (step S102). For example, the acquisition unit 361 acquires information on the operation of the user U regarding the gesture based on the sensor value from the input detection unit 21.
- the acquisition unit 361 acquires information regarding the state of the output device 20 based on the sensor value acquired in step S101 (step S103). For example, the acquisition unit 361 acquires information regarding the state of the output device 20 based on the sensor value from the state detection unit 22.
- the correction unit 366 of the control unit 36 executes the correction process related to the gesture detection (step 104). For example, the correction unit 366 corrects at least one of the detectable range of the gesture and the reference direction for detecting the gesture.
- the gesture detection unit 362 of the control unit 36 determines whether or not the gesture has been detected (step S105). For example, the gesture detection unit 362 determines whether or not the first operation is detected in the detection area OR and then the second operation is also detected in the detection area OR. When the gesture is not detected (step S105: No), the gesture detection unit 362 returns the process to step S101.
- the gesture detection unit 362 determines whether the detected gesture is executed within the detectable range AR (step S106). For example, the gesture detection unit 362 determines whether the first motion is detected in the detectable range AR and then the second motion is detected in the detectable range AR.
- step S106 If the gesture is not executed within the detectable range (step S106: No), the gesture detection unit 362 returns the process to step S101.
- step S106 when the gesture is executed within the detectable range (step S106: Yes), the command execution unit 363 of the control unit 16 executes the command corresponding to the detected gesture (step S107). For example, the command execution unit 363 executes the function of the output device 20 associated with the detected gesture.
- the correction unit 366 executes a correction process related to gesture detection (step S107). For example, the correction unit 366 corrects at least one of the detectable range and the reference direction based on the information of the gesture detected by the gesture detection unit 362.
- control unit 36 returns the process to step S101 and executes the processes of steps S101 to S108 again.
- the terminal device 30 since the terminal device 30 corrects the detectable range, it is possible to reduce the situation where even an operation not intended by the user as a gesture is detected as a gesture. Further, since the terminal device 30 corrects the reference direction, it is possible to reduce the situation where the gesture intended by the user U is mistaken for another gesture.
- the terminal device 30 automatically sets the detectable range AR based on the sensor value.
- the terminal device 30 may be configured so that the user U can manually set the detectable range AR.
- FIG. 24 is a diagram showing how the user U manually sets the size of the detectable range AR.
- the terminal device 30 is configured so that the user U can change the setting of the size of the detectable range AR by using the GUI (Graphical User Interface).
- GUI Graphic User Interface
- the acquisition unit 361 of the control unit 36 acquires information on the detectable range AR from the storage unit 35.
- the storage unit 35 stores information indicating the current detection range AR setting.
- the acquisition unit 361 acquires the information from, for example, the storage unit 35. If information indicating the current detectable range AR setting is stored in another device (for example, an output device 20 or a device on the cloud), the acquisition unit 361 may be used via the communication unit 34. The information may be obtained from the device of.
- the output control unit 364 visualizes the current detectable range AR setting based on the acquired information so that the user can visually recognize it, and displays it on the output unit 33 (step S201).
- a part of the detection area OR is the detectable range AR.
- the acquisition unit 361 acquires the operation information of the user U from the input unit 31 of the terminal device 30 (step S202). For example, the acquisition unit 361 acquires information on the operation of the user U to change the size of the detectable range AR from the input unit 31.
- the terminal device 30 includes a touch panel display as an input unit 31, and the acquisition unit 361 acquires operation information for expanding the detectable range AR of the user U. More specifically, in the example of FIG. 24, the acquisition unit 361 acquires information that the user U is pinching out the hand H on the detectable range AR displayed on the touch panel display. There is.
- the output control unit 364 visualizes the detectable range AR related to the change operation of the user U based on the information acquired in step S202 so that the user U can see it, and displays it on the output unit 33 (step S203).
- control unit 36 detects an operation (for example, pressing the OK button) instructing the user U to change the detectable range AR
- the control unit 36 stores the information based on the information of the detectable range AR displayed in step S203.
- the setting of the detectable range AR stored in the unit 35 is updated.
- the terminal device 30 may be configured so that the user U can manually change the position of the detectable range AR in the detection area OR.
- FIG. 25 is a diagram showing how the user U manually sets the position of the detectable range AR.
- the terminal device 30 is configured so that the user U can change the setting of the position of the detectable range AR by using the GUI.
- the processing of the terminal device 30 that enables the manual setting of the position of the detectable range AR will be described with reference to FIG. 25. The following processing is executed by the control unit 36 of the terminal device 30.
- the acquisition unit 361 of the control unit 36 acquires information on the detectable range AR from the storage unit 35. Then, the output control unit 364 visualizes the current detectable range AR setting based on the acquired information so that the user can visually recognize it, and displays it on the output unit 33 (step S301).
- a part of the detection area OR is the detectable range AR.
- the acquisition unit 361 acquires the operation information of the user U from the input unit 31 of the terminal device 30 (step S302). For example, the acquisition unit 361 acquires information on the operation of the user U to change the position of the detectable range AR from the input unit 31.
- the terminal device 30 includes a touch panel display as an input unit 31, and the acquisition unit 361 acquires operation information to change the position of the detectable range AR of the user U. More specifically, in the example of FIG. 25, in the acquisition unit 361, the user U touches the detectable range AR displayed on the touch panel display, and the user U swipes the finger toward the position to be moved. The information of is acquired.
- the output control unit 364 visualizes the detectable range AR related to the change operation of the user U based on the information acquired in step S302 so that the user U can see it, and displays it on the output unit 33 (step S303).
- control unit 36 detects an operation (for example, pressing the OK button) instructing the user U to change the detectable range AR
- the control unit 36 stores the information based on the information of the detectable range AR displayed in step S303.
- the setting of the detectable range AR stored in the unit 35 is updated.
- the detectable range AR or the non-detectable range NR may be adjusted in conjunction with, for example, the voice recognition function of the terminal device 30. For example, when the user recognizes the voice "widen the detectable range AR", the detectable range AR is changed step by step.
- the combination of gesture and function is predetermined.
- the terminal device 30 may be configured so that the user U can manually set the combination of the gesture and the function.
- the terminal device 30 may be configured to store the operation history information related to the gesture of the user U in the storage unit 35. Then, the terminal device 30 may aggregate the gestures frequently used by the user U and their success rates based on the information in the operation history, and display the aggregated results so that the user U can visually recognize them.
- FIG. 26 is a diagram showing how the gestures frequently used by the user U and their success rates are displayed. In the example of FIG. 26, it can be seen that the user U uses the swipe operation a lot, but the success rate of the swipe operation is poor.
- the terminal device 30 may be configured so that the user U can change the association between the gesture and the function.
- the terminal device 30 may have a GUI for changing the association between the gesture and the function.
- the user U modifies the function associated with the swipe operation (eg, Next / Prev) using the GUI to associate with the pinch operation, for example.
- the terminal device 30 may be configured to propose the detectable range AR to the user based on the information in the operation history. Further, the terminal device 30 may be configured so that the user can change the detectable range AR based on the proposal.
- the user U By allowing the user U to manually set the association between the gesture and the function, it is possible to associate the gesture with the function according to the individuality of the user U. As a result, the user U can associate the gesture with a high success rate with the frequently used function, so that the false detection of the gesture is reduced.
- Embodiment 3 >> In the first and second embodiments, the non-contact sensor is mainly used to determine the user's operation on the device. In the third embodiment, the contact sensor and the non-contact sensor are combined to determine the user's device operation. Hereinafter, the information processing apparatus 40 of the third embodiment will be described.
- Gesture detection using a non-contact sensor changes the range in which gestures can be detected according to the orientation and posture of the user's face. Therefore, the detection accuracy of gesture detection using a non-contact sensor is low. This is because the position of the gesture is absolutely determined by the location of the sensor, but the user relatively recognizes the position of the gesture regardless of the posture state. To solve this problem, it is possible to expand the detectable range of gestures. However, if the detectable range of the gesture is unnecessarily expanded, the information processing apparatus may mistakenly recognize other misleading actions as the gesture. Therefore, it is desirable to limit the detectable range as much as possible.
- the device operation by gesture has a small load on the user.
- gestures vary greatly from person to person (that is, the gesture position is not reproducible)
- the accuracy of device operation is low.
- device operation physical operation
- the operation surface is limited, the operation area becomes narrow and the commands that can be input are reduced.
- the contact sensor and the non-contact sensor are combined. More specifically, the information processing apparatus 40 of the third embodiment makes a correction related to gesture detection or a correction related to device operation based on the sensor information from the contact sensor and the sensor information from the non-contact sensor.
- the information processing apparatus 40 detects the user's gesture using at least the sensor information from the contact sensor, and estimates the gesture position at that time based on the sensor information from the non-contact sensor. Then, the information processing apparatus 40 corrects the detectable range of the gesture based on the information of the estimated gesture position. As a result, the detectable range of the gesture is updated at any time according to the usage status of the user, so that the detection accuracy of the gesture is improved.
- the information processing device 40 may be configured to detect the user's gesture by using at least the sensor information from the non-contact sensor. Then, the information processing device 40 may correct the detectable range of the gesture based on the sensor information from the non-contact sensor when the user touches the face or the information processing device 40. As a result, the detectable range of the gesture is corrected based on the position of the physical contact, so that the detection accuracy of the gesture can be improved.
- the information processing device 40 may be configured to detect the user's gesture using at least the sensor information from the contact sensor. Then, the information processing apparatus 40 may be configured to execute a command corresponding to the detected gesture.
- the command may include at least a command with an operation amount.
- the information processing apparatus 40 may determine the operation amount based on the sensor information from the non-contact sensor. As a result, the information processing apparatus 40 can determine the operation amount by using the non-contact sensor while realizing highly accurate gesture detection by using the contact sensor.
- FIG. 27 is a diagram showing a configuration example of the information processing apparatus 40 according to the third embodiment.
- the information processing device 40 includes an input detection unit 41, a state detection unit 42, an output unit 43, a communication unit 44, a storage unit 45, and a control unit 46.
- the configuration shown in FIG. 27 is a functional configuration, and the hardware configuration may be different from this. Further, the functions of the information processing apparatus 40 may be distributed and implemented in a plurality of physically separated configurations. For example, as shown in the second embodiment, the functions of the information processing apparatus 40 may be distributed and mounted in the output apparatus 20 and the terminal apparatus 30.
- the input detection unit 41 is a detection unit that detects a user's input operation.
- the input detection unit 11 includes a non-contact type detection unit (hereinafter, referred to as a non-contact sensor) that detects an object non-contactly, and a contact-type detection unit (hereinafter, contact) that detects contact with the user's human body or an object. It is called a sensor).
- a non-contact type detection unit hereinafter, referred to as a non-contact sensor
- contact contact-type detection unit
- the non-contact sensor may be composed of one or a plurality of proximity sensors that detect an object located in the detection area.
- the proximity sensor may be an optical sensor or a capacitance type sensor.
- An optical proximity sensor is a sensor that detects the light reflected by an object.
- the capacitance type proximity sensor is a sensor that detects a change in capacitance that occurs between an object and the sensor.
- the non-contact sensor may be a 2D sensor or a 3D sensor.
- the non-contact sensor includes one or a plurality of touch sensors, and detects an object that comes into contact with a predetermined place of the information processing apparatus 10.
- the predetermined location is, for example, the side surface of the earphone (opposite the surface of the speaker).
- the information processing device 10 is a headphone type device, the predetermined location is, for example, the side surface of the headphones (opposite the surface of the speaker).
- the side surface of the earphone or the side surface of the headphone may be referred to as the housing surface.
- the input detection unit 11 is provided with a touch pad TP as a contact type sensor.
- the touch pad TP is composed of a plurality of touch sensors arranged in a plane.
- the infrared emitter ID is arranged on the housing surface. The input detection unit 11 detects an object in contact with the surface of the touch pad TP.
- the non-contact sensor is not limited to the sensor that detects contact with the device.
- the non-contact sensor may be a sensor that detects contact with the human body of the user who wears the information processing device 40.
- the non-contact sensor may be a sensor that detects contact with the face of the user wearing the information processing device 40.
- the non-contact sensor may be configured to detect contact with the user's face, for example, by detecting vibration when the user touches his / her face with his / her hand.
- the state detection unit 42 is a sensor unit that detects the state of the information processing device 40.
- the state detection unit 42 may be configured to detect the user's state.
- the control unit 46 includes a first acquisition unit 461A, a second acquisition unit 461B, a gesture detection unit 462, a command execution unit 463, an output control unit 164, and a guessing unit 165. , And a correction unit 166.
- Each block (acquisition unit 161 to correction unit 166) constituting the control unit 46 is a functional block indicating the function of the control unit 46, respectively.
- These functional blocks may be software blocks or hardware blocks.
- each of the above-mentioned functional blocks may be one software module realized by software (including a microprogram), or may be one circuit block on a semiconductor chip (die).
- each functional block may be one processor or one integrated circuit. The method of configuring the functional block is arbitrary.
- the control unit 16 may be configured in a functional unit different from the above-mentioned functional block. Further, another device may perform a part or all of the operations of each block (first acquisition unit 461A to correction unit 466) constituting the control unit 16. For example, a terminal device such as a music player or a smartphone may perform some or all operations of each block constituting the control unit 46. The operation of each block constituting the control unit 46 will be described later.
- the input detection unit 41, the state detection unit 42, the output unit 43, the communication unit 44, and the control unit 46 are configured with the input detection unit 11, the state detection unit 12, the output unit 13, and the communication unit included in the information processing device 10.
- the configuration is the same as that of the control unit 16 and the control unit 16.
- the contact sensor and the non-contact sensor are combined. More specifically, the information processing apparatus 40 makes a correction related to gesture detection or a correction related to device operation based on the sensor information from the contact sensor and the sensor information from the non-contact sensor. Some specific examples are shown below.
- Example 1 First, Example 1 will be described.
- Example 1 assumes a gesture due to physical contact with the user's face or device.
- FIG. 28 is a diagram showing a user performing a gesture by physical contact.
- the detectable range A1 of the contact sensor is set to a part of the left cheek of the user
- the detectable range A2 of the non-contact sensor is set to the vicinity of the left cheek including a part of the left cheek of the user.
- a part of the detectable range A1 and a part of the detectable range A2 overlap.
- the user operates the device by tapping the cheek.
- Example 1 the information processing apparatus 40 estimates the operation area (gesture position) at the time of contact based on the sensor information from the non-contact sensor. As a result, the information processing apparatus 40 makes it possible to determine in which area within the detectable range A1 the user performed the gesture.
- the information processing apparatus 40 corrects the detectable range of the gesture based on the information of the estimated gesture position. For example, the information processing apparatus 40 increases the size of the detectable range A1 when the operating area (gesture position) covers a range equal to or larger than a predetermined ratio of the current detectable range A1. When the operation area (gesture position) is located at the end of the current detectable range A1, the position of the detectable range A1 is corrected so that the operation area is in the center. As a result, the detectable range A1 of the gesture is updated at any time according to the usage status of the user, so that the detection accuracy of the gesture is improved.
- the user's actions constituting one gesture may include a first action that triggers the start of the gesture and a second action following the first action.
- the information processing apparatus 40 may narrow the detectable range of the second operation based on the information of the operation area (gesture position) of the first operation. This makes it possible to prevent erroneous detection of gestures due to daily activities.
- the information processing device 40 executes a command corresponding to the detected gesture.
- the information processing apparatus 40 may change the command to be executed even if the gestures detected based on the sensor information from the contact sensor are the same gestures but the gesture positions are different. That is, the information processing apparatus 40 determines a command to be executed based on the gesture detected based on the sensor information from the contact sensor and the gesture position information detected based on the sensor information from the non-contact sensor. You may. As a result, the information processing apparatus 40 can assign many commands to one gesture while realizing highly accurate gesture detection by physical contact.
- the information processing apparatus 40 detects the gesture due to physical contact with the non-contact sensor alone. It is also possible.
- FIG. 29 is a flowchart showing an example of the command execution process according to the third embodiment.
- the command execution process is executed by the control unit 46 of the information processing apparatus 40.
- the command execution process is executed, for example, when the information processing apparatus 40 is turned on.
- the second acquisition unit 461B of the information processing apparatus 40 acquires the sensor value of the non-contact sensor (step S201). Then, the guessing unit 465 of the information processing apparatus 40 estimates the gesture position based on the sensor value of the non-contact sensor (step S202). Further, the first acquisition unit 461A of the information processing apparatus 40 acquires the sensor value of the contact sensor (step S203).
- the gesture detection unit 462 of the information processing apparatus 40 determines whether or not the gesture has been detected based on the sensor value of the contact sensor (step S204). When the gesture is not detected (step S204: No), the gesture detection unit 462 returns the process to step S201.
- step S204 when the gesture is detected (step S204: Yes), the guessing unit 465 of the information processing apparatus 40 estimates the device state (step S205). Then, the gesture detection unit 462 of the information processing apparatus 40 determines the gesture based on the estimation result of the device state and the sensor value of the contact sensor (step S206). Then, the correction unit 466 of the information processing apparatus 40 corrects the detectable range of the gesture (in the example of FIG. 28, the detectable range A1) (step S207). After that, the command execution unit 463 executes the command assigned to the determined gesture (step S208). After executing the command, the command execution unit 463 returns the process to step S201.
- the gesture detection unit 462 of the information processing apparatus 40 determines the gesture based on the estimation result of the device state and the sensor value of the contact sensor (step S206). Then, the correction unit 466 of the information processing apparatus 40 corrects the detectable range of the gesture (in the example of FIG. 28, the detectable range A1) (step S207). After that, the command execution unit 4
- Example 2 Next, Example 2 will be described.
- FIG. 30 is a diagram showing a user performing a gesture by aerial motion.
- the detectable range A1 of the contact sensor is set in the information processing device 40 and its surroundings, and the detectable range A2 of the non-contact sensor is set at a position separated to the left from the user's head. There is.
- the detectable range A1 and the detectable range A2 are separated from each other and do not overlap.
- the user operates the device by performing a predetermined aerial operation on the left side of the head.
- the gesture position may vary greatly depending on the user's condition. Therefore, when the detection of the non-contact gesture is not successful due to the variation in the gesture position, the information processing apparatus 40 urges the user to come into contact with the human body, an object, or an arbitrary space point. For example, the information processing device 40 prompts the user to tap the face or physically touch the information processing device 40.
- the information processing apparatus 40 calibrates the detectable range A2 of the gesture based on the operation of the user's physical touch. For example, the information processing apparatus 40 corrects the detectable range A2 of the gesture based on the sensor information from the non-contact sensor when the user touches a human body, an object, or an arbitrary space point. More specifically, the information processing apparatus 40 detects contact with the user's human body, object, or arbitrary space point based on the sensor information from the contact sensor, and detects contact with the user's human body or object. The detectable range A2 of the gesture is corrected based on the sensor information from the non-contact sensor at that time. As a result, the detectable range A2 of the gesture is corrected based on the position of the physical contact, so that the detection accuracy of the gesture is improved.
- the information processing apparatus 40 may provide feedback to the user by sound or the like when the user's hand is located in the corrected detectable range A2 after detecting the contact with the user's body or an object. Further, when the gesture cannot be detected by the non-contact sensor due to the posture state or context of the user (for example, walking or running), the information processing apparatus 40 recognizes the user's hand in the detectable range A2 of the non-contact sensor. Occasionally, feedback may be given to encourage physical operation.
- FIG. 31 is a flowchart showing another example of the command execution process according to the third embodiment.
- the command execution process is executed by the control unit 46 of the information processing apparatus 40.
- the command execution process is executed, for example, when the information processing apparatus 40 is turned on.
- the second acquisition unit 461B of the information processing apparatus 40 acquires the sensor value of the non-contact sensor (step S301). Then, the gesture detection unit 462 of the information processing apparatus 40 determines whether or not the gesture has been detected based on the sensor value of the non-contact sensor (step S302).
- the first acquisition unit 461A of the information processing apparatus 40 acquires the sensor value of the contact sensor (step S303). Then, the correction unit 466 of the information processing apparatus 40 executes an operation related to calibration of the detectable range A2 of the gesture (step S304). For example, the correction unit 466 prompts the user to tap the face or physically touch the information processing device 40. The correction unit 466 corrects the detectable range A2 of the gesture based on the sensor information from the non-contact sensor when the user comes into contact with the human body or an object (step S305). When the correction is completed, the correction unit 466 returns the process to step S301.
- step S302 when the gesture is detected (step S302: Yes), the guessing unit 465 of the information processing apparatus 40 estimates the device state (step S306). Then, the gesture detection unit 462 of the information processing apparatus 40 determines the gesture based on the estimation result of the device state and the sensor value of the contact sensor (step S307). Then, the command execution unit 463 executes the command assigned to the determined gesture (step S308). After executing the command, the command execution unit 463 returns the process to step 301.
- Example 3 Next, Example 3 will be described.
- FIG. 32 is a diagram showing how a command accompanied by an operation amount is input.
- the command is a volume control command (Vol + / Vol-)
- the operation amount is volume.
- the movement direction of the user's hand corresponds to the change direction of the volume (whether the volume is increased or decreased), and the movement amount or movement speed of the user's hand becomes the change amount (input value) of the sound. handle.
- Example 3 the contact sensor and the non-contact sensor are combined to enable not only a command but also an operation amount (change direction and change amount) to be input.
- the information processing apparatus 40 detects the user's gesture using at least the sensor information from the contact sensor. Then, the information processing apparatus 40 determines the operation amount based on the sensor information from the non-contact sensor. As a result, the information processing apparatus 40 can determine the operation amount by using the non-contact sensor while realizing highly accurate gesture detection by using the contact sensor.
- the user's actions constituting one gesture may include a first action that triggers the start of the gesture and a second action following the first action.
- the command is a volume control command
- the first action is, for example, a face tap
- the second action is, for example, a hand movement in a predetermined direction (for example, in the front-back direction) from the face tap.
- the information processing apparatus 40 may determine the volume to be changed based on the moving direction, the moving amount, and / or the moving speed of the second operation.
- the information processing apparatus 40 makes corrections related to the second operation based on information on the speed and magnitude of the first operation (for example, the speed of the hand until the face is tapped and the strength of the face tap).
- the information processing device 40 determines the speed of the first operation based on the information from the non-contact sensor. Then, the information processing apparatus 40 corrects a set value (for example, a threshold value) for associating the second operation with the operation amount based on the information of the speed of the first operation. After that, the information processing apparatus 40 determines the speed or the amount of movement of the second operation based on the sensor information from the non-contact sensor. Then, the information processing apparatus 40 determines the operation amount based on the corrected set value and the information of the speed or the movement amount of the second operation. As a result, the information processing apparatus 40 can improve the input accuracy of the user's operation amount.
- a set value for example, a threshold value
- a user with a fast gesture input speed may bring his / her hand to a distant place, so it is desirable that the range in which the gesture can be detected is wide.
- a user with a slow gesture input speed is unlikely to bring his / her hand to a distant place. Therefore, considering false detection, it is desirable that the range in which the gesture can be detected is narrow. Therefore, the information processing apparatus 40 makes a correction regarding the detection of the second operation based on the information of the first operation of the user.
- the information processing device 40 determines the speed of the first operation based on the sensor information from the non-contact sensor. Then, the information processing apparatus 40 corrects the detectable range of the second operation based on the information of the speed of the first operation. As a result, the detection accuracy of the second operation can be made high.
- the information processing apparatus 10 of the first embodiment may change the detectable range AR according to the distance to an obstacle.
- FIG. 33 is a diagram showing how the detectable range AR is changed according to the distance to the obstacle.
- the information processing apparatus 10 includes a pair of input detection units 11 R and 11 L.
- the obstacle O1 is located in the right direction (Y-axis minus direction) of the input detection unit 11 R , and in the left direction (Y-axis plus direction) of the input detection unit 11 L.
- Obstacle O2 is located.
- the obstacle may be a structure such as a wall, or may be an object independent of the structure.
- the input detection unit 11 R and the input detection unit 11 L each detect the distance to an obstacle.
- the distance to the obstacle O1 is d1
- the distance to the obstacle O2 is d2.
- the information processing apparatus 10 adjusts the detectable range AR so that an obstacle does not enter the detectable range AR.
- the information processing apparatus 10 corrects the detectable range AR R of the input detection unit 11 R so that the maximum distance in the separation direction (Y-axis minus direction) is d3, which is shorter than d1. ..
- the information processing apparatus 10 the detectable range AR L of the input detection unit 11L, the maximum distance separating direction (Y-axis plus direction) is corrected so as to be shorter than d2 d4.
- the information processing apparatus 10 to d3 and d4 may be corrected detectable range AR R and detectable range AR L to be the same distance, the detectable range AR R and detection to d3 and d4 is different distances range AR L may be corrected. Further, the information processing apparatus 10 may correct only one of the detectable range AR R and the detectable range AR L. If the position of the obstacle and / or the user U is known in advance, the information processing apparatus 10 may correct the detectable range AR based on the position information.
- the information processing apparatus 10 of the first embodiment may include a pair of predetermined parts located in both ears of the user U when worn.
- the predetermined portion may be a portion where the sound of the earphone or the headphone is output. Then, when the swinging motion of the user U is detected in both of the pair of predetermined parts, the information processing apparatus 10 does not execute the function associated with the gesture even if the gesture is detected. You may do it.
- the output device 20 of the second embodiment may include a pair of predetermined parts located in both ears of the user U when worn.
- the predetermined portion may be a portion where the sound of the earphone or the headphone is output. Then, when the swing-up motion of the user U is detected in both of the pair of predetermined parts, the terminal device 30 of the second embodiment performs the function associated with the gesture even when the gesture is detected. You may not execute it.
- the information processing device 10 or the terminal device 30 can reduce operations that do not meet the user's intention.
- the function of the information processing apparatus 10 of the first embodiment may include a predetermined function accompanied by an operation amount.
- the predetermined function accompanied by the operation amount may be a function related to the volume operation (Vol + / Vol ⁇ ) or a function related to the reproduction speed.
- the predetermined function accompanied by the operation amount is not limited to these, and may be, for example, a function related to fast forward, rewind, and slow playback.
- a gesture with a movement width such as a swipe, may be associated with a predetermined function.
- the information processing apparatus 10 has a relative movement width of the gesture based on the size of the detectable range.
- the operation amount of a predetermined function (for example, the amount of sound to be raised or lowered) may be determined based on the above.
- the function of the terminal device 30 of the second embodiment may include a predetermined function accompanied by an operation amount. Then, a gesture with a movement width such as a swipe may be associated with the predetermined function. Then, when the gesture detected by the gesture detection unit 362 is a gesture with a movement width, the terminal device 30 determines the relative movement width of the gesture based on the size of the detectable range. Based on this, the operation amount of a predetermined function (for example, the amount of sound to be raised or lowered) may be determined.
- a predetermined function for example, the amount of sound to be raised or lowered
- the information processing device 10 and the output device 20 are devices that can be worn by the user (wearable devices), but are not necessarily wearable devices.
- the information processing device 10 and the output device 20 may be devices installed and used in a structure or a moving body such as a television, a car navigation system, a driver's cab, and various operation panels. Further, the information processing device 10 and the output device 20 may be the mobile body itself.
- the mobile body may be a mobile terminal such as a smartphone, a mobile phone, a personal computer, a music player, or a portable TV, or may be a remote controller for operating a device.
- the moving body may be a moving body moving on land (for example, a vehicle such as a car, a bicycle, a bus, a truck, a motorcycle, a train, a linear motor car, etc.) or in the ground (for example, in a tunnel). It may be a moving body (for example, a subway) that moves around.
- the moving body may be a moving body moving on the water (for example, a ship such as a passenger ship, a cargo ship, a hovercraft, etc.), or a moving body moving underwater (for example, a submersible, a submarine, an unmanned submarine, etc.). It may be a submarine). Further, the moving body may be a moving body moving in the atmosphere (for example, an aircraft such as an airplane, an airship, or a drone), or a moving body moving outside the atmosphere (for example, an artificial celestial body such as a space station). There may be.
- the structure is, for example, a high-rise building, a house, a steel tower, a station facility, an airport facility, a port facility, a stadium, or the like.
- the concept of structure includes not only buildings but also structures such as tunnels, bridges, dams, walls, and iron pillars, and equipment such as cranes, gates, and windmills.
- the concept of a structure includes not only a structure on land or in the ground, but also a structure on water such as a pier and a mega float, and a structure underwater such as an ocean observation facility.
- the information processing apparatus 40 detects the user's gesture using both the non-contact sensor and the contact sensor, but it is not always necessary to continue using both the non-contact sensor and the contact sensor. For example, the information processing apparatus 40 may decide whether to continue using both the non-contact sensor and the contact sensor according to the remaining battery level, or to use either the non-contact sensor or the contact sensor. Further, the information processing apparatus 40 may change the settings related to the sensor to be used when there is a special environmental change such as wearing a hat.
- the information processing apparatus 40 corrects the detectable range of the sensor based on the sensor value, but the detectable range may be corrected by using information other than the sensor value.
- the information processing apparatus 40 may correct the detectable range according to the type of the device of the information processing apparatus 40 (headphones / earphones / head-mounted display, etc.). For example, when the type of the device of the information processing apparatus 40 is headphones, there is a margin in the area where the user can physically contact, so the correction of the detectable range A1 is made small.
- another sensor for example, a biological sensor such as a heart rate sensor
- the detectable range may be corrected based on the estimation result.
- the information processing apparatus 40 corrects the detectable range of the sensor based on the sensor value, but may be configured so that the user can set the detectable range.
- 34A and 34B are diagrams showing an example of a GUI (Graphical User Interface) for setting a detectable range.
- the GUI shown in FIG. 34A is a GUI for setting the detectable range in the plane direction (front-back and left-right directions toward the user's ear E), and the GUI shown in FIG. 34B is separated from the depth direction (user's ear E). It is a GUI for setting the detectable range of (direction).
- the area A3 corresponds to, for example, the detectable range A1 of the contact sensor, and the area A4 corresponds to, for example, the detectable range A2 of the non-contact sensor.
- the area A3 corresponds to, for example, the detectable range A2 of the non-contact sensor, and the area A4 corresponds to, for example, the detection area OR of the non-contact sensor.
- the user sets the detectable range of the sensor using the GUI shown in FIGS. 34A and 34B.
- the information processing device 40 may visualize the difference in the operation area between the headphones and the earphones. Further, the information processing apparatus 40 may be used as a tutorial on the GUI when switching devices.
- control device for controlling the information processing device 10, the output device 20, or the terminal device 30 of the present embodiment may be realized by a dedicated computer system or a general-purpose computer system.
- a communication program for executing the above operation is stored and distributed in a computer-readable recording medium such as an optical disk, a semiconductor memory, a magnetic tape, or a flexible disk.
- the control device is configured by installing the program in a computer and executing the above-mentioned processing.
- the control device may be an information processing device 10, an output device 20, or an external device (for example, a personal computer) of the terminal device 30.
- the control device may be an internal device (for example, control unit 16, control unit 26, or control unit 36) of the information processing device 10, the output device 20, or the terminal device 30.
- the above communication program may be stored in a disk device provided in a server device on a network such as the Internet so that it can be downloaded to a computer or the like.
- the above-mentioned functions may be realized by the collaboration between the OS (Operating System) and the application software.
- the part other than the OS may be stored in a medium and distributed, or the part other than the OS may be stored in the server device so that it can be downloaded to a computer or the like.
- each component of each device shown in the figure is a functional concept, and does not necessarily have to be physically configured as shown in the figure. That is, the specific form of distribution / integration of each device is not limited to the one shown in the figure, and all or part of them may be functionally or physically distributed / physically in arbitrary units according to various loads and usage conditions. Can be integrated and configured.
- the present embodiment includes a device or any configuration constituting the system, for example, a processor as a system LSI (Large Scale Integration), a module using a plurality of processors, a unit using a plurality of modules, and a unit. It can also be implemented as a set or the like with other functions added (that is, a configuration of a part of the device).
- a processor as a system LSI (Large Scale Integration)
- a module using a plurality of processors a unit using a plurality of modules
- a unit that is, a configuration of a part of the device.
- the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
- the present embodiment can have a cloud computing configuration in which one function is shared by a plurality of devices via a network and jointly processed.
- the information processing apparatus 10 and the terminal apparatus 30 correct the detectable range of the gesture based on the information regarding the gesture and the information regarding the device state. Therefore, the information processing device 10 and the terminal device 30 can reduce the situation where the user detects an operation that is not intended as a gesture as a gesture.
- the information processing apparatus 10 and the terminal apparatus 30 correct the reference direction for detecting the gesture based on the information regarding the gesture and the information regarding the device state. Therefore, the information processing device 10 and the terminal device 30 can reduce erroneous detection such as detecting the left / right swipe as the up / down swipe.
- the present technology can also have the following configurations.
- a gesture detection unit that detects a user's gesture performed in the detection area of the detection unit that detects an object, and a gesture detection unit.
- the detectable range of the gesture in the detection area and the reference direction for detecting the gesture based on at least one of the information about the gesture and the information about the state of the predetermined device including the detection unit.
- a correction unit that corrects at least one of the Information processing device equipped with.
- the correction unit corrects at least one of the detectable range and the reference direction based on the information of the user's movement regarding the gesture.
- the information processing apparatus according to (1) above.
- the user's action information regarding the gesture includes information on the first action that causes the user to enter the detectable range.
- the correction unit corrects at least one of the detectable range and the reference direction based on the information of the first operation.
- the information processing device (2) above.
- the correction unit corrects the detectable range to be narrower than at least the detection region based on the information of the first operation.
- the information processing apparatus according to (3) above.
- the user's actions constituting one gesture include the first action and a second action following the first action. After the first operation is detected in the detectable range, the correction unit sets the detectable range for detecting the second operation in the detectable range when the first operation is detected. Wider than The information processing apparatus according to (4) above.
- the correction unit sets the detectable range for detecting the second motion as the detectable range when the first motion is detected. Wider than The information processing apparatus according to (5) above.
- the information of the first operation includes information indicating the approach direction of the hand entering the detectable range.
- the correction unit corrects at least one of the detectable range and the reference direction based on the information indicating the approach direction of the hand.
- the information processing apparatus according to any one of (3) to (6).
- the predetermined device can be worn by the user.
- the correction unit corrects at least one of the detectable range and the reference direction based on the information regarding the wearing state of the predetermined device to the user, which is estimated based on the information indicating the approach direction of the hand.
- the information processing apparatus corrects at least one of the detectable range and the reference direction based on the information regarding the mounting inclination of the predetermined device estimated based on the information indicating the approach direction of the hand.
- the information processing apparatus according to (8) above.
- the predetermined device comprises a predetermined portion located in the user's ear when worn.
- the detection unit is arranged at the predetermined portion, and the detection unit is arranged.
- the correction unit corrects at least one of the detectable range and the reference direction based on the information of the swinging motion in which the user raises his / her hand toward his / her ear.
- the information processing apparatus according to (8) or (9).
- the execution unit which executes the function associated with the gesture.
- the predetermined device comprises a pair of the predetermined parts located in both ears of the user when worn. When the swing-up motion is detected in both of the pair of the predetermined parts, the execution unit does not execute the function associated with the gesture even if the gesture is detected.
- the information processing apparatus according to (10) above.
- the detectable range is arranged in the detection area so that a part of the edge of the detectable range is in contact with a part of the edge of the detection area.
- the gesture detection unit does not detect the gesture when the user's hand enters from a range other than the detectable range in the detection area.
- the information processing apparatus according to any one of (3) to (11).
- the execution unit which executes the function associated with the gesture.
- the above-mentioned functions include at least a predetermined function accompanied by an operation amount.
- a predetermined gesture with a movement width is associated with the predetermined function.
- the execution unit determines the operation amount based on the magnitude of the movement width of the predetermined gesture with respect to the detectable range. do, The information processing apparatus according to any one of (1) to (12).
- the correction unit corrects at least one of the detectable range and the reference direction based on the information from the state detection unit that detects the state of the predetermined device.
- the information processing apparatus according to any one of (1) to (13).
- the predetermined device can be worn by the user.
- the correction unit corrects at least one of the detectable range and the reference direction based on the information regarding the wearing state of the predetermined device to the user, which is estimated based on the information from the state detection unit.
- the predetermined device can be worn by the user.
- the correction unit corrects at least one of the detectable range and the reference direction based on the information regarding the posture of the user estimated based on the information from the state detection unit.
- the state detection unit includes at least one of an acceleration sensor, a gyro sensor, and a biosensor.
- the correction unit corrects at least one of the detectable range and the reference direction based on the information from one or a plurality of sensors included in the state detection unit.
- the information processing apparatus according to any one of (14) to (16).
- the predetermined device is a headphone or an earphone.
- the information processing apparatus according to any one of (1) to (17).
- the information processing device is a device that controls the predetermined device or the predetermined device from the outside of the predetermined device via wire or wirelessly.
- the information processing apparatus according to any one of (1) to (18). (20) Detects the user's gesture performed in the detection area of the detector that detects the object, The detectable range of the gesture in the detection area and the reference direction for detecting the gesture based on at least one of the information about the gesture and the information about the state of the predetermined device including the detection unit. , Correct at least one of Information processing method.
- a first acquisition unit that acquires sensor information from a contact sensor that detects contact with the user's human body or object
- a second acquisition unit that acquires sensor information from the non-contact sensor that detects the user's movement including the user's non-contact movement
- a second acquisition unit that acquires sensor information from the non-contact sensor that detects the user's movement including the user's non-contact movement
- a gesture detection unit that detects a gesture of the user for operating a device based on sensor information from at least one of the contact sensor and the non-contact sensor.
- a correction unit that makes a correction related to the detection of the gesture or a correction related to the operation of the device, and a correction unit.
- the gesture detection unit detects the gesture of the user using at least the sensor information from the contact sensor.
- the correction unit determines the detectable range of the gesture based on the information of the gesture position detected based on the sensor information from the non-contact sensor, which is the gesture position indicating the position where the user has performed the gesture. to correct,
- It is equipped with an execution unit that executes a command corresponding to the detected gesture.
- the execution unit determines a command to be executed based on the gesture detected based on the sensor information from the contact sensor and the gesture position information detected based on the sensor information from the non-contact sensor. , The information processing apparatus according to (22) above.
- the gesture detection unit detects the gesture of the user using at least the sensor information from the non-contact sensor.
- the correction unit corrects the detectable range of the gesture based on the sensor information from the non-contact sensor when the user comes into contact with the human body or the object.
- the information processing apparatus according to (21).
- the correction unit detects the user's contact with the human body or the object based on the sensor information from the contact sensor, and the non-contact sensor when the user's contact with the human body or the object is detected. Corrects the detectable range of the gesture based on the sensor information from The information processing apparatus according to (24).
- the information processing apparatus It is provided with an output control unit that gives feedback to the user when the user's hand is positioned in the corrected detectable range after detecting the contact of the user with the human body or the object.
- the information processing apparatus according to (25) above.
- It is equipped with a command execution unit that executes a command corresponding to the detected gesture.
- the command includes at least a command with an operation amount.
- the gesture detection unit detects the gesture of the user using at least the sensor information from the contact sensor.
- the command execution unit determines the operation amount based on the sensor information from the non-contact sensor.
- the information processing apparatus according to (21).
- the user's actions constituting one gesture include a first action that triggers the start of the gesture and a second action following the first action.
- the command execution unit determines the operation amount based on the speed or movement amount of the second operation determined based on the sensor information from the non-contact sensor.
- the correction unit changes a set value for associating the second operation with the operation amount based on the information of the speed of the first operation detected based on the sensor information from the non-contact sensor.
- the information processing apparatus according to (27).
- the correction unit corrects the detectable range of the second operation based on the information of the speed of the first operation detected based on the sensor information from the non-contact sensor.
- the information processing apparatus according to (28) above.
- (31) Computer, Gesture detector that detects the user's gesture performed in the detection area of the detector that detects the object, The detectable range of the gesture in the detection area and the reference direction for detecting the gesture based on at least one of the information about the gesture and the information about the state of the predetermined device including the detection unit. , A correction unit that corrects at least one of An information processing program to function as. (32) Computer, A first acquisition unit that acquires sensor information from a contact sensor that detects contact with the user's body or object, A second acquisition unit that acquires sensor information from a non-contact sensor that detects the user's movement, including the user's non-contact movement.
- a gesture detection unit that detects a gesture of the user for operating a device based on sensor information from at least one of the contact sensor and the non-contact sensor.
- a correction unit that corrects the detection of the gesture or the operation of the device based on the sensor information from the contact sensor and the sensor information from the non-contact sensor.
- An information processing program to function as.
- Information processing system 10 40 Information processing device 10A Earphone 10B Headphones 11 , 11 L , 11 R , 21, 41 Input detection unit 12, 22, 32, 42 State detection unit 13, 23, 33, 43 Output unit 14, 24 , 34, 44 Communication unit 15, 35, 45 Storage unit 16, 26, 36, 46 Control unit 20, 20A, 20B Output device 30 Terminal device 31 Input unit 161, 361 Acquisition unit 162, 362, 462 Gesture detection unit 163, 363, 463 Command execution unit 164, 364, 464 Output control unit 165, 365, 465 Guessing unit 166, 366, 466 Correction unit 461A First acquisition unit 461B Second acquisition unit AD Approach direction OR Detection area AR, A1, A2 Detectable range A3, A4 area NR Non-detectable range B1 to B4 Reference direction D1 to D4 Device direction U User H Hand O1, O2 Obstacles
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
This information processing device comprises: a gesture detection unit that detects a user gesture performed in a detection region of a detection unit that detects objects; and a correction unit that, on the basis of information relating to the gesture and/or information relating to the state of a prescribed device comprising the detection unit, corrects a detectable range of the gesture in the detection region and/or a reference direction for detecting the gesture.
Description
本開示は、情報処理装置、及び情報処理方法に関する。
This disclosure relates to an information processing device and an information processing method.
機器操作に関する様々な技術が開発されている。例えば、近年では、ジェスチャによる機器操作を可能にする技術の開発が活発になされている。
Various technologies related to device operation have been developed. For example, in recent years, there has been active development of technology that enables device operation by gestures.
ジェスチャによる機器操作には、ユーザが気軽に機器を操作できるようになるという利点がある一方で、誤検出が発生しやすいという問題がある。例えば、ジェスチャによる機器操作には、ユーザの無意識の動作を誤ってジェスチャとして検出したり、ユーザのジェスチャを当該ユーザが意図するジェスチャとは別のジェスチャとして検出したり、といった問題が発生することがある。
While device operation by gesture has the advantage that the user can easily operate the device, there is a problem that false detection is likely to occur. For example, device operation by gesture may cause problems such as erroneously detecting a user's unconscious movement as a gesture, or detecting a user's gesture as a gesture different from the gesture intended by the user. be.
そこで、本開示では、精度の高いジェスチャ検出を実現可能な情報処理装置、及び情報処理方法を提案する。
Therefore, in this disclosure, we propose an information processing device and an information processing method that can realize highly accurate gesture detection.
上記の課題を解決するために、本開示に係る一形態の情報処理装置は、物体を検出する検出部の検出領域で行われるユーザのジェスチャを検出するジェスチャ検出部と、前記ジェスチャに関する情報、及び前記検出部を備える所定の機器の状態に関する情報、の少なくとも一方の情報に基づいて、前記検出領域内の前記ジェスチャの検出可能範囲、及び前記ジェスチャの検出のための基準方向、の少なくとも一方を補正する補正部と、を備える。
In order to solve the above-mentioned problems, the information processing apparatus according to the present disclosure includes a gesture detection unit for detecting a user's gesture performed in the detection area of the detection unit for detecting an object, information on the gesture, and information on the gesture. Based on at least one of the information regarding the state of the predetermined device including the detection unit, at least one of the detectable range of the gesture in the detection area and the reference direction for detecting the gesture is corrected. It is provided with a correction unit and a correction unit.
以下に、本開示の実施形態について図面に基づいて詳細に説明する。なお、以下の各実施形態において、同一の部位には同一の符号を付することにより重複する説明を省略する。
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings. In each of the following embodiments, the same parts are designated by the same reference numerals, so that overlapping description will be omitted.
また、本明細書及び図面において、実質的に同一の機能構成を有する複数の構成要素を、同一の符号の後に異なるアルファベットを付して区別する場合もある。例えば、実質的に同一の機能構成を有する複数の要素を、必要に応じて出力装置20A、及び20Bのように区別する。ただし、実質的に同一の機能構成を有する複数の要素各々を特に区別する必要がない場合、同一符号のみを付する。例えば、出力装置20A、及び20Bを特に区別する必要が無い場合には、単に出力装置20と称する。
Further, in the present specification and the drawings, a plurality of components having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numerals. For example, a plurality of elements having substantially the same functional configuration are distinguished as needed, such as output devices 20A and 20B. However, if it is not necessary to distinguish each of a plurality of elements having substantially the same functional configuration, only the same reference numerals are given. For example, when it is not necessary to distinguish between the output devices 20A and 20B, the output devices 20A and 20B are simply referred to as output devices 20.
また、以下に示す項目順序に従って本開示を説明する。
1.本開示の概要
1-1.課題
1-2.解決手段
2.実施形態1
2-1.情報処理装置の構成
2-2.情報処理装置が検出可能なジェスチャ
2-3.情報処理装置の動作
3.実施形態2
3-1.情報処理システムの構成
3-2.出力装置の構成
3-3.端末装置の構成
3-4.情報処理システムの動作
4.実施形態3
4-1.実施形態3の概要
4-2.情報処理装置の構成
4-3.情報処理装置の動作
5.変形例
6.むすび In addition, the present disclosure will be described according to the order of items shown below.
1. 1. Outline of the present disclosure 1-1. Issues 1-2. Solution 2.Embodiment 1
2-1. Configuration of information processing device 2-2. Gestures that can be detected by the information processing device 2-3. Operation of information processing device 3. Embodiment 2
3-1. Information processing system configuration 3-2. Output device configuration 3-3. Configuration of terminal equipment 3-4. Operation of information processing system 4. Embodiment 3
4-1. Outline of Embodiment 3 4-2. Configuration of information processing device 4-3. Operation of information processing device 5. Modification example 6. Conclusion
1.本開示の概要
1-1.課題
1-2.解決手段
2.実施形態1
2-1.情報処理装置の構成
2-2.情報処理装置が検出可能なジェスチャ
2-3.情報処理装置の動作
3.実施形態2
3-1.情報処理システムの構成
3-2.出力装置の構成
3-3.端末装置の構成
3-4.情報処理システムの動作
4.実施形態3
4-1.実施形態3の概要
4-2.情報処理装置の構成
4-3.情報処理装置の動作
5.変形例
6.むすび In addition, the present disclosure will be described according to the order of items shown below.
1. 1. Outline of the present disclosure 1-1. Issues 1-2. Solution 2.
2-1. Configuration of information processing device 2-2. Gestures that can be detected by the information processing device 2-3. Operation of information processing device 3. Embodiment 2
3-1. Information processing system configuration 3-2. Output device configuration 3-3. Configuration of terminal equipment 3-4. Operation of information processing system 4. Embodiment 3
4-1. Outline of Embodiment 3 4-2. Configuration of information processing device 4-3. Operation of information processing device 5. Modification example 6. Conclusion
<<1.本開示の概要>>
まず、本開示の概要を説明する。 << 1. Summary of the present disclosure >>
First, the outline of the present disclosure will be described.
まず、本開示の概要を説明する。 << 1. Summary of the present disclosure >>
First, the outline of the present disclosure will be described.
<1-1.課題>
近年、ジェスチャによる機器操作を可能にする技術の開発が活発になされている。例えば、近年では、ヘッドホンやイヤホン等、ユーザが装着可能な機器へのコマンドの入力操作(例えば、Next/Prevの操作)を、ユーザがジェスチャを使って行うことを可能にする技術が開発されている。 <1-1. Challenges>
In recent years, the development of technology that enables device operation by gesture has been actively carried out. For example, in recent years, a technique has been developed that enables a user to perform a command input operation (for example, Next / Prev operation) to a device that can be worn by the user, such as headphones and earphones, using a gesture. There is.
近年、ジェスチャによる機器操作を可能にする技術の開発が活発になされている。例えば、近年では、ヘッドホンやイヤホン等、ユーザが装着可能な機器へのコマンドの入力操作(例えば、Next/Prevの操作)を、ユーザがジェスチャを使って行うことを可能にする技術が開発されている。 <1-1. Challenges>
In recent years, the development of technology that enables device operation by gesture has been actively carried out. For example, in recent years, a technique has been developed that enables a user to perform a command input operation (for example, Next / Prev operation) to a device that can be worn by the user, such as headphones and earphones, using a gesture. There is.
ジェスチャ検出が可能な機器の多くは、物体を非接触で検出可能な検出部(例えば、近接センサ)を備え、検出部の検出結果(例えば、センサ値)に基づいてユーザのジェスチャを検出する。なお、以下の説明では、ジェスチャ検出が可能な機器のことを、単に情報処理装置と呼ぶことがある。
Many devices capable of gesture detection are equipped with a detection unit (for example, a proximity sensor) that can detect an object in a non-contact manner, and detect a user's gesture based on the detection result (for example, a sensor value) of the detection unit. In the following description, a device capable of gesture detection may be simply referred to as an information processing device.
図1及び図2は、ユーザがジェスチャを行う様子を示す図である。図1には、本実施形態の情報処理装置の一例として、イヤホン10Aが示されている。また、図2には、本実施形態の情報処理装置の一例として、ヘッドホン10Bが示されている。ジェスチャによる機器操作は、情報処理装置近くの空間上でユーザが所定のジェスチャを行うことにより行われる。図1及び図2の例では、イヤホン10A及びヘッドホン10Bはユーザの耳に装着されており、ユーザUは、自身の耳の近くの空間上で機器操作のためのジェスチャを行っている。図1の例では、ユーザUは、手Hを前から後ろに移動させるジェスチャを行っており、図2の例では、ユーザUは、手Hを後ろから前に移動させるジェスチャを行っている。
1 and 2 are diagrams showing how the user makes a gesture. FIG. 1 shows an earphone 10A as an example of the information processing apparatus of the present embodiment. Further, FIG. 2 shows the headphones 10B as an example of the information processing apparatus of the present embodiment. The device operation by the gesture is performed by the user performing a predetermined gesture in the space near the information processing device. In the examples of FIGS. 1 and 2, the earphone 10A and the headphone 10B are attached to the user's ear, and the user U makes a gesture for operating the device in the space near his / her ear. In the example of FIG. 1, the user U makes a gesture of moving the hand H from the front to the back, and in the example of FIG. 2, the user U makes a gesture of moving the hand H from the back to the front.
なお、本実施形態では、手を移動させる動作には、指を移動させる動作を含むものとする。以下の説明で登場する「手」の記載は、適宜「指」に置き換え可能である。
In the present embodiment, the motion of moving the hand includes the motion of moving the finger. The description of "hand" appearing in the following explanation can be replaced with "finger" as appropriate.
また、以下の説明では、理解を容易にするため、説明にXYZ座標系を用いることがある。ここで、X軸方向、Y軸方向、Z軸方向は、いずれもユーザを基準として定まる方向である。X軸プラス方向はユーザUの正面方向であり、X軸マイナス方向はユーザUの背面方向である。また、Y軸プラス方向はユーザUの左手側方向であり、Y軸マイナス方向は、ユーザUの右手側方向である。また、Z軸プラス方向は上方向であり、Z軸マイナス方向は、下方向である。本実施形態では、ユーザUは立位或いは座位にあるものとし、X軸及びY軸は水平方向、Z軸は鉛直方向(重力方向)であるものとする。なお、以下の説明で登場するX軸方向、Y軸方向、及びZ軸方向は、ユーザの姿勢に合わせて適宜読み替えてもよい。例えば、ユーザが仰向けに寝ている場合は、X軸方向が鉛直方向となり、Y軸及びX軸方向が水平方向となる。
Also, in the following explanation, the XYZ coordinate system may be used for the explanation in order to facilitate understanding. Here, the X-axis direction, the Y-axis direction, and the Z-axis direction are all directions determined with reference to the user. The X-axis plus direction is the front direction of the user U, and the X-axis minus direction is the back direction of the user U. Further, the Y-axis plus direction is the left-hand side direction of the user U, and the Y-axis minus direction is the right-hand side direction of the user U. Further, the Z-axis plus direction is the upward direction, and the Z-axis minus direction is the downward direction. In the present embodiment, the user U is assumed to be in a standing or sitting position, the X-axis and the Y-axis are in the horizontal direction, and the Z-axis is in the vertical direction (gravity direction). The X-axis direction, Y-axis direction, and Z-axis direction appearing in the following description may be appropriately read according to the posture of the user. For example, when the user is lying on his back, the X-axis direction is the vertical direction, and the Y-axis and the X-axis directions are the horizontal direction.
また、以下の説明では、ユーザUが手Hを水平方向(X軸方向)に移動させるジェスチャのことを左右スワイプということがある。また、以下の説明では、ユーザUが手Hを鉛直方向(Z軸方向)に移動させるジェスチャのことを上下スワイプということがある。
Further, in the following explanation, the gesture in which the user U moves the hand H in the horizontal direction (X-axis direction) may be referred to as a left / right swipe. Further, in the following description, the gesture in which the user U moves the hand H in the vertical direction (Z-axis direction) may be referred to as a vertical swipe.
上述したように、情報処理装置は、検出部での物体の検出結果に基づいてユーザのジェスチャを検出する。情報処理装置は、ジェスチャを検出すると、当該ジェスチャに対応する機能(例えば、Next/PrevやVol+/Vol-)を実行する。
As described above, the information processing device detects the user's gesture based on the detection result of the object in the detection unit. When the information processing apparatus detects a gesture, it executes a function corresponding to the gesture (for example, Next / Prev or Vol + / Vol−).
ジェスチャによる機器操作には、ユーザが気軽に機器を操作できるようになるという利点がある一方で、誤検出が発生しやすいという問題がある。ここで、情報処理装置が、ユーザが装着可能な機器であるとする。この場合、誤検出の原因として、(1-1-1)検出部の検出領域の位置の変化、(1-1-2)情報処理装置の向きの変化、が想定される。以下、これらを原因とした誤検出について詳細に説明する。なお、以下の説明では、ユーザが装着可能な機器のことを、省略して装着可能機器ということがある。
While device operation by gesture has the advantage that the user can easily operate the device, there is a problem that false detection is likely to occur. Here, it is assumed that the information processing device is a device that can be worn by the user. In this case, it is assumed that the cause of the erroneous detection is (1-1-1) a change in the position of the detection area of the detection unit and (1-1-2) a change in the orientation of the information processing apparatus. Hereinafter, false positives caused by these will be described in detail. In the following description, the device that can be worn by the user may be abbreviated as the device that can be worn.
(1-1-1)検出部の検出領域の位置の変化
情報処理装置が装着可能機器の場合、情報処理装置の装着向きによっては、検出部の検出領域(動作領域ともいう。)の位置が変化する。図3A及び図3Bは、情報処理装置の装着向きによって、検出領域の位置が変化する様子を示す図である。図3Aは、標準的な向きにイヤホン10Aが装着された様子を示す図であり、図3Bは、イレギュラーな向きにイヤホン10Aが装着された様子を示す図である。 (1-1-1) Change in the position of the detection area of the detection unit When the information processing device is a mountable device, the position of the detection area (also referred to as the operating area) of the detection unit may change depending on the mounting direction of the information processing device. Change. 3A and 3B are diagrams showing how the position of the detection region changes depending on the mounting direction of the information processing apparatus. FIG. 3A is a diagram showing a state in which theearphone 10A is attached in a standard orientation, and FIG. 3B is a diagram showing a state in which the earphone 10A is attached in an irregular orientation.
情報処理装置が装着可能機器の場合、情報処理装置の装着向きによっては、検出部の検出領域(動作領域ともいう。)の位置が変化する。図3A及び図3Bは、情報処理装置の装着向きによって、検出領域の位置が変化する様子を示す図である。図3Aは、標準的な向きにイヤホン10Aが装着された様子を示す図であり、図3Bは、イレギュラーな向きにイヤホン10Aが装着された様子を示す図である。 (1-1-1) Change in the position of the detection area of the detection unit When the information processing device is a mountable device, the position of the detection area (also referred to as the operating area) of the detection unit may change depending on the mounting direction of the information processing device. Change. 3A and 3B are diagrams showing how the position of the detection region changes depending on the mounting direction of the information processing apparatus. FIG. 3A is a diagram showing a state in which the
図3Aの例では、標準的な向きにイヤホン10Aが装着されているので、検出領域ORは、Y軸プラス方向から見て、ユーザUの耳の位置を中心とした一定の範囲に位置している。つまり、図3Aの例では、装置の開発者が意図した範囲に検出領域ORが位置している。一方、図3Bの例では、イレギュラーな向きにイヤホン10Aが装着されているので、検出領域ORは、ユーザUの耳の若干下から側頭部にかけた範囲に位置している。つまり、図3Bの例では、装置の開発者が意図した範囲に検出領域ORが位置していない。言い換えると、ユーザが耳の近くでジェスチャをしようと思ったときに、図3Aの例では、ユーザの感覚に合った位置に検出領域ORがあるのに対し、図3Bの例では、ユーザの感覚に合わない位置に検出領域ORがある。
In the example of FIG. 3A, since the earphone 10A is mounted in the standard orientation, the detection region OR is located in a certain range centered on the position of the ear of the user U when viewed from the Y-axis plus direction. There is. That is, in the example of FIG. 3A, the detection region OR is located in the range intended by the developer of the apparatus. On the other hand, in the example of FIG. 3B, since the earphone 10A is worn in an irregular direction, the detection region OR is located in a range extending from slightly below the ear of the user U to the temporal region. That is, in the example of FIG. 3B, the detection region OR is not located in the range intended by the developer of the apparatus. In other words, when the user wants to make a gesture near the ear, in the example of FIG. 3A, the detection area OR is located at a position suitable for the user's sensation, whereas in the example of FIG. 3B, the user's sensation is present. There is a detection area OR at a position that does not match.
ユーザの感覚に合わない位置に検出領域ORがある場合、ユーザUがジェスチャとして意図していない動作まで、ジェスチャとして検出されてしまう。例えば、ユーザUが頭を掻こうとして手を側頭部に持っていく動作は、図3Bの例では検出領域ORの中での動作となってしまう可能性が高い。そうなると、当該動作は、場合によっては、情報処理装置にジェスチャとして検出されてしまう。
If the detection area OR is located at a position that does not suit the user's sense, even an operation that the user U does not intend as a gesture will be detected as a gesture. For example, the action of the user U trying to scratch his head and bringing his hand to the temporal region is likely to be an action within the detection region OR in the example of FIG. 3B. In that case, the operation may be detected as a gesture by the information processing apparatus.
また、ユーザUの感覚に合わない位置に検出領域ORがある場合、ユーザUがジェスチャを意図して行った動作が、場合によっては、検出領域ORの外での動作となってしまう。例えば、図3Bの例では、イヤホン10Aのやや下側で行われたジェスチャは、ユーザの耳の近くでの動作であるものの、検出領域ORの外での動作となる可能性が高い。この場合、情報処理装置は、ユーザの動作をジェスチャとして検出できない。ユーザUのジェスチャが検出領域ORの外での動作となることを防ぐため、検出部の感度を高めて検出領域ORを広くすることが想起される。しかし、この場合、ユーザUの耳の下方向はよいものの、他の方向では、ユーザUの感覚に合わない位置にまで検出領域ORが広がることになる。そうなると、上述の問題と同様の問題が発生する。
Further, when the detection area OR is located at a position that does not suit the user U's sense, the operation that the user U intentionally performs the gesture becomes an operation outside the detection area OR in some cases. For example, in the example of FIG. 3B, the gesture performed slightly below the earphone 10A is likely to be an operation near the user's ear, but outside the detection region OR. In this case, the information processing device cannot detect the user's action as a gesture. In order to prevent the gesture of the user U from operating outside the detection area OR, it is recalled to increase the sensitivity of the detection unit and widen the detection area OR. However, in this case, although the downward direction of the user U's ear is good, in the other direction, the detection region OR extends to a position that does not suit the user U's sense. In that case, the same problem as the above-mentioned problem occurs.
なお、図3A及び図3Bには、イヤホン型の情報処理装置が示されているが、検出部の検出領域の位置の変化は、情報処理装置がイヤホン型以外の装置の場合も発生しうる。例えば、検出部の検出領域の位置の変化は、情報処理装置が例えば図2に示したようなヘッドホン型の装置の場合にも発生しうる。そのため、情報処理装置がイヤホン型以外の装置の場合もこの(1-1-1)で説明したのと同様の問題が発生する。
Although the earphone type information processing device is shown in FIGS. 3A and 3B, the change in the position of the detection area of the detection unit may occur even when the information processing device is a device other than the earphone type. For example, the change in the position of the detection area of the detection unit may occur even when the information processing device is a headphone type device as shown in FIG. 2, for example. Therefore, when the information processing device is a device other than the earphone type, the same problem as described in this (1-1-1) occurs.
(1-1-2)情報処理装置の向きの変化
情報処理装置が装着可能機器の場合、情報処理装置の装着向きによっては、ユーザUの向きに対する情報処理装置の向きが変化する。図4A及び図4Bは、情報処理装置の装着向きによって、情報処理装置の向きが変化する様子を示す図である。図4Aは、標準的な装着向きにヘッドホン10Bが装着された様子を示す図であり、図4Bは、イレギュラーな向きにヘッドホン10Bが装着された様子を示す図である。 (1-1-2) Change in orientation of information processing device When the information processing device is a mountable device, the orientation of the information processing device with respect to the orientation of the user U changes depending on the mounting orientation of the information processing device. 4A and 4B are diagrams showing how the orientation of the information processing apparatus changes depending on the mounting orientation of the information processing apparatus. FIG. 4A is a diagram showing a state in which theheadphones 10B are worn in a standard wearing direction, and FIG. 4B is a diagram showing a state in which the headphones 10B are worn in an irregular direction.
情報処理装置が装着可能機器の場合、情報処理装置の装着向きによっては、ユーザUの向きに対する情報処理装置の向きが変化する。図4A及び図4Bは、情報処理装置の装着向きによって、情報処理装置の向きが変化する様子を示す図である。図4Aは、標準的な装着向きにヘッドホン10Bが装着された様子を示す図であり、図4Bは、イレギュラーな向きにヘッドホン10Bが装着された様子を示す図である。 (1-1-2) Change in orientation of information processing device When the information processing device is a mountable device, the orientation of the information processing device with respect to the orientation of the user U changes depending on the mounting orientation of the information processing device. 4A and 4B are diagrams showing how the orientation of the information processing apparatus changes depending on the mounting orientation of the information processing apparatus. FIG. 4A is a diagram showing a state in which the
ここで、図中のD1~D4は、装置を基準とした方向である。より具体的には、図中のD1~D4は、標準的な装着向きに情報処理装置が装着されたときの、上方向、下方向、前方向、及び後方向を示している。D1が上方向であり、D2が下方向であり、D3が前方向であり、D4が後方向である。以下の説明では、D1~D4のことを装置方向という。
Here, D1 to D4 in the figure are directions with respect to the device. More specifically, D1 to D4 in the figure indicate the upward, downward, forward, and backward directions when the information processing apparatus is mounted in the standard mounting orientation. D1 is upward, D2 is downward, D3 is forward, and D4 is backward. In the following description, D1 to D4 are referred to as device directions.
図4Aの例では、標準的な向きにヘッドホン10Bが装着されているので、鉛直方向(Z軸方向)と装置方向D1、D2は一致しており、また、水平方向(X軸方向)と装置方向D3、D4も一致している。一方、図4Bの例では、イレギュラーな向きにヘッドホン10Bが装着されているので、鉛直方向(Z軸方向)と装置方向D1、D2は一致しておらず、また、水平方向(X軸方向)と装置方向D3、D4も一致していない。このように、情報処理装置が装着可能機器の場合、ユーザUの向き(例えば、X軸方向、Z軸方向)に対する情報処理装置の向き(例えば、装置方向D3、D4、装置方向D1、D2)が変化する。
In the example of FIG. 4A, since the headphones 10B are mounted in the standard direction, the vertical direction (Z-axis direction) and the device directions D1 and D2 match, and the horizontal direction (X-axis direction) and the device. The directions D3 and D4 also match. On the other hand, in the example of FIG. 4B, since the headphones 10B are mounted in an irregular direction, the vertical direction (Z-axis direction) and the device directions D1 and D2 do not match, and the horizontal direction (X-axis direction). ) And the device directions D3 and D4 do not match either. As described above, when the information processing device is a wearable device, the direction of the information processing device with respect to the direction of the user U (for example, the X-axis direction and the Z-axis direction) (for example, the device directions D3 and D4, and the device directions D1 and D2). Changes.
ここで、ユーザが情報処理装置への入力操作として左右スワイプを行ったとする。一般的に、情報処理装置は、標準装着時の水平方向(装置方向D3、D4)を基準に、ユーザが行ったジェスチャが左右スワイプか否かを判断する。図4Aの例では、水平方向(X軸方向)と標準装着時の情報処理装置の水平方向(装置方向D3、D4)とが一致している。そのため、情報処理装置は、問題なく、ユーザのジェスチャが左右スワイプであると検出できる。
Here, it is assumed that the user swipes left or right as an input operation to the information processing device. Generally, the information processing apparatus determines whether or not the gesture performed by the user is a left / right swipe based on the horizontal direction (device directions D3, D4) at the time of standard mounting. In the example of FIG. 4A, the horizontal direction (X-axis direction) and the horizontal direction (device directions D3, D4) of the information processing apparatus at the time of standard mounting coincide with each other. Therefore, the information processing apparatus can detect that the user's gesture is a left / right swipe without any problem.
しかしながら、図4Bの例では、水平方向(X軸方向)と標準装着時の情報処理装置の水平方向(装置方向D3、D4)とが一致していない。そのため、ユーザが左右スワイプを意図してジェスチャを行ったとしても、情報処理装置は、そのジェスチャが左右スワイプであると検出できない。図4Bの例では、ユーザの手の動きの方向は、標準装着時の情報処理装置の鉛直方向(装置方向D1、D2)に近い。そのため、図4Bの例では、情報処理装置は、ユーザが上下スワイプを行ったと誤って検出する。
However, in the example of FIG. 4B, the horizontal direction (X-axis direction) and the horizontal direction of the information processing device at the time of standard mounting (device directions D3 and D4) do not match. Therefore, even if the user intentionally makes a gesture with a left / right swipe, the information processing apparatus cannot detect that the gesture is a left / right swipe. In the example of FIG. 4B, the direction of the movement of the user's hand is close to the vertical direction (device directions D1 and D2) of the information processing apparatus at the time of standard mounting. Therefore, in the example of FIG. 4B, the information processing apparatus erroneously detects that the user has swiped up and down.
なお、図4Bには、ヘッドホン型の情報処理装置が示されているが、情報処理装置の向きの変化は、情報処理装置がヘッドホン型以外の装置の場合にも発生しうる。例えば、情報処理装置の向きの変化は、情報処理装置がイヤホン型の装置の場合にも発生しうる。そのため、情報処理装置がヘッドホン型以外の装置の場合もこの(1-1-2)で説明したのと同様の問題が発生する。
Although a headphone type information processing device is shown in FIG. 4B, a change in the orientation of the information processing device may occur even when the information processing device is a device other than the headphone type. For example, the change in the orientation of the information processing device may occur even when the information processing device is an earphone type device. Therefore, when the information processing device is a device other than the headphone type, the same problem as described in this (1-1-2) occurs.
<1-2.解決手段>
そこで、本実施形態では、以下の手段により上述の(1-1-1)及び(1-1-2)の問題を解決する。解決手段としては、(1-2-1)ジェスチャの検出可能範囲の補正、(1-2-2)ジェスチャの検出のための基準方向の補正、が想起される。以下、これらの解決手段について述べる。なお、情報処理装置は、以下に示す複数の解決手段を並行して或いは組み合わせて実行してもよい。 <1-2. Solution>
Therefore, in the present embodiment, the above-mentioned problems (1-1-1) and (1-1-2) are solved by the following means. As a solution, (1-2-1) correction of the detectable range of the gesture and (1-2-2) correction of the reference direction for detecting the gesture are recalled. Hereinafter, these solutions will be described. The information processing apparatus may execute a plurality of solutions shown below in parallel or in combination.
そこで、本実施形態では、以下の手段により上述の(1-1-1)及び(1-1-2)の問題を解決する。解決手段としては、(1-2-1)ジェスチャの検出可能範囲の補正、(1-2-2)ジェスチャの検出のための基準方向の補正、が想起される。以下、これらの解決手段について述べる。なお、情報処理装置は、以下に示す複数の解決手段を並行して或いは組み合わせて実行してもよい。 <1-2. Solution>
Therefore, in the present embodiment, the above-mentioned problems (1-1-1) and (1-1-2) are solved by the following means. As a solution, (1-2-1) correction of the detectable range of the gesture and (1-2-2) correction of the reference direction for detecting the gesture are recalled. Hereinafter, these solutions will be described. The information processing apparatus may execute a plurality of solutions shown below in parallel or in combination.
(1-2-1)ジェスチャの検出可能範囲の補正
上述したように、ユーザの感覚に合わない位置に検出領域ORがある場合、ユーザがジェスチャとして意図していない動作まで、情報処理装置にジェスチャとして検出されてしまう。そこで、本実施形態の情報処理装置は、ジェスチャの検出可能範囲を検出領域ORの一部領域に限定する。 (1-2-1) Correction of the detectable range of the gesture As described above, when the detection area OR is located at a position that does not suit the user's sense, the information processing apparatus makes a gesture until the operation that the user does not intend as a gesture. Will be detected as. Therefore, the information processing apparatus of the present embodiment limits the detectable range of the gesture to a part of the detection area OR.
上述したように、ユーザの感覚に合わない位置に検出領域ORがある場合、ユーザがジェスチャとして意図していない動作まで、情報処理装置にジェスチャとして検出されてしまう。そこで、本実施形態の情報処理装置は、ジェスチャの検出可能範囲を検出領域ORの一部領域に限定する。 (1-2-1) Correction of the detectable range of the gesture As described above, when the detection area OR is located at a position that does not suit the user's sense, the information processing apparatus makes a gesture until the operation that the user does not intend as a gesture. Will be detected as. Therefore, the information processing apparatus of the present embodiment limits the detectable range of the gesture to a part of the detection area OR.
図5は、ジェスチャの検出可能範囲を、検出領域ORの一部領域に限定した様子を示す図である。図5の例では、Y軸プラス方向から見て、情報処理装置10周囲の一定範囲に検出領域ORが形成されている。なお、情報処理装置10は、イヤホン10Aであってもよいし、ヘッドホン10Bであってもよい。図5の例では、検出領域ORが、検出可能範囲ARと非検出可能範囲NRに分けられている。検出可能範囲ARは、情報処理装置10がジェスチャを検出することが可能な範囲であり、非検出可能範囲NRは、情報処理装置10がジェスチャを検出しない範囲である。検出可能範囲AR及び非検出可能範囲NRは、いずれも、検出領域OR内の範囲である。
FIG. 5 is a diagram showing a state in which the detectable range of the gesture is limited to a part of the detection area OR. In the example of FIG. 5, the detection region OR is formed in a certain range around the information processing apparatus 10 when viewed from the Y-axis plus direction. The information processing device 10 may be an earphone 10A or a headphone 10B. In the example of FIG. 5, the detection area OR is divided into a detectable range AR and a non-detectable range NR. The detectable range AR is the range in which the information processing apparatus 10 can detect the gesture, and the non-detectable range NR is the range in which the information processing apparatus 10 does not detect the gesture. Both the detectable range AR and the non-detectable range NR are within the detection area OR.
情報処理装置10は、ユーザUのジェスチャに関する情報に基づいて、検出可能範囲ARを補正する。ここで、ジェスチャに関する情報は、ジェスチャに関するユーザUの動作の情報であってもよい。より具体的には、ジェスチャに関する情報は、ユーザUが手Hを検出可能範囲ARに進入させる動作(以下、進入動作という。)の情報であってもよい。情報処理装置10がユーザUの耳に装着される装置なのであれば、進入動作は、ユーザUが手Hを耳に向けて振り上げる動作(以下、振り上げ動作という。)であってもよい。振り上げ動作は、例えば、ユーザUが、肘関節を静止或いはゆっくり前方に移動させながら、当該肘関節を支点にユーザUの手Hを振り子のように耳に向けて振り上げる動作である。以下の説明では、進入動作は、ユーザUの手Hの振り上げ動作であるものとする。
The information processing device 10 corrects the detectable range AR based on the information regarding the gesture of the user U. Here, the information about the gesture may be the information of the operation of the user U regarding the gesture. More specifically, the information regarding the gesture may be information on an operation (hereinafter referred to as an approach operation) in which the user U enters the hand H into the detectable range AR. If the information processing device 10 is a device worn on the ear of the user U, the approaching motion may be an motion in which the user U swings the hand H toward the ear (hereinafter, referred to as a swinging motion). The swing-up motion is, for example, an motion in which the user U swings the hand H of the user U toward the ear like a pendulum with the elbow joint as a fulcrum while moving the elbow joint stationary or slowly forward. In the following description, it is assumed that the approaching motion is the swinging motion of the hand H of the user U.
ユーザUが手Hの振り上げ動作を行った場合、検出可能範囲ARへのユーザUの手Hの進入方向ADは、人体の構造上、ある程度決まった方向となる。個人差やジェスチャ毎の違いはあるものの、多くの場合、進入方向ADは、背面方向(X軸マイナス方向)から上下一定角度の範囲に収まると想定される。図5の例では、ユーザの手Hは検出可能範囲ARにユーザUの正面から若干下側にずれた位置から進入している。そのため、図5の例では、進入方向ADは、背面方向(X軸マイナス方向)から上に若干傾いた方向となっている。
When the user U swings up the hand H, the approach direction AD of the user U's hand H into the detectable range AR is a certain direction due to the structure of the human body. Although there are individual differences and differences between gestures, in many cases, the approach direction AD is assumed to fall within a range of a certain vertical angle from the back direction (X-axis minus direction). In the example of FIG. 5, the user's hand H has entered the detectable range AR from a position slightly downward from the front of the user U. Therefore, in the example of FIG. 5, the approach direction AD is a direction slightly inclined upward from the back surface direction (X-axis minus direction).
進入方向ADが分かれば、情報処理装置10は、ユーザUがどのあたりでジェスチャを行うか推測ができる。そこで、情報処理装置10は、進入方向ADの情報に基づいて、検出可能範囲ARが検出領域ORより狭くなるよう補正する。図5の例では、補正の結果、検出可能範囲ARは、検出領域ORの縁の一部に検出可能範囲ARの縁の一部が接するよう検出領域OR内に偏った状態で設定される。より具体的には、検出可能範囲ARは、ユーザUが手Hを検出領域ORに進入させたと推測される位置(以下、進入位置という。)に偏って配置される。ここで、進入位置は、図5の例を使って説明すると、例えば、進入方向ADを示す矢印と検出領域ORの縁を示す破線との交点である。
If the approach direction AD is known, the information processing apparatus 10 can infer where the user U makes the gesture. Therefore, the information processing apparatus 10 corrects the detectable range AR to be narrower than the detection area OR based on the information of the approach direction AD. In the example of FIG. 5, as a result of the correction, the detectable range AR is set in a state of being biased in the detection area OR so that a part of the edge of the detectable range AR touches a part of the edge of the detection area OR. More specifically, the detectable range AR is biased to a position where the user U is presumed to have entered the detection region OR (hereinafter referred to as an approach position). Here, the approach position will be described using the example of FIG. 5, for example, an intersection of an arrow indicating the approach direction AD and a broken line indicating the edge of the detection region OR.
なお、進入動作の情報は、進入方向ADに限られず、例えば、進入位置の情報であってもよい。この場合、情報処理装置10は、進入位置の情報に基づいて、検出可能範囲ARが検出領域ORより狭くなるよう補正してもよい。
The information on the approaching motion is not limited to the approaching direction AD, and may be, for example, information on the approaching position. In this case, the information processing apparatus 10 may correct the detectable range AR to be narrower than the detection area OR based on the information of the approach position.
本解決手段によれば、情報処理装置10は、ユーザUの手Hの振り上げ動作の情報に基づいてジェスチャの検出可能範囲を補正する。これにより、ユーザがジェスチャとして意図していない動作までジェスチャとして検出されてしまう事態が少なくなる。例え情報処理装置10が検出部の感度を高めて検出領域ORを広くしたとしても、誤検出の可能性は低い。
According to the present solution, the information processing apparatus 10 corrects the detectable range of the gesture based on the information of the swinging motion of the hand H of the user U. As a result, it is less likely that an action that the user does not intend as a gesture is detected as a gesture. Even if the information processing apparatus 10 increases the sensitivity of the detection unit and widens the detection area OR, the possibility of erroneous detection is low.
(1-2-2)ジェスチャの検出のための基準方向の補正
上述したように、情報処理装置10の装着向きによっては、ユーザUの向きに対する情報処理装置10の向きが変化する。そこで、本実施形態の情報処理装置10は、ジェスチャ検出のための基準方向を装置の傾きに合わせて補正する。なお、ジェスチャの検出でいう「検出」は、ジェスチャの認識も含む広義の「検出」である。ここで、ジェスチャ検出のための基準方向とは、情報処理装置10がジェスチャを検出する上で、基準とする方向のことである。例えば、情報処理装置10が左右スワイプを検出しようとしているのであれば、基準方向は例えば水平方向である。 (1-2-2) Correction of reference direction for detecting gesture As described above, the orientation of theinformation processing apparatus 10 with respect to the orientation of the user U changes depending on the mounting orientation of the information processing apparatus 10. Therefore, the information processing apparatus 10 of the present embodiment corrects the reference direction for gesture detection according to the inclination of the apparatus. In addition, "detection" in the detection of a gesture is a "detection" in a broad sense including recognition of a gesture. Here, the reference direction for gesture detection is the reference direction for the information processing apparatus 10 to detect the gesture. For example, if the information processing apparatus 10 is trying to detect a left / right swipe, the reference direction is, for example, the horizontal direction.
上述したように、情報処理装置10の装着向きによっては、ユーザUの向きに対する情報処理装置10の向きが変化する。そこで、本実施形態の情報処理装置10は、ジェスチャ検出のための基準方向を装置の傾きに合わせて補正する。なお、ジェスチャの検出でいう「検出」は、ジェスチャの認識も含む広義の「検出」である。ここで、ジェスチャ検出のための基準方向とは、情報処理装置10がジェスチャを検出する上で、基準とする方向のことである。例えば、情報処理装置10が左右スワイプを検出しようとしているのであれば、基準方向は例えば水平方向である。 (1-2-2) Correction of reference direction for detecting gesture As described above, the orientation of the
図6は、情報処理装置10が基準方向を補正する様子を示す図である。図6の例では、B1~B4は補正された基準方向である。以下の説明では、B1~B4のことを、単に「基準方向」ということがある。装置方向D1~D4は装置を基準とした方向であるので、情報処理装置10が傾いて装着されると、水平方向或いは垂直方向から傾いたものとなる。情報処理装置10が装置方向D1~D4を基準方向としてジェスチャを検出すると、例えば、左右スワイプを上下スワイプとして検出するといった誤検出を発生させることになる。
FIG. 6 is a diagram showing how the information processing apparatus 10 corrects the reference direction. In the example of FIG. 6, B1 to B4 are the corrected reference directions. In the following description, B1 to B4 may be simply referred to as "reference direction". Since the device directions D1 to D4 are directions with respect to the device, when the information processing device 10 is tilted and mounted, it is tilted from the horizontal direction or the vertical direction. When the information processing apparatus 10 detects a gesture with the device directions D1 to D4 as reference directions, erroneous detection such as detecting a left / right swipe as an up / down swipe will occur.
基準方向を補正する方法として、様々な方法が想起される。例えば、情報処理装置10は、ジェスチャに関する情報に基づいて、基準方向を補正する。ここで、ジェスチャに関する情報は、ジェスチャに関するユーザUの動作の情報であってもよい。より具体的には、ジェスチャに関する情報は、ユーザUが手Hを検出可能範囲ARに進入させる進入動作の情報であってもよい。情報処理装置10がユーザUの耳に装着される装置なのであれば、進入動作は、ユーザUが手Hを自身の耳に向けて振り上げる動作(振り上げ動作)であってもよい。
Various methods are recalled as methods for correcting the reference direction. For example, the information processing apparatus 10 corrects the reference direction based on the information about the gesture. Here, the information about the gesture may be the information of the operation of the user U regarding the gesture. More specifically, the information about the gesture may be the information of the approaching motion in which the user U enters the hand H into the detectable range AR. If the information processing device 10 is a device worn on the ear of the user U, the approaching motion may be an motion in which the user U swings his / her hand H toward his / her ear (swinging motion).
上述したように、ユーザUが手Hの振り上げ動作を行った場合の、検出可能範囲ARへのユーザUの手Hの進入方向ADは、ある程度決まった方向となる。情報処理装置10は、進入方向ADの情報に基づいて基準方向を補正する。図7は、情報処理装置10が基準方向を補正する様子を示す図である。装置方向D1~D4は装置を基準とした方向であるので、当然ながら、情報処理装置10は、装置方向D1~D4については分かっている。情報処理装置10は、進入方向ADを基準とした相対的な装置方向D1~D4の方向の情報に基づいて、基準方向を補正する。一例として、情報処理装置10は以下の方法で基準方向を補正可能である。
As described above, when the user U swings up the hand H, the approach direction AD of the user U's hand H into the detectable range AR is a certain direction. The information processing apparatus 10 corrects the reference direction based on the information of the approach direction AD. FIG. 7 is a diagram showing how the information processing apparatus 10 corrects the reference direction. Since the device directions D1 to D4 are directions with respect to the device, the information processing device 10 naturally knows about the device directions D1 to D4. The information processing device 10 corrects the reference direction based on the information in the relative device directions D1 to D4 with respect to the approach direction AD. As an example, the information processing apparatus 10 can correct the reference direction by the following method.
例えば、情報処理装置10は、装置方向D1~D4の少なくとも1つ(図7の例では装置方向D2)と進入方向ADとのなす角度Rを算出する。進入方向ADは、例えば、ユーザUの手Hが検出可能範囲ARへの進入する位置Cでの角度である。上述したように、進入方向ADは、ユーザUの背面方向(X軸マイナス方向)から上に若干傾いた方向となることが想定される。そのため、情報処理装置10は、標準装着時にあるべき進入方向ADの装置方向基準の角度が分かる。情報処理装置10は、標準装着時にあるべき進入方向ADの角度と実際に検出された進入方向ADの角度との差分から、情報処理装置10の装着傾きを推測できる。なお、情報処理装置10は、この角度の差分をそのまま推測結果としてもよい。そして、情報処理装置10の装着傾きの推測結果に基づいて、基準方向を基準方向B1~B4に補正する。
For example, the information processing device 10 calculates an angle R formed by at least one of the device directions D1 to D4 (device direction D2 in the example of FIG. 7) and the approach direction AD. The approach direction AD is, for example, an angle at the position C where the user U's hand H enters the detectable range AR. As described above, it is assumed that the approach direction AD is a direction slightly inclined upward from the back surface direction (X-axis minus direction) of the user U. Therefore, the information processing device 10 knows the angle of the device direction reference of the approach direction AD that should be when the information processing device 10 is installed as standard. The information processing device 10 can estimate the mounting inclination of the information processing device 10 from the difference between the angle of the approach direction AD that should be when the information processing device 10 is mounted as standard and the angle of the approach direction AD that is actually detected. The information processing apparatus 10 may use the difference in angles as the estimation result as it is. Then, the reference direction is corrected to the reference directions B1 to B4 based on the estimation result of the mounting inclination of the information processing apparatus 10.
なお、図7の例では、基準方向B1~B4は水平方向或いは垂直方向となっているが、基準方向B1~B4は必ずしも水平方向或いは垂直方向でなくてもよい。情報処理装置10がジェスチャを検出するために基準とするあらゆる方向を基準方向とすることができる。例えば、情報処理装置10が、斜め45°方向へのスワイプを検出するのであれば、その斜め45°方向を基準方向とみなしてもよい。
In the example of FIG. 7, the reference directions B1 to B4 are in the horizontal direction or the vertical direction, but the reference directions B1 to B4 do not necessarily have to be in the horizontal direction or the vertical direction. The reference direction can be any direction that the information processing apparatus 10 uses as a reference for detecting the gesture. For example, if the information processing apparatus 10 detects a swipe in the diagonal 45 ° direction, the diagonal 45 ° direction may be regarded as the reference direction.
本解決手段によれば、情報処理装置10は、装置方向D1~D4を基準方向とするのではなく、補正された基準方向B1~B4を基準にしてジェスチャを検出する。これにより例えば、左右スワイプを上下スワイプとして検出するといった誤検出が少なくなる。
According to the present solution, the information processing apparatus 10 does not use the device directions D1 to D4 as the reference direction, but detects the gesture based on the corrected reference directions B1 to B4. This reduces erroneous detection, for example, detecting a left / right swipe as an up / down swipe.
(1-2-3)その他の解決手段
上述の(1-2-1)及び(1-2-2)では、ユーザUのジェスチャに関する情報に基づいて、ジェスチャの検出可能範囲、及び/又は、ジェスチャの検出のための基準方向を補正した。しかし、情報処理装置10は、検出可能範囲及び/又は基準方向の補正を、ジェスチャに関する情報以外の情報を使って行うこともできる。 (1-2-3) Other Solutions In the above-mentioned (1-2-1) and (1-2-2), the detectable range of the gesture and / or the gesture is based on the information about the gesture of the user U. Corrected the reference direction for gesture detection. However, theinformation processing apparatus 10 can also correct the detectable range and / or the reference direction by using information other than the information regarding the gesture.
上述の(1-2-1)及び(1-2-2)では、ユーザUのジェスチャに関する情報に基づいて、ジェスチャの検出可能範囲、及び/又は、ジェスチャの検出のための基準方向を補正した。しかし、情報処理装置10は、検出可能範囲及び/又は基準方向の補正を、ジェスチャに関する情報以外の情報を使って行うこともできる。 (1-2-3) Other Solutions In the above-mentioned (1-2-1) and (1-2-2), the detectable range of the gesture and / or the gesture is based on the information about the gesture of the user U. Corrected the reference direction for gesture detection. However, the
例えば、情報処理装置10は、情報処理装置10の状態に関する情報に基づいて、検出可能範囲及び基準方向の少なくとも一方を補正してもよい。例えば、情報処理装置10は、情報処理装置10の状態に関する検出を行うセンサ部を備え、センサ部からの情報に基づいて検出可能範囲及び基準方向の少なくとも一方を補正する。
For example, the information processing apparatus 10 may correct at least one of the detectable range and the reference direction based on the information regarding the state of the information processing apparatus 10. For example, the information processing apparatus 10 includes a sensor unit that detects the state of the information processing apparatus 10, and corrects at least one of the detectable range and the reference direction based on the information from the sensor unit.
ここで、センサ部は、1又は複数の加速度センサ、又は、1又は複数のジャイロセンサから構成されていてもよい。この場合、情報処理装置10は、センサ部からの情報に基づき推定される情報処理装置10のユーザUへの装着状態に基づいて、検出可能範囲及び基準方向の少なくとも一方を補正可能してもよい。
Here, the sensor unit may be composed of one or a plurality of acceleration sensors, or one or a plurality of gyro sensors. In this case, the information processing apparatus 10 may be able to correct at least one of the detectable range and the reference direction based on the mounting state of the information processing apparatus 10 to the user U estimated based on the information from the sensor unit. ..
情報処理装置10には常に重力加速度が加わっているので、情報処理装置10は、センサ部からの情報に基づき重力方向を推定可能である。情報処理装置10は、推定した重力方向から、情報処理装置10の標準装着状態からの傾きを推定する。そして、情報処理装置10は、この傾きの推定結果に基づいて検出可能範囲及び基準方向の少なくとも一方を補正する。情報処理装置10は、補正された検出可能範囲及び/又は基準方向に基づいてジェスチャを検出する。
Since the gravitational acceleration is constantly applied to the information processing device 10, the information processing device 10 can estimate the gravitational direction based on the information from the sensor unit. The information processing apparatus 10 estimates the inclination of the information processing apparatus 10 from the standard mounted state from the estimated gravity direction. Then, the information processing apparatus 10 corrects at least one of the detectable range and the reference direction based on the estimation result of the inclination. The information processing apparatus 10 detects the gesture based on the corrected detectable range and / or the reference direction.
この方法によっても、(1-2-1)及び(1-2-2)で説明した手段と同様に、ユーザUのジェスチャの誤検出を少なくできる。
Also by this method, it is possible to reduce the false detection of the gesture of the user U as in the means described in (1-2-1) and (1-2-2).
<<2.実施形態1>>
以上、本実施形態の概要を述べたが、以下、実施形態1に係る情報処理装置10について説明する。 << 2.Embodiment 1 >>
The outline of the present embodiment has been described above, but theinformation processing apparatus 10 according to the first embodiment will be described below.
以上、本実施形態の概要を述べたが、以下、実施形態1に係る情報処理装置10について説明する。 << 2.
The outline of the present embodiment has been described above, but the
<2-1.情報処理装置の構成>
まず、情報処理装置10の構成を説明する。 <2-1. Information processing device configuration>
First, the configuration of theinformation processing apparatus 10 will be described.
まず、情報処理装置10の構成を説明する。 <2-1. Information processing device configuration>
First, the configuration of the
情報処理装置10は、ユーザの入力操作に基づいて各種機能を実行する情報処理端末である。例えば、情報処理装置10は、イヤホンやヘッドホン等のユーザが装着可能な音響出力装置である。情報処理装置10は、AR(Augmented Reality)グラスやMR(Mixed Reality)グラス等のユーザが装着可能な表示装置であってもよい。本実施形態では、情報処理装置10は、ユーザUが装着可能な機器であるものとし、装着時に少なくともユーザUの耳に位置する部位を備えているものとする。そして、この部位には、少なくとも物体を検出する検出部が配置されているものとする。情報処理装置10は、ユーザのジェスチャを検出可能に構成され、ユーザのジェスチャに基づいて音楽再生等の機能を実行する。
The information processing device 10 is an information processing terminal that executes various functions based on a user's input operation. For example, the information processing device 10 is an acoustic output device that can be worn by a user such as earphones and headphones. The information processing device 10 may be a display device such as an AR (Augmented Reality) glass or an MR (Mixed Reality) glass that can be worn by a user. In the present embodiment, the information processing apparatus 10 is assumed to be a device that can be worn by the user U, and is provided with a portion located at least in the ear of the user U when worn. Then, it is assumed that at least a detection unit for detecting an object is arranged in this portion. The information processing apparatus 10 is configured to be able to detect a user's gesture, and executes a function such as music playback based on the user's gesture.
図8は、本開示の実施形態に係る情報処理装置10の構成例を示す図である。情報処理装置10は、入力検出部11と、状態検出部12と、出力部13と、通信部14と、記憶部15と、制御部16と、を備える。なお、図8に示した構成は機能的な構成であり、ハードウェア構成はこれとは異なっていてもよい。また、情報処理装置10の機能は、複数の物理的に分離された構成に分散して実装されてもよい。
FIG. 8 is a diagram showing a configuration example of the information processing apparatus 10 according to the embodiment of the present disclosure. The information processing apparatus 10 includes an input detection unit 11, a state detection unit 12, an output unit 13, a communication unit 14, a storage unit 15, and a control unit 16. The configuration shown in FIG. 8 is a functional configuration, and the hardware configuration may be different from this. Further, the functions of the information processing apparatus 10 may be distributed and implemented in a plurality of physically separated configurations.
入力検出部11は、ユーザの入力操作を検出する検出部である。例えば、入力検出部11は、物体を非接触で検出する非接触型の検出部である。入力検出部11は、1又は複数の近接センサ(非接触センサともいう。)を備え、検出領域内に位置する物体を検出する。近接センサは、光学式のセンサであってもよいし、静電容量式のセンサであってもよい。光学式の近接センサは、物体が反射した光を検出するセンサである。また、静電容量式の近接センサは、物体とセンサの間に生じる静電容量の変化を検出するセンサである。
The input detection unit 11 is a detection unit that detects a user's input operation. For example, the input detection unit 11 is a non-contact type detection unit that detects an object in a non-contact manner. The input detection unit 11 includes one or a plurality of proximity sensors (also referred to as non-contact sensors), and detects an object located in the detection area. The proximity sensor may be an optical sensor or a capacitance type sensor. An optical proximity sensor is a sensor that detects the light reflected by an object. The capacitance type proximity sensor is a sensor that detects a change in capacitance that occurs between an object and the sensor.
なお、入力検出部11が非接触型の場合、入力検出部11は2Dセンサであってもよいし、3Dセンサであってもよい。入力検出部11が2Dセンサの場合、入力検出部11が形成する検出領域は二次元の領域となり、入力検出部11が3Dセンサの場合、入力検出部11が形成する検出領域は三次元の領域となる。
When the input detection unit 11 is a non-contact type, the input detection unit 11 may be a 2D sensor or a 3D sensor. When the input detection unit 11 is a 2D sensor, the detection area formed by the input detection unit 11 is a two-dimensional area, and when the input detection unit 11 is a 3D sensor, the detection area formed by the input detection unit 11 is a three-dimensional area. It becomes.
なお、入力検出部11は、非接触式に限定されない。入力検出部11は、接触式の検出部であってもよい。この場合、入力検出部11は、1又は複数のタッチセンサを備え、情報処理装置10の所定の場所に接触した物体を検出する。情報処理装置10がイヤホン型の装置なのであれば、所定の場所は、例えば、イヤホン側面(スピーカ面の反対面)である。また、情報処理装置10がヘッドホン型の装置なのであれば、所定の場所は、例えば、ヘッドホン側面(スピーカ面の反対面)である。以下の説明では、イヤホン側面やヘッドホン側面のことをハウジング面ということがある。
The input detection unit 11 is not limited to the non-contact type. The input detection unit 11 may be a contact type detection unit. In this case, the input detection unit 11 includes one or a plurality of touch sensors, and detects an object that comes into contact with a predetermined place of the information processing apparatus 10. If the information processing device 10 is an earphone type device, the predetermined location is, for example, the side surface of the earphone (opposite the surface of the speaker). If the information processing device 10 is a headphone type device, the predetermined location is, for example, the side surface of the headphones (opposite the surface of the speaker). In the following description, the side surface of the earphone or the side surface of the headphone may be referred to as the housing surface.
図9は、入力検出部11の構成例を示す図である。図9には、イヤホン型の情報処理装置10をイヤホン側面から見た様子が示されている。なお、図9に示した構成はあくまで一例であり、入力検出部11の構成は図9に示した構成に限定されない。
FIG. 9 is a diagram showing a configuration example of the input detection unit 11. FIG. 9 shows a state in which the earphone-type information processing apparatus 10 is viewed from the side surface of the earphone. The configuration shown in FIG. 9 is merely an example, and the configuration of the input detection unit 11 is not limited to the configuration shown in FIG.
図9の例では、入力検出部11は、非接触式のセンサとして、赤外発光体IR、4つのフォトセンサPD1、PD2、PD3、PD4と、を備えている。赤外発光体IDは、ハウジング面中央に配置されており、4つのフォトセンサPD1、PD2、PD3、PD4は、赤外発光体IDを取り囲むように周方向に等間隔に配置されている。入力検出部11は、赤外発光体IDから発光され、物体表面で反射された赤外光を4つのフォトセンサPD1、PD2、PD3、PD4で検出することにより、物体を検出する。
In the example of FIG. 9, the input detection unit 11 includes an infrared emitter IR, four photo sensors PD1, PD2, PD3, and PD4 as non-contact sensors. The infrared emitter ID is arranged in the center of the housing surface, and the four photosensors PD1, PD2, PD3, and PD4 are arranged at equal intervals in the circumferential direction so as to surround the infrared emitter ID. The input detection unit 11 detects an object by detecting infrared light emitted from the infrared emitter ID and reflected on the surface of the object by four photosensors PD1, PD2, PD3, and PD4.
また、入力検出部11は、接触式のセンサとして、タッチパッドTPを備えている。タッチパッドTPは、面状に配置された複数のタッチセンサで構成される。赤外発光体IDは、ハウジング面に配置されている。入力検出部11は、タッチパッドTPの表面に接触した物体を検出する。
Further, the input detection unit 11 is provided with a touch pad TP as a contact type sensor. The touch pad TP is composed of a plurality of touch sensors arranged in a plane. The infrared emitter ID is arranged on the housing surface. The input detection unit 11 detects an object in contact with the surface of the touch pad TP.
状態検出部12は、情報処理装置10の状態に関する検出を行うセンサ部である。状態検出部12は、情報処理装置10の状態に関する検出を行う1又は複数のセンサにより構成される。状態検出部12は、例えば、1又は複数の加速度センサを備える。また、状態検出部12は、1又は複数のジャイロセンサを備えていてもよい。また、状態検出部12は、1又は複数の地磁気センサを備えていてもよい。状態検出部12は、これら複数種類のセンサを組み合わせたモーションセンサを備えていてもよい。また、例えば血圧センサや心拍センサといった、ユーザの生体情報を取得する生体センサを備えていてもよい。
The state detection unit 12 is a sensor unit that detects the state of the information processing device 10. The state detection unit 12 is composed of one or a plurality of sensors that detect the state of the information processing device 10. The state detection unit 12 includes, for example, one or a plurality of acceleration sensors. Further, the state detection unit 12 may include one or a plurality of gyro sensors. Further, the state detection unit 12 may include one or a plurality of geomagnetic sensors. The state detection unit 12 may include a motion sensor in which these plurality of types of sensors are combined. Further, a biosensor for acquiring user's biometric information such as a blood pressure sensor or a heartbeat sensor may be provided.
状態検出部12は、これらのセンサの検出結果に基づいて、情報処理装置10の状態に関する検出を行う。例えば、状態検出部12は、これらのセンサのセンサ値に基づいて、重力方向を検出する。例えば、情報処理装置10には常に重力加速度が加わっているので、情報処理装置10は、加速度センサで検出した加速度の方向の一定時間の平均をとることで重力方向を検出することが可能である。
The state detection unit 12 detects the state of the information processing apparatus 10 based on the detection results of these sensors. For example, the state detection unit 12 detects the direction of gravity based on the sensor values of these sensors. For example, since the information processing device 10 is constantly subjected to gravitational acceleration, the information processing device 10 can detect the gravitational direction by averaging the directions of the accelerations detected by the acceleration sensor for a certain period of time. ..
出力部13は、ユーザに情報を出力する出力インタフェースである。例えば、出力部13は、スピーカ、ブザー等の音響装置であってもよいし、振動モータ等の振動装置であってもよい。また、出力部13は、液晶ディスプレイ(Liquid Crystal Display)、有機ELディスプレイ(Organic Electroluminescence Display)等の表示装置であってもよいし、LED(Light Emitting Diode)ランプ等の点灯装置であってもよい。出力部13は、情報処理装置10の出力手段として機能する。出力部13は、制御部16の制御に従ってユーザに各種情報を出力する。
The output unit 13 is an output interface that outputs information to the user. For example, the output unit 13 may be an acoustic device such as a speaker or a buzzer, or may be a vibration device such as a vibration motor. Further, the output unit 13 may be a display device such as a liquid crystal display (Liquid Crystal Display) or an organic EL display (Organic Electroluminescence Display), or may be a lighting device such as an LED (Light Emitting Diode) lamp. .. The output unit 13 functions as an output means of the information processing apparatus 10. The output unit 13 outputs various information to the user according to the control of the control unit 16.
通信部14は、他の装置と通信するための通信インタフェースである。通信部14は、ネットワークインタフェースであってもよいし、機器接続インタフェースであってもよい。また、通信部14は、有線インタフェースであってもよいし、無線インタフェースであってもよい。通信部14は、情報処理装置10の通信手段として機能する。通信部14は、制御部16の制御に従って他の装置と通信する。他の装置は、例えば、音楽プレーヤーやスマートフォン等の端末装置である。
The communication unit 14 is a communication interface for communicating with other devices. The communication unit 14 may be a network interface or a device connection interface. Further, the communication unit 14 may be a wired interface or a wireless interface. The communication unit 14 functions as a communication means of the information processing device 10. The communication unit 14 communicates with other devices according to the control of the control unit 16. Other devices are, for example, terminal devices such as music players and smartphones.
記憶部15は、DRAM(Dynamic Random Access Memory)、SRAM(Static Random Access Memory)、フラッシュメモリ、ハードディスク等のデータ読み書き可能な記憶装置である。記憶部15は、情報処理装置10の記憶手段として機能する。記憶部15は、例えば、ジェスチャの検出可能範囲ARに関する設定を記憶する。
The storage unit 15 is a storage device capable of reading and writing data such as a DRAM (Dynamic Random Access Memory), a SRAM (Static Random Access Memory), a flash memory, and a hard disk. The storage unit 15 functions as a storage means for the information processing device 10. The storage unit 15 stores, for example, the settings related to the detectable range AR of the gesture.
制御部16は、情報処理装置10の各部を制御するコントローラ(controller)である。制御部16は、例えば、CPU(Central Processing Unit)、MPU(Micro Processing Unit)等のプロセッサにより実現される。例えば、制御部16は、情報処理装置10内部の記憶装置に記憶されている各種プログラムを、プロセッサがRAM(Random Access Memory)等を作業領域として実行することにより実現される。なお、制御部16は、ASIC(Application Specific Integrated Circuit)やFPGA(Field Programmable Gate Array)等の集積回路により実現されてもよい。CPU、MPU、ASIC、及びFPGAは何れもコントローラとみなすことができる。
The control unit 16 is a controller that controls each unit of the information processing device 10. The control unit 16 is realized by, for example, a processor such as a CPU (Central Processing Unit) or an MPU (Micro Processing Unit). For example, the control unit 16 is realized by the processor executing various programs stored in the storage device inside the information processing device 10 using a RAM (Random Access Memory) or the like as a work area. The control unit 16 may be realized by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array). The CPU, MPU, ASIC, and FPGA can all be regarded as controllers.
制御部16は、図8に示すように、取得部161と、ジェスチャ検出部162と、コマンド実行部163と、出力制御部164と、推測部165と、補正部166と、を備える。制御部16を構成する各ブロック(取得部161~補正部166)はそれぞれ制御部16の機能を示す機能ブロックである。これら機能ブロックはソフトウェアブロックであってもよいし、ハードウェアブロックであってもよい。例えば、上述の機能ブロックが、それぞれ、ソフトウェア(マイクロプログラムを含む。)で実現される1つのソフトウェアモジュールであってもよいし、半導体チップ(ダイ)上の1つの回路ブロックであってもよい。勿論、各機能ブロックがそれぞれ1つのプロセッサ又は1つの集積回路であってもよい。機能ブロックの構成方法は任意である。
As shown in FIG. 8, the control unit 16 includes an acquisition unit 161, a gesture detection unit 162, a command execution unit 163, an output control unit 164, a guessing unit 165, and a correction unit 166. Each block (acquisition unit 161 to correction unit 166) constituting the control unit 16 is a functional block indicating the function of the control unit 16. These functional blocks may be software blocks or hardware blocks. For example, each of the above-mentioned functional blocks may be one software module realized by software (including a microprogram), or may be one circuit block on a semiconductor chip (die). Of course, each functional block may be one processor or one integrated circuit. The method of configuring the functional block is arbitrary.
なお、制御部16は上述の機能ブロックとは異なる機能単位で構成されていてもよい。また、制御部16を構成する各ブロック(取得部161~補正部166)の一部又は全部の動作を、他の装置が行ってもよい。例えば、制御部16を構成する各ブロックの一部又は全部の動作を、音楽プレーヤーやスマートフォン等の端末装置が行ってもよい。制御部16を構成する各ブロックの動作は後述する。
The control unit 16 may be configured in a functional unit different from the above-mentioned functional block. Further, another device may perform a part or all of the operations of each block (acquisition unit 161 to correction unit 166) constituting the control unit 16. For example, a terminal device such as a music player or a smartphone may perform some or all operations of each block constituting the control unit 16. The operation of each block constituting the control unit 16 will be described later.
<2-2.情報処理装置が検出可能なジェスチャ>
以上、情報処理装置10の構成について述べたが、本実施形態に係る情報処理装置10の動作を詳細に説明する前に、情報処理装置10が検出可能なジェスチャについて説明する。 <2-2. Gestures that can be detected by information processing equipment>
The configuration of theinformation processing apparatus 10 has been described above, but before the operation of the information processing apparatus 10 according to the present embodiment is described in detail, the gestures that can be detected by the information processing apparatus 10 will be described.
以上、情報処理装置10の構成について述べたが、本実施形態に係る情報処理装置10の動作を詳細に説明する前に、情報処理装置10が検出可能なジェスチャについて説明する。 <2-2. Gestures that can be detected by information processing equipment>
The configuration of the
(2-2-1)機能とジェスチャの対応関係
図10は、情報処理装置10が有する機能とジェスチャとの対応関係を示した図である。情報処理装置10は、音楽の再生に関する機能に加えて、電話の応答に関する機能を有している。例えば、情報処理装置10は、以下の(1)~(7)に示す機能を有している。なお、「機能」は「コマンド」と言い換えることができる。
(1)Play/Pause
(2)AnswerCall/End
(3)Next/Prev
(4)Vol+/Vol-
(5)Cancel
(6)Quick Attention
(7)Ambient Control (2-2-1) Correspondence between functions and gestures FIG. 10 is a diagram showing a correspondence between functions and gestures of theinformation processing apparatus 10. The information processing device 10 has a function related to answering a telephone in addition to a function related to playing music. For example, the information processing apparatus 10 has the functions shown in the following (1) to (7). In addition, "function" can be paraphrased as "command".
(1) Play / Pause
(2) AnswerCall / End
(3) Next / Prev
(4) Vol + / Vol-
(5) Candle
(6) Quick Attention
(7) Ambient Control
図10は、情報処理装置10が有する機能とジェスチャとの対応関係を示した図である。情報処理装置10は、音楽の再生に関する機能に加えて、電話の応答に関する機能を有している。例えば、情報処理装置10は、以下の(1)~(7)に示す機能を有している。なお、「機能」は「コマンド」と言い換えることができる。
(1)Play/Pause
(2)AnswerCall/End
(3)Next/Prev
(4)Vol+/Vol-
(5)Cancel
(6)Quick Attention
(7)Ambient Control (2-2-1) Correspondence between functions and gestures FIG. 10 is a diagram showing a correspondence between functions and gestures of the
(1) Play / Pause
(2) AnswerCall / End
(3) Next / Prev
(4) Vol + / Vol-
(5) Candle
(6) Quick Attention
(7) Ambient Control
以下、(1)~(7)の機能、及び、(1)~(7)それぞれに関連付けられたジェスチャを説明する。なお、1つのジェスチャを構成するユーザの動作には、ジェスチャの開始のトリガとなる第1動作と、第1動作に続く第2動作とが含まれていてもよい。第1動作は、例えば、ユーザUの手Hの振り上げ動作であり、第2動作は、例えば、第1動作に続くスワイプ等の動作である。第1動作及び第2動作それぞれを1つのジェスチャとみなすことも可能である。
Hereinafter, the functions of (1) to (7) and the gestures associated with each of (1) to (7) will be described. The user's actions constituting one gesture may include a first action that triggers the start of the gesture and a second action following the first action. The first operation is, for example, a swing-up operation of the hand H of the user U, and the second operation is, for example, an operation such as a swipe following the first operation. It is also possible to regard each of the first motion and the second motion as one gesture.
なお、ここで説明する機能及びジェスチャはあくまで一例である。情報処理装置10が有する機能及びジェスチャは、以下に示す機能及びジェスチャに限定されない。第1動作及び第2動作も以下に示す動作に限定されない。また、機能とジェスチャとの組み合わせも、以下に示す組み合わせに限定されない。例えば、上記(1)~(7)は記載が無いが、Noise Cancelling機能がジェスチャによって動作されてもよい。
The functions and gestures described here are just examples. The functions and gestures of the information processing apparatus 10 are not limited to the functions and gestures shown below. The first operation and the second operation are not limited to the operations shown below. Further, the combination of the function and the gesture is not limited to the combination shown below. For example, although the above (1) to (7) are not described, the Noise Canceling function may be operated by a gesture.
以下の説明では、情報処理装置10はユーザの耳に装着されるイヤホン型の機器であるものとするが、情報処理装置10はイヤホン型の機器に限定されない。例えば、情報処理装置10はヘッドホン型の機器であってもよい。また、情報処理装置10は、ユーザの耳に装着される部位とその他の部位を有していてもよい。また、以下の説明では、情報処理装置10がユーザの耳に装着されるものとするが、ユーザの耳に装着される装置は情報処理装置10以外の所定の装置であってもよい。所定の装置は、例えば、情報処理装置10と無線接続される出力装置(例えば、イヤホン又はヘッドホン、補聴器、集音器などで、有線、無線、カナル型、オープンイヤー型等を問わない)であってもよい。なお、情報処理装置10がユーザの耳に装着されるのであれば、所定の装置は情報処理装置10自身である。所定の装置は所定の機器と言い換え可能である。
In the following description, the information processing device 10 is assumed to be an earphone-type device worn on the user's ear, but the information processing device 10 is not limited to the earphone-type device. For example, the information processing device 10 may be a headphone type device. Further, the information processing apparatus 10 may have a portion worn on the user's ear and another portion. Further, in the following description, it is assumed that the information processing apparatus 10 is attached to the user's ear, but the apparatus attached to the user's ear may be a predetermined device other than the information processing apparatus 10. The predetermined device is, for example, an output device wirelessly connected to the information processing device 10 (for example, an earphone or headphone, a hearing aid, a sound collector, etc., regardless of whether it is a wired, wireless, canal type, open ear type, etc.). You may. If the information processing device 10 is worn on the user's ear, the predetermined device is the information processing device 10 itself. A predetermined device can be paraphrased as a predetermined device.
以下、図10を参照しながら、情報処理装置10が有する機能と、その機能に関連付けられたジェスチャを説明する。
Hereinafter, with reference to FIG. 10, the functions of the information processing apparatus 10 and the gestures associated with the functions will be described.
(1)Play/Pause
「Play」は、曲を再生する機能であり、「Pause」は曲の再生を一時停止する機能である。「Play/Pause」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、ピンチ動作が関連付けられている。図11は、ピンチ動作を説明するための図である。ピンチ動作は、図11に示すように、指で物を摘まむ動作である。なお、情報処理装置10は、ユーザUが実際に物(例えば、情報処理装置10)を摘まんでいなくても、ユーザUがピンチ動作を行ったと検出可能である。 (1) Play / Pause
"Play" is a function for playing a song, and "Pause" is a function for pausing the playback of a song. The "Play / Pause" is associated with the swing-up motion of the hand H as the first motion, and is associated with the pinch motion as the second motion. FIG. 11 is a diagram for explaining a pinch operation. As shown in FIG. 11, the pinch operation is an operation of picking an object with a finger. Theinformation processing apparatus 10 can detect that the user U has performed a pinch operation even if the user U has not actually picked up an object (for example, the information processing apparatus 10).
「Play」は、曲を再生する機能であり、「Pause」は曲の再生を一時停止する機能である。「Play/Pause」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、ピンチ動作が関連付けられている。図11は、ピンチ動作を説明するための図である。ピンチ動作は、図11に示すように、指で物を摘まむ動作である。なお、情報処理装置10は、ユーザUが実際に物(例えば、情報処理装置10)を摘まんでいなくても、ユーザUがピンチ動作を行ったと検出可能である。 (1) Play / Pause
"Play" is a function for playing a song, and "Pause" is a function for pausing the playback of a song. The "Play / Pause" is associated with the swing-up motion of the hand H as the first motion, and is associated with the pinch motion as the second motion. FIG. 11 is a diagram for explaining a pinch operation. As shown in FIG. 11, the pinch operation is an operation of picking an object with a finger. The
(2)Answer Call/End
「Answer Call」は、電話に応答する機能であり、「Answer End」は、電話を終了する機能である。「Answer Call/End」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、ピンチ動作又はホールド動作が関連付けられている。図12は、ホールド動作を説明するための図である。ホールド動作は、図12に示すように、手Hで物を掴む動作である。なお、情報処理装置10は、ユーザUが実際に物(例えば、情報処理装置10)を掴んでいなくても、ユーザUがホールド動作を行ったと検出可能である。 (2) Answer Call / End
"Answer Call" is a function for answering a telephone call, and "Answer End" is a function for ending a telephone call. The "Answer Call / End" is associated with a swing-up motion of the hand H as a first motion, and a pinch motion or a hold motion as a second motion. FIG. 12 is a diagram for explaining the hold operation. As shown in FIG. 12, the hold operation is an operation of grasping an object with the hand H. Theinformation processing apparatus 10 can detect that the user U has performed the hold operation even if the user U does not actually grasp an object (for example, the information processing apparatus 10).
「Answer Call」は、電話に応答する機能であり、「Answer End」は、電話を終了する機能である。「Answer Call/End」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、ピンチ動作又はホールド動作が関連付けられている。図12は、ホールド動作を説明するための図である。ホールド動作は、図12に示すように、手Hで物を掴む動作である。なお、情報処理装置10は、ユーザUが実際に物(例えば、情報処理装置10)を掴んでいなくても、ユーザUがホールド動作を行ったと検出可能である。 (2) Answer Call / End
"Answer Call" is a function for answering a telephone call, and "Answer End" is a function for ending a telephone call. The "Answer Call / End" is associated with a swing-up motion of the hand H as a first motion, and a pinch motion or a hold motion as a second motion. FIG. 12 is a diagram for explaining the hold operation. As shown in FIG. 12, the hold operation is an operation of grasping an object with the hand H. The
(3)Next/Prev
「Next」は、次の曲の頭出しを行う機能であり、「Prev」は、前の曲または再生中の曲の頭出しを行う機能である。「Next/Prev」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、左右スワイプが関連付けられている。ここで、「Next」には、第2動作として、左右スワイプのうちの左スワイプ(前スワイプ)が関連付けられており、「Prev」には、第2動作として、左右スワイプのうちの右スワイプ(後スワイプ)が関連付けられている。なお、「Next/Prev」には、第1動作なしで第2動作を行うジェスチャが関連付けられていてもよい。 (3) Next / Prev
"Next" is a function for cueing the next song, and "Prev" is a function for cueing the previous song or the song being played. The "Next / Prev" is associated with the swinging motion of the hand H as the first motion, and is associated with the left and right swipe as the second motion. Here, "Next" is associated with the left swipe (front swipe) of the left and right swipes as the second operation, and "Prev" is associated with the right swipe of the left and right swipes as the second operation (prev). After swipe) is associated. It should be noted that "Next / Prev" may be associated with a gesture that performs the second operation without the first operation.
「Next」は、次の曲の頭出しを行う機能であり、「Prev」は、前の曲または再生中の曲の頭出しを行う機能である。「Next/Prev」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、左右スワイプが関連付けられている。ここで、「Next」には、第2動作として、左右スワイプのうちの左スワイプ(前スワイプ)が関連付けられており、「Prev」には、第2動作として、左右スワイプのうちの右スワイプ(後スワイプ)が関連付けられている。なお、「Next/Prev」には、第1動作なしで第2動作を行うジェスチャが関連付けられていてもよい。 (3) Next / Prev
"Next" is a function for cueing the next song, and "Prev" is a function for cueing the previous song or the song being played. The "Next / Prev" is associated with the swinging motion of the hand H as the first motion, and is associated with the left and right swipe as the second motion. Here, "Next" is associated with the left swipe (front swipe) of the left and right swipes as the second operation, and "Prev" is associated with the right swipe of the left and right swipes as the second operation (prev). After swipe) is associated. It should be noted that "Next / Prev" may be associated with a gesture that performs the second operation without the first operation.
図13A及び図13Bは、左右スワイプを説明するための図である。図13Aが左スワイプであり、図13Bが右スワイプである。左右スワイプは、ピンチ動作およびホールド動作とは異なり、移動幅を伴う動作(ジェスチャ)である。ここで、左スワイプは、図13Aに示すように、ユーザUが所定の移動幅W1ほど手Hを前方向(X軸プラス方向)へ滑らせる動作である。また、右スワイプは、図13Bに示すように、ユーザUが所定の移動幅W2ほど手Hを後ろ方向(X軸マイナス方向)へ滑らせる動作である。
13A and 13B are diagrams for explaining a left-right swipe. FIG. 13A is a left swipe and FIG. 13B is a right swipe. The left / right swipe is an operation (gesture) involving a movement width, unlike the pinch operation and the hold operation. Here, the left swipe is an operation in which the user U slides the hand H in the forward direction (X-axis plus direction) by a predetermined movement width W1 as shown in FIG. 13A. Further, the right swipe is an operation in which the user U slides the hand H in the backward direction (X-axis minus direction) by a predetermined movement width W2 as shown in FIG. 13B.
なお、図13A及び図13Bは、いずれも、ユーザUが左耳近くでジェスチャを行っている様子を示す図である。図13A及び図13Bの例とは異なり、ユーザUが右耳近くでジェスチャを行う場合、右スワイプが、手Hを前に滑らせる動作となり、左スワイプが、手Hを後ろに滑らせる動作となるので注意を要する。この場合、「Next」には、第2動作として、例えば、右スワイプ(前スワイプ)が関連付けられ、「Prev」には、第2動作として、例えば、左スワイプ(後スワイプ)が関連付けられる。
Note that both FIGS. 13A and 13B are diagrams showing how the user U is performing a gesture near the left ear. Unlike the examples of FIGS. 13A and 13B, when the user U makes a gesture near the right ear, the right swipe is the action of sliding the hand H forward, and the left swipe is the action of sliding the hand H backward. So be careful. In this case, "Next" is associated with, for example, a right swipe (front swipe) as a second action, and "Prev" is associated with, for example, a left swipe (back swipe) as a second action.
(4)Vol+/Vol-
「Vol+」は、音量を上げる機能であり、「Vol-」は、音量を下げる機能である。「Vol+/Vol-」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、上下スワイプが関連付けられている。ここで、「Vol+」には、第2動作として、上下スワイプのうちの上スワイプが関連付けられており、「Vol-」には、第2動作として、上下スワイプのうちの下スワイプが関連付けられている。なお、「Vol+/Vol-」には、第1動作なしで第2動作を行うジェスチャが関連付けられていてもよい。 (4) Vol + / Vol-
"Vol +" is a function to raise the volume, and "Vol-" is a function to lower the volume. "Vol + / Vol-" is associated with the swinging motion of the hand H as the first motion, and is associated with the up / down swipe as the second motion. Here, "Vol +" is associated with the up swipe of the up and down swipes as the second action, and "Vol-" is associated with the down swipe of the up and down swipes as the second action. There is. It should be noted that "Vol + / Vol-" may be associated with a gesture that performs the second operation without the first operation.
「Vol+」は、音量を上げる機能であり、「Vol-」は、音量を下げる機能である。「Vol+/Vol-」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、上下スワイプが関連付けられている。ここで、「Vol+」には、第2動作として、上下スワイプのうちの上スワイプが関連付けられており、「Vol-」には、第2動作として、上下スワイプのうちの下スワイプが関連付けられている。なお、「Vol+/Vol-」には、第1動作なしで第2動作を行うジェスチャが関連付けられていてもよい。 (4) Vol + / Vol-
"Vol +" is a function to raise the volume, and "Vol-" is a function to lower the volume. "Vol + / Vol-" is associated with the swinging motion of the hand H as the first motion, and is associated with the up / down swipe as the second motion. Here, "Vol +" is associated with the up swipe of the up and down swipes as the second action, and "Vol-" is associated with the down swipe of the up and down swipes as the second action. There is. It should be noted that "Vol + / Vol-" may be associated with a gesture that performs the second operation without the first operation.
図14A及び図14Bは、上下スワイプを説明するための図である。図14Aが左スワイプであり、図14Bが右スワイプである。上下スワイプは、ピンチ動作およびホールド動作とは異なり、移動幅を伴う動作(ジェスチャ)である。ここで、上スワイプは、図14Aに示すように、ユーザUが所定の移動幅W3ほど手Hを上方向(Z軸プラス方向)へ滑らせる動作である。また、下スワイプは、図14Bに示すように、ユーザUが所定の移動幅W4ほど手Hを下方向(Z軸マイナス方向)に滑らせる動作である。
14A and 14B are diagrams for explaining the up / down swipe. FIG. 14A is a left swipe and FIG. 14B is a right swipe. The up / down swipe is an operation (gesture) involving a movement width, unlike the pinch operation and the hold operation. Here, the upper swipe is an operation in which the user U slides the hand H upward (Z-axis plus direction) by a predetermined movement width W3, as shown in FIG. 14A. Further, the lower swipe is an operation in which the user U slides the hand H downward (Z-axis minus direction) by a predetermined movement width W4, as shown in FIG. 14B.
なお、「Vol+/Vol-」は、音量を調節するための機能である。そのため、情報処理装置10は、本機能を実行するために、音の上げ下げの情報のみならず、上げ下げする音の量(音の上げ量、又は下げ量)の情報もユーザUから取得すことが望ましい。そこで、本実施形態では、「Vol+/Vol-」は、操作量(本機能の場合、音の上げ量、又は下げ量)を伴う機能であるものとする。本機能においては、例えば、上下スワイプの際の手Hの移動量又は移動速度が「Vol+/Vol-」の操作量に関連付けられる。なお、1回の入力操作で一定量の音を上げ下げするよう情報処理装置10を構成することで、本機能を操作量の伴わない機能とすることも可能である。
Note that "Vol + / Vol-" is a function for adjusting the volume. Therefore, in order to execute this function, the information processing apparatus 10 can acquire not only the information on raising and lowering the sound but also the information on the amount of raising and lowering sound (the amount of raising or lowering the sound) from the user U. desirable. Therefore, in the present embodiment, "Vol + / Vol-" is assumed to be a function accompanied by an operation amount (in the case of this function, an amount of raising or lowering the sound). In this function, for example, the movement amount or movement speed of the hand H when swiping up and down is associated with the operation amount of "Vol + / Vol-". By configuring the information processing apparatus 10 so as to raise or lower a certain amount of sound with one input operation, it is possible to make this function a function without an operation amount.
(5)Cancel
「Cancel」は、ユーザUが行った操作をキャンセルする機能である。「Cancel」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、手をかざす動作が関連付けられている。図15A及び図15Bは、手Hをかざす動作を説明するための図である。図15AはユーザUを左側から見た図であり、図15BはユーザUを正面から見た図である。手Hをかざす動作は、図15A及び図15Bに示すように、ユーザUが、広げた手Hを、情報処理装置10に向けてかざす動作である。 (5) Candle
"Cancel" is a function of canceling the operation performed by the user U. The "Cancel" is associated with the swinging motion of the hand H as the first motion, and is associated with the motion of holding the hand as the second motion. 15A and 15B are diagrams for explaining the operation of holding the hand H. FIG. 15A is a view of the user U viewed from the left side, and FIG. 15B is a view of the user U viewed from the front. As shown in FIGS. 15A and 15B, the operation of holding the hand H is an operation in which the user U holds the spread hand H toward theinformation processing apparatus 10.
「Cancel」は、ユーザUが行った操作をキャンセルする機能である。「Cancel」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、手をかざす動作が関連付けられている。図15A及び図15Bは、手Hをかざす動作を説明するための図である。図15AはユーザUを左側から見た図であり、図15BはユーザUを正面から見た図である。手Hをかざす動作は、図15A及び図15Bに示すように、ユーザUが、広げた手Hを、情報処理装置10に向けてかざす動作である。 (5) Candle
"Cancel" is a function of canceling the operation performed by the user U. The "Cancel" is associated with the swinging motion of the hand H as the first motion, and is associated with the motion of holding the hand as the second motion. 15A and 15B are diagrams for explaining the operation of holding the hand H. FIG. 15A is a view of the user U viewed from the left side, and FIG. 15B is a view of the user U viewed from the front. As shown in FIGS. 15A and 15B, the operation of holding the hand H is an operation in which the user U holds the spread hand H toward the
(6)Quick Attention
「Quick Attention」は、ユーザUが周囲の音をすばやく聞き取るための機能である。具体的には、「Quick Attention」は、出力中の音(例えば、音楽、通話音声、又は着信音)の音量をすばやく下げて、または停止させることで、周囲の音を聞き取り易くする機能である。「Quick Attention」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、タッチ動作が関連付けられている。図16は、タッチ動作を説明するための図である。タッチ動作は、図16に示すように、ユーザUが、情報処理装置10の一部(例えば、ハウジング面)にタッチする動作である。 (6) Quick Attention
"Quick Attention" is a function for the user U to quickly hear the surrounding sound. Specifically, "Quick Attention" is a function that makes it easier to hear surrounding sounds by quickly lowering or stopping the volume of the sound being output (for example, music, call voice, or ringtone). .. The "Quick Attention" is associated with the swinging motion of the hand H as the first motion, and is associated with the touch motion as the second motion. FIG. 16 is a diagram for explaining a touch operation. As shown in FIG. 16, the touch operation is an operation in which the user U touches a part of the information processing apparatus 10 (for example, the housing surface).
「Quick Attention」は、ユーザUが周囲の音をすばやく聞き取るための機能である。具体的には、「Quick Attention」は、出力中の音(例えば、音楽、通話音声、又は着信音)の音量をすばやく下げて、または停止させることで、周囲の音を聞き取り易くする機能である。「Quick Attention」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、タッチ動作が関連付けられている。図16は、タッチ動作を説明するための図である。タッチ動作は、図16に示すように、ユーザUが、情報処理装置10の一部(例えば、ハウジング面)にタッチする動作である。 (6) Quick Attention
"Quick Attention" is a function for the user U to quickly hear the surrounding sound. Specifically, "Quick Attention" is a function that makes it easier to hear surrounding sounds by quickly lowering or stopping the volume of the sound being output (for example, music, call voice, or ringtone). .. The "Quick Attention" is associated with the swinging motion of the hand H as the first motion, and is associated with the touch motion as the second motion. FIG. 16 is a diagram for explaining a touch operation. As shown in FIG. 16, the touch operation is an operation in which the user U touches a part of the information processing apparatus 10 (for example, the housing surface).
(7)Ambient Control
「Ambient Control」は、ユーザUが周囲の音を確認しながら音楽を聞くための機能である。具体的には、「Ambient Contro」は、音楽の再生中に外音を取り込む機能である。「Ambient Control」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、手Hを遠ざける動作が関連付けられている。図17は、手Hを遠ざける動作を説明するための図である。手Hを遠ざける動作は、ピンチ動作、ホールド動作、タッチ動作、及び手をかざす動作とは異なり、移動幅を伴う動作(ジェスチャ)である。ここで、手Hを遠ざける動作は、図17に示すように、ユーザUが、手Hを情報処理装置10にかざす動作を行った後、所定の移動幅W4ほど手Hを所定の離間方向(図17の例であればY軸プラス方向)へ移動させる動作である。 (7) Ambient Control
The "Ambient Control" is a function for the user U to listen to music while checking the surrounding sounds. Specifically, "Ambient Control" is a function of capturing external sounds during music reproduction. The "Ambient Control" is associated with a swing-up motion of the hand H as a first motion, and is associated with a motion of moving the hand H away as a second motion. FIG. 17 is a diagram for explaining an operation of moving the hand H away. The motion of moving the hand H away is a motion (gesture) with a movement width, unlike the pinch motion, the hold motion, the touch motion, and the motion of holding the hand. Here, in the operation of moving the hand H away, as shown in FIG. 17, after the user U holds the hand H over theinformation processing apparatus 10, the hand H is moved in a predetermined separation direction by a predetermined movement width W4. In the example of FIG. 17, it is an operation of moving in the Y-axis plus direction).
「Ambient Control」は、ユーザUが周囲の音を確認しながら音楽を聞くための機能である。具体的には、「Ambient Contro」は、音楽の再生中に外音を取り込む機能である。「Ambient Control」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、手Hを遠ざける動作が関連付けられている。図17は、手Hを遠ざける動作を説明するための図である。手Hを遠ざける動作は、ピンチ動作、ホールド動作、タッチ動作、及び手をかざす動作とは異なり、移動幅を伴う動作(ジェスチャ)である。ここで、手Hを遠ざける動作は、図17に示すように、ユーザUが、手Hを情報処理装置10にかざす動作を行った後、所定の移動幅W4ほど手Hを所定の離間方向(図17の例であればY軸プラス方向)へ移動させる動作である。 (7) Ambient Control
The "Ambient Control" is a function for the user U to listen to music while checking the surrounding sounds. Specifically, "Ambient Control" is a function of capturing external sounds during music reproduction. The "Ambient Control" is associated with a swing-up motion of the hand H as a first motion, and is associated with a motion of moving the hand H away as a second motion. FIG. 17 is a diagram for explaining an operation of moving the hand H away. The motion of moving the hand H away is a motion (gesture) with a movement width, unlike the pinch motion, the hold motion, the touch motion, and the motion of holding the hand. Here, in the operation of moving the hand H away, as shown in FIG. 17, after the user U holds the hand H over the
(2-2-2)機能と補正処理の対応関係
以上、機能とジェスチャとの対応関係について述べたが、次に、機能と補正処理との対応関係を述べる。先に示した図10には、機能とジェスチャとの対応関係に加えて、機能と補正処理との対応関係も示されている。ここで、補正処理は、ジェスチャの検出可能範囲の補正処理、及びジェスチャの検出のための基準方向の補正処理である。図10に示した例では、補正処理は、機能毎に或いはジェスチャ毎に異なっている。 (2-2-2) Correspondence between function and correction processing The correspondence between function and gesture has been described above. Next, the correspondence between function and correction processing will be described. In FIG. 10 shown above, in addition to the correspondence between the function and the gesture, the correspondence between the function and the correction process is also shown. Here, the correction process is a correction process for the detectable range of the gesture and a correction process for the reference direction for detecting the gesture. In the example shown in FIG. 10, the correction process differs for each function or each gesture.
以上、機能とジェスチャとの対応関係について述べたが、次に、機能と補正処理との対応関係を述べる。先に示した図10には、機能とジェスチャとの対応関係に加えて、機能と補正処理との対応関係も示されている。ここで、補正処理は、ジェスチャの検出可能範囲の補正処理、及びジェスチャの検出のための基準方向の補正処理である。図10に示した例では、補正処理は、機能毎に或いはジェスチャ毎に異なっている。 (2-2-2) Correspondence between function and correction processing The correspondence between function and gesture has been described above. Next, the correspondence between function and correction processing will be described. In FIG. 10 shown above, in addition to the correspondence between the function and the gesture, the correspondence between the function and the correction process is also shown. Here, the correction process is a correction process for the detectable range of the gesture and a correction process for the reference direction for detecting the gesture. In the example shown in FIG. 10, the correction process differs for each function or each gesture.
なお、情報処理装置10は、第1動作の検出が終了した直後に検出可能範囲及び基準方向を補正してもよいし、ジェスチャに係るユーザの一連の動きの情報の取得が完了した後、ジェスチャの認識処理を行う段階で検出可能範囲及び基準方向を補正してもよい。
The information processing apparatus 10 may correct the detectable range and the reference direction immediately after the detection of the first motion is completed, or the gesture after the acquisition of the information on the series of movements of the user related to the gesture is completed. The detectable range and the reference direction may be corrected at the stage of performing the recognition process of.
例えば、情報処理装置10は、第1動作と第2動作の双方の情報の取得が完了した後、第1動作の情報に基づいて検出可能範囲を補正する。そして、情報処理装置10は、取得が完了している第2動作の情報に基づいて第2動作が検出可能範囲内で行われていたか否かを判別する。第2動作が検出可能範囲内で行われていなかった場合、情報処理装置10は、取得が完了している第2動作の情報をジェスチャ検出の対象から排除する。
For example, the information processing apparatus 10 corrects the detectable range based on the information of the first operation after the acquisition of the information of both the first operation and the second operation is completed. Then, the information processing apparatus 10 determines whether or not the second operation has been performed within the detectable range based on the information of the second operation for which the acquisition has been completed. If the second operation is not performed within the detectable range, the information processing apparatus 10 excludes the information of the second operation for which acquisition has been completed from the target of gesture detection.
また、情報処理装置10は、例えば、第1動作と第2動作の双方の情報の取得が完了した後、第1動作の情報に基づいて基準方向を補正する。情報処理装置10は、第1動作の情報に基づき補正された基準方向の情報に基づいて、取得が完了している第2動作の情報の解釈を行う。
Further, the information processing apparatus 10 corrects the reference direction based on the information of the first operation, for example, after the acquisition of the information of both the first operation and the second operation is completed. The information processing apparatus 10 interprets the information of the second operation for which the acquisition has been completed, based on the information in the reference direction corrected based on the information of the first operation.
以下、図10を参照しながら情報処理装置10が有する機能と補正処理の対応関係を説明する。
Hereinafter, the correspondence between the functions of the information processing apparatus 10 and the correction processing will be described with reference to FIG.
(1)Play/Pause
「Play/Pause」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、ピンチ動作が関連付けられている。本機能においては、情報処理装置10は、例えば、第1動作に基づいて検出可能範囲を限定する。なお、情報処理装置10は、基準方向の補正を行わなくてもよい。これは、ピンチ動作が移動幅の伴わないジェスチャのため、ジェスチャの検出に基準方向の情報は必ずしも必要ないためである。 (1) Play / Pause
The "Play / Pause" is associated with the swing-up motion of the hand H as the first motion, and is associated with the pinch motion as the second motion. In this function, theinformation processing apparatus 10 limits the detectable range based on, for example, the first operation. The information processing device 10 does not have to make corrections in the reference direction. This is because the pinch motion is a gesture without a movement width, so information in the reference direction is not always necessary for detecting the gesture.
「Play/Pause」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、ピンチ動作が関連付けられている。本機能においては、情報処理装置10は、例えば、第1動作に基づいて検出可能範囲を限定する。なお、情報処理装置10は、基準方向の補正を行わなくてもよい。これは、ピンチ動作が移動幅の伴わないジェスチャのため、ジェスチャの検出に基準方向の情報は必ずしも必要ないためである。 (1) Play / Pause
The "Play / Pause" is associated with the swing-up motion of the hand H as the first motion, and is associated with the pinch motion as the second motion. In this function, the
(2)Answer Call/End
「Answer Call/End」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、ピンチ動作又はホールド動作が関連付けられている。本機能では、情報処理装置10は、例えば、第1動作に基づいて検出可能範囲を限定する。なお、情報処理装置10は、基準方向の補正を行わなくてもよい。これは、ピンチ動作及びホールド動作が移動幅の伴わないジェスチャのため、ジェスチャの検出に基準方向の情報は必ずしも必要ないためである。 (2) Answer Call / End
The "Answer Call / End" is associated with a swing-up motion of the hand H as a first motion, and a pinch motion or a hold motion as a second motion. In this function, theinformation processing apparatus 10 limits the detectable range based on, for example, the first operation. The information processing device 10 does not have to make corrections in the reference direction. This is because the pinch operation and the hold operation are gestures without a movement width, and therefore the information in the reference direction is not always necessary for the gesture detection.
「Answer Call/End」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、ピンチ動作又はホールド動作が関連付けられている。本機能では、情報処理装置10は、例えば、第1動作に基づいて検出可能範囲を限定する。なお、情報処理装置10は、基準方向の補正を行わなくてもよい。これは、ピンチ動作及びホールド動作が移動幅の伴わないジェスチャのため、ジェスチャの検出に基準方向の情報は必ずしも必要ないためである。 (2) Answer Call / End
The "Answer Call / End" is associated with a swing-up motion of the hand H as a first motion, and a pinch motion or a hold motion as a second motion. In this function, the
(3)Next/Prev
「Next/Prev」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、左右スワイプが関連付けられている。本機能においては、情報処理装置10は、例えば、第1動作に基づいて検出可能範囲を限定する。情報処理装置10は、第1動作に基づいて検出可能範囲を限定した後、例えば、第2動作の動作開始を検出したら、限定した検出可能範囲を広くする補正を行ってもよい。これにより、第2動作の検出精度が高まる。また、本機能においては、情報処理装置10は、例えば、第1動作に基づいて基準方向を補正する。 (3) Next / Prev
The "Next / Prev" is associated with the swing-up motion of the hand H as the first motion, and is associated with the left / right swipe as the second motion. In this function, theinformation processing apparatus 10 limits the detectable range based on, for example, the first operation. After limiting the detectable range based on the first operation, the information processing apparatus 10 may make a correction to widen the limited detectable range after detecting the start of the operation of the second operation, for example. As a result, the detection accuracy of the second operation is improved. Further, in this function, the information processing apparatus 10 corrects the reference direction based on, for example, the first operation.
「Next/Prev」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、左右スワイプが関連付けられている。本機能においては、情報処理装置10は、例えば、第1動作に基づいて検出可能範囲を限定する。情報処理装置10は、第1動作に基づいて検出可能範囲を限定した後、例えば、第2動作の動作開始を検出したら、限定した検出可能範囲を広くする補正を行ってもよい。これにより、第2動作の検出精度が高まる。また、本機能においては、情報処理装置10は、例えば、第1動作に基づいて基準方向を補正する。 (3) Next / Prev
The "Next / Prev" is associated with the swing-up motion of the hand H as the first motion, and is associated with the left / right swipe as the second motion. In this function, the
(4)Vol+/Vol-
「Vol+/Vol-」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、上下スワイプが関連付けられている。本機能においては、情報処理装置10は、例えば、第1動作に基づいて検出可能範囲を限定する。情報処理装置10は、第1動作に基づいて検出可能範囲を限定した後、例えば、第2動作の動作開始を検出したら、限定した検出可能範囲を広くする補正を行ってもよい。これにより、第2動作の検出精度が高まる。また、本機能においては、情報処理装置10は、例えば、第1動作に基づいて基準方向を補正する。 (4) Vol + / Vol-
"Vol + / Vol-" is associated with the swinging motion of the hand H as the first motion, and is associated with the up / down swipe as the second motion. In this function, theinformation processing apparatus 10 limits the detectable range based on, for example, the first operation. After limiting the detectable range based on the first operation, the information processing apparatus 10 may make a correction to widen the limited detectable range after detecting the start of the operation of the second operation, for example. As a result, the detection accuracy of the second operation is improved. Further, in this function, the information processing apparatus 10 corrects the reference direction based on, for example, the first operation.
「Vol+/Vol-」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、上下スワイプが関連付けられている。本機能においては、情報処理装置10は、例えば、第1動作に基づいて検出可能範囲を限定する。情報処理装置10は、第1動作に基づいて検出可能範囲を限定した後、例えば、第2動作の動作開始を検出したら、限定した検出可能範囲を広くする補正を行ってもよい。これにより、第2動作の検出精度が高まる。また、本機能においては、情報処理装置10は、例えば、第1動作に基づいて基準方向を補正する。 (4) Vol + / Vol-
"Vol + / Vol-" is associated with the swinging motion of the hand H as the first motion, and is associated with the up / down swipe as the second motion. In this function, the
(5)Cancel
「Cancel」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、手Hをかざす動作が関連付けられている。本機能においては、情報処理装置10は、例えば、情報処理装置10の状態に関する情報に基づいて検出可能範囲を設定する。例えば、情報処理装置10は状態検出部12からの情報に基づいて検出可能範囲を設定する。本機能においては、情報処理装置10は、基準方向の補正を必ずしも行わなくてもよい。手Hをかざす動作が移動幅の伴わないジェスチャのため、ジェスチャの検出に基準方向の情報は必ずしも必要ないためである。 (5) Candle
The "Cancel" is associated with the swinging motion of the hand H as the first motion, and is associated with the motion of holding the hand H as the second motion. In this function, theinformation processing apparatus 10 sets a detectable range based on, for example, information about the state of the information processing apparatus 10. For example, the information processing apparatus 10 sets the detectable range based on the information from the state detection unit 12. In this function, the information processing apparatus 10 does not necessarily have to perform correction in the reference direction. This is because the movement of holding the hand H is a gesture without a movement width, and therefore information in the reference direction is not always necessary for detecting the gesture.
「Cancel」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、手Hをかざす動作が関連付けられている。本機能においては、情報処理装置10は、例えば、情報処理装置10の状態に関する情報に基づいて検出可能範囲を設定する。例えば、情報処理装置10は状態検出部12からの情報に基づいて検出可能範囲を設定する。本機能においては、情報処理装置10は、基準方向の補正を必ずしも行わなくてもよい。手Hをかざす動作が移動幅の伴わないジェスチャのため、ジェスチャの検出に基準方向の情報は必ずしも必要ないためである。 (5) Candle
The "Cancel" is associated with the swinging motion of the hand H as the first motion, and is associated with the motion of holding the hand H as the second motion. In this function, the
(6)Quick Attention
「Quick Attention」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、タッチ動作が関連付けられている。本機能においては、情報処理装置10は、検出可能範囲及び基準方向の補正を行わなくてもよい。これは、タッチ動作の検出に検出可能範囲及び基準方向の情報は必ずしも必要ないためである。 (6) Quick Attention
The "Quick Attention" is associated with the swinging motion of the hand H as the first motion, and is associated with the touch motion as the second motion. In this function, theinformation processing apparatus 10 does not have to correct the detectable range and the reference direction. This is because information on the detectable range and the reference direction is not always necessary for detecting the touch operation.
「Quick Attention」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、タッチ動作が関連付けられている。本機能においては、情報処理装置10は、検出可能範囲及び基準方向の補正を行わなくてもよい。これは、タッチ動作の検出に検出可能範囲及び基準方向の情報は必ずしも必要ないためである。 (6) Quick Attention
The "Quick Attention" is associated with the swinging motion of the hand H as the first motion, and is associated with the touch motion as the second motion. In this function, the
(7)Ambient Control
「Ambient Control」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、手Hを遠ざける動作が関連付けられている。本機能においては、情報処理装置10は、例えば、第1動作に基づいて検出可能範囲を限定する。情報処理装置10は、第1動作に基づいて検出可能範囲を限定した後、例えば、第2動作の動作開始を検出したら、限定した検出可能範囲を広くする補正を行ってもよい。これにより、第2動作の検出精度が高まる。また、本機能においては、情報処理装置10は、例えば、第1動作に基づいて基準方向を補正する。 (7) Ambient Control
The "Ambient Control" is associated with a swing-up motion of the hand H as a first motion, and is associated with a motion of moving the hand H away as a second motion. In this function, theinformation processing apparatus 10 limits the detectable range based on, for example, the first operation. After limiting the detectable range based on the first operation, the information processing apparatus 10 may make a correction to widen the limited detectable range after detecting the start of the operation of the second operation, for example. As a result, the detection accuracy of the second operation is improved. Further, in this function, the information processing apparatus 10 corrects the reference direction based on, for example, the first operation.
「Ambient Control」には、第1動作として、手Hの振り上げ動作が関連付けられており、第2動作として、手Hを遠ざける動作が関連付けられている。本機能においては、情報処理装置10は、例えば、第1動作に基づいて検出可能範囲を限定する。情報処理装置10は、第1動作に基づいて検出可能範囲を限定した後、例えば、第2動作の動作開始を検出したら、限定した検出可能範囲を広くする補正を行ってもよい。これにより、第2動作の検出精度が高まる。また、本機能においては、情報処理装置10は、例えば、第1動作に基づいて基準方向を補正する。 (7) Ambient Control
The "Ambient Control" is associated with a swing-up motion of the hand H as a first motion, and is associated with a motion of moving the hand H away as a second motion. In this function, the
<2-3.情報処理装置の動作>
以上、情報処理装置10が検出可能なジェスチャについて述べたが、次に、このようなジェスチャを検出可能な情報処理装置10の動作について説明する。 <2-3. Information processing device operation>
The gestures that can be detected by theinformation processing apparatus 10 have been described above. Next, the operation of the information processing apparatus 10 that can detect such gestures will be described.
以上、情報処理装置10が検出可能なジェスチャについて述べたが、次に、このようなジェスチャを検出可能な情報処理装置10の動作について説明する。 <2-3. Information processing device operation>
The gestures that can be detected by the
図18は、本実施形態に係るコマンド実行処理を示すフローチャートである。コマンド実行処理は、ユーザUがジェスチャを使って入力したコマンドを情報処理装置10が実行する処理である。ここで「コマンド」とは、情報処理装置10に機能を実行させるための装置内部又は外部からの指令のことである。なお「コマンド」を、Play/Pause等、情報処理装置10が有する機能そのものを指すワードとみなすことも可能である。以下の説明で登場する「コマンド」の記載は「機能」に置き換え可能である。
FIG. 18 is a flowchart showing a command execution process according to the present embodiment. The command execution process is a process in which the information processing apparatus 10 executes a command input by the user U using a gesture. Here, the "command" is a command from the inside or the outside of the device for causing the information processing device 10 to execute a function. It is also possible to regard the "command" as a word indicating the function itself of the information processing apparatus 10, such as Play / Pause. The description of "command" appearing in the following explanation can be replaced with "function".
コマンド実行処理は、情報処理装置10の制御部16により実行される。コマンド実行処理は、例えば、情報処理装置10に電源が投入された場合に開始される。
The command execution process is executed by the control unit 16 of the information processing device 10. The command execution process is started, for example, when the information processing apparatus 10 is turned on.
まず、制御部16の取得部161は、入力検出部11及び状態検出部12からセンサ値を取得する(ステップS101)。例えば、取得部161は、近接センサ等の非接触式のセンサから検出領域OR内の物体に関する情報を取得する。また、取得部161は、タッチセンサ等の接触式のセンサからハウジング面への接触に関する情報を取得する。また、取得部161は、加速度センサ、ジャイロセンサ、地磁気センサ等のセンサから情報処理装置10の状態に関する情報を取得する。
First, the acquisition unit 161 of the control unit 16 acquires the sensor value from the input detection unit 11 and the state detection unit 12 (step S101). For example, the acquisition unit 161 acquires information about an object in the detection region OR from a non-contact type sensor such as a proximity sensor. Further, the acquisition unit 161 acquires information regarding contact with the housing surface from a contact-type sensor such as a touch sensor. Further, the acquisition unit 161 acquires information regarding the state of the information processing device 10 from sensors such as an acceleration sensor, a gyro sensor, and a geomagnetic sensor.
続いて、取得部161は、ステップS101で取得したセンサ値に基づいて、ユーザUのジェスチャに関する情報を取得する(ステップS102)。例えば、取得部161は、入力検出部11からのセンサ値に基づいて、ジェスチャに関するユーザUの動作の情報を取得する。より具体的には、取得部161は、近接センサ等の非接触式のセンサのセンサ値に基づいて、ユーザUが手Hを検出可能範囲ARに進入させる進入動作の情報を取得する。なお、情報処理装置10がユーザUの耳に装着される装置なのであれば、進入動作は、ユーザUが手Hを耳に向けて振り上げる振り上げ動作であってもよい。この場合、進入動作の情報は、検出可能範囲ARへのユーザUの手Hの進入方向ADの情報であってもよい。
Subsequently, the acquisition unit 161 acquires information regarding the gesture of the user U based on the sensor value acquired in step S101 (step S102). For example, the acquisition unit 161 acquires information on the operation of the user U regarding the gesture based on the sensor value from the input detection unit 11. More specifically, the acquisition unit 161 acquires information on the approaching motion in which the user U enters the hand H into the detectable range AR based on the sensor value of a non-contact type sensor such as a proximity sensor. If the information processing device 10 is a device worn on the ear of the user U, the approaching motion may be a swinging motion in which the user U swings the hand H toward the ear. In this case, the information on the approaching motion may be the information on the approaching direction AD of the user U's hand H into the detectable range AR.
続いて、取得部161は、ステップS101で取得したセンサ値に基づいて、情報処理装置10の状態に関する情報を取得する(ステップS103)。例えば、取得部161は、状態検出部12からのセンサ値に基づいて、情報処理装置10の状態に関する情報を取得する。例えば、取得部161は、加速度センサ、ジャイロセンサ、又は地磁気センサからの情報に基づいて、重力方向に関する情報を取得する。
Subsequently, the acquisition unit 161 acquires information regarding the state of the information processing apparatus 10 based on the sensor value acquired in step S101 (step S103). For example, the acquisition unit 161 acquires information about the state of the information processing apparatus 10 based on the sensor value from the state detection unit 12. For example, the acquisition unit 161 acquires information regarding the direction of gravity based on information from an acceleration sensor, a gyro sensor, or a geomagnetic sensor.
次に、制御部16の補正部166は、ジェスチャ検出に関する補正処理を実行する(ステップ104)。例えば、補正部166は、ジェスチャの検出可能範囲、及び前記ジェスチャの検出のための基準方向、の少なくとも一方を補正する。補正部166が行う補正は、(1)ジェスチャに関する情報に基づく補正、及び、(2)機器状態に関する情報に基づく補正、の2つに大別される。
Next, the correction unit 166 of the control unit 16 executes the correction process related to the gesture detection (step 104). For example, the correction unit 166 corrects at least one of the detectable range of the gesture and the reference direction for detecting the gesture. The corrections made by the correction unit 166 are roughly classified into (1) corrections based on information on gestures and (2) corrections based on information on device states.
(1)ジェスチャに関する情報に基づく補正
まず、ジェスチャに関する情報に基づく補正について述べる。 (1) Correction based on information on gestures First, corrections based on information on gestures will be described.
まず、ジェスチャに関する情報に基づく補正について述べる。 (1) Correction based on information on gestures First, corrections based on information on gestures will be described.
補正部166は、ステップS102で取得した情報に基づいて、検出可能範囲AR及び基準方向の少なくとも一方を補正する。例えば、補正部166は、ユーザUの動作の情報に基づいて、検出可能範囲AR及び基準方向の少なくとも一方を補正する。<1-2.解決手段>で述べたように、ユーザUの動作は、手Hの振り上げ動作であってもよい。また、<1-2.解決手段>で述べたように、振り上げ動作は、ユーザUが手Hを検出可能範囲ARに進入させる進入動作であってもよい。このとき、補正部166は、検出可能範囲ARへのユーザUの手Hの進入方向ADの情報に基づいて、検出可能範囲AR及び基準方向の少なくとも一方を補正してもよい。
The correction unit 166 corrects at least one of the detectable range AR and the reference direction based on the information acquired in step S102. For example, the correction unit 166 corrects at least one of the detectable range AR and the reference direction based on the information of the operation of the user U. <1-2. As described in Solution>, the action of the user U may be the swing-up action of the hand H. In addition, <1-2. As described in Solution>, the swing-up motion may be an approach motion in which the user U enters the hand H into the detectable range AR. At this time, the correction unit 166 may correct at least one of the detectable range AR and the reference direction based on the information of the approach direction AD of the user U's hand H to the detectable range AR.
なお、進入動作の情報は、ユーザUの手Hの進入方向ADの情報であってもよい。このとき、補正部166は、進入方向ADの情報に基づいて、検出可能範囲AR及び基準方向の少なくとも一方を補正してもよい。より具体的には、補正部166は、進入方向ADの情報に基づき推測される情報処理装置10のユーザUへの装着状態に関する情報に基づいて、検出可能範囲AD及び基準方向の少なくとも一方を補正してもよい。装着状態に関する情報は、例えば、装着傾きに関する情報であってもよい。装着傾きは、例えば、情報処理装置10の標準的な装着状態からの現在の情報処理装置10の傾きである。装着傾きの推測は、制御部16の推測部165が進入方向ADの情報に基づいて行ってもよい。そして、補正部166は、推測部165の推測結果に基づいて、検出可能範囲AD及び基準方向の少なくとも一方を補正してもよい。
The information on the approaching motion may be the information on the approaching direction AD of the user U's hand H. At this time, the correction unit 166 may correct at least one of the detectable range AR and the reference direction based on the information of the approach direction AD. More specifically, the correction unit 166 corrects at least one of the detectable range AD and the reference direction based on the information regarding the mounting state of the information processing apparatus 10 to the user U estimated based on the information of the approach direction AD. You may. The information regarding the mounting state may be, for example, information regarding the mounting inclination. The mounting tilt is, for example, the tilt of the current information processing apparatus 10 from the standard mounting state of the information processing apparatus 10. The estimation of the mounting inclination may be performed by the estimation unit 165 of the control unit 16 based on the information of the approach direction AD. Then, the correction unit 166 may correct at least one of the detectable range AD and the reference direction based on the estimation result of the estimation unit 165.
なお、1つのジェスチャを構成するユーザの動作には、ジェスチャの開始のトリガとなる第1動作と、第1動作に続く第2動作と、が含まれていてもよい。ここで、第1動作は、ユーザUの手Hの振り上げ動作である。そして、補正部166は、振り上げ動作が検出可能範囲ARで検出された後、第2動作を検出するための検出可能範囲を、振り上げ動作を検出したときの検出可能範囲ARよりも広くしてもよい。より具体的には、補正部166は、検出対象のジェスチャ(第2動作)が移動幅を伴うジェスチャである場合に、第2動作を検出するための検出可能範囲を、振り上げ動作を検出したときの検出可能範囲ARよりも広くしてもよい。移動幅を伴うジェスチャは、例えば、スワイプや手Hを遠ざける動作である。
Note that the user's actions constituting one gesture may include a first action that triggers the start of the gesture and a second action following the first action. Here, the first operation is a swing-up operation of the hand H of the user U. Then, after the swing-up motion is detected in the detectable range AR, the correction unit 166 may widen the detectable range for detecting the second motion to be wider than the detectable range AR when the swing-up motion is detected. good. More specifically, when the correction unit 166 detects the swing-up motion in the detectable range for detecting the second motion when the gesture to be detected (second motion) is a gesture with a movement width. It may be wider than the detectable range AR of. The gesture accompanied by the movement width is, for example, a swipe or an action of moving the hand H away.
これにより、誤検出を抑制しつつも、ジェスチャの検出精度を高めることができる。
This makes it possible to improve the accuracy of gesture detection while suppressing false detection.
(2)機器状態に関する情報に基づく補正
次に、機器状態に関する情報に基づく補正について述べる。 (2) Correction based on information on device status Next, correction based on information on device status will be described.
次に、機器状態に関する情報に基づく補正について述べる。 (2) Correction based on information on device status Next, correction based on information on device status will be described.
補正部166は、ステップS103で取得した情報に基づいて、検出可能範囲AR及び基準方向の少なくとも一方を補正する。例えば、補正部166は、状態検出部12からの情報に基づき推測される情報処理装置10のユーザUへの装着状態に関する情報に基づいて、検出可能範囲AR及び前記基準方向の少なくとも一方を補正する。装着傾きは、例えば、情報処理装置10の標準的な装着状態からの現在の情報処理装置10の傾きである。装着傾きの推測は、制御部16の推測部165が行ってもよい。そして、補正部166は、推測部165の推測結果に基づいて、検出可能範囲AD及び基準方向の少なくとも一方を補正してもよい。
The correction unit 166 corrects at least one of the detectable range AR and the reference direction based on the information acquired in step S103. For example, the correction unit 166 corrects at least one of the detectable range AR and the reference direction based on the information regarding the mounting state of the information processing apparatus 10 to the user U, which is estimated based on the information from the state detection unit 12. .. The mounting tilt is, for example, the tilt of the current information processing apparatus 10 from the standard mounting state of the information processing apparatus 10. The estimation of the mounting inclination may be performed by the estimation unit 165 of the control unit 16. Then, the correction unit 166 may correct at least one of the detectable range AD and the reference direction based on the estimation result of the estimation unit 165.
また、前記補正部は、状態検出部12からの情報に基づき推測されるユーザUの姿勢に関する情報に基づいて、検出可能範囲AR及び基準方向の少なくとも一方を補正する。ユーザUの姿勢に関する情報は、ユーザが仰向けに寝ている等を示す情報である。ユーザUの姿勢に関する情報は、加速度センサ又はジャイロセンサにより検出される重力方向の情報であってもよい。ユーザUの姿勢の推測は、制御部16の推測部165が行ってもよい。そして、補正部166は、推測部165の推測結果に基づいて、検出可能範囲AD及び基準方向の少なくとも一方を補正してもよい。
Further, the correction unit corrects at least one of the detectable range AR and the reference direction based on the information regarding the posture of the user U estimated based on the information from the state detection unit 12. The information regarding the posture of the user U is information indicating that the user is lying on his back or the like. The information regarding the posture of the user U may be information in the direction of gravity detected by the acceleration sensor or the gyro sensor. The estimation unit 165 of the control unit 16 may estimate the posture of the user U. Then, the correction unit 166 may correct at least one of the detectable range AD and the reference direction based on the estimation result of the estimation unit 165.
これにより、誤検出を抑制しつつも、ジェスチャの検出精度を高めることができる。
This makes it possible to improve the accuracy of gesture detection while suppressing false detection.
次に、制御部16のジェスチャ検出部162は、ジェスチャを検出したか判別する(ステップS105)。例えば、ジェスチャ検出部162は、検出領域OR内で第1動作を検出した後、同じく検出領域OR内で第2動作を検出したか判別する。ジェスチャを検出していない場合(ステップS105:No)、ジェスチャ検出部162は、ステップS101に処理を戻す。
Next, the gesture detection unit 162 of the control unit 16 determines whether or not the gesture has been detected (step S105). For example, the gesture detection unit 162 determines whether or not the first operation is detected in the detection area OR and then the second operation is also detected in the detection area OR. When the gesture is not detected (step S105: No), the gesture detection unit 162 returns the process to step S101.
続いて、ジェスチャを検出した場合(ステップS105:Yes)、ジェスチャ検出部162は、検出したジェスチャが検出可能範囲AR内で実行されたか判別する(ステップS106)。例えば、ジェスチャ検出部162は、第1動作が検出可能範囲ARで検出された後、第2動作が検出可能範囲ARで検出されたか判別する。第2動作が左右スワイプ等の移動幅を伴うジェスチャの場合は、その開始位置と終了位置がいずれも検出可能範囲AR内に収まっているか否かで、ジェスチャが検出可能範囲AR内で実行されたか判別してもよい。
Subsequently, when the gesture is detected (step S105: Yes), the gesture detection unit 162 determines whether the detected gesture is executed within the detectable range AR (step S106). For example, the gesture detection unit 162 determines whether the first motion is detected in the detectable range AR and then the second motion is detected in the detectable range AR. If the second operation is a gesture with a movement width such as a left / right swipe, whether the gesture was executed within the detectable range AR depending on whether the start position and the end position are both within the detectable range AR. It may be determined.
なお、第1動作を検出する検出可能範囲ARと第2動作を検出する検出可能範囲ARは広さが異なっていてもよい。例えば、制御部16は、第2動作が移動幅を伴うジェスチャである場合に、第2動作を検出するための検出可能範囲を、振り上げ動作を検出したときの検出可能範囲ARよりも広くしてもよい。
Note that the detectable range AR for detecting the first operation and the detectable range AR for detecting the second operation may have different widths. For example, when the second motion is a gesture with a movement width, the control unit 16 makes the detectable range for detecting the second motion wider than the detectable range AR when the swing-up motion is detected. May be good.
また、ユーザUがジェスチャによる入力操作を行いやすくするため、検出可能範囲ARで第1動作方向が検出された場合は、出力制御部164は、出力部13を制御して、ジェスチャ検出の受付中であることを、ユーザに通知してもよい。このとき、出力制御部164が行う通知は、音であってもよい。
Further, in order to facilitate the input operation by the gesture by the user U, when the first operation direction is detected in the detectable range AR, the output control unit 164 controls the output unit 13 and is accepting the gesture detection. You may notify the user that this is the case. At this time, the notification given by the output control unit 164 may be a sound.
ジェスチャが検出可能範囲内で実行されていない場合(ステップS106:No)、ジェスチャ検出部162は、ステップS101に処理を戻す。
If the gesture is not executed within the detectable range (step S106: No), the gesture detection unit 162 returns the process to step S101.
なお、ジェスチャ検出部162は、ユーザUの手Hが非検出可能範囲NRから検出可能範囲ARに進入した場合は、第2動作が検出可能範囲AR内で行われたか否かに関わらず、当該ジェスチャを検出しないようにしてもよい。
When the hand H of the user U enters the detectable range AR from the non-detectable range NR, the gesture detection unit 162 said that the second operation is performed within the detectable range AR regardless of whether or not the second operation is performed. Gestures may not be detected.
図19A及び図19Bは、ジェスチャ検出部162がジェスチャを検出しない場合の一例を示す図である。検出可能範囲ARは、図19A及び図19Bに示すように、検出領域ORの縁の一部に検出可能範囲ARの縁の一部が接するよう検出領域OR内に偏って配置されている。より具体的には、検出可能範囲ARはユーザUの前面側に偏って配置されている。図19Aに示すように、ユーザUの手Hが非検出可能範囲NRから検出可能範囲ARから進入した場合は、そのユーザUの手Hの検出可能範囲ARへの進入は、振り上げ動作によるものでないと想定される。そのため、第2動作が例えば図19Bに示すように検出可能範囲AR内で行われたとしても、ジェスチャ検出部162は、当該ジェスチャを検出しない。
19A and 19B are diagrams showing an example of a case where the gesture detection unit 162 does not detect a gesture. As shown in FIGS. 19A and 19B, the detectable range AR is unevenly arranged in the detection region OR so that a part of the edge of the detectable range AR touches a part of the edge of the detection region OR. More specifically, the detectable range AR is unevenly arranged on the front side of the user U. As shown in FIG. 19A, when the user U's hand H enters from the detectable range AR from the non-detectable range NR, the user U's entry into the detectable range AR is not due to the swing-up operation. Is assumed. Therefore, even if the second operation is performed within the detectable range AR as shown in FIG. 19B, the gesture detection unit 162 does not detect the gesture.
一方、ジェスチャ検出部162は、ユーザUの手Hが非検出可能範囲NRからではなく、直接、検出可能範囲ARに進入した場合は、第2動作が検出可能範囲AR内で行われたことを条件に、当該ジェスチャを検出する。
On the other hand, the gesture detection unit 162 indicates that the second operation is performed within the detectable range AR when the user U's hand H directly enters the detectable range AR instead of from the non-detectable range NR. The gesture is detected as a condition.
図20A及び図20Bは、ジェスチャ検出部162がジェスチャを検出する場合の一例を示す図である。ユーザUは、図20Aに示すように、手Hを検出可能範囲ARに直接進入させ、その後、図20Bに示すように第2動作を行っている。図20Bの例では、第2動作は移動幅の伴うジェスチャであるので、制御部16は、第2動作を検出するための検出可能範囲を、第1動作を検出したときの検出可能範囲AR(図20Aに示す検出可能範囲AR)よりも広くしている。図20Bの例では、検出可能範囲AR内で第2動作が行われているので、ジェスチャ検出部162は、当該ジェスチャを検出する。
20A and 20B are diagrams showing an example of a case where the gesture detection unit 162 detects a gesture. As shown in FIG. 20A, the user U directly enters the detectable range AR of the hand H, and then performs the second operation as shown in FIG. 20B. In the example of FIG. 20B, since the second motion is a gesture accompanied by a movement width, the control unit 16 sets the detectable range for detecting the second motion to the detectable range AR when the first motion is detected ( It is wider than the detectable range AR) shown in FIG. 20A. In the example of FIG. 20B, since the second operation is performed within the detectable range AR, the gesture detection unit 162 detects the gesture.
このように、ユーザUの手Hが検出可能範囲NRから検出可能範囲ARへ進入した場合には、ジェスチャ検出部162が当該ジェスチャを検出しないようにし、ユーザUの手Hが非検出可能範囲NRからではなく、直接、検出可能範囲ARに進入した場合には、当該ジェスチャを検出することにより、ジェスチャの誤検出の可能性がさらに低くなる。
In this way, when the user U's hand H enters the detectable range AR from the detectable range NR, the gesture detection unit 162 prevents the gesture from being detected, and the user U's hand H does not detect the non-detectable range NR. When entering the detectable range AR directly instead of from the body, the possibility of erroneous detection of the gesture is further reduced by detecting the gesture.
図18に戻り、ジェスチャが検出可能範囲内で実行された場合(ステップS106:Yes)、制御部16のコマンド実行部163は、検出したジェスチャに対応するコマンドを実行する(ステップS107)。例えば、コマンド実行部163は、検出したジェスチャに関連付けられた情報処理装置10の機能を実行する。
Returning to FIG. 18, when the gesture is executed within the detectable range (step S106: Yes), the command execution unit 163 of the control unit 16 executes the command corresponding to the detected gesture (step S107). For example, the command execution unit 163 executes the function of the information processing apparatus 10 associated with the detected gesture.
次に、補正部166は、ジェスチャ検出に関する補正処理を実行する(ステップS108)。例えば、補正部166は、ジェスチャ検出部162が検出したジェスチャの情報に基づいて検出可能範囲及び基準方向の少なくとも一方を補正する。例えば、補正部166は、スワイプの開始位置及び終了位置の情報に基づいて、検出可能範囲ARを広くしたり狭くしたりする。また、補正部166は、検出した左右スワイプや上下スワイプの移動方向から基準方向を微調整する。
Next, the correction unit 166 executes a correction process related to gesture detection (step S108). For example, the correction unit 166 corrects at least one of the detectable range and the reference direction based on the information of the gesture detected by the gesture detection unit 162. For example, the correction unit 166 widens or narrows the detectable range AR based on the information on the start position and the end position of the swipe. Further, the correction unit 166 finely adjusts the reference direction from the movement direction of the detected left / right swipe or up / down swipe.
補正処理が完了したら、制御部16は、ステップS101に処理を戻し、再びステップS101~ステップS108の処理を実行する。
When the correction process is completed, the control unit 16 returns the process to step S101 and executes the process of steps S101 to S108 again.
本実施形態によれば、情報処理装置10が検出可能範囲を補正しているので、ユーザがジェスチャとして意図していない動作までジェスチャとして検出されてしまう事態を少なくできる。また、情報処理装置10が基準方向を補正しているので、ユーザが意図するジェスチャを別のジェスチャと間違えるといった事態も少なくなる。
According to the present embodiment, since the information processing apparatus 10 corrects the detectable range, it is possible to reduce the situation where even an operation not intended by the user as a gesture is detected as a gesture. Further, since the information processing apparatus 10 corrects the reference direction, it is less likely that the gesture intended by the user is mistaken for another gesture.
<<3.実施形態2>>
実施形態1では、情報処理装置10が、ユーザの手Hの動きを検出し、コマンドを実行した。しかし、ユーザの手Hの動きを検出する装置とコマンドを実行する装置は異なる装置であってもよい。以下、実施形態2の情報処理システム1について説明する。 << 3. Embodiment 2 >>
In the first embodiment, theinformation processing apparatus 10 detects the movement of the user's hand H and executes a command. However, the device that detects the movement of the user's hand H and the device that executes the command may be different devices. Hereinafter, the information processing system 1 of the second embodiment will be described.
実施形態1では、情報処理装置10が、ユーザの手Hの動きを検出し、コマンドを実行した。しかし、ユーザの手Hの動きを検出する装置とコマンドを実行する装置は異なる装置であってもよい。以下、実施形態2の情報処理システム1について説明する。 << 3. Embodiment 2 >>
In the first embodiment, the
<3-1.情報処理システムの構成>
図21は、情報処理システム1の構成例を示す図である。情報処理システム1は、ユーザのジェスチャに基づいて各種機能を実行するシステムである。情報処理システム1は、図21に示すように、出力装置20と、端末装置30と、を備える。図21の例では、情報処理システム1は、出力装置20Aと出力装置20Bとを備えている。なお、図21の例では、出力装置20と端末装置30は無線接続されているが、出力装置20と端末装置30は有線接続可能に構成されていてもよい。 <3-1. Information processing system configuration>
FIG. 21 is a diagram showing a configuration example of theinformation processing system 1. The information processing system 1 is a system that executes various functions based on the gesture of the user. As shown in FIG. 21, the information processing system 1 includes an output device 20 and a terminal device 30. In the example of FIG. 21, the information processing system 1 includes an output device 20A and an output device 20B. In the example of FIG. 21, the output device 20 and the terminal device 30 are wirelessly connected, but the output device 20 and the terminal device 30 may be configured to be connectable by wire.
図21は、情報処理システム1の構成例を示す図である。情報処理システム1は、ユーザのジェスチャに基づいて各種機能を実行するシステムである。情報処理システム1は、図21に示すように、出力装置20と、端末装置30と、を備える。図21の例では、情報処理システム1は、出力装置20Aと出力装置20Bとを備えている。なお、図21の例では、出力装置20と端末装置30は無線接続されているが、出力装置20と端末装置30は有線接続可能に構成されていてもよい。 <3-1. Information processing system configuration>
FIG. 21 is a diagram showing a configuration example of the
なお、情報処理システム1は、出力装置20を1つのみ備えていてもよいし、複数備えていてもよい。例えば、出力装置20がイヤホンであるとすると、情報処理システム1は、端末装置30と無線接続され、ユーザUの左右の耳にそれぞれ装着される一対の出力装置20を備えていてもよい。一例として、出力装置20AはユーザUの左耳に装着されるイヤホンであり、出力装置20BはユーザUの右耳に装着されるイヤホンである。
The information processing system 1 may be provided with only one output device 20 or may be provided with a plurality of output devices 20. For example, assuming that the output device 20 is an earphone, the information processing system 1 may include a pair of output devices 20 that are wirelessly connected to the terminal device 30 and are attached to the left and right ears of the user U, respectively. As an example, the output device 20A is an earphone worn on the left ear of the user U, and the output device 20B is an earphone worn on the right ear of the user U.
なお、1つの出力装置20は、必ずしも1つの一体の装置でなくてもよい。機能的に或いは用途的に連関する複数の別体の装置を、1つの出力装置20とみなすことも可能である。例えば、ユーザUの左右の耳にそれぞれ装着される左右一対のイヤホンを1つの出力装置20とみなしてもよい。勿論、1つの出力装置20は、ユーザUの一方の耳に装着される1つの一体のイヤホンであってもよいし、ユーザUの左右の耳に装着される1つの一体のヘッドホンであってもよい。
Note that one output device 20 does not necessarily have to be one integrated device. It is also possible to regard a plurality of separate devices that are functionally or practically related as one output device 20. For example, a pair of left and right earphones worn on the left and right ears of the user U may be regarded as one output device 20. Of course, one output device 20 may be one integrated earphone worn on one ear of the user U, or one integrated headphone worn on the left and right ears of the user U. good.
<3-2.出力装置の構成>
まず、出力装置20の構成を説明する。 <3-2. Output device configuration>
First, the configuration of the output device 20 will be described.
まず、出力装置20の構成を説明する。 <3-2. Output device configuration>
First, the configuration of the output device 20 will be described.
出力装置20は、イヤホンやヘッドホン等のユーザが装着可能な音響出力装置である。情報処理装置10は、ARグラスやMRグラス等のユーザが装着可能な表示装置であってもよい。本実施形態では、出力装置20は、ユーザUが装着可能な機器であるものとし、装着時に少なくともユーザUの耳に位置する部位を備えているものとする。そして、この部位には、少なくとも物体を検出する検出部が配置されている。ここで、検出部は、実施形態1を例に説明すると、入力検出部11に相当する機能ブロックである。なお、ユーザUの耳に位置する部位が左右2つあるのであれば、双方の部位それぞれが検出部を備えていてもよいし、一方の部位のみが検出部を備えていてもよい。
The output device 20 is an acoustic output device that can be worn by a user such as earphones and headphones. The information processing device 10 may be a display device such as AR glass or MR glass that can be worn by the user. In the present embodiment, the output device 20 is a device that can be worn by the user U, and is provided with a portion located at least in the ear of the user U when worn. Then, at least a detection unit for detecting an object is arranged in this portion. Here, the detection unit is a functional block corresponding to the input detection unit 11 when the first embodiment is described as an example. If there are two left and right parts located in the ears of the user U, each of both parts may have a detection unit, or only one part may have a detection part.
図22は、本開示の実施形態に係る出力装置20の構成例を示す図である。出力装置20は、入力検出部21と、状態検出部22と、出力部23と、通信部24と、制御部26と、を備える。なお、図22に示した構成は機能的な構成であり、ハードウェア構成はこれとは異なっていてもよい。また、出力装置20の機能は、複数の物理的に分離された構成に分散して実装されてもよい。
FIG. 22 is a diagram showing a configuration example of the output device 20 according to the embodiment of the present disclosure. The output device 20 includes an input detection unit 21, a state detection unit 22, an output unit 23, a communication unit 24, and a control unit 26. The configuration shown in FIG. 22 is a functional configuration, and the hardware configuration may be different from this. Further, the functions of the output device 20 may be distributed and implemented in a plurality of physically separated configurations.
入力検出部21は、ユーザの入力操作を検出する検出部である。状態検出部22は、出力装置20の状態に関する検出を行うセンサ部である。出力部23は、ユーザに情報を出力する出力インタフェースである。通信部24は、端末装置30等の他の装置と通信するための通信インタフェースである。制御部26は、出力装置20の各部を制御するコントローラである。
The input detection unit 21 is a detection unit that detects a user's input operation. The state detection unit 22 is a sensor unit that detects the state of the output device 20. The output unit 23 is an output interface that outputs information to the user. The communication unit 24 is a communication interface for communicating with other devices such as the terminal device 30. The control unit 26 is a controller that controls each unit of the output device 20.
その他、入力検出部21、状態検出部22、出力部23、通信部24、及び制御部26の構成は、情報処理装置10が備える入力検出部11、状態検出部12、出力部13、通信部14、及び制御部16の構成と同様である。この場合、情報処理装置10の記載は、適宜、出力装置20に置き換えてもよい。
In addition, the input detection unit 21, the state detection unit 22, the output unit 23, the communication unit 24, and the control unit 26 are configured by the input detection unit 11, the state detection unit 12, the output unit 13, and the communication unit included in the information processing device 10. The configuration is the same as that of the control unit 16 and the control unit 16. In this case, the description of the information processing device 10 may be replaced with the output device 20 as appropriate.
<3-3.端末装置の構成>
次に、端末装置30の構成を説明する。 <3-3. Terminal device configuration>
Next, the configuration of theterminal device 30 will be described.
次に、端末装置30の構成を説明する。 <3-3. Terminal device configuration>
Next, the configuration of the
端末装置30は、出力装置20と通信可能な情報処理端末である。端末装置30は、本実施形態の情報処理装置の一種である。端末装置30は、例えば、携帯電話、スマートデバイス(スマートフォン、又はタブレット)、PDA(Personal Digital Assistant)、又はパーソナルコンピュータである。また、端末装置30は、通信機能が具備された業務用カメラといった機器であってもよいし、FPU(Field Pickup Unit)等の通信機器が搭載された移動体であってもよい。また、端末装置30は、M2M(Machine to Machine)デバイス、又はIoT(Internet of Things)デバイスであってもよい。端末装置30は、出力装置20の外部から有線又は無線を介して出力装置20を制御する。
The terminal device 30 is an information processing terminal capable of communicating with the output device 20. The terminal device 30 is a kind of information processing device of the present embodiment. The terminal device 30 is, for example, a mobile phone, a smart device (smartphone or tablet), a PDA (Personal Digital Assistant), or a personal computer. Further, the terminal device 30 may be a device such as a commercial camera equipped with a communication function, or may be a mobile body equipped with a communication device such as an FPU (Field Pickup Unit). Further, the terminal device 30 may be an M2M (Machine to Machine) device or an IoT (Internet of Things) device. The terminal device 30 controls the output device 20 from the outside of the output device 20 via wire or wireless.
図23は、本開示の実施形態に係る端末装置30の構成例を示す図である。端末装置30は、入力部31と、状態検出部32と、出力部33と、通信部34と、記憶部35と、制御部36と、を備える。なお、図23に示した構成は機能的な構成であり、ハードウェア構成はこれとは異なっていてもよい。また、端末装置30の機能は、複数の物理的に分離された構成に分散して実装されてもよい。
FIG. 23 is a diagram showing a configuration example of the terminal device 30 according to the embodiment of the present disclosure. The terminal device 30 includes an input unit 31, a state detection unit 32, an output unit 33, a communication unit 34, a storage unit 35, and a control unit 36. The configuration shown in FIG. 23 is a functional configuration, and the hardware configuration may be different from this. Further, the functions of the terminal device 30 may be distributed and implemented in a plurality of physically separated configurations.
入力部31は、ユーザの入力操作を受け付ける入力インタフェースである。例えば、入力部31は、ボタンやタッチパネルである。状態検出部32は、端末装置30の状態に関する検出を行うセンサ部である。出力部33は、ユーザに情報を出力する出力インタフェースである。通信部34は、出力装置20等の他の装置と通信するための通信インタフェースである。記憶部35は、データ読み書き可能な記憶装置である。記憶部35は、例えば、ジェスチャの検出可能範囲ARに関する情報を記憶する。制御部36は、端末装置30の各部を制御するコントローラである。制御部36は、取得部361と、ジェスチャ検出部362と、コマンド実行部363と、出力制御部364と、推測部365と、補正部366と、を備える。
The input unit 31 is an input interface that accepts user input operations. For example, the input unit 31 is a button or a touch panel. The state detection unit 32 is a sensor unit that detects the state of the terminal device 30. The output unit 33 is an output interface that outputs information to the user. The communication unit 34 is a communication interface for communicating with other devices such as the output device 20. The storage unit 35 is a storage device capable of reading and writing data. The storage unit 35 stores, for example, information about the detectable range AR of the gesture. The control unit 36 is a controller that controls each unit of the terminal device 30. The control unit 36 includes an acquisition unit 361, a gesture detection unit 362, a command execution unit 363, an output control unit 364, a guessing unit 365, and a correction unit 366.
その他、入力検出部21、状態検出部22、出力部23、通信部24、及び制御部26の構成は、情報処理装置10が備える入力検出部11、状態検出部12、出力部13、通信部14、及び制御部16の構成と同様である。また、取得部361、ジェスチャ検出部362、コマンド実行部363、出力制御部364、推測部365、及び補正部366の構成は、取得部361が通信を介して入力検出部21及び状態検出部22から情報を取得する以外は、情報処理装置10の制御部16が備える取得部161、ジェスチャ検出部162、コマンド実行部163、出力制御部164、推測部165、及び補正部166の構成と同様である。この場合、情報処理装置10の記載は、適宜、出力装置20又は端末装置30に置き換えてもよい。
In addition, the input detection unit 21, the state detection unit 22, the output unit 23, the communication unit 24, and the control unit 26 are configured by the input detection unit 11, the state detection unit 12, the output unit 13, and the communication unit included in the information processing device 10. The configuration is the same as that of the control unit 16 and the control unit 16. Further, in the configuration of the acquisition unit 361, the gesture detection unit 362, the command execution unit 363, the output control unit 364, the guessing unit 365, and the correction unit 366, the acquisition unit 361 communicates with the input detection unit 21 and the state detection unit 22. The configuration is the same as that of the acquisition unit 161, the gesture detection unit 162, the command execution unit 163, the output control unit 164, the guessing unit 165, and the correction unit 166 included in the control unit 16 of the information processing apparatus 10 except that the information is acquired from. be. In this case, the description of the information processing device 10 may be appropriately replaced with the output device 20 or the terminal device 30.
<3-4.情報処理システムの動作>
以上、情報処理システム1の構成を説明したが、次に、このような構成を有する情報処理システム1の動作について説明する。 <3-4. Information processing system operation>
The configuration of theinformation processing system 1 has been described above. Next, the operation of the information processing system 1 having such a configuration will be described.
以上、情報処理システム1の構成を説明したが、次に、このような構成を有する情報処理システム1の動作について説明する。 <3-4. Information processing system operation>
The configuration of the
(コマンド実行処理)
情報処理システム1は、実施形態1の情報処理装置10と同様にコマンド実行処理を実行可能である。情報処理システム1のコマンド実行処理は、端末装置30が出力装置20からセンサ値を取得する以外は、実施形態1のコマンド実行処理と同様である。以下、実施形態1と同様に、図18を参照しながらコマンド実行処理を説明する。 (Command execution processing)
Theinformation processing system 1 can execute a command execution process in the same manner as the information processing apparatus 10 of the first embodiment. The command execution process of the information processing system 1 is the same as the command execution process of the first embodiment except that the terminal device 30 acquires the sensor value from the output device 20. Hereinafter, the command execution process will be described with reference to FIG. 18 as in the first embodiment.
情報処理システム1は、実施形態1の情報処理装置10と同様にコマンド実行処理を実行可能である。情報処理システム1のコマンド実行処理は、端末装置30が出力装置20からセンサ値を取得する以外は、実施形態1のコマンド実行処理と同様である。以下、実施形態1と同様に、図18を参照しながらコマンド実行処理を説明する。 (Command execution processing)
The
図18は、本実施形態に係るコマンド実行処理を示すフローチャートである。コマンド実行処理は、端末装置30の制御部36により実行される。コマンド実行処理は、例えば、端末装置30が出力装置20と通信を確立した場合に実行される。
FIG. 18 is a flowchart showing a command execution process according to the present embodiment. The command execution process is executed by the control unit 36 of the terminal device 30. The command execution process is executed, for example, when the terminal device 30 establishes communication with the output device 20.
まず、制御部36の取得部361は、出力装置20の入力検出部21及び状態検出部22から通信部34を介してセンサ値を取得する(ステップS101)。例えば、取得部361は、出力装置20が有する非接触式のセンサから検出領域OR内の物体に関する情報を取得する。また、取得部361は、出力装置20が有する接触式のセンサから出力装置20のハウジング面への接触に関する情報を取得する。また、取得部361は、出力装置20が有する加速度センサ等のセンサから出力装置20の状態に関する情報を取得する。
First, the acquisition unit 361 of the control unit 36 acquires the sensor value from the input detection unit 21 and the state detection unit 22 of the output device 20 via the communication unit 34 (step S101). For example, the acquisition unit 361 acquires information about an object in the detection region OR from the non-contact type sensor of the output device 20. Further, the acquisition unit 361 acquires information regarding contact of the output device 20 with the housing surface from the contact-type sensor of the output device 20. Further, the acquisition unit 361 acquires information regarding the state of the output device 20 from a sensor such as an acceleration sensor of the output device 20.
続いて、取得部361は、ステップS101で取得したセンサ値に基づいて、ユーザUのジェスチャに関する情報を取得する(ステップS102)。例えば、取得部361は、入力検出部21からのセンサ値に基づいて、ジェスチャに関するユーザUの動作の情報を取得する。
Subsequently, the acquisition unit 361 acquires information regarding the gesture of the user U based on the sensor value acquired in step S101 (step S102). For example, the acquisition unit 361 acquires information on the operation of the user U regarding the gesture based on the sensor value from the input detection unit 21.
続いて、取得部361は、ステップS101で取得したセンサ値に基づいて、出力装置20の状態に関する情報を取得する(ステップS103)。例えば、取得部361は、状態検出部22からのセンサ値に基づいて、出力装置20の状態に関する情報を取得する。
Subsequently, the acquisition unit 361 acquires information regarding the state of the output device 20 based on the sensor value acquired in step S101 (step S103). For example, the acquisition unit 361 acquires information regarding the state of the output device 20 based on the sensor value from the state detection unit 22.
次に、制御部36の補正部366は、ジェスチャ検出に関する補正処理を実行する(ステップ104)。例えば、補正部366は、ジェスチャの検出可能範囲、及び前記ジェスチャの検出のための基準方向、の少なくとも一方を補正する。
Next, the correction unit 366 of the control unit 36 executes the correction process related to the gesture detection (step 104). For example, the correction unit 366 corrects at least one of the detectable range of the gesture and the reference direction for detecting the gesture.
次に、制御部36のジェスチャ検出部362は、ジェスチャを検出したか判別する(ステップS105)。例えば、ジェスチャ検出部362は、検出領域OR内で第1動作を検出した後、同じく検出領域OR内で第2動作を検出したか判別する。ジェスチャを検出していない場合(ステップS105:No)、ジェスチャ検出部362は、ステップS101に処理を戻す。
Next, the gesture detection unit 362 of the control unit 36 determines whether or not the gesture has been detected (step S105). For example, the gesture detection unit 362 determines whether or not the first operation is detected in the detection area OR and then the second operation is also detected in the detection area OR. When the gesture is not detected (step S105: No), the gesture detection unit 362 returns the process to step S101.
続いて、ジェスチャを検出した場合(ステップS105:Yes)、ジェスチャ検出部362は、検出したジェスチャが検出可能範囲AR内で実行されたか判別する(ステップS106)。例えば、ジェスチャ検出部362は、第1動作が検出可能範囲ARで検出された後、第2動作が検出可能範囲ARで検出されたか判別する。
Subsequently, when the gesture is detected (step S105: Yes), the gesture detection unit 362 determines whether the detected gesture is executed within the detectable range AR (step S106). For example, the gesture detection unit 362 determines whether the first motion is detected in the detectable range AR and then the second motion is detected in the detectable range AR.
ジェスチャが検出可能範囲内で実行されていない場合(ステップS106:No)、ジェスチャ検出部362は、ステップS101に処理を戻す。
If the gesture is not executed within the detectable range (step S106: No), the gesture detection unit 362 returns the process to step S101.
一方、ジェスチャが検出可能範囲内で実行された場合(ステップS106:Yes)、制御部16のコマンド実行部363は、検出したジェスチャに対応するコマンドを実行する(ステップS107)。例えば、コマンド実行部363は、検出したジェスチャに関連付けられた出力装置20の機能を実行する。
On the other hand, when the gesture is executed within the detectable range (step S106: Yes), the command execution unit 363 of the control unit 16 executes the command corresponding to the detected gesture (step S107). For example, the command execution unit 363 executes the function of the output device 20 associated with the detected gesture.
次に、補正部366は、ジェスチャ検出に関する補正処理を実行する(ステップS107)。例えば、補正部366は、ジェスチャ検出部362が検出したジェスチャの情報に基づいて検出可能範囲及び基準方向の少なくとも一方を補正する。
Next, the correction unit 366 executes a correction process related to gesture detection (step S107). For example, the correction unit 366 corrects at least one of the detectable range and the reference direction based on the information of the gesture detected by the gesture detection unit 362.
補正処理が完了したら、制御部36は、ステップS101に処理を戻し、再びステップS101~ステップS108の処理を実行する。
When the correction process is completed, the control unit 36 returns the process to step S101 and executes the processes of steps S101 to S108 again.
本実施形態によっても、端末装置30が検出可能範囲を補正しているので、ユーザがジェスチャとして意図していない動作までジェスチャとして検出されてしまう事態を少なくできる。また、端末装置30が基準方向を補正しているので、ユーザUが意図するジェスチャを別のジェスチャと間違えるといった事態も少なくできる。
Also in this embodiment, since the terminal device 30 corrects the detectable range, it is possible to reduce the situation where even an operation not intended by the user as a gesture is detected as a gesture. Further, since the terminal device 30 corrects the reference direction, it is possible to reduce the situation where the gesture intended by the user U is mistaken for another gesture.
(検出可能範囲のマニュアル設定)
上述の実施形態では、端末装置30は、センサ値に基づいて検出可能範囲ARを自動的に設定した。しかしながら、端末装置30は、検出可能範囲ARをユーザUがマニュアルで設定できるよう構成されていてもよい。 (Manual setting of detectable range)
In the above embodiment, theterminal device 30 automatically sets the detectable range AR based on the sensor value. However, the terminal device 30 may be configured so that the user U can manually set the detectable range AR.
上述の実施形態では、端末装置30は、センサ値に基づいて検出可能範囲ARを自動的に設定した。しかしながら、端末装置30は、検出可能範囲ARをユーザUがマニュアルで設定できるよう構成されていてもよい。 (Manual setting of detectable range)
In the above embodiment, the
図24は、ユーザUが検出可能範囲ARの大きさをマニュアルで設定する様子を示す図である。端末装置30は、検出可能範囲ARの大きさの設定をユーザUがGUI(Graphical User Interface)を使って変更可能に構成されている。以下、図24を参照しながら、検出可能範囲ARの大きさのマニュアル設定を可能にする端末装置30の処理を説明する。以下の処理は、端末装置30の制御部36が実行する。
FIG. 24 is a diagram showing how the user U manually sets the size of the detectable range AR. The terminal device 30 is configured so that the user U can change the setting of the size of the detectable range AR by using the GUI (Graphical User Interface). Hereinafter, the processing of the terminal device 30 that enables the manual setting of the size of the detectable range AR will be described with reference to FIG. 24. The following processing is executed by the control unit 36 of the terminal device 30.
まず、制御部36の取得部361は、記憶部35から検出可能範囲ARに関する情報を取得する。例えば、記憶部35には、現在の検出可能範囲ARの設定を示す情報が保存されている。取得部361は、例えば、記憶部35から当該情報を取得する。なお、他の装置(例えば、出力装置20やクラウド上の装置)に現在の検出可能範囲ARの設定を示す情報が保存されているのであれば、取得部361は、通信部34を介して他の装置から当該情報を取得してもよい。そして、出力制御部364は、取得した情報に基づいて現在の検出可能範囲ARの設定をユーザが視覚できるよう画像化し、出力部33に表示する(ステップS201)。
First, the acquisition unit 361 of the control unit 36 acquires information on the detectable range AR from the storage unit 35. For example, the storage unit 35 stores information indicating the current detection range AR setting. The acquisition unit 361 acquires the information from, for example, the storage unit 35. If information indicating the current detectable range AR setting is stored in another device (for example, an output device 20 or a device on the cloud), the acquisition unit 361 may be used via the communication unit 34. The information may be obtained from the device of. Then, the output control unit 364 visualizes the current detectable range AR setting based on the acquired information so that the user can visually recognize it, and displays it on the output unit 33 (step S201).
図24の例では、検出領域ORの一部が検出可能範囲ARとなっている。取得部361は、端末装置30の入力部31からユーザUの操作情報を取得する(ステップS202)。例えば、取得部361は、入力部31から、検出可能範囲ARの大きさを変更する旨のユーザUの操作の情報を取得する。図24の例では、端末装置30は入力部31としてタッチパネルディスプレイを備えており、取得部361は、ユーザUの検出可能範囲ARを広げる旨の動作の情報を取得している。より具体的には、図24の例では、取得部361は、タッチパネルディスプレイ上に表示された検出可能範囲AR上で、ユーザUが、手Hをピンチアウトさせている旨の情報を取得している。
In the example of FIG. 24, a part of the detection area OR is the detectable range AR. The acquisition unit 361 acquires the operation information of the user U from the input unit 31 of the terminal device 30 (step S202). For example, the acquisition unit 361 acquires information on the operation of the user U to change the size of the detectable range AR from the input unit 31. In the example of FIG. 24, the terminal device 30 includes a touch panel display as an input unit 31, and the acquisition unit 361 acquires operation information for expanding the detectable range AR of the user U. More specifically, in the example of FIG. 24, the acquisition unit 361 acquires information that the user U is pinching out the hand H on the detectable range AR displayed on the touch panel display. There is.
出力制御部364は、ステップS202で取得した情報に基づいてユーザUの変更操作に係る検出可能範囲ARをユーザUが視覚できるよう画像化し、出力部33に表示する(ステップS203)。
The output control unit 364 visualizes the detectable range AR related to the change operation of the user U based on the information acquired in step S202 so that the user U can see it, and displays it on the output unit 33 (step S203).
そして、制御部36は、ユーザUから検出可能範囲ARの変更を指示する旨の操作(例えば、OKボタンの押下)を検出すると、ステップS203で表示した検出可能範囲ARの情報に基づいて、記憶部35に保存されている検出可能範囲ARの設定を更新する。
Then, when the control unit 36 detects an operation (for example, pressing the OK button) instructing the user U to change the detectable range AR, the control unit 36 stores the information based on the information of the detectable range AR displayed in step S203. The setting of the detectable range AR stored in the unit 35 is updated.
なお、端末装置30は、検出可能範囲ARの検出領域OR内での位置もユーザUがマニュアルで変更できるよう構成されていてもよい。
The terminal device 30 may be configured so that the user U can manually change the position of the detectable range AR in the detection area OR.
図25は、ユーザUが検出可能範囲ARの位置をマニュアルで設定する様子を示す図である。端末装置30は、検出可能範囲ARの位置の設定をユーザUがGUIを使って変更可能に構成されている。以下、図25を参照しながら、検出可能範囲ARの位置のマニュアル設定を可能にする端末装置30の処理を説明する。以下の処理は、端末装置30の制御部36が実行する。
FIG. 25 is a diagram showing how the user U manually sets the position of the detectable range AR. The terminal device 30 is configured so that the user U can change the setting of the position of the detectable range AR by using the GUI. Hereinafter, the processing of the terminal device 30 that enables the manual setting of the position of the detectable range AR will be described with reference to FIG. 25. The following processing is executed by the control unit 36 of the terminal device 30.
まず、制御部36の取得部361は、記憶部35から検出可能範囲ARに関する情報を取得する。そして、出力制御部364は、取得した情報に基づいて現在の検出可能範囲ARの設定をユーザが視覚できるよう画像化し、出力部33に表示する(ステップS301)。
First, the acquisition unit 361 of the control unit 36 acquires information on the detectable range AR from the storage unit 35. Then, the output control unit 364 visualizes the current detectable range AR setting based on the acquired information so that the user can visually recognize it, and displays it on the output unit 33 (step S301).
図25の例では、検出領域ORの一部が検出可能範囲ARとなっている。取得部361は、端末装置30の入力部31からユーザUの操作情報を取得する(ステップS302)。例えば、取得部361は、入力部31から、検出可能範囲ARの位置を変更する旨のユーザUの操作の情報を取得する。図25の例では、端末装置30は入力部31としてタッチパネルディスプレイを備えており、取得部361は、ユーザUの検出可能範囲ARの位置を変更する旨の動作の情報を取得している。より具体的には、図25例では、取得部361は、タッチパネルディスプレイに表示された検出可能範囲ARをユーザUがタッチし、移動させたい位置に向けてユーザUが指をスワイプさせている旨の情報を取得している。
In the example of FIG. 25, a part of the detection area OR is the detectable range AR. The acquisition unit 361 acquires the operation information of the user U from the input unit 31 of the terminal device 30 (step S302). For example, the acquisition unit 361 acquires information on the operation of the user U to change the position of the detectable range AR from the input unit 31. In the example of FIG. 25, the terminal device 30 includes a touch panel display as an input unit 31, and the acquisition unit 361 acquires operation information to change the position of the detectable range AR of the user U. More specifically, in the example of FIG. 25, in the acquisition unit 361, the user U touches the detectable range AR displayed on the touch panel display, and the user U swipes the finger toward the position to be moved. The information of is acquired.
出力制御部364は、ステップS302で取得した情報に基づいてユーザUの変更操作に係る検出可能範囲ARをユーザUが視覚できるよう画像化し、出力部33に表示する(ステップS303)。
The output control unit 364 visualizes the detectable range AR related to the change operation of the user U based on the information acquired in step S302 so that the user U can see it, and displays it on the output unit 33 (step S303).
そして、制御部36は、ユーザUから検出可能範囲ARの変更を指示する旨の操作(例えば、OKボタンの押下)を検出すると、ステップS303で表示した検出可能範囲ARの情報に基づいて、記憶部35に保存されている検出可能範囲ARの設定を更新する。
Then, when the control unit 36 detects an operation (for example, pressing the OK button) instructing the user U to change the detectable range AR, the control unit 36 stores the information based on the information of the detectable range AR displayed in step S303. The setting of the detectable range AR stored in the unit 35 is updated.
検出可能範囲ARの大きさや位置をユーザUがマニュアルで設定できるようにすることで、ユーザUの個性に合わせた検出可能範囲ARの設定が可能になる。結果として、ジェスチャの誤検出は少なくなる。なお、GUI(Graphical User Interface)以外にも、例えば端末装置30の音声認識機能と連動させて検出可能範囲ARまたは非検出可能範囲NRを調整できるようにしてもよい。例えば、ユーザの「検出可能範囲ARを広くして」という音声を認識した際に検出可能範囲ARを段階的に変更していく等である。
By allowing the user U to manually set the size and position of the detectable range AR, it is possible to set the detectable range AR according to the individuality of the user U. As a result, false positives of gestures are reduced. In addition to the GUI (Graphical User Interface), the detectable range AR or the non-detectable range NR may be adjusted in conjunction with, for example, the voice recognition function of the terminal device 30. For example, when the user recognizes the voice "widen the detectable range AR", the detectable range AR is changed step by step.
(ジェスチャ割り当てのマニュアル設定)
上述の実施形態では、ジェスチャと機能の組み合わせは予め決められていた。しかしながら、端末装置30は、ジェスチャと機能の組み合わせをユーザUがマニュアルで設定できるよう構成されていてもよい。 (Manual setting of gesture assignment)
In the above embodiment, the combination of gesture and function is predetermined. However, theterminal device 30 may be configured so that the user U can manually set the combination of the gesture and the function.
上述の実施形態では、ジェスチャと機能の組み合わせは予め決められていた。しかしながら、端末装置30は、ジェスチャと機能の組み合わせをユーザUがマニュアルで設定できるよう構成されていてもよい。 (Manual setting of gesture assignment)
In the above embodiment, the combination of gesture and function is predetermined. However, the
このとき、端末装置30は、ユーザUのジェスチャに係る操作履歴の情報を記憶部35に保存するよう構成されていてもよい。そして、端末装置30は、操作履歴の情報に基づいて、ユーザUがよく使用するジェスチャとその成功率を集計し、集計結果をユーザUが視認できるよう表示してもよい。図26は、ユーザUがよく使用するジェスチャとその成功率を表示した様子を示す図である。図26の例では、ユーザUはスワイプ動作を多く使用している一方で、スワイプ動作の成功率が悪いことがわかる。
At this time, the terminal device 30 may be configured to store the operation history information related to the gesture of the user U in the storage unit 35. Then, the terminal device 30 may aggregate the gestures frequently used by the user U and their success rates based on the information in the operation history, and display the aggregated results so that the user U can visually recognize them. FIG. 26 is a diagram showing how the gestures frequently used by the user U and their success rates are displayed. In the example of FIG. 26, it can be seen that the user U uses the swipe operation a lot, but the success rate of the swipe operation is poor.
端末装置30は、ユーザUがジェスチャと機能との関連付けを変更できるように構成されていてもよい。例えば、端末装置30は、ジェスチャと機能との関連付けを変更するためのGUIを有していてもよい。図26の例では、ユーザUは、例えば、スワイプ動作に関連付けられている機能(例えば、Next/Prev)をピンチ動作に関連付けるようGUIを使って変更する。
The terminal device 30 may be configured so that the user U can change the association between the gesture and the function. For example, the terminal device 30 may have a GUI for changing the association between the gesture and the function. In the example of FIG. 26, the user U modifies the function associated with the swipe operation (eg, Next / Prev) using the GUI to associate with the pinch operation, for example.
なお、端末装置30は、操作履歴の情報に基づいてユーザに検出可能範囲ARを提案するよう構成されていてもよい。また、端末装置30は、ユーザがその提案に基づいて検出可能範囲ARを変更できるよう構成されていてもよい。
The terminal device 30 may be configured to propose the detectable range AR to the user based on the information in the operation history. Further, the terminal device 30 may be configured so that the user can change the detectable range AR based on the proposal.
ジェスチャと機能との関連付けをユーザUがマニュアルで設定できるようにすることで、ユーザUの個性に合わせたジェスチャと機能との関連付けが可能になる。これにより、ユーザUが成功率の高いジェスチャをよく使う機能に関連付けることが可能になるので、ジェスチャの誤検出は少なくなる。
By allowing the user U to manually set the association between the gesture and the function, it is possible to associate the gesture with the function according to the individuality of the user U. As a result, the user U can associate the gesture with a high success rate with the frequently used function, so that the false detection of the gesture is reduced.
<<4.実施形態3>>
実施形態1、2では、主に、非接触センサを使ってユーザの機器への操作を判別した。実施形態3では、接触センサと非接触センサとを組み合わせてユーザの機器操作を判別する。以下、実施形態3の情報処理装置40について説明する。 << 4. Embodiment 3 >>
In the first and second embodiments, the non-contact sensor is mainly used to determine the user's operation on the device. In the third embodiment, the contact sensor and the non-contact sensor are combined to determine the user's device operation. Hereinafter, theinformation processing apparatus 40 of the third embodiment will be described.
実施形態1、2では、主に、非接触センサを使ってユーザの機器への操作を判別した。実施形態3では、接触センサと非接触センサとを組み合わせてユーザの機器操作を判別する。以下、実施形態3の情報処理装置40について説明する。 << 4. Embodiment 3 >>
In the first and second embodiments, the non-contact sensor is mainly used to determine the user's operation on the device. In the third embodiment, the contact sensor and the non-contact sensor are combined to determine the user's device operation. Hereinafter, the
<4-1.実施形態3の概要>
非接触センサを使ったジェスチャの検出は、ユーザの顔の向きや姿勢に応じてジェスチャの検出可能範囲が変化してしまう。このため、非接触センサを使ったジェスチャの検出は、検出精度が低くなる。これは、ジェスチャの位置はセンサの配置箇所によって絶対的に決まるものの、ユーザは、姿勢状態に関わらず、ジェスチャ位置を相対的に認知してしまうためである。この問題を解決するため、ジェスチャの検出可能範囲を広げることも考えられる。しかし、ジェスチャの検出可能範囲をむやみに広げてしまうと、情報処理装置が、他の紛らわしい動作をジェスチャと誤認識する可能性がある。そのため、検出可能範囲はできるだけ限定するのが望ましい。 <4-1. Outline of Embodiment 3>
Gesture detection using a non-contact sensor changes the range in which gestures can be detected according to the orientation and posture of the user's face. Therefore, the detection accuracy of gesture detection using a non-contact sensor is low. This is because the position of the gesture is absolutely determined by the location of the sensor, but the user relatively recognizes the position of the gesture regardless of the posture state. To solve this problem, it is possible to expand the detectable range of gestures. However, if the detectable range of the gesture is unnecessarily expanded, the information processing apparatus may mistakenly recognize other misleading actions as the gesture. Therefore, it is desirable to limit the detectable range as much as possible.
非接触センサを使ったジェスチャの検出は、ユーザの顔の向きや姿勢に応じてジェスチャの検出可能範囲が変化してしまう。このため、非接触センサを使ったジェスチャの検出は、検出精度が低くなる。これは、ジェスチャの位置はセンサの配置箇所によって絶対的に決まるものの、ユーザは、姿勢状態に関わらず、ジェスチャ位置を相対的に認知してしまうためである。この問題を解決するため、ジェスチャの検出可能範囲を広げることも考えられる。しかし、ジェスチャの検出可能範囲をむやみに広げてしまうと、情報処理装置が、他の紛らわしい動作をジェスチャと誤認識する可能性がある。そのため、検出可能範囲はできるだけ限定するのが望ましい。 <4-1. Outline of Embodiment 3>
Gesture detection using a non-contact sensor changes the range in which gestures can be detected according to the orientation and posture of the user's face. Therefore, the detection accuracy of gesture detection using a non-contact sensor is low. This is because the position of the gesture is absolutely determined by the location of the sensor, but the user relatively recognizes the position of the gesture regardless of the posture state. To solve this problem, it is possible to expand the detectable range of gestures. However, if the detectable range of the gesture is unnecessarily expanded, the information processing apparatus may mistakenly recognize other misleading actions as the gesture. Therefore, it is desirable to limit the detectable range as much as possible.
なお、ジェスチャによる機器操作(空中操作)は、ユーザの負荷が小さい。しかしながら、ジェスチャは、人によって大きくばらつく(すなわち、ジェスチャ位置に再現性がない)ので、機器操作の精度が低くなる。一方、機器や人体への接触による機器操作(物理操作)は、機器操作の精度が高い。しかしながら、物理操作では、操作面が限定されるため、操作領域が狭くなったり、入力できるコマンドが少なくなったりする。
Note that the device operation (aerial operation) by gesture has a small load on the user. However, since gestures vary greatly from person to person (that is, the gesture position is not reproducible), the accuracy of device operation is low. On the other hand, device operation (physical operation) by contact with the device or the human body has high accuracy of device operation. However, in physical operation, since the operation surface is limited, the operation area becomes narrow and the commands that can be input are reduced.
そこで、実施形態3では、接触センサと非接触センサとを組み合わせる。より具体的には、実施形態3の情報処理装置40は、接触センサからのセンサ情報及び非接触センサからのセンサ情報に基づいて、ジェスチャの検出に関する補正又は機器操作に関する補正を行う。
Therefore, in the third embodiment, the contact sensor and the non-contact sensor are combined. More specifically, the information processing apparatus 40 of the third embodiment makes a correction related to gesture detection or a correction related to device operation based on the sensor information from the contact sensor and the sensor information from the non-contact sensor.
例えば、情報処理装置40は、少なくとも接触センサからのセンサ情報を使ってユーザのジェスチャを検出するとともに、その際のジェスチャ位置を非接触センサからのセンサ情報に基づいて推定する。そして、情報処理装置40は、推定したジェスチャ位置の情報に基づいて、ジェスチャの検出可能範囲を補正する。これにより、ジェスチャの検出可能範囲がユーザの使用状況に合わせて随時更新されるので、ジェスチャの検出精度が高まる。
For example, the information processing apparatus 40 detects the user's gesture using at least the sensor information from the contact sensor, and estimates the gesture position at that time based on the sensor information from the non-contact sensor. Then, the information processing apparatus 40 corrects the detectable range of the gesture based on the information of the estimated gesture position. As a result, the detectable range of the gesture is updated at any time according to the usage status of the user, so that the detection accuracy of the gesture is improved.
また、情報処理装置40は、少なくとも非接触センサからのセンサ情報を使ってユーザのジェスチャを検出するように構成されていてもよい。そして、情報処理装置40は、ユーザが顔又は情報処理装置40へ接触したときの非接触センサからのセンサ情報に基づいて、ジェスチャの検出可能範囲を補正してもよい。これにより、物理接触の位置を基準にジェスチャの検出可能範囲が補正されるので、ジェスチャの検出精度を高めることができる。
Further, the information processing device 40 may be configured to detect the user's gesture by using at least the sensor information from the non-contact sensor. Then, the information processing device 40 may correct the detectable range of the gesture based on the sensor information from the non-contact sensor when the user touches the face or the information processing device 40. As a result, the detectable range of the gesture is corrected based on the position of the physical contact, so that the detection accuracy of the gesture can be improved.
情報処理装置40は、少なくとも接触センサからのセンサ情報を使ってユーザのジェスチャを検出するように構成されていてもよい。そして、情報処理装置40は、検出したジェスチャに対応するコマンドを実行するよう構成されていてもよい。コマンドには、少なくとも、操作量を伴うコマンドが含まれていてもよい。このとき、情報処理装置40は、非接触センサからのセンサ情報に基づいて操作量を判別してもよい。これにより、情報処理装置40は、接触センサを使って精度の高いジェスチャ検出を実現しつつも、非接触センサを使って操作量を判別することができる。
The information processing device 40 may be configured to detect the user's gesture using at least the sensor information from the contact sensor. Then, the information processing apparatus 40 may be configured to execute a command corresponding to the detected gesture. The command may include at least a command with an operation amount. At this time, the information processing apparatus 40 may determine the operation amount based on the sensor information from the non-contact sensor. As a result, the information processing apparatus 40 can determine the operation amount by using the non-contact sensor while realizing highly accurate gesture detection by using the contact sensor.
<4-2.情報処理装置の構成>
以上、実施形態3の概要を述べたが、次に、情報処理装置40の構成を説明する。 <4-2. Information processing device configuration>
The outline of the third embodiment has been described above, but next, the configuration of theinformation processing apparatus 40 will be described.
以上、実施形態3の概要を述べたが、次に、情報処理装置40の構成を説明する。 <4-2. Information processing device configuration>
The outline of the third embodiment has been described above, but next, the configuration of the
図27は、実施形態3に係る情報処理装置40の構成例を示す図である。情報処理装置40は、入力検出部41と、状態検出部42と、出力部43と、通信部44と、記憶部45と、制御部46と、を備える。なお、図27に示した構成は機能的な構成であり、ハードウェア構成はこれとは異なっていてもよい。また、情報処理装置40の機能は、複数の物理的に分離された構成に分散して実装されてもよい。例えば、情報処理装置40の機能は、例えば実施形態2に示すように、出力装置20と端末装置30とに分散して実装されてもよい。
FIG. 27 is a diagram showing a configuration example of the information processing apparatus 40 according to the third embodiment. The information processing device 40 includes an input detection unit 41, a state detection unit 42, an output unit 43, a communication unit 44, a storage unit 45, and a control unit 46. The configuration shown in FIG. 27 is a functional configuration, and the hardware configuration may be different from this. Further, the functions of the information processing apparatus 40 may be distributed and implemented in a plurality of physically separated configurations. For example, as shown in the second embodiment, the functions of the information processing apparatus 40 may be distributed and mounted in the output apparatus 20 and the terminal apparatus 30.
入力検出部41は、ユーザの入力操作を検出する検出部である。入力検出部11は、物体を非接触で検出する非接触型の検出部(以下、非接触センサという。)と、ユーザの人体又は物への接触を検出する接触型の検出部(以下、接触センサという。)と、を備える。
The input detection unit 41 is a detection unit that detects a user's input operation. The input detection unit 11 includes a non-contact type detection unit (hereinafter, referred to as a non-contact sensor) that detects an object non-contactly, and a contact-type detection unit (hereinafter, contact) that detects contact with the user's human body or an object. It is called a sensor).
非接触センサは、検出領域内に位置する物体を検出する、1又は複数の近接センサで構成されていてもよい。近接センサは、光学式のセンサであってもよいし、静電容量式のセンサであってもよい。光学式の近接センサは、物体が反射した光を検出するセンサである。また、静電容量式の近接センサは、物体とセンサの間に生じる静電容量の変化を検出するセンサである。なお、非接触センサは、2Dセンサであってもよいし、3Dセンサであってもよい。
The non-contact sensor may be composed of one or a plurality of proximity sensors that detect an object located in the detection area. The proximity sensor may be an optical sensor or a capacitance type sensor. An optical proximity sensor is a sensor that detects the light reflected by an object. The capacitance type proximity sensor is a sensor that detects a change in capacitance that occurs between an object and the sensor. The non-contact sensor may be a 2D sensor or a 3D sensor.
非接触センサは、1又は複数のタッチセンサを備え、情報処理装置10の所定の場所に接触した物体を検出する。情報処理装置40がイヤホン型の装置なのであれば、所定の場所は、例えば、イヤホン側面(スピーカ面の反対面)である。また、情報処理装置10がヘッドホン型の装置なのであれば、所定の場所は、例えば、ヘッドホン側面(スピーカ面の反対面)である。以下の説明では、イヤホン側面やヘッドホン側面のことをハウジング面ということがある。
The non-contact sensor includes one or a plurality of touch sensors, and detects an object that comes into contact with a predetermined place of the information processing apparatus 10. If the information processing device 40 is an earphone type device, the predetermined location is, for example, the side surface of the earphone (opposite the surface of the speaker). If the information processing device 10 is a headphone type device, the predetermined location is, for example, the side surface of the headphones (opposite the surface of the speaker). In the following description, the side surface of the earphone or the side surface of the headphone may be referred to as the housing surface.
また、入力検出部11は、接触式のセンサとして、タッチパッドTPを備えている。タッチパッドTPは、面状に配置された複数のタッチセンサで構成される。赤外発光体IDは、ハウジング面に配置されている。入力検出部11は、タッチパッドTPの表面に接触した物体を検出する。
Further, the input detection unit 11 is provided with a touch pad TP as a contact type sensor. The touch pad TP is composed of a plurality of touch sensors arranged in a plane. The infrared emitter ID is arranged on the housing surface. The input detection unit 11 detects an object in contact with the surface of the touch pad TP.
なお、非接触センサは、機器への接触を検出するセンサに限定されない。例えば、非接触センサは、情報処理装置40を装着するユーザの人体への接触を検出するセンサであってもよい。例えば、情報処理装置40がイヤホン型、或いはヘッドホン型の装置なのであれば、非接触センサは、情報処理装置40を装着するユーザの顔への接触を検出するセンサであってもよい。この場合、非接触センサは、例えば、ユーザが自身の顔へ手を接触させた際の振動を検出することにより、ユーザの顔への接触を検出するよう構成されていてもよい。
The non-contact sensor is not limited to the sensor that detects contact with the device. For example, the non-contact sensor may be a sensor that detects contact with the human body of the user who wears the information processing device 40. For example, if the information processing device 40 is an earphone type or headphone type device, the non-contact sensor may be a sensor that detects contact with the face of the user wearing the information processing device 40. In this case, the non-contact sensor may be configured to detect contact with the user's face, for example, by detecting vibration when the user touches his / her face with his / her hand.
状態検出部42は、情報処理装置40の状態に関する検出を行うセンサ部である。状態検出部42は、ユーザの状態を検出するよう構成されていてもよい。
The state detection unit 42 is a sensor unit that detects the state of the information processing device 40. The state detection unit 42 may be configured to detect the user's state.
制御部46は、図27に示すように、第1の取得部461Aと、第2の取得部461Bと、ジェスチャ検出部462と、コマンド実行部463と、出力制御部164と、推測部165と、補正部166と、を備える。制御部46を構成する各ブロック(取得部161~補正部166)はそれぞれ制御部46の機能を示す機能ブロックである。これら機能ブロックはソフトウェアブロックであってもよいし、ハードウェアブロックであってもよい。例えば、上述の機能ブロックが、それぞれ、ソフトウェア(マイクロプログラムを含む。)で実現される1つのソフトウェアモジュールであってもよいし、半導体チップ(ダイ)上の1つの回路ブロックであってもよい。勿論、各機能ブロックがそれぞれ1つのプロセッサ又は1つの集積回路であってもよい。機能ブロックの構成方法は任意である。
As shown in FIG. 27, the control unit 46 includes a first acquisition unit 461A, a second acquisition unit 461B, a gesture detection unit 462, a command execution unit 463, an output control unit 164, and a guessing unit 165. , And a correction unit 166. Each block (acquisition unit 161 to correction unit 166) constituting the control unit 46 is a functional block indicating the function of the control unit 46, respectively. These functional blocks may be software blocks or hardware blocks. For example, each of the above-mentioned functional blocks may be one software module realized by software (including a microprogram), or may be one circuit block on a semiconductor chip (die). Of course, each functional block may be one processor or one integrated circuit. The method of configuring the functional block is arbitrary.
なお、制御部16は上述の機能ブロックとは異なる機能単位で構成されていてもよい。また、制御部16を構成する各ブロック(第1の取得部461A~補正部466)の一部又は全部の動作を、他の装置が行ってもよい。例えば、制御部46を構成する各ブロックの一部又は全部の動作を、音楽プレーヤーやスマートフォン等の端末装置が行ってもよい。制御部46を構成する各ブロックの動作は後述する。
The control unit 16 may be configured in a functional unit different from the above-mentioned functional block. Further, another device may perform a part or all of the operations of each block (first acquisition unit 461A to correction unit 466) constituting the control unit 16. For example, a terminal device such as a music player or a smartphone may perform some or all operations of each block constituting the control unit 46. The operation of each block constituting the control unit 46 will be described later.
その他、入力検出部41、状態検出部42、出力部43、通信部44、及び制御部46の構成は、情報処理装置10が備える入力検出部11、状態検出部12、出力部13、通信部14、及び制御部16の構成と同様である。
In addition, the input detection unit 41, the state detection unit 42, the output unit 43, the communication unit 44, and the control unit 46 are configured with the input detection unit 11, the state detection unit 12, the output unit 13, and the communication unit included in the information processing device 10. The configuration is the same as that of the control unit 16 and the control unit 16.
<4-3.情報処理装置の動作>
以上、情報処理装置40の構成を説明したが、次に、このような構成を有する情報処理装置40の動作について説明する。 <4-3. Information processing device operation>
The configuration of theinformation processing apparatus 40 has been described above. Next, the operation of the information processing apparatus 40 having such a configuration will be described.
以上、情報処理装置40の構成を説明したが、次に、このような構成を有する情報処理装置40の動作について説明する。 <4-3. Information processing device operation>
The configuration of the
上述したように、実施形態3では、接触センサと非接触センサとを組み合わせる。より具体的には、情報処理装置40は、接触センサからのセンサ情報及び非接触センサからのセンサ情報に基づいて、ジェスチャの検出に関する補正又は機器操作に関する補正を行う。以下、具体例をいくつか示す。
As described above, in the third embodiment, the contact sensor and the non-contact sensor are combined. More specifically, the information processing apparatus 40 makes a correction related to gesture detection or a correction related to device operation based on the sensor information from the contact sensor and the sensor information from the non-contact sensor. Some specific examples are shown below.
<4-3-1.例1>
まず、例1について説明する。 <4-3-1. Example 1>
First, Example 1 will be described.
まず、例1について説明する。 <4-3-1. Example 1>
First, Example 1 will be described.
(例1の内容)
例1では、ユーザの顔又はデバイスへの物理接触によるジェスチャを想定する。図28は、ユーザが物理接触によるジェスチャを行う様子を示す図である。図28の例では、接触センサの検出可能範囲A1がユーザの左頬の一部に設定されており、非接触センサの検出可能範囲A2がユーザの左頬の一部を含む左頬付近に設定されている。図28の例では、検出可能範囲A1の一部と検出可能範囲A2の一部とが重複している。図28の例では、ユーザが頬をタップすることより機器を操作する。 (Contents of Example 1)
Example 1 assumes a gesture due to physical contact with the user's face or device. FIG. 28 is a diagram showing a user performing a gesture by physical contact. In the example of FIG. 28, the detectable range A1 of the contact sensor is set to a part of the left cheek of the user, and the detectable range A2 of the non-contact sensor is set to the vicinity of the left cheek including a part of the left cheek of the user. Has been done. In the example of FIG. 28, a part of the detectable range A1 and a part of the detectable range A2 overlap. In the example of FIG. 28, the user operates the device by tapping the cheek.
例1では、ユーザの顔又はデバイスへの物理接触によるジェスチャを想定する。図28は、ユーザが物理接触によるジェスチャを行う様子を示す図である。図28の例では、接触センサの検出可能範囲A1がユーザの左頬の一部に設定されており、非接触センサの検出可能範囲A2がユーザの左頬の一部を含む左頬付近に設定されている。図28の例では、検出可能範囲A1の一部と検出可能範囲A2の一部とが重複している。図28の例では、ユーザが頬をタップすることより機器を操作する。 (Contents of Example 1)
Example 1 assumes a gesture due to physical contact with the user's face or device. FIG. 28 is a diagram showing a user performing a gesture by physical contact. In the example of FIG. 28, the detectable range A1 of the contact sensor is set to a part of the left cheek of the user, and the detectable range A2 of the non-contact sensor is set to the vicinity of the left cheek including a part of the left cheek of the user. Has been done. In the example of FIG. 28, a part of the detectable range A1 and a part of the detectable range A2 overlap. In the example of FIG. 28, the user operates the device by tapping the cheek.
既存の事例では、接触における操作エリアは単一で設定されているため、ユーザが顔やデバイスのどの位置をタップしても同じジェスチャとして認識される。したがってコマンド数に限りがでてしまう可能性がある。そこで、例1では、情報処理装置40は、非接触センサからのセンサ情報に基づいて、接触時における操作エリア(ジェスチャ位置)を推定する。これにより、情報処理装置40は、ユーザが検出可能範囲A1内のどの領域でジェスチャを行ったのかを判別可能にする。
In the existing case, since the operation area for contact is set to a single unit, the same gesture is recognized regardless of the user tapping any position on the face or device. Therefore, the number of commands may be limited. Therefore, in Example 1, the information processing apparatus 40 estimates the operation area (gesture position) at the time of contact based on the sensor information from the non-contact sensor. As a result, the information processing apparatus 40 makes it possible to determine in which area within the detectable range A1 the user performed the gesture.
なお、ユーザによって操作エリアの大きさや位置に個人差がある。そこで、情報処理装置40は、推定したジェスチャ位置の情報に基づいて記ジェスチャの検出可能範囲を補正する。例えば、情報処理装置40は、操作エリア(ジェスチャ位置)が、現状の検出可能範囲A1の所定の割合以上の範囲に及んでいる場合には、検出可能範囲A1の大きさを大きくする。また、操作エリア(ジェスチャ位置)が、現状の検出可能範囲A1の端に位置している場合には、操作エリアが中央に来るように検出可能範囲A1の位置を補正する。これにより、ジェスチャの検出可能範囲A1がユーザの使用状況に合わせて随時更新されるので、ジェスチャの検出精度が高まる。
There are individual differences in the size and position of the operation area depending on the user. Therefore, the information processing apparatus 40 corrects the detectable range of the gesture based on the information of the estimated gesture position. For example, the information processing apparatus 40 increases the size of the detectable range A1 when the operating area (gesture position) covers a range equal to or larger than a predetermined ratio of the current detectable range A1. When the operation area (gesture position) is located at the end of the current detectable range A1, the position of the detectable range A1 is corrected so that the operation area is in the center. As a result, the detectable range A1 of the gesture is updated at any time according to the usage status of the user, so that the detection accuracy of the gesture is improved.
なお、1つのジェスチャを構成するユーザの動作には、ジェスチャの開始のトリガとなる第1動作と、第1動作に続く第2動作とが含まれていてもよい。この場合、情報処理装置40は、第1動作の操作エリア(ジェスチャ位置)の情報に基づいて、第2動作の検出可能範囲を狭めるようにしてもよい。これにより、日常動作によるジェスチャの誤検出を防ぐことができる。
Note that the user's actions constituting one gesture may include a first action that triggers the start of the gesture and a second action following the first action. In this case, the information processing apparatus 40 may narrow the detectable range of the second operation based on the information of the operation area (gesture position) of the first operation. This makes it possible to prevent erroneous detection of gestures due to daily activities.
情報処理装置40は、検出したジェスチャに対応するコマンドを実行する。このとき、情報処理装置40は、接触センサからのセンサ情報に基づき検出されたジェスチャが同じジェスチャであっても、ジェスチャ位置が異なる場合には、実行するコマンドを変更してもよい。すなわち、情報処理装置40は、接触センサからのセンサ情報に基づき検出されたジェスチャと、非接触センサからのセンサ情報に基づき検出されるジェスチャ位置の情報と、に基づいて、実行するコマンドを判別してもよい。これにより、情報処理装置40は、物理接触による精度の高いジェスチャ検出を実現しつつも、1つのジェスチャに多くのコマンドを割り当てることができるようになる。
The information processing device 40 executes a command corresponding to the detected gesture. At this time, the information processing apparatus 40 may change the command to be executed even if the gestures detected based on the sensor information from the contact sensor are the same gestures but the gesture positions are different. That is, the information processing apparatus 40 determines a command to be executed based on the gesture detected based on the sensor information from the contact sensor and the gesture position information detected based on the sensor information from the non-contact sensor. You may. As a result, the information processing apparatus 40 can assign many commands to one gesture while realizing highly accurate gesture detection by physical contact.
なお、図28に示すように検出可能範囲A1の一部と検出可能範囲A2の一部とが重複している場合、情報処理装置40は、非接触センサ単体で、物理接触によるジェスチャを検出することも可能である。
As shown in FIG. 28, when a part of the detectable range A1 and a part of the detectable range A2 overlap, the information processing apparatus 40 detects the gesture due to physical contact with the non-contact sensor alone. It is also possible.
(処理例)
図29は、実施形態3に係るコマンド実行処理の一例を示すフローチャートである。コマンド実行処理は、情報処理装置40の制御部46により実行される。コマンド実行処理は、例えば、情報処理装置40に電源が投入された場合に実行される。 (Processing example)
FIG. 29 is a flowchart showing an example of the command execution process according to the third embodiment. The command execution process is executed by the control unit 46 of theinformation processing apparatus 40. The command execution process is executed, for example, when the information processing apparatus 40 is turned on.
図29は、実施形態3に係るコマンド実行処理の一例を示すフローチャートである。コマンド実行処理は、情報処理装置40の制御部46により実行される。コマンド実行処理は、例えば、情報処理装置40に電源が投入された場合に実行される。 (Processing example)
FIG. 29 is a flowchart showing an example of the command execution process according to the third embodiment. The command execution process is executed by the control unit 46 of the
まず、情報処理装置40の第2の取得部461Bは、非接触センサのセンサ値を取得する(ステップS201)。そして、情報処理装置40の推測部465は、非接触センサのセンサ値に基づいてジェスチャ位置を推測する(ステップS202)。さらに、情報処理装置40の第1の取得部461Aは、接触センサのセンサ値を取得する(ステップS203)。
First, the second acquisition unit 461B of the information processing apparatus 40 acquires the sensor value of the non-contact sensor (step S201). Then, the guessing unit 465 of the information processing apparatus 40 estimates the gesture position based on the sensor value of the non-contact sensor (step S202). Further, the first acquisition unit 461A of the information processing apparatus 40 acquires the sensor value of the contact sensor (step S203).
次に、情報処理装置40のジェスチャ検出部462は、接触センサのセンサ値に基づいてジェスチャを検出したか判別する(ステップS204)。ジェスチャを検出していない場合(ステップS204:No)、ジェスチャ検出部462は、ステップS201に処理を戻す。
Next, the gesture detection unit 462 of the information processing apparatus 40 determines whether or not the gesture has been detected based on the sensor value of the contact sensor (step S204). When the gesture is not detected (step S204: No), the gesture detection unit 462 returns the process to step S201.
一方、ジェスチャを検出した場合(ステップS204:Yes)、情報処理装置40の推測部465は、機器状態を推定する(ステップS205)。そして、情報処理装置40のジェスチャ検出部462は、機器状態の推定結果と接触センサのセンサ値とに基づいてジェスチャを判定する(ステップS206)。そして、情報処理装置40の補正部466は、ジェスチャの検出可能範囲(図28の例であれば、検出可能範囲A1)を補正する(ステップS207)。その後、コマンド実行部463は、判定したジェスチャに割り当てられたコマンドを実行する(ステップS208)。コマンドを実行したら、コマンド実行部463は、ステップS201に処理を戻す。
On the other hand, when the gesture is detected (step S204: Yes), the guessing unit 465 of the information processing apparatus 40 estimates the device state (step S205). Then, the gesture detection unit 462 of the information processing apparatus 40 determines the gesture based on the estimation result of the device state and the sensor value of the contact sensor (step S206). Then, the correction unit 466 of the information processing apparatus 40 corrects the detectable range of the gesture (in the example of FIG. 28, the detectable range A1) (step S207). After that, the command execution unit 463 executes the command assigned to the determined gesture (step S208). After executing the command, the command execution unit 463 returns the process to step S201.
<4-3-2.例2>
次に、例2について説明する。 <4-3-2. Example 2>
Next, Example 2 will be described.
次に、例2について説明する。 <4-3-2. Example 2>
Next, Example 2 will be described.
(例2の内容)
例2では、ユーザの非接触でのジェスチャを想定する。図30は、ユーザが空中動作によってジェスチャを行っている様子を示す図である。図30の例では、接触センサの検出可能範囲A1が情報処理装置40及びその周辺に設定されており、非接触センサの検出可能範囲A2がユーザの頭部から左に離間した位置に設定されている。図30の例では、検出可能範囲A1と検出可能範囲A2とは離間しており重複していない。図30の例では、ユーザが頭部の左で所定の空中動作をすることにより機器を操作する。 (Contents of Example 2)
In Example 2, a user's non-contact gesture is assumed. FIG. 30 is a diagram showing a user performing a gesture by aerial motion. In the example of FIG. 30, the detectable range A1 of the contact sensor is set in theinformation processing device 40 and its surroundings, and the detectable range A2 of the non-contact sensor is set at a position separated to the left from the user's head. There is. In the example of FIG. 30, the detectable range A1 and the detectable range A2 are separated from each other and do not overlap. In the example of FIG. 30, the user operates the device by performing a predetermined aerial operation on the left side of the head.
例2では、ユーザの非接触でのジェスチャを想定する。図30は、ユーザが空中動作によってジェスチャを行っている様子を示す図である。図30の例では、接触センサの検出可能範囲A1が情報処理装置40及びその周辺に設定されており、非接触センサの検出可能範囲A2がユーザの頭部から左に離間した位置に設定されている。図30の例では、検出可能範囲A1と検出可能範囲A2とは離間しており重複していない。図30の例では、ユーザが頭部の左で所定の空中動作をすることにより機器を操作する。 (Contents of Example 2)
In Example 2, a user's non-contact gesture is assumed. FIG. 30 is a diagram showing a user performing a gesture by aerial motion. In the example of FIG. 30, the detectable range A1 of the contact sensor is set in the
非接触センサがメインでジェスチャを検出する場合、ジェスチャ位置がユーザの状態によって大きくばらつく可能性がある。そこで、ジェスチャ位置のばらつきにより非接触のジェスチャの検出が上手くいかない場合、情報処理装置40は、ユーザへ人体、物、又は任意の空間地点への接触を促す。例えば、情報処理装置40は、顔タップや情報処理装置40への物理タッチをユーザに促す。
When the non-contact sensor mainly detects gestures, the gesture position may vary greatly depending on the user's condition. Therefore, when the detection of the non-contact gesture is not successful due to the variation in the gesture position, the information processing apparatus 40 urges the user to come into contact with the human body, an object, or an arbitrary space point. For example, the information processing device 40 prompts the user to tap the face or physically touch the information processing device 40.
そして、情報処理装置40は、ユーザの物理タッチの動作に基づいて、ジェスチャの検出可能範囲A2のキャリブレーションを行う。例えば、情報処理装置40は、ユーザが人体、物、又は任意の空間地点へ接触したときの非接触センサからのセンサ情報に基づいて、ジェスチャの検出可能範囲A2を補正する。より具体的には、情報処理装置40は、接触センサからのセンサ情報に基づいてユーザの人体、物、又は任意の空間地点への接触を検出し、ユーザの人体又は物への接触を検出したときの非接触センサからのセンサ情報に基づいて、ジェスチャの検出可能範囲A2を補正する。これにより、物理接触の位置を基準にジェスチャの検出可能範囲A2が補正されるので、ジェスチャの検出精度が高まる。
Then, the information processing apparatus 40 calibrates the detectable range A2 of the gesture based on the operation of the user's physical touch. For example, the information processing apparatus 40 corrects the detectable range A2 of the gesture based on the sensor information from the non-contact sensor when the user touches a human body, an object, or an arbitrary space point. More specifically, the information processing apparatus 40 detects contact with the user's human body, object, or arbitrary space point based on the sensor information from the contact sensor, and detects contact with the user's human body or object. The detectable range A2 of the gesture is corrected based on the sensor information from the non-contact sensor at that time. As a result, the detectable range A2 of the gesture is corrected based on the position of the physical contact, so that the detection accuracy of the gesture is improved.
なお、情報処理装置40は、ユーザの人体又は物への接触を検出した後、補正した検出可能範囲A2にユーザの手が位置した場合に、ユーザに音などでフィードバックを行ってもよい。また、ユーザの姿勢状態やコンテキスト(例えば、歩行やランニング)によって、非接触センサでのジェスチャ検出が出来ない場合、情報処理装置40は、非接触センサの検出可能範囲A2にユーザの手を認識したときに、物理操作を促すようなフィードバックを行ってもよい。
Note that the information processing apparatus 40 may provide feedback to the user by sound or the like when the user's hand is located in the corrected detectable range A2 after detecting the contact with the user's body or an object. Further, when the gesture cannot be detected by the non-contact sensor due to the posture state or context of the user (for example, walking or running), the information processing apparatus 40 recognizes the user's hand in the detectable range A2 of the non-contact sensor. Occasionally, feedback may be given to encourage physical operation.
(処理例)
図31は、実施形態3に係るコマンド実行処理の他の例を示すフローチャートである。コマンド実行処理は、情報処理装置40の制御部46により実行される。コマンド実行処理は、例えば、情報処理装置40に電源が投入された場合に実行される。 (Processing example)
FIG. 31 is a flowchart showing another example of the command execution process according to the third embodiment. The command execution process is executed by the control unit 46 of theinformation processing apparatus 40. The command execution process is executed, for example, when the information processing apparatus 40 is turned on.
図31は、実施形態3に係るコマンド実行処理の他の例を示すフローチャートである。コマンド実行処理は、情報処理装置40の制御部46により実行される。コマンド実行処理は、例えば、情報処理装置40に電源が投入された場合に実行される。 (Processing example)
FIG. 31 is a flowchart showing another example of the command execution process according to the third embodiment. The command execution process is executed by the control unit 46 of the
まず、情報処理装置40の第2の取得部461Bは、非接触センサのセンサ値を取得する(ステップS301)。そして、情報処理装置40のジェスチャ検出部462は、非接触センサのセンサ値に基づいてジェスチャを検出したか判別する(ステップS302)。
First, the second acquisition unit 461B of the information processing apparatus 40 acquires the sensor value of the non-contact sensor (step S301). Then, the gesture detection unit 462 of the information processing apparatus 40 determines whether or not the gesture has been detected based on the sensor value of the non-contact sensor (step S302).
ジェスチャを検出していない場合(ステップS302:No)、情報処理装置40の第1の取得部461Aは、接触センサのセンサ値を取得する(ステップS303)。そして、情報処理装置40の補正部466は、ジェスチャの検出可能範囲A2のキャリブレーションに関する動作を実行する(ステップS304)。例えば、補正部466は、ユーザに対して、顔タップや情報処理装置40への物理タッチをユーザに促す。補正部466は、ユーザが人体又は物へ接触したときの非接触センサからのセンサ情報に基づいて、ジェスチャの検出可能範囲A2を補正する(ステップS305)。補正が完了したら、補正部466は、ステップS301に処理を戻す。
When the gesture is not detected (step S302: No), the first acquisition unit 461A of the information processing apparatus 40 acquires the sensor value of the contact sensor (step S303). Then, the correction unit 466 of the information processing apparatus 40 executes an operation related to calibration of the detectable range A2 of the gesture (step S304). For example, the correction unit 466 prompts the user to tap the face or physically touch the information processing device 40. The correction unit 466 corrects the detectable range A2 of the gesture based on the sensor information from the non-contact sensor when the user comes into contact with the human body or an object (step S305). When the correction is completed, the correction unit 466 returns the process to step S301.
一方、ジェスチャを検出した場合(ステップS302:Yes)、情報処理装置40の推測部465は、機器状態を推定する(ステップS306)。そして、情報処理装置40のジェスチャ検出部462は、機器状態の推定結果と接触センサのセンサ値とに基づいてジェスチャを判定する(ステップS307)。そして、コマンド実行部463は、判定したジェスチャに割り当てられたコマンドを実行する(ステップS308)。コマンドを実行したら、コマンド実行部463は、ステップ301に処理を戻す。
On the other hand, when the gesture is detected (step S302: Yes), the guessing unit 465 of the information processing apparatus 40 estimates the device state (step S306). Then, the gesture detection unit 462 of the information processing apparatus 40 determines the gesture based on the estimation result of the device state and the sensor value of the contact sensor (step S307). Then, the command execution unit 463 executes the command assigned to the determined gesture (step S308). After executing the command, the command execution unit 463 returns the process to step 301.
<4-3-3.例3>
次に、例3について説明する。 <4-3-3. Example 3>
Next, Example 3 will be described.
次に、例3について説明する。 <4-3-3. Example 3>
Next, Example 3 will be described.
(例3の内容)
コマンドには、操作量を伴うコマンドがある。図32は、操作量を伴うコマンドを入力する様子を示す図である。例えば、コマンドが音量調節コマンド(Vol+/Vol-)なのであれば、操作量は音量である。音量調節コマンドの場合、ユーザの手の移動方向が音量の変化方向(音量を増加させるか減少させるか)に対応し、ユーザの手の移動量又は移動速度が音の変化量(入力値)に対応する。 (Contents of Example 3)
Some commands include operations. FIG. 32 is a diagram showing how a command accompanied by an operation amount is input. For example, if the command is a volume control command (Vol + / Vol-), the operation amount is volume. In the case of the volume control command, the movement direction of the user's hand corresponds to the change direction of the volume (whether the volume is increased or decreased), and the movement amount or movement speed of the user's hand becomes the change amount (input value) of the sound. handle.
コマンドには、操作量を伴うコマンドがある。図32は、操作量を伴うコマンドを入力する様子を示す図である。例えば、コマンドが音量調節コマンド(Vol+/Vol-)なのであれば、操作量は音量である。音量調節コマンドの場合、ユーザの手の移動方向が音量の変化方向(音量を増加させるか減少させるか)に対応し、ユーザの手の移動量又は移動速度が音の変化量(入力値)に対応する。 (Contents of Example 3)
Some commands include operations. FIG. 32 is a diagram showing how a command accompanied by an operation amount is input. For example, if the command is a volume control command (Vol + / Vol-), the operation amount is volume. In the case of the volume control command, the movement direction of the user's hand corresponds to the change direction of the volume (whether the volume is increased or decreased), and the movement amount or movement speed of the user's hand becomes the change amount (input value) of the sound. handle.
例3では、接触センサと非接触センサを組み合わせてコマンドのみならず、操作量(変化方向や変化量)の入力も可能にする。例えば、情報処理装置40は、少なくとも接触センサからのセンサ情報を使ってユーザの前記ジェスチャを検出する。そして、情報処理装置40は、非接触センサからのセンサ情報に基づいて操作量を判別する。これにより、情報処理装置40は、接触センサを使って精度の高いジェスチャ検出を実現しつつも、非接触センサを使って操作量を判別することができる。
In Example 3, the contact sensor and the non-contact sensor are combined to enable not only a command but also an operation amount (change direction and change amount) to be input. For example, the information processing apparatus 40 detects the user's gesture using at least the sensor information from the contact sensor. Then, the information processing apparatus 40 determines the operation amount based on the sensor information from the non-contact sensor. As a result, the information processing apparatus 40 can determine the operation amount by using the non-contact sensor while realizing highly accurate gesture detection by using the contact sensor.
なお、1つのジェスチャを構成するユーザの動作には、ジェスチャの開始のトリガとなる第1動作と、第1動作に続く第2動作とが含まれていてもよい。コマンドが音量調節コマンドなのであれば、第1動作は例えば顔タップであり、第2動作は例えば顔タップからの所定の方向(例えば、前後方向)の手の動きである。この場合、情報処理装置40は、第2動作の移動方向、移動量、及び/又は移動速度で、変化させる音量を判別してもよい。
Note that the user's actions constituting one gesture may include a first action that triggers the start of the gesture and a second action following the first action. If the command is a volume control command, the first action is, for example, a face tap, and the second action is, for example, a hand movement in a predetermined direction (for example, in the front-back direction) from the face tap. In this case, the information processing apparatus 40 may determine the volume to be changed based on the moving direction, the moving amount, and / or the moving speed of the second operation.
なお、人によってジェスチャの大きさや速度は違う。そのため、操作量を伴うコマンドの場合、入力する操作量が精度の低いものとなる可能性がある。そこで、情報処理装置40は、第1動作の速度や大きさの情報(例えば、顔タップするまでの手の速度や顔タップの強さ)に基づいて、第2動作に関する補正を行う。
The size and speed of gestures differ from person to person. Therefore, in the case of a command involving an operation amount, the operation amount to be input may be low in accuracy. Therefore, the information processing apparatus 40 makes corrections related to the second operation based on information on the speed and magnitude of the first operation (for example, the speed of the hand until the face is tapped and the strength of the face tap).
例えば、情報処理装置40は、非接触センサからの情報に基づいて第1動作の速度を判別する。そして、情報処理装置40は、第1動作の速さの情報に基づいて、第2動作と操作量とを対応付けるための設定値(例えば、閾値)を補正する。その後、情報処理装置40は、非接触センサからのセンサ情報に基づいて第2動作の速度又は移動量を判別する。そして、情報処理装置40は、補正した設定値と、第2動作の速度又は移動量の情報と、に基づいて操作量を判別する。これにより、情報処理装置40は、ユーザの操作量の入力精度を高いものとすることができる。
For example, the information processing device 40 determines the speed of the first operation based on the information from the non-contact sensor. Then, the information processing apparatus 40 corrects a set value (for example, a threshold value) for associating the second operation with the operation amount based on the information of the speed of the first operation. After that, the information processing apparatus 40 determines the speed or the amount of movement of the second operation based on the sensor information from the non-contact sensor. Then, the information processing apparatus 40 determines the operation amount based on the corrected set value and the information of the speed or the movement amount of the second operation. As a result, the information processing apparatus 40 can improve the input accuracy of the user's operation amount.
ジェスチャの入力速度が速いユーザは遠くのところに手を持っていく可能性があるためジェスチャの検出可能範囲は広い方が望ましい。一方で、ジェスチャの入力速度が遅いユーザは遠くのところに手を持っていく可能性が低いため、誤検出を考慮すると、ジェスチャの検出可能範囲は狭い方が望ましい。そこで、情報処理装置40は、ユーザの第1動作の情報に基づいて、第2動作の検出に関する補正を行う。
A user with a fast gesture input speed may bring his / her hand to a distant place, so it is desirable that the range in which the gesture can be detected is wide. On the other hand, a user with a slow gesture input speed is unlikely to bring his / her hand to a distant place. Therefore, considering false detection, it is desirable that the range in which the gesture can be detected is narrow. Therefore, the information processing apparatus 40 makes a correction regarding the detection of the second operation based on the information of the first operation of the user.
例えば、情報処理装置40は、非接触センサからのセンサ情報に基づいて第1動作の速さを判別する。そして、情報処理装置40は、第1動作の速さの情報に基づいて、第2動作の検出可能範囲を補正する。これにより、第2動作の検出精度を高いものとすることができる。
For example, the information processing device 40 determines the speed of the first operation based on the sensor information from the non-contact sensor. Then, the information processing apparatus 40 corrects the detectable range of the second operation based on the information of the speed of the first operation. As a result, the detection accuracy of the second operation can be made high.
<<5.変形例>>
上述の各実施形態はそれぞれ一例を示したものであり、種々の変更及び応用が可能である。 << 5. Modification example >>
Each of the above embodiments shows an example, and various modifications and applications are possible.
上述の各実施形態はそれぞれ一例を示したものであり、種々の変更及び応用が可能である。 << 5. Modification example >>
Each of the above embodiments shows an example, and various modifications and applications are possible.
(ジェスチャ検出に関する変形例)
例えば、実施形態1の情報処理装置10は、障害物までの距離に応じて検出可能範囲ARを変更してもよい。図33は、障害物までの距離に応じて検出可能範囲ARが変更される様子を示す図である。例えば、情報処理装置10は、一対の入力検出部11R、11Lを備えている。図33に示すように、入力検出部11Rの右方向(Y軸マイナス方向)には、障害物O1が位置しており、入力検出部11Lの左方向(Y軸プラス方向)には、障害物O2が位置している。障害物は、壁等の構造物であってもよいし、構造物から独立した物体であってもよい。 (Modified example of gesture detection)
For example, theinformation processing apparatus 10 of the first embodiment may change the detectable range AR according to the distance to an obstacle. FIG. 33 is a diagram showing how the detectable range AR is changed according to the distance to the obstacle. For example, the information processing apparatus 10 includes a pair of input detection units 11 R and 11 L. As shown in FIG. 33, the obstacle O1 is located in the right direction (Y-axis minus direction) of the input detection unit 11 R , and in the left direction (Y-axis plus direction) of the input detection unit 11 L. Obstacle O2 is located. The obstacle may be a structure such as a wall, or may be an object independent of the structure.
例えば、実施形態1の情報処理装置10は、障害物までの距離に応じて検出可能範囲ARを変更してもよい。図33は、障害物までの距離に応じて検出可能範囲ARが変更される様子を示す図である。例えば、情報処理装置10は、一対の入力検出部11R、11Lを備えている。図33に示すように、入力検出部11Rの右方向(Y軸マイナス方向)には、障害物O1が位置しており、入力検出部11Lの左方向(Y軸プラス方向)には、障害物O2が位置している。障害物は、壁等の構造物であってもよいし、構造物から独立した物体であってもよい。 (Modified example of gesture detection)
For example, the
入力検出部11R及び入力検出部11Lは、それぞれ障害物までの距離を検出する。図33の例であれば、障害物O1までの距離はd1であり、障害物O2までの距離はd2である。そして、情報処理装置10は、この情報に基づいて、検出可能範囲ARに障害物が入らないよう、検出可能範囲ARを調整する。図33の例であれば、情報処理装置10は、入力検出部11Rの検出可能範囲ARRを、離間方向(Y軸マイナス方向)の最大距離がd1より短いd3となるよう補正している。また、情報処理装置10は、入力検出部11Lの検出可能範囲ARLを、離間方向(Y軸プラス方向)の最大距離がd2より短いd4となるよう補正している。
The input detection unit 11 R and the input detection unit 11 L each detect the distance to an obstacle. In the example of FIG. 33, the distance to the obstacle O1 is d1, and the distance to the obstacle O2 is d2. Then, based on this information, the information processing apparatus 10 adjusts the detectable range AR so that an obstacle does not enter the detectable range AR. In the example of FIG. 33, the information processing apparatus 10 corrects the detectable range AR R of the input detection unit 11 R so that the maximum distance in the separation direction (Y-axis minus direction) is d3, which is shorter than d1. .. Further, the information processing apparatus 10, the detectable range AR L of the input detection unit 11L, the maximum distance separating direction (Y-axis plus direction) is corrected so as to be shorter than d2 d4.
情報処理装置10は、d3とd4が同じ距離となるよう検出可能範囲ARR及び検出可能範囲ARLを補正してもよいし、d3とd4が異なる距離となるよう検出可能範囲ARR及び検出可能範囲ARLを補正してもよい。また、情報処理装置10は、検出可能範囲ARR及び検出可能範囲ARLの一方のみ補正してもよい。なお、情報処理装置10は、障害物及び/又はユーザUの位置が予め分かっているのであれば、その位置情報に基づいて、検出可能範囲ARを補正してもよい。
The information processing apparatus 10 to d3 and d4 may be corrected detectable range AR R and detectable range AR L to be the same distance, the detectable range AR R and detection to d3 and d4 is different distances range AR L may be corrected. Further, the information processing apparatus 10 may correct only one of the detectable range AR R and the detectable range AR L. If the position of the obstacle and / or the user U is known in advance, the information processing apparatus 10 may correct the detectable range AR based on the position information.
これより、情報処理装置10が障害物によりジェスチャを検出できなくなるといった事態を少なくできるので、ジェスチャの検出精度が高くなる。なお、この変形例は、実施形態2の情報処理システム1にも適用可能である。
From this, it is possible to reduce the situation where the information processing apparatus 10 cannot detect the gesture due to an obstacle, so that the gesture detection accuracy becomes high. This modification can also be applied to the information processing system 1 of the second embodiment.
(機能の実行に関する変形例)
実施形態1の情報処理装置10は、装着時にユーザUの双方の耳に位置する一対の所定の部位を備えていてもよい。この場合、所定の部位は、イヤホン又はヘッドホンの音が出力される部分であってもよい。そして、情報処理装置10は、一対の所定の部位の双方でユーザUの振り上げ動作が検出された場合には、ジェスチャが検出された場合であっても、ジェスチャに関連付けられた機能を実行しないようにしてもよい。 (Transformation example related to function execution)
Theinformation processing apparatus 10 of the first embodiment may include a pair of predetermined parts located in both ears of the user U when worn. In this case, the predetermined portion may be a portion where the sound of the earphone or the headphone is output. Then, when the swinging motion of the user U is detected in both of the pair of predetermined parts, the information processing apparatus 10 does not execute the function associated with the gesture even if the gesture is detected. You may do it.
実施形態1の情報処理装置10は、装着時にユーザUの双方の耳に位置する一対の所定の部位を備えていてもよい。この場合、所定の部位は、イヤホン又はヘッドホンの音が出力される部分であってもよい。そして、情報処理装置10は、一対の所定の部位の双方でユーザUの振り上げ動作が検出された場合には、ジェスチャが検出された場合であっても、ジェスチャに関連付けられた機能を実行しないようにしてもよい。 (Transformation example related to function execution)
The
同様に、実施形態2の出力装置20は、装着時にユーザUの双方の耳に位置する一対の所定の部位を備えていてもよい。この場合、所定の部位は、イヤホン又はヘッドホンの音が出力される部分であってもよい。そして、実施形態2の端末装置30は、一対の所定の部位の双方でユーザUの振り上げ動作が検出された場合には、ジェスチャが検出された場合であっても、ジェスチャに関連付けられた機能を実行しないようにしてもよい。
Similarly, the output device 20 of the second embodiment may include a pair of predetermined parts located in both ears of the user U when worn. In this case, the predetermined portion may be a portion where the sound of the earphone or the headphone is output. Then, when the swing-up motion of the user U is detected in both of the pair of predetermined parts, the terminal device 30 of the second embodiment performs the function associated with the gesture even when the gesture is detected. You may not execute it.
左右双方で同時に振り上げ動作が検出される場合、ユーザUは、ジェスチャを行っているのではなく、イヤホン又はヘッドホン等の機器の装着を解除していると想定される。このような場合に機能を実行しないようにすることで、情報処理装置10又は端末装置30は、ユーザの意図に沿わない動作を少なくすることができる。
If the swinging motion is detected on both the left and right sides at the same time, it is assumed that the user U is not performing a gesture but has released the device such as earphones or headphones. By not executing the function in such a case, the information processing device 10 or the terminal device 30 can reduce operations that do not meet the user's intention.
(操作量に関する変形例)
実施形態1の情報処理装置10が有する機能には、操作量を伴う所定の機能が含まれていてもよい。このとき、操作量を伴う所定の機能は音量操作に係る機能(Vol+/Vol-)であってもよいし、再生速度に係る機能であってもよい。また、操作量を伴う所定の機能は、これらに限定されず、例えば、早送り、巻き戻し、スロー再生に係る機能であってもよい。所定の機能には、スワイプ等、移動幅を伴うジェスチャが関連付けられていてもよい。そして、情報処理装置10は、ジェスチャ検出部162で検出されたジェスチャが移動幅の伴うジェスチャの場合には、検出可能範囲の大きさを基準とした、当該ジェスチャの相対的な移動幅の大きさに基づいて、所定の機能の操作量(例えば、上げ下げする音の量)を判別してもよい。 (Modification example related to operation amount)
The function of theinformation processing apparatus 10 of the first embodiment may include a predetermined function accompanied by an operation amount. At this time, the predetermined function accompanied by the operation amount may be a function related to the volume operation (Vol + / Vol−) or a function related to the reproduction speed. Further, the predetermined function accompanied by the operation amount is not limited to these, and may be, for example, a function related to fast forward, rewind, and slow playback. A gesture with a movement width, such as a swipe, may be associated with a predetermined function. When the gesture detected by the gesture detection unit 162 is a gesture with a movement width, the information processing apparatus 10 has a relative movement width of the gesture based on the size of the detectable range. The operation amount of a predetermined function (for example, the amount of sound to be raised or lowered) may be determined based on the above.
実施形態1の情報処理装置10が有する機能には、操作量を伴う所定の機能が含まれていてもよい。このとき、操作量を伴う所定の機能は音量操作に係る機能(Vol+/Vol-)であってもよいし、再生速度に係る機能であってもよい。また、操作量を伴う所定の機能は、これらに限定されず、例えば、早送り、巻き戻し、スロー再生に係る機能であってもよい。所定の機能には、スワイプ等、移動幅を伴うジェスチャが関連付けられていてもよい。そして、情報処理装置10は、ジェスチャ検出部162で検出されたジェスチャが移動幅の伴うジェスチャの場合には、検出可能範囲の大きさを基準とした、当該ジェスチャの相対的な移動幅の大きさに基づいて、所定の機能の操作量(例えば、上げ下げする音の量)を判別してもよい。 (Modification example related to operation amount)
The function of the
同様に、実施形態2の端末装置30が有する機能には、操作量を伴う所定の機能が含まれていてもよい。そして、所定の機能には、スワイプ等、移動幅を伴うジェスチャが関連付けられていてもよい。そして、端末装置30は、ジェスチャ検出部362で検出されたジェスチャが移動幅の伴うジェスチャの場合には、検出可能範囲の大きさを基準とした、当該ジェスチャの相対的な移動幅の大きさに基づいて、所定の機能の操作量(例えば、上げ下げする音の量)を判別してもよい。
Similarly, the function of the terminal device 30 of the second embodiment may include a predetermined function accompanied by an operation amount. Then, a gesture with a movement width such as a swipe may be associated with the predetermined function. Then, when the gesture detected by the gesture detection unit 362 is a gesture with a movement width, the terminal device 30 determines the relative movement width of the gesture based on the size of the detectable range. Based on this, the operation amount of a predetermined function (for example, the amount of sound to be raised or lowered) may be determined.
これにより、1つのジェスチャで操作量も入力できるようになるので、情報処理装置10又は端末装置30のユーザビリティが向上する。
This makes it possible to input the operation amount with one gesture, so that the usability of the information processing device 10 or the terminal device 30 is improved.
(装置態様に関する変形例)
上述の実施形態では、情報処理装置10及び出力装置20は、ユーザが装着可能な機器(装着可能機器)であるものとしたが、必ずしも装着可能機器でなくてもよい。例えば、情報処理装置10及び出力装置20は、テレビ、カーナビゲーションシステム、運転台、各種操作パネル等、構造物や移動体に設置されて使用される機器であってもよい。また、情報処理装置10及び出力装置20は、移動体そのものであってもよい。 (Modified example of device mode)
In the above-described embodiment, theinformation processing device 10 and the output device 20 are devices that can be worn by the user (wearable devices), but are not necessarily wearable devices. For example, the information processing device 10 and the output device 20 may be devices installed and used in a structure or a moving body such as a television, a car navigation system, a driver's cab, and various operation panels. Further, the information processing device 10 and the output device 20 may be the mobile body itself.
上述の実施形態では、情報処理装置10及び出力装置20は、ユーザが装着可能な機器(装着可能機器)であるものとしたが、必ずしも装着可能機器でなくてもよい。例えば、情報処理装置10及び出力装置20は、テレビ、カーナビゲーションシステム、運転台、各種操作パネル等、構造物や移動体に設置されて使用される機器であってもよい。また、情報処理装置10及び出力装置20は、移動体そのものであってもよい。 (Modified example of device mode)
In the above-described embodiment, the
ここで、移動体は、スマートフォン、携帯電話、パーソナルコンピュータ、音楽プレーヤー、ポータブルテレビ等のモバイル端末であってもよいし、機器操作のためのリモートコントローラであってもよい。また、移動体は、陸上を移動する移動体(例えば、自動車、自転車、バス、トラック、自動二輪車、列車、リニアモーターカー等の車両)であってもよいし、地中(例えば、トンネル内)を移動する移動体(例えば、地下鉄)であってもよい。また、移動体は、水上を移動する移動体(例えば、旅客船、貨物船、ホバークラフト等の船舶)であってもよいし、水中を移動する移動体(例えば、潜水艇、潜水艦、無人潜水機等の潜水船)であってもよい。また、移動体は、大気圏内を移動する移動体(例えば、飛行機、飛行船、ドローン等の航空機)であってもよいし、大気圏外を移動する移動体(例えば、宇宙ステーション等の人工天体)であってもよい。
Here, the mobile body may be a mobile terminal such as a smartphone, a mobile phone, a personal computer, a music player, or a portable TV, or may be a remote controller for operating a device. Further, the moving body may be a moving body moving on land (for example, a vehicle such as a car, a bicycle, a bus, a truck, a motorcycle, a train, a linear motor car, etc.) or in the ground (for example, in a tunnel). It may be a moving body (for example, a subway) that moves around. Further, the moving body may be a moving body moving on the water (for example, a ship such as a passenger ship, a cargo ship, a hovercraft, etc.), or a moving body moving underwater (for example, a submersible, a submarine, an unmanned submarine, etc.). It may be a submarine). Further, the moving body may be a moving body moving in the atmosphere (for example, an aircraft such as an airplane, an airship, or a drone), or a moving body moving outside the atmosphere (for example, an artificial celestial body such as a space station). There may be.
また、構造物は、例えば、高層ビル、家屋、鉄塔、駅施設、空港施設、港湾施設、スタジアム等の建物である。なお、構造物という概念には、建物のみならず、トンネル、橋梁、ダム、塀、鉄柱等の構築物(Non-building structure)や、クレーン、門、風車等の設備も含まれる。また、構造物という概念には、陸上又は地中の構造物のみならず、桟橋、メガフロート等の水上の構造物や、海洋観測設備等の水中の構造物も含まれる。
The structure is, for example, a high-rise building, a house, a steel tower, a station facility, an airport facility, a port facility, a stadium, or the like. The concept of structure includes not only buildings but also structures such as tunnels, bridges, dams, walls, and iron pillars, and equipment such as cranes, gates, and windmills. Further, the concept of a structure includes not only a structure on land or in the ground, but also a structure on water such as a pier and a mega float, and a structure underwater such as an ocean observation facility.
(使用するセンサに関する変形例)
実施形態3では、情報処理装置40は、非接触センサと接触センサの両方を使ってユーザのジェスチャを検出したが、必ずしも非接触センサと接触センサの両方を使い続けなくてもよい。例えば、情報処理装置40は、バッテリー残量に合わせて非接触センサと接触センサの両方を使い続けるか、非接触センサと接触センサのいずれか一方を使うか決定してもよい。また、情報処理装置40は、帽子をかぶっているなど特別な環境変化がある場合は、使用するセンサに関する設定を変更してもよい。 (Modification example of the sensor used)
In the third embodiment, theinformation processing apparatus 40 detects the user's gesture using both the non-contact sensor and the contact sensor, but it is not always necessary to continue using both the non-contact sensor and the contact sensor. For example, the information processing apparatus 40 may decide whether to continue using both the non-contact sensor and the contact sensor according to the remaining battery level, or to use either the non-contact sensor or the contact sensor. Further, the information processing apparatus 40 may change the settings related to the sensor to be used when there is a special environmental change such as wearing a hat.
実施形態3では、情報処理装置40は、非接触センサと接触センサの両方を使ってユーザのジェスチャを検出したが、必ずしも非接触センサと接触センサの両方を使い続けなくてもよい。例えば、情報処理装置40は、バッテリー残量に合わせて非接触センサと接触センサの両方を使い続けるか、非接触センサと接触センサのいずれか一方を使うか決定してもよい。また、情報処理装置40は、帽子をかぶっているなど特別な環境変化がある場合は、使用するセンサに関する設定を変更してもよい。 (Modification example of the sensor used)
In the third embodiment, the
(補正に関する変形例)
実施形態3では、情報処理装置40は、センサ値に基づいてセンサの検出可能範囲の補正を行ったが、センサ値以外の情報を使って検出可能範囲を補正してもよい。例えば、情報処理装置40は、情報処理装置40のデバイスの種類(ヘッドホン/イヤホン/ヘッドマウントディスプレイなど)に応じて検出可能範囲を補正してもよい。例えば、情報処理装置40のデバイスの種類がヘッドホンの場合、ユーザが物理接触可能な領域に余裕があるので、検出可能範囲A1の補正は少なめにする。また、他のセンサー(例えば、心拍センサ等の生体センサ)を使ってユーザの状態を推定し、推定結果に基づいて検出可能範囲を補正してもよい。 (Modification example related to correction)
In the third embodiment, theinformation processing apparatus 40 corrects the detectable range of the sensor based on the sensor value, but the detectable range may be corrected by using information other than the sensor value. For example, the information processing apparatus 40 may correct the detectable range according to the type of the device of the information processing apparatus 40 (headphones / earphones / head-mounted display, etc.). For example, when the type of the device of the information processing apparatus 40 is headphones, there is a margin in the area where the user can physically contact, so the correction of the detectable range A1 is made small. Further, another sensor (for example, a biological sensor such as a heart rate sensor) may be used to estimate the user's state, and the detectable range may be corrected based on the estimation result.
実施形態3では、情報処理装置40は、センサ値に基づいてセンサの検出可能範囲の補正を行ったが、センサ値以外の情報を使って検出可能範囲を補正してもよい。例えば、情報処理装置40は、情報処理装置40のデバイスの種類(ヘッドホン/イヤホン/ヘッドマウントディスプレイなど)に応じて検出可能範囲を補正してもよい。例えば、情報処理装置40のデバイスの種類がヘッドホンの場合、ユーザが物理接触可能な領域に余裕があるので、検出可能範囲A1の補正は少なめにする。また、他のセンサー(例えば、心拍センサ等の生体センサ)を使ってユーザの状態を推定し、推定結果に基づいて検出可能範囲を補正してもよい。 (Modification example related to correction)
In the third embodiment, the
(ユーザによる検出可能範囲の設定)
実施形態3では、情報処理装置40は、センサ値に基づいてセンサの検出可能範囲の補正を行ったが、ユーザが検出可能範囲を設定できるように構成されていてもよい。図34A及び図34Bは、検出可能範囲の設定のためのGUI(Graphical User Interface)の一例を示す図である。図34Aに示したGUIが平面方向(ユーザの耳Eに向かって前後左右方向)の検出可能範囲設定のためのGUIであり、図34Bに示したGUIが奥行方向(ユーザの耳Eから離間する方向)の検出可能範囲設定のためのGUIである。領域A3は、例えば、接触センサの検出可能範囲A1に対応し、領域A4は、例えば、非接触センサの検出可能範囲A2に対応する。或いは、領域A3は、例えば、非接触センサの検出可能範囲A2に対応し、領域A4は、例えば、非接触センサの検出領域ORに対応する。ユーザは、図34A及び図34Bに示したGUIを使ってセンサの検出可能範囲を設定する。なお、情報処理装置40は、ヘッドホンやイヤホンでの操作領域の違いを可視化してもよい。また、情報処理装置40は、デバイスを切り替え時などはGUI上でチュートリアルしてもよい。 (User detectable range setting)
In the third embodiment, theinformation processing apparatus 40 corrects the detectable range of the sensor based on the sensor value, but may be configured so that the user can set the detectable range. 34A and 34B are diagrams showing an example of a GUI (Graphical User Interface) for setting a detectable range. The GUI shown in FIG. 34A is a GUI for setting the detectable range in the plane direction (front-back and left-right directions toward the user's ear E), and the GUI shown in FIG. 34B is separated from the depth direction (user's ear E). It is a GUI for setting the detectable range of (direction). The area A3 corresponds to, for example, the detectable range A1 of the contact sensor, and the area A4 corresponds to, for example, the detectable range A2 of the non-contact sensor. Alternatively, the area A3 corresponds to, for example, the detectable range A2 of the non-contact sensor, and the area A4 corresponds to, for example, the detection area OR of the non-contact sensor. The user sets the detectable range of the sensor using the GUI shown in FIGS. 34A and 34B. The information processing device 40 may visualize the difference in the operation area between the headphones and the earphones. Further, the information processing apparatus 40 may be used as a tutorial on the GUI when switching devices.
実施形態3では、情報処理装置40は、センサ値に基づいてセンサの検出可能範囲の補正を行ったが、ユーザが検出可能範囲を設定できるように構成されていてもよい。図34A及び図34Bは、検出可能範囲の設定のためのGUI(Graphical User Interface)の一例を示す図である。図34Aに示したGUIが平面方向(ユーザの耳Eに向かって前後左右方向)の検出可能範囲設定のためのGUIであり、図34Bに示したGUIが奥行方向(ユーザの耳Eから離間する方向)の検出可能範囲設定のためのGUIである。領域A3は、例えば、接触センサの検出可能範囲A1に対応し、領域A4は、例えば、非接触センサの検出可能範囲A2に対応する。或いは、領域A3は、例えば、非接触センサの検出可能範囲A2に対応し、領域A4は、例えば、非接触センサの検出領域ORに対応する。ユーザは、図34A及び図34Bに示したGUIを使ってセンサの検出可能範囲を設定する。なお、情報処理装置40は、ヘッドホンやイヤホンでの操作領域の違いを可視化してもよい。また、情報処理装置40は、デバイスを切り替え時などはGUI上でチュートリアルしてもよい。 (User detectable range setting)
In the third embodiment, the
(その他の変形例)
本実施形態の情報処理装置10、出力装置20、又は端末装置30を制御する制御装置は、専用のコンピュータシステムにより実現してもよいし、汎用のコンピュータシステムによって実現してもよい。 (Other variants)
The control device for controlling theinformation processing device 10, the output device 20, or the terminal device 30 of the present embodiment may be realized by a dedicated computer system or a general-purpose computer system.
本実施形態の情報処理装置10、出力装置20、又は端末装置30を制御する制御装置は、専用のコンピュータシステムにより実現してもよいし、汎用のコンピュータシステムによって実現してもよい。 (Other variants)
The control device for controlling the
例えば、上述の動作を実行するための通信プログラムを、光ディスク、半導体メモリ、磁気テープ、フレキシブルディスク等のコンピュータ読み取り可能な記録媒体に格納して配布する。そして、例えば、該プログラムをコンピュータにインストールし、上述の処理を実行することによって制御装置を構成する。このとき、制御装置は、情報処理装置10、出力装置20、又は端末装置30の外部の装置(例えば、パーソナルコンピュータ)であってもよい。また、制御装置は、情報処理装置10、出力装置20、又は端末装置30の内部の装置(例えば、制御部16、制御部26、又は制御部36)であってもよい。
For example, a communication program for executing the above operation is stored and distributed in a computer-readable recording medium such as an optical disk, a semiconductor memory, a magnetic tape, or a flexible disk. Then, for example, the control device is configured by installing the program in a computer and executing the above-mentioned processing. At this time, the control device may be an information processing device 10, an output device 20, or an external device (for example, a personal computer) of the terminal device 30. Further, the control device may be an internal device (for example, control unit 16, control unit 26, or control unit 36) of the information processing device 10, the output device 20, or the terminal device 30.
また、上記通信プログラムをインターネット等のネットワーク上のサーバ装置が備えるディスク装置に格納しておき、コンピュータにダウンロード等できるようにしてもよい。また、上述の機能を、OS(Operating System)とアプリケーションソフトとの協働により実現してもよい。この場合には、OS以外の部分を媒体に格納して配布してもよいし、OS以外の部分をサーバ装置に格納しておき、コンピュータにダウンロード等できるようにしてもよい。
Further, the above communication program may be stored in a disk device provided in a server device on a network such as the Internet so that it can be downloaded to a computer or the like. Further, the above-mentioned functions may be realized by the collaboration between the OS (Operating System) and the application software. In this case, the part other than the OS may be stored in a medium and distributed, or the part other than the OS may be stored in the server device so that it can be downloaded to a computer or the like.
また、上記実施形態において説明した各処理のうち、自動的に行われるものとして説明した処理の全部又は一部を手動的に行うこともでき、あるいは、手動的に行われるものとして説明した処理の全部又は一部を公知の方法で自動的に行うこともできる。この他、上記文書中や図面中で示した処理手順、具体的名称、各種のデータやパラメータを含む情報については、特記する場合を除いて任意に変更することができる。例えば、各図に示した各種情報は、図示した情報に限られない。
Further, among the processes described in the above-described embodiment, all or a part of the processes described as being automatically performed can be manually performed, or the processes described as being manually performed can be performed. All or part of it can be done automatically by a known method. In addition, information including processing procedures, specific names, various data and parameters shown in the above documents and drawings can be arbitrarily changed unless otherwise specified. For example, the various information shown in each figure is not limited to the information shown in the figure.
また、図示した各装置の各構成要素は機能概念的なものであり、必ずしも物理的に図示の如く構成されていることを要しない。すなわち、各装置の分散・統合の具体的形態は図示のものに限られず、その全部又は一部を、各種の負荷や使用状況などに応じて、任意の単位で機能的又は物理的に分散・統合して構成することができる。
Further, each component of each device shown in the figure is a functional concept, and does not necessarily have to be physically configured as shown in the figure. That is, the specific form of distribution / integration of each device is not limited to the one shown in the figure, and all or part of them may be functionally or physically distributed / physically in arbitrary units according to various loads and usage conditions. Can be integrated and configured.
また、上述の実施形態は、処理内容を矛盾させない領域で適宜組み合わせることが可能である。また、上述の実施形態のフローチャートに示された各ステップは、適宜順序を変更することが可能である。
Further, the above-described embodiments can be appropriately combined in a region where the processing contents do not contradict each other. Further, the order of each step shown in the flowchart of the above-described embodiment can be changed as appropriate.
また、例えば、本実施形態は、装置またはシステムを構成するあらゆる構成、例えば、システムLSI(Large Scale Integration)等としてのプロセッサ、複数のプロセッサ等を用いるモジュール、複数のモジュール等を用いるユニット、ユニットにさらにその他の機能を付加したセット等(すなわち、装置の一部の構成)として実施することもできる。
Further, for example, the present embodiment includes a device or any configuration constituting the system, for example, a processor as a system LSI (Large Scale Integration), a module using a plurality of processors, a unit using a plurality of modules, and a unit. It can also be implemented as a set or the like with other functions added (that is, a configuration of a part of the device).
なお、本実施形態において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、全ての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。
In the present embodiment, the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
また、例えば、本実施形態は、1つの機能を、ネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成をとることができる。
Further, for example, the present embodiment can have a cloud computing configuration in which one function is shared by a plurality of devices via a network and jointly processed.
<<6.むすび>>
以上説明したように、本開示の一実施形態によれば、情報処理装置10及び端末装置30は、ジェスチャに関する情報及び機器状態に関する情報に基づいて、ジェスチャの検出可能範囲を補正している。そのため、情報処理装置10及び端末装置30は、ユーザがジェスチャとして意図していない動作までジェスチャとして検出されてしまう事態を少なくできる。 << 6. Conclusion >>
As described above, according to one embodiment of the present disclosure, theinformation processing apparatus 10 and the terminal apparatus 30 correct the detectable range of the gesture based on the information regarding the gesture and the information regarding the device state. Therefore, the information processing device 10 and the terminal device 30 can reduce the situation where the user detects an operation that is not intended as a gesture as a gesture.
以上説明したように、本開示の一実施形態によれば、情報処理装置10及び端末装置30は、ジェスチャに関する情報及び機器状態に関する情報に基づいて、ジェスチャの検出可能範囲を補正している。そのため、情報処理装置10及び端末装置30は、ユーザがジェスチャとして意図していない動作までジェスチャとして検出されてしまう事態を少なくできる。 << 6. Conclusion >>
As described above, according to one embodiment of the present disclosure, the
また、本開示の一実施形態によれば、情報処理装置10及び端末装置30は、ジェスチャに関する情報及び機器状態に関する情報に基づいて、ジェスチャの検出のための基準方向を補正している。そのため、情報処理装置10及び端末装置30は、左右スワイプを上下スワイプとして検出するといった誤検出を少なくできる。
Further, according to one embodiment of the present disclosure, the information processing apparatus 10 and the terminal apparatus 30 correct the reference direction for detecting the gesture based on the information regarding the gesture and the information regarding the device state. Therefore, the information processing device 10 and the terminal device 30 can reduce erroneous detection such as detecting the left / right swipe as the up / down swipe.
以上、本開示の各実施形態について説明したが、本開示の技術的範囲は、上述の各実施形態そのままに限定されるものではなく、本開示の要旨を逸脱しない範囲において種々の変更が可能である。また、異なる実施形態及び変形例にわたる構成要素を適宜組み合わせてもよい。
Although each embodiment of the present disclosure has been described above, the technical scope of the present disclosure is not limited to the above-mentioned embodiments as they are, and various changes can be made without departing from the gist of the present disclosure. be. In addition, components spanning different embodiments and modifications may be combined as appropriate.
また、本明細書に記載された各実施形態における効果はあくまで例示であって限定されるものでは無く、他の効果があってもよい。
Further, the effects in each embodiment described in the present specification are merely examples and are not limited, and other effects may be obtained.
なお、本技術は以下のような構成も取ることができる。
(1)
物体を検出する検出部の検出領域で行われるユーザのジェスチャを検出するジェスチャ検出部と、
前記ジェスチャに関する情報、及び前記検出部を備える所定の機器の状態に関する情報、の少なくとも一方の情報に基づいて、前記検出領域内の前記ジェスチャの検出可能範囲、及び前記ジェスチャの検出のための基準方向、の少なくとも一方を補正する補正部と、
を備える情報処理装置。
(2)
前記補正部は、前記ジェスチャに関する前記ユーザの動作の情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(1)に記載の情報処理装置。
(3)
前記ジェスチャに関する前記ユーザの動作の情報には、前記ユーザが手を前記検出可能範囲に進入させる第1の動作の情報が含まれ、
前記補正部は、前記第1の動作の情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(2)に記載の情報処理装置。
(4)
前記補正部は、前記第1の動作の情報に基づいて、前記検出可能範囲が少なくとも前記検出領域よりも狭くなるよう補正する、
前記(3)に記載の情報処理装置。
(5)
1つの前記ジェスチャを構成する前記ユーザの動作には、前記第1の動作と、該第1の動作に続く第2動作と、が含まれ、
前記補正部は、前記第1の動作が前記検出可能範囲で検出された後、前記第2動作を検出するための前記検出可能範囲を、前記第1の動作を検出したときの前記検出可能範囲よりも広くする、
前記(4)に記載の情報処理装置。
(6)
前記補正部は、検出対象の前記ジェスチャが移動幅を伴うジェスチャである場合に、前記第2動作を検出するための前記検出可能範囲を、前記第1の動作を検出したときの前記検出可能範囲よりも広くする、
前記(5)に記載の情報処理装置。
(7)
前記第1の動作の情報には、前記検出可能範囲に進入する前記手の進入方向を示す情報が含まれ、
前記補正部は、前記手の進入方向を示す情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(3)~(6)のいずれか1つに記載の情報処理装置。
(8)
前記所定の機器は、前記ユーザが装着可能であり、
前記補正部は、前記手の進入方向を示す情報に基づき推測される前記所定の機器の前記ユーザへの装着状態に関する情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(7)に記載の情報処理装置。
(9)
前記補正部は、前記手の進入方向を示す情報に基づき推測される前記所定の機器の装着傾きに関する情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(8)に記載の情報処理装置。
(10)
前記所定の機器は、装着時に前記ユーザの耳に位置する所定の部位を備え、
前記所定の部位には、前記検出部が配置されており、
前記補正部は、前記ユーザが手を耳に向けて振り上げる振り上げ動作の情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(8)又は(9)に記載の情報処理装置。
(11)
前記ジェスチャが検出された場合に、該ジェスチャが関連付けられた機能を実行する実行部、を備え、
前記所定の機器は、装着時に前記ユーザの双方の耳に位置する一対の前記所定の部位を備えており、
前記実行部は、一対の前記所定の部位の双方で前記振り上げ動作が検出された場合には、前記ジェスチャが検出された場合であっても、該ジェスチャに関連付けられた前記機能を実行しない、
前記(10)に記載の情報処理装置。
(12)
前記検出可能範囲は、前記検出領域の縁の一部に前記検出可能範囲の縁の一部が接するよう前記検出領域内に配置されており、
前記ジェスチャ検出部は、前記ユーザの手が前記検出領域内の前記検出可能範囲以外の範囲から進入した場合には、前記ジェスチャを検出しない、
前記(3)~(11)のいずれか1つに記載の情報処理装置。
(13)
前記ジェスチャが検出された場合に、該ジェスチャが関連付けられた機能を実行する実行部、を備え、
前記機能には、少なくとも、操作量を伴う所定の機能が含まれており、
前記所定の機能には、移動幅を伴う所定のジェスチャが関連付けられており、
前記実行部は、前記ジェスチャ検出部で検出されたジェスチャが前記所定のジェスチャの場合には、前記検出可能範囲に対する前記所定のジェスチャの相対的な移動幅の大きさに基づいて前記操作量を判別する、
前記(1)~(12)のいずれか1つに記載の情報処理装置。
(14)
前記補正部は、前記所定の機器の状態に関する検出を行う状態検出部からの情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(1)~(13)のいずれか1つに記載の情報処理装置。
(15)
前記所定の機器は、前記ユーザが装着可能であり、
前記補正部は、前記状態検出部からの情報に基づき推定される前記所定の機器の前記ユーザへの装着状態に関する情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(14)に記載の情報処理装置。
(16)
前記所定の機器は、前記ユーザが装着可能であり、
前記補正部は、前記状態検出部からの情報に基づき推定される前記ユーザの姿勢に関する情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(14)又は(15)に記載の情報処理装置。
(17)
前記状態検出部は、加速度センサ、ジャイロセンサ、生体センサの少なくともいずれかの種類のセンサを備え、
前記補正部は、前記状態検出部が備える1又は複数のセンサからの情報に基づいて前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(14)~(16)のいずれか1つに記載の情報処理装置。
(18)
前記所定の機器は、ヘッドホン又はイヤホンである、
前記(1)~(17)のいずれか1つに記載の情報処理装置。
(19)
前記情報処理装置は、前記所定の機器、又は前記所定の機器を該所定の機器の外部から有線又は無線を介して制御する装置である、
前記(1)~(18)のいずれか1つに記載の情報処理装置。
(20)
物体を検出する検出部の検出領域で行われるユーザのジェスチャを検出し、
前記ジェスチャに関する情報、及び前記検出部を備える所定の機器の状態に関する情報、の少なくとも一方の情報に基づいて、前記検出領域内の前記ジェスチャの検出可能範囲、及び前記ジェスチャの検出のための基準方向、の少なくとも一方を補正する、
情報処理方法。
(21)
ユーザの人体又は物体への接触を検出する接触センサからのセンサ情報を取得する第1の取得部と、
前記ユーザの非接触での動きを含む前記ユーザの動きを検出する非接触センサからのセンサ情報を取得する第2の取得部と、
前記接触センサ及び前記非接触センサの少なくとも一方からのセンサ情報に基づいて、機器操作のための前記ユーザのジェスチャを検出するジェスチャ検出部と、
前記接触センサからのセンサ情報及び前記非接触センサからのセンサ情報に基づいて、前記ジェスチャの検出に関する補正又は前記機器操作に関する補正を行う補正部と、
を備える情報処理装置。
(22)
前記ジェスチャ検出部は、少なくとも前記接触センサからのセンサ情報を使って前記ユーザの前記ジェスチャを検出し、
前記補正部は、前記ユーザが前記ジェスチャを行った位置を示すジェスチャ位置であって前記非接触センサからのセンサ情報に基づき検出される前記ジェスチャ位置の情報に基づいて、前記ジェスチャの検出可能範囲を補正する、
前記(21)に記載の情報処理装置。
(23)
検出した前記ジェスチャに対応するコマンドを実行する実行部、を備え、
前記実行部は、前記接触センサからのセンサ情報に基づき検出されたジェスチャと、前記非接触センサからのセンサ情報に基づき検出される前記ジェスチャ位置の情報と、に基づいて、実行するコマンドを判別する、
前記(22)に記載の情報処理装置。
(24)
前記ジェスチャ検出部は、少なくとも前記非接触センサからのセンサ情報を使って前記ユーザの前記ジェスチャを検出し、
前記補正部は、前記ユーザが前記人体又は前記物体へ接触したときの前記非接触センサからのセンサ情報に基づいて、前記ジェスチャの検出可能範囲を補正する、
前記(21)に記載の情報処理装置。
(25)
前記補正部は、前記接触センサからのセンサ情報に基づいて前記ユーザの前記人体又は前記物体への接触を検出し、前記ユーザの前記人体又は前記物体への接触を検出したときの前記非接触センサからのセンサ情報に基づいて、前記ジェスチャの検出可能範囲を補正する、
前記(24)に記載の情報処理装置。
(26)
前記ユーザの前記人体又は前記物体への接触を検出した後、前記補正した前記検出可能範囲に前記ユーザの手が位置した場合に、前記ユーザにフィードバックを行う出力制御部、を備える、
前記(25)に記載の情報処理装置。
(27)
検出した前記ジェスチャに対応するコマンドを実行するコマンド実行部、を備え、
前記コマンドには、少なくとも、操作量を伴うコマンドが含まれており、
前記ジェスチャ検出部は、少なくとも前記接触センサからのセンサ情報を使って前記ユーザの前記ジェスチャを検出し、
前記コマンド実行部は、前記非接触センサからのセンサ情報に基づいて前記操作量を判別する、
前記(21)に記載の情報処理装置。
(28)
1つのジェスチャを構成するユーザの動作には、ジェスチャの開始のトリガとなる第1動作と、第1動作に続く第2動作とが含まれ、
前記コマンド実行部は、前記非接触センサからのセンサ情報に基づいて判別される前記第2動作の速度又は移動量に基づいて前記操作量を判別し、
前記補正部は、前記非接触センサからのセンサ情報に基づき検出される前記第1動作の速さの情報に基づいて、前記第2動作と前記操作量とを対応付けるための設定値を変更する、
前記(27)に記載の情報処理装置。
(29)
前記補正部は、前記非接触センサからのセンサ情報に基づき検出される前記第1動作の速さの情報に基づいて、前記第2動作の検出可能範囲を補正する、
前記(28)に記載の情報処理装置。
(30)
ユーザの人体又は物への接触を検出する接触センサからのセンサ情報を取得し、
前記ユーザの非接触での動きを含む前記ユーザの動きを検出する非接触センサからのセンサ情報を取得し、
前記接触センサ及び前記非接触センサの少なくとも一方からのセンサ情報に基づいて、機器操作のための前記ユーザのジェスチャを検出し、
前記接触センサからのセンサ情報及び前記非接触センサからのセンサ情報に基づいて、前記ジェスチャの検出に関する補正又は前記機器操作に関する補正を行う、
情報処理方法。
(31)
コンピュータを、
物体を検出する検出部の検出領域で行われるユーザのジェスチャを検出するジェスチャ検出部、
前記ジェスチャに関する情報、及び前記検出部を備える所定の機器の状態に関する情報、の少なくとも一方の情報に基づいて、前記検出領域内の前記ジェスチャの検出可能範囲、及び前記ジェスチャの検出のための基準方向、の少なくとも一方を補正する補正部、
として機能させるための情報処理プログラム。
(32)
コンピュータを、
ユーザの人体又は物への接触を検出する接触センサからのセンサ情報を取得する第1の取得部、
前記ユーザの非接触での動きを含む前記ユーザの動きを検出する非接触センサからのセンサ情報を取得する第2の取得部、
前記接触センサ及び前記非接触センサの少なくとも一方からのセンサ情報に基づいて、機器操作のための前記ユーザのジェスチャを検出するジェスチャ検出部、
前記接触センサからのセンサ情報及び前記非接触センサからのセンサ情報に基づいて、前記ジェスチャの検出に関する補正又は前記機器操作に関する補正を行う補正部、
として機能させるための情報処理プログラム。 The present technology can also have the following configurations.
(1)
A gesture detection unit that detects a user's gesture performed in the detection area of the detection unit that detects an object, and a gesture detection unit.
The detectable range of the gesture in the detection area and the reference direction for detecting the gesture based on at least one of the information about the gesture and the information about the state of the predetermined device including the detection unit. A correction unit that corrects at least one of the
Information processing device equipped with.
(2)
The correction unit corrects at least one of the detectable range and the reference direction based on the information of the user's movement regarding the gesture.
The information processing apparatus according to (1) above.
(3)
The user's action information regarding the gesture includes information on the first action that causes the user to enter the detectable range.
The correction unit corrects at least one of the detectable range and the reference direction based on the information of the first operation.
The information processing device according to (2) above.
(4)
The correction unit corrects the detectable range to be narrower than at least the detection region based on the information of the first operation.
The information processing apparatus according to (3) above.
(5)
The user's actions constituting one gesture include the first action and a second action following the first action.
After the first operation is detected in the detectable range, the correction unit sets the detectable range for detecting the second operation in the detectable range when the first operation is detected. Wider than
The information processing apparatus according to (4) above.
(6)
When the gesture to be detected is a gesture with a movement width, the correction unit sets the detectable range for detecting the second motion as the detectable range when the first motion is detected. Wider than
The information processing apparatus according to (5) above.
(7)
The information of the first operation includes information indicating the approach direction of the hand entering the detectable range.
The correction unit corrects at least one of the detectable range and the reference direction based on the information indicating the approach direction of the hand.
The information processing apparatus according to any one of (3) to (6).
(8)
The predetermined device can be worn by the user.
The correction unit corrects at least one of the detectable range and the reference direction based on the information regarding the wearing state of the predetermined device to the user, which is estimated based on the information indicating the approach direction of the hand.
The information processing apparatus according to (7) above.
(9)
The correction unit corrects at least one of the detectable range and the reference direction based on the information regarding the mounting inclination of the predetermined device estimated based on the information indicating the approach direction of the hand.
The information processing apparatus according to (8) above.
(10)
The predetermined device comprises a predetermined portion located in the user's ear when worn.
The detection unit is arranged at the predetermined portion, and the detection unit is arranged.
The correction unit corrects at least one of the detectable range and the reference direction based on the information of the swinging motion in which the user raises his / her hand toward his / her ear.
The information processing apparatus according to (8) or (9).
(11)
When the gesture is detected, the execution unit, which executes the function associated with the gesture, is provided.
The predetermined device comprises a pair of the predetermined parts located in both ears of the user when worn.
When the swing-up motion is detected in both of the pair of the predetermined parts, the execution unit does not execute the function associated with the gesture even if the gesture is detected.
The information processing apparatus according to (10) above.
(12)
The detectable range is arranged in the detection area so that a part of the edge of the detectable range is in contact with a part of the edge of the detection area.
The gesture detection unit does not detect the gesture when the user's hand enters from a range other than the detectable range in the detection area.
The information processing apparatus according to any one of (3) to (11).
(13)
When the gesture is detected, the execution unit, which executes the function associated with the gesture, is provided.
The above-mentioned functions include at least a predetermined function accompanied by an operation amount.
A predetermined gesture with a movement width is associated with the predetermined function.
When the gesture detected by the gesture detection unit is the predetermined gesture, the execution unit determines the operation amount based on the magnitude of the movement width of the predetermined gesture with respect to the detectable range. do,
The information processing apparatus according to any one of (1) to (12).
(14)
The correction unit corrects at least one of the detectable range and the reference direction based on the information from the state detection unit that detects the state of the predetermined device.
The information processing apparatus according to any one of (1) to (13).
(15)
The predetermined device can be worn by the user.
The correction unit corrects at least one of the detectable range and the reference direction based on the information regarding the wearing state of the predetermined device to the user, which is estimated based on the information from the state detection unit.
The information processing apparatus according to (14) above.
(16)
The predetermined device can be worn by the user.
The correction unit corrects at least one of the detectable range and the reference direction based on the information regarding the posture of the user estimated based on the information from the state detection unit.
The information processing apparatus according to (14) or (15).
(17)
The state detection unit includes at least one of an acceleration sensor, a gyro sensor, and a biosensor.
The correction unit corrects at least one of the detectable range and the reference direction based on the information from one or a plurality of sensors included in the state detection unit.
The information processing apparatus according to any one of (14) to (16).
(18)
The predetermined device is a headphone or an earphone.
The information processing apparatus according to any one of (1) to (17).
(19)
The information processing device is a device that controls the predetermined device or the predetermined device from the outside of the predetermined device via wire or wirelessly.
The information processing apparatus according to any one of (1) to (18).
(20)
Detects the user's gesture performed in the detection area of the detector that detects the object,
The detectable range of the gesture in the detection area and the reference direction for detecting the gesture based on at least one of the information about the gesture and the information about the state of the predetermined device including the detection unit. , Correct at least one of
Information processing method.
(21)
A first acquisition unit that acquires sensor information from a contact sensor that detects contact with the user's human body or object, and
A second acquisition unit that acquires sensor information from the non-contact sensor that detects the user's movement including the user's non-contact movement, and a second acquisition unit.
A gesture detection unit that detects a gesture of the user for operating a device based on sensor information from at least one of the contact sensor and the non-contact sensor.
Based on the sensor information from the contact sensor and the sensor information from the non-contact sensor, a correction unit that makes a correction related to the detection of the gesture or a correction related to the operation of the device, and a correction unit.
Information processing device equipped with.
(22)
The gesture detection unit detects the gesture of the user using at least the sensor information from the contact sensor.
The correction unit determines the detectable range of the gesture based on the information of the gesture position detected based on the sensor information from the non-contact sensor, which is the gesture position indicating the position where the user has performed the gesture. to correct,
The information processing apparatus according to (21).
(23)
It is equipped with an execution unit that executes a command corresponding to the detected gesture.
The execution unit determines a command to be executed based on the gesture detected based on the sensor information from the contact sensor and the gesture position information detected based on the sensor information from the non-contact sensor. ,
The information processing apparatus according to (22) above.
(24)
The gesture detection unit detects the gesture of the user using at least the sensor information from the non-contact sensor.
The correction unit corrects the detectable range of the gesture based on the sensor information from the non-contact sensor when the user comes into contact with the human body or the object.
The information processing apparatus according to (21).
(25)
The correction unit detects the user's contact with the human body or the object based on the sensor information from the contact sensor, and the non-contact sensor when the user's contact with the human body or the object is detected. Corrects the detectable range of the gesture based on the sensor information from
The information processing apparatus according to (24).
(26)
It is provided with an output control unit that gives feedback to the user when the user's hand is positioned in the corrected detectable range after detecting the contact of the user with the human body or the object.
The information processing apparatus according to (25) above.
(27)
It is equipped with a command execution unit that executes a command corresponding to the detected gesture.
The command includes at least a command with an operation amount.
The gesture detection unit detects the gesture of the user using at least the sensor information from the contact sensor.
The command execution unit determines the operation amount based on the sensor information from the non-contact sensor.
The information processing apparatus according to (21).
(28)
The user's actions constituting one gesture include a first action that triggers the start of the gesture and a second action following the first action.
The command execution unit determines the operation amount based on the speed or movement amount of the second operation determined based on the sensor information from the non-contact sensor.
The correction unit changes a set value for associating the second operation with the operation amount based on the information of the speed of the first operation detected based on the sensor information from the non-contact sensor.
The information processing apparatus according to (27).
(29)
The correction unit corrects the detectable range of the second operation based on the information of the speed of the first operation detected based on the sensor information from the non-contact sensor.
The information processing apparatus according to (28) above.
(30)
Acquires sensor information from a contact sensor that detects contact with the user's body or object,
The sensor information from the non-contact sensor that detects the movement of the user including the non-contact movement of the user is acquired, and the sensor information is acquired.
Based on the sensor information from at least one of the contact sensor and the non-contact sensor, the user's gesture for operating the device is detected.
Based on the sensor information from the contact sensor and the sensor information from the non-contact sensor, the correction related to the detection of the gesture or the correction related to the operation of the device is performed.
Information processing method.
(31)
Computer,
Gesture detector that detects the user's gesture performed in the detection area of the detector that detects the object,
The detectable range of the gesture in the detection area and the reference direction for detecting the gesture based on at least one of the information about the gesture and the information about the state of the predetermined device including the detection unit. , A correction unit that corrects at least one of
An information processing program to function as.
(32)
Computer,
A first acquisition unit that acquires sensor information from a contact sensor that detects contact with the user's body or object,
A second acquisition unit that acquires sensor information from a non-contact sensor that detects the user's movement, including the user's non-contact movement.
A gesture detection unit that detects a gesture of the user for operating a device based on sensor information from at least one of the contact sensor and the non-contact sensor.
A correction unit that corrects the detection of the gesture or the operation of the device based on the sensor information from the contact sensor and the sensor information from the non-contact sensor.
An information processing program to function as.
(1)
物体を検出する検出部の検出領域で行われるユーザのジェスチャを検出するジェスチャ検出部と、
前記ジェスチャに関する情報、及び前記検出部を備える所定の機器の状態に関する情報、の少なくとも一方の情報に基づいて、前記検出領域内の前記ジェスチャの検出可能範囲、及び前記ジェスチャの検出のための基準方向、の少なくとも一方を補正する補正部と、
を備える情報処理装置。
(2)
前記補正部は、前記ジェスチャに関する前記ユーザの動作の情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(1)に記載の情報処理装置。
(3)
前記ジェスチャに関する前記ユーザの動作の情報には、前記ユーザが手を前記検出可能範囲に進入させる第1の動作の情報が含まれ、
前記補正部は、前記第1の動作の情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(2)に記載の情報処理装置。
(4)
前記補正部は、前記第1の動作の情報に基づいて、前記検出可能範囲が少なくとも前記検出領域よりも狭くなるよう補正する、
前記(3)に記載の情報処理装置。
(5)
1つの前記ジェスチャを構成する前記ユーザの動作には、前記第1の動作と、該第1の動作に続く第2動作と、が含まれ、
前記補正部は、前記第1の動作が前記検出可能範囲で検出された後、前記第2動作を検出するための前記検出可能範囲を、前記第1の動作を検出したときの前記検出可能範囲よりも広くする、
前記(4)に記載の情報処理装置。
(6)
前記補正部は、検出対象の前記ジェスチャが移動幅を伴うジェスチャである場合に、前記第2動作を検出するための前記検出可能範囲を、前記第1の動作を検出したときの前記検出可能範囲よりも広くする、
前記(5)に記載の情報処理装置。
(7)
前記第1の動作の情報には、前記検出可能範囲に進入する前記手の進入方向を示す情報が含まれ、
前記補正部は、前記手の進入方向を示す情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(3)~(6)のいずれか1つに記載の情報処理装置。
(8)
前記所定の機器は、前記ユーザが装着可能であり、
前記補正部は、前記手の進入方向を示す情報に基づき推測される前記所定の機器の前記ユーザへの装着状態に関する情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(7)に記載の情報処理装置。
(9)
前記補正部は、前記手の進入方向を示す情報に基づき推測される前記所定の機器の装着傾きに関する情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(8)に記載の情報処理装置。
(10)
前記所定の機器は、装着時に前記ユーザの耳に位置する所定の部位を備え、
前記所定の部位には、前記検出部が配置されており、
前記補正部は、前記ユーザが手を耳に向けて振り上げる振り上げ動作の情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(8)又は(9)に記載の情報処理装置。
(11)
前記ジェスチャが検出された場合に、該ジェスチャが関連付けられた機能を実行する実行部、を備え、
前記所定の機器は、装着時に前記ユーザの双方の耳に位置する一対の前記所定の部位を備えており、
前記実行部は、一対の前記所定の部位の双方で前記振り上げ動作が検出された場合には、前記ジェスチャが検出された場合であっても、該ジェスチャに関連付けられた前記機能を実行しない、
前記(10)に記載の情報処理装置。
(12)
前記検出可能範囲は、前記検出領域の縁の一部に前記検出可能範囲の縁の一部が接するよう前記検出領域内に配置されており、
前記ジェスチャ検出部は、前記ユーザの手が前記検出領域内の前記検出可能範囲以外の範囲から進入した場合には、前記ジェスチャを検出しない、
前記(3)~(11)のいずれか1つに記載の情報処理装置。
(13)
前記ジェスチャが検出された場合に、該ジェスチャが関連付けられた機能を実行する実行部、を備え、
前記機能には、少なくとも、操作量を伴う所定の機能が含まれており、
前記所定の機能には、移動幅を伴う所定のジェスチャが関連付けられており、
前記実行部は、前記ジェスチャ検出部で検出されたジェスチャが前記所定のジェスチャの場合には、前記検出可能範囲に対する前記所定のジェスチャの相対的な移動幅の大きさに基づいて前記操作量を判別する、
前記(1)~(12)のいずれか1つに記載の情報処理装置。
(14)
前記補正部は、前記所定の機器の状態に関する検出を行う状態検出部からの情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(1)~(13)のいずれか1つに記載の情報処理装置。
(15)
前記所定の機器は、前記ユーザが装着可能であり、
前記補正部は、前記状態検出部からの情報に基づき推定される前記所定の機器の前記ユーザへの装着状態に関する情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(14)に記載の情報処理装置。
(16)
前記所定の機器は、前記ユーザが装着可能であり、
前記補正部は、前記状態検出部からの情報に基づき推定される前記ユーザの姿勢に関する情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(14)又は(15)に記載の情報処理装置。
(17)
前記状態検出部は、加速度センサ、ジャイロセンサ、生体センサの少なくともいずれかの種類のセンサを備え、
前記補正部は、前記状態検出部が備える1又は複数のセンサからの情報に基づいて前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
前記(14)~(16)のいずれか1つに記載の情報処理装置。
(18)
前記所定の機器は、ヘッドホン又はイヤホンである、
前記(1)~(17)のいずれか1つに記載の情報処理装置。
(19)
前記情報処理装置は、前記所定の機器、又は前記所定の機器を該所定の機器の外部から有線又は無線を介して制御する装置である、
前記(1)~(18)のいずれか1つに記載の情報処理装置。
(20)
物体を検出する検出部の検出領域で行われるユーザのジェスチャを検出し、
前記ジェスチャに関する情報、及び前記検出部を備える所定の機器の状態に関する情報、の少なくとも一方の情報に基づいて、前記検出領域内の前記ジェスチャの検出可能範囲、及び前記ジェスチャの検出のための基準方向、の少なくとも一方を補正する、
情報処理方法。
(21)
ユーザの人体又は物体への接触を検出する接触センサからのセンサ情報を取得する第1の取得部と、
前記ユーザの非接触での動きを含む前記ユーザの動きを検出する非接触センサからのセンサ情報を取得する第2の取得部と、
前記接触センサ及び前記非接触センサの少なくとも一方からのセンサ情報に基づいて、機器操作のための前記ユーザのジェスチャを検出するジェスチャ検出部と、
前記接触センサからのセンサ情報及び前記非接触センサからのセンサ情報に基づいて、前記ジェスチャの検出に関する補正又は前記機器操作に関する補正を行う補正部と、
を備える情報処理装置。
(22)
前記ジェスチャ検出部は、少なくとも前記接触センサからのセンサ情報を使って前記ユーザの前記ジェスチャを検出し、
前記補正部は、前記ユーザが前記ジェスチャを行った位置を示すジェスチャ位置であって前記非接触センサからのセンサ情報に基づき検出される前記ジェスチャ位置の情報に基づいて、前記ジェスチャの検出可能範囲を補正する、
前記(21)に記載の情報処理装置。
(23)
検出した前記ジェスチャに対応するコマンドを実行する実行部、を備え、
前記実行部は、前記接触センサからのセンサ情報に基づき検出されたジェスチャと、前記非接触センサからのセンサ情報に基づき検出される前記ジェスチャ位置の情報と、に基づいて、実行するコマンドを判別する、
前記(22)に記載の情報処理装置。
(24)
前記ジェスチャ検出部は、少なくとも前記非接触センサからのセンサ情報を使って前記ユーザの前記ジェスチャを検出し、
前記補正部は、前記ユーザが前記人体又は前記物体へ接触したときの前記非接触センサからのセンサ情報に基づいて、前記ジェスチャの検出可能範囲を補正する、
前記(21)に記載の情報処理装置。
(25)
前記補正部は、前記接触センサからのセンサ情報に基づいて前記ユーザの前記人体又は前記物体への接触を検出し、前記ユーザの前記人体又は前記物体への接触を検出したときの前記非接触センサからのセンサ情報に基づいて、前記ジェスチャの検出可能範囲を補正する、
前記(24)に記載の情報処理装置。
(26)
前記ユーザの前記人体又は前記物体への接触を検出した後、前記補正した前記検出可能範囲に前記ユーザの手が位置した場合に、前記ユーザにフィードバックを行う出力制御部、を備える、
前記(25)に記載の情報処理装置。
(27)
検出した前記ジェスチャに対応するコマンドを実行するコマンド実行部、を備え、
前記コマンドには、少なくとも、操作量を伴うコマンドが含まれており、
前記ジェスチャ検出部は、少なくとも前記接触センサからのセンサ情報を使って前記ユーザの前記ジェスチャを検出し、
前記コマンド実行部は、前記非接触センサからのセンサ情報に基づいて前記操作量を判別する、
前記(21)に記載の情報処理装置。
(28)
1つのジェスチャを構成するユーザの動作には、ジェスチャの開始のトリガとなる第1動作と、第1動作に続く第2動作とが含まれ、
前記コマンド実行部は、前記非接触センサからのセンサ情報に基づいて判別される前記第2動作の速度又は移動量に基づいて前記操作量を判別し、
前記補正部は、前記非接触センサからのセンサ情報に基づき検出される前記第1動作の速さの情報に基づいて、前記第2動作と前記操作量とを対応付けるための設定値を変更する、
前記(27)に記載の情報処理装置。
(29)
前記補正部は、前記非接触センサからのセンサ情報に基づき検出される前記第1動作の速さの情報に基づいて、前記第2動作の検出可能範囲を補正する、
前記(28)に記載の情報処理装置。
(30)
ユーザの人体又は物への接触を検出する接触センサからのセンサ情報を取得し、
前記ユーザの非接触での動きを含む前記ユーザの動きを検出する非接触センサからのセンサ情報を取得し、
前記接触センサ及び前記非接触センサの少なくとも一方からのセンサ情報に基づいて、機器操作のための前記ユーザのジェスチャを検出し、
前記接触センサからのセンサ情報及び前記非接触センサからのセンサ情報に基づいて、前記ジェスチャの検出に関する補正又は前記機器操作に関する補正を行う、
情報処理方法。
(31)
コンピュータを、
物体を検出する検出部の検出領域で行われるユーザのジェスチャを検出するジェスチャ検出部、
前記ジェスチャに関する情報、及び前記検出部を備える所定の機器の状態に関する情報、の少なくとも一方の情報に基づいて、前記検出領域内の前記ジェスチャの検出可能範囲、及び前記ジェスチャの検出のための基準方向、の少なくとも一方を補正する補正部、
として機能させるための情報処理プログラム。
(32)
コンピュータを、
ユーザの人体又は物への接触を検出する接触センサからのセンサ情報を取得する第1の取得部、
前記ユーザの非接触での動きを含む前記ユーザの動きを検出する非接触センサからのセンサ情報を取得する第2の取得部、
前記接触センサ及び前記非接触センサの少なくとも一方からのセンサ情報に基づいて、機器操作のための前記ユーザのジェスチャを検出するジェスチャ検出部、
前記接触センサからのセンサ情報及び前記非接触センサからのセンサ情報に基づいて、前記ジェスチャの検出に関する補正又は前記機器操作に関する補正を行う補正部、
として機能させるための情報処理プログラム。 The present technology can also have the following configurations.
(1)
A gesture detection unit that detects a user's gesture performed in the detection area of the detection unit that detects an object, and a gesture detection unit.
The detectable range of the gesture in the detection area and the reference direction for detecting the gesture based on at least one of the information about the gesture and the information about the state of the predetermined device including the detection unit. A correction unit that corrects at least one of the
Information processing device equipped with.
(2)
The correction unit corrects at least one of the detectable range and the reference direction based on the information of the user's movement regarding the gesture.
The information processing apparatus according to (1) above.
(3)
The user's action information regarding the gesture includes information on the first action that causes the user to enter the detectable range.
The correction unit corrects at least one of the detectable range and the reference direction based on the information of the first operation.
The information processing device according to (2) above.
(4)
The correction unit corrects the detectable range to be narrower than at least the detection region based on the information of the first operation.
The information processing apparatus according to (3) above.
(5)
The user's actions constituting one gesture include the first action and a second action following the first action.
After the first operation is detected in the detectable range, the correction unit sets the detectable range for detecting the second operation in the detectable range when the first operation is detected. Wider than
The information processing apparatus according to (4) above.
(6)
When the gesture to be detected is a gesture with a movement width, the correction unit sets the detectable range for detecting the second motion as the detectable range when the first motion is detected. Wider than
The information processing apparatus according to (5) above.
(7)
The information of the first operation includes information indicating the approach direction of the hand entering the detectable range.
The correction unit corrects at least one of the detectable range and the reference direction based on the information indicating the approach direction of the hand.
The information processing apparatus according to any one of (3) to (6).
(8)
The predetermined device can be worn by the user.
The correction unit corrects at least one of the detectable range and the reference direction based on the information regarding the wearing state of the predetermined device to the user, which is estimated based on the information indicating the approach direction of the hand.
The information processing apparatus according to (7) above.
(9)
The correction unit corrects at least one of the detectable range and the reference direction based on the information regarding the mounting inclination of the predetermined device estimated based on the information indicating the approach direction of the hand.
The information processing apparatus according to (8) above.
(10)
The predetermined device comprises a predetermined portion located in the user's ear when worn.
The detection unit is arranged at the predetermined portion, and the detection unit is arranged.
The correction unit corrects at least one of the detectable range and the reference direction based on the information of the swinging motion in which the user raises his / her hand toward his / her ear.
The information processing apparatus according to (8) or (9).
(11)
When the gesture is detected, the execution unit, which executes the function associated with the gesture, is provided.
The predetermined device comprises a pair of the predetermined parts located in both ears of the user when worn.
When the swing-up motion is detected in both of the pair of the predetermined parts, the execution unit does not execute the function associated with the gesture even if the gesture is detected.
The information processing apparatus according to (10) above.
(12)
The detectable range is arranged in the detection area so that a part of the edge of the detectable range is in contact with a part of the edge of the detection area.
The gesture detection unit does not detect the gesture when the user's hand enters from a range other than the detectable range in the detection area.
The information processing apparatus according to any one of (3) to (11).
(13)
When the gesture is detected, the execution unit, which executes the function associated with the gesture, is provided.
The above-mentioned functions include at least a predetermined function accompanied by an operation amount.
A predetermined gesture with a movement width is associated with the predetermined function.
When the gesture detected by the gesture detection unit is the predetermined gesture, the execution unit determines the operation amount based on the magnitude of the movement width of the predetermined gesture with respect to the detectable range. do,
The information processing apparatus according to any one of (1) to (12).
(14)
The correction unit corrects at least one of the detectable range and the reference direction based on the information from the state detection unit that detects the state of the predetermined device.
The information processing apparatus according to any one of (1) to (13).
(15)
The predetermined device can be worn by the user.
The correction unit corrects at least one of the detectable range and the reference direction based on the information regarding the wearing state of the predetermined device to the user, which is estimated based on the information from the state detection unit.
The information processing apparatus according to (14) above.
(16)
The predetermined device can be worn by the user.
The correction unit corrects at least one of the detectable range and the reference direction based on the information regarding the posture of the user estimated based on the information from the state detection unit.
The information processing apparatus according to (14) or (15).
(17)
The state detection unit includes at least one of an acceleration sensor, a gyro sensor, and a biosensor.
The correction unit corrects at least one of the detectable range and the reference direction based on the information from one or a plurality of sensors included in the state detection unit.
The information processing apparatus according to any one of (14) to (16).
(18)
The predetermined device is a headphone or an earphone.
The information processing apparatus according to any one of (1) to (17).
(19)
The information processing device is a device that controls the predetermined device or the predetermined device from the outside of the predetermined device via wire or wirelessly.
The information processing apparatus according to any one of (1) to (18).
(20)
Detects the user's gesture performed in the detection area of the detector that detects the object,
The detectable range of the gesture in the detection area and the reference direction for detecting the gesture based on at least one of the information about the gesture and the information about the state of the predetermined device including the detection unit. , Correct at least one of
Information processing method.
(21)
A first acquisition unit that acquires sensor information from a contact sensor that detects contact with the user's human body or object, and
A second acquisition unit that acquires sensor information from the non-contact sensor that detects the user's movement including the user's non-contact movement, and a second acquisition unit.
A gesture detection unit that detects a gesture of the user for operating a device based on sensor information from at least one of the contact sensor and the non-contact sensor.
Based on the sensor information from the contact sensor and the sensor information from the non-contact sensor, a correction unit that makes a correction related to the detection of the gesture or a correction related to the operation of the device, and a correction unit.
Information processing device equipped with.
(22)
The gesture detection unit detects the gesture of the user using at least the sensor information from the contact sensor.
The correction unit determines the detectable range of the gesture based on the information of the gesture position detected based on the sensor information from the non-contact sensor, which is the gesture position indicating the position where the user has performed the gesture. to correct,
The information processing apparatus according to (21).
(23)
It is equipped with an execution unit that executes a command corresponding to the detected gesture.
The execution unit determines a command to be executed based on the gesture detected based on the sensor information from the contact sensor and the gesture position information detected based on the sensor information from the non-contact sensor. ,
The information processing apparatus according to (22) above.
(24)
The gesture detection unit detects the gesture of the user using at least the sensor information from the non-contact sensor.
The correction unit corrects the detectable range of the gesture based on the sensor information from the non-contact sensor when the user comes into contact with the human body or the object.
The information processing apparatus according to (21).
(25)
The correction unit detects the user's contact with the human body or the object based on the sensor information from the contact sensor, and the non-contact sensor when the user's contact with the human body or the object is detected. Corrects the detectable range of the gesture based on the sensor information from
The information processing apparatus according to (24).
(26)
It is provided with an output control unit that gives feedback to the user when the user's hand is positioned in the corrected detectable range after detecting the contact of the user with the human body or the object.
The information processing apparatus according to (25) above.
(27)
It is equipped with a command execution unit that executes a command corresponding to the detected gesture.
The command includes at least a command with an operation amount.
The gesture detection unit detects the gesture of the user using at least the sensor information from the contact sensor.
The command execution unit determines the operation amount based on the sensor information from the non-contact sensor.
The information processing apparatus according to (21).
(28)
The user's actions constituting one gesture include a first action that triggers the start of the gesture and a second action following the first action.
The command execution unit determines the operation amount based on the speed or movement amount of the second operation determined based on the sensor information from the non-contact sensor.
The correction unit changes a set value for associating the second operation with the operation amount based on the information of the speed of the first operation detected based on the sensor information from the non-contact sensor.
The information processing apparatus according to (27).
(29)
The correction unit corrects the detectable range of the second operation based on the information of the speed of the first operation detected based on the sensor information from the non-contact sensor.
The information processing apparatus according to (28) above.
(30)
Acquires sensor information from a contact sensor that detects contact with the user's body or object,
The sensor information from the non-contact sensor that detects the movement of the user including the non-contact movement of the user is acquired, and the sensor information is acquired.
Based on the sensor information from at least one of the contact sensor and the non-contact sensor, the user's gesture for operating the device is detected.
Based on the sensor information from the contact sensor and the sensor information from the non-contact sensor, the correction related to the detection of the gesture or the correction related to the operation of the device is performed.
Information processing method.
(31)
Computer,
Gesture detector that detects the user's gesture performed in the detection area of the detector that detects the object,
The detectable range of the gesture in the detection area and the reference direction for detecting the gesture based on at least one of the information about the gesture and the information about the state of the predetermined device including the detection unit. , A correction unit that corrects at least one of
An information processing program to function as.
(32)
Computer,
A first acquisition unit that acquires sensor information from a contact sensor that detects contact with the user's body or object,
A second acquisition unit that acquires sensor information from a non-contact sensor that detects the user's movement, including the user's non-contact movement.
A gesture detection unit that detects a gesture of the user for operating a device based on sensor information from at least one of the contact sensor and the non-contact sensor.
A correction unit that corrects the detection of the gesture or the operation of the device based on the sensor information from the contact sensor and the sensor information from the non-contact sensor.
An information processing program to function as.
1 情報処理システム
10、40 情報処理装置
10A イヤホン
10B ヘッドホン
11、11L、11R、21、41 入力検出部
12、22、32、42 状態検出部
13、23、33、43 出力部
14、24、34、44 通信部
15、35、45 記憶部
16、26、36、46 制御部
20、20A、20B 出力装置
30 端末装置
31 入力部
161、361 取得部
162、362、462 ジェスチャ検出部
163、363、463 コマンド実行部
164、364、464 出力制御部
165、365、465 推測部
166、366、466 補正部
461A 第1の取得部
461B 第2の取得部
AD 進入方向
OR 検出領域
AR、A1、A2 検出可能範囲
A3、A4 領域
NR 非検出可能範囲
B1~B4 基準方向
D1~D4 装置方向
U ユーザ
H 手
O1、O2 障害物 1 Information processing system 10, 40 Information processing device 10A Earphone 10B Headphones 11 , 11 L , 11 R , 21, 41 Input detection unit 12, 22, 32, 42 State detection unit 13, 23, 33, 43 Output unit 14, 24 , 34, 44 Communication unit 15, 35, 45 Storage unit 16, 26, 36, 46 Control unit 20, 20A, 20B Output device 30 Terminal device 31 Input unit 161, 361 Acquisition unit 162, 362, 462 Gesture detection unit 163, 363, 463 Command execution unit 164, 364, 464 Output control unit 165, 365, 465 Guessing unit 166, 366, 466 Correction unit 461A First acquisition unit 461B Second acquisition unit AD Approach direction OR Detection area AR, A1, A2 Detectable range A3, A4 area NR Non-detectable range B1 to B4 Reference direction D1 to D4 Device direction U User H Hand O1, O2 Obstacles
10、40 情報処理装置
10A イヤホン
10B ヘッドホン
11、11L、11R、21、41 入力検出部
12、22、32、42 状態検出部
13、23、33、43 出力部
14、24、34、44 通信部
15、35、45 記憶部
16、26、36、46 制御部
20、20A、20B 出力装置
30 端末装置
31 入力部
161、361 取得部
162、362、462 ジェスチャ検出部
163、363、463 コマンド実行部
164、364、464 出力制御部
165、365、465 推測部
166、366、466 補正部
461A 第1の取得部
461B 第2の取得部
AD 進入方向
OR 検出領域
AR、A1、A2 検出可能範囲
A3、A4 領域
NR 非検出可能範囲
B1~B4 基準方向
D1~D4 装置方向
U ユーザ
H 手
O1、O2 障害物 1
Claims (26)
- 物体を検出する検出部の検出領域で行われるユーザのジェスチャを検出するジェスチャ検出部と、
前記ジェスチャに関する情報、及び前記検出部を備える所定の機器の状態に関する情報、の少なくとも一方の情報に基づいて、前記検出領域内の前記ジェスチャの検出可能範囲、及び前記ジェスチャの検出のための基準方向、の少なくとも一方を補正する補正部と、
を備える情報処理装置。 A gesture detection unit that detects a user's gesture performed in the detection area of the detection unit that detects an object, and a gesture detection unit.
The detectable range of the gesture in the detection area and the reference direction for detecting the gesture based on at least one of the information about the gesture and the information about the state of the predetermined device including the detection unit. A correction unit that corrects at least one of the
Information processing device equipped with. - 前記補正部は、前記ジェスチャに関する前記ユーザの動作の情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
請求項1に記載の情報処理装置。 The correction unit corrects at least one of the detectable range and the reference direction based on the information of the user's movement regarding the gesture.
The information processing apparatus according to claim 1. - 前記ジェスチャに関する前記ユーザの動作の情報には、前記ユーザが手を前記検出可能範囲に進入させる第1の動作の情報が含まれ、
前記補正部は、前記第1の動作の情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
請求項2に記載の情報処理装置。 The user's action information regarding the gesture includes information on the first action that causes the user to enter the detectable range.
The correction unit corrects at least one of the detectable range and the reference direction based on the information of the first operation.
The information processing apparatus according to claim 2. - 前記補正部は、前記第1の動作の情報に基づいて、前記検出可能範囲が少なくとも前記検出領域よりも狭くなるよう補正する、
請求項3に記載の情報処理装置。 The correction unit corrects the detectable range to be narrower than at least the detection region based on the information of the first operation.
The information processing apparatus according to claim 3. - 1つの前記ジェスチャを構成する前記ユーザの動作には、前記第1の動作と、該第1の動作に続く第2動作と、が含まれ、
前記補正部は、前記第1の動作が前記検出可能範囲で検出された後、前記第2動作を検出するための前記検出可能範囲を、前記第1の動作を検出したときの前記検出可能範囲よりも広くする、
請求項4に記載の情報処理装置。 The user's actions constituting one gesture include the first action and a second action following the first action.
After the first operation is detected in the detectable range, the correction unit sets the detectable range for detecting the second operation in the detectable range when the first operation is detected. Wider than
The information processing apparatus according to claim 4. - 前記補正部は、検出対象の前記ジェスチャが移動幅を伴うジェスチャである場合に、前記第2動作を検出するための前記検出可能範囲を、前記第1の動作を検出したときの前記検出可能範囲よりも広くする、
請求項5に記載の情報処理装置。 When the gesture to be detected is a gesture with a movement width, the correction unit sets the detectable range for detecting the second motion as the detectable range when the first motion is detected. Wider than
The information processing apparatus according to claim 5. - 前記第1の動作の情報には、前記検出可能範囲に進入する前記手の進入方向を示す情報が含まれ、
前記補正部は、前記手の進入方向を示す情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
請求項3に記載の情報処理装置。 The information of the first operation includes information indicating the approach direction of the hand entering the detectable range.
The correction unit corrects at least one of the detectable range and the reference direction based on the information indicating the approach direction of the hand.
The information processing apparatus according to claim 3. - 前記所定の機器は、前記ユーザが装着可能であり、
前記補正部は、前記手の進入方向を示す情報に基づき推測される前記所定の機器の前記ユーザへの装着状態に関する情報、及び前記手の進入方向を示す情報に基づき推測される前記所定の機器の装着傾きに関する情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
請求項7に記載の情報処理装置。 The predetermined device can be worn by the user.
The correction unit is the predetermined device estimated based on the information regarding the state in which the predetermined device is attached to the user, which is estimated based on the information indicating the approach direction of the hand, and the information indicating the approach direction of the hand. Correct at least one of the detectable range and the reference direction based on the information regarding the mounting inclination of the
The information processing apparatus according to claim 7. - 前記所定の機器は、装着時に前記ユーザの耳に位置する所定の部位を備え、
前記所定の部位には、前記検出部が配置されており、
前記補正部は、前記ユーザが手を耳に向けて振り上げる振り上げ動作の情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
請求項8に記載の情報処理装置。 The predetermined device comprises a predetermined portion located in the user's ear when worn.
The detection unit is arranged at the predetermined portion, and the detection unit is arranged.
The correction unit corrects at least one of the detectable range and the reference direction based on the information of the swinging motion in which the user raises his / her hand toward his / her ear.
The information processing apparatus according to claim 8. - 前記ジェスチャが検出された場合に、該ジェスチャが関連付けられた機能を実行する実行部、を備え、
前記所定の機器は、装着時に前記ユーザの双方の耳に位置する一対の前記所定の部位を備えており、
前記実行部は、一対の前記所定の部位の双方で前記振り上げ動作が検出された場合には、前記ジェスチャが検出された場合であっても、該ジェスチャに関連付けられた前記機能を実行しない、
請求項9に記載の情報処理装置。 When the gesture is detected, the execution unit, which executes the function associated with the gesture, is provided.
The predetermined device comprises a pair of the predetermined parts located in both ears of the user when worn.
When the swing-up motion is detected in both of the pair of the predetermined parts, the execution unit does not execute the function associated with the gesture even if the gesture is detected.
The information processing apparatus according to claim 9. - 前記検出可能範囲は、前記検出領域の縁の一部に前記検出可能範囲の縁の一部が接するよう前記検出領域内に配置されており、
前記ジェスチャ検出部は、前記ユーザの手が前記検出領域内の前記検出可能範囲以外の範囲から進入した場合には、前記ジェスチャを検出しない、
請求項3に記載の情報処理装置。 The detectable range is arranged in the detection area so that a part of the edge of the detectable range is in contact with a part of the edge of the detection area.
The gesture detection unit does not detect the gesture when the user's hand enters from a range other than the detectable range in the detection area.
The information processing apparatus according to claim 3. - 前記ジェスチャが検出された場合に、該ジェスチャが関連付けられた機能を実行する実行部、を備え、
前記機能には、少なくとも、操作量を伴う所定の機能が含まれており、
前記所定の機能には、移動幅を伴う所定のジェスチャが関連付けられており、
前記実行部は、前記ジェスチャ検出部で検出されたジェスチャが前記所定のジェスチャの場合には、前記検出可能範囲に対する前記所定のジェスチャの相対的な移動幅の大きさに基づいて前記操作量を判別する、
請求項1に記載の情報処理装置。 When the gesture is detected, the execution unit, which executes the function associated with the gesture, is provided.
The above-mentioned functions include at least a predetermined function accompanied by an operation amount.
A predetermined gesture with a movement width is associated with the predetermined function.
When the gesture detected by the gesture detection unit is the predetermined gesture, the execution unit determines the operation amount based on the magnitude of the movement width of the predetermined gesture with respect to the detectable range. do,
The information processing apparatus according to claim 1. - 前記補正部は、前記所定の機器の状態に関する検出を行う状態検出部からの情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
請求項1に記載の情報処理装置。 The correction unit corrects at least one of the detectable range and the reference direction based on the information from the state detection unit that detects the state of the predetermined device.
The information processing apparatus according to claim 1. - 前記所定の機器は、前記ユーザが装着可能であり、
前記補正部は、前記状態検出部からの情報に基づき推定される前記所定の機器の前記ユーザへの装着状態に関する情報、または前記状態検出部からの情報に基づき推定される前記ユーザの姿勢に関する情報に基づいて、前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
請求項13に記載の情報処理装置。 The predetermined device can be worn by the user.
The correction unit is information on the wearing state of the predetermined device to the user estimated based on the information from the state detection unit, or information on the posture of the user estimated based on the information from the state detection unit. Corrects at least one of the detectable range and the reference direction based on.
The information processing apparatus according to claim 13. - 前記状態検出部は、加速度センサ、ジャイロセンサ、生体センサの少なくともいずれかのセンサを備え、
前記補正部は、前記状態検出部が備える1又は複数のセンサからの情報に基づいて前記検出可能範囲及び前記基準方向の少なくとも一方を補正する、
請求項13に記載の情報処理装置。 The state detection unit includes at least one of an acceleration sensor, a gyro sensor, and a biosensor.
The correction unit corrects at least one of the detectable range and the reference direction based on the information from one or a plurality of sensors included in the state detection unit.
The information processing apparatus according to claim 13. - 物体を検出する検出部の検出領域で行われるユーザのジェスチャを検出し、
前記ジェスチャに関する情報、及び前記検出部を備える所定の機器の状態に関する情報、の少なくとも一方の情報に基づいて、前記検出領域内の前記ジェスチャの検出可能範囲、及び前記ジェスチャの検出のための基準方向、の少なくとも一方を補正する、
情報処理方法。 Detects the user's gesture performed in the detection area of the detector that detects the object,
The detectable range of the gesture in the detection area and the reference direction for detecting the gesture based on at least one of the information about the gesture and the information about the state of the predetermined device including the detection unit. , Correct at least one of
Information processing method. - ユーザの人体又は物体への接触を検出する接触センサからのセンサ情報を取得する第1の取得部と、
前記ユーザの非接触での動きを含む前記ユーザの動きを検出する非接触センサからのセンサ情報を取得する第2の取得部と、
前記接触センサ及び前記非接触センサの少なくとも一方からのセンサ情報に基づいて、機器操作のための前記ユーザのジェスチャを検出するジェスチャ検出部と、
前記接触センサからのセンサ情報及び前記非接触センサからのセンサ情報に基づいて、前記ジェスチャの検出に関する補正又は前記機器操作に関する補正を行う補正部と、
を備える情報処理装置。 A first acquisition unit that acquires sensor information from a contact sensor that detects contact with the user's human body or object, and
A second acquisition unit that acquires sensor information from the non-contact sensor that detects the user's movement including the user's non-contact movement, and a second acquisition unit.
A gesture detection unit that detects a gesture of the user for operating a device based on sensor information from at least one of the contact sensor and the non-contact sensor.
Based on the sensor information from the contact sensor and the sensor information from the non-contact sensor, a correction unit that makes a correction related to the detection of the gesture or a correction related to the operation of the device, and a correction unit.
Information processing device equipped with. - 前記ジェスチャ検出部は、少なくとも前記接触センサからのセンサ情報を使って前記ユーザの前記ジェスチャを検出し、
前記補正部は、前記ユーザが前記ジェスチャを行った位置を示すジェスチャ位置であって前記非接触センサからのセンサ情報に基づき検出される前記ジェスチャ位置の情報に基づいて、前記ジェスチャの検出可能範囲を補正する、
請求項17に記載の情報処理装置。 The gesture detection unit detects the gesture of the user using at least the sensor information from the contact sensor.
The correction unit determines the detectable range of the gesture based on the information of the gesture position detected based on the sensor information from the non-contact sensor, which is the gesture position indicating the position where the user has performed the gesture. to correct,
The information processing apparatus according to claim 17. - 検出した前記ジェスチャに対応するコマンドを実行する実行部、を備え、
前記実行部は、前記接触センサからのセンサ情報に基づき検出されたジェスチャと、前記非接触センサからのセンサ情報に基づき検出される前記ジェスチャ位置の情報と、に基づいて、実行するコマンドを判別する、
請求項18に記載の情報処理装置。 It is equipped with an execution unit that executes a command corresponding to the detected gesture.
The execution unit determines a command to be executed based on the gesture detected based on the sensor information from the contact sensor and the gesture position information detected based on the sensor information from the non-contact sensor. ,
The information processing apparatus according to claim 18. - 前記ジェスチャ検出部は、少なくとも前記非接触センサからのセンサ情報を使って前記ユーザの前記ジェスチャを検出し、
前記補正部は、前記ユーザが前記人体又は前記物体へ接触したときの前記非接触センサからのセンサ情報に基づいて、前記ジェスチャの検出可能範囲を補正する、
請求項17に記載の情報処理装置。 The gesture detection unit detects the gesture of the user using at least the sensor information from the non-contact sensor.
The correction unit corrects the detectable range of the gesture based on the sensor information from the non-contact sensor when the user comes into contact with the human body or the object.
The information processing apparatus according to claim 17. - 前記補正部は、前記接触センサからのセンサ情報に基づいて前記ユーザの前記人体又は前記物体への接触を検出し、前記ユーザの前記人体又は前記物体への接触を検出したときの前記非接触センサからのセンサ情報に基づいて、前記ジェスチャの検出可能範囲を補正する、
請求項20に記載の情報処理装置。 The correction unit detects the user's contact with the human body or the object based on the sensor information from the contact sensor, and the non-contact sensor when the user's contact with the human body or the object is detected. Corrects the detectable range of the gesture based on the sensor information from
The information processing apparatus according to claim 20. - 前記ユーザの前記人体又は前記物体への接触を検出した後、前記補正した前記検出可能範囲に前記ユーザの手が位置した場合に、前記ユーザにフィードバックを行う出力制御部、を備える、
請求項21に記載の情報処理装置。 It is provided with an output control unit that gives feedback to the user when the user's hand is positioned in the corrected detectable range after detecting the contact of the user with the human body or the object.
The information processing apparatus according to claim 21. - 検出した前記ジェスチャに対応するコマンドを実行するコマンド実行部、を備え、
前記コマンドには、少なくとも、操作量を伴うコマンドが含まれており、
前記ジェスチャ検出部は、少なくとも前記接触センサからのセンサ情報を使って前記ユーザの前記ジェスチャを検出し、
前記コマンド実行部は、前記非接触センサからのセンサ情報に基づいて前記操作量を判別する、
請求項17に記載の情報処理装置。 It is equipped with a command execution unit that executes a command corresponding to the detected gesture.
The command includes at least a command with an operation amount.
The gesture detection unit detects the gesture of the user using at least the sensor information from the contact sensor.
The command execution unit determines the operation amount based on the sensor information from the non-contact sensor.
The information processing apparatus according to claim 17. - 1つのジェスチャを構成するユーザの動作には、ジェスチャの開始のトリガとなる第1動作と、第1動作に続く第2動作とが含まれ、
前記コマンド実行部は、前記非接触センサからのセンサ情報に基づいて判別される前記第2動作の速度又は移動量に基づいて前記操作量を判別し、
前記補正部は、前記非接触センサからのセンサ情報に基づき検出される前記第1動作の速さの情報に基づいて、前記第2動作と前記操作量とを対応付けるための設定値を変更する、
請求項23に記載の情報処理装置。 The user's actions constituting one gesture include a first action that triggers the start of the gesture and a second action following the first action.
The command execution unit determines the operation amount based on the speed or movement amount of the second operation determined based on the sensor information from the non-contact sensor.
The correction unit changes a set value for associating the second operation with the operation amount based on the information of the speed of the first operation detected based on the sensor information from the non-contact sensor.
The information processing apparatus according to claim 23. - 前記補正部は、前記非接触センサからのセンサ情報に基づき検出される前記第1動作の速さの情報に基づいて、前記第2動作の検出可能範囲を補正する、
請求項24に記載の情報処理装置。 The correction unit corrects the detectable range of the second operation based on the information of the speed of the first operation detected based on the sensor information from the non-contact sensor.
The information processing apparatus according to claim 24. - ユーザの人体又は物への接触を検出する接触センサからのセンサ情報を取得し、
前記ユーザの非接触での動きを含む前記ユーザの動きを検出する非接触センサからのセンサ情報を取得し、
前記接触センサ及び前記非接触センサの少なくとも一方からのセンサ情報に基づいて、機器操作のための前記ユーザのジェスチャを検出し、
前記接触センサからのセンサ情報及び前記非接触センサからのセンサ情報に基づいて、前記ジェスチャの検出に関する補正又は前記機器操作に関する補正を行う、
情報処理方法。 Acquires sensor information from a contact sensor that detects contact with the user's body or object,
The sensor information from the non-contact sensor that detects the movement of the user including the non-contact movement of the user is acquired, and the sensor information is acquired.
Based on the sensor information from at least one of the contact sensor and the non-contact sensor, the user's gesture for operating the device is detected.
Based on the sensor information from the contact sensor and the sensor information from the non-contact sensor, the correction related to the detection of the gesture or the correction related to the operation of the device is performed.
Information processing method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022536400A JPWO2022014609A1 (en) | 2020-07-14 | 2021-07-13 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020120870 | 2020-07-14 | ||
JP2020-120870 | 2020-07-14 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022014609A1 true WO2022014609A1 (en) | 2022-01-20 |
Family
ID=79555547
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/026355 WO2022014609A1 (en) | 2020-07-14 | 2021-07-13 | Information processing device and information processing method |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2022014609A1 (en) |
WO (1) | WO2022014609A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016122295A (en) * | 2014-12-24 | 2016-07-07 | 富士通株式会社 | Input method, input program, and input device |
WO2018143313A1 (en) * | 2017-02-01 | 2018-08-09 | コニカミノルタ株式会社 | Wearable electronic device |
WO2020008559A1 (en) * | 2018-07-04 | 2020-01-09 | マクセル株式会社 | Head-mounted display and setting method |
-
2021
- 2021-07-13 WO PCT/JP2021/026355 patent/WO2022014609A1/en active Application Filing
- 2021-07-13 JP JP2022536400A patent/JPWO2022014609A1/ja active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016122295A (en) * | 2014-12-24 | 2016-07-07 | 富士通株式会社 | Input method, input program, and input device |
WO2018143313A1 (en) * | 2017-02-01 | 2018-08-09 | コニカミノルタ株式会社 | Wearable electronic device |
WO2020008559A1 (en) * | 2018-07-04 | 2020-01-09 | マクセル株式会社 | Head-mounted display and setting method |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022014609A1 (en) | 2022-01-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11954816B2 (en) | Display control device, display control method, and recording medium | |
US11188187B2 (en) | Information processing apparatus, information processing method, and recording medium | |
US11373054B2 (en) | Object recognition method and mobile terminal | |
EP2713359B1 (en) | Method and apparatus for controlling screen brightness corresponding to variation of illumination | |
KR102582863B1 (en) | Electronic device and method for recognizing user gestures based on user intention | |
CN108153422B (en) | Display object control method and mobile terminal | |
WO2016089939A1 (en) | Master device for using connection attribute of electronic accessories connections to facilitate locating an accessory | |
US20170144042A1 (en) | Mobile terminal, training management program and training management method | |
KR102622564B1 (en) | Electronic device and method for controlling display operation in electronic device | |
US20130197916A1 (en) | Terminal device, speech recognition processing method of terminal device, and related program | |
US11170539B2 (en) | Information processing device and information processing method | |
US20200321018A1 (en) | Information processing device, information processing method, and program | |
KR20170111450A (en) | Hearing aid apparatus, portable apparatus and controlling method thereof | |
WO2021004281A1 (en) | State display method and terminal device | |
CN110933452A (en) | Method and device for displaying lovely face gift and storage medium | |
WO2021031844A1 (en) | Icon display method and terminal | |
US10642575B2 (en) | Information processing device and method of information processing for notification of user speech received at speech recognizable volume levels | |
CN111124206B (en) | Position adjusting method and electronic equipment | |
US20220141390A1 (en) | Photographing method, device, and system, and computer-readable storage medium | |
KR20150131816A (en) | Mobile terminal and method for controlling the same | |
US10575130B2 (en) | Mobile electronic device, control method, and recording medium | |
KR20160101572A (en) | Image display apparatus and power saving method thereof | |
US10133966B2 (en) | Information processing apparatus, information processing method, and information processing system | |
WO2022014609A1 (en) | Information processing device and information processing method | |
WO2022019085A1 (en) | Information processing device and information processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21843374 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022536400 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21843374 Country of ref document: EP Kind code of ref document: A1 |