WO2018079383A1 - Information processing apparatus, control method, program, and recording medium - Google Patents

Information processing apparatus, control method, program, and recording medium Download PDF

Info

Publication number
WO2018079383A1
WO2018079383A1 PCT/JP2017/037732 JP2017037732W WO2018079383A1 WO 2018079383 A1 WO2018079383 A1 WO 2018079383A1 JP 2017037732 W JP2017037732 W JP 2017037732W WO 2018079383 A1 WO2018079383 A1 WO 2018079383A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
information
finger
pose
detection result
Prior art date
Application number
PCT/JP2017/037732
Other languages
French (fr)
Japanese (ja)
Inventor
洋一 西牧
泰史 奥村
Original Assignee
株式会社ソニー・インタラクティブエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ソニー・インタラクティブエンタテインメント filed Critical 株式会社ソニー・インタラクティブエンタテインメント
Priority to JP2018547603A priority Critical patent/JP6687749B2/en
Publication of WO2018079383A1 publication Critical patent/WO2018079383A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • the present invention relates to an information processing apparatus, a control method, a program, and a recording medium.
  • Controllers for controlling the movement of characters in a virtual space are widely used in information processing apparatuses such as home game machines. In recent years, it has been considered that the movement of the user's hand is reflected in the movement of the finger of the character in the virtual space.
  • a finger sign may indicate some meaning depending on the region or scene, and when controlling the movement of a character's fingers, the user does not intend to present an inappropriate sign to the character in an online game or the like. There was a case where I was allowed to.
  • the present invention has been made in view of the above circumstances, and an object thereof is to provide an information processing apparatus, a control method, a program, and a recording medium that can prevent an inappropriate sign from being presented.
  • the present invention for solving the problems of the above-described conventional example is an information processing apparatus connected to a device fixed to a user's hand, and the device detects a bending / extending state of the finger for at least some of the fingers.
  • a finger sensor and an angle sensor for detecting a fixed hand angle
  • the information processing apparatus is configured to specify information for identifying a hand pose and processing to be performed for at least one hand pose.
  • the receiving means connected to the holding means for holding the information in association with each other, and receiving the detection result of the bending state of the finger by the finger sensor and the detection result of the hand angle from the device, It is determined whether or not it matches one of the hand poses specified by the information held in the holding means.
  • Determining means for acquiring process specifying information associated with the information specifying the over's, and execution means for executing the processing of said determination means is specified by the acquired process specifying information is obtained by a comprise.
  • the information processing apparatus 1 is a computer device such as a home game machine, and a controller device 20 is connected to the information processing apparatus 1.
  • the controller device 20 is attached and fixed to the left or right hand of the user.
  • the controller device 20 fixed to the user's left hand is referred to as the controller device 20L
  • the controller device 20 fixed to the user's right hand is distinguished from the controller device 20R by the reference numerals L and R respectively.
  • An example of the controller device 20 includes a device main body 210 and a fixture 220 fixed to the device main body 210 as illustrated in FIG. As shown in FIG. 2, the controller device 20 includes a control unit 21, a storage unit 22, an operation interface 23, a sensor unit 24, and a communication unit 25. In the present embodiment, the circuit unit is housed in the device main body 210.
  • the longitudinal direction of the device body 210 is the Z axis (when the device body 210 is fixed to the hand, the direction crossing the palm, and the side where the user's thumb reaches is the positive direction), and the front side
  • the direction (depth direction) connecting the device body 210 to the hand and the back side when looking at the palm of the hand and the back side (depth direction) is the Y axis
  • the left and right direction (device body 210 is fixed to the hand When opened (stretched fingers), the direction from the wrist side to the fingertip side is taken as the X axis (FIG. 1).
  • the device main body 210 may be the same shape as that attached to the user's left hand and that attached to the right hand.
  • the fixing tool 220 has a flexible belt in an annular shape, and the device main body 210 is passed through the fixing tool 220 from the user's index finger (so-called index finger) to the little finger (so-called pink finger). Is fixed to a position where it is brought into contact with the root of the user's thumb (the position corresponding to the MP joint of the little finger from the index finger).
  • the size of the controller device 20 is such that when the user naturally holds the device body 210, one end thereof is positioned slightly outside the position where the tip of the user's thumb (so-called thumb) reaches, and the other end. However, a certain size is set at a position slightly protruding from the base of the little finger (position corresponding to the MP joint). Even if the user opens his / her hand with the controller device 20 mounted, the controller device 20 is fixed to the user's hand by the fixing tool 220, and therefore does not fall.
  • the device main body 210 includes a gripping part 211 gripped by a user, an operation part 212, and a position presentation part 213.
  • the gripping portion 211 has a substantially polygonal column shape.
  • the operation unit 212 is formed continuously from the grip unit 211 and includes a button operation unit 231 including a plurality of buttons that can be operated with the thumb of the user in the example of FIG.
  • the side surface side of the device main body 210 (the surface facing the finger when the back surface of the device main body 210 is brought into contact with the palm of the user) and the button operation unit 231 are respectively provided with a sensor unit 24 described later.
  • a finger sensor 241 is provided.
  • the position presentation unit 213 is arranged on the back side of the device main body 210 (on the thumb side when fixed to the user's hand) and includes at least one light emitting element such as an LED. It is assumed that the position presentation unit 213 emits light of a predetermined color unique to each controller device 20 during operation of the controller device 20. Further, the position presentation unit 213 only needs to be able to detect the position of each controller device 20 such as a color marker unique to each controller device 20 from the outside, and is not necessarily a light emitting element.
  • the finger sensor 241 is an optical sensor, and includes a light emitting unit that emits infrared light and a light receiving unit that detects infrared light, and light emitted from the light emitting unit is an object.
  • the distance to the obstacle in the infrared radiation direction is detected by measuring the ratio of the light reflected and incident on the light receiving unit.
  • the finger sensor 241 provided in the button operation unit 231 detects the distance to the thumb (thumb) for operating the button of the button operation unit 231.
  • finger sensors corresponding to the respective fingers from the index finger to the little finger provided on the side surface of the device body 210 detects the distance to the corresponding finger (the obstacle in the direction in which it should be). The finger sensor 241 then outputs the detected distance information as detection result information.
  • the finger sensor 241 when the user moves the hand from the state in which the device main body 210 is fixed to the hand and the index finger is extended from the little finger to the shape in which the device main body 210 is gripped, the finger sensor 241 is in the state in which the finger is extended. The distance to the object in the direction of the tip of the finger (for example, the floor surface if the user extends the finger downward) will be detected.
  • the finger sensor 241 functions as a sensor that detects the bending / extending state of the finger.
  • the sensor for detecting the bending / extending state of the finger is not limited to the optical finger sensor as described here as long as the state of bending / extending (bending or extending) of each finger can be detected.
  • control unit 21 is a program control device such as a CPU, and operates according to a program stored in the storage unit 22.
  • the control unit 21 receives input of information representing the content of the operation performed by the user on the operation unit 212 from the operation interface 23 and outputs the information to the information processing apparatus 1 via the communication unit 25.
  • the control unit 21 outputs information output from the sensor included in the sensor unit 24 to the information processing apparatus 1 via the communication unit 25.
  • the storage unit 22 is a memory device or the like, and holds a program executed by the control unit 21.
  • This program may be provided by being stored in a computer-readable and non-transitory recording medium and may be copied to the storage unit 22 or provided via a communication means such as a network. It may be stored in the unit 22.
  • the storage unit 22 also operates as a work memory for the control unit 21.
  • the operation interface 23 outputs information representing the content of the operation performed by the user at the operation unit 212 to the control unit 21.
  • the sensor unit 24 includes at least one sensor and outputs information output from the sensor to the control unit 21.
  • the sensor unit 24 may include an inclination sensor 242 and an acceleration sensor 243 that detect the inclination of the device main body 210 of the controller device 20 in addition to the finger sensor 241 described above.
  • the tilt sensor 242 detects information on the angle with respect to the direction of gravity of the Z axis in the longitudinal direction of the device body 210, and calculates information on the elevation angle ⁇ from the horizontal plane of the Z axis of the device body 210.
  • the tilt sensor 242 also detects information about the rotation angle ⁇ around the Z axis, and outputs detection result information including the rotation angle ⁇ around the Z axis and the elevation angle ⁇ of the Z axis from the horizontal plane.
  • the acceleration sensor 243 detects the acceleration of the controller device 20 in the X, Y, and Z axis directions, and outputs detection result information representing the detection result.
  • the acceleration sensor 243 is used when the hand (including a finger) wearing the controller device 20 hits another object (the other hand, another person's hand, a wall, etc.) and examines the impact (time change of acceleration) When the acceleration increases in a pulse shape), the impact is detected, and detection result information indicating that the impact has been detected is output.
  • the communication unit 25 is a wired interface such as a USB interface or a wireless interface such as Bluetooth (registered trademark), and outputs various information to the information processing apparatus 1 according to instructions input from the control unit 21. .
  • the information processing apparatus 1 includes a control unit 11, a storage unit 12, an interface unit 13, an imaging unit 14, and an output unit 15.
  • the control unit 11 is a program control device such as a CPU, and operates according to a program stored in the storage unit 12.
  • control unit 11 is connected to a pose database that associates pose specifying information for specifying a pose of a hand with process specifying information indicating processing to be performed for at least one hand pose.
  • the control unit 11 also receives detection result information representing the detection result of the finger bending / extension state, the detection result of the hand angle, and the detection result of the impact from the controller device 20.
  • detection result information representing the detection result of the finger bending / extension state, the detection result of the hand angle, and the detection result of the impact from the controller device 20.
  • the control unit 11 The process specifying information registered in association with the specific information is acquired, and the process specified by the process specifying information is executed. Detailed operation of the control unit 11 will be described later.
  • the storage unit 12 is a memory device, a disk device, or the like, and holds a program executed by the control unit 11.
  • the program may be provided by being stored in a computer-readable non-transitory recording medium and stored in the storage unit 12. Further, it may be provided via a communication means such as a network and stored in the storage unit 12.
  • the storage unit 12 also operates as a work memory for the control unit 11.
  • the storage unit 12 includes, for at least one hand pose, pose specifying information (P) for specifying a hand pose and processing specifying information (P) indicating processing to be performed ( Q) is associated with the pose database.
  • the pose identification information (P) includes information relating to the bending / extending state of each finger from the thumb detected by the finger sensor 241 (whether it is bent or stretched) and the hand angle (rotation angle around the Z axis). This is a determination condition using at least one of the range and the information relating to the elevation angle of the Z axis with respect to the horizontal plane.
  • the pose identification information for identifying a so-called thumbs-up pose is “the thumb is extended, the index finger, middle finger, ring finger, and little finger are bent, and the Z-axis is vertically upward”. It can be described as a condition.
  • the pose where the index finger and middle finger form a V-shaped sign is that the thumb, ring finger, little finger are bent, the index finger and middle finger are stretched, and the Z axis is in the range of ⁇ 10 degrees from the horizontal direction.
  • the process specifying information (Q) specifies, for example, a process related to a character hand drawing process.
  • the process specifying information (Q) is information specifying a process for discarding the received detection result information (in this case, each finger of the character remains in the pose of the immediately preceding hand).
  • the interface unit 13 is connected to the controller device 20 wirelessly or in a wired manner. Information indicating the contents of the user's operation from the controller device 20 and information on the distance detected by the finger sensor 241 (the finger sensor 241 corresponds to each finger). If there are a plurality of distances, information on a plurality of distances detected by the finger sensor 241 corresponding to each finger is received and output to the control unit 11.
  • the imaging unit 14 is a camera or the like installed with the range where the user is positioned as the imaging range, and repeatedly captures images including the user at predetermined timings, and outputs data of the images to the control unit 11.
  • the output unit 15 has an interface for outputting video and the like to a home television, such as an HDMI (registered trademark) interface.
  • the output unit 15 outputs information on a video to be displayed in accordance with an instruction input from the control unit 11.
  • the control unit 11 functionally includes a receiving unit 111, a determination unit 112, and a process execution unit 113 as illustrated in FIG.
  • the receiving unit 111 receives the detection result of the bending / extending state of the finger and the detection result of the hand angle at every predetermined timing (for example, every time a certain time elapses) from the controller device 20. In addition, when the controller device 20 detects an impact, the receiving unit 111 receives information indicating that the impact has been detected.
  • the discriminating unit 112 discriminates the hand pose based on the information received by the receiving unit 111. Specifically, the determination unit 112 determines whether each finger is bent or stretched based on the detection result information of the bending state of the finger received by the receiving unit 111. As described above, in the detection result information output from the controller device 20, when the finger is bent, the detection result information (distance to the finger) of the finger sensor 241 corresponding to the bent finger is compared. The value is close to the target “0”. On the other hand, when the finger is extended, the detection result information (distance to the finger) of the finger sensor 241 corresponding to the bent finger is a relatively large value.
  • the determination unit 112 compares the detection result information of each finger with a predetermined threshold value, and when the detection result information is larger than the threshold value, the corresponding finger is stretched. Is determined. If the detection result information is less than the threshold value, it is determined that the corresponding finger is bent.
  • the determination unit 112 determines whether each finger of the user's hand to which the controller device 20 is fixed is stretched (hereinafter referred to as “extension”) or bent (hereinafter referred to as “bending”). .
  • the determination unit 112 refers to the pose database held by the storage unit 12 and determines the bending / extension state of each finger and the angle of the hand determined based on the detection result information of the bending / extension state of the finger received by the receiving unit 111.
  • the pose specifying information that matches the detection result information (the pose specifying information specifying the pose determined by each detection result information) is searched.
  • the determining unit 112 reads the process specifying information (Q) associated with the pose specifying information (P) and outputs it to the process executing unit 113.
  • the determination unit 112 satisfies the condition of the pose specifying information corresponding to the so-called thumb-up pose (“the thumb is extended and the index finger is extended). The middle finger, the ring finger, and the little finger are bent and the Z-axis is vertically upward ”. Then, the process specifying information associated with the thumb up pose specifying information is read out and output to the process executing unit 113.
  • the determination unit 112 determines that there is no pose specifying information related to a condition that the information received by the receiving unit 111 satisfies in the pose database held by the storage unit 12 (no matching pose specifying information can be found by the search). Sometimes, information indicating that there is no corresponding pose is output to the pose database.
  • the process executing unit 113 executes a process specified by the process specifying information output from the determination unit 112.
  • the process executed here is, for example, a process for controlling the pose instructed by the user not to be reflected on the character, or a process for concealing by a predetermined method such as applying a Gaussian blur to the image portion on which the character's hand is displayed.
  • various processes such as a process of generating a sound and a process of synthesizing a predetermined image may be used.
  • the process executing unit 113 executes a predetermined process. For example, the processing execution unit 113 controls the pose of the character in the virtual space so that the finger bending / extending state, the hand bending angle, and the hand angle received by the receiving unit 111 are the same.
  • the determination unit 112 searches for the pose specification information that matches the detection result of the bending / extension state of the finger and the detection result of the hand angle received by the reception unit 111, and the pose specification information that matches the detection result.
  • the process specifying information (Q) associated with the pause specifying information (P) is immediately read out and output to the process execution unit 113.
  • the present embodiment is not limited to this. Absent.
  • the determination unit 112 has the same pose specification information determined to match the detection result received by the receiving unit 111 for a predetermined time (assumed to be received a plurality of times within this time). That is, the detection result information received by the receiving unit 111 a plurality of times during a predetermined time period is the same hand among the poses of any hand specified by the pose specification information stored in the pose database.
  • the process specification information associated with the pose specification information for specifying the matched hand pose is acquired, and the process execution unit It is good also as outputting to 113.
  • the process specifying associated with the pose specifying information is specified. It is possible to prevent the process specified by the information from being executed, and to prevent the extra process from being performed in the process of changing the pose of the character.
  • the plurality of finger sensors 241 of the controller device 20 output detection result information indicating the distance from the corresponding thumb to the little finger.
  • the tilt sensor 242 of the sensor unit 24 outputs detection result information indicating that the Z-axis direction (the longitudinal direction of the controller device 20 and the user's thumb side) is vertically upward as the hand angle. .
  • the control unit 11 of the information processing apparatus 1 includes a variable p representing pose identification information searched last time, which is initially initialized to “” (blank character string), and a counter variable c which is initially initialized to “0”. 6 is executed, and the detection result information output from the controller device 20 is accepted (S11).
  • the control part 11 produces
  • the control part 11 will produce
  • the control unit 11 can search the pose database for pose specifying information including a condition that satisfies the information indicating the bending state and the information on the hand angle among the detection results received in the process S11. (S13).
  • the pose database does not include a condition that “the thumb is extended, the index finger, the middle finger, the ring finger, and the little finger are bent, and the Z axis is vertically upward”. Therefore, the control unit 11 determines that the pose identification information cannot be found by the search (S13: No), and executes a predetermined process.
  • control unit 11 initializes the variable p to “” (blank character string) and the counter variable c to “0”, and as an example, the bending / extension of the finger is the same as the bending / extension state of the finger or the angle of the hand received in step S11.
  • the pose of the character in the virtual space is controlled so as to be in the state and the hand angle (S14), and the process returns to the process S11 to continue the process.
  • the plurality of finger sensors 241 of the controller device 20 correspond to each other. Detection result information indicating the distance from the thumb to the little finger is output.
  • the tilt sensor 242 of the sensor unit 24 outputs detection result information indicating that the Z-axis direction (the longitudinal direction of the controller device 20 and the user's thumb is present) is vertically downward as the hand angle. .
  • the control unit 11 obtains information indicating that the finger bending / stretching state is “stretching, bending, bending, bending, bending, bending” in step S12. Generate. And in the process S13, the condition that “the thumb is extended, the index finger, the middle finger, the ring finger, and the little finger are bent, and the Z axis is vertically downward” is included in the pose database. It is determined that the pose specifying information for specifying the so-called thumbs-down pose has been retrieved (S13: Yes).
  • the control unit 11 determines whether or not the previously searched pose specifying information (represented by the variable p) matches the pose specifying information searched this time in the process S13 (S15). If the pose specifying information searched last time matches the pose specifying information searched this time in step S13 (S15: Yes), the counter variable c is incremented by "1" (S16), and the counter variable c It is determined whether or not exceeds a predetermined threshold value (count threshold value) (S17). Here, if it is determined that the counter variable c has exceeded a predetermined count threshold value (S17: Yes), the detection result (accepted multiple times within this time) for a predetermined time is met. Since the determined pose specifying information is the same, the control unit 11 stores the processing specifying information (Q) stored in the pose database in association with the searched pose specifying information (represented by the variable p). Is acquired (S18).
  • control part 11 performs the process specified by the process specific information acquired here (S19).
  • the shape of the character's hand is set to a predetermined shape (for example, a shape in which all fingers are stretched), and the process returns to step S11 to continue the process.
  • process S15 if the pose specifying information searched last time does not match the pose specifying information searched this time in process S13 (S15: No), variable p is searched this time in process S13.
  • the pause identification information is set, the counter variable c is set to “1” (S20), and the process returns to the process S11 to continue the process.
  • process S17 when the counter variable c does not exceed the predetermined count threshold value (S17: No), the process proceeds to process S14 and the process is continued.
  • the controller device 20 outputs the detection result information by detecting the bending state of the finger, the hand angle, etc., multiple times per 0.5 seconds.
  • the number of times detection result information is input in 0.5 seconds.
  • the character poses a thumbs-down pose for a maximum of 0.5 seconds, but after that, the character is changed to a predetermined pose and is a pose that is inappropriate in the scene (here, the thumbs). (Pause down) is prevented.
  • the controller device 20 connected to the information processing apparatus 1 can determine the tilt of the finger for at least some fingers (a part of the finger sensor 241 is not limited to the distance to the object).
  • the pose may be determined using not only the information on the bending / stretching state but also information on the tilt of the finger.
  • the pose specifying information is that the pose in which a V-shaped sign is formed by the index finger and the middle finger is “the thumb, ring finger, little finger are bent, the index finger, the middle finger are extended, and the index finger is away from the middle finger.
  • the control unit 11 is distinguished from a state in which the index finger and the middle finger extend in parallel (a state in which the index finger is not V-shaped). It is possible to execute the process.
  • detection result information of the acceleration sensor 243 of the controller device 20 may be used.
  • the pose identification information indicates the high touch pose. “The first to little finger is stretched, and the Z axis is in the range of ⁇ 10 degrees from the horizontal direction. If the acceleration sensor 243 detects an impact ”, it is possible to execute processing corresponding to a pose such as high touch.
  • the information processing apparatus 1 may obtain information on the movement of the user's hand using detection result information from the acceleration sensor 243 and the tilt sensor 242. Since there is a widely known method for acquiring information on the movement of the user's hand using the detection result information of the acceleration sensor 243 and the tilt sensor 242 of the controller device 20, a detailed description thereof is omitted here.
  • the pose specifying information indicates the high touch pose, “the first to little finger is extended, and the Z axis is ⁇ 10 degrees from the horizontal direction. If the condition is that the user's hand has risen and the acceleration sensor 243 has detected an impact, the information processing apparatus 1 can execute processing corresponding to a pose such as high touch more reliably. Is possible.
  • the position of the hand may be detected and used to determine the pose. That is, the information processing apparatus 1 according to the present embodiment detects the position presentation unit 213 of the device 20 from the image data output from the imaging unit 14.
  • the image data output by the imaging unit 14 is received, and the position presentation unit 213 of the controller device 20 fixed to the user's hand is detected from the image data. Since such a method of detecting the position presentation unit is widely known, a detailed description thereof is omitted here.
  • the pose specifying information indicates the high touch pose, “the first to little finger is stretched and the Z axis is ⁇ 10 degrees from the horizontal direction. If the condition is that the user's hand is higher than the user's head and the acceleration sensor 243 has detected an impact, the information processing apparatus 1 can make a pose such as high touch more reliably. Corresponding processing can be executed.
  • the information processing apparatus 1 executes an application program, and performs processing for registering / deleting pause specifying information and processing specifying information associated therewith in the pause database in accordance with an instruction of the application program. May be.
  • the pose specifying information indicating the high touch and the character do not perform the high touch action (for example, a predetermined upright posture of the character).
  • processing specific information which is set to a predetermined posture such as
  • the pose database is used when the pose specification information and the pose specified by the pose specification information are instructed.
  • a condition for executing a process hereinafter referred to as an execution condition
  • process specifying information may be held in association with each other.
  • the execution conditions are, for example, the rating information (age restriction information) of the application program being executed, the conditions related to the area in which the application program is being executed (or the language selected for the message display, etc.), etc.
  • the rating may be a condition such as “A” or “B” ”, or a condition that“ the language selected for displaying the message is “Greek” ”. .
  • pose specific information that represents a pose that presents a sign, such as “the thumb, ring finger, little finger are bent, index finger, middle finger are extended, and the Z axis is within ⁇ 10 degrees from the horizontal direction”
  • the condition is “The language selected for displaying the message is“ Greek ”and the game rating is other than“ Z ””, and the character is instructed to perform the process specified by the process specifying information.
  • Processing specific information that does not present a pose (for example, sets a character to a predetermined posture such as a predetermined upright posture) is associated and registered.
  • the user can display messages.
  • Information on the selected language is acquired, and when “Greek” is included in the acquired language information, it is determined that one of the users (the user of the information processing apparatus 1 is satisfied) as the execution condition is satisfied.
  • an online game opponent including opponents of online games
  • the V-shaped sign will be displayed by the user's character even if the V-shaped sign is formed by a hand with the controller device 20 fixed. It is no longer presented and the user of the opponent who selects the Greek language is not inadvertently discomforted.
  • the counter threshold value to be compared in the process S17 may be different for each piece of pause specifying information.
  • the counter threshold value may be registered in the pose database in association with the pose specifying information.
  • the control unit 11 sets the variable p in the process S20 to the pose specific information searched this time in the process S13, sets the counter variable c to “1”, and searches the current time in the process S13.
  • the counter threshold value registered in association with the pose specifying information may be read and used in step S17.
  • the controller device 20 when the user wears a device (for example, a display device such as a head mounted display worn on the head) in addition to the controller device 20, the controller device 20 is based on the relative position with respect to the other device. May be detected.
  • a device for example, a display device such as a head mounted display worn on the head
  • the information processing apparatus 1 detects the position of the head-mounted display based on the image captured by the imaging unit 14 (such as the angle of the head-mounted display with respect to the vertical direction). At the same time, the position of the controller device 20 is detected.
  • the information processing apparatus 1 then detects the position of the detected head-mounted display (the optical axis direction of the imaging unit 14 and the vertical direction (the direction of gravity) and a direction orthogonal thereto in a plane whose normal direction is the optical axis direction). Using the position on each axis as a reference position, the coordinates of the relative position of the controller device 20 with respect to the reference position are obtained. In the following, the vertically upward direction is defined as a positive direction.
  • the information processing apparatus 1 Since the head mounted display is mounted on the user's head, it can be assumed that the value of the vertical component of the position coordinates of the head mounted display represents the height of the user's head. Therefore, the information processing apparatus 1 has a value obtained by subtracting the value of the vertical component of the position coordinate of the head mounted display from the value of the vertical component of the position coordinate of the controller device 20 exceeds a predetermined threshold value, and When the value is a positive value (the controller device 20 is higher than the head-mounted display), it can be determined that the user's hand is higher than the user's head.
  • the information processing apparatus 1 determines that the value obtained by subtracting the value of the vertical component of the position coordinate of the head mounted display from the value of the vertical component of the position coordinate of the controller device 20 exceeds a predetermined threshold value. If the value is negative (the controller device 20 is higher than the head-mounted display), it can be determined that the user has lowered his / her hand.
  • the head mounted display may have a camera, and the relative position from the head mounted display to the tip of the hand (or the controller device 20) may be detected.
  • the controller device 20 when the controller device 20 is located vertically above the head mounted display and at a distance exceeding a predetermined threshold from the image captured by the camera of the head mounted display, the user It can be determined that the hand is raised (the user's hand is higher than the user's head).
  • the head-mounted display is taken as an example here as another device, the present embodiment is not limited to this, and the headphone worn on the user's head, the headband, and the wristwatch type worn on the user's arm.
  • the movement of the user's hand may be determined by detecting the relative position of the controller device 20 from various devices.
  • SYMBOLS 1 Information processing apparatus 11 Control part, 12 Storage part, 13 Interface part, 14 Imaging part, 15 Output part, 20 Controller device, 21 Control part, 22 Storage part, 23 Operation interface, 24 Sensor part, 25 Communication part, 111 Acceptance unit, 112 determination unit, 113 processing execution unit, 210 device main body, 211 gripping unit, 212 operation unit, 213 position presentation unit, 220 fixture, 231 button operation unit, 241 finger sensor, 242 tilt sensor, 243 acceleration sensor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

This information processing apparatus: receives a result of detection of a hand angle and a result of detection of a finger bending/stretching state by a finger sensor of a device; determines, on the basis of the received results of detection, whether there is a match with any hand poses identified in information stored in advance; acquires, when it is determined that there is a match with one of the hand poses, process identification information associated with the information that identifies the matching hand pose; and executes a process identified by the acquired process identification information.

Description

情報処理装置、制御方法、プログラム、及び記録媒体Information processing apparatus, control method, program, and recording medium
 本発明は、情報処理装置、制御方法、プログラム、及び記録媒体に関する。 The present invention relates to an information processing apparatus, a control method, a program, and a recording medium.
 家庭用ゲーム機等の情報処理装置において、仮想空間内のキャラクタの動作を制御するためのコントローラが広く用いられている。近年では、ユーザの手の動きを、仮想空間内のキャラクタの手指の動作に反映させるものも考えられている。 2. Description of the Related Art Controllers for controlling the movement of characters in a virtual space are widely used in information processing apparatuses such as home game machines. In recent years, it has been considered that the movement of the user's hand is reflected in the movement of the finger of the character in the virtual space.
 しかしながら、指によるサインは地域や場面によって何らかの意味を示す場合があり、キャラクタの手指の動作を制御する際には、ユーザが意図せず、オンラインゲーム等の場においてキャラクタに不適切なサインを提示させてしまう場合があった。 However, a finger sign may indicate some meaning depending on the region or scene, and when controlling the movement of a character's fingers, the user does not intend to present an inappropriate sign to the character in an online game or the like. There was a case where I was allowed to.
 本発明は上記実情に鑑みて為されたもので、不適切なサインの提示を防止できる情報処理装置、制御方法、プログラム及び記録媒体を提供することを、その目的の一つとする。 The present invention has been made in view of the above circumstances, and an object thereof is to provide an information processing apparatus, a control method, a program, and a recording medium that can prevent an inappropriate sign from being presented.
 上記従来例の問題点を解決する本発明は、ユーザの手に固定されるデバイスに接続される情報処理装置であって、前記デバイスは、少なくとも一部の指について、指の屈伸状態を検出する指センサと、固定されている手の角度を検出する角度センサと、を含み、前記情報処理装置は、少なくとも一つの手のポーズについて、手のポーズを特定する情報と行うべき処理を表す処理特定情報とを関連付けて保持する保持手段に接続され、前記デバイスから指センサによる指の屈伸状態の検出結果と、手の角度の検出結果とを受け入れる受入手段と、当該受け入れた検出結果に基づいて、前記保持手段に保持されている情報で特定される手のポーズのいずれかに一致するか否かを判断し、いずれかに一致すると判断したときに、当該一致した手のポーズを特定する情報に関連付けられた処理特定情報を取得する判別手段と、前記判別手段が取得した処理特定情報で特定される処理を実行する実行手段と、を含むこととしたものである。 The present invention for solving the problems of the above-described conventional example is an information processing apparatus connected to a device fixed to a user's hand, and the device detects a bending / extending state of the finger for at least some of the fingers. A finger sensor; and an angle sensor for detecting a fixed hand angle, wherein the information processing apparatus is configured to specify information for identifying a hand pose and processing to be performed for at least one hand pose. Based on the received detection result, the receiving means connected to the holding means for holding the information in association with each other, and receiving the detection result of the bending state of the finger by the finger sensor and the detection result of the hand angle from the device, It is determined whether or not it matches one of the hand poses specified by the information held in the holding means. Determining means for acquiring process specifying information associated with the information specifying the over's, and execution means for executing the processing of said determination means is specified by the acquired process specifying information is obtained by a comprise.
 本発明によると、不適切なサインの提示を防止できる。 According to the present invention, inappropriate signing can be prevented.
本発明の実施の形態の情報処理装置に接続されるコントローラデバイスの例を表す概略斜視図である。It is a schematic perspective view showing the example of the controller device connected to the information processing apparatus of embodiment of this invention. 本発明の実施の形態の情報処理装置に接続されるコントローラデバイスの構成例を表すブロック図である。It is a block diagram showing the example of a structure of the controller device connected to the information processing apparatus of embodiment of this invention. 本発明の実施の形態の情報処理装置の構成例を表すブロック図である。It is a block diagram showing the example of a structure of the information processing apparatus of embodiment of this invention. 本発明の実施の形態の情報処理装置の例を表す機能ブロック図である。It is a functional block diagram showing the example of the information processing apparatus of embodiment of this invention. 本発明の実施の形態の情報処理装置が用いるポーズデータベースの内容例を表す説明図である。It is explanatory drawing showing the example of the content of the pose database which the information processing apparatus of embodiment of this invention uses. 本発明の実施の形態の情報処理装置の処理例を表すフローチャート図である。It is a flowchart figure showing the process example of the information processing apparatus of embodiment of this invention.
 本発明の実施の形態について図面を参照しながら説明する。なお、以下の説明において各部の大きさやその比、配置等は一例であり、本実施の形態の例は、図示等した大きさや比率、配置に限られるものではない。 Embodiments of the present invention will be described with reference to the drawings. In the following description, the sizes, ratios, and arrangements of the respective parts are examples, and the examples of the present embodiment are not limited to the illustrated sizes, ratios, and arrangements.
 本発明の実施の形態に係る情報処理装置1は、例えば家庭用ゲーム機等のコンピュータ機器であり、この情報処理装置1にはコントローラデバイス20が接続される。 The information processing apparatus 1 according to the embodiment of the present invention is a computer device such as a home game machine, and a controller device 20 is connected to the information processing apparatus 1.
 コントローラデバイス20は、ユーザの左手または右手に装着されて固定される。以下の説明において、ユーザの左手に固定されるコントローラデバイス20と、右手に固定されるコントローラデバイス20とを区別する必要がある場合は、ユーザの左手に固定されるコントローラデバイス20をコントローラデバイス20L、ユーザの右手に固定されるコントローラデバイス20をコントローラデバイス20RとそれぞれL,Rの符号を付して区別する。 The controller device 20 is attached and fixed to the left or right hand of the user. In the following description, when it is necessary to distinguish between the controller device 20 fixed to the user's left hand and the controller device 20 fixed to the right hand, the controller device 20 fixed to the user's left hand is referred to as the controller device 20L, The controller device 20 fixed to the user's right hand is distinguished from the controller device 20R by the reference numerals L and R respectively.
 このコントローラデバイス20の一例は、図1にその概要を例示するように、デバイス本体210と、このデバイス本体210に固定された固定具220とを含んで構成される。また、このコントローラデバイス20は、図2に示すように、制御部21と、記憶部22と、操作インタフェース23と、センサ部24と、通信部25とを含んで構成される。また本実施の形態では、この回路部は、デバイス本体210内に、収納されている。 An example of the controller device 20 includes a device main body 210 and a fixture 220 fixed to the device main body 210 as illustrated in FIG. As shown in FIG. 2, the controller device 20 includes a control unit 21, a storage unit 22, an operation interface 23, a sensor unit 24, and a communication unit 25. In the present embodiment, the circuit unit is housed in the device main body 210.
 以下の説明において、デバイス本体210の長手方向をZ軸(デバイス本体210を手に固定したとき、手の平を横断する方向、なおユーザの親指が到達する側を正の方向とする)とし、正面側(デバイス本体210を手に固定して手の平を見たときに手前となる側)と背面側とを結ぶ方向(奥行き方向)をY軸、左右方向(デバイス本体210を手に固定して手を開いた(指を伸ばした)とき、手首側から指先側へ向かう方向)をX軸とする(図1)。 In the following description, the longitudinal direction of the device body 210 is the Z axis (when the device body 210 is fixed to the hand, the direction crossing the palm, and the side where the user's thumb reaches is the positive direction), and the front side The direction (depth direction) connecting the device body 210 to the hand and the back side when looking at the palm of the hand and the back side (depth direction) is the Y axis, and the left and right direction (device body 210 is fixed to the hand When opened (stretched fingers), the direction from the wrist side to the fingertip side is taken as the X axis (FIG. 1).
 デバイス本体210は、ユーザの左手に装着されるものと、右手に装着されるものとで、それぞれの形状が互い同一となっていてよい。本実施の形態の例では、固定具220は可撓性あるベルトを環状としたものであり、この固定具220にユーザの人差指(いわゆる人差指)から小指(いわゆる小指)までを通し、デバイス本体210をユーザの親指の付根(人差指から小指のMP関節に相当する位置)に当接させた位置に固定して用いる。 The device main body 210 may be the same shape as that attached to the user's left hand and that attached to the right hand. In the example of the present embodiment, the fixing tool 220 has a flexible belt in an annular shape, and the device main body 210 is passed through the fixing tool 220 from the user's index finger (so-called index finger) to the little finger (so-called pink finger). Is fixed to a position where it is brought into contact with the root of the user's thumb (the position corresponding to the MP joint of the little finger from the index finger).
 またこのコントローラデバイス20の大きさは、ユーザが自然にデバイス本体210を把持したときに、その一方端がユーザの親指(いわゆる親指)の先端が到達する位置よりやや外側に位置し、その他方端が、小指の付け根(MP関節に相当する位置)からやや突出する位置にある程度の大きさとする。なおコントローラデバイス20は、コントローラデバイス20を装着した状態でユーザが手を開いても、固定具220によりユーザの手に固定されるので、落下することはない。 The size of the controller device 20 is such that when the user naturally holds the device body 210, one end thereof is positioned slightly outside the position where the tip of the user's thumb (so-called thumb) reaches, and the other end. However, a certain size is set at a position slightly protruding from the base of the little finger (position corresponding to the MP joint). Even if the user opens his / her hand with the controller device 20 mounted, the controller device 20 is fixed to the user's hand by the fixing tool 220, and therefore does not fall.
 デバイス本体210は、ユーザにより把持される把持部211と、操作部212と、位置提示部213とを含む。把持部211は、本実施の形態の一例では実質的に多角形柱状をなしている。操作部212は、把持部211から連続して形成されており、図1の例では、ユーザの親指で操作可能な複数のボタンを備えたボタン操作部231を含む。またこのデバイス本体210の側面側(デバイス本体210の背面をユーザの手の平に当接させたときに指側に向いた面)と、ボタン操作部231とにはそれぞれ、後に説明するセンサ部24の指センサ241が設けられている。 The device main body 210 includes a gripping part 211 gripped by a user, an operation part 212, and a position presentation part 213. In one example of the present embodiment, the gripping portion 211 has a substantially polygonal column shape. The operation unit 212 is formed continuously from the grip unit 211 and includes a button operation unit 231 including a plurality of buttons that can be operated with the thumb of the user in the example of FIG. Further, the side surface side of the device main body 210 (the surface facing the finger when the back surface of the device main body 210 is brought into contact with the palm of the user) and the button operation unit 231 are respectively provided with a sensor unit 24 described later. A finger sensor 241 is provided.
 位置提示部213は、デバイス本体210の上側(ユーザの手に固定した際、親指側)背面に配され、例えばLED等の発光素子を少なくとも一つ含んで構成される。この位置提示部213は、コントローラデバイス20の動作中はコントローラデバイス20ごとに固有の、予め指定された色の光を発光するものとする。またこの位置提示部213は、コントローラデバイス20ごとに固有な色のマーカー等、各コントローラデバイス20の位置をそれぞれ外部から検出できればよく、必ずしも発光素子でなくてもよい。 The position presentation unit 213 is arranged on the back side of the device main body 210 (on the thumb side when fixed to the user's hand) and includes at least one light emitting element such as an LED. It is assumed that the position presentation unit 213 emits light of a predetermined color unique to each controller device 20 during operation of the controller device 20. Further, the position presentation unit 213 only needs to be able to detect the position of each controller device 20 such as a color marker unique to each controller device 20 from the outside, and is not necessarily a light emitting element.
 本実施の形態の一例において、指センサ241は光学センサであり、赤外光を放射する発光部と、赤外光を検出する受光部とを有し、発光部から放射された光が対象物に反射して受光部に入射する割合を測定することで、赤外光の放射方向にある障害物までの距離を検出する。具体的にここでは、ボタン操作部231に設けられた指センサ241が、ボタン操作部231のボタンを操作する親指(親指)までの距離を検出する。また、デバイス本体210の側面側(デバイス本体210の背面をユーザの手の平に当接させたときに指側に向いた面)に設けられた、人差指から小指までのそれぞれの指に対応する指センサ241が、それぞれ対応する指(のあるべき方向の障害物)までの距離を検出する。そして指センサ241は、当該検出した距離の情報を、検出結果の情報として出力する。 In an example of the present embodiment, the finger sensor 241 is an optical sensor, and includes a light emitting unit that emits infrared light and a light receiving unit that detects infrared light, and light emitted from the light emitting unit is an object. The distance to the obstacle in the infrared radiation direction is detected by measuring the ratio of the light reflected and incident on the light receiving unit. Specifically, here, the finger sensor 241 provided in the button operation unit 231 detects the distance to the thumb (thumb) for operating the button of the button operation unit 231. Also, finger sensors corresponding to the respective fingers from the index finger to the little finger provided on the side surface of the device body 210 (the surface facing the finger when the back surface of the device body 210 is in contact with the palm of the user) 241 detects the distance to the corresponding finger (the obstacle in the direction in which it should be). The finger sensor 241 then outputs the detected distance information as detection result information.
 この例では、ユーザがデバイス本体210を手に固定し、人差指から小指までを伸ばした状態からデバイス本体210を握り込む形まで手を動かすとき、指センサ241は指を伸ばした状態では、ユーザの指の先端の方向にある対象物(例えばユーザが指を下方に向けて伸ばした状態であれば床面等)までの距離を検出することとなる。 In this example, when the user moves the hand from the state in which the device main body 210 is fixed to the hand and the index finger is extended from the little finger to the shape in which the device main body 210 is gripped, the finger sensor 241 is in the state in which the finger is extended. The distance to the object in the direction of the tip of the finger (for example, the floor surface if the user extends the finger downward) will be detected.
 また、ユーザの指が曲げられると、曲げられた指の第2関節または第3関節の表面までの距離が検出され、ユーザがデバイス本体210を握り込んだ状態では、指センサ241に指の表面が接するため、この指センサ241が検出する距離は「0」となる。本実施の形態ではこの指センサ241が指の屈伸状態を検出するセンサとして機能する。もっとも指の屈伸状態を検出するセンサとしては、各指の屈伸(曲げているか伸ばしているか)の状態を検出できれば、ここで説明したような光学的な指センサに限られるものではない。 Further, when the user's finger is bent, the distance to the surface of the second joint or the third joint of the bent finger is detected, and when the user grasps the device body 210, the finger sensor 241 receives the finger surface. Therefore, the distance detected by the finger sensor 241 is “0”. In the present embodiment, the finger sensor 241 functions as a sensor that detects the bending / extending state of the finger. However, the sensor for detecting the bending / extending state of the finger is not limited to the optical finger sensor as described here as long as the state of bending / extending (bending or extending) of each finger can be detected.
 本実施の形態では、ここで制御部21は、CPU等のプログラム制御デバイスであり、記憶部22に格納されたプログラムに従って動作する。本実施の形態ではこの制御部21は、操作インタフェース23からユーザが操作部212においてした操作の内容を表す情報の入力を受け、当該情報を、通信部25を介して情報処理装置1へ出力する。またこの制御部21は、センサ部24に含まれるセンサが出力する情報を、通信部25を介して情報処理装置1へ出力する。 In the present embodiment, the control unit 21 is a program control device such as a CPU, and operates according to a program stored in the storage unit 22. In the present embodiment, the control unit 21 receives input of information representing the content of the operation performed by the user on the operation unit 212 from the operation interface 23 and outputs the information to the information processing apparatus 1 via the communication unit 25. . Further, the control unit 21 outputs information output from the sensor included in the sensor unit 24 to the information processing apparatus 1 via the communication unit 25.
 記憶部22は、メモリデバイス等であり、制御部21によって実行されるプログラムを保持する。このプログラムは、コンピュータ可読かつ、非一時的な記録媒体に格納されて提供され、この記憶部22に複写されたものであってもよいし、ネットワーク等の通信手段を介して提供され、この記憶部22に格納されたものであってもよい。またこの記憶部22は、制御部21のワークメモリとしても動作する。 The storage unit 22 is a memory device or the like, and holds a program executed by the control unit 21. This program may be provided by being stored in a computer-readable and non-transitory recording medium and may be copied to the storage unit 22 or provided via a communication means such as a network. It may be stored in the unit 22. The storage unit 22 also operates as a work memory for the control unit 21.
 操作インタフェース23は、操作部212にてユーザが行った操作の内容を表す情報を制御部21に出力する。センサ部24は、少なくとも一つのセンサを含み、当該センサが出力する情報を制御部21に出力する。本実施の形態の一例では、このセンサ部24は、既に説明した指センサ241のほか、コントローラデバイス20のデバイス本体210の傾きを検出する傾きセンサ242や加速度センサ243を含んでもよい。 The operation interface 23 outputs information representing the content of the operation performed by the user at the operation unit 212 to the control unit 21. The sensor unit 24 includes at least one sensor and outputs information output from the sensor to the control unit 21. In an example of the present embodiment, the sensor unit 24 may include an inclination sensor 242 and an acceleration sensor 243 that detect the inclination of the device main body 210 of the controller device 20 in addition to the finger sensor 241 described above.
 ここで傾きセンサ242は、デバイス本体210の長手方向のZ軸の重力の向きに対する角度の情報を検出し、デバイス本体210のZ軸の水平面からの仰俯角φの情報を演算して得る。また、この傾きセンサ242は、Z軸まわりの回転角θの情報も検出し、このZ軸まわりの回転角θと、Z軸の水平面からの仰俯角φとを含む検出結果情報を出力する。加速度センサ243は、コントローラデバイス20のX,Y,Z軸方向の加速度を検出して、その検出結果を表す検出結果情報を出力する。またこの加速度センサ243は、コントローラデバイス20を装着している手(指を含む)が他の物体(もう一方の手や他人の手、壁など)に当たり、衝撃(加速度の時間変化を調べるとき、加速度がパルス状に大きくなる)が生じたときに、当該衝撃を検出して、衝撃を検出した旨の検出結果情報を出力する。 Here, the tilt sensor 242 detects information on the angle with respect to the direction of gravity of the Z axis in the longitudinal direction of the device body 210, and calculates information on the elevation angle φ from the horizontal plane of the Z axis of the device body 210. The tilt sensor 242 also detects information about the rotation angle θ around the Z axis, and outputs detection result information including the rotation angle θ around the Z axis and the elevation angle φ of the Z axis from the horizontal plane. The acceleration sensor 243 detects the acceleration of the controller device 20 in the X, Y, and Z axis directions, and outputs detection result information representing the detection result. In addition, the acceleration sensor 243 is used when the hand (including a finger) wearing the controller device 20 hits another object (the other hand, another person's hand, a wall, etc.) and examines the impact (time change of acceleration) When the acceleration increases in a pulse shape), the impact is detected, and detection result information indicating that the impact has been detected is output.
 通信部25は、USBインタフェース等の有線インタフェースや、ブルートゥース(登録商標)等の無線インタフェースであり、制御部21から入力される指示に従い、種々の情報を情報処理装置1に対して出力している。 The communication unit 25 is a wired interface such as a USB interface or a wireless interface such as Bluetooth (registered trademark), and outputs various information to the information processing apparatus 1 according to instructions input from the control unit 21. .
 情報処理装置1は、図3に例示するように、制御部11と、記憶部12と、インタフェース部13と、撮像部14と、出力部15とを含んで構成されている。制御部11は、CPU等のプログラム制御デバイスであり、記憶部12に格納されたプログラムに従って動作する。 As illustrated in FIG. 3, the information processing apparatus 1 includes a control unit 11, a storage unit 12, an interface unit 13, an imaging unit 14, and an output unit 15. The control unit 11 is a program control device such as a CPU, and operates according to a program stored in the storage unit 12.
 本実施の形態では制御部11は、少なくとも一つの手のポーズについて、手のポーズを特定するポーズ特定情報と行うべき処理を表す処理特定情報とを関連付けたポーズデータベースにアクセス可能に接続される。またこの制御部11は、コントローラデバイス20から指の屈伸状態の検出結果と、手の角度の検出結果と、衝撃の検出結果と表す検出結果情報を受け入れる。そして制御部11は、当該受け入れた検出結果情報の表す手のポーズが、上記ポーズデータベースに予め登録されたポーズ特定情報で特定される手のポーズのいずれかに一致したときに、当該一致したポーズ特定情報に関連付けて登録された処理特定情報を取得し、当該処理特定情報で特定される処理を実行する。この制御部11の詳しい動作については後に述べる。 In this embodiment, the control unit 11 is connected to a pose database that associates pose specifying information for specifying a pose of a hand with process specifying information indicating processing to be performed for at least one hand pose. The control unit 11 also receives detection result information representing the detection result of the finger bending / extension state, the detection result of the hand angle, and the detection result of the impact from the controller device 20. When the hand pose represented by the received detection result information matches one of the hand poses specified by the pose specifying information registered in advance in the pose database, the control unit 11 The process specifying information registered in association with the specific information is acquired, and the process specified by the process specifying information is executed. Detailed operation of the control unit 11 will be described later.
 記憶部12は、メモリデバイスやディスクデバイス等であり、制御部11によって実行されるプログラムを保持する。このプログラムは、コンピュータ可読かつ非一時的な記録媒体に格納されて提供され、この記憶部12に格納されたものであってもよい。また、ネットワーク等の通信手段を介して提供され、この記憶部12に格納されたものであってもよい。またこの記憶部12は、制御部11のワークメモリとしても動作する。 The storage unit 12 is a memory device, a disk device, or the like, and holds a program executed by the control unit 11. The program may be provided by being stored in a computer-readable non-transitory recording medium and stored in the storage unit 12. Further, it may be provided via a communication means such as a network and stored in the storage unit 12. The storage unit 12 also operates as a work memory for the control unit 11.
 本実施の形態において、この記憶部12は、図4に例示するように、少なくとも一つの手のポーズについて、手のポーズを特定するポーズ特定情報(P)と行うべき処理を表す処理特定情報(Q)とを関連付けたポーズデータベースを保持する。ここでポーズ特定情報(P)は、指センサ241が検出する親指から小指の各指の屈伸状態(曲げているか伸ばしているかの別)に係る情報と、手の角度(Z軸周りの回転角範囲とZ軸の水平面に対する仰角)に係る情報との少なくとも一方を用いた判断条件である。 In the present embodiment, as illustrated in FIG. 4, the storage unit 12 includes, for at least one hand pose, pose specifying information (P) for specifying a hand pose and processing specifying information (P) indicating processing to be performed ( Q) is associated with the pose database. Here, the pose identification information (P) includes information relating to the bending / extending state of each finger from the thumb detected by the finger sensor 241 (whether it is bent or stretched) and the hand angle (rotation angle around the Z axis). This is a determination condition using at least one of the range and the information relating to the elevation angle of the Z axis with respect to the horizontal plane.
 例えば、いわゆるサムズアップのポーズを特定するポーズ特定情報は、「親指が伸ばされ、人差指、中指、薬指、及び小指が曲げられた状態にあり、かつ、Z軸が鉛直上向きとなっている」との条件として記述できる。また人差指と中指とでV字のサインを形成したポーズは、「親指、薬指、小指が曲げられ、人差指、中指が伸ばされ、Z軸が水平方向から±10度の範囲にある」との条件として記述できる。 For example, the pose identification information for identifying a so-called thumbs-up pose is “the thumb is extended, the index finger, middle finger, ring finger, and little finger are bent, and the Z-axis is vertically upward”. It can be described as a condition. The pose where the index finger and middle finger form a V-shaped sign is that the thumb, ring finger, little finger are bent, the index finger and middle finger are stretched, and the Z axis is in the range of ± 10 degrees from the horizontal direction. Can be described as
 処理特定情報(Q)は、例えばキャラクタの手の形の描画処理に係る処理を特定する。一例として、処理特定情報(Q)は、受け入れた検出結果情報を破棄する(この場合はキャラクタの各指が、直前の手のポーズのままとなる)処理を特定する情報等である。 The process specifying information (Q) specifies, for example, a process related to a character hand drawing process. As an example, the process specifying information (Q) is information specifying a process for discarding the received detection result information (in this case, each finger of the character remains in the pose of the immediately preceding hand).
 インタフェース部13は、コントローラデバイス20に無線又は有線にて接続され、コントローラデバイス20からユーザの操作の内容を表す情報や、指センサ241において検出した距離の情報(指センサ241が各指に対応して複数ある場合は、各指に対応する指センサ241が検出した複数の距離の情報)を受け入れ、制御部11に出力する。 The interface unit 13 is connected to the controller device 20 wirelessly or in a wired manner. Information indicating the contents of the user's operation from the controller device 20 and information on the distance detected by the finger sensor 241 (the finger sensor 241 corresponds to each finger). If there are a plurality of distances, information on a plurality of distances detected by the finger sensor 241 corresponding to each finger is received and output to the control unit 11.
 撮像部14は、ユーザの位置する範囲を撮像範囲として設置されたカメラ等であり、ユーザを含む画像を所定のタイミングごとに繰り返して撮像し、当該画像のデータを制御部11に出力する。出力部15は、例えばHDMI(登録商標)インタフェース等、家庭用テレビジョンに対して映像等を出力するインタフェースを有する。この出力部15は、制御部11から入力される指示に従い、表示するべき映像の情報を出力する。 The imaging unit 14 is a camera or the like installed with the range where the user is positioned as the imaging range, and repeatedly captures images including the user at predetermined timings, and outputs data of the images to the control unit 11. The output unit 15 has an interface for outputting video and the like to a home television, such as an HDMI (registered trademark) interface. The output unit 15 outputs information on a video to be displayed in accordance with an instruction input from the control unit 11.
 次に本実施の形態の制御部11の動作について説明する。本実施の形態の制御部11は、機能的には図5に例示するように、受入部111と、判別部112と、処理実行部113とを含む。 Next, the operation of the control unit 11 of this embodiment will be described. The control unit 11 according to the present embodiment functionally includes a receiving unit 111, a determination unit 112, and a process execution unit 113 as illustrated in FIG.
 ここで受入部111は、コントローラデバイス20から所定のタイミングごと(例えば一定の時間が経過するごと)に、指の屈伸状態の検出結果と、手の角度の検出結果とを受け入れる。またこの受入部111は、コントローラデバイス20が衝撃を検出したときに、衝撃を検出した旨の情報を受け入れる。 Here, the receiving unit 111 receives the detection result of the bending / extending state of the finger and the detection result of the hand angle at every predetermined timing (for example, every time a certain time elapses) from the controller device 20. In addition, when the controller device 20 detects an impact, the receiving unit 111 receives information indicating that the impact has been detected.
 判別部112は、受入部111が受け入れた情報に基づいて手のポーズを判別する。具体的に、この判別部112は、受入部111が受け入れた指の屈伸状態の検出結果情報に基づいて、各指が曲げられているか伸ばされているかを判別する。既に述べたように、コントローラデバイス20が出力する検出結果情報では、指が曲げられた状態にあるときには、当該曲げられた指に対応する指センサ241の検出結果情報(指までの距離)は比較的「0」に近い値となる。一方、指が伸ばされた状態にあるときには、当該曲げられた指に対応する指センサ241の検出結果情報(指までの距離)は比較的大きい値となる。そこで判別部112は、各指の検出結果情報と予め定められたしきい値とを比較し、当該しきい値よりも大きい検出結果情報となっている場合は、対応する指が伸ばされていると判別する。また、当該しきい値を下回る検出結果情報となっている場合は、対応する指が曲げられていると判別する。 The discriminating unit 112 discriminates the hand pose based on the information received by the receiving unit 111. Specifically, the determination unit 112 determines whether each finger is bent or stretched based on the detection result information of the bending state of the finger received by the receiving unit 111. As described above, in the detection result information output from the controller device 20, when the finger is bent, the detection result information (distance to the finger) of the finger sensor 241 corresponding to the bent finger is compared. The value is close to the target “0”. On the other hand, when the finger is extended, the detection result information (distance to the finger) of the finger sensor 241 corresponding to the bent finger is a relatively large value. Therefore, the determination unit 112 compares the detection result information of each finger with a predetermined threshold value, and when the detection result information is larger than the threshold value, the corresponding finger is stretched. Is determined. If the detection result information is less than the threshold value, it is determined that the corresponding finger is bent.
 これにより判別部112は、コントローラデバイス20が固定されているユーザの手の各指が伸ばされているか(以下「伸び」と表す)、曲げられているか(以下「曲げ」と表す)を判別する。 Accordingly, the determination unit 112 determines whether each finger of the user's hand to which the controller device 20 is fixed is stretched (hereinafter referred to as “extension”) or bent (hereinafter referred to as “bending”). .
 またこの判別部112は、記憶部12が保持するポーズデータベースを参照し、受入部111が受け入れた指の屈伸状態の検出結果情報に基づいて判別した各指の曲げ伸ばし状態や、手の角度の検出結果情報に合致するポーズ特定情報(各検出結果情報で判別されるポーズを特定するポーズ特定情報)を検索する。判別部112は、検索の結果、合致するポーズ特定情報が見いだされたときには、当該ポーズ特定情報(P)に関連付けられた処理特定情報(Q)を読み出して、処理実行部113に出力する。 Further, the determination unit 112 refers to the pose database held by the storage unit 12 and determines the bending / extension state of each finger and the angle of the hand determined based on the detection result information of the bending / extension state of the finger received by the receiving unit 111. The pose specifying information that matches the detection result information (the pose specifying information specifying the pose determined by each detection result information) is searched. When the matching pose specifying information is found as a result of the search, the determining unit 112 reads the process specifying information (Q) associated with the pose specifying information (P) and outputs it to the process executing unit 113.
 一例として受入部111が受け入れた指の屈伸状態の検出結果情報から、「親指、人差指、中指、薬指、及び小指がそれぞれ、伸び、曲げ、曲げ、曲げ、曲げ」と判別され、手の角度を表す検出結果情報が「Z軸が鉛直上向き」であることを表しているときには、判別部112は、いわゆるサムズアップのポーズに相当するポーズ特定情報の条件に合致する(「親指が伸ばされ、人差指、中指、薬指、及び小指が曲げられた状態にあり、かつ、Z軸が鉛直上向きとなっている」との条件に合致する)と判断する。そして当該サムズアップのポーズ特定情報に関連付けられた処理特定情報を読み出して、処理実行部113に出力する。 As an example, from the detection result information of the bending state of the finger received by the receiving unit 111, it is determined that “the thumb, the index finger, the middle finger, the ring finger, and the little finger are stretched, bent, bent, bent, bent, respectively”, and the angle of the hand is determined. When the detection result information to be expressed indicates that “Z-axis is vertically upward”, the determination unit 112 satisfies the condition of the pose specifying information corresponding to the so-called thumb-up pose (“the thumb is extended and the index finger is extended). The middle finger, the ring finger, and the little finger are bent and the Z-axis is vertically upward ”. Then, the process specifying information associated with the thumb up pose specifying information is read out and output to the process executing unit 113.
 またこの判別部112は、記憶部12が保持するポーズデータベースに、受入部111が受け入れた情報が満足する条件に係るポーズ特定情報がない(検索により合致するポーズ特定情報が見いだせない)と判断したときには、ポーズデータベースに、対応するポーズがない旨の情報を出力する。 Further, the determination unit 112 determines that there is no pose specifying information related to a condition that the information received by the receiving unit 111 satisfies in the pose database held by the storage unit 12 (no matching pose specifying information can be found by the search). Sometimes, information indicating that there is no corresponding pose is output to the pose database.
 処理実行部113は、判別部112が出力した処理特定情報で特定される処理を実行する。ここで実行される処理は例えば、ユーザの指示したポーズをキャラクタに反映させないよう制御する処理や、キャラクタの手が表示される画像部分に対してガウシアンブラーを適用するなど所定の方法で隠蔽する処理等のほか、音声を鳴動する処理や、予め定めた画像を合成する処理など、種々の処理であってもよい。 The process executing unit 113 executes a process specified by the process specifying information output from the determination unit 112. The process executed here is, for example, a process for controlling the pose instructed by the user not to be reflected on the character, or a process for concealing by a predetermined method such as applying a Gaussian blur to the image portion on which the character's hand is displayed. In addition to the above, various processes such as a process of generating a sound and a process of synthesizing a predetermined image may be used.
 またこの処理実行部113は、判別部112がポーズデータベースに対応するポーズがない旨の情報を出力したときには、予め定められた処理を実行する。例えばこの処理実行部113は、受入部111が受け入れた指の屈伸状態や手の角度と同じ屈伸状態、手の角度となるよう、仮想空間内のキャラクタのポーズを制御する。 Further, when the determining unit 112 outputs information indicating that there is no pose corresponding to the pose database, the process executing unit 113 executes a predetermined process. For example, the processing execution unit 113 controls the pose of the character in the virtual space so that the finger bending / extending state, the hand bending angle, and the hand angle received by the receiving unit 111 are the same.
[時間経過の考慮]
 なお、ここまでの説明において判別部112は、受入部111が受け入れた指の屈伸状態の検出結果や手の角度の検出結果に合致するポーズ特定情報を検索し、検出結果に合致するポーズ特定情報が見いだされたときには、直ちに当該ポーズ特定情報(P)に関連付けられた処理特定情報(Q)を読み出して、処理実行部113に出力するものとして説明したが、本実施の形態はこれに限られない。
[Consideration of elapsed time]
In the description so far, the determination unit 112 searches for the pose specification information that matches the detection result of the bending / extension state of the finger and the detection result of the hand angle received by the reception unit 111, and the pose specification information that matches the detection result. In the above description, the process specifying information (Q) associated with the pause specifying information (P) is immediately read out and output to the process execution unit 113. However, the present embodiment is not limited to this. Absent.
 すなわち判別部112は、予め定められた時間の間、受入部111が受け入れた検出結果(この時間内に複数回受け入れるものとする)に合致すると判断されたポーズ特定情報がいずれも同じであるとき、つまり、予め定めた時間の間に、受入部111が複数回受け入れた検出結果情報がいずれも、ポーズデータベースに保持されたポーズ特定情報で特定されるいずれかの手のポーズのうち、同じ手のポーズに一致する(同じポーズ特定情報に係る条件を満足する)と判断したときに、当該一致した手のポーズを特定するポーズ特定情報に関連付けられた処理特定情報を取得して、処理実行部113に出力することとしてもよい。 That is, when the determination unit 112 has the same pose specification information determined to match the detection result received by the receiving unit 111 for a predetermined time (assumed to be received a plurality of times within this time). That is, the detection result information received by the receiving unit 111 a plurality of times during a predetermined time period is the same hand among the poses of any hand specified by the pose specification information stored in the pose database. When it is determined that the pose matches (the condition relating to the same pose specification information is satisfied), the process specification information associated with the pose specification information for specifying the matched hand pose is acquired, and the process execution unit It is good also as outputting to 113.
 この例によると、例えばユーザが手のポーズを変化させている過程で、一時的にいずれかのポーズ特定情報によって特定されるポーズが形成されたときに、当該ポーズ特定情報に関連付けられた処理特定情報で特定される処理が実行されてしまうことが防止され、キャラクタのポーズ変化過程で余計な処理が行われることが防止される。 According to this example, for example, when a pose specified by any one of the pose specifying information is temporarily formed while the user is changing the pose of the hand, the process specifying associated with the pose specifying information is specified. It is possible to prevent the process specified by the information from being executed, and to prevent the extra process from being performed in the process of changing the pose of the character.
[動作]
 次に、本発明の実施の形態に係る情報処理装置1の基本的な動作例について説明する。本実施の形態の以下の例では、記憶部12のポーズデータベースにおいて、「親指が伸ばされ、人差指、中指、薬指、及び小指が曲げられた状態にあり、かつ、Z軸が鉛直下向きとなっている」との条件(いわゆるサムズダウンのポーズ)を表すポーズを特定するポーズ特定情報に対し、キャラクタの各指を、ユーザにより指示された手のポーズ(指センサ241にて検出したユーザの指の屈伸状態)に設定せず、指示がなかったものとして直前の手のポーズを継続する処理を特定する処理特定情報を関連付けて記憶しているものとする。
[Operation]
Next, a basic operation example of the information processing apparatus 1 according to the embodiment of the present invention will be described. In the following example of the present embodiment, in the pose database of the storage unit 12, “the thumb is extended, the index finger, the middle finger, the ring finger, and the little finger are bent, and the Z axis is vertically downward. In response to the pose specifying information for specifying the pose indicating the condition “so-called thumbs-down pose”, each character's finger is moved to the hand pose (indicated by the finger sensor 241 of the user's finger detected by the user). It is assumed that the process specifying information for specifying the process of continuing the pose of the immediately preceding hand is stored in association with it as if there was no instruction without setting to the (bending / extending state).
 ここでコントローラデバイス20を左右の手にそれぞれ固定したユーザが、例えば右手の親指を伸ばし、それ以外の人差指から小指までを曲げてコントローラデバイス20を握った形で、親指を鉛直上方へ向けると、コントローラデバイス20の複数の指センサ241が、対応する親指から小指までのそれぞれまでの距離を表す検出結果情報を出力する。また、センサ部24の傾きセンサ242が、手の角度としてZ軸方向(コントローラデバイス20の長手方向であって、ユーザの親指がある側)が鉛直上向きであることを表す検出結果情報を出力する。 Here, when the user who fixed the controller device 20 to the left and right hands, for example, stretches the thumb of the right hand and bends the other index finger to the little finger to hold the controller device 20 and points the thumb vertically upward, The plurality of finger sensors 241 of the controller device 20 output detection result information indicating the distance from the corresponding thumb to the little finger. In addition, the tilt sensor 242 of the sensor unit 24 outputs detection result information indicating that the Z-axis direction (the longitudinal direction of the controller device 20 and the user's thumb side) is vertically upward as the hand angle. .
 情報処理装置1の制御部11は、当初「」(空白文字列)に初期化される、前回検索されたポーズ特定情報を表す変数pと、当初「0」に初期化されるカウンタ変数cとを設定して図6に例示する処理を実行しており、コントローラデバイス20が出力する検出結果情報を受け入れる(S11)。制御部11は、処理S11で受け入れた検出結果情報のうち、指センサ241の検出結果情報に基づき、各指の屈伸状態を表す情報を生成する(S12)。ここでは制御部11は、「伸び、曲げ、曲げ、曲げ、曲げ」の屈伸状態にある旨の情報を生成することとなる。 The control unit 11 of the information processing apparatus 1 includes a variable p representing pose identification information searched last time, which is initially initialized to “” (blank character string), and a counter variable c which is initially initialized to “0”. 6 is executed, and the detection result information output from the controller device 20 is accepted (S11). The control part 11 produces | generates the information showing the bending / extension state of each finger | toe based on the detection result information of the finger sensor 241 among the detection result information received by process S11 (S12). Here, the control part 11 will produce | generate the information that it exists in the bending / stretching state of "elongation, bending, bending, bending, bending".
 制御部11は、そして、この屈伸状態を表す情報と、処理S11で受け入れた検出結果のうち、手の角度の情報とが満足する条件を含むポーズ特定情報を、ポーズデータベースから検索し、検索できたかを判断する(S13)。ここでの例では、「親指が伸ばされ、人差指、中指、薬指、及び小指が曲げられた状態にあり、かつ、Z軸が鉛直上向きとなっている」との条件がポーズデータベースに含まれないので、制御部11は、検索によりポーズ特定情報が見いだせなかったと判断し(S13:No)、予め定められた処理を実行する。ここでは制御部11は、変数pを「」(空白文字列)に、カウンタ変数cを「0」にそれぞれ初期化し、また一例として処理S11で受け入れた指の屈伸状態や手の角度と同じ屈伸状態、手の角度となるよう、仮想空間内のキャラクタのポーズを制御して(S14)、処理S11に戻って処理を続ける。 Then, the control unit 11 can search the pose database for pose specifying information including a condition that satisfies the information indicating the bending state and the information on the hand angle among the detection results received in the process S11. (S13). In this example, the pose database does not include a condition that “the thumb is extended, the index finger, the middle finger, the ring finger, and the little finger are bent, and the Z axis is vertically upward”. Therefore, the control unit 11 determines that the pose identification information cannot be found by the search (S13: No), and executes a predetermined process. Here, the control unit 11 initializes the variable p to “” (blank character string) and the counter variable c to “0”, and as an example, the bending / extension of the finger is the same as the bending / extension state of the finger or the angle of the hand received in step S11. The pose of the character in the virtual space is controlled so as to be in the state and the hand angle (S14), and the process returns to the process S11 to continue the process.
 次に、ユーザが右手の親指を伸ばし、それ以外の人差指から小指までを曲げてコントローラデバイス20を握った状態で、親指を鉛直下方へ向けると、コントローラデバイス20の複数の指センサ241が、対応する親指から小指までのそれぞれまでの距離を表す検出結果情報を出力する。また、センサ部24の傾きセンサ242が、手の角度としてZ軸方向(コントローラデバイス20の長手方向であって、ユーザの親指がある側)が鉛直下向きであることを表す検出結果情報を出力する。 Next, when the user extends the thumb of the right hand and bends from the other index finger to the little finger and grips the controller device 20, when the thumb is directed vertically downward, the plurality of finger sensors 241 of the controller device 20 correspond to each other. Detection result information indicating the distance from the thumb to the little finger is output. In addition, the tilt sensor 242 of the sensor unit 24 outputs detection result information indicating that the Z-axis direction (the longitudinal direction of the controller device 20 and the user's thumb is present) is vertically downward as the hand angle. .
 この場合、制御部11は、処理S11で受け入れた検出結果情報に基づいて、処理S12にて指の屈伸状態が、「伸び、曲げ、曲げ、曲げ、曲げ」の屈伸状態にある旨の情報を生成する。そして処理S13において、「親指が伸ばされ、人差指、中指、薬指、及び小指が曲げられた状態にあり、かつ、Z軸が鉛直下向きとなっている」との条件がポーズデータベースに含まれるので、このいわゆるサムズダウンのポーズを特定するポーズ特定情報が検索できたと判断する(S13:Yes)。 In this case, based on the detection result information received in step S11, the control unit 11 obtains information indicating that the finger bending / stretching state is “stretching, bending, bending, bending, bending, bending” in step S12. Generate. And in the process S13, the condition that “the thumb is extended, the index finger, the middle finger, the ring finger, and the little finger are bent, and the Z axis is vertically downward” is included in the pose database. It is determined that the pose specifying information for specifying the so-called thumbs-down pose has been retrieved (S13: Yes).
 そして制御部11は、前回検索されたポーズ特定情報(変数pが表す)と、処理S13にて今回検索されたポーズ特定情報とが一致しているか否かを判断する(S15)、ここで、前回検索されたポーズ特定情報と、処理S13にて今回検索されたポーズ特定情報とが一致していれば(S15:Yes)、カウンタ変数cを「1」だけインクリメントし(S16)、カウンタ変数cが予め定めたしきい値(カウントしきい値)を超えたか否かを判断する(S17)。ここでカウンタ変数cが予め定めたカウントしきい値を超えたと判断されると(S17:Yes)、予め定められた時間の間、受け入れた検出結果(この時間内に複数回受け入れる)に合致すると判断されたポーズ特定情報がいずれも同じであったこととなるので、制御部11は、検索されたポーズ特定情報(変数pが表す)に関連付けてポーズデータベースに格納された処理特定情報(Q)を取得する(S18)。 Then, the control unit 11 determines whether or not the previously searched pose specifying information (represented by the variable p) matches the pose specifying information searched this time in the process S13 (S15). If the pose specifying information searched last time matches the pose specifying information searched this time in step S13 (S15: Yes), the counter variable c is incremented by "1" (S16), and the counter variable c It is determined whether or not exceeds a predetermined threshold value (count threshold value) (S17). Here, if it is determined that the counter variable c has exceeded a predetermined count threshold value (S17: Yes), the detection result (accepted multiple times within this time) for a predetermined time is met. Since the determined pose specifying information is the same, the control unit 11 stores the processing specifying information (Q) stored in the pose database in association with the searched pose specifying information (represented by the variable p). Is acquired (S18).
 そして制御部11は、ここで取得した処理特定情報で特定される処理を実行する(S19)。例えばここではキャラクタの手の形を予め定めた形(例えばすべての指を伸ばしている形)に設定し、処理S11に戻って処理を続ける。 And the control part 11 performs the process specified by the process specific information acquired here (S19). For example, here, the shape of the character's hand is set to a predetermined shape (for example, a shape in which all fingers are stretched), and the process returns to step S11 to continue the process.
 また処理S15において、前回検索されたポーズ特定情報と、処理S13にて今回検索されたポーズ特定情報とが一致していなければ(S15:No)、変数pを、処理S13にて今回検索されたポーズ特定情報に設定し、カウンタ変数cを「1」にセットして(S20)、処理S11に戻って処理を続ける。 In process S15, if the pose specifying information searched last time does not match the pose specifying information searched this time in process S13 (S15: No), variable p is searched this time in process S13. The pause identification information is set, the counter variable c is set to “1” (S20), and the process returns to the process S11 to continue the process.
 また処理S17において、カウンタ変数cが予め定めたカウントしきい値を超えていないとき(S17:No)には、処理S14に移行して処理を続ける。 In process S17, when the counter variable c does not exceed the predetermined count threshold value (S17: No), the process proceeds to process S14 and the process is continued.
 なお、ここでの例ではコントローラデバイス20は、0.5秒あたりに複数回、指の屈伸状態や手の角度等を検出して検出結果情報を出力するものとし、カウントしきい値は、例えば0.5秒間に検出結果情報が入力される回数とする。このようにすると、最大で0.5秒だけ、キャラクタがサムズダウンのポーズをとることとなるが、その後、予め定められたポーズに変更されて、場面において不適切とされるポーズ(ここではサムズダウンのポーズ)がとられることが防止される。 In the example here, the controller device 20 outputs the detection result information by detecting the bending state of the finger, the hand angle, etc., multiple times per 0.5 seconds. The number of times detection result information is input in 0.5 seconds. In this way, the character poses a thumbs-down pose for a maximum of 0.5 seconds, but after that, the character is changed to a predetermined pose and is a pose that is inappropriate in the scene (here, the thumbs). (Pause down) is prevented.
[指の開き具合を特定できる場合]
 また本実施の形態において、情報処理装置1に接続されるコントローラデバイス20が、少なくとも一部の指について指の傾きを判断できる場合(指センサ241の一部が、対象物までの距離だけでなく、比較的近い対象物の方向も判別できる場合)は、当該屈伸状態の情報だけでなく、指の傾きの情報も用いてポーズを判別してもよい。
[If you can specify how your fingers open]
Further, in the present embodiment, when the controller device 20 connected to the information processing apparatus 1 can determine the tilt of the finger for at least some fingers (a part of the finger sensor 241 is not limited to the distance to the object). In the case where the direction of a relatively close target object can also be determined), the pose may be determined using not only the information on the bending / stretching state but also information on the tilt of the finger.
 一例として人差指の傾きを判別できる場合、人差指、中指を伸ばした状態で人差指と中指とを平行に(くっつけるようにして)伸ばしているのか、人差指、中指でV字をつくるように伸ばしているのかを区別可能となる。この場合、ポーズ特定情報は、人差指と中指とでV字のサインを形成したポーズは、「親指、薬指、小指が曲げられ、人差指、中指が伸ばされ、かつ、人差指が中指から離れた方向へ伸びており、Z軸が水平方向から±10度の範囲にある」との条件として記述でき、制御部11は、人差指と中指とが平行に伸びている状態(V字にしない状態)と区別した処理を実行することが可能となる。 For example, if you can determine the tilt of the index finger, whether the index finger and middle finger are stretched in parallel with the index finger and middle finger stretched (attached to each other), or the index finger and middle finger are stretched to create a V-shape. Can be distinguished. In this case, the pose specifying information is that the pose in which a V-shaped sign is formed by the index finger and the middle finger is “the thumb, ring finger, little finger are bent, the index finger, the middle finger are extended, and the index finger is away from the middle finger. It can be described as a condition that “the Z-axis is in a range of ± 10 degrees from the horizontal direction”, and the control unit 11 is distinguished from a state in which the index finger and the middle finger extend in parallel (a state in which the index finger is not V-shaped). It is possible to execute the process.
[手の接触判定]
 さらに本実施の形態の一例では、コントローラデバイス20の加速度センサ243の検出結果情報を用いてもよい。例えば加速度センサ243の検出結果情報を用いて、ポーズ特定情報を、ハイタッチのポーズを表す、「第1から小指までが伸ばされた状態であり、Z軸が水平方向から±10度の範囲にあって、かつ、加速度センサ243が衝撃を検出した」との条件とすれば、ハイタッチ等のポーズに対応した処理を実行させることが可能となる。
[Hand contact judgment]
Furthermore, in an example of the present embodiment, detection result information of the acceleration sensor 243 of the controller device 20 may be used. For example, using the detection result information of the acceleration sensor 243, the pose identification information indicates the high touch pose. “The first to little finger is stretched, and the Z axis is in the range of ± 10 degrees from the horizontal direction. If the acceleration sensor 243 detects an impact ”, it is possible to execute processing corresponding to a pose such as high touch.
[手の動きの情報]
 さらに本実施の形態の情報処理装置1は、加速度センサ243と傾きセンサ242との検出結果情報を用いて、ユーザの手の動きの情報を得てもよい。コントローラデバイス20の加速度センサ243と傾きセンサ242との検出結果情報を用いたユーザの手の動きの情報取得は、広く知られた方法があるので、ここでの詳しい説明を省略する。
[Hand movement information]
Furthermore, the information processing apparatus 1 according to the present embodiment may obtain information on the movement of the user's hand using detection result information from the acceleration sensor 243 and the tilt sensor 242. Since there is a widely known method for acquiring information on the movement of the user's hand using the detection result information of the acceleration sensor 243 and the tilt sensor 242 of the controller device 20, a detailed description thereof is omitted here.
 このようにユーザの手の動きの情報が取得できる場合は、ポーズ特定情報を、ハイタッチのポーズを表す、「第1から小指までが伸ばされた状態であり、Z軸が水平方向から±10度の範囲にあって、ユーザの手が上昇し、加速度センサ243が衝撃を検出した」との条件とすれば、情報処理装置1は、より確実にハイタッチ等のポーズに対応した処理を実行させることが可能となる。 When the user's hand movement information can be acquired in this way, the pose specifying information indicates the high touch pose, “the first to little finger is extended, and the Z axis is ± 10 degrees from the horizontal direction. If the condition is that the user's hand has risen and the acceleration sensor 243 has detected an impact, the information processing apparatus 1 can execute processing corresponding to a pose such as high touch more reliably. Is possible.
[手の位置判定]
 また本実施の形態の別の例では、手の位置を検出してポーズの判別に用いてもよい。すなわち本実施の形態の情報処理装置1は、撮像部14が出力する画像データからデバイス20の位置提示部213を検出する。
[Hand position determination]
In another example of the present embodiment, the position of the hand may be detected and used to determine the pose. That is, the information processing apparatus 1 according to the present embodiment detects the position presentation unit 213 of the device 20 from the image data output from the imaging unit 14.
 具体的にこのデバイス位置検出処理では、撮像部14が出力する画像データを受け入れて、当該画像データ中から、ユーザの手に固定されているコントローラデバイス20の位置提示部213を検出する。このような位置提示部の検出方法は広く知られているので、ここでの詳しい説明を省略する。 Specifically, in this device position detection process, the image data output by the imaging unit 14 is received, and the position presentation unit 213 of the controller device 20 fixed to the user's hand is detected from the image data. Since such a method of detecting the position presentation unit is widely known, a detailed description thereof is omitted here.
 このようにユーザの手の位置の情報が取得できる場合は、ポーズ特定情報を、ハイタッチのポーズを表す、「第1から小指までが伸ばされた状態であり、Z軸が水平方向から±10度の範囲にあって、ユーザの手がユーザの頭部より高い位置にあり、加速度センサ243が衝撃を検出した」との条件とすれば、情報処理装置1は、より確実にハイタッチ等のポーズに対応した処理を実行させることが可能となる。 When the user's hand position information can be acquired in this way, the pose specifying information indicates the high touch pose, “the first to little finger is stretched and the Z axis is ± 10 degrees from the horizontal direction. If the condition is that the user's hand is higher than the user's head and the acceleration sensor 243 has detected an impact, the information processing apparatus 1 can make a pose such as high touch more reliably. Corresponding processing can be executed.
[アプリによるパターン登録]
 なお、本実施の形態の情報処理装置1は、アプリケーションプログラムを実行し、アプリケーションプログラムの指示に従って、ポーズ特定情報と、それに関連付けられた処理特定情報とを、ポーズデータベースに登録・削除する処理を行ってもよい。
[Pattern registration by application]
Note that the information processing apparatus 1 according to the present embodiment executes an application program, and performs processing for registering / deleting pause specifying information and processing specifying information associated therewith in the pause database in accordance with an instruction of the application program. May be.
 これによると、例えば、ゲームアプリケーション等でハイタッチ等の喜びの表現が不適切である場面では、ハイタッチを表すポーズ特定情報と、キャラクタにハイタッチの動作を行わせない(例えばキャラクタを予め定めた直立姿勢などの所定の姿勢に設定する)処理特定情報とを関連付けて登録することで、場面に応じて不適切なポーズが提示されないように処理できる。 According to this, for example, in a scene where a joy expression such as a high touch is inappropriate in a game application or the like, the pose specifying information indicating the high touch and the character do not perform the high touch action (for example, a predetermined upright posture of the character). By registering in association with processing specific information (which is set to a predetermined posture such as), it is possible to perform processing so that an inappropriate pose is not presented according to the scene.
[ゲームのレーティング、地域を考慮した処理]
 また、手のポーズの妥当性は、ユーザの年齢や居住地域等にも依存することを考慮して、ポーズデータベースは、ポーズ特定情報と、当該ポーズ特定情報により特定されるポーズが指示されたときに処理を実行する条件(以下実行条件と呼ぶ)と、処理特定情報とを互いに関連付けて保持するものであってもよい。
[Game rating, processing considering the region]
Considering that the validity of hand poses also depends on the user's age, residential area, etc., the pose database is used when the pose specification information and the pose specified by the pose specification information are instructed. In addition, a condition for executing a process (hereinafter referred to as an execution condition) and process specifying information may be held in association with each other.
 ここで実行条件は例えば実行中のアプリケーションプログラムのレーティング情報(年齢制限情報)や、実行している地域(またはメッセージ表示等のために選択された言語)に係る条件等であり、一例として「ゲームのレーティングで『A』または『B』である」などの条件であってもよいし、「メッセージの表示のために選択された言語が『ギリシア語』である」との条件であってもよい。 Here, the execution conditions are, for example, the rating information (age restriction information) of the application program being executed, the conditions related to the area in which the application program is being executed (or the language selected for the message display, etc.), etc. The rating may be a condition such as “A” or “B” ”, or a condition that“ the language selected for displaying the message is “Greek” ”. .
 そしてポーズデータベースでは、例えばギリシア等では、人差指と中指とを伸ばし、その他の指を曲げて手のひら側を相手に向ける手のポーズが社会的に不適切とされていることを考慮し、V字のサインを提示するポーズを表す「親指、薬指、小指が曲げられ、人差指、中指が伸ばされ、Z軸が水平方向から±10度の範囲にある」となっているポーズ特定情報に対して、実行条件を、「メッセージの表示のために選択された言語が『ギリシア語』であり、かつゲームのレーティングが『Z』以外である」とし、処理特定情報が特定する処理を、キャラクタに指示されたポーズを提示させない(例えばキャラクタを予め定めた直立姿勢などの所定の姿勢に設定する)処理特定情報を関連付けて登録しておく。 In the pose database, for example, in Greece, considering that the pose of the hand with the index finger and middle finger extended and bending the other finger and pointing the palm side toward the other party is socially inappropriate, Executed for pose specific information that represents a pose that presents a sign, such as “the thumb, ring finger, little finger are bent, index finger, middle finger are extended, and the Z axis is within ± 10 degrees from the horizontal direction” The condition is “The language selected for displaying the message is“ Greek ”and the game rating is other than“ Z ””, and the character is instructed to perform the process specified by the process specifying information. Processing specific information that does not present a pose (for example, sets a character to a predetermined posture such as a predetermined upright posture) is associated and registered.
 この例によると、情報処理装置1はレーティングが『A』であるオンラインゲームの実行中に、ユーザ(情報処理装置1のユーザのほか、オンラインゲームの対戦相手を含む)がメッセージの表示のために選択している言語の情報を取得し、当該取得した言語の情報のうちに、「ギリシア語」が含まれるときに、実行条件を満足するとして、いずれかのユーザ(情報処理装置1のユーザのほか、オンラインゲームの対戦相手を含む)がV字のサインを提示するポーズをとるようコントローラデバイス20を固定した手でV字のサインを形成しても、当該ユーザのキャラクタがV字のサインを提示することがなくなり、ギリシア語を選択している対戦相手のユーザに対して不用意に不快感を与えることがなくなる。 According to this example, during the execution of an online game whose rating is “A”, the user (including the user of the information processing apparatus 1 as well as the opponent of the online game) can display messages. Information on the selected language is acquired, and when “Greek” is included in the acquired language information, it is determined that one of the users (the user of the information processing apparatus 1 is satisfied) as the execution condition is satisfied. Even if an online game opponent (including opponents of online games) takes a pose of presenting a V-shaped sign, the V-shaped sign will be displayed by the user's character even if the V-shaped sign is formed by a hand with the controller device 20 fixed. It is no longer presented and the user of the opponent who selects the Greek language is not inadvertently discomforted.
[変形例]
 また図6に例示した本実施の形態において、処理S17で比較するカウンタしきい値は、ポーズ特定情報ごとに異なっていてもよい。この場合、カウンタしきい値をポーズデータベースに、ポーズ特定情報に関連付けて登録しておけばよい。この例では制御部11は、処理S20にて変数pを処理S13にて今回検索されたポーズ特定情報に設定し、カウンタ変数cを「1」にセットするとともに、処理S13にて今回検索されたポーズ特定情報に関連付けて登録されているカウンタしきい値を読み出して、処理S17にて使用すればよい。
[Modification]
In the present embodiment illustrated in FIG. 6, the counter threshold value to be compared in the process S17 may be different for each piece of pause specifying information. In this case, the counter threshold value may be registered in the pose database in association with the pose specifying information. In this example, the control unit 11 sets the variable p in the process S20 to the pose specific information searched this time in the process S13, sets the counter variable c to “1”, and searches the current time in the process S13. The counter threshold value registered in association with the pose specifying information may be read and used in step S17.
[他のデバイスとの相対的位置]
 なお、ここまでの説明において、コントローラデバイス20の位置(当該コントローラデバイス20を装着したユーザの手の位置)を検出する方法として、情報処理装置1が備える撮像部14にてコントローラデバイス20を撮像する方法について説明したが、本実施の形態において、コントローラデバイス20の位置を検出する方法は、他の方法であってもよい。
[Relative position with other devices]
In the description so far, as a method of detecting the position of the controller device 20 (the position of the hand of the user wearing the controller device 20), the controller device 20 provided in the information processing apparatus 1 is imaged. Although the method has been described, in the present embodiment, the method for detecting the position of the controller device 20 may be another method.
 一例として、ユーザがコントローラデバイス20の他にデバイス(例えば頭部に装着するヘッドマウントディスプレイ等のディスプレイデバイスなど)を装着している場合、当該他のデバイスとの相対的位置に基づいてコントローラデバイス20の位置を検出してもよい。 As an example, when the user wears a device (for example, a display device such as a head mounted display worn on the head) in addition to the controller device 20, the controller device 20 is based on the relative position with respect to the other device. May be detected.
 具体的に当該他のデバイスがヘッドマウントディスプレイであれば、情報処理装置1は、撮像部14にて撮像した画像に基づきヘッドマウントディスプレイの位置を検出する(ヘッドマウントディスプレイの鉛直方向に対する角度など、姿勢の情報も合わせて検出してもよい)とともに、コントローラデバイス20の位置を検出する。 Specifically, if the other device is a head-mounted display, the information processing apparatus 1 detects the position of the head-mounted display based on the image captured by the imaging unit 14 (such as the angle of the head-mounted display with respect to the vertical direction). At the same time, the position of the controller device 20 is detected.
 そして情報処理装置1は、検出したヘッドマウントディスプレイの位置(撮像部14の光軸方向と、当該光軸方向を法線方向とする面内において鉛直方向(重力の方向)及びそれに直交する方向の各軸上の位置)を基準位置として、この基準位置に対するコントローラデバイス20の相対位置の座標を求める。なお、以下では鉛直上向きを正の向きとする。 The information processing apparatus 1 then detects the position of the detected head-mounted display (the optical axis direction of the imaging unit 14 and the vertical direction (the direction of gravity) and a direction orthogonal thereto in a plane whose normal direction is the optical axis direction). Using the position on each axis as a reference position, the coordinates of the relative position of the controller device 20 with respect to the reference position are obtained. In the following, the vertically upward direction is defined as a positive direction.
 ヘッドマウントディスプレイは、ユーザの頭部に装着されるものであるので、当該ヘッドマウントディスプレイの位置座標の鉛直方向成分の値は、ユーザの頭の高さを表すと想定できる。そこで情報処理装置1は、コントローラデバイス20の位置座標の鉛直方向成分の値から、ヘッドマウントディスプレイの位置座標の鉛直方向成分の値を引いた値が、予め定めたしきい値を超え、かつ、正の値である(コントローラデバイス20がヘッドマウントディスプレイより高い位置にある)ときに、ユーザの手がユーザの頭部より高い位置にあると判断できる。 Since the head mounted display is mounted on the user's head, it can be assumed that the value of the vertical component of the position coordinates of the head mounted display represents the height of the user's head. Therefore, the information processing apparatus 1 has a value obtained by subtracting the value of the vertical component of the position coordinate of the head mounted display from the value of the vertical component of the position coordinate of the controller device 20 exceeds a predetermined threshold value, and When the value is a positive value (the controller device 20 is higher than the head-mounted display), it can be determined that the user's hand is higher than the user's head.
 また、情報処理装置1は、コントローラデバイス20の位置座標の鉛直方向成分の値から、ヘッドマウントディスプレイの位置座標の鉛直方向成分の値を引いた値が、予め定めたしきい値を超え、かつ、負の値である(コントローラデバイス20がヘッドマウントディスプレイより高い位置にある)ときには、ユーザが手を下げていると判断できる。 Further, the information processing apparatus 1 determines that the value obtained by subtracting the value of the vertical component of the position coordinate of the head mounted display from the value of the vertical component of the position coordinate of the controller device 20 exceeds a predetermined threshold value. If the value is negative (the controller device 20 is higher than the head-mounted display), it can be determined that the user has lowered his / her hand.
 さらに本実施の形態のある例では、ヘッドマウントディスプレイがカメラを有し、ヘッドマウントディスプレイから手の先(あるいはコントローラデバイス20)までの相対的な位置を検出してもよい。 Further, in an example of the present embodiment, the head mounted display may have a camera, and the relative position from the head mounted display to the tip of the hand (or the controller device 20) may be detected.
 この場合、ヘッドマウントディスプレイのカメラにて撮像された画像から、ヘッドマウントディスプレイよりも鉛直上方、かつ、所定のしきい値を超えた距離だけ離れた位置にコントローラデバイス20があるときに、ユーザが手を上げている(ユーザの手がユーザの頭部より高い位置にある)と判断できる。 In this case, when the controller device 20 is located vertically above the head mounted display and at a distance exceeding a predetermined threshold from the image captured by the camera of the head mounted display, the user It can be determined that the hand is raised (the user's hand is higher than the user's head).
 さらに、ここでは他のデバイスとしてヘッドマウントディスプレイを例としたが、本実施の形態はこれに限られず、ユーザの頭部に装着されるヘッドフォンや、ヘッドバンド、ユーザの腕に装着される腕時計型のデバイス等、種々のデバイスからコントローラデバイス20の相対的位置を検出して、ユーザの手の動きを判別しても構わない。 Furthermore, although the head-mounted display is taken as an example here as another device, the present embodiment is not limited to this, and the headphone worn on the user's head, the headband, and the wristwatch type worn on the user's arm. The movement of the user's hand may be determined by detecting the relative position of the controller device 20 from various devices.
 1 情報処理装置、11 制御部、12 記憶部、13 インタフェース部、14 撮像部、15 出力部、20 コントローラデバイス、21 制御部、22 記憶部、23 操作インタフェース、24 センサ部、25 通信部、111 受入部、112 判別部、113 処理実行部、210 デバイス本体、211 把持部、212 操作部、213 位置提示部、220 固定具、231 ボタン操作部、241 指センサ、242 傾きセンサ、243 加速度センサ。
 
DESCRIPTION OF SYMBOLS 1 Information processing apparatus, 11 Control part, 12 Storage part, 13 Interface part, 14 Imaging part, 15 Output part, 20 Controller device, 21 Control part, 22 Storage part, 23 Operation interface, 24 Sensor part, 25 Communication part, 111 Acceptance unit, 112 determination unit, 113 processing execution unit, 210 device main body, 211 gripping unit, 212 operation unit, 213 position presentation unit, 220 fixture, 231 button operation unit, 241 finger sensor, 242 tilt sensor, 243 acceleration sensor.

Claims (6)

  1.  ユーザの手に固定されるデバイスに接続される情報処理装置であって、
     前記デバイスは、少なくとも一部の指について、指の屈伸状態を検出する指センサと、
     固定されている手の角度を検出する角度センサと、
     を含み、
     前記情報処理装置は、
     少なくとも一つの手のポーズについて、手のポーズを特定する情報と行うべき処理を表す処理特定情報とを関連付けて保持する保持手段に接続され、
     前記デバイスから指センサによる指の屈伸状態の検出結果と、手の角度の検出結果とを受け入れる受入手段と、
     当該受け入れた検出結果に基づいて、前記保持手段に保持されている情報で特定される手のポーズのいずれかに一致するか否かを判断し、いずれかに一致すると判断したときに、当該一致した手のポーズを特定する情報に関連付けられた処理特定情報を取得する判別手段と、
     前記判別手段が取得した処理特定情報で特定される処理を実行する実行手段と、
    を含む情報処理装置。
    An information processing apparatus connected to a device fixed to a user's hand,
    The device includes a finger sensor that detects a bending / extending state of a finger for at least some of the fingers;
    An angle sensor for detecting the angle of the fixed hand;
    Including
    The information processing apparatus includes:
    For at least one hand pose, connected to holding means for holding information specifying the hand pose and processing specifying information indicating processing to be performed in association with each other,
    A receiving means for receiving a detection result of a finger bending / extension state by a finger sensor and a detection result of a hand angle from the device;
    Based on the received detection result, it is determined whether or not it matches any of the hand poses specified by the information held in the holding means. Determining means for acquiring process specifying information associated with information specifying the pose of the hand that has been performed;
    Execution means for executing a process specified by the process specifying information acquired by the determination means;
    An information processing apparatus including:
  2.  請求項1に記載の情報処理装置であって、
     前記受入手段は、所定のタイミングごとに繰り返して前記デバイスから指センサによる指の屈伸状態の検出結果と、手の角度の検出結果とを受け入れており、
     前記判別手段は、予め定めた時間に亘って繰り返し受け入れた検出結果がいずれも、前記保持手段に保持されている情報で特定されるいずれかの手のポーズのうち、同じ手のポーズに一致すると判断したときに、当該一致した手のポーズを特定する情報に関連付けられた処理特定情報を取得する情報処理装置。
    The information processing apparatus according to claim 1,
    The receiving means receives a detection result of a finger bending / extension state by a finger sensor and a detection result of a hand angle repeatedly from the device at predetermined timings,
    The determination unit determines that all the detection results repeatedly received over a predetermined time coincide with the same hand pose among any of the hand poses specified by the information held in the holding unit. An information processing apparatus that acquires processing specifying information associated with information for specifying the matching hand pose when the determination is made.
  3.  請求項1または2に記載の情報処理装置であって、
     前記処理特定情報には、受け入れた検出結果情報を破棄するべき旨の処理特定情報を含む情報処理装置。
    The information processing apparatus according to claim 1, wherein:
    The information processing apparatus including the processing specification information indicating that the received detection result information should be discarded.
  4.  ユーザの手に固定され、少なくとも一部の指について、指の屈伸状態を検出する指センサと、固定されている手の角度を検出する角度センサと、を備えたデバイスに接続され、少なくとも一つの手のポーズについて、手のポーズを特定する情報と行うべき処理を表す処理特定情報とを関連付けて保持する保持手段に接続される情報処理装置を用い、
     取得手段が、前記デバイスから指センサによる指の屈伸状態の検出結果と、手の角度の検出結果と、手の位置の検出結果と、前記デバイスの加速度の検出結果と、のうち、少なくとも指の屈伸状態の検出結果と、手の角度の検出結果とを含む検出結果を取得する工程と、
     判別手段が、当該取得した検出結果に基づいて、前記保持手段に保持されている情報で特定される手のポーズのいずれかに一致するか否かを判断し、いずれかに一致すると判断したときに、当該一致した手のポーズを特定する情報に関連付けられた処理特定情報を取得する工程と、
     実行手段が、前記取得した処理特定情報で特定される処理を実行する工程と、
    を含む情報処理装置の制御方法。
    A finger sensor that is fixed to a user's hand and that detects at least a part of the finger and detects a bending / extension state of the finger, and an angle sensor that detects an angle of the fixed hand, and is connected to at least one device. For the hand pose, using an information processing apparatus connected to a holding unit that holds information for specifying the pose of the hand and process specifying information representing the process to be performed,
    The acquisition means includes at least a finger detection result of the finger bending / extension state from the device, a hand angle detection result, a hand position detection result, and an acceleration detection result of the device. Obtaining a detection result including a detection result of a bending state and a detection result of a hand angle;
    When the determining means determines whether or not it matches any of the hand poses specified by the information held in the holding means based on the acquired detection result, And obtaining process identification information associated with information identifying the matched hand pose,
    A step of executing a process specified by the acquired process specifying information;
    A method for controlling an information processing apparatus including:
  5.  ユーザの手に固定され、少なくとも一部の指について、指の屈伸状態を検出する指センサと、固定されている手の角度を検出する角度センサと、を備えたデバイスに接続され、
     少なくとも一つの手のポーズについて、手のポーズを特定する情報と行うべき処理を表す処理特定情報とを関連付けて保持する保持手段に接続される情報処理装置を、
     前記デバイスから指センサによる指の屈伸状態の検出結果と、手の角度の検出結果とを受け入れる受入手段と、
     当該受け入れた検出結果に基づいて、前記保持手段に保持されている情報で特定される手のポーズのいずれかに一致するか否かを判断し、いずれかに一致すると判断したときに、当該一致した手のポーズを特定する情報に関連付けられた処理特定情報を取得する判別手段と、
     前記判別手段が取得した処理特定情報で特定される処理を実行する実行手段と、
    として機能させるプログラム。
    Connected to a device that is fixed to a user's hand and includes at least some fingers, a finger sensor that detects a flexion / extension state of the finger, and an angle sensor that detects an angle of the fixed hand;
    For at least one hand pose, an information processing apparatus connected to holding means for holding information specifying the hand pose and processing specifying information indicating processing to be performed in association with each other,
    A receiving means for receiving a detection result of a finger bending / extension state by a finger sensor and a detection result of a hand angle from the device;
    Based on the received detection result, it is determined whether or not it matches any of the hand poses specified by the information held in the holding means. Determining means for acquiring process specifying information associated with information specifying the pose of the hand that has been performed;
    Execution means for executing a process specified by the process specifying information acquired by the determination means;
    Program to function as.
  6.  ユーザの手に固定され、少なくとも一部の指について、指の屈伸状態を検出する指センサと、固定されている手の角度を検出する角度センサと、を備えたデバイスに接続され、少なくとも一つの手のポーズについて、手のポーズを特定する情報と行うべき処理を表す処理特定情報とを関連付けて保持する保持手段に接続される情報処理装置を、
     前記デバイスから指センサによる指の屈伸状態の検出結果と、手の角度の検出結果とを受け入れる受入手段と、
     当該受け入れた検出結果に基づいて、前記保持手段に保持されている情報で特定される手のポーズのいずれかに一致するか否かを判断し、いずれかに一致すると判断したときに、当該一致した手のポーズを特定する情報に関連付けられた処理特定情報を取得する判別手段と、
     前記判別手段が取得した処理特定情報で特定される処理を実行する実行手段と、
    として機能させるプログラムを格納した、コンピュータ可読な記録媒体。
    A finger sensor that is fixed to a user's hand and that detects at least a part of the finger and detects a bending / extension state of the finger, and an angle sensor that detects an angle of the fixed hand, and is connected to at least one device. An information processing apparatus connected to a holding unit that holds information for specifying a hand pose and processing specifying information indicating processing to be performed in association with the hand pose.
    A receiving means for receiving a detection result of a finger bending / extension state by a finger sensor and a detection result of a hand angle from the device;
    Based on the received detection result, it is determined whether or not it matches any of the hand poses specified by the information held in the holding means. Determining means for acquiring process specifying information associated with information specifying the pose of the hand that has been performed;
    Execution means for executing a process specified by the process specifying information acquired by the determination means;
    A computer-readable recording medium storing a program that functions as a computer.
PCT/JP2017/037732 2016-10-28 2017-10-18 Information processing apparatus, control method, program, and recording medium WO2018079383A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018547603A JP6687749B2 (en) 2016-10-28 2017-10-18 Information processing apparatus, control method, program, and recording medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016212079 2016-10-28
JP2016-212079 2016-10-28

Publications (1)

Publication Number Publication Date
WO2018079383A1 true WO2018079383A1 (en) 2018-05-03

Family

ID=62024744

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/037732 WO2018079383A1 (en) 2016-10-28 2017-10-18 Information processing apparatus, control method, program, and recording medium

Country Status (2)

Country Link
JP (1) JP6687749B2 (en)
WO (1) WO2018079383A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2020031271A1 (en) * 2018-08-07 2021-04-30 株式会社ソニー・インタラクティブエンタテインメント Controller device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102549168B1 (en) * 2021-05-25 2023-06-30 주식회사 아이팝 Apparatus for recognizing hand gesture using flex sensor and method thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008065860A (en) * 2007-11-26 2008-03-21 Olympus Corp Operation input device
JP2012175282A (en) * 2011-02-18 2012-09-10 Sharp Corp Manipulation apparatus, playback device and television receiver
JP2013533537A (en) * 2010-06-02 2013-08-22 マイクロソフト コーポレーション Avatar / gesture display restrictions
JP2014102674A (en) * 2012-11-20 2014-06-05 Alpine Electronics Inc Gesture input device
WO2016038953A1 (en) * 2014-09-10 2016-03-17 ソニー株式会社 Detection device, detection method, control device, and control method
JP2016081286A (en) * 2014-10-16 2016-05-16 株式会社東芝 Terminal operation support apparatus and terminal operation support method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008065860A (en) * 2007-11-26 2008-03-21 Olympus Corp Operation input device
JP2013533537A (en) * 2010-06-02 2013-08-22 マイクロソフト コーポレーション Avatar / gesture display restrictions
JP2012175282A (en) * 2011-02-18 2012-09-10 Sharp Corp Manipulation apparatus, playback device and television receiver
JP2014102674A (en) * 2012-11-20 2014-06-05 Alpine Electronics Inc Gesture input device
WO2016038953A1 (en) * 2014-09-10 2016-03-17 ソニー株式会社 Detection device, detection method, control device, and control method
JP2016081286A (en) * 2014-10-16 2016-05-16 株式会社東芝 Terminal operation support apparatus and terminal operation support method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2020031271A1 (en) * 2018-08-07 2021-04-30 株式会社ソニー・インタラクティブエンタテインメント Controller device
US11648464B2 (en) 2018-08-07 2023-05-16 Sony Interactive Entertainment Inc. Controller device

Also Published As

Publication number Publication date
JPWO2018079383A1 (en) 2019-03-28
JP6687749B2 (en) 2020-04-28

Similar Documents

Publication Publication Date Title
US10969867B2 (en) Information processing system, controller device, controller device control method and program
CN109690447B (en) Information processing method, program for causing computer to execute the information processing method, and computer
JP6710285B2 (en) Information processing apparatus, control method, program, and storage medium
US20080132335A1 (en) Method of determining operation input using game controller including acceleration detector
JP6834614B2 (en) Information processing equipment, information processing methods, and programs
US10512833B2 (en) Presentation method, swing analysis apparatus, swing analysis system, swing analysis program, and recording medium
JP6716028B2 (en) Control device, information processing system, control method, and program
WO2018079383A1 (en) Information processing apparatus, control method, program, and recording medium
KR20200051938A (en) Method for controlling interaction in virtual reality by tracking fingertips and VR system using it
JP2016077346A (en) Motion support system, motion support method, and motion support program
JP2017191426A (en) Input device, input control method, computer program, and storage medium
JP6661783B2 (en) Information processing system, information processing apparatus, control method, and program
JP3138145U (en) Brain training equipment
US10948978B2 (en) Virtual object operating system and virtual object operating method
JP2015018485A (en) Electronic control device, control method, and control program
TW202144983A (en) Method of interacting with virtual creature in virtual reality environment and virtual object operating system
EP3813018A1 (en) Virtual object operating system and virtual object operating method
TW202122970A (en) Behavior-based configuration method and behavior-based configuration system

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018547603

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17863923

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17863923

Country of ref document: EP

Kind code of ref document: A1