Connect public, paid and private patent data with Google Patents Public Datasets

System and method for enhanced gesture-based interaction

Download PDF

Info

Publication number
EP2613223A1
EP2613223A1 EP20120000116 EP12000116A EP2613223A1 EP 2613223 A1 EP2613223 A1 EP 2613223A1 EP 20120000116 EP20120000116 EP 20120000116 EP 12000116 A EP12000116 A EP 12000116A EP 2613223 A1 EP2613223 A1 EP 2613223A1
Authority
EP
Grant status
Application
Patent type
Prior art keywords
control
device
system
user
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20120000116
Other languages
German (de)
French (fr)
Inventor
Gilles Pinault
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Softkinetic Software
Original Assignee
Softkinetic Software
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/02Accessories
    • A63F13/06Accessories using player-operated means for controlling the position of a specific area display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/02 - G06F3/16, e.g. facsimile, microfilm
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/31Communication aspects specific to video games, e.g. between several handheld game devices at close range
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1025Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection
    • A63F2300/1031Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection using a wireless connection, e.g. Bluetooth, infrared connections
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1037Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted for converting control signals received from the game device into a haptic signal, e.g. using force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1043Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being characterized by constructional details
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1056Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving pressure sensitive buttons
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1081Input via voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • A63F2300/1093Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0381Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer

Abstract

Described herein is a wireless remote control device (100) which can be used on the hand (150) of a user to provide both hardware-based control signals which can be associated with gesture-based control signals for enhanced gesture recognition systems. The device (100) comprises a housing (110) having a sensing unit having at least one control button (120, 130, 140) which is capable of generating a control signal for an associated computerised system. The housing (110) comprises an adjustable band which is ergonomically designed for use on the hand (150) of the user. The device (100) also comprises a communication module for transmitting the generated control signal to the computerised system and for receiving signals from the computerised system. A feedback module is also provided for providing contextual feedback to the user, through the housing (110), in accordance with a signal determined by an application running on the computerised system. The control device (100) includes a controller which controls a sensing module, the communication module and the feedback module in accordance with each generated control signal. The computerised system utilises information obtained from the control device (100) together with information obtained from a gesture recognition system. The computerised system contextually manages signals provided by, and feedback signals provided to, the control device with respect to an application running on the computerised system.

Description

    Field of the Invention
  • [0001]
    The present invention relates to a system and a method for human-to-machine interaction, and is, more particularly although not exclusively, concerned with wireless remote control devices for enhanced gesture-based interaction with an interactive computerised system.
  • Background of the Invention
  • [0002]
    In the human-to-machine interaction domain, device controllers have been used for many years as interfaces between human users and hardware platforms. Through such interfaces, the user has been able to provide commands to a computerised system having means for receiving, processing, and executing actions in accordance with those commands. Device controllers may be wired or wireless and include mouse devices for computers, computer keyboards, TV remote controllers, game controllers, pads, virtual reality gloves ...etc. All of these device controller systems share one common specification which is the use of known sensing devices, such as, for example, trigger buttons, multi-directional pads, inertial sensors units etc.
  • [0003]
    Furthermore, with the advance of technology in the virtual reality domain as well as in the consumer electronic domain, several systems have been developed that use such devices or a combination thereof. In particular, it has now become mandatory for the user to carry device controllers on him/her, wear or handle them in order to improve immersion into a virtual reality environment or in order to improve user experience for interactions with computerised systems.
  • [0004]
    For example, in US-A-2010/0304868 , a multi-positional three-dimensional handled controller is described for interactivity with computerised systems. The controller provides control for a variety of applications and simulations, whilst providing an intuitive interface to interact with both two-dimensional and three-dimensional scenarios generated as a result of a computer program being executed through the computerised system of a game client. Such game clients can be a dedicated games console which can execute a locally stored program or can be connected to the internet so that a user can interact with another user at a different location. The controller is configured for use in multiple positions in order to provide flexibility in relation to the manipulation and handling of the controller. The controller comprises a handle having sensors that detect movement of input features relative to the handle, for example, a gyroscope, accelerometers, buttons, a joystick and a trackball. In one embodiment, the controller is hand-held and can operate in a "lollipop" mode where only the motion of the hand of a user provides wireless signals for interaction with an associated games console using a wireless protocol falling within the IEEE 802.11 specification. In another embodiment, the controller is again hand-held but operated in conjunction with a 2D camera that tracks the colour ball to detect gestures and which provides additional wireless signals to be processed by the games console. In a further embodiment, the controller can be operated with two hands. Ergonomics of the controller described in US-A-2010/0304868 require that the user actively and voluntary holds the controller with a hand and does not allow free natural movements, including especially fingers movements, for gesture recognition within a gesture-based interaction system.
  • [0005]
    In US-A-2011/0269544 , a hand-held computer input device is described for interacting with a games console. The device comprises a body on which are formed protrusions, the fingers and thumb of the hand of the user engaging with respective one of the protrusions. Each protrusion is configured to detect movement relative to one or more of the fingers of the user. Movement of the hand of the user as well as his/her thumb and fingers are sensed by the games console and tactile feedback can be provided to the user in response to commands transmitted to the games console. A pair of devices may be utilised but these are handled so that only one device fits the right hand and the other device fits the left hand. The hand-held interactive device described in US-A-2011/0269544 may be considered to be ergonomic from the point of view of fitting to the hand and fingers of a user, but it requires the user to grip the device for operation and does not allow free movement of the hands and/or fingers for gesture recognition within a natural gesture-based computerised system.
  • [0006]
    More recently, gesture recognition technologies based on imaging sensing signal analysis have created interactive systems which allow the user to provide commands-simply by using predetermined gestures to interact with the computerised system controlling and running such interactive systems. Such gesture recognition technologies do not make use of any devices worn by the user, and make use of either full body movement analysis, or hands only movement analysis.
  • [0007]
    However, when using either independent hardware-based controllers or natural gesture-based movements, there are limitations in the number of predefined interactions that can be performed. For a hardware remote controller device, it must, for example, be handled and it does not support finger tracking or reliable hand gesture recognition. Similarly, a full-body natural gesture recognition system does not allow a simple click action to be performed in the same way as a button activated by a finger on a remote controller would be as it requires a specific movement to be performed exclusively among some other movements. Moreover, when using gesture recognition systems, it is difficult to perform at the same time a simple click action and moving a hand around a control screen. One reason is that, a click action tends to be performed by either detecting a still pose of a hand for a predetermined duration at a specific location, or detecting a forward and backward movement along a predefined axis, for example, the Z or depth axis, within a predetermined period of time. It is therefore not possible to make use of one single hand to perform the two action within the same time
  • [0008]
    Furthermore, the combination of a natural gesture recognition system with an existing remote control device is limited as the existing apparatus has to be handled, introducing strong constraints in the kind and the number of gestures able to be performed and recognised.
  • Summary of the Invention
  • [0009]
    It is an object of the present invention to provide a control device whose ergonomics is specific for use both in standalone or in combination with a gesture analysis system for interacting with a computerised system.
  • [0010]
    It is another object of the present invention to provide a control device that can be used within a natural gesture recognition system providing additional functionalities and/or extra information for the computerised system.
  • [0011]
    It is yet another object of the present invention to provide a control device that can be used for providing additional and refined information or data relating to the position and the movements or gestures of a user that can be tracked by an imaging system associated with the computerised system.
  • [0012]
    It is a further object of the present invention to provide a control device that can be used within a natural gesture recognition system providing additional functionality in the form of bi-directional exchange of information between the user and the computerised system.
  • [0013]
    It is yet a further object of the present invention to provide a method of controlling and using the control device in a computerised system to provide enhanced user experiences by allowing combination of natural gesture based and hardware device based interactions with a computerized system, and without limiting the individual features provided by either the device itself or the gesture recognition system.
  • [0014]
    The term "control button" as used herein is intended to include buttons, switches, directional pads and joysticks as well as any other suitable device through which a user can input a control signal. Such other suitable device, namely "extra control" may be, for example, and not exclusively a microphone for providing voice commands to the computerised system, an accelerometer, a gyroscope.
  • [0015]
    In accordance with a first aspect of the present invention, there is provided a user-operated control device specifically designed for use in combination with a three-dimensional imaging system forming part of an interactive computerised system, the three-dimensional imaging system including a three-dimensional camera, the computerised system including means for processing three-dimensional information of at least a part of a user present within the camera frustum, the control device comprising:-
    • a housing;
    • a power supply unit;
    • a sensing module including at least one control button or an extra control mounted in the housing, each control button or extra control, namely a "control means" being capable of generating a control signal for the computerised system when operated;
    • a communication module located within the housing and through which each control signal generated by a control means is transmitted to the computerised system;
    • a feedback module mounted in the housing for providing contextual feedback to the user in accordance with a signal generated by an application running on the computerised system; and
    • a controller mounted in the housing, the controller controlling the sensing module, the communication module and the feedback module in accordance with each generated control signal to be send to, or received from, the computerized system;
    • characterised in that the housing comprises an ergonomic adjustable band mountable on an extremity of a user and allowing natural gesture-based interactions with the interactive computerised system whilst situated in the frustum of the three-dimensional imaging system.
  • [0016]
    As the control device comprises an adjustable band, it can easily be placed on and removed from the extremity, for example, the hand of the user. This means that in multi-player situations, each user can quickly pass the control device to the next user or from one hand to the other. This also means that the use of the control device is easy and intuitive and is not left- or right-handed specific.
  • [0017]
    The control device of the present invention furthermore has the advantage that information can be provided to the computerised system while allowing free movement of the hand and fingers which can be used at the same time in an appropriate gesture recognition system.
  • [0018]
    In addition, by having at least one control button, the device provides additional features which are useful when interacting using gesture recognition with an object in a virtual environment, for example, the object can be grabbed by depressing a button on the device, moved to another location using a gesture, and released at that another location by releasing the button. Moreover, the operation of one or more control buttons does not interfere with the ability of the user to use natural gestures, at the same time, as required by a gesture recognition system.
  • [0019]
    Preferably, the adjustable band has a first surface which defines a palm side. Each control button is mounted on the palm side of the adjustable band and is operated by at least one finger of the user.
  • [0020]
    The adjustable band also has a second surface which defines a back side, the back side may have at least one control button to be operated by some other fingers of the user. It will be appreciated that this may be a finger from the other hand of the user.
  • [0021]
    In addition, another portion may be defined between the first surface and the second surface, the portion including at least one control button to be operated by a thumb of the user.
  • [0022]
    In one embodiment, the feedback module of the adjustable band may also comprise a vibrator element. The vibrator element is located adjacent to a surface of the adjustable band which is closer to the user when worn. The vibrator element may be located on either the palm side of the adjustable band or on another part of the adjustable band, but preferably, at a location where it does not interfere with any of the other units that generate control information, such as, the accelerometer(s) or the gyroscope.
  • [0023]
    In addition, the feedback module may further comprise at least one of: a speaker module; a display screen; a light-emitting diode device; and a matrix touch feedback module.
  • [0024]
    At least a portion of the adjustable band comprises a flexible material. In one embodiment, the whole of the band may comprise flexible material. The flexible material may comprise an elastic material.
  • [0025]
    The provision of a flexible material enables the band to be easily fitted to the hand of a user, regardless of his/her hand size and gender, and is both comfortable and adjustable.
  • [0026]
    In one embodiment, the band is simply slipped on and off the hand permitting easy hand-over between users.
  • [0027]
    In one embodiment, the band is adjustable to fit different sizes of hand and comprises a two-part adjustable fastening, each part of the two-part adjustable fastening being located at respective ends of the adjustable band. A preferred two-part adjustable fastening comprises a hook and loop fastener, but it will be appreciated that other two-part fastenings can also be used, for example, a magnetic- or mechanical-based closure systems.
  • [0028]
    In accordance with another aspect of the present invention, there is provided a method of controlling an interactive computerised system as defined by the claims.
  • [0029]
    Step e) may comprise providing information relating to the part of the user with which the control device is attached.
  • [0030]
    In addition, the provided information relating the user part may be a position, velocity, acceleration and/or identification. The information can be provided by the device itself, or refined by combining data coming from the device with other data determined by a gesture analysis system. In a preferred embodiment, the provided information relating to the portion of the user may preferably be used to resolve ambiguities in information measured the three-dimensional imaging system.
  • [0031]
    In accordance with a further aspect of the present invention, there is provided a gesture-based interactive computerised system comprising a three-dimensional imaging system; a gesture recognition system; a control device as described above; and a computerised system associated with the three-dimensional imaging system and the gesture recognition system, the computerised system processing images captured by the three-dimensional imaging system and gestures recognised by the gesture recognition system, the computerised system using data from the control device in conjunction with the gestures recognised by the gesture recognition system.
  • [0032]
    In particular, this gesture-based interactive system allows the input of control signals using gestures performed by the hand as well as hardware-based control signals generated by a user interacting with the control device. Moreover, the control device may be used to provide information relating to a predetermined action or event, particular data or parameters that cannot be generated by a natural gesture interaction in the context defined by the interactive computerised system. An example of this, is performing a click action while the hand is continuously scrolling a hand-controlled cursor on a display screen or performing a drag and drop operation as will be described in more detail below.
  • Brief Description of the Drawings
  • [0033]
    For a better understanding of the present invention, reference will now be made, by way of example only, to the accompanying drawings in which:-
    • Figure 1 illustrates a schematic perspective back view of a control device in accordance with one simple embodiment of the present invention;
    • Figure 2 illustrates a schematic perspective front view of the control device shown in Figure 1; and
    • Figure 3 illustrates a block diagram illustrating the components of the control device shown in Figures 1 and 2 and the controller with which it interacts.
  • Description of the Invention
  • [0034]
    The present invention will be described with respect to some particular embodiments and with reference to certain drawings, but the invention is not limited thereto. The drawings described are only schematic and are non-limiting. In the drawings, the size and exact position of some of the elements may be exaggerated and not drawn on scale for illustrative purposes.
  • [0035]
    The present invention relates to the human to machine interaction domain, and in particular, to a system and a method using a wireless control device designed for being used in combination with a natural gesture recognition system in order to enhance the user experience at interacting. In particular, the control device of the present invention provides additional features to the natural gesture recognition based computerized systems without limiting the features of the said gesture recognition system. It furthermore is able to provide feedback information to the user at interaction time so that to enhance the user experience in comparison to any computerised system using only one of either the gesture recognition or the hardware device control systems.
  • [0036]
    The controller devices described in US-A-2010/0304868 and US-A-2011/0269544 require the user to be active in holding the control devices in one or both hands in order to operate the controls and provide control signals for the computerised system which forms part of the associated games console. However, some users tend to struggle with a single size of device, for example, children whose hands are considerably smaller than those of adults. Naturally, a solution to this problem of different hand sizes would be to provide more than one size of device so that different sizes of hands can be accommodated with an increased cost for the user.
  • [0037]
    When designing marker-less gesture-based interactions for full body applications, it is often challenging to obtain data due to the lack of detailed information provided by a 3D camera, for example, the ability to track the most precise tool of a human body, the hand, or even more complex, the fingers. This is particularly a challenge when the user is at a substantial distance, typically, more than 3m, from the 3D camera, and the computerised system is tracking parts of users within a room. One limitation when tracking a hand gesture or hand movements is it is difficult to determine if the hand is in an active state, for example, grabbing an item, or is in a passive state, for example, just passing over the object.
  • [0038]
    Another limitation when tracking a hand gesture or hand movements is the lack of feedback to the user in gesture-based recognition systems as they tend to be based on the display of information on a screen. In such systems, it is often difficult for the user of the system to estimate when a virtual representation of his/her hand is in contact with a virtual object. The use of visual feedback tends to clutter the display or screen and do not provide always a sufficiently natural and/or immersive feel for the user.
  • [0039]
    One further limitation in marker-less gesture recognition systems includes is the inability to recognise precisely enough hand orientations due to the lack of spatial resolution, ambiguity, occlusions, or either due to an object in the field of view of the 3D camera or to some potential self-occlusions, or moreover when the object being tracked is at a substantial distance (typically greater than 3m) away from the 3D camera.
  • [0040]
    The present invention provides a remote control device that overcomes the above-mentioned limitations and which takes into account suitable functionality, ergonomics and ease of use of the device. The main features provided by the remote control device include:-
    1. (i) providing information relative to the hand status including at least two reliable states (and preferably more);
    2. (ii) providing additional data to complement or refine those determined from a 3D camera based gesture recognition system;
    3. (iii) providing additional feedback to the user for balancing the transparency and the usability of the system; and
    4. (iv) providing ergonomic quality suited to complement marker-less full body natural gesture interactions with an application.
  • [0041]
    The device of the present invention comprises a band that can be worn over the hand and which incorporates at least one control button on the palm-side which can be operated by fingers of the same hand, the fingers of the user still being able to move freely in the same way they would without the presence of the control device. The device is used within a three-dimensional (3D) imaging system (including a 3D camera) forming part of a computerised system in a games console so that both gestures and direct control signals from the band can be processed by the system to enhance the user experience whilst interacting with a game running and being controlled by the computerised system.
  • [0042]
    In Figures 1 and 2, a controller device 100 in accordance with the present invention is shown. The device 100 comprises an adjustable band 110 on which is mounted three control buttons 120, 130, 140 as is shown more clearly in Figure 2. As shown, the band 110 fits over a hand 150 of a user and is held in position on the hand by a fastening 160. The fastening 160 provides one position where the band 110 can be adjusted to fit the hand 150 of the user. Typically, the fastening 160 comprises a two-part fastening that can be adjusted to fit the hand, where one part of the fastening is fixed to one end 170 of the band and the other part of the fastening is fixed to the other end 180 of the band so that when the two ends 170, 180 overlap, the band 110 is secured around the hand 150. One such two-part fastener is a "hook and loop" fastener, for example, Velcro ™ (trademark of The Velcro Company, Manchester, NH, USA). However, it will be appreciated that other forms of two-part fasteners can also be used, for example, hooks and eyes; a magnet and a metal plate; press-studs etc.
  • [0043]
    The band 110 includes at least a two-axis accelerometer, preferably a three-axis accelerometer, and a gyroscope (not shown as it is embedded inside the band) for measuring orientation angles and the direction of gravity, as well as movement speed and acceleration in any direction. This has the advantage that the size of the device 100 may be considerably smaller than any hand-held control device. The accelerometer may then be able to provide the equivalent of two control axes of a standard joystick device merely by controlling pitch and/or roll inclination of the hand 150.
  • [0044]
    In addition, the accelerometer provides a reliable and precise value for the acceleration of the hand 150 when wearing the device 100. This reliable and precise value for the acceleration can be used to validate what a computerised system (not shown) presumes to be the hand. This presumption of the hand may be obtained by using an inverse kinematic-based user skeleton fitting process to identify parts of the user, and in particular, to identify the position of the hands over time. The spatio-temporal properties of the tracked hand may then be compared to the device data provided with respect to the position of the handling hand over the time in order to refine identification and accuracy of the data. In particular, the acceleration value provided by the device may be used to extrapolate the position of the hand when it is no longer visible to a 3D camera forming part of the computerised system due to occlusion by another object or part of the body, or because the hand is out of the field of view of the 3D camera.
  • [0045]
    Furthermore, accelerometer information that is passed to the computerised system may be combined with the identification and tracking of parts of the user to be able to load automatically functions and/or features for each control button according to the hand in which the control device is located. For example, in a two-device interactive application, the device in the right hand may be already configured for the particular application and the device in the left hand may also be configured in accordance with accelerometer information provided from the control device in the left hand of the user. When using two devices, the control means, that is, the buttons of each device may be configured according to the hand into which they are located. This is automatically performed when switching devices from the left hand to the right hand by comparing accelerometer and/or gyroscopic data from the device with spatio-temporal data for the left and right hands determined by the gesture recognition system.
  • [0046]
    A transceiver unit (also not shown) is also embedded inside the band 110 for transmitting control signals generated by the control buttons 120, 130, 140 to the associated computerised system. In addition, signals may also be received from the computerised system. Electromagnetic radiation in a suitable frequency band can used for transmitting signals between the controller device 100 and its associated computerised system.
  • [0047]
    As shown in Figure 1, the fastening 160 is preferably located on the back of the hand 150 and as shown in Figure 2, the control buttons 120, 130, 140 are locate in the palm of the hand 150. It will be appreciated, however, that it may not be necessary to have a fastening 160 if the band 110 is made of at least one suitable material that allows the band to be readily and easily placed onto and removed from the hand. In addition to the control buttons 120, 130, 140, a feedback element (not shown) may also be located on the palm-side of the device 100 that mitigates the lack of tactile sensation due to the contactless nature of the interaction with the 3D camera. The feedback element may comprises a vibrator that buzzes or vibrates as required, for example, to confirm that an object has been contacted with a virtual representation of the hand of a user in a virtual environment of a game running on the computerised system. Additionally, the feedback element may comprise an acoustic device that provides an audio feedback signal for the user. In addition, other forms of feedback can be implemented, either as alternatives or as additions. These may include thermal, electrical and/or contact arrangements.
  • [0048]
    In addition to having control buttons 120, 130, 140 located on the palm side of the band 110 as shown in Figure 2, other control means or buttons (not shown) may be provided on the back side of the band for operation as for example by the other hand. Other control means or button may also be provided in between the back and the palm side of the device for operation, for example, with the thumb.
  • [0049]
    Each control button 120, 130, 140 may be an analogue button having a number of states which are controlled by the pressure of a finger of a user on the button. Alternatively, each control button 120, 130, 140 may comprise a binary state button which operates between an "on" state and an "off' state by depression by a finger of a user. The control buttons 120, 130, 140 may also provide continuous signals representing pressure and/or distance as described below with reference to Figure 3. In addition, biometric sensors may be implemented on the band, for example, to monitor heart rate and galvanic skin response.
  • [0050]
    As an alternative to discrete control buttons, a continuous sensing surface may be provided, for example, a control pad. Such a control pad (not shown) may be provided for operation by the thumb of the user, the control pad being located in a position on the band that can be accessed by the thumb when either the user is right- or left-handed.
  • [0051]
    In addition, the buttons may be pressure-, flex-, capacitive- and/or distance-sensors which provide continuous data instead of needing to be operated by a user to provide data.
  • [0052]
    In addition to the control means (that is, the buttons, accelerometers etc.), the device 100 also includes electronic parts, for example, a battery, a microcontroller, and a wireless module as will be described in more detail with reference to Figure 3 below. All parts not directly used for the interface with the user are preferably located on the back of the band 110 so that they do not clutter the palm area which is used for providing the interaction with the computerised system.
  • [0053]
    In addition, input/output ports may be provided which permit the connection of additional sensors and/or actuators to the control device. For example, an add-on flex sensor may be provided to fit around the first or second finger which provides additional control data or an additional resistance to provide additional contextual feedback.
  • [0054]
    The band 110 may comprise silicone which is flexible and readily conforms to the shape of the hand 150. In addition, the band 110 may be shaped so that it can only be worn in one particular orientation so that the buttons, if predefined, can be used if the band is placed on the right hand or the left hand. More generally, the band 110 is preferably symmetrical about at least one plane so that the band can be worn on either the right or the left hand to provide ambidextrous use and/or to allow the user to have a band on each hand for more complex interactions. One advantage in using a type of silicone material for the band is that it provides grip with the hand of the user so that the control buttons can remain in the palm of the hand for easier operation.
  • [0055]
    Alternatively, the band 110 may be made of any other suitable material which can house the control means, the feedback means and any other necessary electronics for the operation of the device 100. The band 110 may comprise different materials utilised for their specific properties, for example, a portion of the band 110 to be located in the palm of the hand 150 may be of silicone to provide enhanced grip, a portion of the band 110 adjacent the silicone may be elastic so that the fit can be adjusted, and the fastener 160 may comprise Velcro so that the fit can be adjusted and for ease of locating the device 100 on the hand 150.
  • [0056]
    In use, a user places the device 100 over his/her hand 150 and adjusts it for a comfortable fit using the two-part fastener 160. Ideally, the fit of the device 100 on the hand 150 is a balance between the user holding the device and wearing it, that is, the user has to exert no effort in maintaining the device 100 in a position where the buttons 120, 130, 140 can easily be operated. Moreover, it will be appreciated that the device 100 is not a glove and does not need to be held for operation. Furthermore, the device can readily be applied to and removed from the hand.
  • [0057]
    Having the device in one hand, the user may interacts with a computerised system associated to gesture recognition means including a 3D imaging system, such as, a depth sensing 3D camera, and an application for providing contextual instructions and feedback information. In one embodiment, the user can be represented by an avatar on a screen in the centre of a virtual environment, and is free to act and move within the frustum of the 3D camera. Movement and gestures of the user are recognised by software operating in the computerised system. Feedback is provided to the user, for example, in the form of a vibration to indicate when the hand of the user has touched a virtual object in the virtual environment as displayed on a screen.
  • [0058]
    In its simplest embodiment, the band comprises a single button which, when depressed by a user, automatically sends a signal to the associated computerised system. In this particular case, it is possible to adjust the position of the button relative to the palm of the hand for operation by a preferred finger of the user, for example, one of the first or second fingers.
  • [0059]
    In another embodiment, two long buttons may be provided so that they can be operated by different or by the same fingers according to the preference of the user. Whilst the hardware set up of the device may be fixed, due to the ergonomics of the device, the two buttons may be located in positions where they can be operated naturally by the user moving one or two of the three most used fingers, namely, the first and second fingers and the thumb. For example, the first button, that is, the one that is nearest the first finger, may be used by the thumb and the other button may be used by the first or second finger.
  • [0060]
    In a more advanced embodiment where there is more than one button, it is possible to align the buttons for fingers of the user. Here, it will be appreciated that more than three buttons may be provided as described above with reference to Figures 1 and 2. In addition, there may be more than one row of buttons.
  • [0061]
    It will be appreciated that, apart from the simplest embodiment where the function of the button is predefined, the function of each button may not be fixed and may be user or computer system-definable. In a more preferred embodiment, the computer system may include means for pre-setting automatically and dynamically the control means, especially buttons, according to the application operated by the computerised system, or according to one particular context in the application being run by the computerised system.
  • [0062]
    In another embodiment, the accelerometer can be used to determine the vertical direction due to gravity and the functionality of each button is assigned accordingly.
  • [0063]
    In a preferred embodiment, the device may be used in a virtual environment in which a user can be represented by an avatar standing in the scene. The user can act and move freely within the frustum of the 3D camera associated with the computerised system so that his/her movement and/or gestures can be recognised and tracked by the software run in the computerised system, and then associated with the control device 100 inputs in accordance with the present invention. To make the avatar move and interact within the scene, the user uses the features provided by the control device 100 and also those provided by gesture recognition software. Some features that can be implemented by the buttons 120, 130, 140 include:-
    • When a first button is depressed, the orientation of the device 100 is registered with the associated computerised system and from there changes in direction control the speed of translation, forwards or backwards, and the rotation speed to the left or right as if the device 100 was a joystick in the hand of the user. When the button is released, the navigational control provided by the device 100 is paused so that movements of the user are used to interact directly with the virtual environment. • When a second button is depressed, using a similar mechanism to the first button, the user can control the rotation of the 3D camera around the avatar in the virtual environment.
    • When a third button is depressed, a "drag and drop" function can be implemented. Here, the nearest "physical" object, that is, the nearest object in a virtual representation, to the hand of the user on which the device 100 is placed moves towards the virtual object and grabs it, the virtual object following the movements and direction of the hand as long the button is not released. When the button is released, the object is dropped at the location of the hand of the user, with respect to the virtual environment, at the time the button is released.
  • [0064]
    This particular function provides the ability to pick up, carry, move and launch virtual objects within the virtual environment.
  • [0065]
    When a collision occurs between an object in the scene and the avatar representation of the user, the vibrator on the palm of the device vibrates with an intensity which may be proportional to the intensity of the virtual shock or proportional to the hardness of the material with which the object is supposed to have collided. Similarly, increased vibration can be used to relay information to the user about the weight of a virtual object.
  • [0066]
    It will be appreciated that it is important that the hand on which the control device 100 is placed can be detected within the scene. This can be done using any skeletal representation of a user having a portion that corresponds to the hand. The location of the hand in virtual space can be improved using the output from the three-axis accelerometer and using additionally output of a gyroscope. Instead of using a skeletal representation, the movement of any identified feature of the scene by means of image processing, for example, extremities, coloured objects, specific shapes, body parts etc., all being within the frustum of the 3D camera, can be compared with the acceleration of the device 100 in order to evaluate the probability that the device is attached to that identified feature.
  • [0067]
    In another preferred embodiment, a gyroscope may be provided within the device 100 and provides information that improves orientation detection of the body part, and in particular the hand, to which the device is attached.
  • [0068]
    In another preferred embodiment, the gyroscope may be replaced by a compass in order to provide absolute orientations measurements instead of relatives orientations measurements. The use of these absolute measurements being also used for improving orientation detection of the body part, and in particular the hand, to which the device is attached. This may be in addition to the gyroscope.
  • [0069]
    In addition, multiple control devices can be used at the same time. For example, a single user may use a control device on each hand or multiple users may each use a control device on one hand. In these cases, the communication protocol used needs to be able to distinguish between the identity or individual address of each control device.
  • [0070]
    Turning now to Figure 3, a block diagram of a control system 300 for an interactive games console or computer system (not shown) is shown. The control system 300 comprises a computerised system 310 which is housed in the interactive games console and which comprises an electronics module 320. The electronics module 320 comprises a microcontroller 330 and a transceiver 340. The transceiver 340 operates on one or more of Bluetooth ™ (trademark of the Bluetooth Special Interest Group), Wi-Fi ™ (trademark of the Wi-Fi Alliance), ZigBee ™ (trademark of the ZigBee Alliance), and other radio frequency (RF) bands. [Bluetooth operates in the Industrial, Scientific & Medical (ISM) band of 2.4 to 2.48 GHz; Wi-Fi operates in accordance with IEEE 802.11; and ZigBee also operates in accordance with IEEE 802.] Naturally, the transceiver 340 may operate using other communication systems, for example, using visible light and infrared radiation. It will be appreciated that the transceiver 340 may operate using more than one type of radiation and/or RF band. In this case, dedicated transmit and receive modules are provided for each type of radiation and/or RF band.
  • [0071]
    The transceiver 340 is connected to and controlled by the microcontroller 330 for two-way communication as indicated by arrow 335, and transmits and receives signals from a control device 350 in accordance with the present invention. The control device 350 corresponds to the device 100 described above with reference to Figures 1 and 2.
  • [0072]
    The control device 350 comprises three units groups, namely, the sensing unit 360, the driving unit 370, and the feedback unit 380. The sensing unit 360 comprises the control means, that is,. an inertial sensing unit 400, a binary states pad 410, an analogue pad 420, a binary joystick 430, an analogue joystick 440, a binary states button 450 and an analogue button 460. Although only one of these elements is shown, it will be appreciated that there may be more than one which may be present in the control device 350 according to one particular configuration.
  • [0073]
    The inertial sensing unit 400 may include at least a two axis accelerometer. In a preferred embodiment, the inertial sensing unit may include a single three-axis accelerometer as described above or three individual accelerometers each of which is aligned with a respective one of the x-, y- and z-axes. Alternatively or additionally, the inertial sensing unit 400 may also comprise gyroscopic elements aligned with respective ones of the x-, y-, and z-axes.
  • [0074]
    The binary states pad 410 and the analogue pad 420 are essentially pressure-sensitive input devices having at least one sensitive direction in a plane, for example, a north, a south, an east and a west direction, with a central push or binary state. Such pads are known as directional pads, joypads or D-pads.
  • [0075]
    In the case of the binary states pad 410, the pad may comprise a continuous flat form button in which the data value provided is digital and proportional to the direction chosen by the user and it can also be a "on" or "off' state.
  • [0076]
    In the case of the analogue pad 420, the pad comprises a continuous multi-directional 2D flat form button in which the data value provided is analogue and again proportional to the direction chosen by the user, the data value being continuous in each direction depending on how away the finger of the user is from the central position in a particular direction.
  • [0077]
    In addition, the binary states pad 410 and the analogue pad 420 may be cross-faders or multi-directional pads in which intermediate states are provided.
  • [0078]
    The binary joystick 430 and the analogue joystick 440 operate in a similar way to pads described above but each comprises a surface having a protruding element, for example, a stick, which is perpendicular to the surface. As described above with reference to the binary states pad 410, the binary joystick 430 provides a data value which is digital and proportional to the direction set by the user. Similarly, the analogue joystick 440 provides a data value which is analogue and proportional to the distance from a central position in a particular direction as set by the user.
  • [0079]
    The binary states button 450 comprises at least one button which can be operated between an "on" state and an "off" state by a finger of the user. One or more of the buttons 120, 130, 140 described above with reference to Figure 2 may be a binary states button.
  • [0080]
    The analogue button 460 comprises at least one button which can be operated through a number of different states by pressure applied by a finger of a user. As before, one or more of the buttons 110, 120, 130 described above with reference to Figure 2 may be an analogue button.
  • [0081]
    Connected to the sensing unit 360 is the driving unit 370. The driving unit 370 includes a power supply 470 which is connected to an electronics module 480 as indicated by arrow 475. The power supply 470 provides power to components within the sensing unit 360, as indicated by arrow 490, and also to components within the feedback unit 380, as indicated by arrow 495, as will be described below. The power supply 470 may be a battery or other portable electrical supply which may be rechargeable, for example, using induction, by means of a direct connection to a suitable mains supply, or by means of a USB connection to computerised system or other suitable computer device. The battery may also be recharged by movement of the user or by connection to a photovoltaic cell. In one embodiment, the control device 350 may also include a photovoltaic cell (not shown) through which the battery is constantly charged provided the light levels are sufficiently high.
  • [0082]
    The electronics module 480 includes a microcontroller 500, a transceiver 510, and an input/output (I/O) module 520. The microcontroller 500 is connected to both the transceiver 510 and the I/O module 520 as indicated by respective arrows 515 and 525 as shown. These connections are two-way connections so that information can be passed from the transceiver 510 and the I/O module 520 to the microcontroller 500 and from the microcontroller 500 to the transceiver 510 and the I/O module 520 as required. The microcontroller 500 is also connected, as indicated by arrows 365, 385 to the sensing unit 360 and the feedback unit 380 via suitable interfaces (not shown) within the sensing and feedback units 360, 380. Each interface connects to a bus (also not shown) for transmitting signals to and from each of the components 400, 410, 420, 430, 440, 450, 460 within the sensing unit 360 and the components within the feedback unit 380 as will be described in more detail below.
  • [0083]
    The I/O module 520 may include a USB port for connecting the device 350 directly to the computerised system 310. It may also include a charging point for charging the power supply 470 if the power supply comprises a rechargeable battery unit. In addition, other sensors may be attached to the I/O module 520 to provide further information relating to the movement and/or position of the hand of the user.
  • [0084]
    The feedback unit 380 comprises one or more of: a speaker module 550, a vibrator module 560, a display screen module 570, a light-emitting diode (LED) module 580, and a matrix touch feedback module 590. It will be appreciated that the number of feedback components provided on the device 100 will depend on the types of feedback to be provided. As described above, the vibrator module 560 is a preferred component for providing tactile feedback to a user.
  • [0085]
    The speaker module 550 comprises at least one speaker element for providing audio feedback to a user. The display screen module 570 and the LED module 580 provide visual feedback for the user, the display screen module 570 having the ability to display more detailed information than the LED module 580. In one preferred embodiment, the LED may provide power supply status information, changing colour from green (fully charged) to yellow (50% charge, for example) to red (in need of recharging).
  • [0086]
    The matrix touch feedback module 590 comprises an array of actuators, for example, vibrating elements or electrodes, which provides feedback in the form of a tactile sensation. In the control device 100 described above with reference to Figures 1 and 2, this module is located in an area where it can be located against the skin of the user to stimulate tactile sensations.
  • [0087]
    Whilst the control device present invention has been described with reference to specific embodiments, it will be appreciated that other implementations of control device are also possible.

Claims (24)

  1. A user-operated control device (100; 350) for a three-dimensional imaging system forming part of an interactive computerised system (310), the three-dimensional imaging system including a three-dimensional camera and the means for processing three-dimensional information of at least part of a user present within its frustum, the control device comprising:-
    a housing (110);
    at least one sensing module (360) including at least one control button (120, 130, 140; 410, 420, 430, 440, 45, 460) mounted in the housing (110), each control button (120, 130, 140; 410, 420, 430, 440, 45, 460) being capable of generating a control signal for the computerised system (310) when operated;
    a feedback module (380, 550, 560, 570, 580, 590) mounted in the housing (110) for providing contextual feedback to the user in accordance with an application operating on the computerised system (310);
    a communication module (510) located within the housing (110) and through which each control signal is transmitted to, and each feedback signal is received from, the computerised system (310); and
    a controller module (500) mounted in the housing (100), the controller (500) controlling the sensing module (360), the communication module (510) and the feedback module (380, 550, 560, 570, 580, 590) in accordance with each generated control signal;
    characterised in that the housing (110) comprises an ergonomic body having an adjustable band mountable on an extremity of the user and allowing gesture interactions with the interactive computerised system (310) whilst in the frustum of the three-dimensional imaging system.
  2. A control device according to claim 1, wherein the adjustable band (110) has a first surface which defines a palm side, each control button being mounted on the palm side of the adjustable band and operated by at least one finger of the user.
  3. A control device according to claim 1 or 2, wherein the adjustable band (110) has a second surface which defines a back side, the back side having at least one control button to be operated by other fingers of the user.
  4. A control device according to claim 3, wherein a portion is defined between the first surface and the second surface, the portion including at least one control button to be operated by a thumb of the user.
  5. A control device according to any one of the preceding claims, wherein the feedback module (380) comprises a vibrator element (560).
  6. A control device according to claim 5, wherein the vibrator element is located adjacent a surface of the adjustable band (110) which is closer to the user when worn.
  7. A control device according to claim 5 or 6, wherein the feedback module (380) further comprises at least one of: a speaker module (550); a display screen (570); a light-emitting diode device (580); and a matrix touch feedback module (590).
  8. A control device according to any one of the preceding claims, wherein at least a portion of the adjustable band (110) comprises a flexible material.
  9. A control device according to claim 8, wherein the flexible material comprises an elastic material.
  10. A control device according to any one of the preceding claims, wherein the adjustable band (110) comprises a two-part adjustable fastening (160), each part of the two-part adjustable fastening (160) being located at respective ends (170, 180) of the adjustable band (110).
  11. A control device according to claim 10, wherein the two-part adjustable fastening (160) comprises a hook and loop fastener.
  12. A control device according to claim 10, wherein the two-part adjustable fastening (160) comprises one of: a magnetic closure and a mechanical closure.
  13. A control device according to any one of the preceding claims, wherein the sensing module (360) further comprises at least one of: a microphone for providing voice commands to the computerised system; an inertial sensing unit (400), a binary states pad (410), an analogue pad (420), a binary joystick (430), an analogue joystick (440), a binary states button (450), an analogue button (460), an optical sensor and a compass.
  14. A control device according to any one of the preceding claims, wherein the device is ergonomic and symmetrical about a plane and which allows both hand gesture and finger gesture interaction whilst being ambidextrous compliant.
  15. A control device according to claim 14, wherein the device ergonomics and design do not requires actively and voluntary holding it with a hand while still allowing free natural movements.
  16. A method of controlling an interactive computerised system (310) having a three-dimensional imaging system, the three-dimensional imaging system including a three-dimensional camera for providing three-dimensional information of at least one part of a user present within its frustum, the method comprising the steps of:
    a) providing a control device (100; 350) for use by the user, the control device being in accordance with any one of the preceding claims;
    b) generating control signals from at least one control button (120, 130, 140; 410, 420, 430, 440, 450, 460) on the control device; and
    c) using the generated control signals to control the interactive computerised system (310).
  17. A method according to claim 16, further comprising the step of:
    d) providing contextual feedback to the user in accordance with an application running on the computerised system (310).
  18. A method according to claim 16 or 17, further comprising the step of:
    e) using the control device (100; 350) to provide information relating to at least the part of the user with which the control device (100; 350) is associated within the frustum of the three-dimensional camera.
  19. A method according to claim 18, wherein step e) comprises providing spatio-temporal related information relating to the the part of the user with which the control device (100; 350) is associated.
  20. A method according to claim 19, wherein the spatio-temporal related information comprises at least one of: status, acceleration, orientation, and position of the part of the user with which the control device (100; 350) is associated.
  21. A method according any one of claims 16 to 20, further comprising the step of using the information provided by the device and relating to the portion of the user to resolve ambiguities in information measured by the three-dimensional imaging system.
  22. A gesture-based interactive computerised system comprising:-
    a three-dimensional imaging system;
    a gesture recognition system;
    a control device (100; 350) according to any one of claims 1 to 15; and
    a computerised system (310) associated with the three-dimensional imaging system and the gesture recognition system, the computerised system (310) processing images captured by the three-dimensional imaging system and gestures recognised by the gesture recognition system, the computerised system (310) using data from the control device (100; 350) in conjunction with the gestures recognised by the gesture recognition system.
  23. A method of controlling an interactive computerised system (310) comprising the steps of:
    a) providing a control device (100; 350) for use by a user, the control device being in accordance with any one of the preceding claims from 1 to 15;
    b) generating control signals from at least one control button (120, 130, 140; 410, 420, 430, 440, 450, 460) on the control device; and
    c) using the generated control signals to control the interactive computerised system (310).
  24. A method according to claim 23, further comprising a step of:
    d) providing contextual feedback to the user in accordance with an application running on the computerised system (310).
EP20120000116 2012-01-09 2012-01-09 System and method for enhanced gesture-based interaction Pending EP2613223A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP20120000116 EP2613223A1 (en) 2012-01-09 2012-01-09 System and method for enhanced gesture-based interaction

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
EP20120000116 EP2613223A1 (en) 2012-01-09 2012-01-09 System and method for enhanced gesture-based interaction
CN 201380004925 CN104094196B (en) 2012-01-09 2013-01-09 Systems and methods for enhanced gesture-based interactions
PCT/EP2013/050318 WO2013104681A1 (en) 2012-01-09 2013-01-09 System and method for enhanced gesture-based interaction
KR20147020592A KR101666096B1 (en) 2012-01-09 2013-01-09 System and method for enhanced gesture-based interaction
JP2014550738A JP5969626B2 (en) 2012-01-09 2013-01-09 System and method for enhanced gesture-based interactions
US14370590 US9360944B2 (en) 2012-01-09 2013-01-09 System and method for enhanced gesture-based interaction

Publications (1)

Publication Number Publication Date
EP2613223A1 true true EP2613223A1 (en) 2013-07-10

Family

ID=47563483

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20120000116 Pending EP2613223A1 (en) 2012-01-09 2012-01-09 System and method for enhanced gesture-based interaction

Country Status (6)

Country Link
US (1) US9360944B2 (en)
JP (1) JP5969626B2 (en)
KR (1) KR101666096B1 (en)
CN (1) CN104094196B (en)
EP (1) EP2613223A1 (en)
WO (1) WO2013104681A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015025277A1 (en) * 2013-08-21 2015-02-26 Navico Holding As Controlling marine electronics device
EP2891950A1 (en) 2014-01-07 2015-07-08 Softkinetic Software Human-to-computer natural three-dimensional hand gesture based navigation method
WO2016116722A1 (en) * 2015-01-19 2016-07-28 Kurv Music Ltd. A hand-held controller for a computer, a control system for a computer and a computer system
US9507562B2 (en) 2013-08-21 2016-11-29 Navico Holding As Using voice recognition for recording events
EP3214529A1 (en) * 2016-03-04 2017-09-06 HTC Corporation Wireless control device, position calibrator and accessory
US9836129B2 (en) 2015-08-06 2017-12-05 Navico Holding As Using motion sensing for controlling a display
US9864433B2 (en) 2012-07-13 2018-01-09 Softkinetic Software Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9886093B2 (en) * 2013-09-27 2018-02-06 Apple Inc. Band with haptic actuators
US20150220145A1 (en) * 2014-01-07 2015-08-06 Nod Inc. Methods and Apparatus for Recognition of Start and/or Stop Portions of a Gesture Using an Auxiliary Sensor
WO2015105919A3 (en) * 2014-01-07 2015-10-01 Nod, Inc. Methods and apparatus recognition of start and/or stop portions of a gesture using an auxiliary sensor and for mapping of arbitrary human motion within an arbitrary space bounded by a user's range of motion
US9606584B1 (en) 2014-07-01 2017-03-28 D.R. Systems, Inc. Systems and user interfaces for dynamic interaction with two- and three-dimensional medical image data using hand gestures
CN104503589A (en) * 2015-01-05 2015-04-08 京东方科技集团股份有限公司 Somatosensory recognition system and recognition method
CN105045388A (en) * 2015-07-07 2015-11-11 安徽瑞宏信息科技有限公司 Gesture and action based interactive control method
CN104991650A (en) * 2015-07-24 2015-10-21 贺杰 Gesture controller and virtual reality system
CN205427764U (en) * 2015-10-19 2016-08-03 北京蚁视科技有限公司 Handle type gesture recognition device
WO2017184785A1 (en) * 2016-04-19 2017-10-26 Scott Summit Virtual reality haptic system and apparatus

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070279380A1 (en) * 2006-05-31 2007-12-06 Bruno Rafael Murillo Computer input device
US20080084385A1 (en) * 2006-10-06 2008-04-10 Microsoft Corporation Wearable computer pointing device
US20100144436A1 (en) * 2008-12-05 2010-06-10 Sony Computer Entertainment Inc. Control Device for Communicating Visual Information
US20100304868A1 (en) 2009-05-29 2010-12-02 Sony Computer Entertainment America Inc. Multi-positional three-dimensional controller
US20110202306A1 (en) * 2008-08-25 2011-08-18 Universitat Zurich Prorektorat Mnw Adjustable Virtual Reality System
US20110269544A1 (en) 2002-07-18 2011-11-03 Sony Computer Entertainment Inc. Hand-held computer interactive device
US20110310002A1 (en) * 2010-06-22 2011-12-22 Microsoft Corporation Free space directional force feedback apparatus

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1084385A (en) 1913-06-05 1914-01-13 Noiseless Typewriter Co Type-writing machine.
EP1127303A2 (en) * 1998-10-28 2001-08-29 ViA, Inc. Flexible user interface device and method
JP2000298544A (en) * 1999-04-12 2000-10-24 Matsushita Electric Ind Co Ltd Input/output device and its method
JP3487237B2 (en) * 1999-08-27 2004-01-13 日本電気株式会社 Pointing device and a computer system using the same
JP2001142605A (en) * 1999-11-10 2001-05-25 Shimadzu Corp Input system
WO2001069365A1 (en) * 2000-03-13 2001-09-20 Ab In Credoble Gesture recognition system
US7042442B1 (en) * 2000-06-27 2006-05-09 International Business Machines Corporation Virtual invisible keyboard
WO2003025734A1 (en) * 2001-09-14 2003-03-27 Digityper Ab A portable unit for inputting signals to a peripheral unit, and use of such a unit
US7102615B2 (en) * 2002-07-27 2006-09-05 Sony Computer Entertainment Inc. Man-machine interface using a deformable device
JP4058494B2 (en) * 2003-07-14 2008-03-12 株式会社クリエイション・ルネ With magnet band
WO2006078604B1 (en) * 2005-01-18 2007-06-21 Rallypoint Inc Sensing input actions
JP2006209652A (en) * 2005-01-31 2006-08-10 Alps Electric Co Ltd Input device and electronic equipment
JP2006312346A (en) * 2005-05-06 2006-11-16 Nissan Motor Co Ltd Command input device
EP2347320A1 (en) * 2008-10-27 2011-07-27 Sony Computer Entertainment Inc. Control device for communicating visual information
US20100177039A1 (en) * 2009-01-10 2010-07-15 Isaac Grant Finger Indicia Input Device for Computer
JP5263833B2 (en) * 2009-05-18 2013-08-14 国立大学法人 奈良先端科学技術大学院大学 Ring interface, the interface device for use in the wearable computer, and interface methods
JP2011186693A (en) * 2010-03-08 2011-09-22 Brother Industries Ltd Information input apparatus
JP2011239279A (en) * 2010-05-12 2011-11-24 Hitachi Consumer Electronics Co Ltd Remote control device and remote control method
US20130207890A1 (en) * 2010-10-22 2013-08-15 Joshua Michael Young Methods devices and systems for creating control signals
US20130069931A1 (en) * 2011-09-15 2013-03-21 Microsoft Corporation Correlating movement information received from different sources

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110269544A1 (en) 2002-07-18 2011-11-03 Sony Computer Entertainment Inc. Hand-held computer interactive device
US20070279380A1 (en) * 2006-05-31 2007-12-06 Bruno Rafael Murillo Computer input device
US20080084385A1 (en) * 2006-10-06 2008-04-10 Microsoft Corporation Wearable computer pointing device
US20110202306A1 (en) * 2008-08-25 2011-08-18 Universitat Zurich Prorektorat Mnw Adjustable Virtual Reality System
US20100144436A1 (en) * 2008-12-05 2010-06-10 Sony Computer Entertainment Inc. Control Device for Communicating Visual Information
US20100304868A1 (en) 2009-05-29 2010-12-02 Sony Computer Entertainment America Inc. Multi-positional three-dimensional controller
US20110310002A1 (en) * 2010-06-22 2011-12-22 Microsoft Corporation Free space directional force feedback apparatus

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
BRANDON SHREWSBURY: "Providing Haptic Feedback Using the Kinect", DUNDEE, SCOTLAND, UK, 26 October 2011 (2011-10-26), XP055026739, Retrieved from the Internet <URL:http://delivery.acm.org/10.1145/2050000/2049628/p321-shrewsbury.pdf?ip=145.64.134.241&acc=ACTIVE SERVICE&CFID=81283363&CFTOKEN=20492987&__acm__=1336574869_34ff99db206e9e7fbc69b1290d64c957> [retrieved on 20120509], DOI: 10.1080/0963828 *
FRATI V ET AL: "Using Kinect for hand tracking and rendering in wearable haptics", WORLD HAPTICS CONFERENCE (WHC), 2011 IEEE, IEEE, 21 June 2011 (2011-06-21), pages 317 - 321, XP032008619, ISBN: 978-1-4577-0299-0, DOI: 10.1109/WHC.2011.5945505 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9864433B2 (en) 2012-07-13 2018-01-09 Softkinetic Software Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand
WO2015025277A1 (en) * 2013-08-21 2015-02-26 Navico Holding As Controlling marine electronics device
US9615562B2 (en) 2013-08-21 2017-04-11 Navico Holding As Analyzing marine trip data
US9439411B2 (en) 2013-08-21 2016-09-13 Navico Holding As Fishing statistics display
US9572335B2 (en) 2013-08-21 2017-02-21 Navico Holding As Video recording system and methods
US9596839B2 (en) 2013-08-21 2017-03-21 Navico Holding As Motion capture while fishing
US9507562B2 (en) 2013-08-21 2016-11-29 Navico Holding As Using voice recognition for recording events
EP2891950A1 (en) 2014-01-07 2015-07-08 Softkinetic Software Human-to-computer natural three-dimensional hand gesture based navigation method
WO2016116722A1 (en) * 2015-01-19 2016-07-28 Kurv Music Ltd. A hand-held controller for a computer, a control system for a computer and a computer system
US9836129B2 (en) 2015-08-06 2017-12-05 Navico Holding As Using motion sensing for controlling a display
EP3214529A1 (en) * 2016-03-04 2017-09-06 HTC Corporation Wireless control device, position calibrator and accessory

Also Published As

Publication number Publication date Type
KR20140128305A (en) 2014-11-05 application
JP5969626B2 (en) 2016-08-17 grant
CN104094196A (en) 2014-10-08 application
KR101666096B1 (en) 2016-10-13 grant
WO2013104681A1 (en) 2013-07-18 application
US9360944B2 (en) 2016-06-07 grant
CN104094196B (en) 2017-07-11 grant
US20140368428A1 (en) 2014-12-18 application
JP2015507803A (en) 2015-03-12 application

Similar Documents

Publication Publication Date Title
Templeman et al. Virtual locomotion: Walking in place through virtual environments
US20090122146A1 (en) Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
US20070021208A1 (en) Obtaining input for controlling execution of a game program
US20080080789A1 (en) Object detection using video input combined with tilt angle information
US20090183193A1 (en) Gesture cataloging and recognition
US7683883B2 (en) 3D mouse and game controller based on spherical coordinates system and system for use
US20060264260A1 (en) Detectable and trackable hand-held controller
US20100004896A1 (en) Method and apparatus for interpreting orientation invariant motion
US20060282873A1 (en) Hand-held controller having detectable elements for tracking purposes
US20080174550A1 (en) Motion-Input Device For a Computing Terminal and Method of its Operation
US8743052B1 (en) Computing interface system
US20060287086A1 (en) Scheme for translating movements of a hand-held controller into inputs for a system
US20060287087A1 (en) Method for mapping movements of a hand-held controller to game commands
US20090048021A1 (en) Inertia sensing input controller and receiver and interactive system using thereof
Guo et al. Exploring the use of tangible user interfaces for human-robot interaction: a comparative study
US8310656B2 (en) Mapping movements of a hand-held controller to the two-dimensional image plane of a display screen
US20120038582A1 (en) Systems and Methods for Providing Haptic Feedback to Touch-Sensitive Input Devices
US20080098448A1 (en) Controller configured to track user&#39;s level of anxiety and other mental and physical attributes
US20080291160A1 (en) System and method for recognizing multi-axis gestures based on handheld controller accelerometer outputs
US20080084385A1 (en) Wearable computer pointing device
US20110234488A1 (en) Portable engine for entertainment, education, or communication
Perng et al. Acceleration sensing glove (ASG)
US20070149282A1 (en) Interactive gaming method and apparatus with emotion perception ability
US20070265075A1 (en) Attachable structure for use with hand-held controller having tracking ability
US20120127070A1 (en) Control signal input device and method using posture recognition

Legal Events

Date Code Title Description
AK Designated contracting states:

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent to

Extension state: BA ME

RBV Designated contracting states (correction):

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

17P Request for examination filed

Effective date: 20131231

17Q First examination report

Effective date: 20150306

RAP1 Transfer of rights of an ep published application

Owner name: SOFTKINETIC SOFTWARE