WO2003001340A2 - Gesture recognition system and method - Google Patents

Gesture recognition system and method Download PDF

Info

Publication number
WO2003001340A2
WO2003001340A2 PCT/US2002/020119 US0220119W WO03001340A2 WO 2003001340 A2 WO2003001340 A2 WO 2003001340A2 US 0220119 W US0220119 W US 0220119W WO 03001340 A2 WO03001340 A2 WO 03001340A2
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
gestures
inertial data
class
inertial
Prior art date
Application number
PCT/US2002/020119
Other languages
French (fr)
Inventor
Kirill Mosttov
John Vermes
Original Assignee
Motion Sense Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US30058801P priority Critical
Priority to US60/300,588 priority
Application filed by Motion Sense Corporation filed Critical Motion Sense Corporation
Publication of WO2003001340A2 publication Critical patent/WO2003001340A2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/62Methods or arrangements for recognition using electronic means
    • G06K9/6267Classification techniques
    • G06K9/6279Classification techniques relating to the number of classes
    • G06K9/628Multiple classes
    • G06K9/6281Piecewise classification, i.e. whereby each classification requires several discriminant rules
    • G06K9/6282Tree-organised sequential classifiers
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00335Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/20Image acquisition
    • G06K9/22Image acquisition using hand-held instruments
    • G06K9/222Image acquisition using hand-held instruments the instrument generating sequences of position coordinates corresponding to handwriting; preprocessing or recognising digital ink
    • G06K9/224Image acquisition using hand-held instruments the instrument generating sequences of position coordinates corresponding to handwriting; preprocessing or recognising digital ink in three dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer

Abstract

A gesture-recognition system that has a plurality of inertial sensors to generate inertial data, a gesture discriminator, and two or more gesture interpreters. The gesture discriminator receives the inertial data and determines whether the inertial data represents a gesture discriminator determines that the inertial data represents a gesture from the class of gestures, whereas the second gesture interpreter receives the inertial data if the gesture discriminator does not determine that the inertial data represents a gesture from the class of gestures.

Description

GESTURE RECOGNITION SYSTEM AND METHOD

BACKGROUND The present invention concerns gesture recognition technology, particularly for mobile microprocessor-based devices with motion-sensitive input and control mechanisms.

As mobile devices shrink, the surface area on the devices available for user input and control mechanisms also shrinks. Consequently, there is a growing need for control systems that are accurate and reliable, but do not require significant surface area on the device. One area of development is motion-sensing technology, such as gesture recognition technology.

In general, two techniques have been developed for motion-sensitive input or control devices: "external referenced" and "internal referenced". In an "internal referenced" device, the sensors that permit motion-awareness are fastened, integrated, or otherwise mechanically coupled to the device so that both the sensors and the device become a single inertial platform. As such, these devices sense their motion over time with respect to a reference frame that coincides with the reference frame of the device at some arbitrary start time. In contrast, in an "external referenced" device, at least part of the sensing apparatus is fixed outside of the device's inertial frame. As such, these devices typically measure their position or velocity relative to a reference frame outside the device, e.g., a fixed external point.

Internal referencing is useful for mobile devices such as cellular telephones, personal digital assistants (PDAs), handheld games, and cameras, since a basic requirement of these applications is location independence. In contrast, external referencing is acceptable for input devices that are connected to a desktop device, such as desktop game controllers.

Tilt sensors and inertial sensors (e.g., accelerometers and gyroscopes) are useful for mobile devices using an "internal referenced" control system. Such sensors have been decreasing in size and cost due to the increasingly availability of micro-electromechanical systems (MEMS). Together with software and additional hardware, such sensors permit the user to employ gestures to control mobile electronic devices. A gesture in this case is a pre-determined motion of the entire device that activates a function of the application or provides it with control information.

It is well known in the art that reliable recognition of gestures is difficult. In addition, gesture recognition becomes increasingly difficult as the number of permissible gestures increases. Furthermore, an additional, qualitative difficulty arises when tliere are multiple classes or degrees of freedom of permissible gestures, for example planar vs. 3- D gestures. It should be understood that additional degrees of gesture freedom dramatically increase the difficulty of developing a robust recognition engine.

A straightforward response to this problem is to limit the universe of permissible gestures, in number and especially in the number of gesture classes. For example, the permissible motions could be limited to one dimension of tilt, or to motions associated with a specific activity, e.g., musical conducting.

More recently, attempts have been made to recognize motions from more than one class. For example, U.S. Patent No. 6,151,208 discloses a device that recognizes two classes of gestures: positional rotations of the hand as distinct from gestural rotations of the hand. Since a gestural rotation can be confused with multiple positional rotations, the device is implemented with "speculative execution of commands." With this solution, when the beginning of a rotation is detected, the device's state is saved. Commands called for by positional rotations are executed. If it is subsequently discovered that the rotation was actually a gesture, the device's state is rolled back and the function called for by the gesture is executed instead.

The speculative execution solution described has the weakness that redundancy of processing is involved. In addition, the requirement for being able to roll back state is a constraint on the type of functions that can be initiated. Still further, speculative execution becomes increasingly less suitable as the number of motion classes increases. There is a clear requirement in the art for a method that improves the recognition of inertial gestures when such gestures can be from different classes. SUMMARY In a first aspect, the invention is directed to a gesture-recognition system that has a plurality of inertial sensors to generate inertial data, a gesture discriminator, a first gesture interpreter and a second gesture interpreter. The gesture discriminator receives the inertial data and determines whether the inertial data represents a gesture from a first class of gestures. The first gesture interpreter receives the inertial data if the gesture discriminator determines that the inertial data represents a gesture from the first class of gestures, whereas the second gesture interpreter receives the inertial data if the gesture discriminator does not determine that the inertial data represents a gesture from the first class of gestures. The first gesture interpreter is configured to identify the inertial data as a particular gesture from the first class of gestures, and the second gesture interpreter is configured to identify the inertial data as a particular gesture from a second class of gestures.

Implementations of the invention may include one or more of the following features. The first class of gestures may be linear motions, such as linear reciprocal motions, or planar motions. If the first class of gestures are planar motions, the first gesture interpreter may comprise a handwriting recognition system. The second class of gestures may be a tilt motion. The inertial sensors may comprise at least one gyroscope, accelerometer, or magneto-resistive sensor. In another aspect, the invention is directed to a gesture-recognition system that has a plurality of inertial sensors to generate inertial data, a first gesture discriminator, a first gesture interpreter, a second gesture discriminator and a second gesture interpreter. The first gesture discriminator receives the inertial data and determines whether the inertial data represents a gesture from a first class of gestures. The first gesture interpreter receives the inertial data if the first gesture discriminator determines that the inertial data represents a gesture from the first class of gestures, and is configured to identify the inertial data as a particular gesture from the first class of gestures. The second gesture discriminator receives the inertial data if the gesture discriminator does not determine that the inertial data represents a gesture from the first class of gestures and determines whether the inertial data represents a gesture from a second class of gestures. The second gesture interpreter receives the inertial data if the second gesture discriminator determines that the inertial data represents a gesture from the second class of gestures and is configured to identify the inertial data as a particular gesture from the second class of gestures. Implementations of the invention may include one or more of the following features. A third gesture discriminator may receive the inertial data if the gesture discriminator does not determine that the inertial data represents a gesture from the second class of gestures and may determine whether the inertial data represents a gesture from a third class of gestures. A third gesture interpreter may receive the inertial data if the third gesture discriminator determines that the inertial data represents a gesture from the third class of gestures and may be configured to identify the inertial data as a particular gesture from the third class of gestures. A fourth gesture interpreter may receive the inertial data if the third gesture discriminator does not determine that the inertial data represents a gesture from the tliird class of gestures and may be configured to identify the inertial data as a particular gesture from the fourth class of gestures.

In another aspect, the invention is directed to a gesture-recognition system with at least one discriminator to use inertial data to select one of a plurality of classes of gestures and a plurality of gesture interpreters. Each gesture interpreter is configured to identify a particular gesture from one of the plurality of classes of gestures.

In another aspect, the invention is directed to a gesture-controlled electronic device that has a plurality of inertial sensors to generate inertial data, a gesture- recognition system, and an application that receives a token from the gesture-recognition system. The gesture recognition system includes a gesture discriminator that receives the inertial data and determines whether the inertial data represents a gesture from a first class of gestures, a first gesture interpreter that receives the inertial data if the gesture discriminator determines that the inertial data represents a gesture from the first class of gestures, and a second gesture interpreter that receives the inertial data if the gesture discriminator does not determine that the inertial data represents a gesture from the first class of gestures. The first gesture interpreter is configured to identify the inertial data as a particular gesture from the first class of gestures, whereas the second gesture interpreter is configured to identify the inertial data as a particular gesture from a second class of gestures.

In another aspect, the invention is directed to a method of controlling a hand-held electronic device that runs an application. In the method, mertial data is generated with a plurality of inertial sensors embedded in the electronic device, a particular gesture is identified from the inertial data and a corresponding token is generated, and the token is sent to the application. The identifying step includes selecting one class of gestures from a plurality of classes of gestures based on the inertial data, and identifying a particular gesture from the selected class of gestures based on the inertial data.

In another aspect, the invention is directed to a method of recognizing a gesture by an electronic device. In the method, inertial data is generated with a plurality of inertial sensors embedded in the electronic device, one class of gestures is selected from a plurality of classes of gestures based on the inertial data, and a particular gesture from the selected class of gestures is identified based on the inertial data.

In another aspect, the invention is directed to a method of recognizing a gesture by an electronic device. The method includes generating inertial data with a plurality of inertial sensors embedded in the electronic device, and determining whether the inertial data was generated by a gesture from a first class of gestures. If so, the inertial data is matched to a particular gesture from the first class of gestures, and if not, the inertial data is matched to a particular gesture from a second class of gestures.

The present invention facilitates the recognition of complex inertial gestures in the form of intentional movements of a mobile microprocessor-controlled device.

BRIEF DESCRIPTION OF DRAWINGS FIG. 1 illustrates a mobile microprocessor-based electronic device. FIG. 2 is a schematic block diagram illustrating information flow through a mobile electronic device.

FIG. 3 is a block diagram illustrating the components of a parser according to the present invention.

FIG. 4 is a flow-chart representing the method performed by the parser of FIG. 3. FIG. 5 is a schematic diagram of a gesture recognition system for a mobile electronic device.

FIG. 6 is a block diagram illustrating the components of a parser according to an alternative implementation of the present invention.

DETAILED DESCRIPTION In prior art, sensor data is presented directly to a recognition system block for processing. In contrast, in the present invention, a classification step takes place first, dividing the recognition problem into two tasks:

(1) Assign gesture data according to broad criteria to one of a plurality of predetermined classes (2) Interpret said gesture using the recognizer specifically designed for the assigned gesture class

In the gesture classification step, the incoming stream of sensor data is examined for gross criteria, for example planarity or linearity, rather than for gesture identity. It should be noted that such criteria are not limited to these examples, but could refer to any criteria that differentiate gesture classes rather than gestures themselves from one another. The task of recognition is thus simplified, since recognition is carried out on pre- classified gestures by specialized recognizers specific to a given gesture's class. Also, advantageously, improvement of such specialized gesture interpreters is easier to carry out than improvement of the general recognizers present in prior art.

A microprocessor-based mobile electronic device 10 in which the invention can be implemented is illustrated in FIG. 1. The present invention provides a method and a system for recognizing gestures performed by the user of the electronic device. The device can be a personal digital assistant (PDA) as shown, or it can be some other hand- held device, such as a cellular phone, one-way or two-way pager, hand-held game or camera. Of course, if the device is a cellular phone, it can send and receive radio- frequency telephone signals, and if the device is a PDA, it can be connected to a docking station for synchronization with a desktop computer. However, beyond these types of situations, the device is generally mobile and, for ordinary operation, need not be physically connected to or communicate with a stationary computer system or sensor system.

The electronic device 10 includes a gesture recognition system that permits the user to gesture with the device to input commands or data. A gesture in this case is a predetermined motion of the entire device (as opposed to manipulation of a control piece, such as a button). The gesture can activate a function of an application or provides it with control information. Such gestures are intended to provide the electronic device with user interface input similar in purpose to input from conventional user interface devices, such as buttons, keys, styli/digitizers, and the like.

The information flow in the gesture-controlled electronic device 10 is illustrated in FIG. 2. The electronic device 10 includes one or more inertial sensors 12 that generate analog or digital signals 14 representing the motion or position of the electronic device 10. The inertial sensors can be fastened, integrated, or otherwise mechanically coupled to the device so that both the sensors and the device become a single inertial platform. As such, these devices sense their motion with respect to the reference frame of the device itself. In contrast to a position sensor that measures a position with respect to a fixed reference sensor system, the inertial sensors 12 detect the motion of the device (e.g., velocity, acceleration, angular velocity, or angular acceleration). In addition, the inertial sensors can be entirely internal to the electronic device 10, so that the inertial data can be generated without requiring data from an external sensor or the exchange of data with an external processor.

The sensors can be micromachined electromechanical systems (MEMS), such as accelerometers, gyroscopes, or magneto-resistive sensors. Assuming that accelerometers are used, the sensors can measure the acceleration of the device with respect to a reference frame that coincides with the reference frame of the device at some arbitrary start time. Similarly, assuming that gyroscopes are used, the sensors can measure the angular acceleration of the device with respect to a reference frame that coincides with the reference frame of the device at some arbitrary start time. On the other hand, assuming that magneto-resistive sensors are used, the sensors can measure the angular position of the device with respect to the local magnetic field.

In one implementation, the electronic device includes three sensors (e.g., accellerometers) to measure motion along three perpendicular linear axes (the X, Y and Z axes), and three sensors (e.g., gyroscopes or magneto-resistive sensors) that generate data from which motion of the device about three perpendicular spin axes may be deduced. Typically, the three linear axes are parallel to the three rotational axes. In addition, as shown by FIG. 1, the sensors can be oriented so that the measurement axes generally align with the gross shape of the electronic device. For example, a first linear axis and a first rotational axes may be aligned with the primary longitudinal axes of the electronic device, and a second linear axis and a second rotational axes may be aligned with the secondary longitudinal axes of the electronic device.

Returning to FIG. 2, assuming that the inertial sensors 12 generate analog signals, these analog signals are fed to a transducer 16, such as an analog to digital converter (ADC), to generate digital signals 18.

The digital signals are then fed to gesture recognition system 15 that includes a signal conditioning mechanism 20 and a parser 24. The signal conditioning mechanism performs data "cleaning" to reduce noise in the signal, generating a cleaner digital signal 22.

The digital signal 22 is fed to the parser 24, discussed in further detail below, that extracts semantic information from the inertial data. The parser 24 can identify gestures in a variety of classes from the inertial data. The gestures be either simple, e.g., circular motion or shaking, or complex, e.g., tracing of letters or numbers. Once the parser 24 identifies the gesture, it generates a token 26 that is directed to an application 28 running on the electronic device 10. The tokens generated by parser 24 represent specific gestures, but the command or data represented by the tokens are left undefined. Instead, the tokens are interpreted by the application 28 (although the tokens might have default interpretations given by an operating system or by the parser, e.g., an "x-gesture" is interpreted as the keystroke "x"). This permits different applications to assign different actions or meanings to the tokens. For example, one application can assign a token for a shaking motion to a command to close the application, whereas another application can assign a token for a shaking motion to a data input, such as a letter or number. In an event-driven environment, the token 26 can be considered an event, much like a mouse- click or a keystroke, with the response controlled by the methods of the objects in the application. Optionally, if the gestures belong to a subset of the classes, or the class of gestures cannot be classified by the parser, then the sensor data may be fed directly to one of the applications for use by the application.

The treatment of tokens by the electronic device can be contrasted with electronic devices in which positional or inertial data is fed directly into an application. For example, in conventional game control devices, the application is programmed to map the incoming data to a scalar control parameter, e.g., a position on a display, or to perform a preset action in response to a preset sequence of inertial data. In contrast, the tokens generated by parser 24 represent specific gestures, but the command or data represented by the tokens are left undefined.

The basic functional blocks of one implementation of the parser 24 are illustrated in FIG. 3. The parser includes a discriminator 30 that decides which class of gesture is represented by the inertial data, and two interpreters 32, 34 that match the inertial data to a particular gesture in the class that was selected by the discriminator 30. Possible classes of gestures include tilting, linear motions, reciprocal motions and planar motions.

The equivalent method performed by the parser is illustrated in FIG. 4. First, the parser determines what class of gesture is represented by the inertial data. If the parser determines that the inertial data represents a gesture from a first class, the parser matches the inertial data to a particular gesture from the first class. Similarly, if the parser determines that the inertial data represents a gesture from a second class, the parser matches the inertial data to a particular gesture from the second class. As noted in the background, one problem in prior art inertial input devices is that, given multiple degrees of freedom and the possible complexity of gestures, matching the inertial data to a particular gesture is difficult and unreliable. However, by initially dividing the inertial data into different classes with the discriminator 30, each recognizer 32 can be optimized for the particular class of gestures. This improves the reliability and speed of the gesture-recognition process. Consequently, the discriminator and recognizer can make a final determination that the inertial data represents a particular gesture from a particular class. Thus, the device need not save a series of states, or use a complex rollback system to undo a state change resulting from an incorrect classification of the gesture.

Referring to FIG. 5, in one implementation, the mobile device includes three accelerometers 40 and three magneto-resistive (MR) sensors 42. The accelerometers are positioned such that their sense axes are mutually perpendicular. Likewise, the MR sensors are positioned with mutually perpendicular sensitive axes. Analog signals from each sensor 40 and 42 is directed to an associated ADC 44. Digital signals from the ADC 44 are directed to a preprocessor 46 for preliminary data processing of the signal, such as calibration, filtering and scaling.

The processed data from the preprocessor 46 is held by a sensor data memory 48. For example, the data memory can be a FIFO buffer, and the data from the preprocessor 46 can be sampled at regular intervals and placed into the FIFO buffer. Other portions of the gesture recognition system can request data and be presented with sextuples of digital data, with each of the six parameters corresponding to an output of one of the six sensors.

In one embodiment, the parser is configured to identify gestures from two classes of permissible gestures: (a) translation motion embedded in a plane without angular motion, and

(b) angular motion without translation, for example a tilting motion.

To perform the gesture recognition, data from the memory 48 is directed to a linear or planar motion discriminator 50. If the discriminator 50 detects linear or planar motion, the motion data is transferred to a planar gesture recognizer 52, such as a conventional handwriting recognition system. On the other hand, if the discriminator 50 does not detect linear or planar motion, the motion data is transferred to a tilt gesture recognizer that determines the direction and degree of tilt of the device.

In a second implementation, the parser is configured to identify gestures from two classes of permissible gestures: (a) reciprocal motion in three dimension, and

(b) angular motion without translation, for example a tilting motion. In the second implementation, if the discriminator 50 identifies a linear or reciprocal motion, the inertial data can be fed to the first interpreter 52. The first interpreter 52 identifies the particular reciprocal motion, e.g., a motion along one of the six linear semi-axes (+X, -X, +Y, -Y, +Z, -Z). Since the interpreter 52 is dedicated to a limited type of motion, it can be designed to identify these reciprocal motions with greater accuracy.

On the other hand, if the discriminator 50 does not identify a reciprocal motion, the inertial data can be fed to the second interpreter 54. The second interpreters 54 identifies a direction and degree of "tilt", e.g., the amount by which the electronic device is tilted away from the gravitational vector along one of the six semi-axes.

Different classes of gestures can be used for different types of functions by the applications. For example, reciprocal motions can be interpreted as commands, e.g., to open or close an application, whereas tilt motions can be interpreted as cursor control data, e.g., selection of an item from a list or positioning of a cursor on a screen.

The discriminator 50 can identify linear motion, particularly "reciprocal motions", i.e., short motions of the electronic device along an axis and then back to its starting point. The identification of the gesture is passed from the interpreter 52 or 54 to a token ID generator 56 which generates a token identifying the type of gesture. The token from the generator 56 is then passed to a processor which interprets the token as a control command or a character (digits, letters, special characters).

Referring to FIG. 6, in another implementation, a parser 24' is organized with a hierarchy of discriminators 60, 62, 64. The first discriminator 60 determines whether the gesture belongs to a first class of motions. If the first discriminator recognizes the gesture, the inertial data is fed to a first interpreter 70 that is configured to identify particular gestures from the first class. If the motion does not belong to the first class, then the inertial data is fed to the second discriminator 62. The second discriminator 62 determines whether the gesture belongs to a second class of motions. If the second discriminator recognizes the gesture, the inertial data is fed to a second interpreter 72 that is configured to identify particular gestures from the second class. If the motion does not belong to the second class, then the inertial data is fed to the third discriminator 64. The third discriminator 64 determines whether the gesture belongs to a third class of motions. If the third discriminator recognizes the gesture, the inertial data is fed to a third interpreter 74 that is configured to identify particular gestures from the third class. If the motion does not belong to the third class, then the inertial data is fed to a generic fourth interpreter 76 that is configured to identify gestures that do not belong to the other three classes. For example, the first class of gestures can be reciprocal linear motions, the second class of gestures can be tilting motions, the third class of gestures can be planar motions, and the fourth class of gestures can be arbitrary three-dimensional motions. Of course, the ordering of these classes of gestures can be changed.

Of course, the parser 24 can include just two discriminators, or four or more discriminators. However, in general, each discriminator is configured to recognize a specific class of gestures, and each discriminator is associated with an interpreter that identifies specific gestures in the class. With this functional organization, each discriminator can be narrowly tailored to accurately identify the specific class of gestures. Similarly, each interpreter can be narrowly tailored to accurately identify gestures in a specific class. Because the discriminators and interpreters are narrowly tailored, they can be more accurate than general-purpose template matching algorithms.

The discriminators and interpreters can be implemented as hardware, software or firmware, or a combination of hardware, software or firmware. What is claimed is:

Claims

1. A gesture-recognition system, comprising: a plurality of inertial sensors to generate inertial data; a gesture discriminator that receives the inertial data and determines whether the inertial data represents a gesture from a first class of gestures; a first gesture interpreter that receives the inertial data if the gesture discriminator determines that the inertial data represents a gesture from the first class of gestures, the first gesture interpreter configured to identify the inertial data as a particular gesture from the first class of gestures; and a second gesture interpreter that receives the inertial data if the gesture discriminator does not determine that the inertial data represents a gesture from the first class of gestures, the second gesture interpreter configured to identify the inertial data as a particular gesture from a second class of gestures.
2. The system of claim 1, wherein the first class of gestures comprise planar motions.
3. The system of claim 2, wherein the first gesture interpreter comprises a handwriting recognition system.
4. The system of claim 1, wherein the first class of gestures comprise linear reciprocal motions.
5. The system of claim 1, wherein the second class of gestures comprises a tilt motion.
6. The system of claim 1, wherein the inertial sensors comprise at least one gyroscope, accelerometer, or magneto-resistive sensor.
7. A gesture-recognition system, comprising: a plurality of inertial sensors to generate inertial data; a first gesture discriminator that receives the inertial data and determines whether the inertial data represents a gesture from a first class of gestures; a first gesture interpreter that receives the inertial data if the first gesture discriminator determines that the inertial data represents a gesture from the first class of gestures, the first gesture interpreter configured to identify the inertial data as a particular gesture from the first class of gestures; a second gesture discriminator that receives the inertial data if the gesture discriminator does not determine that the inertial data represents a gesture from the first class of gestures and determines whether the inertial data represents a gesture from a second class of gestures; and a second gesture interpreter that receives the inertial data if the second gesture discriminator determines that the inertial data represents a gesture from the second class of gestures, the second gesture interpreter configured to identify the inertial data as a particular gesture from the second class of gestures.
8. The system of claim 7, further comprising: a third gesture discriminator that receives the inertial data if the gesture discriminator does not determine that the inertial data represents a gesture from the second class of gestures and determines whether the inertial data represents a gesture from a third class of gestures.
9. The system of claim 8, further comprising: a third gesture interpreter that receives the inertial data if the third gesture discriminator determines that the inertial data represents a gesture from the third class of gestures, the third gesture interpreter configured to identify the inertial data as a particular gesture from the third class of gestures.
10. The system of claim 9, further comprising: a fourth gesture interpreter that receives the inertial data if the third gesture discriminator does not determine that the inertial data represents a gesture from the third class of gestures, the fourth gesture interpreter configured to identify the inertial data as a particular gesture from the fourth class of gestures.
11. A gesture-recognition system, comprising: at least one discriminator to use inertial data to select one of a plurality of classes of gestures; and a plurality of gesture interpreters, each gesture interpreter configured to identify a particular gesture from one of the plurality of classes of gestures.
12. A gesture-controlled electronic device, comprising: a plurality of inertial sensors to generate inertial data; a gesture-recognition system that includes a gesture discriminator that receives the inertial data and determines whether the inertial data represents a gesture from a first class of gestures, a first gesture interpreter that receives the inertial data if the gesture discriminator determines that the inertial data represents a gesture from the first class of gestures, the first gesture interpreter configured to identify the inertial data as a particular gesture from the first class of gestures, and a second gesture interpreter that receives the inertial data if the gesture discriminator does not determine that the mertial data represents a gesture from the first class of gestures, the second gesture interpreter configured to identify the inertial data as a particular gesture from a second class of gestures, and generates a token representing the particular gesture; and an application that receives the token from the gesture-recognition system.
13. A method of controlling a hand-held electronic device that runs an application, comprising: generating inertial data with a plurality of inertial sensors embedded in the electronic device; identifying a particular gesture from the inertial data and generating a corresponding token, the identifying step including selecting one class of gestures from a plurality of classes of gestures based on the inertial data, and identifying a particular gesture from the selected class of gestures based on the inertial data; and passing the token to the application.
14. A method of recognizing a gesture by an electronic device, comprising: generating inertial data with a plurality of inertial sensors embedded in the electronic device; selecting one class of gestures from a plurality of classes of gestures based on the inertial data; and identifying a particular gesture from the selected class of gestures based on the inertial data.
15. A method of recognizing a gesture by an electronic device, comprising: generating inertial data with a plurality of inertial sensors embedded in the electronic device; determining whether the inertial data was generated by a gesture from a first class of gestures; if so, matching the inertial data to a particular gesture from the first class of gestures; and if not, matching the inertial data to a particular gesture from a second class of gestures.
PCT/US2002/020119 2001-06-22 2002-06-24 Gesture recognition system and method WO2003001340A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US30058801P true 2001-06-22 2001-06-22
US60/300,588 2001-06-22

Publications (1)

Publication Number Publication Date
WO2003001340A2 true WO2003001340A2 (en) 2003-01-03

Family

ID=23159736

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2002/020119 WO2003001340A2 (en) 2001-06-22 2002-06-24 Gesture recognition system and method

Country Status (1)

Country Link
WO (1) WO2003001340A2 (en)

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005062705A2 (en) * 2003-12-23 2005-07-14 Koninklijke Philips Electronics N.V. Method of controlling a portable user device
WO2005076114A1 (en) * 2004-02-06 2005-08-18 Nokia Corporation Gesture control system
EP1586978A2 (en) * 2004-03-31 2005-10-19 NEC Corporation Portable device with action shortcut function
WO2005103863A3 (en) * 2004-03-23 2006-01-26 Fujitsu Ltd Distinguishing tilt and translation motion components in handheld devices
WO2005093550A3 (en) * 2004-03-01 2006-04-13 Apple Computer Methods and apparatuses for operating a portable device based on an accelerometer
GB2419433A (en) * 2004-10-20 2006-04-26 Glasgow School Of Art Automated Gesture Recognition
WO2006090546A2 (en) * 2005-02-23 2006-08-31 Matsushita Electric Works, Ltd. Input device for a computer and environmental control system using the same
US7173604B2 (en) 2004-03-23 2007-02-06 Fujitsu Limited Gesture identification of controlled devices
US7176887B2 (en) 2004-03-23 2007-02-13 Fujitsu Limited Environmental modeling for motion controlled handheld devices
US7176888B2 (en) 2004-03-23 2007-02-13 Fujitsu Limited Selective engagement of motion detection
US7176886B2 (en) 2004-03-23 2007-02-13 Fujitsu Limited Spatial signatures
WO2007018631A1 (en) 2005-07-29 2007-02-15 Sony Ericsson Mobile Communications Ab Methods, devices and computer program products for operating mobile devices responsive to user input through movement thereof
US7180500B2 (en) 2004-03-23 2007-02-20 Fujitsu Limited User definable gestures for motion controlled handheld devices
US7180501B2 (en) 2004-03-23 2007-02-20 Fujitsu Limited Gesture based navigation of a handheld user interface
US7180502B2 (en) 2004-03-23 2007-02-20 Fujitsu Limited Handheld device with preferred motion selection
EP1804472A1 (en) * 2004-10-19 2007-07-04 Vodafone K.K. Function control method, and terminal device
US7280096B2 (en) 2004-03-23 2007-10-09 Fujitsu Limited Motion sensor engagement for a handheld device
US7301527B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Feedback based user interface for motion controlled handheld devices
US7301526B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Dynamic adaptation of gestures for motion controlled handheld devices
US7301529B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Context dependent gesture response
US7301528B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Distinguishing tilt and translation motion components in handheld devices
US7307228B2 (en) 2000-10-02 2007-12-11 Apple Inc. Method and apparatus for detecting free fall
US7365735B2 (en) 2004-03-23 2008-04-29 Fujitsu Limited Translation controlled cursor
US7365736B2 (en) 2004-03-23 2008-04-29 Fujitsu Limited Customizable gesture mappings for motion controlled handheld devices
US7365737B2 (en) 2004-03-23 2008-04-29 Fujitsu Limited Non-uniform gesture precision
WO2007126433A3 (en) * 2006-03-29 2008-05-08 Gregory A Dunko Motion sensor character generation for mobile device
US7492367B2 (en) 2005-03-10 2009-02-17 Motus Corporation Apparatus, system and method for interpreting and reproducing physical motion
US7667686B2 (en) 2006-02-01 2010-02-23 Memsic, Inc. Air-writing and motion sensing input for portable devices
US7903084B2 (en) 2004-03-23 2011-03-08 Fujitsu Limited Selective engagement of motion input modes
WO2011039283A1 (en) 2009-09-29 2011-04-07 Movea S.A System and method for recognizing gestures
FR2954533A1 (en) * 2009-12-21 2011-06-24 Air Liquide Portable terminal i.e. hand-held computer, for transmitting information or orders in industrial area, has gyroscope measuring orientation parameters of case, and accelerometer measuring acceleration parameters of case during movements
US7990365B2 (en) 2004-03-23 2011-08-02 Fujitsu Limited Motion controlled remote controller
WO2011119499A2 (en) * 2010-03-23 2011-09-29 Bump Technologies, Inc. Bump suppression
US8139030B2 (en) 2006-02-01 2012-03-20 Memsic, Inc. Magnetic sensor for use with hand-held devices
CN102480553A (en) * 2010-11-24 2012-05-30 上海华勤通讯技术有限公司 3G intelligent mobile phone with call for help and method for realizing call for help
US8195220B2 (en) 2008-02-01 2012-06-05 Lg Electronics Inc. User interface for mobile devices
US8392340B2 (en) 2009-03-13 2013-03-05 Apple Inc. Method and apparatus for detecting conditions of a peripheral device including motion, and determining/predicting temperature(S) wherein at least one temperature is weighted based on detected conditions
US8423076B2 (en) 2008-02-01 2013-04-16 Lg Electronics Inc. User interface for a mobile device
US8692764B2 (en) 2004-03-23 2014-04-08 Fujitsu Limited Gesture based user interface supporting preexisting symbols
US8723793B2 (en) 2003-05-01 2014-05-13 Thomson Licensing Multimedia user interface
US8773260B2 (en) 2004-04-06 2014-07-08 Symbol Technologies, Inc. System and method for monitoring a mobile computing product/arrangement
CN104135911A (en) * 2012-01-09 2014-11-05 因文森斯公司 Activity classification in a multi-axis activity monitor device
US9164588B1 (en) 2013-02-05 2015-10-20 Google Inc. Wearable computing device with gesture recognition
US9426193B2 (en) 2014-10-14 2016-08-23 GravityNav, Inc. Multi-dimensional data visualization, navigation, and menu systems
US9563202B1 (en) 2012-06-29 2017-02-07 Monkeymedia, Inc. Remote controlled vehicle with a head-mounted display apparatus
US9579586B2 (en) 2012-06-29 2017-02-28 Monkeymedia, Inc. Remote controlled vehicle with a handheld display device
US10051298B2 (en) 1999-04-23 2018-08-14 Monkeymedia, Inc. Wireless seamless expansion and video advertising player

Cited By (70)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10051298B2 (en) 1999-04-23 2018-08-14 Monkeymedia, Inc. Wireless seamless expansion and video advertising player
US7351925B2 (en) 2000-10-02 2008-04-01 Apple Inc. Method and apparatus for detecting free fall
US7688306B2 (en) 2000-10-02 2010-03-30 Apple Inc. Methods and apparatuses for operating a portable device based on an accelerometer
US7541551B2 (en) 2000-10-02 2009-06-02 Apple Inc. Method and apparatus for detecting free fall
US8698744B2 (en) 2000-10-02 2014-04-15 Apple Inc. Methods and apparatuses for operating a portable device based on an accelerometer
US7307228B2 (en) 2000-10-02 2007-12-11 Apple Inc. Method and apparatus for detecting free fall
US9921666B2 (en) 2000-10-02 2018-03-20 Apple Inc. Methods and apparatuses for operating a portable device based on an accelerometer
US9829999B2 (en) 2000-10-02 2017-11-28 Apple Inc. Methods and apparatuses for operating a portable device based on an accelerometer
US9575569B2 (en) 2000-10-02 2017-02-21 Apple Inc. Methods and apparatuses for operating a portable device based on an accelerometer
US8723793B2 (en) 2003-05-01 2014-05-13 Thomson Licensing Multimedia user interface
WO2005062705A2 (en) * 2003-12-23 2005-07-14 Koninklijke Philips Electronics N.V. Method of controlling a portable user device
WO2005062705A3 (en) * 2003-12-23 2005-10-20 Koninkl Philips Electronics Nv Method of controlling a portable user device
WO2005076114A1 (en) * 2004-02-06 2005-08-18 Nokia Corporation Gesture control system
US8819596B2 (en) 2004-02-06 2014-08-26 Nokia Corporation Gesture control system
WO2005093550A3 (en) * 2004-03-01 2006-04-13 Apple Computer Methods and apparatuses for operating a portable device based on an accelerometer
US7280096B2 (en) 2004-03-23 2007-10-09 Fujitsu Limited Motion sensor engagement for a handheld device
US7180502B2 (en) 2004-03-23 2007-02-20 Fujitsu Limited Handheld device with preferred motion selection
US7180501B2 (en) 2004-03-23 2007-02-20 Fujitsu Limited Gesture based navigation of a handheld user interface
US7180500B2 (en) 2004-03-23 2007-02-20 Fujitsu Limited User definable gestures for motion controlled handheld devices
US7301527B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Feedback based user interface for motion controlled handheld devices
US7301526B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Dynamic adaptation of gestures for motion controlled handheld devices
US7301529B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Context dependent gesture response
US7301528B2 (en) 2004-03-23 2007-11-27 Fujitsu Limited Distinguishing tilt and translation motion components in handheld devices
US7176886B2 (en) 2004-03-23 2007-02-13 Fujitsu Limited Spatial signatures
US7176888B2 (en) 2004-03-23 2007-02-13 Fujitsu Limited Selective engagement of motion detection
US7365735B2 (en) 2004-03-23 2008-04-29 Fujitsu Limited Translation controlled cursor
US7365736B2 (en) 2004-03-23 2008-04-29 Fujitsu Limited Customizable gesture mappings for motion controlled handheld devices
US7365737B2 (en) 2004-03-23 2008-04-29 Fujitsu Limited Non-uniform gesture precision
US7176887B2 (en) 2004-03-23 2007-02-13 Fujitsu Limited Environmental modeling for motion controlled handheld devices
KR100853605B1 (en) * 2004-03-23 2008-08-22 후지쯔 가부시끼가이샤 Distinguishing tilt and translation motion components in handheld devices
WO2005103863A3 (en) * 2004-03-23 2006-01-26 Fujitsu Ltd Distinguishing tilt and translation motion components in handheld devices
US8692764B2 (en) 2004-03-23 2014-04-08 Fujitsu Limited Gesture based user interface supporting preexisting symbols
US7173604B2 (en) 2004-03-23 2007-02-06 Fujitsu Limited Gesture identification of controlled devices
US7990365B2 (en) 2004-03-23 2011-08-02 Fujitsu Limited Motion controlled remote controller
US7903084B2 (en) 2004-03-23 2011-03-08 Fujitsu Limited Selective engagement of motion input modes
EP1586978A3 (en) * 2004-03-31 2010-01-27 NEC Corporation Portable device with action shortcut function
EP1586978A2 (en) * 2004-03-31 2005-10-19 NEC Corporation Portable device with action shortcut function
US8773260B2 (en) 2004-04-06 2014-07-08 Symbol Technologies, Inc. System and method for monitoring a mobile computing product/arrangement
EP1804472A1 (en) * 2004-10-19 2007-07-04 Vodafone K.K. Function control method, and terminal device
GB2419433A (en) * 2004-10-20 2006-04-26 Glasgow School Of Art Automated Gesture Recognition
WO2006090546A2 (en) * 2005-02-23 2006-08-31 Matsushita Electric Works, Ltd. Input device for a computer and environmental control system using the same
WO2006090546A3 (en) * 2005-02-23 2007-07-12 Matsushita Electric Works Ltd Input device for a computer and environmental control system using the same
US7492367B2 (en) 2005-03-10 2009-02-17 Motus Corporation Apparatus, system and method for interpreting and reproducing physical motion
WO2007018631A1 (en) 2005-07-29 2007-02-15 Sony Ericsson Mobile Communications Ab Methods, devices and computer program products for operating mobile devices responsive to user input through movement thereof
US8046030B2 (en) 2005-07-29 2011-10-25 Sony Ericsson Mobile Communications Ab Methods, devices and computer program products for operating mobile devices responsive to user input through movement thereof
US8761840B2 (en) 2005-07-29 2014-06-24 Sony Corporation Methods, devices and computer program products for operating mobile devices responsive to user input through movement thereof
US8139030B2 (en) 2006-02-01 2012-03-20 Memsic, Inc. Magnetic sensor for use with hand-held devices
US7667686B2 (en) 2006-02-01 2010-02-23 Memsic, Inc. Air-writing and motion sensing input for portable devices
US7536201B2 (en) 2006-03-29 2009-05-19 Sony Ericsson Mobile Communications Ab Motion sensor character generation for mobile device
WO2007126433A3 (en) * 2006-03-29 2008-05-08 Gregory A Dunko Motion sensor character generation for mobile device
JP2009531962A (en) * 2006-03-29 2009-09-03 ソニー エリクソン モバイル コミュニケーションズ, エービー Character generation by the motion sensor for the mobile communication device
US8423076B2 (en) 2008-02-01 2013-04-16 Lg Electronics Inc. User interface for a mobile device
US8195220B2 (en) 2008-02-01 2012-06-05 Lg Electronics Inc. User interface for mobile devices
US8392340B2 (en) 2009-03-13 2013-03-05 Apple Inc. Method and apparatus for detecting conditions of a peripheral device including motion, and determining/predicting temperature(S) wherein at least one temperature is weighted based on detected conditions
WO2011039283A1 (en) 2009-09-29 2011-04-07 Movea S.A System and method for recognizing gestures
FR2954533A1 (en) * 2009-12-21 2011-06-24 Air Liquide Portable terminal i.e. hand-held computer, for transmitting information or orders in industrial area, has gyroscope measuring orientation parameters of case, and accelerometer measuring acceleration parameters of case during movements
WO2011119499A3 (en) * 2010-03-23 2011-12-22 Bump Technologies, Inc. Bump suppression
WO2011119499A2 (en) * 2010-03-23 2011-09-29 Bump Technologies, Inc. Bump suppression
CN102480553A (en) * 2010-11-24 2012-05-30 上海华勤通讯技术有限公司 3G intelligent mobile phone with call for help and method for realizing call for help
CN104135911A (en) * 2012-01-09 2014-11-05 因文森斯公司 Activity classification in a multi-axis activity monitor device
US9579586B2 (en) 2012-06-29 2017-02-28 Monkeymedia, Inc. Remote controlled vehicle with a handheld display device
US9563202B1 (en) 2012-06-29 2017-02-07 Monkeymedia, Inc. Remote controlled vehicle with a head-mounted display apparatus
US9919233B2 (en) 2012-06-29 2018-03-20 Monkeymedia, Inc. Remote controlled vehicle with augmented reality overlay
US9656168B1 (en) 2012-06-29 2017-05-23 Monkeymedia, Inc. Head-mounted display for navigating a virtual environment
US9782684B2 (en) 2012-06-29 2017-10-10 Monkeymedia, Inc. Remote controlled vehicle with a handheld display device
US9791897B2 (en) 2012-06-29 2017-10-17 Monkeymedia, Inc. Handheld display device for navigating a virtual environment
US9612627B2 (en) 2012-06-29 2017-04-04 Monkeymedia, Inc. Head-mounted display apparatus for navigating a virtual environment
US9658617B1 (en) 2012-06-29 2017-05-23 Monkeymedia, Inc. Remote controlled vehicle with a head-mounted display
US9164588B1 (en) 2013-02-05 2015-10-20 Google Inc. Wearable computing device with gesture recognition
US9426193B2 (en) 2014-10-14 2016-08-23 GravityNav, Inc. Multi-dimensional data visualization, navigation, and menu systems

Similar Documents

Publication Publication Date Title
US5500937A (en) Method and apparatus for editing an inked object while simultaneously displaying its recognized object
US7907838B2 (en) Motion sensing and processing on mobile devices
US5181181A (en) Computer apparatus input device for three-dimensional information
EP2426598B1 (en) Apparatus and method for user intention inference using multimodal information
US8941590B2 (en) Adaptive tracking system for spatial input devices
US8655622B2 (en) Method and apparatus for interpreting orientation invariant motion
US7562459B2 (en) Method for entering commands and/or characters for a portable communication device equipped with a tilt sensor
EP2652580B1 (en) Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device
JP5801487B2 (en) sensor-based user interface control
US5912659A (en) Graphics display pointer with integrated selection
JP2014207014A (en) Orientation-sensitive signal output
US8502787B2 (en) System and method for differentiating between intended and unintended user input on a touchpad
US7519918B2 (en) Mobile virtual desktop
US20100153313A1 (en) Interface adaptation system
US6687614B2 (en) Navigation device, information display device, object creation method, and recording medium
EP2699983B1 (en) Methods and apparatuses for facilitating gesture recognition
CN100478847C (en) Adaptive user interface input device
US20030048260A1 (en) System and method for selecting actions based on the identification of user's fingers
US20090262074A1 (en) Controlling and accessing content using motion processing on mobile devices
US9244545B2 (en) Touch and stylus discrimination and rejection for contact sensitive computing devices
US6993451B2 (en) 3D input apparatus and method thereof
KR100856203B1 (en) User inputting apparatus and method using finger mark recognition sensor
US8819812B1 (en) Gesture recognition for device input
CN102246125B (en) Mobile devices with motion gesture recognition
KR100465241B1 (en) Motion recognition system using a imaginary writing plane and method thereof

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642