US20220300107A1 - Gesture Detection - Google Patents
Gesture Detection Download PDFInfo
- Publication number
- US20220300107A1 US20220300107A1 US17/832,147 US202217832147A US2022300107A1 US 20220300107 A1 US20220300107 A1 US 20220300107A1 US 202217832147 A US202217832147 A US 202217832147A US 2022300107 A1 US2022300107 A1 US 2022300107A1
- Authority
- US
- United States
- Prior art keywords
- gesture
- command
- processing system
- output signals
- data points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000001514 detection method Methods 0.000 title claims description 10
- 238000000034 method Methods 0.000 claims abstract description 19
- 238000012545 processing Methods 0.000 claims description 24
- 238000005070 sampling Methods 0.000 claims description 16
- 238000004891 communication Methods 0.000 claims description 6
- 230000005684 electric field Effects 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 3
- 239000000758 substrate Substances 0.000 claims description 3
- 230000004044 response Effects 0.000 abstract description 6
- 238000010586 diagram Methods 0.000 description 6
- 230000009471 action Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000000926 separation method Methods 0.000 description 3
- 239000003990 capacitor Substances 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000002329 infrared spectrum Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000036316 preload Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
Definitions
- Gesture detection is common. Many set-top boxes, remote controls, and mobile devices may be controlled using physical gestures. Gestures may even be used to control an automotive environment, such as power windows. In conventional gesture control, a user places her finger on a gesture surface and performs some gesture.
- FIGS. 1 and 2 are simplified schematics illustrating an environment in which exemplary embodiments may be implemented
- FIG. 3 is a more detailed block diagram illustrating the operating environment, according to exemplary embodiments.
- FIGS. 4-5 are schematics illustrating a gesture detector, according to exemplary embodiments.
- FIGS. 6-7 are more simplified schematics illustrating another exemplary operating environment
- FIGS. 8-9 are more detailed illustrations of the gesture detector, according to exemplary embodiments.
- FIGS. 10-11 are more detailed schematics of the gesture detector, according to exemplary embodiments.
- FIGS. 12-13 are diagrams illustrating a curvilinear arrangement of the gesture detector, according to exemplary embodiments.
- FIG. 14 is another schematic illustrating the gesture detector, according to exemplary embodiments.
- FIGS. 15-17 are schematics illustrating a learning mode of operation, according to exemplary embodiments.
- FIG. 18-20 are schematics illustrating output sampling, according to exemplary embodiments.
- FIG. 21 is a schematic illustrating an aftermarket gesture detector, according to exemplary embodiments.
- FIGS. 22-23 are schematics illustrating other operating environments for additional aspects of the exemplary embodiments.
- first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first device could be termed a second device, and, similarly, a second device could be termed a first device without departing from the teachings of the disclosure.
- FIGS. 1-2 are simplified schematics illustrating an environment in which exemplary embodiments may be implemented.
- FIG. 1 illustrates an automotive interior 20 having a center console 22 .
- the automotive interior 20 has many buttons, switches, and other conventional controls for driving a vehicle, so the details need not be explained.
- FIG. 1 also illustrates a gesture detector 24 .
- the gesture detector 24 is illustrated as being located on the center console 22 , but the gesture detector 24 may be placed at any location within the automotive interior 20 (as later paragraphs will explain). Wherever the gesture detector 24 is located, the gesture detector 24 senses hand gestures that are performed to control the vehicle.
- FIG. 2 for example, illustrates a driver's human hand 26 performing a hand gesture 28 in a vicinity of the gesture detector 24 .
- the gesture detector 24 is enlarged for clarity. As the driver's hand performs the gesture 28 , the gesture detector 24 senses a capacitance 30 between the driver's hand 26 and the gesture detector 24 . The gesture detector 24 then generates an output signal 32 that is proportional to the capacitance 30 . The output signal 32 is analyzed (such as by a controller 34 ) to execute a command 36 .
- the driver's hand 26 may perform the hand gesture 28 to lock the car doors. Another gesture may open a sunroof. Still another gesture may turn on the headlights. Even more gestures may select a radio station, answer a hands-free call, or apply the brakes. Whatever the gesture 28 , exemplary embodiments interpret the gesture 28 and execute the corresponding command 36 . Indeed, the user may associate any gesture to any action, as later paragraphs will explain.
- Exemplary embodiments thus greatly improve gesture detection.
- Conventional gesture detection utilizes infrared vision systems and/or environmental markers (such as motion capture suits). Infrared detection, though, is poor in bright environments, where ambient light typically washes out the infrared spectrum. Indeed, automotive interiors often have large solar glass expanses that make infrared detection infeasible.
- Exemplary embodiments instead, detect gestures using the capacitance 30 .
- the gesture detector 24 thus does not rely on the infrared spectrum, so the gesture detector 24 recognizes gestures even in external environments where current sensor technologies fail.
- the gesture detector 24 may thus be dispersed throughout the automotive interior 20 for detection and interpretation of driver and passenger gestures.
- Exemplary embodiments thus greatly increase safety.
- Conventional automotive interiors have knobs, buttons, and stalks that must be physically manipulated to control a vehicle.
- Exemplary embodiments instead, recognize gesture inputs that do not require physical contact with automotive controls.
- the driver's hand and/or fingers may make movements without removing the driver's eye from the road.
- Exemplary embodiments recognize the gesture 28 and safely execute the corresponding command 36 .
- the gesture detector 24 recognizes simple snaps and swipes, more complex geometric shapes, and even alphanumeric characters. Whatever the gesture 28 , exemplary embodiments allow safe and complete control of the automotive environment.
- the gesture 28 may be touch less.
- Conventional gesture detectors require contact between the hand 26 and some gesture surface. Indeed, many vehicles have conventional touch screens that allow the driver's fingers to scroll or swipe among selections of items and tap to select.
- FIGS. 1 and 2 require no contact between the driver's hand 26 or fingers and the gesture detector 24 .
- Exemplary embodiments instead, utilize contactless, touch less gestures to execute the command 36 . That is, the driver's hand 26 performs any two- or three-dimensional gesture 28 that need not contact some touch-sensing surface. As the driver's hand 26 performs the gesture 28 , the capacitance 30 between the driver's hand 26 and the gesture detector 24 changes. Exemplary embodiments use the capacitance 30 to determine which command 36 is executed. So, again, the driver need not be distracted when trying to find and touch the gesture detector 24 . The driver need only perform the gesture 28 to execute the corresponding command 36 .
- FIG. 3 is a more detailed block diagram illustrating the operating environment, according to exemplary embodiments.
- FIG. 3 illustrates the gesture detector 24 interfacing with the controller 30 .
- the gesture detector 24 senses the capacitance 30 and generates the output signal 32 . If the output signal 32 has an analog form, digital conversion 40 may be required.
- the controller 30 receives the output signal 32 , the controller 30 interprets the output signal 32 .
- the controller 30 has a processor 42 and a memory 44 .
- the processor 42 may be a microprocessor (“ ⁇ P”), an application specific integrated circuit (ASIC), or other component that executes a gesture algorithm 46 stored in the memory 44 .
- the gesture algorithm 46 includes instructions, code, operations, and/or programs that cause the processor 42 to interpret any gesture input sensed by the gesture detector 24 .
- the gesture illustrated as reference numeral 28 in FIG. 2
- the gesture detector 24 measures the capacitance 30 and generates the output signal 32 .
- the gesture algorithm 46 instructs the processor 42 to determine the corresponding command
- the processor 42 consults a database 50 of gestures. When the output signal 32 is received, the processor 42 queries the database 50 of gestures.
- FIG. 3 illustrates the database 50 of gestures as a table 52 that is locally stored in the memory 44 of the controller 30 .
- the database 50 of gestures may be remotely stored, queried, or retrieved from any location, such as in a controller area network (or “CAN”) or other communications network. Regardless, the database 50 of gestures maps, associates, or relates different output signals 32 to their corresponding commands 36 .
- the processor 42 for example, compares the output signal 32 to the entries stored in the database 50 of gestures. Should a match be found, the processor 42 retrieves the corresponding command 36 . The processor 42 then executes the command 36 in response to the output signal 32 , which is generated by the gesture detector 24 in response to the gesture 28 .
- FIGS. 4-5 are more schematics illustrating the gesture detector 24 , according to exemplary embodiments.
- FIG. 4 illustrates the gesture detector 24 located on or in an instrument panel 60
- FIG. 5 illustrates the gesture detector 24 located on or in an interior door panel 62 .
- the gesture detector 24 may be located in front seats, back seats, or any other location in which gesture detection is desired.
- FIGS. 6-7 are more simplified schematics illustrating another exemplary operating environment.
- the gesture detector 24 detects gestures performed in the vicinity of any electronic device 70 .
- the electronic device 70 for simplicity, is illustrated as a smartphone 72 .
- the electronic device 70 may be any processor-controlled device, as later paragraphs will explain.
- the smartphone 72 may also have the processor 42 executing the gesture algorithm 46 stored in the memory 44 .
- the gesture detector 24 senses the capacitance 30 and generates the output signal 32 .
- FIG. 6 illustrates the gesture detector 24 on a front face 74 of the smartphone 72 , while FIG.
- the gesture detector 24 on a backside 76 of the smartphone 72 .
- the processor 42 queries for and retrieves the matching command 36 .
- the processor 42 then executes the command 36 in response to the output signal 32 . So, even though the smartphone 72 may have a touch-sensing screen 78 , the gesture detector 24 senses touch less gestures performed by the user's hand 26 . The user may thus perform touch less gestures to access web pages, answer calls, compose texts, and any other commands or actions.
- Exemplary embodiments may thus be deployed throughout homes and businesses.
- the gesture detector 24 may be installed within cars where ambient, dynamic lighting conditions degrade conventional optical recognition techniques.
- the gesture detector 24 may also be installed in communications devices, toys, fixtures, and any other electronic device 70 . Because the gesture detector 24 does not rely on light, the gesture detector 24 is thus unaffected by lighting conditions.
- the gesture detector 24 may thus be deployed throughout homes and businesses to detect and interpret our gestures.
- the gesture detector 24 may even be combined with or augmented by voice recognition techniques to reduce, or even eliminate, manual activation of controls.
- FIGS. 8-9 are more detailed illustrations of the gesture detector 24 , according to exemplary embodiments.
- FIG. 8 illustrates the gesture detector 24 having an electrically conductive plate 90 of area S (illustrated as reference numeral 92 ).
- the user's hand 26 performs the gesture 28 , the user's hand 26 is separated by a distance d (illustrated as reference numeral 94 ) from the plate 90 .
- the movement of the user's hand 26 causes electrical charges 96 to distribute.
- the electrical charges are grossly enlarged for clarity. Because human skin and tissue are electrically conductive, the electrical charges 96 distribute on the user's skin.
- the electrical charges 96 also distribute on a surface of the plate 90 .
- FIG. 8 illustrates the electrical charges 96 on the user's hand 26 as negatively charged, while the electrical charges 96 on the plate 90 are positively charged.
- the polarity of the electrical charges 96 may be reversed. Regardless, if a voltage difference V (illustrated as reference numeral 98 ) exists between the user's hand 26 and the plate 90 , then an electric field E (illustrated as reference numeral 100 ) is generated.
- FIG. 9 illustrates a simplified schematic.
- the user's hand 26 is separated by the distance d from the conductive plate 90 . Because the user's hand 26 is electrically conductive, this gesture arrangement may be simplified and electrically modeled as a parallel plate capacitor.
- the voltage difference V is
- V Q ⁇ ⁇ S ⁇ d ,
- the capacitance C may be rewritten as
- the capacitance C (illustrated as reference numeral 30 ) has no dependence on the voltage difference V, nor is the capacitance C dependent on the electrical charge Q (illustrated as reference numeral 96 ).
- the reader may also notice that the capacitance C is inversely proportional to the separation distance d. As the user's hand 26 approaches the plate 90 , the separation distance d decreases, causing the capacitance C to increase. Conversely, as the user's hand 26 moves away from the plate 90 , the separation distance d increases, causing the capacitance C to decrease.
- V ( t ) V o ( e ⁇ t/ ⁇ ).
- the output signal 32 is analog, the output signal 32 may be converted by the analog-to-digital converter 40 before being interpreted by the processor 42 .
- the processor 42 receives the output signal 32 , queries the database 50 of gestures, and executes the corresponding command 36 , as earlier paragraphs explained.
- FIGS. 10-11 are more detailed schematics of the gesture detector 24 , according to exemplary embodiments.
- the gesture detector 24 may have multiple plates 90 for sensing difference capacitances during performance of the gesture 28 .
- the gesture detector 24 may have a co-planar, linear arrangement of individual plates 90 .
- the capacitance C illustrated as reference numeral 30
- Each plate 90 may individually generate its corresponding output signal 32 in response to the capacitance C.
- Multiple output signals 32 may be individually received by the processor 42 for interpretation.
- the multiple output signals 32 may be combined in any way.
- the multiple output signals 32 for example, may be summed to yield a summed output signal.
- the multiple output signals 32 may be multiplexed according to time to yield a multiplexed output signal.
- the multiple output signals 32 may be averaged according to time to yield an averaged output signal. However the multiple output signals 32 are combined, the processor 42 interprets the output signals 32 and executes the corresponding command 36 .
- FIG. 11 illustrates an array 110 of the plates 90 .
- the gesture detector 24 may have the multiple plates 90 arranged as a co-planar grid of rows and columns.
- the capacitance C illustrated as reference numeral 30
- the processor 42 may interpret each individual output signal 32 or any combination of the multiple output signals 32 .
- the processor 42 then executes the corresponding command 36 .
- FIGS. 12-13 are diagrams illustrating a curvilinear arrangement of the gesture detector 24 , according to exemplary embodiments.
- the gesture detector 24 may have the multiple plates 90 , but the plates 90 need not lie in the same plane. Some of the plates 90 may lie in the same plane, while other plates 90 may be arranged or oriented in one or more different planes.
- the plates 90 may be installed on curved or curvilinear surfaces of the center console 22 , the instrument panel 60 , and the door panel 62 .
- the plates 90 may be arranged on the sleek, curved surfaces of the electronic device 70 illustrated in FIGS. 5-6 . Indeed, the plates 90 may have many different orientations to each other.
- FIG. 1 the plates 90 may have the multiple plates 90 , but the plates 90 need not lie in the same plane. Some of the plates 90 may lie in the same plane, while other plates 90 may be arranged or oriented in one or more different planes.
- the plates 90 may be installed on curved or curvilinear surfaces of the center console 22 ,
- FIG. 13 in particular, illustrates a flexible substrate 112 on which the plates 90 may be printed, using conductive ink, in the grid or array 110 .
- FIGS. 12-13 only illustrate a few or several plates 90
- the array 110 may contain hundreds, perhaps thousands or millions, of plates 90 using semiconductor micro or nanotechnology manufacturing.
- the convex, curvilinear arrangement of the plates 90 increases sensitivity of the gesture detector 24 to the gesture 28 .
- the electric field E illustrated as reference numeral 100
- each different plate 90 produces a different output signal 32 .
- the different output signals 32 thus allow exemplary embodiments to detect proximity to the user's hand 26 using many different vector representations of many different electric fields E.
- Conventional two-dimensional planar arrangements yield an identical vector representation, providing little data for differentiating the user's different gestures 28 .
- the curvilinear, three-dimensional arrangement in contradistinction, generates many different output signals 32 , albeit normal to each plate 90 , that provides much more data.
- exemplary embodiments provide volumetric data describing the user's hand 26 performing each different gesture 28 , thus increasing sensitivity of different gestures.
- the gesture detector 24 may thus be any arrangement of three-dimensional capacitive plates 90 for sensing the capacitance 30 during the gesture 28 .
- the multiple plates 90 may also be curvilinearly concave in arrangement, depending on the atheistic design of the underlying interior (such as the center console 22 ).
- FIG. 14 is another schematic illustrating the gesture detector 24 , according to exemplary embodiments.
- the gesture detector 24 has three (3) orthogonal plates 90 .
- each plate 90 measures its corresponding capacitance 30 .
- the gesture detector 24 may thus sense the capacitance 30 in three dimensions, plus time.
- the processor 42 may interpret each individual output signal 32 or any combination of the multiple output signals 32 .
- the processor 42 then executes the corresponding command 36 .
- FIGS. 15-17 are schematics illustrating a learning mode 120 of operation, according to exemplary embodiments.
- the processor 42 may be taught to recognize the gestures.
- FIG. 15 thus illustrates a baseline capacitance C Base (again illustrated as the reference numeral 30 ).
- the gesture detector 24 may first measure the baseline capacitance 30 of the ambient environment. Even though the user's hand (illustrated as reference numeral 26 in FIGS. 12-14 ) may not be near the gesture detector 24 , ambient electrical charges 96 may still cause the gesture detector 24 to sense some ambient, baseline capacitance 30 . Stray electrical charges in the air and/or on the surface of the plate 90 , for example, may create the baseline capacitance 30 .
- the processor 42 may thus receive the output signal 32 generated from the ambient conditions.
- FIG. 16 illustrates a graphical user interface 122 .
- the graphical user interface 122 may be displayed on any display device (such as in the center console 22 of the automotive environment 20 , illustrated in FIG. 1 ).
- the graphical user interface 122 may be displayed on any other apparatus, such as the mobile smartphone (illustrated as reference numeral 72 in FIG. 6 ).
- the user trains the processor 42 to recognize particular touchless gestures performed above the gesture detector 24 .
- the user may first select the learning mode 120 of operation.
- the graphical user interface 122 may visually prompt 124 the user to perform a gesture above the gesture detector 24 . The user then performs the desired two-dimensional or even three-dimensional movement.
- the gesture detector 42 senses the capacitance 30 and generates the output signal 52 .
- the gesture algorithm 46 causes the processor 42 to read and store the output signal 32 in the memory 44 . Once the gesture is complete, the user selects the completion icon 124 .
- Baseline comparisons may then be made.
- exemplary embodiments may compare the baseline capacitance C Base to the output signal 32 . That is, exemplary embodiments may compare the output signal 32 to the baseline measurements of the ambient environment. Any change may then be used to retrieve the corresponding command 36 .
- FIG. 17 illustrates a menu 130 of the commands 36 .
- the menu 130 is stored and retrieved from the memory (discussed and illustrated above as reference numeral 44 ).
- the menu 130 is processed for display, thus allowing the user to select the command 36 that corresponds to the just-performed gesture. Once the user confirms completion of the gesture, the user may then associate one of the commands 36 to the gesture.
- the menu 130 thus contains a selection of different commands 36 from which the user may choose.
- FIG. 17 only illustrates a few popular commands 36 for the automotive environment. In practice, though, the menu 130 may be a much fuller listing of commands for any operating environment.
- the user touches or selects the command 36 that she wishes to associate to the gesture (e.g., the output signal 32 ).
- the processor 42 adds a new entry to the database 50 of gestures.
- the database 50 of gestures is thus updated to associate the output signal 32 to the command 36 selected from the menu 130 .
- the user may thus continue performing different gestures, and associating different commands, to populate the database 50 of gestures.
- the database 50 of gestures may also be prepopulated.
- a manufacturer or retailer may preload the database 50 of gestures.
- Gestures may be predefined to invoke or call commands, functions, or any other action. The user may then learn the predefined gestures, such as by viewing training tutorials. The user may also download entries or updates to the database 50 of gestures.
- a server accessible from the Internet, may store predefined associations that are downloaded and stored to the memory 44 .
- FIG. 18-20 are schematics illustrating output sampling, according to exemplary embodiments.
- the gesture detector (discussed and illustrated above as reference numeral 24 ) generates the output signal 32 .
- FIG. 18 illustrates a graph of the output signal 32 for an exemplary gesture having a one second (1 sec.) duration. Even though the gesture is only one second in duration, the output signal 32 may be too complex for quick and efficient processing.
- the processor 42 in other words, may require more time than desired to process the output signal 32 .
- FIG. 19 illustrates sampling of the output signal 32 .
- Exemplary embodiments may sample the output signal 32 to produce discrete data points 140 according to some sampling rate 142 .
- the sampling rate 142 is assumed to be 0.2 seconds, which may be adequate for human gestures. So, when the user performs the gesture having the one second duration, the output signal 32 may be sampled every 0.2 seconds to yield five (5) data points 140 .
- FIG. 20 again illustrates the database 50 of gestures. Because the output signal 32 may be sampled, the database 50 of gestures need only store the discrete data points 140 sampled from the output signal 32 .
- FIG. 20 thus illustrates each sampled output signal 32 as a collection or set of the discrete data points 140 .
- exemplary embodiments need only match the sampled values. Exemplary embodiments need not match an entire, continuous capacitance, voltage, or current signal. The burden on the processor 42 is thus reduced, yielding a quicker response to the user's gesture input.
- FIG. 21 is a schematic illustrating an aftermarket gesture detector 24 , according to exemplary embodiments.
- the gesture detector 24 may be self-contained, aftermarket component that interprets gestures into their corresponding commands 36 .
- the gesture detector 24 may thus include the processor and memory (not shown for simplicity).
- the gesture detector 24 is thus preferably a small component that may be purchased to add gesture detection to an existing system.
- FIG. 21 illustrates the gesture detector 24 as a computer-like tactile mouse, even one or more having control buttons.
- a surface of the computer-like mouse may have the plates 90 printed onto, or affixed thereon, the surface.
- the gesture detector 24 for example, may interface with the driver's vehicle, computer, television, or any other electronic device 70 .
- the gesture detector 24 thus has an interface 150 for sending the determined command 36 to the existing system.
- the gesture detector 24 may physically plug into a vehicle's on-board diagnostic (“OBD”) system 152 (“OBD”) and send the command 36 to the vehicle's intelligence for execution.
- OBD on-board diagnostic
- the gesture detector 24 may utilize the vehicle's BLUETOOTH® interface for sending the command 36 to the vehicle's intelligence for execution.
- the gesture detector 24 may have the interface 150 to a computer 154 , television, or any other audio-video component.
- the gesture detector 24 may thus be laid on a table or counter to interpret gestures into commands for an HVAC system 156 or appliance 158 .
- Exemplary embodiments may also be applied to jewelry and other adornment.
- wearable devices become common, jewelry will evolve as a computing platform.
- An article of jewelry for example, may be instrumented with the gesture detector 24 , thus enabling inputs across a surface of the jewelry.
- exemplary embodiments may be applied or retrofitted to heirloom pieces and other existing jewelry, thus transforming older adornment to modern, digital usage.
- FIG. 22 is a schematic illustrating still more exemplary embodiments.
- FIG. 22 is a generic block diagram illustrating the gesture algorithm 46 operating within a processor-controlled device 300 .
- the gesture algorithm 46 may operate in any processor-controlled device 300 .
- FIG. 22 illustrates the gesture algorithm 46 stored in a memory subsystem of the processor-controlled device 300 .
- One or more processors communicate with the memory subsystem and execute the gesture algorithm 46 . Because the processor-controlled device 300 illustrated in FIG. 22 is well-known to those of ordinary skill in the art, no detailed explanation is needed.
- FIG. 23 depicts other possible operating environments for additional aspects of the exemplary embodiments.
- FIG. 23 illustrates the gesture algorithm 46 operating within various other devices 400 .
- FIG. 23 illustrates that the gesture algorithm 46 may entirely or partially operate within a set-top box (“STB”) ( 402 ), a personal/digital video recorder (PVR/DVR) 404 , a Global Positioning System (GPS) device 408 , an interactive television 410 , a tablet computer 412 , or any computer system, communications device, or processor-controlled device utilizing the processor 50 and/or a digital signal processor (DP/DSP) 414 .
- STB set-top box
- PVR/DVR personal/digital video recorder
- GPS Global Positioning System
- DP/DSP digital signal processor
- the device 400 may also include watches, radios, vehicle electronics, clocks, printers, gateways, mobile/implantable medical devices, and other apparatuses and systems. Because the architecture and operating principles of the various devices 400 are well known, the hardware and software componentry of the various devices 400 are not further shown and described.
- Exemplary embodiments may be physically embodied on or in a computer-readable storage medium.
- This computer-readable medium may include CD-ROM, DVD, tape, cassette, floppy disk, memory card, and large-capacity disks.
- This computer-readable medium, or media could be distributed to end-subscribers, licensees, and assignees. These types of computer-readable media, and other types not mention here but considered within the scope of the exemplary embodiments.
- a computer program product comprises processor-executable instructions for detecting gestures, as explained above.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This application is a continuation of U.S. patent application Ser. No. 16/010,855, filed Jun. 18, 2018, which is a continuation of U.S. patent application Ser. No. 14/078,982, filed Nov. 13, 2013 (now U.S. Pat. No. 10,025,431). All sections of the aforementioned application(s) and/or patent(s) are incorporated herein by reference in their entirety.
- A portion of the disclosure of this patent document and its attachments contain material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyrights whatsoever.
- Gesture detection is common. Many set-top boxes, remote controls, and mobile devices may be controlled using physical gestures. Gestures may even be used to control an automotive environment, such as power windows. In conventional gesture control, a user places her finger on a gesture surface and performs some gesture.
- The features, aspects, and advantages of the exemplary embodiments are better understood when the following Detailed Description is read with reference to the accompanying drawings, wherein:
-
FIGS. 1 and 2 are simplified schematics illustrating an environment in which exemplary embodiments may be implemented; -
FIG. 3 is a more detailed block diagram illustrating the operating environment, according to exemplary embodiments; -
FIGS. 4-5 are schematics illustrating a gesture detector, according to exemplary embodiments; -
FIGS. 6-7 are more simplified schematics illustrating another exemplary operating environment; -
FIGS. 8-9 are more detailed illustrations of the gesture detector, according to exemplary embodiments; -
FIGS. 10-11 are more detailed schematics of the gesture detector, according to exemplary embodiments; -
FIGS. 12-13 are diagrams illustrating a curvilinear arrangement of the gesture detector, according to exemplary embodiments; -
FIG. 14 is another schematic illustrating the gesture detector, according to exemplary embodiments; -
FIGS. 15-17 are schematics illustrating a learning mode of operation, according to exemplary embodiments; -
FIG. 18-20 are schematics illustrating output sampling, according to exemplary embodiments; -
FIG. 21 is a schematic illustrating an aftermarket gesture detector, according to exemplary embodiments; and -
FIGS. 22-23 are schematics illustrating other operating environments for additional aspects of the exemplary embodiments. - The exemplary embodiments will now be described more fully hereinafter with reference to the accompanying drawings. The exemplary embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. These embodiments are provided so that this disclosure will be thorough and complete and will fully convey the exemplary embodiments to those of ordinary skill in the art. Moreover, all statements herein reciting embodiments, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure).
- Thus, for example, it will be appreciated by those of ordinary skill in the art that the diagrams, schematics, illustrations, and the like represent conceptual views or processes illustrating the exemplary embodiments. The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing associated software. Those of ordinary skill in the art further understand that the exemplary hardware, software, processes, methods, and/or operating systems described herein are for illustrative purposes and, thus, are not intended to be limited to any particular named manufacturer.
- As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms “includes,” “comprises,” “including,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. Furthermore, “connected” or “coupled” as used herein may include wirelessly connected or coupled. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
- It will also be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first device could be termed a second device, and, similarly, a second device could be termed a first device without departing from the teachings of the disclosure.
-
FIGS. 1-2 are simplified schematics illustrating an environment in which exemplary embodiments may be implemented.FIG. 1 illustrates anautomotive interior 20 having acenter console 22. Theautomotive interior 20 has many buttons, switches, and other conventional controls for driving a vehicle, so the details need not be explained. However,FIG. 1 also illustrates agesture detector 24. Thegesture detector 24 is illustrated as being located on thecenter console 22, but thegesture detector 24 may be placed at any location within the automotive interior 20 (as later paragraphs will explain). Wherever thegesture detector 24 is located, thegesture detector 24 senses hand gestures that are performed to control the vehicle.FIG. 2 , for example, illustrates a driver'shuman hand 26 performing ahand gesture 28 in a vicinity of thegesture detector 24. Thegesture detector 24 is enlarged for clarity. As the driver's hand performs thegesture 28, thegesture detector 24 senses acapacitance 30 between the driver'shand 26 and thegesture detector 24. Thegesture detector 24 then generates anoutput signal 32 that is proportional to thecapacitance 30. Theoutput signal 32 is analyzed (such as by a controller 34) to execute acommand 36. The driver'shand 26, for example, may perform thehand gesture 28 to lock the car doors. Another gesture may open a sunroof. Still another gesture may turn on the headlights. Even more gestures may select a radio station, answer a hands-free call, or apply the brakes. Whatever thegesture 28, exemplary embodiments interpret thegesture 28 and execute thecorresponding command 36. Indeed, the user may associate any gesture to any action, as later paragraphs will explain. - Exemplary embodiments thus greatly improve gesture detection. Conventional gesture detection utilizes infrared vision systems and/or environmental markers (such as motion capture suits). Infrared detection, though, is poor in bright environments, where ambient light typically washes out the infrared spectrum. Indeed, automotive interiors often have large solar glass expanses that make infrared detection infeasible. Exemplary embodiments, instead, detect gestures using the
capacitance 30. Thegesture detector 24 thus does not rely on the infrared spectrum, so thegesture detector 24 recognizes gestures even in external environments where current sensor technologies fail. Thegesture detector 24 may thus be dispersed throughout theautomotive interior 20 for detection and interpretation of driver and passenger gestures. - Exemplary embodiments thus greatly increase safety. Conventional automotive interiors have knobs, buttons, and stalks that must be physically manipulated to control a vehicle. Exemplary embodiments, instead, recognize gesture inputs that do not require physical contact with automotive controls. The driver's hand and/or fingers may make movements without removing the driver's eye from the road. Exemplary embodiments recognize the
gesture 28 and safely execute thecorresponding command 36. Thegesture detector 24 recognizes simple snaps and swipes, more complex geometric shapes, and even alphanumeric characters. Whatever thegesture 28, exemplary embodiments allow safe and complete control of the automotive environment. - The
gesture 28 may be touch less. Conventional gesture detectors require contact between thehand 26 and some gesture surface. Indeed, many vehicles have conventional touch screens that allow the driver's fingers to scroll or swipe among selections of items and tap to select.FIGS. 1 and 2 , though, require no contact between the driver'shand 26 or fingers and thegesture detector 24. Exemplary embodiments, instead, utilize contactless, touch less gestures to execute thecommand 36. That is, the driver'shand 26 performs any two- or three-dimensional gesture 28 that need not contact some touch-sensing surface. As the driver'shand 26 performs thegesture 28, thecapacitance 30 between the driver'shand 26 and thegesture detector 24 changes. Exemplary embodiments use thecapacitance 30 to determine which command 36 is executed. So, again, the driver need not be distracted when trying to find and touch thegesture detector 24. The driver need only perform thegesture 28 to execute thecorresponding command 36. -
FIG. 3 is a more detailed block diagram illustrating the operating environment, according to exemplary embodiments.FIG. 3 illustrates thegesture detector 24 interfacing with thecontroller 30. Thegesture detector 24 senses thecapacitance 30 and generates theoutput signal 32. If theoutput signal 32 has an analog form,digital conversion 40 may be required. When thecontroller 30 receives theoutput signal 32, thecontroller 30 interprets theoutput signal 32. Thecontroller 30 has aprocessor 42 and amemory 44. Theprocessor 42 may be a microprocessor (“μP”), an application specific integrated circuit (ASIC), or other component that executes agesture algorithm 46 stored in thememory 44. Thegesture algorithm 46 includes instructions, code, operations, and/or programs that cause theprocessor 42 to interpret any gesture input sensed by thegesture detector 24. When the gesture (illustrated asreference numeral 28 inFIG. 2 ) is performed, thegesture detector 24 measures thecapacitance 30 and generates theoutput signal 32. Thegesture algorithm 46 instructs theprocessor 42 to determine thecorresponding command 36. - The
processor 42 consults adatabase 50 of gestures. When theoutput signal 32 is received, theprocessor 42 queries thedatabase 50 of gestures.FIG. 3 illustrates thedatabase 50 of gestures as a table 52 that is locally stored in thememory 44 of thecontroller 30. Thedatabase 50 of gestures, however, may be remotely stored, queried, or retrieved from any location, such as in a controller area network (or “CAN”) or other communications network. Regardless, thedatabase 50 of gestures maps, associates, or relatesdifferent output signals 32 to their corresponding commands 36. Theprocessor 42, for example, compares theoutput signal 32 to the entries stored in thedatabase 50 of gestures. Should a match be found, theprocessor 42 retrieves the correspondingcommand 36. Theprocessor 42 then executes thecommand 36 in response to theoutput signal 32, which is generated by thegesture detector 24 in response to thegesture 28. -
FIGS. 4-5 are more schematics illustrating thegesture detector 24, according to exemplary embodiments.FIG. 4 illustrates thegesture detector 24 located on or in aninstrument panel 60, whileFIG. 5 illustrates thegesture detector 24 located on or in an interior door panel 62. Indeed, thegesture detector 24 may be located in front seats, back seats, or any other location in which gesture detection is desired. -
FIGS. 6-7 are more simplified schematics illustrating another exemplary operating environment. Here thegesture detector 24 detects gestures performed in the vicinity of anyelectronic device 70. Theelectronic device 70, for simplicity, is illustrated as asmartphone 72. Theelectronic device 70, however, may be any processor-controlled device, as later paragraphs will explain. Regardless, thesmartphone 72 may also have theprocessor 42 executing thegesture algorithm 46 stored in thememory 44. When a user's hand performs the gesture (illustrated, respectively, asreference numerals FIG. 2 ), thegesture detector 24 senses thecapacitance 30 and generates theoutput signal 32.FIG. 6 illustrates thegesture detector 24 on afront face 74 of thesmartphone 72, whileFIG. 7 illustrates thegesture detector 24 on abackside 76 of thesmartphone 72. Wherever thegesture detector 24 is located, theprocessor 42 queries for and retrieves the matchingcommand 36. Theprocessor 42 then executes thecommand 36 in response to theoutput signal 32. So, even though thesmartphone 72 may have a touch-sensing screen 78, thegesture detector 24 senses touch less gestures performed by the user'shand 26. The user may thus perform touch less gestures to access web pages, answer calls, compose texts, and any other commands or actions. - Exemplary embodiments may thus be deployed throughout homes and businesses. The
gesture detector 24 may be installed within cars where ambient, dynamic lighting conditions degrade conventional optical recognition techniques. Thegesture detector 24, however, may also be installed in communications devices, toys, fixtures, and any otherelectronic device 70. Because thegesture detector 24 does not rely on light, thegesture detector 24 is thus unaffected by lighting conditions. Thegesture detector 24 may thus be deployed throughout homes and businesses to detect and interpret our gestures. Thegesture detector 24 may even be combined with or augmented by voice recognition techniques to reduce, or even eliminate, manual activation of controls. -
FIGS. 8-9 are more detailed illustrations of thegesture detector 24, according to exemplary embodiments.FIG. 8 illustrates thegesture detector 24 having an electricallyconductive plate 90 of area S (illustrated as reference numeral 92). As the user'shand 26 performs thegesture 28, the user'shand 26 is separated by a distance d (illustrated as reference numeral 94) from theplate 90. As the user'shand 26 performs thecontactless gesture 28, the movement of the user'shand 26 causeselectrical charges 96 to distribute. The electrical charges are grossly enlarged for clarity. Because human skin and tissue are electrically conductive, theelectrical charges 96 distribute on the user's skin. Theelectrical charges 96 also distribute on a surface of theplate 90. For simplicity, only a fewelectrical charges 96 are illustrated. In practice, though, theelectrical charges 96 will distribute all over the user'shand 26, while the electrical charges will distribute all over theplate 90.FIG. 8 illustrates theelectrical charges 96 on the user'shand 26 as negatively charged, while theelectrical charges 96 on theplate 90 are positively charged. The polarity of theelectrical charges 96, however, may be reversed. Regardless, if a voltage difference V (illustrated as reference numeral 98) exists between the user'shand 26 and theplate 90, then an electric field E (illustrated as reference numeral 100) is generated. -
FIG. 9 illustrates a simplified schematic. The user'shand 26 is separated by the distance d from theconductive plate 90. Because the user'shand 26 is electrically conductive, this gesture arrangement may be simplified and electrically modeled as a parallel plate capacitor. The voltage difference V is -
- where Q is the charge and ϵ is the permittivity of the air between the user's
hand 26 and theplate 90. Knowing the relationship for the capacitance C as -
- the capacitance C may be rewritten as
-
- The reader may notice that the capacitance C (illustrated as reference numeral 30) has no dependence on the voltage difference V, nor is the capacitance C dependent on the electrical charge Q (illustrated as reference numeral 96). The reader may also notice that the capacitance C is inversely proportional to the separation distance d. As the user's
hand 26 approaches theplate 90, the separation distance d decreases, causing the capacitance C to increase. Conversely, as the user'shand 26 moves away from theplate 90, the separation distance d increases, causing the capacitance C to decrease. - The
output signal 32 also changes. As the user'shand 26 vertically moves with respect to theplate 90, the capacitance C changes. Once theelectrical charges 96 develop, the electric field E (illustrated asreference numeral 100 inFIG. 8 ) charges thegesture detector 24 as a capacitor. Thegesture detector 24 may then be discharged, through aresistor 102, according to the RC time constant τ=RC, where R is the resistance (in Ohms) of theresistor 102 and C is thecapacitance 30. Theoutput signal 32 will thus decay with time according to -
V(t)=V o(e −t/τ). - Because the capacitance C changes as the user's
hand 26 performs the gesture, the time constant τ=RC will also change, causing theoutput signal 32 to change with the same gesture. So, as the user'shand 26 performs thegesture 28, the capacitance C changes and theoutput signal 32 also changes. If theoutput signal 32 is analog, theoutput signal 32 may be converted by the analog-to-digital converter 40 before being interpreted by theprocessor 42. Theprocessor 42 receives theoutput signal 32, queries thedatabase 50 of gestures, and executes the correspondingcommand 36, as earlier paragraphs explained. -
FIGS. 10-11 are more detailed schematics of thegesture detector 24, according to exemplary embodiments. Here thegesture detector 24 may havemultiple plates 90 for sensing difference capacitances during performance of thegesture 28. AsFIG. 10 illustrates, thegesture detector 24 may have a co-planar, linear arrangement ofindividual plates 90. As the user'shand 26 performs thegesture 28, the capacitance C (illustrated as reference numeral 30) changes. Eachplate 90 may individually generate itscorresponding output signal 32 in response to the capacitance C. Multiple output signals 32 may be individually received by theprocessor 42 for interpretation. The multiple output signals 32, however, may be combined in any way. The multiple output signals 32, for example, may be summed to yield a summed output signal. Themultiple output signals 32 may be multiplexed according to time to yield a multiplexed output signal. Themultiple output signals 32 may be averaged according to time to yield an averaged output signal. However themultiple output signals 32 are combined, theprocessor 42 interprets the output signals 32 and executes the correspondingcommand 36. -
FIG. 11 illustrates anarray 110 of theplates 90. Here thegesture detector 24 may have themultiple plates 90 arranged as a co-planar grid of rows and columns. As the user'shand 26 performs thegesture 28, the capacitance C (illustrated as reference numeral 30) changes. Theprocessor 42 may interpret eachindividual output signal 32 or any combination of the multiple output signals 32. Theprocessor 42 then executes the correspondingcommand 36. -
FIGS. 12-13 are diagrams illustrating a curvilinear arrangement of thegesture detector 24, according to exemplary embodiments. Here thegesture detector 24 may have themultiple plates 90, but theplates 90 need not lie in the same plane. Some of theplates 90 may lie in the same plane, whileother plates 90 may be arranged or oriented in one or more different planes. Recalling the automotive interior illustrated inFIGS. 1 and 4-5 , theplates 90 may be installed on curved or curvilinear surfaces of thecenter console 22, theinstrument panel 60, and the door panel 62. Likewise, theplates 90 may be arranged on the sleek, curved surfaces of theelectronic device 70 illustrated inFIGS. 5-6 . Indeed, theplates 90 may have many different orientations to each other.FIG. 13 , in particular, illustrates aflexible substrate 112 on which theplates 90 may be printed, using conductive ink, in the grid orarray 110. WhileFIGS. 12-13 only illustrate a few orseveral plates 90, in practice thearray 110 may contain hundreds, perhaps thousands or millions, ofplates 90 using semiconductor micro or nanotechnology manufacturing. The convex, curvilinear arrangement of theplates 90 increases sensitivity of thegesture detector 24 to thegesture 28. As the user'shand 26 performs thecontactless gesture 28, the electric field E (illustrated as reference numeral 100) is everywhere perpendicular to eachplate 90. As themultiple plates 90 may be curvilinearly arranged, eachdifferent plate 90 produces adifferent output signal 32. Thedifferent output signals 32 thus allow exemplary embodiments to detect proximity to the user'shand 26 using many different vector representations of many different electric fields E. Conventional two-dimensional planar arrangements yield an identical vector representation, providing little data for differentiating the user's different gestures 28. The curvilinear, three-dimensional arrangement, in contradistinction, generates manydifferent output signals 32, albeit normal to eachplate 90, that provides much more data. Indeed, exemplary embodiments provide volumetric data describing the user'shand 26 performing eachdifferent gesture 28, thus increasing sensitivity of different gestures. Thegesture detector 24 may thus be any arrangement of three-dimensional capacitive plates 90 for sensing thecapacitance 30 during thegesture 28. Themultiple plates 90, however, may also be curvilinearly concave in arrangement, depending on the atheistic design of the underlying interior (such as the center console 22). -
FIG. 14 is another schematic illustrating thegesture detector 24, according to exemplary embodiments. Here thegesture detector 24 has three (3)orthogonal plates 90. As the user'shand 26 performs thegesture 28, eachplate 90 measures itscorresponding capacitance 30. Thegesture detector 24 may thus sense thecapacitance 30 in three dimensions, plus time. Theprocessor 42 may interpret eachindividual output signal 32 or any combination of the multiple output signals 32. Theprocessor 42 then executes the correspondingcommand 36. -
FIGS. 15-17 are schematics illustrating alearning mode 120 of operation, according to exemplary embodiments. Before theprocessor 42 can interpret the user's gestures, theprocessor 42 may be taught to recognize the gestures.FIG. 15 thus illustrates a baseline capacitance CBase (again illustrated as the reference numeral 30). Thegesture detector 24 may first measure thebaseline capacitance 30 of the ambient environment. Even though the user's hand (illustrated asreference numeral 26 inFIGS. 12-14 ) may not be near thegesture detector 24, ambientelectrical charges 96 may still cause thegesture detector 24 to sense some ambient,baseline capacitance 30. Stray electrical charges in the air and/or on the surface of theplate 90, for example, may create thebaseline capacitance 30. Theprocessor 42 may thus receive theoutput signal 32 generated from the ambient conditions. -
FIG. 16 illustrates agraphical user interface 122. Thegraphical user interface 122 may be displayed on any display device (such as in thecenter console 22 of theautomotive environment 20, illustrated inFIG. 1 ). Thegraphical user interface 122, of course, may be displayed on any other apparatus, such as the mobile smartphone (illustrated asreference numeral 72 inFIG. 6 ). Regardless, here the user trains theprocessor 42 to recognize particular touchless gestures performed above thegesture detector 24. When the user wishes to store a gesture for later recognition, the user may first select thelearning mode 120 of operation. AsFIG. 16 illustrates, thegraphical user interface 122 may visually prompt 124 the user to perform a gesture above thegesture detector 24. The user then performs the desired two-dimensional or even three-dimensional movement. As the gesture is performed, thegesture detector 42 senses thecapacitance 30 and generates theoutput signal 52. Thegesture algorithm 46 causes theprocessor 42 to read and store theoutput signal 32 in thememory 44. Once the gesture is complete, the user selects thecompletion icon 124. - Baseline comparisons may then be made. As the user performs the gesture, exemplary embodiments may compare the baseline capacitance CBase to the
output signal 32. That is, exemplary embodiments may compare theoutput signal 32 to the baseline measurements of the ambient environment. Any change may then be used to retrieve thecorresponding command 36. -
FIG. 17 illustrates amenu 130 of thecommands 36. Themenu 130 is stored and retrieved from the memory (discussed and illustrated above as reference numeral 44). Themenu 130 is processed for display, thus allowing the user to select thecommand 36 that corresponds to the just-performed gesture. Once the user confirms completion of the gesture, the user may then associate one of thecommands 36 to the gesture. Themenu 130 thus contains a selection ofdifferent commands 36 from which the user may choose.FIG. 17 only illustrates a fewpopular commands 36 for the automotive environment. In practice, though, themenu 130 may be a much fuller listing of commands for any operating environment. The user touches or selects thecommand 36 that she wishes to associate to the gesture (e.g., the output signal 32). Once the user makes her selection, theprocessor 42 adds a new entry to thedatabase 50 of gestures. Thedatabase 50 of gestures is thus updated to associate theoutput signal 32 to thecommand 36 selected from themenu 130. The user may thus continue performing different gestures, and associating different commands, to populate thedatabase 50 of gestures. - The
database 50 of gestures may also be prepopulated. As thegesture detector 24 may be adapted to any electronic device or environment, a manufacturer or retailer may preload thedatabase 50 of gestures. Gestures may be predefined to invoke or call commands, functions, or any other action. The user may then learn the predefined gestures, such as by viewing training tutorials. The user may also download entries or updates to thedatabase 50 of gestures. A server, accessible from the Internet, may store predefined associations that are downloaded and stored to thememory 44. -
FIG. 18-20 are schematics illustrating output sampling, according to exemplary embodiments. Whatever gesture the user performs, the gesture detector (discussed and illustrated above as reference numeral 24) generates theoutput signal 32. Theoutput signal 32 may be thecapacitance 30, the time constant τ=RC, the decaying voltage measurement, or a decaying current measurement, depending on the circuit design. Regardless, theoutput signal 32 may be too complex for fast processing. For example,FIG. 18 illustrates a graph of theoutput signal 32 for an exemplary gesture having a one second (1 sec.) duration. Even though the gesture is only one second in duration, theoutput signal 32 may be too complex for quick and efficient processing. Theprocessor 42, in other words, may require more time than desired to process theoutput signal 32. -
FIG. 19 illustrates sampling of theoutput signal 32. Exemplary embodiments may sample theoutput signal 32 to producediscrete data points 140 according to somesampling rate 142. For mathematical simplicity, thesampling rate 142 is assumed to be 0.2 seconds, which may be adequate for human gestures. So, when the user performs the gesture having the one second duration, theoutput signal 32 may be sampled every 0.2 seconds to yield five (5) data points 140. -
FIG. 20 again illustrates thedatabase 50 of gestures. Because theoutput signal 32 may be sampled, thedatabase 50 of gestures need only store thediscrete data points 140 sampled from theoutput signal 32.FIG. 20 thus illustrates each sampledoutput signal 32 as a collection or set of the discrete data points 140. When thedatabase 50 of gestures is queried, exemplary embodiments need only match the sampled values. Exemplary embodiments need not match an entire, continuous capacitance, voltage, or current signal. The burden on theprocessor 42 is thus reduced, yielding a quicker response to the user's gesture input. -
FIG. 21 is a schematic illustrating anaftermarket gesture detector 24, according to exemplary embodiments. Here thegesture detector 24 may be self-contained, aftermarket component that interprets gestures into their corresponding commands 36. Thegesture detector 24 may thus include the processor and memory (not shown for simplicity). Thegesture detector 24 is thus preferably a small component that may be purchased to add gesture detection to an existing system.FIG. 21 , for example, illustrates thegesture detector 24 as a computer-like tactile mouse, even one or more having control buttons. A surface of the computer-like mouse may have theplates 90 printed onto, or affixed thereon, the surface. Thegesture detector 24, for example, may interface with the driver's vehicle, computer, television, or any otherelectronic device 70. Thegesture detector 24 thus has aninterface 150 for sending thedetermined command 36 to the existing system. Thegesture detector 24, for example, may physically plug into a vehicle's on-board diagnostic (“OBD”) system 152 (“OBD”) and send thecommand 36 to the vehicle's intelligence for execution. Thegesture detector 24, however, may utilize the vehicle's BLUETOOTH® interface for sending thecommand 36 to the vehicle's intelligence for execution. Similarly, thegesture detector 24 may have theinterface 150 to acomputer 154, television, or any other audio-video component. Thegesture detector 24 may thus be laid on a table or counter to interpret gestures into commands for anHVAC system 156 orappliance 158. - Exemplary embodiments may also be applied to jewelry and other adornment. As wearable devices become common, jewelry will evolve as a computing platform. An article of jewelry, for example, may be instrumented with the
gesture detector 24, thus enabling inputs across a surface of the jewelry. Moreover, as thegesture detector 24 may be small and adhesively adhered, exemplary embodiments may be applied or retrofitted to heirloom pieces and other existing jewelry, thus transforming older adornment to modern, digital usage. -
FIG. 22 is a schematic illustrating still more exemplary embodiments.FIG. 22 is a generic block diagram illustrating thegesture algorithm 46 operating within a processor-controlleddevice 300. As the above paragraphs explained, thegesture algorithm 46 may operate in any processor-controlleddevice 300.FIG. 22 , then, illustrates thegesture algorithm 46 stored in a memory subsystem of the processor-controlleddevice 300. One or more processors communicate with the memory subsystem and execute thegesture algorithm 46. Because the processor-controlleddevice 300 illustrated inFIG. 22 is well-known to those of ordinary skill in the art, no detailed explanation is needed. -
FIG. 23 depicts other possible operating environments for additional aspects of the exemplary embodiments.FIG. 23 illustrates thegesture algorithm 46 operating within variousother devices 400.FIG. 23 , for example, illustrates that thegesture algorithm 46 may entirely or partially operate within a set-top box (“STB”) (402), a personal/digital video recorder (PVR/DVR) 404, a Global Positioning System (GPS)device 408, aninteractive television 410, a tablet computer 412, or any computer system, communications device, or processor-controlled device utilizing theprocessor 50 and/or a digital signal processor (DP/DSP) 414. Thedevice 400 may also include watches, radios, vehicle electronics, clocks, printers, gateways, mobile/implantable medical devices, and other apparatuses and systems. Because the architecture and operating principles of thevarious devices 400 are well known, the hardware and software componentry of thevarious devices 400 are not further shown and described. - Exemplary embodiments may be physically embodied on or in a computer-readable storage medium. This computer-readable medium may include CD-ROM, DVD, tape, cassette, floppy disk, memory card, and large-capacity disks. This computer-readable medium, or media, could be distributed to end-subscribers, licensees, and assignees. These types of computer-readable media, and other types not mention here but considered within the scope of the exemplary embodiments. A computer program product comprises processor-executable instructions for detecting gestures, as explained above.
- While the exemplary embodiments have been described with respect to various features, aspects, and embodiments, those skilled and unskilled in the art will recognize the exemplary embodiments are not so limited. Other variations, modifications, and alternative embodiments may be made without departing from the spirit and scope of the exemplary embodiments.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/832,147 US20220300107A1 (en) | 2013-11-13 | 2022-06-03 | Gesture Detection |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/078,982 US10025431B2 (en) | 2013-11-13 | 2013-11-13 | Gesture detection |
US16/010,855 US11379070B2 (en) | 2013-11-13 | 2018-06-18 | Gesture detection |
US17/832,147 US20220300107A1 (en) | 2013-11-13 | 2022-06-03 | Gesture Detection |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/010,855 Continuation US11379070B2 (en) | 2013-11-13 | 2018-06-18 | Gesture detection |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220300107A1 true US20220300107A1 (en) | 2022-09-22 |
Family
ID=53043389
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/078,982 Active 2034-08-30 US10025431B2 (en) | 2013-11-13 | 2013-11-13 | Gesture detection |
US16/010,855 Active US11379070B2 (en) | 2013-11-13 | 2018-06-18 | Gesture detection |
US17/832,147 Abandoned US20220300107A1 (en) | 2013-11-13 | 2022-06-03 | Gesture Detection |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/078,982 Active 2034-08-30 US10025431B2 (en) | 2013-11-13 | 2013-11-13 | Gesture detection |
US16/010,855 Active US11379070B2 (en) | 2013-11-13 | 2018-06-18 | Gesture detection |
Country Status (1)
Country | Link |
---|---|
US (3) | US10025431B2 (en) |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170038795A1 (en) * | 2014-01-05 | 2017-02-09 | Vorbeck Materials Corp. | Coated leather articles and method of preparation |
DE102014200024A1 (en) * | 2014-01-06 | 2015-07-09 | Volkswagen Aktiengesellschaft | A display operation unit and method for displaying additional indications of a display unit |
KR101533319B1 (en) * | 2014-02-22 | 2015-07-03 | 주식회사 브이터치 | Remote control apparatus and method using camera centric virtual touch |
US20150253860A1 (en) * | 2014-03-07 | 2015-09-10 | Fresenius Medical Care Holdings, Inc. | E-field sensing of non-contact gesture input for controlling a medical device |
US10409382B2 (en) * | 2014-04-03 | 2019-09-10 | Honda Motor Co., Ltd. | Smart tutorial for gesture control system |
US9575508B2 (en) * | 2014-04-21 | 2017-02-21 | Apple Inc. | Impact and contactless gesture inputs for docking stations |
US10073574B2 (en) | 2015-10-26 | 2018-09-11 | Semiconductor Components Industries, Llc | Methods and apparatus for a capacitive sensor |
US10589625B1 (en) | 2015-12-11 | 2020-03-17 | Disney Enterprises, Inc. | Systems and methods for augmenting an appearance of an actual vehicle component with a virtual vehicle component |
US10969748B1 (en) | 2015-12-28 | 2021-04-06 | Disney Enterprises, Inc. | Systems and methods for using a vehicle as a motion base for a simulated experience |
US11524242B2 (en) | 2016-01-20 | 2022-12-13 | Disney Enterprises, Inc. | Systems and methods for providing customized instances of a game within a virtual space |
US10345914B2 (en) * | 2016-01-26 | 2019-07-09 | Infinity Augmented Reality Israel Ltd. | Method and system for generating a synthetic database of postures and gestures |
JP6916436B2 (en) * | 2017-03-17 | 2021-08-11 | テイ・エス テック株式会社 | Door trim unit |
US10585471B2 (en) | 2017-10-03 | 2020-03-10 | Disney Enterprises, Inc. | Systems and methods to provide an interactive space based on predicted events |
US10970560B2 (en) * | 2018-01-12 | 2021-04-06 | Disney Enterprises, Inc. | Systems and methods to trigger presentation of in-vehicle content |
US10841632B2 (en) | 2018-08-08 | 2020-11-17 | Disney Enterprises, Inc. | Sequential multiplayer storytelling in connected vehicles |
CN109255866B (en) * | 2018-08-24 | 2020-12-18 | 南京理工大学 | Non-contact gesture door lock device and implementation method |
US10976799B2 (en) * | 2018-12-14 | 2021-04-13 | Motorola Mobility Llc | Extending electronic device sensors through device mounts |
US11221663B2 (en) * | 2019-02-07 | 2022-01-11 | Datalogic Ip Tech S.R.L. | Removal prediction of a data reader from a charging base unit |
DE102019210009A1 (en) | 2019-07-08 | 2021-01-28 | Volkswagen Aktiengesellschaft | Method for operating an operating system in a vehicle and operating system in a vehicle |
DE102019210010A1 (en) | 2019-07-08 | 2021-01-14 | Volkswagen Aktiengesellschaft | Method and operating system for acquiring user input for a device of a vehicle |
DE102019210008A1 (en) | 2019-07-08 | 2021-01-14 | Volkswagen Aktiengesellschaft | Method for operating a control system and control system |
US10785621B1 (en) | 2019-07-30 | 2020-09-22 | Disney Enterprises, Inc. | Systems and methods to provide an interactive space based on vehicle-to-vehicle communications |
CN111078009A (en) * | 2019-12-04 | 2020-04-28 | 哈尔滨拓博科技有限公司 | Office desktop gesture control system based on electric field induction |
US11076276B1 (en) | 2020-03-13 | 2021-07-27 | Disney Enterprises, Inc. | Systems and methods to provide wireless communication between computing platforms and articles |
DE102020211740A1 (en) * | 2020-09-18 | 2022-03-24 | Sivantos Pte. Ltd. | Method for operating a hearing device and hearing system |
CN112631422B (en) * | 2020-12-10 | 2023-04-07 | 西安理工大学 | Media interaction system based on human behavior triggering |
Family Cites Families (54)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7489303B1 (en) | 2001-02-22 | 2009-02-10 | Pryor Timothy R | Reconfigurable instrument panels |
US5977957A (en) | 1997-05-22 | 1999-11-02 | Ericsson Inc. | Adaptive sampling of touch screen input |
JP2001014093A (en) | 1999-06-30 | 2001-01-19 | Touch Panel Systems Kk | Acoustic contact detecter |
US8482535B2 (en) | 1999-11-08 | 2013-07-09 | Apple Inc. | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
US7920102B2 (en) | 1999-12-15 | 2011-04-05 | Automotive Technologies International, Inc. | Vehicular heads-up display system |
US6456487B1 (en) | 2001-04-30 | 2002-09-24 | Nokia Corporation | Enclosure for wireless communication device |
KR101289110B1 (en) | 2001-11-01 | 2013-08-07 | 임머숀 코퍼레이션 | Method and apparatus for providing tactile sensations |
US20030109972A1 (en) * | 2001-12-12 | 2003-06-12 | Sht Co., Ltd. | Driver's vehicle diagnostic apparatus and early warning |
FI116806B (en) | 2003-06-04 | 2006-02-28 | Nokia Corp | Method and system for making the selection and the electronic device |
US7138727B2 (en) | 2004-07-20 | 2006-11-21 | Lear Corporation | Vehicular accessory control panel |
US7683890B2 (en) | 2005-04-28 | 2010-03-23 | 3M Innovative Properties Company | Touch location determination using bending mode sensors and multiple detection techniques |
WO2007111909A2 (en) | 2006-03-24 | 2007-10-04 | Northwestern University | Haptic device with indirect haptic feedback |
US8130202B2 (en) | 2007-05-01 | 2012-03-06 | International Business Machines Corporation | Infrared touch screen gated by touch force |
US20090273583A1 (en) | 2008-05-05 | 2009-11-05 | Sony Ericsson Mobile Communications Ab | Contact sensitive display |
US7999660B2 (en) | 2008-10-10 | 2011-08-16 | Motorola Mobility, Inc. | Electronic device with suspension interface for localized haptic response |
JP4875050B2 (en) | 2008-12-09 | 2012-02-15 | 京セラ株式会社 | Input device |
US8248499B2 (en) * | 2009-02-23 | 2012-08-21 | Gary Edwin Sutton | Curvilinear sensor system |
TW201101137A (en) | 2009-06-29 | 2011-01-01 | J Touch Corp | Touch panel with matrix type tactile feedback |
FR2949007B1 (en) * | 2009-08-07 | 2012-06-08 | Nanotec Solution | DEVICE AND METHOD FOR CONTROL INTERFACE SENSITIVE TO A MOVEMENT OF A BODY OR OBJECT AND CONTROL EQUIPMENT INCORPORATING THIS DEVICE. |
JP2011048685A (en) | 2009-08-27 | 2011-03-10 | Kyocera Corp | Input apparatus |
US8436806B2 (en) | 2009-10-02 | 2013-05-07 | Research In Motion Limited | Method of synchronizing data acquisition and a portable electronic device configured to perform the same |
JP5788887B2 (en) | 2009-10-29 | 2015-10-07 | ニュー トランスデューサーズ リミテッド | Touch sensitive devices |
EP2325731B1 (en) | 2009-11-02 | 2013-11-13 | SMK Corporation | Holding structure for a touch panel |
US8633916B2 (en) | 2009-12-10 | 2014-01-21 | Apple, Inc. | Touch pad with force sensors and actuator feedback |
FR2954238B1 (en) | 2009-12-22 | 2012-03-16 | Dav | CONTROL DEVICE FOR MOTOR VEHICLE |
US20110227871A1 (en) | 2010-03-22 | 2011-09-22 | Mattel, Inc. | Electronic Device and the Input and Output of Data |
JP5408731B2 (en) | 2010-04-12 | 2014-02-05 | 株式会社フジクラ | Hand motion detection device and electrical equipment |
TW201203052A (en) | 2010-05-03 | 2012-01-16 | Flatfrog Lab Ab | Touch determination by tomographic reconstruction |
US8396252B2 (en) | 2010-05-20 | 2013-03-12 | Edge 3 Technologies | Systems and related methods for three dimensional gesture recognition in vehicles |
KR101114873B1 (en) | 2010-08-31 | 2012-02-28 | 주식회사 이음플러스 | Touch panel sensor andmethod of sensing movement using proximity sensor |
US20120052929A1 (en) | 2010-08-31 | 2012-03-01 | Khamvong Thammasouk | Interactive phone case |
FR2964761B1 (en) | 2010-09-14 | 2012-08-31 | Thales Sa | HAPTIC INTERACTION DEVICE AND METHOD FOR GENERATING HAPTIC AND SOUND EFFECTS |
US8760432B2 (en) | 2010-09-21 | 2014-06-24 | Visteon Global Technologies, Inc. | Finger pointing, gesture based human-machine interface for vehicles |
US8919848B2 (en) | 2011-11-16 | 2014-12-30 | Flextronics Ap, Llc | Universal console chassis for the car |
US20120081337A1 (en) | 2010-10-04 | 2012-04-05 | Sony Ericsson Mobile Communications Ab | Active Acoustic Multi-Touch and Swipe Detection for Electronic Devices |
US20120192108A1 (en) * | 2011-01-26 | 2012-07-26 | Google Inc. | Gesture-based menu controls |
US9417696B2 (en) | 2011-01-27 | 2016-08-16 | Blackberry Limited | Portable electronic device and method therefor |
US9384166B2 (en) | 2011-04-22 | 2016-07-05 | Panasonic Intellectual Property Management Co., Ltd. | Vehicular input device and vehicular input method |
US9733746B2 (en) | 2011-05-10 | 2017-08-15 | Northwestern University | Touch interface device having an electrostatic multitouch surface and method for controlling the device |
US20130063336A1 (en) | 2011-09-08 | 2013-03-14 | Honda Motor Co., Ltd. | Vehicle user interface system |
US9489061B2 (en) * | 2011-11-14 | 2016-11-08 | Logitech Europe S.A. | Method and system for power conservation in a multi-zone input device |
US20130191741A1 (en) | 2012-01-24 | 2013-07-25 | Motorola Mobility, Inc. | Methods and Apparatus for Providing Feedback from an Electronic Device |
US8994689B2 (en) | 2012-01-25 | 2015-03-31 | Chrysler Group Llc | Automotive vehicle power window control using capacitive switches |
US20130204457A1 (en) | 2012-02-06 | 2013-08-08 | Ford Global Technologies, Llc | Interacting with vehicle controls through gesture recognition |
US8493354B1 (en) | 2012-08-23 | 2013-07-23 | Immersion Corporation | Interactivity model for shared feedback on mobile devices |
US8787009B2 (en) | 2012-04-13 | 2014-07-22 | Wimo Labs LLC | Portable electronic device case |
US9007758B2 (en) | 2012-04-13 | 2015-04-14 | Wimo Labs LLC | Portable electronic device case |
US9323985B2 (en) * | 2012-08-16 | 2016-04-26 | Microchip Technology Incorporated | Automatic gesture recognition for a sensor system |
US9285889B2 (en) * | 2012-12-10 | 2016-03-15 | Intel Corporation | Electrode arrangement for a keyboard proximity and tracking sensor |
WO2014145388A1 (en) | 2013-03-15 | 2014-09-18 | New Concepts Development Corp. | Apparatus for protecting electronic devices |
US9507464B2 (en) | 2013-03-15 | 2016-11-29 | Elo Touch Solutions, Inc. | Acoustic touch apparatus and methods using touch sensitive lamb waves |
EP2994817A4 (en) * | 2013-05-07 | 2017-01-11 | The Trustees of Princeton University | System and method for 3d position and gesture sensing of human hand |
US20150094083A1 (en) | 2013-10-02 | 2015-04-02 | Blackberry Limited | Explicit and implicit triggers for creating new place data |
US9595659B2 (en) | 2014-04-30 | 2017-03-14 | Jung-Hoon Kim | Piezoelectric vibration device for mobile terminal |
-
2013
- 2013-11-13 US US14/078,982 patent/US10025431B2/en active Active
-
2018
- 2018-06-18 US US16/010,855 patent/US11379070B2/en active Active
-
2022
- 2022-06-03 US US17/832,147 patent/US20220300107A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
US10025431B2 (en) | 2018-07-17 |
US11379070B2 (en) | 2022-07-05 |
US20150130743A1 (en) | 2015-05-14 |
US20180307347A1 (en) | 2018-10-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220300107A1 (en) | Gesture Detection | |
KR101720193B1 (en) | Method and apparatus for sensing interaction in touch sensor | |
US9880691B2 (en) | Device and method for synchronizing display and touch controller with host polling | |
US9939966B2 (en) | Low ground mass correction mechanism | |
US8760432B2 (en) | Finger pointing, gesture based human-machine interface for vehicles | |
US8352202B2 (en) | System and method for detecting interfernce in a sensor device using phase shifting | |
CN105045445B (en) | Drive sensor electrode for noise measurement | |
US20170242505A1 (en) | Force calibration for temperature | |
US20120161791A1 (en) | Methods and apparatus for determining input objects associated with proximity events | |
CN105027062A (en) | Information processing device | |
US9454278B2 (en) | Weighting for display noise removal in capacitive sensors | |
US10691265B2 (en) | Gesture detection | |
US9811205B2 (en) | Variable time anti-aliasing filter | |
US9904412B2 (en) | Display noise subtraction via substantially orthogonal noise templates | |
US10007770B2 (en) | Temporary secure access via input object remaining in place | |
KR20170061560A (en) | Methode for obtaining user input and electronic device thereof | |
US10712868B2 (en) | Hybrid baseline management | |
US20170193261A1 (en) | Determining which hand is being used to operate a device using a fingerprint sensor | |
CN108376039A (en) | Apply a signal to touch sensor | |
US9823767B2 (en) | Press and move gesture | |
CN103995579A (en) | Multiple-view display system with user recognition and operation method thereof | |
JP6642474B2 (en) | State determination device, learning device, state determination method, and program | |
US9134843B2 (en) | System and method for distinguishing input objects | |
WO2018057534A1 (en) | Multi-user touch interface system | |
KR102353919B1 (en) | Electronic device and method for performing predefined operations in response to pressure of touch |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: AT&T INTELLECTUAL PROPERTY I, L.P., GEORGIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LI, KEVIN;REEL/FRAME:060180/0022 Effective date: 20131112 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |