GB2423808A - Gesture controlled system for controlling vehicle accessories - Google Patents
Gesture controlled system for controlling vehicle accessories Download PDFInfo
- Publication number
- GB2423808A GB2423808A GB0504485A GB0504485A GB2423808A GB 2423808 A GB2423808 A GB 2423808A GB 0504485 A GB0504485 A GB 0504485A GB 0504485 A GB0504485 A GB 0504485A GB 2423808 A GB2423808 A GB 2423808A
- Authority
- GB
- United Kingdom
- Prior art keywords
- control system
- gesture
- vehicle
- hand
- electrode
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B32—LAYERED PRODUCTS
- B32B—LAYERED PRODUCTS, i.e. PRODUCTS BUILT-UP OF STRATA OF FLAT OR NON-FLAT, e.g. CELLULAR OR HONEYCOMB, FORM
- B32B17/00—Layered products essentially comprising sheet glass, or glass, slag, or like fibres
- B32B17/06—Layered products essentially comprising sheet glass, or glass, slag, or like fibres comprising glass as the main or only constituent of a layer, next to another layer of a specific material
- B32B17/10—Layered products essentially comprising sheet glass, or glass, slag, or like fibres comprising glass as the main or only constituent of a layer, next to another layer of a specific material of synthetic resin
- B32B17/10005—Layered products essentially comprising sheet glass, or glass, slag, or like fibres comprising glass as the main or only constituent of a layer, next to another layer of a specific material of synthetic resin laminated safety glass or glazing
- B32B17/10009—Layered products essentially comprising sheet glass, or glass, slag, or like fibres comprising glass as the main or only constituent of a layer, next to another layer of a specific material of synthetic resin laminated safety glass or glazing characterized by the number, the constitution or treatment of glass sheets
- B32B17/10036—Layered products essentially comprising sheet glass, or glass, slag, or like fibres comprising glass as the main or only constituent of a layer, next to another layer of a specific material of synthetic resin laminated safety glass or glazing characterized by the number, the constitution or treatment of glass sheets comprising two outer glass sheets
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A control system 10, for controlling remotely one or more devices, comprises a sensitised area 11 for detecting the proximity of a hand at a plurality of points over the area, a gesture library memory, a gesture recognition module for recognising and identifying the hand's motions among gestures stored in the gesture library, and a controller 14 to control the operation of one of the devices according to the hand motions of an operator applied anywhere over the sensitised area 11. The sensitised area 11 may comprise an array of capacitive electrodes 12a-n arranged on a dielectric component 13 with each electrode 12a-n connected to a threshold circuit 21 a-n so as to independently adjust the sensitivity of each electrode 12a-n. The dielectric component 13 may be a laminated glazing panel and may be a windscreen. There may be provided a transmitting electrode (40, figure 4) placed in the vicinity of a steering wheel. There may be provided a head up display or display screen 15 to show the gesture to control each device. There may be provided a speaker 16 to provide audible feedback. The device may be a car air conditioner, radio or navigation system.
Description
Control system for controlling one or more devices The present invention
relates to a control system for controlling remotely one or more devices, in particular a control system for controlling vehicle accessories such air conditioner, lights, radio or navigation system.
To operate the function of such comfort or entertainment features the driver is required to take his eyes off the road in order to physically locate the control system, generally taking the form of a switch, before moving one hand off the steering wheel to grasp and move the switch in one direction. This operation procedure of such a control system is causing driver distraction and can lead to traffic accidents.
It is an object of this invention to provide a control system suitable for use in a motor vehicle that achieves a control function for operating one or more vehicle accessories in a simple and easy way to overcome the aforementioned disadvantage.
According to a first aspect of the invention there is provided a control system for controlling remotely one or more devices, comprising a sensitised area for detecting the proximity of a hand at a plurality of points over this area, a gesture library memory, a gesture recognition module for recognizing and identifying the hand's motions among gestures stored in the gesture library, and a controller to control the operation of one of the devices according to the hand's motions of an operator applied anywhere over the sensitised area.
Preferably, the sensitised area includes an array of capacitive electrodes arranged on a dielectric component.
Also preferably, each electrode is connected to a threshold circuit so as to independently adjust the sensitivity of each electrode.
The dielectric component may be a laminated glazing panel. In such case the electrodes are preferably embedded within the laminated glazing panel.
In a preferred embodiment the control system further includes a transmitting electrode so that the user can switch on or off the gesture recognition. This transmitting electrode may be placed in a vicinity of a steering wheel.
The control system may include a Head up display system and/or a display screen so as to shown the gesture to control each vehicle device available. It may also include a loud speaker so as to give a audible feedback to the operator.
According to a second aspect of the invention there is provided a method for operating one or more devices, by a control system having a sensitised area over which an operator can move his hand, the method comprising the step of: - a) determining a gesture according to the hand's motion of the operator by detecting the proximity of a hand at each of a plurality of points in the sensitised area, - b) comparing this gesture against other gestures stored in a library of gestures, where each gesture in the library is associated with a output command of a device, and - c) activating the output command associated with the gesture matched in the library.
In a preferred embodiment, the step a) includes the step of: - d) monitoring a change in capacitance over time at the plurality of points, e) memorising co-ordinates for each points whose capacitance has been changed, and - f) in step b) determining whether the co-ordinates memorised in step e) is a recognised gesture.
According to a third aspect of the invention there is provided a vehicle incorporating a control system according to the first aspect of the invention.
The invention will now be described by way of example with reference to the accompanying drawings of which: - Fig.i is a schematic diagram of a control system according to an embodiment of the present invention for implementation inside a motor vehicle to allow a user to control vehicle accessories for example a radio, an air conditioner and a navigation system, Fig.2 is a simplified schematic view of a portion of the control system of Fig.1 illustrating its use by an operator, Fig.3 is a flow chart illustrating operation of the control system shown in block form in Fig.1, and Fig.4 is a schematic cross-section of a windscreen of a motor vehicle in which is implemented a control system shown in Fig.1 according to a second embodiment.
Referring to Fig.1, there is shown a control system 10 for controlling vehicle accessories by hand's motion of an operator. The control system 10 includes a sensitised area 11. The sensitised area 11 comprises a plurality of electrodes 12a-n arranged in an array, in this example 7x7. Each electrode 1 2a-n is formed from conductive material such as copper traces or conductive ink which are coated on a dielectric component 13, in this case a panel located on a dashboard between a driver seat and a passenger seat. Each electrode 12a-n is connected to an Electronic Control Unit (ECU) 14. A visual display unit 15, a loud speaker 16, and a controller 17 are also connected to the ECU 14. The controller 17 is connected to a bus line 18, e.g. a CAN bus, which allows it to operate and to communicate with each vehicle accessories 19 available on the CAN bus, in this case a radio, a navigation system and an air conditioner.
In the ECU 14, a detection circuit 20 is disposed. In a conventional manner, the detection circuit 20 includes for each electrode 1 2a-n a threshold circuit 21 a-n to control the sensitivity of each electrodes 12a-n. The sensitivity of each electrode is defined as a threshold of the signal from the electrode which must be taken into account as indication of a hand gesture. In other words, the minimal distance between a hand and the electrode that must be considerecj. The detection circuit 20 further includes a multiplexer and amplifier (not shown) which enables the detection circuit to sequentially read values from the electrodes 1 2a-n.
The ECU 14 further is provided with a gesture recognition module in order to implement gesture recognition on the movement of the operator s hand over the sensitised area 13. The gesture recognition module includes a memory in which is stored a library of gestures.
In operation, once the supply power of the vehicle is turned on, the display 15 will show the main menu of all the vehicle accessories available with gesture symbols providing reminders of specific gestures to assist the operator. The operator therefore can call up the accessory that he wishes to operate.
To do this, the operator then moves one hand over the sensitised area 13 according to the gesture shown on the display unit 15 for the accessory that he wishes to operate, for instance leftward motion of the hand to control the navigation system. As hand moves over the sensitised area then the current is pulled from the electrodes, which are near to the hand, into the body and will be detected by the ECU generating output signals to the gesture recogniser module. The gesture recogniser module will in turn interpret these output signals and issues an appropriate output command 22 to the controller 17 to execute the operation. It would be appreciated that the operator may also slide his finger on the sensitiseci panel which will also pull the current from the electrodes touched by the finger into the body generating thus output signals to the gesture recognition module.
These output signals will be interpreted by gesture recognition module in order to issue an appropriate output command 22 as hereinbefore described.
A gesture recognition principle over the sensitised area 13 is described as follows with reference to the flow chart of Fig. 3. As stated above, when the engine of the vehicle is switched on at step 101, the display unit 15 is turned on displaying the functions of the vehicle accessories (Fig. 2), at step 102. The gesture recognition module through the detection circuit 20 scans all output electrodes 12a-n and tracks, in real time, changes in the capacitance of electrodes, at step 103. When the operator moves his hands closer to the sensitised area 13 and starts a gesture, the detection circuit 20 will detect changes in capacitance of a first electrode, at step 104. Then, the gesture recognition module which is provided with a mathematical model, i.e. Finite State Machine algorithms, will memorise the co-ordinates of the selected electrode within the array whose capacitance has been changed, at step 105. In order to track more quickly the gesture the detection circuit scans electrodes in the vicinity of the latest electrode whose its capacitance has been changed, at step 106. The following electrode which has its capacitance altered will also be memorised by its co-ordinates, at step 107 and so on until the detection circuit do not detect any changes within the array after a predetermined period of time, at step 107, which will be interpreted as the end of the gesture command, i.e. the operator's hand has been moving away from the sensitised area. Meanwhile, the gesture recognition module will start to compare the memorised co- ordinates or digital gesture representation against other gestures stored in the library of gestures, at step 108. Each gesture stored in the library is associated with a output command to operate a vehicles accessory or selecting a sub- menu of a vehicle accessory. At step 109, if the gesture is not recognised, the ECU sends a tone through the loud speaker so as to give a audible feedback to the operator that the gesture has not been recognised and the process returns to step 102. At step 109, if the gesture is recognised, the ECU sends an output command to the controller and an acceptance tone through the loud speaker so as to give a audible feedback to the operator, at step 110.
In a second embodiment, to be described with reference to Fig. 4, the electrode array 11 is embedded within a laminated glazing panel 30 forming a windscreen in a vehicle passenger compartment and a Head up Display (HUD) system 33 (shown as dotted line in Fig.1) is connected to the ECU 14. It will be appreciate that the vehicle is only depicted by the windscreen 30.
The windscreen usually comprises two glass sheets 30a, 30b and an interlayer sheet 30c positioned between the glass sheets. The electrodes 1 la are in the form of a small transparent conductive surface which is coated on the interlayer sheet 30c. Such arrangement allows the driver to control easily the vehicle accessories without looking for the location of the sensitised area 13. Furthermore, in order to avoid the driver to have a look to the display unit to find out what gestures must be applied to operate a vehicle accessory, the information from the display screen are repeated in front of the driver in a small area by the HUD system 33. This is particularly useful if the user is driving the vehicle as it means that he can control the vehicle accessories without taking his eyes off the road ahead.
It will be appreciated that the sensitivity of each electrode 1 2a-n inside the threshold circuit is adjusted according to their positions so as to compensate for the sloping face of the windscreen 30 and allows a gesture to be recognised in one plane A. That is, that the signal threshold of the electrodes located in the lower part of the windscreen 30 will be less than those of the electrodes located in the upper part of the windscreen 30.
It may be found desirable, in particular, in this embodiment that the driver is able to turn off the control system in order to avoid that all his hand's motion over the windscreen is to be interpret as a command.
To do this, the control system 10 is provided with a transmitting electrode 40 which is mounted in the vicinity of the steering wheel 41 in order to be reached by one hand of the driver. This transmitting electrode is connected to a frequency generator 42. In this embodiment, the electrode array 12 acts as a receiving electrode array.
In operation, the generator 42 generates a signal to the transmitting electrode 40. The transmitting electrode 40 transmits an electromagnetic signal inside the vehicle compartment. The driver needs to touch the transmitting electrode 40 so that the electromagnetic signal passes through the driver and is received by the receiving electrode array 12. The signal received by the electrode array 12 will be interpreted as described in the first embodiment. Hence, the driver can easily turn off the gesture recognition by moving away his hand from the transmitting electrode.
Although the control system by gesture recognition has been described herein in relation to the control of vehicle accessories within the compartment of a vehicle it will be appreciated that the sensitised area 11 may be located within other windows of the vehicle.
The sensitised area may be operated from outside the vehicle to allow a user to act remotely on a vehicle mechanism, e.g. for locking/unlocking the locks of the openable- panels of the vehicle. In this case the electrodes are arranged in such a way that they only be operable from outside the vehicle. However, in order to ensure that only the driver is able to operate the control system for security reasons, the control system is interfaced with an identified user as the keyless entry system so that it is Only enabled when the driver is detected by the keyless entry system and then the control system will responds in a suitable manner as described above in relation to the first embodiment. Furthermore, it will be appreciate that the number, the size and the sensitivity of electrodes may vary for different applications and different vehicles.
Claims (22)
1. A control system for controlling remotely one or more devices, the control system comprising; a sensitised area for detecting the proximity of a hand at a plurality of points over this area, a gesture library memory, a gesture recognition module for recognizing and identifying the hand's motions among gestures stored in the gesture library, and a controller to control the operation of one of the devices according to the hand's motions of an operator applied anywhere over the sensitised area.
2. A control system as claimed in claim 1 wherein the sensitised area includes an array of capacitive electrodes arranged on a dielectric component.
3. A control system as claimed in claim 2 in which each electrode is connected to a threshold circuit so as to independently adjust the sensitivity of each electrode.
4. A control system as claimed in claim 2 or claim 3 wherein the dielectric component is a laminated glazing panel.
5. A control system as claimed in claim 2 or claim 3 wherein the electrodes are embedded within the laminated glazing panel.
6. A control system as claimed in claim 5 which further includes a transmitting electrode.
7. A control system as claimed in claim 6 wherein the transmitting electrode is in use placed in the vicinity of a steering wheel.
8. A control system as claimed in any preceding claim which includes a Head up display system so as to shown the gesture to control each vehicle device available.
9. A control system as claimed in any preceding claim which includes a display screen so as to show the gesture to control each vehicle device available.
10. A control system as claimed in any preceding claim which includes a loud speaker so as to give a audible feedback to the operator.
11. A control system as claimed in any preceding claim wherein the devices are vehicle accessories such air conditioner, radio or navigation system.
12. A control system as claimed in claim 1 wherein the devices are vehicle mechanisms.
13. A control system as claimed in claim 1 wherein the control system is adapted to operate in association with a user identification means.
14. A method for operating one or more devices, by a control system having a sensitised area over which an operator can move his hand, the method comprising the step of: - a) determining a gesture according to the hand's motion of the operator by detecting the proximity of a hand at each of a plurality of points in the sensitised area, - b) comparing this gesture against other gestures stored in a library of gestures, where each gesture in the library is associated with a output command of a device, and - C) activating the output command associated with the gesture matched in the library.
15. A method as claimed in claim 14 wherein said step a) includes the step of: - d) monitoring a change in capacitance over time at the plurality of points, - e) memorising co-ordinates for each points whose capacitance has been changed, and - f) in step b) determining whether the co-ordinates memorised in step e) is a recognised gesture. -11 -
16. A method as claimed in claim 15 wherein the plurality of points are an array of points.
17. A method as claimed in claim 14 or claim 15 wherein said step a) further includes the step of transmitting an electromagnetic signal from a transmitting electrode.
18. A vehicle incorporating a control system for controlling one or more vehicle devices, the control system being in accordance with any one of claims 1 to 13.
19. A vehicle as claimed in claim 18 when the control system is according to claim 5, in which the laminated glazing panel is a windscreen in the vehicle passenger compartment.
20. A control system as described herein with reference to the Figures 1, 2, 3 of the accompanying drawings.
21. A method for operating one or more vehicle devices substantially as described herein with reference to the Figure 4 of the accompanying drawings.
22. A vehicle as described herein with reference to the Figure 4 of the accompanying drawings.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0504485A GB2423808B (en) | 2005-03-04 | 2005-03-04 | Motor vehicle control system for controlling one or more vehicle devices |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0504485A GB2423808B (en) | 2005-03-04 | 2005-03-04 | Motor vehicle control system for controlling one or more vehicle devices |
Publications (3)
Publication Number | Publication Date |
---|---|
GB0504485D0 GB0504485D0 (en) | 2005-04-13 |
GB2423808A true GB2423808A (en) | 2006-09-06 |
GB2423808B GB2423808B (en) | 2010-02-17 |
Family
ID=34451799
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB0504485A Expired - Fee Related GB2423808B (en) | 2005-03-04 | 2005-03-04 | Motor vehicle control system for controlling one or more vehicle devices |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2423808B (en) |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2063350A1 (en) * | 2007-11-20 | 2009-05-27 | Samsung Electronics Co., Ltd. | Method and apparatus for interfacing between devices in home network |
US7770136B2 (en) | 2007-01-24 | 2010-08-03 | Microsoft Corporation | Gesture recognition interactive feedback |
EP2268005A3 (en) * | 2009-03-09 | 2011-01-12 | Samsung Electronics Co., Ltd. | Display apparatus for providing a user menu and method for providing user interface (ui) applicable thereto |
US7899772B1 (en) | 2006-07-14 | 2011-03-01 | Ailive, Inc. | Method and system for tuning motion recognizers by a user using a set of motion signals |
WO2011023856A1 (en) * | 2009-08-26 | 2011-03-03 | Valtion Teknillinen Tutkimuskeskus | Gesture sensor arrangement and a method for producing it |
US7917455B1 (en) | 2007-01-29 | 2011-03-29 | Ailive, Inc. | Method and system for rapid evaluation of logical expressions |
WO2011092628A1 (en) * | 2010-01-26 | 2011-08-04 | Nokia Corporation | Method for controlling an apparatus using gestures |
US8251821B1 (en) | 2007-06-18 | 2012-08-28 | Ailive, Inc. | Method and system for interactive control using movable controllers |
WO2012126586A1 (en) * | 2011-03-23 | 2012-09-27 | Daimler Ag | Method for detecting an actuating motion for an actuator of a motor vehicle equipment element and actuator of a motor vehicle equipment element |
WO2013174494A1 (en) | 2012-05-22 | 2013-11-28 | Audi Ag | System and method for controlling at least one vehicle system by means of gestures performed by a driver |
US8655622B2 (en) | 2008-07-05 | 2014-02-18 | Ailive, Inc. | Method and apparatus for interpreting orientation invariant motion |
ES2451849A1 (en) * | 2012-09-28 | 2014-03-28 | Fundación Para La Promoción De La Innovación, Invest. Y Desarrollo Tecnológico En La Industria De Automoción De Galicia | Procedure and system for gestural interaction with a vehicle (Machine-translation by Google Translate, not legally binding) |
WO2014085277A1 (en) * | 2012-11-27 | 2014-06-05 | Neonöde Inc. | Light-based touch controls on a steering wheel and dashboard |
WO2014172904A1 (en) * | 2013-04-27 | 2014-10-30 | Rong Weihua | Hand gesture recognition apparatus, hand gesture recognition method, and related vehicle-mounted apparatus |
US8924076B2 (en) | 2007-03-16 | 2014-12-30 | Pilkington Group Limited | Interactive vehicle glazing |
US9092093B2 (en) | 2012-11-27 | 2015-07-28 | Neonode Inc. | Steering wheel user interface |
US9261968B2 (en) | 2006-07-14 | 2016-02-16 | Ailive, Inc. | Methods and systems for dynamic calibration of movable game controllers |
DE102015217179A1 (en) | 2014-11-03 | 2016-05-04 | Ifm Electronic Gmbh | Sensor arrangement for an access system of a vehicle |
DE102014222410A1 (en) | 2014-11-03 | 2016-05-04 | Ifm Electronic Gmbh | Access system for a vehicle |
US9389710B2 (en) | 2009-02-15 | 2016-07-12 | Neonode Inc. | Light-based controls on a toroidal steering wheel |
WO2016116372A1 (en) * | 2015-01-20 | 2016-07-28 | Saint-Gobain Glass France | Composite pane with capacitive switching region |
US9628844B2 (en) | 2011-12-09 | 2017-04-18 | Microsoft Technology Licensing, Llc | Determining audience state or interest using passive sensor data |
US9788032B2 (en) | 2012-05-04 | 2017-10-10 | Microsoft Technology Licensing, Llc | Determining a future portion of a currently presented media program |
FR3062352A1 (en) * | 2017-02-02 | 2018-08-03 | Eolane Combree | INTERIOR LIGHTING DEVICE OF A VEHICLE |
US10322741B2 (en) | 2014-12-09 | 2019-06-18 | Continental Automotive France | Method of interaction from the steering wheel between a user and an onboard system embedded in a vehicle |
US10331222B2 (en) | 2011-05-31 | 2019-06-25 | Microsoft Technology Licensing, Llc | Gesture recognition techniques |
US11429230B2 (en) | 2018-11-28 | 2022-08-30 | Neonode Inc | Motorist user interface sensor |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8620113B2 (en) | 2011-04-25 | 2013-12-31 | Microsoft Corporation | Laser diode modes |
US8635637B2 (en) | 2011-12-02 | 2014-01-21 | Microsoft Corporation | User interface presenting an animated avatar performing a media reaction |
JP5858850B2 (en) * | 2012-04-02 | 2016-02-10 | 三菱電機株式会社 | Air conditioner indoor unit |
US8898687B2 (en) | 2012-04-04 | 2014-11-25 | Microsoft Corporation | Controlling a media program based on a media reaction |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2355055A (en) * | 1999-10-09 | 2001-04-11 | Rover Group | A control system for a vehicle |
WO2002015560A2 (en) * | 2000-08-12 | 2002-02-21 | Georgia Tech Research Corporation | A system and method for capturing an image |
US6519607B1 (en) * | 1999-10-28 | 2003-02-11 | Hewlett-Packard Company | Image driven operating system |
US20040141634A1 (en) * | 2002-10-25 | 2004-07-22 | Keiichi Yamamoto | Hand pattern switch device |
US20040161132A1 (en) * | 1998-08-10 | 2004-08-19 | Cohen Charles J. | Gesture-controlled interfaces for self-service machines and other applications |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5543591A (en) * | 1992-06-08 | 1996-08-06 | Synaptics, Incorporated | Object position detector with edge motion feature and gesture recognition |
US6204839B1 (en) * | 1997-06-27 | 2001-03-20 | Compaq Computer Corporation | Capacitive sensing keyboard and pointing device |
US6392636B1 (en) * | 1998-01-22 | 2002-05-21 | Stmicroelectronics, Inc. | Touchpad providing screen cursor/pointer movement control |
KR100595924B1 (en) * | 1998-01-26 | 2006-07-05 | 웨인 웨스터만 | Method and apparatus for integrating manual input |
-
2005
- 2005-03-04 GB GB0504485A patent/GB2423808B/en not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040161132A1 (en) * | 1998-08-10 | 2004-08-19 | Cohen Charles J. | Gesture-controlled interfaces for self-service machines and other applications |
GB2355055A (en) * | 1999-10-09 | 2001-04-11 | Rover Group | A control system for a vehicle |
US6519607B1 (en) * | 1999-10-28 | 2003-02-11 | Hewlett-Packard Company | Image driven operating system |
WO2002015560A2 (en) * | 2000-08-12 | 2002-02-21 | Georgia Tech Research Corporation | A system and method for capturing an image |
US20040141634A1 (en) * | 2002-10-25 | 2004-07-22 | Keiichi Yamamoto | Hand pattern switch device |
Cited By (45)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9405372B2 (en) | 2006-07-14 | 2016-08-02 | Ailive, Inc. | Self-contained inertial navigation system for interactive control using movable controllers |
US7899772B1 (en) | 2006-07-14 | 2011-03-01 | Ailive, Inc. | Method and system for tuning motion recognizers by a user using a set of motion signals |
US8051024B1 (en) | 2006-07-14 | 2011-11-01 | Ailive, Inc. | Example-based creation and tuning of motion recognizers for motion-controlled applications |
US9261968B2 (en) | 2006-07-14 | 2016-02-16 | Ailive, Inc. | Methods and systems for dynamic calibration of movable game controllers |
US7770136B2 (en) | 2007-01-24 | 2010-08-03 | Microsoft Corporation | Gesture recognition interactive feedback |
US7917455B1 (en) | 2007-01-29 | 2011-03-29 | Ailive, Inc. | Method and system for rapid evaluation of logical expressions |
US8924076B2 (en) | 2007-03-16 | 2014-12-30 | Pilkington Group Limited | Interactive vehicle glazing |
US8251821B1 (en) | 2007-06-18 | 2012-08-28 | Ailive, Inc. | Method and system for interactive control using movable controllers |
EP2063350A1 (en) * | 2007-11-20 | 2009-05-27 | Samsung Electronics Co., Ltd. | Method and apparatus for interfacing between devices in home network |
US8655622B2 (en) | 2008-07-05 | 2014-02-18 | Ailive, Inc. | Method and apparatus for interpreting orientation invariant motion |
US9389710B2 (en) | 2009-02-15 | 2016-07-12 | Neonode Inc. | Light-based controls on a toroidal steering wheel |
US10007422B2 (en) | 2009-02-15 | 2018-06-26 | Neonode Inc. | Light-based controls in a toroidal steering wheel |
EP2268005A3 (en) * | 2009-03-09 | 2011-01-12 | Samsung Electronics Co., Ltd. | Display apparatus for providing a user menu and method for providing user interface (ui) applicable thereto |
WO2011023856A1 (en) * | 2009-08-26 | 2011-03-03 | Valtion Teknillinen Tutkimuskeskus | Gesture sensor arrangement and a method for producing it |
WO2011092628A1 (en) * | 2010-01-26 | 2011-08-04 | Nokia Corporation | Method for controlling an apparatus using gestures |
US9335825B2 (en) | 2010-01-26 | 2016-05-10 | Nokia Technologies Oy | Gesture control |
US8886414B2 (en) | 2011-03-23 | 2014-11-11 | Daimler Ag | Method for detecting an actuating motion for an actuator of a motor vehicle equipment element and actuator of a motor vehicle equipment element |
CN103443754A (en) * | 2011-03-23 | 2013-12-11 | 戴姆勒股份公司 | Method for detecting an actuating motion for an actuator of a motor vehicle equipment element and actuator of a motor vehicle equipment element |
WO2012126586A1 (en) * | 2011-03-23 | 2012-09-27 | Daimler Ag | Method for detecting an actuating motion for an actuator of a motor vehicle equipment element and actuator of a motor vehicle equipment element |
CN103443754B (en) * | 2011-03-23 | 2016-06-29 | 戴姆勒股份公司 | The method of the control action of the manipulation device of cognitron motor-car equipment element and the manipulation device of motor vehicles equipment element |
US10331222B2 (en) | 2011-05-31 | 2019-06-25 | Microsoft Technology Licensing, Llc | Gesture recognition techniques |
US9628844B2 (en) | 2011-12-09 | 2017-04-18 | Microsoft Technology Licensing, Llc | Determining audience state or interest using passive sensor data |
US10798438B2 (en) | 2011-12-09 | 2020-10-06 | Microsoft Technology Licensing, Llc | Determining audience state or interest using passive sensor data |
US9788032B2 (en) | 2012-05-04 | 2017-10-10 | Microsoft Technology Licensing, Llc | Determining a future portion of a currently presented media program |
DE102012018685A1 (en) | 2012-05-22 | 2013-11-28 | Audi Ag | System and method for controlling at least one vehicle system by means of gestures carried out by a driver |
DE102012018685B4 (en) * | 2012-05-22 | 2017-08-03 | Audi Ag | System and method for controlling at least one vehicle system by means of gestures carried out by a driver |
WO2013174494A1 (en) | 2012-05-22 | 2013-11-28 | Audi Ag | System and method for controlling at least one vehicle system by means of gestures performed by a driver |
ES2451849A1 (en) * | 2012-09-28 | 2014-03-28 | Fundación Para La Promoción De La Innovación, Invest. Y Desarrollo Tecnológico En La Industria De Automoción De Galicia | Procedure and system for gestural interaction with a vehicle (Machine-translation by Google Translate, not legally binding) |
US9092093B2 (en) | 2012-11-27 | 2015-07-28 | Neonode Inc. | Steering wheel user interface |
US9710144B2 (en) | 2012-11-27 | 2017-07-18 | Neonode Inc. | User interface for curved input device |
WO2014085277A1 (en) * | 2012-11-27 | 2014-06-05 | Neonöde Inc. | Light-based touch controls on a steering wheel and dashboard |
US10719218B2 (en) | 2012-11-27 | 2020-07-21 | Neonode Inc. | Vehicle user interface |
US10254943B2 (en) | 2012-11-27 | 2019-04-09 | Neonode Inc. | Autonomous drive user interface |
US11650727B2 (en) | 2012-11-27 | 2023-05-16 | Neonode Inc. | Vehicle user interface |
WO2014172904A1 (en) * | 2013-04-27 | 2014-10-30 | Rong Weihua | Hand gesture recognition apparatus, hand gesture recognition method, and related vehicle-mounted apparatus |
DE102014222410A1 (en) | 2014-11-03 | 2016-05-04 | Ifm Electronic Gmbh | Access system for a vehicle |
DE102014222410B4 (en) | 2014-11-03 | 2018-05-17 | Ifm Electronic Gmbh | Access system for a vehicle |
DE102015217179A1 (en) | 2014-11-03 | 2016-05-04 | Ifm Electronic Gmbh | Sensor arrangement for an access system of a vehicle |
US10322741B2 (en) | 2014-12-09 | 2019-06-18 | Continental Automotive France | Method of interaction from the steering wheel between a user and an onboard system embedded in a vehicle |
US10525673B2 (en) | 2015-01-20 | 2020-01-07 | Saint-Gobain Glass France | Composite pane with a capacitive switching zone |
EA034011B1 (en) * | 2015-01-20 | 2019-12-18 | Сэн-Гобэн Гласс Франс | Composite pane with a capacitive switching zone |
WO2016116372A1 (en) * | 2015-01-20 | 2016-07-28 | Saint-Gobain Glass France | Composite pane with capacitive switching region |
EP3357755A1 (en) * | 2017-02-02 | 2018-08-08 | Eolane Combree | Interior lighting device for a vehicle |
FR3062352A1 (en) * | 2017-02-02 | 2018-08-03 | Eolane Combree | INTERIOR LIGHTING DEVICE OF A VEHICLE |
US11429230B2 (en) | 2018-11-28 | 2022-08-30 | Neonode Inc | Motorist user interface sensor |
Also Published As
Publication number | Publication date |
---|---|
GB0504485D0 (en) | 2005-04-13 |
GB2423808B (en) | 2010-02-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
GB2423808A (en) | Gesture controlled system for controlling vehicle accessories | |
KR102312455B1 (en) | Apparatus and method for supporting an user before operating a switch for electromotive adjustment of a part of a vehicle | |
US10474357B2 (en) | Touch sensing display device and method of detecting user input from a driver side or passenger side in a motor vehicle | |
CN107150643B (en) | Vehicle and control method thereof | |
US10220806B2 (en) | Monitoring and alerting vehicle occupants for ignition systems | |
US20160091978A1 (en) | Gesture recognition apparatus, vehicle having the same and method for controlling the same | |
US20160098088A1 (en) | Human machine interface apparatus for vehicle and methods of controlling the same | |
JP2007106353A (en) | Vehicular information display device, and vehicular information display system | |
KR20130063911A (en) | Eye breakaway prevention system during vehicle's driving | |
KR101946746B1 (en) | Positioning of non-vehicle objects in the vehicle | |
EP3799609B1 (en) | Power window sync switch | |
US20140172186A1 (en) | Capacitive steering wheel switches with audible feedback | |
US20130054048A1 (en) | Vehicle control system | |
JP6564280B2 (en) | Get-off warning device for vehicle electronic key system | |
JP2018501998A (en) | System and method for controlling automotive equipment | |
KR102441515B1 (en) | Input apparatus for vehicle | |
KR102270367B1 (en) | Automotive convenience apparatus using touch panel | |
KR102441509B1 (en) | Terminal apparatus, vehicle and method for controlling the terminal apparatus | |
JP6626282B2 (en) | Alighting detection device of electronic key system for vehicles | |
KR101556520B1 (en) | Terminal, vehicle having the same and method thereof | |
CN216210968U (en) | Induction device and vehicle | |
US20230256902A1 (en) | Input device | |
EP4349665A1 (en) | Safety belt device and motor vehicle | |
JP3232217U (en) | Flexible board Capacitance sensing type Vehicle door warning device before opening | |
JP2017013713A (en) | Alighting detection device of electronic key system for vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
732E | Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977) | ||
746 | Register noted 'licences of right' (sect. 46/1977) |
Effective date: 20190208 |
|
PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 20200304 |