GB2423808A - Gesture controlled system for controlling vehicle accessories - Google Patents

Gesture controlled system for controlling vehicle accessories Download PDF

Info

Publication number
GB2423808A
GB2423808A GB0504485A GB0504485A GB2423808A GB 2423808 A GB2423808 A GB 2423808A GB 0504485 A GB0504485 A GB 0504485A GB 0504485 A GB0504485 A GB 0504485A GB 2423808 A GB2423808 A GB 2423808A
Authority
GB
United Kingdom
Prior art keywords
control system
gesture
vehicle
hand
step
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0504485A
Other versions
GB0504485D0 (en
GB2423808B (en
Inventor
Carl Anthony Pickering
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ford Global Technologies LLC
Original Assignee
Ford Global Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies LLC filed Critical Ford Global Technologies LLC
Priority to GB0504485A priority Critical patent/GB2423808B/en
Publication of GB0504485D0 publication Critical patent/GB0504485D0/en
Publication of GB2423808A publication Critical patent/GB2423808A/en
Application granted granted Critical
Publication of GB2423808B publication Critical patent/GB2423808B/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B32LAYERED PRODUCTS
    • B32BLAYERED PRODUCTS, i.e. PRODUCTS BUILT-UP OF STRATA OF FLAT OR NON-FLAT, e.g. CELLULAR OR HONEYCOMB, FORM
    • B32B17/00Layered products essentially comprising sheet glass, or glass, slag, or like fibres
    • B32B17/06Layered products essentially comprising sheet glass, or glass, slag, or like fibres comprising glass as the main or only constituent of a layer, next to another layer of a specific material
    • B32B17/10Layered products essentially comprising sheet glass, or glass, slag, or like fibres comprising glass as the main or only constituent of a layer, next to another layer of a specific material of synthetic resin
    • B32B17/10009Layered products essentially comprising sheet glass, or glass, slag, or like fibres comprising glass as the main or only constituent of a layer, next to another layer of a specific material of synthetic resin characterized by the number, the constitution or treatment of glass sheets
    • B32B17/10036Layered products essentially comprising sheet glass, or glass, slag, or like fibres comprising glass as the main or only constituent of a layer, next to another layer of a specific material of synthetic resin characterized by the number, the constitution or treatment of glass sheets comprising two outer glass sheet
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Abstract

A control system 10, for controlling remotely one or more devices, comprises a sensitised area 11 for detecting the proximity of a hand at a plurality of points over the area, a gesture library memory, a gesture recognition module for recognising and identifying the hand's motions among gestures stored in the gesture library, and a controller 14 to control the operation of one of the devices according to the hand motions of an operator applied anywhere over the sensitised area 11. The sensitised area 11 may comprise an array of capacitive electrodes 12a-n arranged on a dielectric component 13 with each electrode 12a-n connected to a threshold circuit 21 a-n so as to independently adjust the sensitivity of each electrode 12a-n. The dielectric component 13 may be a laminated glazing panel and may be a windscreen. There may be provided a transmitting electrode (40, figure 4) placed in the vicinity of a steering wheel. There may be provided a head up display or display screen 15 to show the gesture to control each device. There may be provided a speaker 16 to provide audible feedback. The device may be a car air conditioner, radio or navigation system.

Description

Control system for controlling one or more devices The present invention

relates to a control system for controlling remotely one or more devices, in particular a control system for controlling vehicle accessories such air conditioner, lights, radio or navigation system.

To operate the function of such comfort or entertainment features the driver is required to take his eyes off the road in order to physically locate the control system, generally taking the form of a switch, before moving one hand off the steering wheel to grasp and move the switch in one direction. This operation procedure of such a control system is causing driver distraction and can lead to traffic accidents.

It is an object of this invention to provide a control system suitable for use in a motor vehicle that achieves a control function for operating one or more vehicle accessories in a simple and easy way to overcome the aforementioned disadvantage.

According to a first aspect of the invention there is provided a control system for controlling remotely one or more devices, comprising a sensitised area for detecting the proximity of a hand at a plurality of points over this area, a gesture library memory, a gesture recognition module for recognizing and identifying the hand's motions among gestures stored in the gesture library, and a controller to control the operation of one of the devices according to the hand's motions of an operator applied anywhere over the sensitised area.

Preferably, the sensitised area includes an array of capacitive electrodes arranged on a dielectric component.

Also preferably, each electrode is connected to a threshold circuit so as to independently adjust the sensitivity of each electrode.

The dielectric component may be a laminated glazing panel. In such case the electrodes are preferably embedded within the laminated glazing panel.

In a preferred embodiment the control system further includes a transmitting electrode so that the user can switch on or off the gesture recognition. This transmitting electrode may be placed in a vicinity of a steering wheel.

The control system may include a Head up display system and/or a display screen so as to shown the gesture to control each vehicle device available. It may also include a loud speaker so as to give a audible feedback to the operator.

According to a second aspect of the invention there is provided a method for operating one or more devices, by a control system having a sensitised area over which an operator can move his hand, the method comprising the step of: - a) determining a gesture according to the hand's motion of the operator by detecting the proximity of a hand at each of a plurality of points in the sensitised area, - b) comparing this gesture against other gestures stored in a library of gestures, where each gesture in the library is associated with a output command of a device, and - c) activating the output command associated with the gesture matched in the library.

In a preferred embodiment, the step a) includes the step of: - d) monitoring a change in capacitance over time at the plurality of points, e) memorising co-ordinates for each points whose capacitance has been changed, and - f) in step b) determining whether the co-ordinates memorised in step e) is a recognised gesture.

According to a third aspect of the invention there is provided a vehicle incorporating a control system according to the first aspect of the invention.

The invention will now be described by way of example with reference to the accompanying drawings of which: - Fig.i is a schematic diagram of a control system according to an embodiment of the present invention for implementation inside a motor vehicle to allow a user to control vehicle accessories for example a radio, an air conditioner and a navigation system, Fig.2 is a simplified schematic view of a portion of the control system of Fig.1 illustrating its use by an operator, Fig.3 is a flow chart illustrating operation of the control system shown in block form in Fig.1, and Fig.4 is a schematic cross-section of a windscreen of a motor vehicle in which is implemented a control system shown in Fig.1 according to a second embodiment.

Referring to Fig.1, there is shown a control system 10 for controlling vehicle accessories by hand's motion of an operator. The control system 10 includes a sensitised area 11. The sensitised area 11 comprises a plurality of electrodes 12a-n arranged in an array, in this example 7x7. Each electrode 1 2a-n is formed from conductive material such as copper traces or conductive ink which are coated on a dielectric component 13, in this case a panel located on a dashboard between a driver seat and a passenger seat. Each electrode 12a-n is connected to an Electronic Control Unit (ECU) 14. A visual display unit 15, a loud speaker 16, and a controller 17 are also connected to the ECU 14. The controller 17 is connected to a bus line 18, e.g. a CAN bus, which allows it to operate and to communicate with each vehicle accessories 19 available on the CAN bus, in this case a radio, a navigation system and an air conditioner.

In the ECU 14, a detection circuit 20 is disposed. In a conventional manner, the detection circuit 20 includes for each electrode 1 2a-n a threshold circuit 21 a-n to control the sensitivity of each electrodes 12a-n. The sensitivity of each electrode is defined as a threshold of the signal from the electrode which must be taken into account as indication of a hand gesture. In other words, the minimal distance between a hand and the electrode that must be considerecj. The detection circuit 20 further includes a multiplexer and amplifier (not shown) which enables the detection circuit to sequentially read values from the electrodes 1 2a-n.

The ECU 14 further is provided with a gesture recognition module in order to implement gesture recognition on the movement of the operator s hand over the sensitised area 13. The gesture recognition module includes a memory in which is stored a library of gestures.

In operation, once the supply power of the vehicle is turned on, the display 15 will show the main menu of all the vehicle accessories available with gesture symbols providing reminders of specific gestures to assist the operator. The operator therefore can call up the accessory that he wishes to operate.

To do this, the operator then moves one hand over the sensitised area 13 according to the gesture shown on the display unit 15 for the accessory that he wishes to operate, for instance leftward motion of the hand to control the navigation system. As hand moves over the sensitised area then the current is pulled from the electrodes, which are near to the hand, into the body and will be detected by the ECU generating output signals to the gesture recogniser module. The gesture recogniser module will in turn interpret these output signals and issues an appropriate output command 22 to the controller 17 to execute the operation. It would be appreciated that the operator may also slide his finger on the sensitiseci panel which will also pull the current from the electrodes touched by the finger into the body generating thus output signals to the gesture recognition module.

These output signals will be interpreted by gesture recognition module in order to issue an appropriate output command 22 as hereinbefore described.

A gesture recognition principle over the sensitised area 13 is described as follows with reference to the flow chart of Fig. 3. As stated above, when the engine of the vehicle is switched on at step 101, the display unit 15 is turned on displaying the functions of the vehicle accessories (Fig. 2), at step 102. The gesture recognition module through the detection circuit 20 scans all output electrodes 12a-n and tracks, in real time, changes in the capacitance of electrodes, at step 103. When the operator moves his hands closer to the sensitised area 13 and starts a gesture, the detection circuit 20 will detect changes in capacitance of a first electrode, at step 104. Then, the gesture recognition module which is provided with a mathematical model, i.e. Finite State Machine algorithms, will memorise the co-ordinates of the selected electrode within the array whose capacitance has been changed, at step 105. In order to track more quickly the gesture the detection circuit scans electrodes in the vicinity of the latest electrode whose its capacitance has been changed, at step 106. The following electrode which has its capacitance altered will also be memorised by its co-ordinates, at step 107 and so on until the detection circuit do not detect any changes within the array after a predetermined period of time, at step 107, which will be interpreted as the end of the gesture command, i.e. the operator's hand has been moving away from the sensitised area. Meanwhile, the gesture recognition module will start to compare the memorised co- ordinates or digital gesture representation against other gestures stored in the library of gestures, at step 108. Each gesture stored in the library is associated with a output command to operate a vehicles accessory or selecting a sub- menu of a vehicle accessory. At step 109, if the gesture is not recognised, the ECU sends a tone through the loud speaker so as to give a audible feedback to the operator that the gesture has not been recognised and the process returns to step 102. At step 109, if the gesture is recognised, the ECU sends an output command to the controller and an acceptance tone through the loud speaker so as to give a audible feedback to the operator, at step 110.

In a second embodiment, to be described with reference to Fig. 4, the electrode array 11 is embedded within a laminated glazing panel 30 forming a windscreen in a vehicle passenger compartment and a Head up Display (HUD) system 33 (shown as dotted line in Fig.1) is connected to the ECU 14. It will be appreciate that the vehicle is only depicted by the windscreen 30.

The windscreen usually comprises two glass sheets 30a, 30b and an interlayer sheet 30c positioned between the glass sheets. The electrodes 1 la are in the form of a small transparent conductive surface which is coated on the interlayer sheet 30c. Such arrangement allows the driver to control easily the vehicle accessories without looking for the location of the sensitised area 13. Furthermore, in order to avoid the driver to have a look to the display unit to find out what gestures must be applied to operate a vehicle accessory, the information from the display screen are repeated in front of the driver in a small area by the HUD system 33. This is particularly useful if the user is driving the vehicle as it means that he can control the vehicle accessories without taking his eyes off the road ahead.

It will be appreciated that the sensitivity of each electrode 1 2a-n inside the threshold circuit is adjusted according to their positions so as to compensate for the sloping face of the windscreen 30 and allows a gesture to be recognised in one plane A. That is, that the signal threshold of the electrodes located in the lower part of the windscreen 30 will be less than those of the electrodes located in the upper part of the windscreen 30.

It may be found desirable, in particular, in this embodiment that the driver is able to turn off the control system in order to avoid that all his hand's motion over the windscreen is to be interpret as a command.

To do this, the control system 10 is provided with a transmitting electrode 40 which is mounted in the vicinity of the steering wheel 41 in order to be reached by one hand of the driver. This transmitting electrode is connected to a frequency generator 42. In this embodiment, the electrode array 12 acts as a receiving electrode array.

In operation, the generator 42 generates a signal to the transmitting electrode 40. The transmitting electrode 40 transmits an electromagnetic signal inside the vehicle compartment. The driver needs to touch the transmitting electrode 40 so that the electromagnetic signal passes through the driver and is received by the receiving electrode array 12. The signal received by the electrode array 12 will be interpreted as described in the first embodiment. Hence, the driver can easily turn off the gesture recognition by moving away his hand from the transmitting electrode.

Although the control system by gesture recognition has been described herein in relation to the control of vehicle accessories within the compartment of a vehicle it will be appreciated that the sensitised area 11 may be located within other windows of the vehicle.

The sensitised area may be operated from outside the vehicle to allow a user to act remotely on a vehicle mechanism, e.g. for locking/unlocking the locks of the openable- panels of the vehicle. In this case the electrodes are arranged in such a way that they only be operable from outside the vehicle. However, in order to ensure that only the driver is able to operate the control system for security reasons, the control system is interfaced with an identified user as the keyless entry system so that it is Only enabled when the driver is detected by the keyless entry system and then the control system will responds in a suitable manner as described above in relation to the first embodiment. Furthermore, it will be appreciate that the number, the size and the sensitivity of electrodes may vary for different applications and different vehicles.

Claims (22)

1. A control system for controlling remotely one or more devices, the control system comprising; a sensitised area for detecting the proximity of a hand at a plurality of points over this area, a gesture library memory, a gesture recognition module for recognizing and identifying the hand's motions among gestures stored in the gesture library, and a controller to control the operation of one of the devices according to the hand's motions of an operator applied anywhere over the sensitised area.
2. A control system as claimed in claim 1 wherein the sensitised area includes an array of capacitive electrodes arranged on a dielectric component.
3. A control system as claimed in claim 2 in which each electrode is connected to a threshold circuit so as to independently adjust the sensitivity of each electrode.
4. A control system as claimed in claim 2 or claim 3 wherein the dielectric component is a laminated glazing panel.
5. A control system as claimed in claim 2 or claim 3 wherein the electrodes are embedded within the laminated glazing panel.
6. A control system as claimed in claim 5 which further includes a transmitting electrode.
7. A control system as claimed in claim 6 wherein the transmitting electrode is in use placed in the vicinity of a steering wheel.
8. A control system as claimed in any preceding claim which includes a Head up display system so as to shown the gesture to control each vehicle device available.
9. A control system as claimed in any preceding claim which includes a display screen so as to show the gesture to control each vehicle device available.
10. A control system as claimed in any preceding claim which includes a loud speaker so as to give a audible feedback to the operator.
11. A control system as claimed in any preceding claim wherein the devices are vehicle accessories such air conditioner, radio or navigation system.
12. A control system as claimed in claim 1 wherein the devices are vehicle mechanisms.
13. A control system as claimed in claim 1 wherein the control system is adapted to operate in association with a user identification means.
14. A method for operating one or more devices, by a control system having a sensitised area over which an operator can move his hand, the method comprising the step of: - a) determining a gesture according to the hand's motion of the operator by detecting the proximity of a hand at each of a plurality of points in the sensitised area, - b) comparing this gesture against other gestures stored in a library of gestures, where each gesture in the library is associated with a output command of a device, and - C) activating the output command associated with the gesture matched in the library.
15. A method as claimed in claim 14 wherein said step a) includes the step of: - d) monitoring a change in capacitance over time at the plurality of points, - e) memorising co-ordinates for each points whose capacitance has been changed, and - f) in step b) determining whether the co-ordinates memorised in step e) is a recognised gesture. -11 -
16. A method as claimed in claim 15 wherein the plurality of points are an array of points.
17. A method as claimed in claim 14 or claim 15 wherein said step a) further includes the step of transmitting an electromagnetic signal from a transmitting electrode.
18. A vehicle incorporating a control system for controlling one or more vehicle devices, the control system being in accordance with any one of claims 1 to 13.
19. A vehicle as claimed in claim 18 when the control system is according to claim 5, in which the laminated glazing panel is a windscreen in the vehicle passenger compartment.
20. A control system as described herein with reference to the Figures 1, 2, 3 of the accompanying drawings.
21. A method for operating one or more vehicle devices substantially as described herein with reference to the Figure 4 of the accompanying drawings.
22. A vehicle as described herein with reference to the Figure 4 of the accompanying drawings.
GB0504485A 2005-03-04 2005-03-04 Motor vehicle control system for controlling one or more vehicle devices Active GB2423808B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0504485A GB2423808B (en) 2005-03-04 2005-03-04 Motor vehicle control system for controlling one or more vehicle devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0504485A GB2423808B (en) 2005-03-04 2005-03-04 Motor vehicle control system for controlling one or more vehicle devices

Publications (3)

Publication Number Publication Date
GB0504485D0 GB0504485D0 (en) 2005-04-13
GB2423808A true GB2423808A (en) 2006-09-06
GB2423808B GB2423808B (en) 2010-02-17

Family

ID=34451799

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0504485A Active GB2423808B (en) 2005-03-04 2005-03-04 Motor vehicle control system for controlling one or more vehicle devices

Country Status (1)

Country Link
GB (1) GB2423808B (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2063350A1 (en) * 2007-11-20 2009-05-27 Samsung Electronics Co., Ltd. Method and apparatus for interfacing between devices in home network
US7770136B2 (en) 2007-01-24 2010-08-03 Microsoft Corporation Gesture recognition interactive feedback
EP2268005A3 (en) * 2009-03-09 2011-01-12 Samsung Electronics Co., Ltd. Display apparatus for providing a user menu and method for providing user interface (ui) applicable thereto
US7899772B1 (en) 2006-07-14 2011-03-01 Ailive, Inc. Method and system for tuning motion recognizers by a user using a set of motion signals
WO2011023856A1 (en) * 2009-08-26 2011-03-03 Valtion Teknillinen Tutkimuskeskus Gesture sensor arrangement and a method for producing it
US7917455B1 (en) 2007-01-29 2011-03-29 Ailive, Inc. Method and system for rapid evaluation of logical expressions
WO2011092628A1 (en) * 2010-01-26 2011-08-04 Nokia Corporation Method for controlling an apparatus using gestures
US8251821B1 (en) 2007-06-18 2012-08-28 Ailive, Inc. Method and system for interactive control using movable controllers
WO2012126586A1 (en) * 2011-03-23 2012-09-27 Daimler Ag Method for detecting an actuating motion for an actuator of a motor vehicle equipment element and actuator of a motor vehicle equipment element
WO2013174494A1 (en) 2012-05-22 2013-11-28 Audi Ag System and method for controlling at least one vehicle system by means of gestures performed by a driver
US8655622B2 (en) 2008-07-05 2014-02-18 Ailive, Inc. Method and apparatus for interpreting orientation invariant motion
ES2451849A1 (en) * 2012-09-28 2014-03-28 Fundación Para La Promoción De La Innovación, Invest. Y Desarrollo Tecnológico En La Industria De Automoción De Galicia Method and system for gestural interaction with a vehicle
WO2014085277A1 (en) * 2012-11-27 2014-06-05 Neonöde Inc. Light-based touch controls on a steering wheel and dashboard
WO2014172904A1 (en) * 2013-04-27 2014-10-30 Rong Weihua Hand gesture recognition apparatus, hand gesture recognition method, and related vehicle-mounted apparatus
US8924076B2 (en) 2007-03-16 2014-12-30 Pilkington Group Limited Interactive vehicle glazing
US9092093B2 (en) 2012-11-27 2015-07-28 Neonode Inc. Steering wheel user interface
US9261968B2 (en) 2006-07-14 2016-02-16 Ailive, Inc. Methods and systems for dynamic calibration of movable game controllers
DE102015217179A1 (en) 2014-11-03 2016-05-04 Ifm Electronic Gmbh Sensor arrangement for an access system of a vehicle
DE102014222410A1 (en) 2014-11-03 2016-05-04 Ifm Electronic Gmbh Access system for a vehicle
US9389710B2 (en) 2009-02-15 2016-07-12 Neonode Inc. Light-based controls on a toroidal steering wheel
WO2016116372A1 (en) * 2015-01-20 2016-07-28 Saint-Gobain Glass France Composite pane with capacitive switching region
US9628844B2 (en) 2011-12-09 2017-04-18 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US9788032B2 (en) 2012-05-04 2017-10-10 Microsoft Technology Licensing, Llc Determining a future portion of a currently presented media program
FR3062352A1 (en) * 2017-02-02 2018-08-03 Eolane Combree Device for interior lighting in a vehicle
US10254943B2 (en) 2017-07-12 2019-04-09 Neonode Inc. Autonomous drive user interface

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8620113B2 (en) 2011-04-25 2013-12-31 Microsoft Corporation Laser diode modes
US8760395B2 (en) 2011-05-31 2014-06-24 Microsoft Corporation Gesture recognition techniques
US8635637B2 (en) 2011-12-02 2014-01-21 Microsoft Corporation User interface presenting an animated avatar performing a media reaction
JP5858850B2 (en) * 2012-04-02 2016-02-10 三菱電機株式会社 The air conditioner indoor unit
US8898687B2 (en) 2012-04-04 2014-11-25 Microsoft Corporation Controlling a media program based on a media reaction

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2355055A (en) * 1999-10-09 2001-04-11 Rover Group A control system for a vehicle
WO2002015560A2 (en) * 2000-08-12 2002-02-21 Georgia Tech Research Corporation A system and method for capturing an image
US6519607B1 (en) * 1999-10-28 2003-02-11 Hewlett-Packard Company Image driven operating system
US20040141634A1 (en) * 2002-10-25 2004-07-22 Keiichi Yamamoto Hand pattern switch device
US20040161132A1 (en) * 1998-08-10 2004-08-19 Cohen Charles J. Gesture-controlled interfaces for self-service machines and other applications

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5543591A (en) * 1992-06-08 1996-08-06 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
US6204839B1 (en) * 1997-06-27 2001-03-20 Compaq Computer Corporation Capacitive sensing keyboard and pointing device
US6392636B1 (en) * 1998-01-22 2002-05-21 Stmicroelectronics, Inc. Touchpad providing screen cursor/pointer movement control
EP1717679B1 (en) * 1998-01-26 2016-09-21 Apple Inc. Method for integrating manual input

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040161132A1 (en) * 1998-08-10 2004-08-19 Cohen Charles J. Gesture-controlled interfaces for self-service machines and other applications
GB2355055A (en) * 1999-10-09 2001-04-11 Rover Group A control system for a vehicle
US6519607B1 (en) * 1999-10-28 2003-02-11 Hewlett-Packard Company Image driven operating system
WO2002015560A2 (en) * 2000-08-12 2002-02-21 Georgia Tech Research Corporation A system and method for capturing an image
US20040141634A1 (en) * 2002-10-25 2004-07-22 Keiichi Yamamoto Hand pattern switch device

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8051024B1 (en) 2006-07-14 2011-11-01 Ailive, Inc. Example-based creation and tuning of motion recognizers for motion-controlled applications
US9405372B2 (en) 2006-07-14 2016-08-02 Ailive, Inc. Self-contained inertial navigation system for interactive control using movable controllers
US7899772B1 (en) 2006-07-14 2011-03-01 Ailive, Inc. Method and system for tuning motion recognizers by a user using a set of motion signals
US9261968B2 (en) 2006-07-14 2016-02-16 Ailive, Inc. Methods and systems for dynamic calibration of movable game controllers
US7770136B2 (en) 2007-01-24 2010-08-03 Microsoft Corporation Gesture recognition interactive feedback
US7917455B1 (en) 2007-01-29 2011-03-29 Ailive, Inc. Method and system for rapid evaluation of logical expressions
US8924076B2 (en) 2007-03-16 2014-12-30 Pilkington Group Limited Interactive vehicle glazing
US8251821B1 (en) 2007-06-18 2012-08-28 Ailive, Inc. Method and system for interactive control using movable controllers
EP2063350A1 (en) * 2007-11-20 2009-05-27 Samsung Electronics Co., Ltd. Method and apparatus for interfacing between devices in home network
US8655622B2 (en) 2008-07-05 2014-02-18 Ailive, Inc. Method and apparatus for interpreting orientation invariant motion
US10007422B2 (en) 2009-02-15 2018-06-26 Neonode Inc. Light-based controls in a toroidal steering wheel
US9389710B2 (en) 2009-02-15 2016-07-12 Neonode Inc. Light-based controls on a toroidal steering wheel
EP2268005A3 (en) * 2009-03-09 2011-01-12 Samsung Electronics Co., Ltd. Display apparatus for providing a user menu and method for providing user interface (ui) applicable thereto
WO2011023856A1 (en) * 2009-08-26 2011-03-03 Valtion Teknillinen Tutkimuskeskus Gesture sensor arrangement and a method for producing it
WO2011092628A1 (en) * 2010-01-26 2011-08-04 Nokia Corporation Method for controlling an apparatus using gestures
US9335825B2 (en) 2010-01-26 2016-05-10 Nokia Technologies Oy Gesture control
CN103443754B (en) * 2011-03-23 2016-06-29 戴姆勒股份公司 The method and vehicle equipped operating device element actuating means actuating the operation of a motor vehicle equipped with an identification element
US8886414B2 (en) 2011-03-23 2014-11-11 Daimler Ag Method for detecting an actuating motion for an actuator of a motor vehicle equipment element and actuator of a motor vehicle equipment element
CN103443754A (en) * 2011-03-23 2013-12-11 戴姆勒股份公司 Method for detecting an actuating motion for an actuator of a motor vehicle equipment element and actuator of a motor vehicle equipment element
WO2012126586A1 (en) * 2011-03-23 2012-09-27 Daimler Ag Method for detecting an actuating motion for an actuator of a motor vehicle equipment element and actuator of a motor vehicle equipment element
US9628844B2 (en) 2011-12-09 2017-04-18 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US9788032B2 (en) 2012-05-04 2017-10-10 Microsoft Technology Licensing, Llc Determining a future portion of a currently presented media program
WO2013174494A1 (en) 2012-05-22 2013-11-28 Audi Ag System and method for controlling at least one vehicle system by means of gestures performed by a driver
DE102012018685B4 (en) * 2012-05-22 2017-08-03 Audi Ag System and method for controlling at least one vehicle system by means of a driver carried out gestures
DE102012018685A1 (en) 2012-05-22 2013-11-28 Audi Ag System and method for controlling at least one vehicle system by means of a driver carried out gestures
ES2451849A1 (en) * 2012-09-28 2014-03-28 Fundación Para La Promoción De La Innovación, Invest. Y Desarrollo Tecnológico En La Industria De Automoción De Galicia Method and system for gestural interaction with a vehicle
WO2014085277A1 (en) * 2012-11-27 2014-06-05 Neonöde Inc. Light-based touch controls on a steering wheel and dashboard
US9710144B2 (en) 2012-11-27 2017-07-18 Neonode Inc. User interface for curved input device
US9092093B2 (en) 2012-11-27 2015-07-28 Neonode Inc. Steering wheel user interface
WO2014172904A1 (en) * 2013-04-27 2014-10-30 Rong Weihua Hand gesture recognition apparatus, hand gesture recognition method, and related vehicle-mounted apparatus
DE102014222410A1 (en) 2014-11-03 2016-05-04 Ifm Electronic Gmbh Access system for a vehicle
DE102014222410B4 (en) 2014-11-03 2018-05-17 Ifm Electronic Gmbh Access system for a vehicle
DE102015217179A1 (en) 2014-11-03 2016-05-04 Ifm Electronic Gmbh Sensor arrangement for an access system of a vehicle
WO2016116372A1 (en) * 2015-01-20 2016-07-28 Saint-Gobain Glass France Composite pane with capacitive switching region
FR3062352A1 (en) * 2017-02-02 2018-08-03 Eolane Combree Device for interior lighting in a vehicle
EP3357755A1 (en) * 2017-02-02 2018-08-08 Eolane Combree Interior lighting device for a vehicle
US10254943B2 (en) 2017-07-12 2019-04-09 Neonode Inc. Autonomous drive user interface

Also Published As

Publication number Publication date
GB0504485D0 (en) 2005-04-13
GB2423808B (en) 2010-02-17

Similar Documents

Publication Publication Date Title
EP2305508B1 (en) User configurable vehicle user interface
EP1286861B1 (en) Head-up display based safety devices for use in motor vehicles
US8928336B2 (en) Proximity switch having sensitivity control and method therefor
US7295904B2 (en) Touch gesture based interface for motor vehicle
EP1530526B1 (en) Circuit for selectively producing switching signals
US7447575B2 (en) Operator control system for an automobile
US9551590B2 (en) Gesture-based information and command entry for motor vehicle
EP2295277A1 (en) Vehicle operator control input assistance
US20040227625A1 (en) Motor vehicle roof with a control means for electrical motor vehicle components and process for operating electrical motor vehicle components
US7649278B2 (en) Operating device for on-vehicle equipment
EP1579412B1 (en) Accessory system for vehicle
CN102820876B (en) Cognitive proximity switch and method having sensitivity
US9475390B2 (en) Method and device for providing a user interface in a vehicle
US7385308B2 (en) Advanced automotive control switches
US7248151B2 (en) Virtual keypad for vehicle entry control
US9143126B2 (en) Proximity switch having lockout control for controlling movable panel
CN103085734B (en) Having error feedback touch proximity switch
US20120041648A1 (en) Device for adjusting position of headrest and method for adjusting position of headrest
US20180201190A1 (en) Vision system with door mounted exterior mirror and display module
JP5216829B2 (en) Adaptive vehicle user interface
US7217894B2 (en) Switch apparatus for use in vehicles
JP5376203B2 (en) Electric seat system for a vehicle
US20090174682A1 (en) Instrumentation Module For A Vehicle
EP1228917A1 (en) A control arrangement
US6982632B2 (en) Vehicle engine starting apparatus

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)
746 Register noted 'licences of right' (sect. 46/1977)

Effective date: 20190208