EP3025223A1 - Method and device for remote-controlling a function of a vehicle - Google Patents

Method and device for remote-controlling a function of a vehicle

Info

Publication number
EP3025223A1
EP3025223A1 EP14727709.9A EP14727709A EP3025223A1 EP 3025223 A1 EP3025223 A1 EP 3025223A1 EP 14727709 A EP14727709 A EP 14727709A EP 3025223 A1 EP3025223 A1 EP 3025223A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
gesture
function
device
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP14727709.9A
Other languages
German (de)
French (fr)
Inventor
Christophe Bonnet
Andreas Hiller
Gerhard Kuenzel
Martin Moser
Heiko Schiemenz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Daimler AG
Original Assignee
Daimler AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to DE102013012394.1A priority Critical patent/DE102013012394A1/en
Application filed by Daimler AG filed Critical Daimler AG
Priority to PCT/EP2014/001374 priority patent/WO2015010752A1/en
Publication of EP3025223A1 publication Critical patent/EP3025223A1/en
Application status is Withdrawn legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/20Means to switch the anti-theft system on or off
    • B60R25/2045Means to switch the anti-theft system on or off by hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/06Automatic manoeuvring for parking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/10Interpretation of driver requests or demands
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/10Input devices or features thereof
    • B60K2370/12Input devices or input features
    • B60K2370/146Input by gesture
    • B60K2370/14643D-gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/70Arrangements of instruments in the vehicle
    • B60K2370/77Arrangements of instruments in the vehicle characterised by locations other than the dashboard
    • B60K2370/797At the vehicle exterior
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0063Manual parameter input, manual setting means, manual initialising or calibrating means
    • B60W2050/0064Manual parameter input, manual setting means, manual initialising or calibrating means using a remote, e.g. cordless, transmitter or receiver unit, e.g. remote keypad or mobile phone

Abstract

The invention relates to a method and a device for remote-controlling a function of a vehicle (1). The device comprises a portable operator-control device (2) having a touch-sensitive display and operator control face (4), a gesture capture means (5), and a wireless communication link (3) between the portable operator-control device (2) and the vehicle (1). A gesture which is carried out by a user is captured by means of the portable operator-control device (2), and a wireless communication link (3) is set up between the portable operator-control device (2) and the vehicle (1). Predefined gestures are assigned and stored for the purpose of controlling the function of the vehicle (1). The captured gesture is transmitted to the vehicle (1) by means of the communication link (3). The captured gesture is compared with the predefined gestures in the vehicle (1), and when there is correspondence the function of the vehicle (1) which is assigned to the gesture is carried out. Furthermore, the invention relates to a device (12) for putting the graphic gesture which has been carried out with a graphic trace (13). In this context, the graphic trace (13) comprises a plurality of end positions (14, 15, 16), wherein the end positions (14, 15, 16) symbolise the positions of the transmission of a vehicle (1).

Description

 Method and device for remote control of a function of a vehicle

The present invention relates to a method and a corresponding device for remote control of a function of a vehicle by means of a portable or mobile operating device.

The invention further relates to a device for inputting the graphic guided gesture, wherein by means of the gesture a function of a vehicle is controlled.

In DE 102004004302 A1 is a mobile device for remote control of a

Air conditioning of a vehicle described. The mobile device communicates with a control device in the vehicle. Via a touch-sensitive display

(Touch Panel Display) on the mobile device can actuation commands or

Actuating instructions are entered. Upon receipt of a corresponding operation instruction from the actuator, either aeration of the passenger compartment or air conditioning of the passenger compartment is performed.

In DE 102009019910 A1, gesture recognition is described by processing a temporal sequence of position inputs received via a touch sensor, such as a capacitive or resistive touch sensor. Here, a state machine gesture recognition algorithm is indicated for interpreting coordinate streams output from a touch sensor.

In DE 112006003515 T5 is a method for controlling an electronic

Device with a touch-sensitive display device described. The method includes: detecting a contact with the touch-sensitive display device while the device is in a locked state of a user interface; Moving an image corresponding to an unlocked state of a user interface of the device in accordance with the contact; Transferring the device to the unlocked state of the user Interface, when the detected contact corresponds to a predefined gesture; and maintaining the device in the locked state of the user interface if the detected contact does not correspond to the predefined gesture.

From EP 1 249 379 A2 a method for moving a motor vehicle to a target position is known. In this case, the motor vehicle is brought into a starting position near the desired target position. After a first driver-side activation, the environment of the motor vehicle is continuously scanned and the current vehicle position determined continuously. On the basis of the determined environmental and position information, a trajectory to the target position is determined. To shut down the trajectory

Control information for the movement of the motor vehicle generated in the target position. After a second driver-side activation of the control information-dependent control commands are delivered to the drive train, the brake system and the steering of the motor vehicle. As a result, the motor vehicle moves independently of the driver into the target position. The driver-side activation can take place outside the motor vehicle.

In DE 10 2009 041 587 A1 a driver assistance device is described. The control device outputs control signals to a drive and steering device of the

Motor vehicle and causes a performing an autonomous parking operation. By means of a remote control, commands can be issued to the control device from outside the vehicle. After receiving a predetermined

Interrupt command, an already started parking operation of the motor vehicle can be interrupted. At least one camera is coupled to the control device and obtains image data about a surrounding area of the motor vehicle. The

Control device sends the image data obtained from the camera or image data calculated therefrom to the remote control. The remote control displays these image data using complex display and control units.

As shown in the prior art, touch-sensitive display and control panels (also known as "touch screens") are becoming increasingly popular for use as display and user input devices on portable devices such as smartphones or tablet PCs. It displays graphics and text and provides a user interface that allows a user to interact with the devices. A touch-sensitive display and

Control surface detected and reacts to a touch on the control surface. A Device may display one or more virtual buttons, menus, and other user interface objects on the control panel. A user may interact with the device by touching the touch-sensitive display and control surface at positions corresponding to the user interface objects with which it wishes to interact. Thus, for example, an application running on such devices can be started. In addition, various gestures for clear operation such as unlocking by swipe gesture or a special unlocking gesture can be used. In addition to combined touch-sensitive display and control surfaces, there are also remote from the display touch-sensitive

Control surfaces, such as in laptops.

For triggering or activating the function of the vehicle outside the vehicle, such as locking or unlocking or opening and closing of vehicle doors, switching on or off of air conditioning systems of the vehicle, the activation of radio or navigation systems, etc., are today mainly using special Devices executed. The reason for this is the high safety requirement in the remote control of the vehicle function. Is this a device with

touch-sensitive display and control surface, the main problem is the inadvertent activation or deactivation of functions by unintentional contact with the control surface. In addition, there is only a weak haptic feedback to the user in the operation. To be sure whether a certain function is triggered or not, the user must constantly look at the display and control surface. Monitoring during the implementation of the

Vehicle function is thus difficult to carry out due to the constant eye contact with the operating surface. For example, when performing a parking operation outside the vehicle, the user should always keep the moving vehicle in the field of view in order to bring the vehicle to a standstill in an emergency can.

If the user uses his own mobile devices such as the mobile phone to control vehicle functions, safe operation is even harder to guarantee. Such electronic devices are consumer electronics and are not designed to operate vehicle functions in terms of safety. The functionality is susceptible to interference and the communication with the vehicle can be easily manipulated. It is therefore an object of the present invention to provide an improved over the prior art method and improved devices for remote control of a function of a vehicle and to enter the graphic guided gesture.

This object is achieved by the subject matter with the features of claim 1, claim 6 and claim 8.

In order to control a function of a vehicle outside the vehicle, various gestures are assigned to control different vehicle functions. That the vehicle functions can only be activated or triggered if the corresponding gesture was executed by the user. These gestures are called "predefined gestures".

The association between the gestures and the vehicle function are stored in the vehicle, e.g. in a storage unit in the vehicle. Filed both the

Raw data of the gestures as well as certain decision criteria for the gesture. The assignment between the gestures and the vehicle function can also be stored. Likewise, this assignment or decision criteria of the gestures can also be stored in the portable operating device. The vehicle functions can only be activated or triggered if the corresponding gesture has been carried out by the user or if the executed gestures correspond to the corresponding criteria.

The predefined gestures can be fixed or dynamically changeable gestures, which fulfill the following criteria, for example: The gesture must have a certain shape, the gesture must take place at a specific point on the control panel, the shape or position of the gesture is predetermined by the vehicle. Likewise, the vehicle itself can generate a gesture. For example, to activate a vehicle function, a sequence of numbers (eg 3547) is stored in the memory unit, whereby the sequence of numbers must be entered by means of a known gesture (eg by swiping in one go or sequentially typing in the fields). In addition, the shape or location of the gesture to be performed may change with each operation. In this case, the current state of the vehicle can be taken into account and the "predefined gestures" to be executed are adapted accordingly. Depending on the vehicle function, the complexity of the gestures may vary. For example, be assigned more complex gestures or patterns for safety-critical functions such as "granting access or driving authorization" and simple gestures for functions such as "switching on and off of air conditioning".

To control a vehicle function, a user performs on the portable

Operating device detects a gesture ("executed gesture"), for example by a specific movement instead of just touching a touch-sensitive operating surface of the portable operating device This "executed gesture" is detected by means of the portable operating device (as a "detected gesture"), eg with a integrated

Gesture detection. Thereby, various courses of the parameters such as the position, the pressure or the movement of a conductive object e.g. a finger on the

Control surface detected during an expected gesture.

When capturing the gesture, different capture technologies can be used. The most commonly used techniques are passive and active capacitive sensing. Here, the position of a finger on the control surface is determined based on the electrical capacitance. Other detection technologies are techniques based on resistive displays, usually operated with a stylus, or based on ultrasound or other acoustic techniques, or techniques based on total internal reflection or other optical effects. All of these techniques can be used to detect the gesture in the present invention.

Generally, so-called raw data of the gesture (e.g., the position, the pressure or the movement of a conductive object such as a finger) is recorded and recorded in the

Gesture registration deposited. In this case, the raw data of the gesture include, for example, the coordinates (x, y) of the touch position of a finger on the operating surface, the course of contact, the speed of the contact progression, or

Direction change of the touch history.

In order to identify the executed gesture, the captured raw data is transmitted to the vehicle via the wireless communication link and stored in the vehicle

Vehicle analyzed and evaluated. For the evaluation in the vehicle are predefined gestures or the raw data of the predefined gestures or decision criteria for the evaluation in one

Storage unit stored in the vehicle or in a control unit of the vehicle. Different gestures are assigned to control different vehicle functions. For the gesture recognition in the vehicle, the transmitted raw data of the detected gesture are compared with the stored predefined gestures or with the

Decision criteria evaluated. If the transmitted raw data correspond to the stored gesture or the decision criteria, the corresponding

Vehicle function executed.

Detecting a gesture performed by a user, rather than just a touch, greatly reduces the risk of inadvertent activation of vehicle functions. Unintentional contact with the control surface can not trigger a function. Thus, a higher operating safety is achievable.

By transmitting the detected gesture from the portable operating device to the vehicle, the operability of the portable operating device can be monitored. The raw data of the gestures can only be transferred as long as the

Communication connection between the portable communication device and the vehicle is made.

The comparison of the detected gesture and the predefined gestures in the vehicle is independent of the gesture recognition of the portable operating device. Thus, simple operating device can also be used for controlling the vehicle function. In the operating device, only a gesture detection, no gesture recognition is required.

Preferably, gesture recognition may be performed by both the portable operator and the vehicle. The detected gesture can be detected independently and assigned to a gesture movement. By means of in

Smartphone or tablet integrated gesture recognition can be detected and recognized, for example, typing, pulling, pushing, longer-pulling and variable-drawing gestures. The executed gestures are assigned a gestural movement (so-called wiping, pushing, turning, zooming etc.). When transmitting to the vehicle by means of the communication connection not only the transmitted raw data of the detected Gesture, but also the results of the gesture recognition, ie the associated gesture movement transmitted. In the vehicle, the transmitted raw data of the detected gesture are evaluated, for example by means of an on-board gesture recognition. The results of this vehicle's gesture recognition, the gesture movement associated with the vehicle, are compared with the transmitted gesture movement. If both the raw data of the recognized gestures and the gesture movements match, then the corresponding vehicle function is executed.

By transmitting the detected gesture and the gesture movement from the

Gesture recognition from the portable operating device to the vehicle is a multiple verification of the gesture recognition instead. The operability of the portable operating device may be limited by the communication between the portable

Control device and the vehicle are ensured. Tampering with both the portable operating device and the vehicle can be prevented.

Preferably, the wireless communication link is between the portable

Control device and the vehicle a Bluetooth connection. Other wireless connections may be any radio connection, e.g. a Wi-Fi connection or a mobile communication connection. Any radio communication standard can be used. Depending on the availability of the communication connection, the system can be easily adapted.

The following data are preferably transmitted when transmitting the detected gesture to the vehicle: Coordinate course of the detected gesture or the speeds of the touch history or the change in direction of the touch history on the touch-sensitive display and control surface or the gesture movement detected by the portable control device.

Thus, different information can be transmitted depending on the types of gestures. This simplifies the transmission, in which e.g. only the information characterizing the property of the gestures is transmitted.

Also, various information of the same gestures can be transmitted simultaneously. This information can be evaluated independently. The Redundancy increases the reliability of the remote control of the vehicle function. Safe gesture recognition is possible, especially for complex gestures.

Depending on the nature of the vehicle function, the function is performed either after the gesture has been executed, activated and executed, or while the gesture is being performed. For a vehicle function that is to be monitored by a user in the implementation, e.g. when opening a convertible roof, forward and

Reversing the vehicle, the function of the vehicle is carried out only as long as the executed gesture is detected by means of the operating device. In this case, e.g. to carry out any continuous movement on the operating surface. This continuous movement may be circular, a wiping without wiping, a wiping in one direction with wiping of the finger, etc. Continuous pressing on the control surface during operation, such as dead man's switch, is not sufficient. This is especially important for the conduct of

driving safety relevant functions such as Parking in and out of the vehicle. The function of the vehicle is only carried out if a continuous or constant movement is detected on the control surface. In addition, the gesture has to be transmitted to the vehicle in real time ("real-time-like") and checked.

The risk of unintentional incorrect operation is thus effectively prevented. The operating safety and the monitoring are by a simple operating gesture

ensured.

The device for remotely controlling a function of a vehicle includes a portable operating device, a wireless communication link between the portable operating device and the vehicle. In this case, the portable operating device comprises a touch-sensitive display and control surface and a gesture detection. Likewise, the portable operating device may include gesture recognition. The gesture detection and the gesture recognition can be performed separately or integrated. A gesture executed by the user on the touch-sensitive display and operating surface can then be detected or recognized by means of an integrated gesture recognition of the operating device.

Such gesture recognition has been in control devices for many years

touch-sensitive control surface used. An early example of this is the Character recognition in PDAs and a pulling finger movement and the simple and double touch on the touchpad of a notebook. Lately the

Gesture recognition integrated in smartphone or tablet. Typing, dragging, pushing, dragging, and dragging gestures (wiping, pushing, rotating, zooming, etc.) are recognized by taking various parameters such as the position, pressure, or motion of a conductive object, e.g. a finger on the control panel during an expected gesture.

Likewise, the display and control surface can be a so-called "multi-touch pad." For example, this control surface can be operated with several fingers at the same time.Thus, one or more touch contacts and movements can be detected only to be executed if several operating gestures are carried out simultaneously.

The portable operating device is located outside the vehicle, so that a user can comfortably control the vehicle or a vehicle function from the outside. The operating device may be a hand-held computer, a tablet, a mobile phone, a media player, a personal digital assistant (PDA) or a wireless

Remote control device be.

In the vehicle is a storage unit. Therein are predefined gestures which are assigned to control the function of the vehicle. Further, the vehicle has a control unit that can perform the function of the vehicle. The allocation between the gestures and the vehicle function, the raw data of the gestures or specific decision criteria for the gesture can be stored in the memory unit. These data are dynamically changeable. An algorithm for generating or changing certain decision criteria for the gesture may also be stored in the memory unit.

Not only can the assignment between the gestures and the vehicle function be changed, but the "predefined gestures" to be executed can also be changed.The vehicle can generate its own gesture.The shape or position of the gesture to be executed changes with each operation Vehicle can be taken into account and the operation adjusted accordingly become. The user can define his own gesture. A flexible allocation is feasible. The operation can be designed user-friendly.

Preferably, the storage unit in the vehicle or in a control unit of the

Vehicle, where the predefined gestures are stored to control the function of the vehicle, be secured. Preferably, the access to the storage unit in the vehicle can only be made with an authorization by the vehicle manufacturer. The

Storage of the predefined gestures can also take place in a separate safe area in the vehicle or in the control unit of the vehicle, so that access is only possible with appropriate authorization. This allows a high level of security and at the same time a high availability of the operating data for the control of the function of the vehicle. The association between the predefined gestures and the vehicle functions may be e.g. stored on a database server of the vehicle manufacturer to the vehicle or to the control unit of the vehicle. With appropriate authorization, the assignment can be changed. On the one hand, flexible administration is possible. On the other hand, the control of the vehicle function is protected against unauthorized manipulation.

The detected by the portable operating device gesture is by means of

Communication link transmitted to the vehicle. There, the detected gesture is detected with an on-board gesture recognition.

For gesture recognition in the vehicle, e.g. evaluated the transmitted raw data of the detected gestures in an evaluation unit. Like the memory unit, the evaluation unit can be protected against unauthorized access. By comparing the raw data of the detected gesture with the stored predefined gesture or with the stored decision criteria, it is determined whether the detected gesture is valid for controlling a vehicle function. If yes, the corresponding will

Vehicle function performed by means of the control unit of the vehicle.

A gesture is recognized as a valid gesture if certain decision criteria are met. These criteria may be, for example, that the gesture has a certain shape, or is performed at a certain point of the control surface, or changes with each operation, or corresponds to a continuous movement, or is vehicle-specific. The above components (gesture detection, gesture recognition,

Memory unit, control unit) may be implemented in hardware, software or a combination of both hardware and software.

To enter the gesture, which are assigned to control the function of the vehicle, a graphical guide on the touch-sensitive display and control surface can be displayed. The function of the vehicle is only listed if the

executed gesture is detected within the graphical guide and corresponds to the predefined gesture for this vehicle function.

An example is an input field with numbers from 0 to 9. To activate a

Vehicle function, a 4-digit number sequence (e.g., 3569) should be entered. If a gesture such as Swipe gesture is executed in a train from 3 to 9, the vehicle function is activated. This path can be displayed in color on the control surface. Also a gesture like successively typing in the fields can be displayed led. The graphical guidance can also change the current state of the vehicle and adapt accordingly.

The operation is through the pictorial representation of the graphic guidance

user-friendly. The user can easily make the required gesture. The recording is limited to the field of graphic guidance. Unintentional contacts on the control surface outside the graphic guidance are not detected and thus not transmitted to the vehicle, so that no transmission of unnecessary data.

 The device for inputting the graphic guided gesture on the

touch-sensitive display and control surface includes a graphical guide with multiple end positions. The end positions are connected via connection paths.

An example of this is the choice of gear position. The end positions reflect the gearbox settings from a shift gate. When a gesture is executed from one end position to the other end position along the connection paths, the corresponding gear position is set in the vehicle. The graphical guidance in this case is a symbolized shift gate. Preferably, a shift lever of the vehicle transmission is shown as a picture element, for example a movable point on the touch-sensitive display and control surface. The picture element can display the current switching state. When a gesture is executed along a connection path between two end positions, the picture element moves with it. The corresponding gear position is set in the vehicle when the picture element has reached the corresponding end position. Thus, the "gear change" function can be controlled by means of the portable operating device, wherein the picture element displays the current state of the vehicle.

Thus, this device is not only user-friendly, intuitive, easy to use, but also ensures safety for vehicle-specific operation.

There are now various possibilities for designing and developing the teaching of the present invention in an advantageous manner. On the one hand, reference should be made to the subordinate claims and, on the other hand, to the following explanation of the embodiment. It should also be included the advantageous embodiments resulting from any combination of subclaims.

Also, the teachings of the present invention are not limited to

Remote control of a vehicle function, the corresponding devices and methods can also be used for remote control of any machinery and equipment.

The present invention will be explained below with reference to several embodiments with reference to the accompanying drawings. It should be understood that the drawings illustrate, but are not limited to, preferred embodiments of the invention.

The drawings in each case in a schematic representation:

Fig.1 Basic structure of a device for remote control of a

Vehicle function according to an embodiment of the present invention Fig. 2 flowchart of a method for the remote control of a

Vehicle function after execution of a complete gesture according to an embodiment of the present invention

Fig. 3 flowchart of a method for the remote control of a

 Vehicle function during execution of a gesture according to a

 Embodiment of the present invention

Fig. 4 apparatus for inputting a graphically guided gesture according to a

 Embodiment of the present invention

FIG. 1 illustrates a device 6 for remote control of a function of a vehicle 1 according to an embodiment of the invention. The device 6 comprises the vehicle 1, a portable operating device 2 and a wireless communication link 3 between the portable operating device 2 and the vehicle 1.

The portable operating device 2, here e.g. formed as a mobile phone, located outside the vehicle 1, so that a user, the vehicle 2 and a

Can control vehicle function comfortably from the outside.

In order to be able to communicate with the vehicle 1, the mobile telephone 2 has a wireless communication interface 9. e.g. a Bluetooth interface 9. Via this interface 9, the mobile phone 2 communicates with the Bluetooth interface O of the vehicle 1. Data can be sent, transmitted and received via the Bluetooth connection 3. Furthermore, the functionality of the mobile phone 2 can be monitored by the data transmission via the Bluetooth connection 3.

The mobile phone 2 has a display and control surface 4 for operating the

Remote control on. Here, the display and control surface 4 is a touch-sensitive flat display ("touch screen" display) through which the control commands for controlling the vehicle function are input. For example, the mobile phone user makes a gesture on the touchscreen 4 with his finger. The executed gesture is detected by means of a gesture detection 5 integrated in the mobile telephone 2. In this case, so-called raw data of the gesture are recorded and stored in a memory in the gesture detection 5 and then evaluated. The raw data of the gesture can eg the Course of the coordinates (x, y) of the touch of the finger on the touch screen 4. Both the raw data and the evaluation result of the mobile phone are transmitted to the vehicle 1 by means of the Bluetooth connection 3 and evaluated in the vehicle 1.

 For the evaluation in the vehicle 1 are predefined gestures or the raw data of the predefined gestures in a memory unit 7 in the vehicle 1 and in a

Control unit 8 of the vehicle 1 stored. Different vehicle functions are assigned different gestures. Preferably, the access to the memory unit 7 in the vehicle 1, in which the predefined gestures are stored to control the function of the vehicle 1, be secured. For example, the memory unit 7 may only be written and read with authorization by the vehicle manufacturer. This storage unit 7 can also be located in a separate safe area in the vehicle 1 or in the control unit 8 of the vehicle 1, so that the access is possible only with appropriate authorization.

For gesture recognition in the vehicle, the transmitted raw data are evaluated in an evaluation unit 11. Like the memory unit 7, the evaluation unit 11 can be secured against unauthorized access. By comparing the raw data of the executed gesture with the raw data of the stored predefined gesture, it is determined whether the executed gesture is valid for controlling a vehicle function. If the data match, the corresponding vehicle function is carried out by means of the control unit 8 of the vehicle 1.

A gesture is recognized as a valid gesture if certain criteria are met. These criteria may e.g. be that the gesture corresponds to a particular shape, or is carried out at a certain point of the control surface, or changes with each operation, or corresponds to a continuous movement, or is vehicle-specific.

To enter the gesture for the control of the vehicle function, a corresponding operating display can be displayed on the touch screen 4 of the mobile phone 2.

2 shows a flow chart of a method for remote control of a vehicle function after execution of a complete gesture according to an embodiment of the present invention. The vehicle function is only started if one corresponding gesture, for example on the touch screen 4 of the mobile phone 2 was completely executed.

In a step not shown here, a user selects an application, such as an application.

"Motor start" on his mobile phone 2. The corresponding application program is started.

In step S1, an operation display for inputting certain gestures for controlling the vehicle function "engine start" appears on the touch screen 4 of the mobile phone 2. This display may be displayed in textual or pictorial form as a graphical guide on the touch screen 4. For this embodiment, the display may eg as a text "Please enter numbers 9541" or as an image on the touchscreen.

At the same time a wireless communication link 3, here a Bluetooth connection between the mobile phone 2 and the vehicle 1 is established. Thus, the control commands for controlling the vehicle function or the executed gesture that the mobile phone user has performed with his finger on the touch screen 4 can. be transmitted to the vehicle 1.

In step S2, it is determined whether a touch on the touch screen 4 is detected or not. If no touch is detected, which corresponds to a "no" answer in step S2, the processing flow goes to step S3. If a touch is detected, which corresponds to a "yes" answer in step S2, the processing flow goes to step S4.

In step S3, it is determined whether or not a predetermined cancellation condition is satisfied. The predetermined termination condition may be, for example, that no gesture has been detected on the touch screen 4 for a predetermined period T1. if the

predetermined abort condition is satisfied, that is, no touch is detected within the time period T1, which corresponds to a "yes" answer in step S3, the process is aborted and terminated If the abort condition is not met, which corresponds to a "no" answer in step S3 , the processing flow returns to step S2. The corresponding operating display for the input of certain gestures will continue on the Mobile phone 2 is displayed and the user can continue his gesture or specify again.

In step S4, the so-called raw data of the gesture, e.g. as the course of the

Coordinates of the executed touch, by the gesture detection 5 in the

Mobile phone 2 recorded and evaluated.

In step S5, the raw data of the executed gesture is transmitted to the vehicle 1 via the Bluetooth connection 3. Likewise, the evaluation result of the gesture detection 5 of the mobile phone 2 can be transmitted with.

In step S6, it is determined whether or not the raw data of the executed gesture is valid. That the raw data are evaluated in the vehicle 1, independently of the gesture detection 5 in the mobile phone 2, in the vehicle's own gesture recognition 11. At this time, it is checked, e.g. whether the evaluation result matches the stored predefined gestures. If "yes", the associated vehicle function is executed in step S7. If "no", the process is aborted and terminated.

In step S7, it is determined whether the executed gesture is complete. If so, the processing flow goes to step S8 and the vehicle function is activated. Here the engine of the vehicle is started. If not, the processing flow returns to step S2. That as long as a movement of the touch on the control surface is detected, the coordinates of the raw data are detected in step S4 and transmitted to the vehicle in step S5 and checked for validity until the executed gesture is complete.

3 shows a flow chart of a method of remotely controlling a vehicle function while executing a gesture in accordance with an embodiment of the present invention. In this case, the vehicle function is only started as long as a corresponding gesture is carried out, for example, on the touch screen 4 of the mobile phone 2. This is necessary for carrying out an operation for which monitoring is important. In a step, not shown here, a user selects an application such as "opening a convertible roof" or "driving" on his mobile phone 2. The corresponding application program is started.

In step S1, on the touch screen 4 of the mobile phone 2, an operation display for inputting certain gestures for controlling an operation of the vehicle such as "opening a convertible roof" or "driving" appears. This display may be presented in textual or pictorial form as a graphical guide on the touchscreen 4. For this embodiment, the display may be e.g. can be displayed as a text "Please execute continuous circular motion." Also, a circular image on the

Touchscreen are displayed as an indicator to make clear to the user that the touch screen should now be operated in a circular motion. The circular movement can be carried out, for example, in one direction, circling with a change of direction or even, for example, in the form of a eight 8. The user must operate the touchscreen without discontinuing.

Similar to Fig. 2, the same steps S1 to S6 are performed. A Bluetooth connection is established between the mobile phone 2 and the vehicle 1 for transmitting the detected gesture. As long as a touch on the touch screen 4 is detected, the so-called raw data of the gesture are detected, evaluated. And transferred via the Bluetooth connection 3 to the vehicle 1.

In step S6, it is determined whether or not the raw data of the executed gesture is valid. If "yes", the associated vehicle function is executed in step S9 If "no", the processing flow goes to step S11.

In the execution of the vehicle function, it is determined in step S10 whether or not further movement of the touch on the touch screen 4 is detected. When a movement is detected, which corresponds to a "yes" answer in step S10, the processing flow returns to step S4. That is, as long as a movement of the touch on the operation surface is detected, the coordinates of the raw data are detected in step S4 and in step S5 transferred to the vehicle and checked for validity. In addition, the vehicle can give the driver via the mobile phone, for example, by an acoustic or haptic signal feedback. In addition, the vehicle and the mobile phone can independently evaluate the data of the gesture. The results of the evaluation of the mobile phone are transmitted to the vehicle. Only if the evaluation results match the

Vehicle function associated with the gesture executed.

If no movement of the touch is detected in step S10, which corresponds to a "no" answer in step S10, the processing flow goes to step S11. The

Vehicle function is stopped and the procedure aborted.

Thus, the recording and transmission of the raw data of the gesture with the detection of a touch on the touch screen 4 are started and not stopped until no more contact is detected. In this case, in step S6 in the vehicle 1, the transmitted raw data is checked for validity. The corresponding associated vehicle function is performed as long as the transmitted raw data is valid. If the raw data is not valid, the process is aborted immediately. For the example, the control of the vehicle function "opening a convertible roof" by means of a continuous circular motion on the touch screen 4, the opening of the convertible roof is only performed when the associated gesture, as a circular motion on the

Touchscreen 4 is performed. The process is immediately stopped if no temporal change of the touch coordinates on the touch screen 4 is detected, i. when you release it or when you press the touch screen 4 again. In this case, opening the roof is stopped at the dead man's switch.

For the "driving" function, the driver executes the rotational movement, the data of the executed gesture, in this case the course of the movement, is transmitted to the vehicle via the wireless communication link

Vehicle steers and drives. If the driver stops the circular motion, the vehicle stops. If the driver restarts the circular movement, the vehicle continues to drive. In doing so, the vehicle may encounter obstacles via the onboard sensor

recognize independently and react accordingly. If an obstacle is detected, the vehicle brakes and comes to a stop at a distance to the obstacle. The vehicle can give feedback to the vehicle user by means of an acoustic or haptic signal. The intensity of the signal can be varied with the distance to the obstacle. Fig. 4 shows a device for inputting a graphically guided gesture. By means of the graphically guided gesture, for example, a gear change can be performed via a mobile telephone 2 in a vehicle 1.

In this case, the device comprises a graphic guide 13 with three end positions (14, 15, 16) and a picture element 17. The end positions (14, 15, 16) can be divided into three

Connection paths (18, 19, 20) can be achieved.

The three end positions (14, 15, 16) correspond to the three transmission settings of a shift gate. The end position 14 with "D" stands for "Drive", corresponds to the

Vehicle function "engage forward" The end position 15 with "P" stands for "parking", corresponds to the vehicle function "insert parking brake". The end position 16 with "R" stands for "Reverse", corresponds to the vehicle function "Reverse gear".

The end positions (14, 15, 16) can be at a distance to each other so

arranged that "D" 14 is located at the upper end position of the mobile phone display 4, "R" 16 perpendicular to the "D" 14 at the lower end position of the display, and "P" 15 at the middle of the distance between the "R" 16 and "D" 14 to be arranged at right angles.

 The picture element (17) is formed in this embodiment as a filled circle. It corresponds to a shift lever of the vehicle transmission. The position of the circle (17) indicates the current switching state of the vehicle (1) on the mobile phone display (4). The circle (17) can be moved by means of a finger along the three connection paths (18, 19, 20) to the three end positions (14, 15, 16). The position of the circle (17) corresponds to the current position of the finger within the graphic guide (13).

The corresponding function of the vehicle is activated only when the gesture has been executed from one end position to the other end position within the graphic guide (13) and the circle (17) has reached the corresponding end position. Otherwise, the circle (17) is returned to its last valid end position and there is no gear change in the vehicle (1) instead. In order to enable a gear change, the picture element (17) is moved. The movement takes place via a gesture of the vehicle user from one end position, for example 15 to another end position, for example 14. The movement of the picture element (17) must be carried out so that it takes place along a connection path (18, 19, 20) and without is carried out.

 The connection paths can be rectangular (18, 19) or rectilinear (20). For example, if the vehicle is to be moved forward from a stop position (15), the vehicle user must make a gesture in a train which is made from "P" (15) at right angles to the "D" (14). When the picture element (17) is lowered, the picture element (17) jumps back to its initial position, in this case the holding position (15), and a gear change in the vehicle (1) does not take place.

Claims

claims
Method for remote control of a function of a vehicle (1),
 - Wherein by means of a portable operating device (2) detected by a user gesture is detected, and
 a wireless communication link (3) between the portable ones
 Operating device (2) and the vehicle (1) is constructed,
characterized in that
 - (to control the function of the vehicle (1) predefined gestures assigned and stored,)
 - The detected gesture for controlling the function of the vehicle (1) by means of the communication link (3) to the vehicle (1) is transmitted, and
 - In the vehicle (1) the detected gesture is compared with the stored in the vehicle (1) gestures, and
 - At the match the function of the vehicle (1) is assigned, which is associated with the gesture.
Method according to claim 1,
characterized in that
 the gesture of the portable operating device (2) and the vehicle (1) is assigned to a gesture movement independently of one another,
 - In the vehicle (1) of the portable control device (2) associated
 Gestural movement and the vehicle (1) associated gesture movement are compared with each other, and
 in the case of coincidence of the two gesture movements, the function of
 Vehicle (1) is assigned, which is associated with the gesture.
Method according to one of the preceding claims,
characterized in that the wireless communication link (3) between the portable ones
Operating device (2) and the vehicle (1) is a Bluetooth communication or a WLAN communication.
Method according to one of the preceding claims,
characterized in that
in transmitting the detected gesture to the vehicle (1)
on a on a touch-sensitive display and control surface (4) specified
 - Touch history, or
 - The speeds of the contact history, or
 the change in direction of the contact history, or
 the gesture movement associated with the portable operating device (2) is transmitted from the portable operating device (2) to the vehicle.
Method according to claim 4,
characterized in that
the function of the vehicle (1) is performed as long as the executed gesture is detected by a portable operating device (2) or after the executed gesture has been detected by a portable operating device (2).
Device (6) for remote control of a function of a vehicle (1), comprising
a portable operating device (2),
 - wherein the portable operating device (2) by means of a touch-sensitive display and control surface (4) is operable and comprises a gesture detection (5),
 - A communication device for establishing a wireless
 Communication link (3) between the portable operating device (2) and the vehicle (1),
characterized in that
 the vehicle (1) comprises a memory unit (7) and a control unit (8),
- are stored in the memory unit (7), predefined gestures, and
- The control unit (8) compares the detected gesture with a predefined gesture.
Device (6) according to claim 6, characterized in that
 - The portable control device (2) comprises a graphic guide (13) for inputting the gestures, and
 - A function of the vehicle (1) is executable when an executed gesture is entered in the graphical guide (13).
8. Device (12) for inputting a graphic guided gesture with a
 graphic guidance (13),
 characterized in that
 - The graphical guide (13) a plurality of end positions (14, 15, 16), wherein the end positions (14, 15, 16) via connection paths (18, 19, 20) are connected and
 - The end positions (14, 15, 16) symbolize the gear positions of a vehicle (1).
9. Apparatus according to claim 8,
 characterized in that
 - The graphical guide (13) represents a symbolized shift gate.
10. Apparatus according to claim 8 or 9,
 characterized in that
 the device comprises a movable picture element (17), wherein the picture element (17) symbolizes a shift lever of the vehicle transmission.
EP14727709.9A 2013-07-26 2014-05-21 Method and device for remote-controlling a function of a vehicle Withdrawn EP3025223A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE102013012394.1A DE102013012394A1 (en) 2013-07-26 2013-07-26 Method and device for remote control of a function of a vehicle
PCT/EP2014/001374 WO2015010752A1 (en) 2013-07-26 2014-05-21 Method and device for remote-controlling a function of a vehicle

Publications (1)

Publication Number Publication Date
EP3025223A1 true EP3025223A1 (en) 2016-06-01

Family

ID=50884335

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14727709.9A Withdrawn EP3025223A1 (en) 2013-07-26 2014-05-21 Method and device for remote-controlling a function of a vehicle

Country Status (6)

Country Link
US (1) US20160170494A1 (en)
EP (1) EP3025223A1 (en)
JP (1) JP2016538780A (en)
CN (1) CN105408853A (en)
DE (1) DE102013012394A1 (en)
WO (1) WO2015010752A1 (en)

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014011802B4 (en) * 2014-08-09 2019-04-18 Audi Ag Safe activation of a partially autonomous function of a motor vehicle via a portable communication device
US10457327B2 (en) * 2014-09-26 2019-10-29 Nissan North America, Inc. Method and system of assisting a driver of a vehicle
DE102015200044A1 (en) * 2015-01-06 2016-07-07 Ford Global Technologies, Llc Method and device for supporting a maneuvering process of a motor vehicle
US9965042B2 (en) * 2015-03-30 2018-05-08 X Development Llc Methods and systems for gesture based switch for machine control
CN106292552A (en) * 2015-05-14 2017-01-04 中兴通讯股份有限公司 A kind of method of remote-control car and device, terminal and automobile
DE102015213807B4 (en) * 2015-07-22 2017-02-16 Volkswagen Aktiengesellschaft Activating a vehicle action by means of a mobile device
CN105564427B (en) * 2015-07-31 2018-05-15 宇龙计算机通信科技(深圳)有限公司 One kind is parked method, terminal and system
JP6414696B2 (en) * 2015-09-24 2018-10-31 株式会社デンソー Key lock-in prevention device
DE102015222234B4 (en) * 2015-11-11 2019-03-21 Volkswagen Aktiengesellschaft Method for triggering a safety-relevant function of a system and system
CN105644502A (en) * 2016-02-23 2016-06-08 大斧智能科技(常州)有限公司 Gesture unlocking controller for electric vehicle
JP2017185886A (en) * 2016-04-05 2017-10-12 株式会社東海理化電機製作所 Change gear operation device
US10372121B2 (en) 2016-04-26 2019-08-06 Ford Global Technologies, Llc Determination of continuous user interaction and intent through measurement of force variability
DE102016212723A1 (en) 2016-07-13 2018-01-18 Volkswagen Aktiengesellschaft A method of providing a deadman button function by means of a touch screen of an operating device, operating device and system
DE102016224528A1 (en) 2016-12-08 2018-06-14 Volkswagen Aktiengesellschaft Remote assisted maneuvering of a team
DE102016224529A1 (en) 2016-12-08 2018-06-28 Volkswagen Aktiengesellschaft Functional protection of a remote-controlled Anhängerrangierens
RU2652665C1 (en) * 2016-12-12 2018-04-28 Акционерное общество "Лаборатория Касперского" System and method of vehicle control
US10369988B2 (en) 2017-01-13 2019-08-06 Ford Global Technologies, Llc Autonomous parking of vehicles inperpendicular parking spots
US10234868B2 (en) 2017-06-16 2019-03-19 Ford Global Technologies, Llc Mobile device initiation of vehicle remote-parking
DE102017007119A1 (en) 2017-07-27 2019-01-31 Daimler Ag Method for remote control of a function of a vehicle
US10281921B2 (en) 2017-10-02 2019-05-07 Ford Global Technologies, Llc Autonomous parking of vehicles in perpendicular parking spots
US10336320B2 (en) 2017-11-22 2019-07-02 Ford Global Technologies, Llc Monitoring of communication for vehicle remote park-assist
US10507868B2 (en) 2018-02-22 2019-12-17 Ford Global Technologies, Llc Tire pressure monitoring for vehicle park-assist
US10493981B2 (en) 2018-04-09 2019-12-03 Ford Global Technologies, Llc Input signal management for vehicle park-assist
US10232673B1 (en) 2018-06-01 2019-03-19 Ford Global Technologies, Llc Tire pressure monitoring with vehicle park-assist
US10384605B1 (en) 2018-09-04 2019-08-20 Ford Global Technologies, Llc Methods and apparatus to facilitate pedestrian detection during remote-controlled maneuvers
US10529233B1 (en) 2018-09-24 2020-01-07 Ford Global Technologies Llc Vehicle and method for detecting a parking space via a drone

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10117650A1 (en) 2001-04-09 2002-10-10 Daimler Chrysler Ag Bringing vehicle to target position, involves outputting control commands to drive train, braking system and/or steering so vehicle can be steered to target position independently of driver
DE102004004302A1 (en) 2003-02-03 2004-08-12 Denso Corp., Kariya Vehicle remote control air conditioning system has a control unit that activates only a ventilation or climate control component of the air conditioning system in order to reduce battery power consumption
JP2005165733A (en) * 2003-12-03 2005-06-23 Sony Corp Information processing system, remote control device and method, controller and method, program, and recording medium
CN101243382B (en) * 2005-09-15 2013-01-30 苹果公司 System and method for processing raw data of track pad device
EP1848626B1 (en) * 2005-02-18 2017-04-19 Bayerische Motoren Werke Aktiengesellschaft Device for bringing a motor vehicle to a target position
US7657849B2 (en) 2005-12-23 2010-02-02 Apple Inc. Unlocking a device by performing gestures on an unlock image
DE102009019910A1 (en) 2008-05-01 2009-12-03 Atmel Corporation, San Jose Touch sensor device e.g. capacitive touch sensor device, for e.g. personal computer, has gesture unit analyzing time series data to distinguish gesture inputs, where unit is coded with gesture recognition code having linked state modules
DE102008051982A1 (en) * 2008-10-16 2009-06-10 Daimler Ag Vehicle e.g. hybrid vehicle, maneuvering method, involves releasing parking brake, transferring forward- or backward driving position in automatic transmission, and regulating speed of vehicle by parking brake
DE102009041587A1 (en) 2009-09-15 2011-03-17 Valeo Schalter Und Sensoren Gmbh A driver assistance device for a motor vehicle and method for assisting a driver in monitoring an autonomous parking operation of a motor vehicle
EP2418123B1 (en) * 2010-08-11 2012-10-24 Valeo Schalter und Sensoren GmbH Method and system for supporting a driver of a vehicle in manoeuvring the vehicle on a driving route and portable communication device
US8918230B2 (en) * 2011-01-21 2014-12-23 Mitre Corporation Teleoperation of unmanned ground vehicle
US8958929B2 (en) * 2011-01-31 2015-02-17 Toyota Jidosha Kabushiki Kaisha Vehicle control apparatus
JP2015039084A (en) * 2011-02-28 2015-02-26 シャープ株式会社 Image display device set
JP5793463B2 (en) * 2011-07-19 2015-10-14 日本電信電話株式会社 Information selection apparatus, method, and program
JP6035717B2 (en) * 2011-08-30 2016-11-30 アイシン精機株式会社 Vehicle control apparatus and computer program for portable information terminal
DE102011112447A1 (en) * 2011-09-03 2013-03-07 Volkswagen Aktiengesellschaft Method and arrangement for providing a graphical user interface, in particular in a vehicle
DE102012008858A1 (en) * 2012-04-28 2012-11-08 Daimler Ag Method for performing autonomous parking process of motor vehicle e.g. passenger car, involves storing target position and/or last driven trajectory of vehicle in suitable device prior to start of autonomous vehicle parking operation
JP2014088730A (en) * 2012-10-31 2014-05-15 Mitsubishi Electric Corp Portable communication apparatus and door control device
CN103067630A (en) * 2012-12-26 2013-04-24 刘义柏 Method of generating wireless control command through gesture movement of mobile phone
US9122916B2 (en) * 2013-03-14 2015-09-01 Honda Motor Co., Ltd. Three dimensional fingertip tracking

Also Published As

Publication number Publication date
US20160170494A1 (en) 2016-06-16
WO2015010752A1 (en) 2015-01-29
CN105408853A (en) 2016-03-16
DE102013012394A1 (en) 2015-01-29
JP2016538780A (en) 2016-12-08

Similar Documents

Publication Publication Date Title
US9411424B2 (en) Input device for operating graphical user interface
JP2017076408A (en) Inputting of information and command based on gesture for motor vehicle
EP2881878B1 (en) Vehicle control by means of gestural input on external or internal surface
US9733752B2 (en) Mobile terminal and control method thereof
EP2513760B1 (en) Method and apparatus for changing operating modes
US8760432B2 (en) Finger pointing, gesture based human-machine interface for vehicles
KR20080108970A (en) Interactive operating device and method for operating the interactive operating device
KR20110076921A (en) Display and control system in a motor vehicle having user-adjustable representation of displayed objects, and method for operating such a display and control system
JP2006072854A (en) Input device
KR101872426B1 (en) Depth-based user interface gesture control
US20130204457A1 (en) Interacting with vehicle controls through gesture recognition
US20140223384A1 (en) Systems, methods, and apparatus for controlling gesture initiation and termination
EP2451672B1 (en) Method and device for providing a user interface in a vehicle
CN103294366B (en) A kind of screen unlock method and electronic equipment
WO2009006221A1 (en) Virtual keypad systems and methods
US20110169750A1 (en) Multi-touchpad multi-touch user interface
EP3009799A1 (en) Method for operating a motor vehicle employing a touch screen
WO2006025891A2 (en) Touch gesture based interface for motor vehicle
US10025388B2 (en) Touchless human machine interface
US20140282161A1 (en) Gesture-based control systems and methods
KR101561917B1 (en) Vehicle control apparatus and method thereof
EP2328061A2 (en) Input apparatus
EP2750915B1 (en) Method and system for providing a graphical user interface, in particular in a vehicle
JP2010184600A (en) Onboard gesture switch device
KR20100106203A (en) Multi-telepointer, virtual object display device, and virtual object control method

Legal Events

Date Code Title Description
17P Request for examination filed

Effective date: 20160205

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent to:

Extension state: BA ME

DAX Request for extension of the european patent (to any country) (deleted)
17Q First examination report despatched

Effective date: 20161124

18W Application withdrawn

Effective date: 20170119