EP3298475A1 - Bediensystem und verfahren zum betreiben eines bediensystems für ein kraftfahrzeug - Google Patents
Bediensystem und verfahren zum betreiben eines bediensystems für ein kraftfahrzeugInfo
- Publication number
- EP3298475A1 EP3298475A1 EP16722047.4A EP16722047A EP3298475A1 EP 3298475 A1 EP3298475 A1 EP 3298475A1 EP 16722047 A EP16722047 A EP 16722047A EP 3298475 A1 EP3298475 A1 EP 3298475A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- operating system
- detection
- control
- gesture
- operating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 12
- 238000001514 detection method Methods 0.000 claims abstract description 114
- 230000011664 signaling Effects 0.000 claims abstract description 36
- 230000008859 change Effects 0.000 claims description 5
- 230000003287 optical effect Effects 0.000 description 11
- 230000008901 benefit Effects 0.000 description 8
- 238000012790 confirmation Methods 0.000 description 6
- 230000009471 action Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000011156 evaluation Methods 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Arrangement of adaptations of instruments
-
- B60K35/10—
-
- B60K35/65—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/165—Management of the audio stream, e.g. setting of volume, audio stream path
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
- G06V40/113—Recognition of static hand signs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- B60K2360/146—
-
- B60K2360/1464—
Definitions
- the invention relates to an operating system for a motor vehicle.
- the operating system comprises a detection device which is designed to detect at least a body part of a user when it is arranged in a detection region of the detection device.
- the operating system comprises a control device, which is designed to control a signaling device of the operating system.
- Operating systems are known from the prior art, which are controlled by gestures of a user within a detection area.
- the user can be signaled or displayed by a light cone of the detection area or by a light beam at least one system boundary of the detection area.
- DE 10 2013 009 567 A1 discloses a method for operating a gesture recognition device in a motor vehicle.
- the gesture recognition device performs gesture recognition only if a gesticulating hand is located in a predetermined partial volume of the vehicle interior. Further, the operator is assisted in locating the partial volume by a light source is irradiated by a light beam in the sub-volume. For this purpose, the light beam hits the hand of the operator when it is arranged in the sub-volume.
- US 2014/0361989 A1 describes a method for operating functions in a vehicle using gestures executed in three-dimensional space. In a method step, it is determined whether or not a first gesture performed in the three-dimensional space is detected by an image-based detection process. In another Step is determined whether or not the first gesture is a gesture associated with activating control of a function. When the corresponding gestures are detected, a function associated with the gesture is executed.
- DE 10 2013 012 466 A1 discloses an operating system for one or more vehicle-mounted device (s).
- the operating system comprises a sensor device for detecting image data of a hand located in the detection range of the sensor device and for transmitting the acquired image data to an evaluation device.
- the operating system includes a storage device in which actions associated with hand gestures and hand gestures are stored, an evaluation device that evaluates the image data, and a controller that executes the action associated with a hand gesture.
- the object of the present invention is to support a user particularly well in the operation of a gesture-controlled operating system and thereby to increase the reliability of the gesture operation.
- This object is achieved by an operating system for a motor vehicle and a method for operating an operating system according to the independent patent claims.
- Advantageous embodiments of the invention can be found in the dependent claims.
- this object is achieved by an operating system for a motor vehicle.
- the operating system comprises a detection device which is designed to detect at least a body part of a user when it is arranged in a detection region of the detection device.
- detection area is meant a spatial area in the environment of the detection device, which can be accommodated by the detection device.
- the detection device may comprise a camera, in particular a 2D camera and / or a 3D camera.
- the detection range of the camera can be determined, for example, by the viewing angle of the camera.
- the operating system has a control device which is designed to control a signaling device of the operating system.
- the operating system is characterized in that the detection device is designed to check whether the detected in the detection range Body part is located in a control room, which forms a portion of the detection area.
- the operating space is thus arranged within the detection range of the detection device.
- operating space is meant, for example, a subspace within the detectable by the detection device spatial area.
- the detection device can thus be located in a predetermined environment and in this detect a predetermined spatial area - the so-called detection area - in which in turn a defined by the detection device control room is arranged as a subspace of the spatial area.
- the operating system is also distinguished by the fact that the control device is designed to control the signaling device in such a way that feedback is output by means of the latter outside the control room when it is detected by the detection device that the body part is located within the control room.
- the control device is designed to control the signaling device in such a way that feedback is output by means of the latter outside the control room when it is detected by the detection device that the body part is located within the control room.
- the control device is designed to control the signaling device in such a way that feedback is output by means of the latter outside the control room when it is detected by the detection device that the body part is located within the control room.
- the control device is designed to control the signaling device in such a way that feedback is output by means of the latter outside the control room when it is detected by the detection device that the body part is located within the control room.
- the control device can control the signaling device accordingly, which can then output a feedback.
- the data transmission can be wireless or wired.
- the signaling device may comprise a display device and / or an acoustic output device.
- the display device may have, for example, one or more lights and / or a display, so that optical signals can be displayed by means of the display device.
- the acoustic output device may, for example, have one or more loudspeakers, so that acoustic signals can be output by means of the acoustic output device.
- a feedback can be, for example, the illumination of the luminaire in the form of an optical signal and / or an acoustic signal from a loudspeaker.
- the object is also achieved by a method for operating an operating system for a motor vehicle.
- a method for operating an operating system for a motor vehicle first of all a body part of a user is detected by means of a detection device when it is arranged in a detection area of the detection device. Furthermore, it is checked whether the body part detected in the detection area is located in a control room, which forms a partial area of the detection area. Subsequently, a signaling device of the operating system is controlled by means of a control device such that a feedback is output by means of the signaling device outside the control room, if it is detected by the detection device that the body part is located within the control room.
- the detecting means is further adapted to detect a gesture in the operating room and to check whether the detected gesture matches at least a predetermined gesture. It is not only the presence of a body part in The gesture can, for example, be evaluated on the basis of video data in the case of a camera, resulting in the advantage that the detection device not only outputs a direct feedback when detecting a body part, but at the same time designed to evaluate gestures.
- control device is designed to control the signal device such that an acknowledgment signal is output by means of the signal device if the detected gesture coincides with the at least one predetermined gesture.
- the confirmation signal can be output as an optical and / or acoustic signal by means of the signaling device. So, as already mentioned, for example, the hand of the user detected in the control room, the light turns yellow. If it is subsequently detected that the user is making a predetermined gesture with his hand, the color of the luminaire can change from yellow to green as a confirmation signal. The user thus knows that his currently executed gesture has been recognized as an operator gesture.
- predetermined gesture is meant, for example, a gesture stored in a memory device of the detection device, ie, the given gestures can be stored or stored, for example, in a memory device of the detection device through the Camera captured video data to be compared with stored video data, ie stored video data. If the recorded video data match the stored video data, a gesture will be recognized as a given gesture. This results in the advantage that the user of the operating system receives a direct feedback in a particularly simple manner when operating the operating system.
- the control device is adapted to control the signaling device such that by means of the signaling device, a warning message is issued if the detected gesture deviates from the at least one predetermined gesture.
- the warning signal can be output as an optical and / or acoustic signal by means of the signaling device. If, for example, after detecting a gesture of the user, for example by gesturing with his hand, no predetermined gesture detected, so the light of the signaling device instead of yellow to green, for a given gesture, from yellow to red in a faulty operation, ie one unspecified gesture, change.
- the camera captured video data can compare with deposited video data. If the captured video data does not match the stored video data, a gesture is detected as a misoperation. This has the advantage that a user can react directly to it if he has operated the operating system wrong.
- the signaling device may also comprise two display devices in the form of two lights, which may be arranged side by side.
- the first light can, for example, display an optical feedback when detecting the body part in the control room, that is, the light would shine yellow.
- the second lamp may output, for example, an optical confirmation signal and / or an optical warning signal. If the user intervenes, for example, in the control room, then this is indicated by the first light, which lights yellow at this moment and as long as the user is in the control room with his hand. For example, if the user subsequently makes a gesture, and the gesture is captured as a predetermined gesture, the second light turns green.
- the signaling device may include, for example, a lamp and a speaker, so that the optical signal of the lamp is amplified by an acoustic signal.
- the signaling device may preferably have a display with an illuminated frame.
- a translucent frame with a light source in the form of a light guide can be arranged around an outer contour of the display. This allows the user to track his actual operations on the display.
- the user By the illuminable frame, the user, as already described in the example of the lamp, understand his detected operating action, so if he is, for example, in the control room and if his gesture was done right or wrong.
- control device is configured to control the signal device in such a way that a detection signal is output by means of the signal device, if it is detected by the detection device that the body part is indeed within the detection range but outside the operating space.
- the detection signal can be output as an optical and / or acoustic signal by means of the signaling device.
- the user is indicated by means of the signaling device that, while he is in the detection area of the detection device, he is not yet in the control room if he wishes to operate the operating system. This results in the advantage that a user can easily check the position of his hand when operating the operating system.
- the stability and reliability of the gesture recognition can be increased thereby, since the detection device can already detect and track a body part of the user, for example an arm or a hand or a finger, outside the control room and thereby make the penetration of the body part into the operating space more reliable. can tektieren.
- control device is designed to change the expansion and / or arrangement of the control room in dependence on at least one predetermined criterion.
- predetermined criterion For example, be a stored user profile, eg on the vehicle key.
- the stored user profile may include, for example, the size of the driver and / or a preferred seating position.
- the extent of the control room can be adapted to the seat position of the user in the motor vehicle.
- the operating room is adapted to the arm length of the user so that he can reach into the operating room when extending his arm. If the user changes and the seat position is shifted from the first seat position away from the operating space into a second seat position, the user might no longer be able to reach the operating space in the second seat position.
- the extent of the control room can be adapted, in this case increased, as a function of the seating position. That is, the extent of the operating space may increase, the farther the seat, for example, away from the steering wheel. This results in the advantage that individually the operating room can be adapted for each user of the operating system. The simultaneous support of the driver when operating the operating system results in a comfortable operating system for every user.
- Figure 1 is a schematic representation of the operating system in an interior of a motor vehicle.
- Fig. 2 is a schematic representation of a detection device and a recorded by her detection area with an operating space arranged therein.
- an operating system 14 is shown schematically in an interior 12 of a motor vehicle 10.
- the motor vehicle 10 may be a motor vehicle, in particular a passenger car.
- the mode of operation of the operating system 14 will now be explained below in conjunction with FIG. 2, which schematically represents a detection device 16 of the operating system from FIG. 1.
- the operating system 14 comprises the detection device 16, which is designed to detect a body part 18 of a user and a gesture of the body part 18, indicated in FIG. 1 by the double arrow.
- the detection device 16 may comprise, for example, a camera.
- the camera can be arranged, for example, on a ceiling of the motor vehicle 10 directly above a center console in the interior 12 of the motor vehicle 10.
- the camera spans a detection area 24, in which an object and / or a body part 18 can be detected.
- detection area 24 is meant a spatial area in the vicinity of the detection device 16, ie in the interior space 12 of the motor vehicle 10, which can be accommodated by the detection device 16.
- an operating space 26 is provided within the detection area 24, an operating space 26 is provided.
- the operating space 26 is thus a subspace of the detection area 24.
- the detection facility 16 is located in an interior space 12 of the motor vehicle 10 and accommodates therein a predetermined spatial area - the detection area 24 - in which an operating space 26 defined by the detection facility 16 is again is arranged as a subspace of the detection area 24.
- the operating space 26 can, as shown in FIG.
- the operating system 14 further comprises a control device 20 and a signaling device 22. Depending on where and what the detection device 16 detects within the detection region 24, the control device 20 controls the signaling device 22 differently.
- the detection device 16, the control device 20 and the signaling device 22 are coupled or connected wirelessly or wired accordingly.
- the detection device 16, for example the camera is designed in a manner known per se to detect an object and / or a body part 18 of a user, for example a hand.
- Gestures for example the gesturing of a hand, can also be detected by the detection device 16.
- the detection device 16 can detect, for example, by image data and / or video data, ie a sequence of image data, gestures and / or a body part 18.
- the control device 20 can control the signaling device 22 differently, depending on where and what is detected within the detection region 24 by the detection device 16.
- the signaling device 22 may comprise, for example, a display device and / or an acoustic output device.
- the display device may, for example, one or more lights and / or a display, which is arranged for example in an instrument cluster of the motor vehicle 10, have, so that by means of the display device optical signals can be displayed.
- the acoustic output device can, for example, have one or more loudspeakers, so that acoustic signals can be output by means of the acoustic output device.
- the signal device 22 comprises, for example, a display device, then the display device is preferably arranged in a field of view of the user.
- the control device 20 distinguishes between four different display modes. Depending on where and what the detection device 16 detects within the detection range 24, the signal device 22 is controlled differently by means of the control device 20. If the detection device 16 detects a body part 18 of the user in the detection area 24 but outside the operating room 26, the signal device 22 is operated in a first display mode. The signaling device 22 is operated in a second display mode if the detection device 16 detects the body part 18 of the user in the control room 26. summarizes. If the detection device 24 detects a predetermined gesture of the user in the control room 26, the signal device 22 is operated in a third display mode. For this purpose, the detection device 16 compares, for example, stored video data with recorded video data.
- the gesture made by the user is a given gesture. Detects the detection device 16 in an operating space 26 a deviating from a predetermined operating gesture gesture, ie a misuse of the user, the signaling device 22 is operated in a fourth display mode. A deviating gesture or incorrect operation is detected if the captured video data deviates from the stored video data.
- the control device 20 controls the signal device 22 in such a way that a detection signal is output by means of the signal device 22.
- the user is shown that he is indeed with his hand in the detection area 24 of the detection device 16, but not yet in the control room 26, if he wants to operate the operating system 14.
- the signal device 22 is, for example, a luminaire, then the luminaire can light up yellow as a detection signal.
- the control device 20 activates the signaling device 22 in such a way that a feedback message is output by means of the signaling device 22 when the hand is in the operating space 26.
- a feedback message is output by means of the signaling device 22 when the hand is in the operating space 26.
- the light in the form of an optical signal can light up orange.
- the control device 20 activates the signal device 22 such that an acknowledgment signal is output by means of the signal device 22.
- a gesture is detected in the control room 26 and this coincides with at least one predetermined gesture.
- the control device 20 controls the signaling device 22 such that a warning message is output by means of the signaling device 22.
- the light of the signal device 22 instead of from yellow to green for a given gesture from yellow to red in the event of a faulty operation, So a non-default gesture, change.
- the signaling device 22 may also include three lights, which may be arranged side by side in the field of view of the user.
- the first lamp may output the detection signal in the first display mode.
- the second luminaire may, for example, output the visual feedback in the second display mode.
- a confirmation signal or a warning signal can be output. If the user intervenes, for example, in the detection area 16 with his or her hand, this is indicated by the first light (first display mode), which lights up yellow at this moment and as long as the user is in the detection area 16 with his hand.
- the second light (second display mode), which lights orange at this moment and as long as the user is in the control room 26 with his hand.
- the third light (third display mode), which, for example, green lights. If the user executes a faulty operation on the given gesture, this is also signaled to him by the third light, in that the color of the light changes from green to red (fourth display mode). Leaves the user with his hand the control room 26 and the detection area 24, all three lights would go out one after the other.
- the signal device 22 may comprise, for example, a loudspeaker.
- the feedback and / or the confirmation signal and / or the warning message and / or the detection signal can be output as an acoustic signal.
- the adaptation of the control room 26 is indicated schematically in Fig. 2 by the arrows around the control room 26.
- the operating space 26 can only be "extended” to the extent that the system limits of the detection range-indicated by the dashed lines-permit.The extent of the operating space 26 can not be increased beyond the limits of the detection range 24.
- a gesture-based control of the operating system 14 is considerably simplified.
- a user is given a visual and / or audible feedback as to whether, for example, he has positioned his hand correctly in order to be able to carry out a gesture operation at all.
- the user is also given a visual and / or audible feedback as to whether he has just made a gesture with his hand that has been recognized by the operating system 14 or not.
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102015006613.7A DE102015006613A1 (de) | 2015-05-21 | 2015-05-21 | Bediensystem und Verfahren zum Betreiben eines Bediensystems für ein Kraftfahrzeug |
PCT/EP2016/000617 WO2016184539A1 (de) | 2015-05-21 | 2016-04-15 | Bediensystem und verfahren zum betreiben eines bediensystems für ein kraftfahrzeug |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3298475A1 true EP3298475A1 (de) | 2018-03-28 |
Family
ID=55967203
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16722047.4A Withdrawn EP3298475A1 (de) | 2015-05-21 | 2016-04-15 | Bediensystem und verfahren zum betreiben eines bediensystems für ein kraftfahrzeug |
Country Status (5)
Country | Link |
---|---|
US (1) | US10599226B2 (de) |
EP (1) | EP3298475A1 (de) |
CN (1) | CN107454948A (de) |
DE (1) | DE102015006613A1 (de) |
WO (1) | WO2016184539A1 (de) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102017201236B4 (de) * | 2017-01-26 | 2023-09-07 | Volkswagen Aktiengesellschaft | Verfahren zum Betreiben eines Bediensystems, Bediensystem und Fahrzeug mit einem Bediensystem |
KR102272309B1 (ko) | 2019-03-15 | 2021-07-05 | 엘지전자 주식회사 | 차량 제어 장치 |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6266061B1 (en) * | 1997-01-22 | 2001-07-24 | Kabushiki Kaisha Toshiba | User interface apparatus and operation range presenting method |
DE10039432C1 (de) * | 2000-08-11 | 2001-12-06 | Siemens Ag | Bedieneinrichtung |
US8745541B2 (en) * | 2003-03-25 | 2014-06-03 | Microsoft Corporation | Architecture for controlling a computer using hand gestures |
US9311528B2 (en) * | 2007-01-03 | 2016-04-12 | Apple Inc. | Gesture learning |
US8726194B2 (en) * | 2007-07-27 | 2014-05-13 | Qualcomm Incorporated | Item selection using enhanced control |
JP4318056B1 (ja) * | 2008-06-03 | 2009-08-19 | 島根県 | 画像認識装置および操作判定方法 |
US8547327B2 (en) * | 2009-10-07 | 2013-10-01 | Qualcomm Incorporated | Proximity object tracker |
JP5617581B2 (ja) * | 2010-12-08 | 2014-11-05 | オムロン株式会社 | ジェスチャ認識装置、ジェスチャ認識方法、制御プログラム、および、記録媒体 |
DE102012000263A1 (de) | 2012-01-10 | 2013-07-11 | Daimler Ag | Verfahren und Vorrichtung zum Bedienen von Funktionen in einem Fahrzeug unter Verwendung von im dreidimensionalen Raum ausgeführten Gesten sowie betreffendes Computerprogrammprodukt |
DE102012216193B4 (de) | 2012-09-12 | 2020-07-30 | Continental Automotive Gmbh | Verfahren und Vorrichtung zur Bedienung einer Kraftfahrzeugkomponente mittels Gesten |
DE102012021220A1 (de) | 2012-10-27 | 2014-04-30 | Volkswagen Aktiengesellschaft | Bedienanordnung für ein Kraftfahrzeug |
DE102013000081B4 (de) | 2013-01-08 | 2018-11-15 | Audi Ag | Bedienschnittstelle zum berührungslosen Auswählen einer Gerätefunktion |
US20140267004A1 (en) * | 2013-03-13 | 2014-09-18 | Lsi Corporation | User Adjustable Gesture Space |
DE102013009567B4 (de) | 2013-06-07 | 2015-06-18 | Audi Ag | Verfahren zum Betreiben einer Gestenerkennungseinrichtung sowie Kraftfahrzeug mit räumlich beschränkter Gestenerkennung |
CN103303224B (zh) | 2013-06-18 | 2015-04-15 | 桂林电子科技大学 | 车载设备手势控制系统及其使用方法 |
DE102013012466B4 (de) | 2013-07-26 | 2019-11-07 | Audi Ag | Bediensystem und Verfahren zum Bedienen einer fahrzeugseitigen Vorrichtung |
US9785243B2 (en) * | 2014-01-30 | 2017-10-10 | Honeywell International Inc. | System and method for providing an ergonomic three-dimensional, gesture based, multimodal interface for use in flight deck applications |
WO2015133057A1 (ja) * | 2014-03-05 | 2015-09-11 | 株式会社デンソー | 検知装置及びジェスチャ入力装置 |
JP6426025B2 (ja) * | 2015-02-20 | 2018-11-21 | クラリオン株式会社 | 情報処理装置 |
-
2015
- 2015-05-21 DE DE102015006613.7A patent/DE102015006613A1/de active Pending
-
2016
- 2016-04-15 US US15/574,781 patent/US10599226B2/en active Active
- 2016-04-15 CN CN201680017384.5A patent/CN107454948A/zh active Pending
- 2016-04-15 EP EP16722047.4A patent/EP3298475A1/de not_active Withdrawn
- 2016-04-15 WO PCT/EP2016/000617 patent/WO2016184539A1/de active Application Filing
Also Published As
Publication number | Publication date |
---|---|
CN107454948A (zh) | 2017-12-08 |
US10599226B2 (en) | 2020-03-24 |
US20180150141A1 (en) | 2018-05-31 |
DE102015006613A1 (de) | 2016-11-24 |
WO2016184539A1 (de) | 2016-11-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2673156B1 (de) | Verfahren, vorrichtung und computerprogrammprodukt zum ansteuern einer funktionseinheit eines fahrzeugs | |
DE102017122866A1 (de) | Parkunterstützungsvorrichtung | |
DE102012021249B4 (de) | Verfahren zum Absichern eines Ausstiegsvorgangs einer Person aus einem Fahrzeug | |
WO2014206543A1 (de) | Verfahren und vorrichtung zur fernsteuerung einer funktion eines fahrzeugs | |
DE102017216837A1 (de) | Gestik- und Mimiksteuerung für ein Fahrzeug | |
DE102011054848B4 (de) | Steuer- und Überwachungseinrichtung für Fahrzeuge | |
EP1830244A2 (de) | Verfahren und Vorrichtung zum Betreiben von zumindest zwei Funktionskomponenten eines Systems, insbesondere eines Fahrzeugs | |
WO2014040930A1 (de) | Verfahren und vorrichtung zur bedienung einer kraftfahrzeugkomponente mittels gesten | |
DE102005023697A1 (de) | Einrichtung zur Steuerung der Innenbeleuchtung eines Kraftfahrzeugs | |
WO2019201382A1 (de) | Verfahren, vorrichtung und fortbewegungsmittel für ein automatisiertes anfahren eines fortbewegungsmittels an einer lichtsignalanlage | |
DE102012022321A1 (de) | Verfahren zum Betreiben eines Fahrzeugs und Fahrzeug | |
EP3230828A1 (de) | Erfassungsvorrichtung zum erkennen einer geste und/oder einer blickrichtung eines insassen eines kraftfahrzeugs durch synchrone ansteuerung von leuchteinheiten | |
DE102017103391A1 (de) | Verfahren zur Verbesserung der Benutzerfreundlichkeit eines Fahrzeugs | |
EP3298475A1 (de) | Bediensystem und verfahren zum betreiben eines bediensystems für ein kraftfahrzeug | |
DE102019206696A1 (de) | Verfahren zur geführten Fahrzeugübergabe bei automatisiertem Valet Parken | |
DE102017204916A1 (de) | Verfahren zum Durchführen eines automatischen Fahrvorgangs eines Kraftfahrzeugs unter Verwendung einer Fernbedienung | |
DE10307477A1 (de) | Außenspiegelsteuerung | |
WO2021001258A1 (de) | Vorrichtung und verfahren zur ermittlung von bilddaten der augen, augenpositionen oder einer blickrichtung eines fahrzeugnutzers | |
DE102016011016A1 (de) | Verfahren zum Betrieb eines Assistenzsystems | |
DE102018208402A1 (de) | Funktionsüberprüfungssystem | |
DE102016011141A1 (de) | Verfahren zur Überwachung eines Zustandes zumindest eines Insassen eines Fahrzeuges | |
DE102011080198A1 (de) | Rückblickspiegel | |
WO2020224986A1 (de) | Verfahren zum durchführen eines rangiervorgangs eines kraftfahrzeugs mit kamerabasierter bestimmung eines abstands zum kraftfahrzeug, sowie rangierassistenzsystem | |
DE10126238A1 (de) | Vorrichtung und Verfahren zur Einstellung mindestens eines Rückspiegels | |
WO2014187905A2 (de) | Verfahren zur bedienung eines schienenfahrzeugs und führerstand eines schienenfahrzeugs |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20171221 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: DI FRANCO, ONOFRIO Inventor name: SPRICKMANN KERKERINCK, PAUL |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
17Q | First examination report despatched |
Effective date: 20190308 |
|
18W | Application withdrawn |
Effective date: 20190326 |