EP3230828A1 - Dispositif de détection pour connaître un geste et/ou une direction du regard d'un occupant d'un véhicule par commande synchrone d'unités d'éclairage - Google Patents
Dispositif de détection pour connaître un geste et/ou une direction du regard d'un occupant d'un véhicule par commande synchrone d'unités d'éclairageInfo
- Publication number
- EP3230828A1 EP3230828A1 EP15805454.4A EP15805454A EP3230828A1 EP 3230828 A1 EP3230828 A1 EP 3230828A1 EP 15805454 A EP15805454 A EP 15805454A EP 3230828 A1 EP3230828 A1 EP 3230828A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- occupant
- sensor device
- light
- gesture
- motor vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 45
- 230000001360 synchronised effect Effects 0.000 title description 4
- 238000000034 method Methods 0.000 claims description 6
- 230000004044 response Effects 0.000 claims description 5
- 238000005286 illumination Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000033001 locomotion Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000004378 air conditioning Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
- B60K2360/1464—3D-gesture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
Definitions
- the present invention relates to a detection device for detecting a gesture and / or a viewing direction of an occupant of a motor vehicle, comprising a first sensor device and at least one second sensor device, each of the sensor devices each having a light emitting unit for emitting light
- Receiving unit for receiving the light reflected by the occupant and a computing unit for driving the lighting unit and for detecting the gesture and / or viewing direction based on the reflected light.
- the invention also relates to an operating arrangement and a motor vehicle.
- the present invention relates to a method for detecting a gesture and / or a viewing direction of an occupant of a motor vehicle.
- a plurality of detection devices is installed, with which, for example, an operating action of a vehicle occupant can be detected.
- an operating action of a vehicle occupant can be detected.
- the detection device comprises, for example, a sensor device in the form of a camera, by means of which detects the operating gesture of the occupant and with a
- an operating signal can then be generated with which a
- Functional device of the motor vehicle can be controlled.
- detection devices are known from the prior art with which a viewing direction of the driver can be detected. Thus, it can be recognized, for example, whether a driver directs his view of the road. In addition, it can be recognized on the basis of the driver's line of sight, for example
- Such detection devices for detecting a gesture and / or the viewing direction of a vehicle occupant usually have an optical sensor with an active lighting unit.
- the detection devices are designed as cameras, wherein the lighting units emit light in the visible wavelength range or in the infrared wavelength range.
- a camera for example, a Image of a portion of the occupant, such as the hand, are recorded.
- 3D cameras are known, with which a pictorial representation of
- Distances can be provided.
- TOF cameras TOF - Time of Flight
- TOF - Time of Flight what are known as TOF cameras (TOF - Time of Flight), which emit light with the lighting unit and detect the light reflected by the occupant with a receiving unit. Based on the duration of the light can then be deduced the distance between the detection device and at least a part of the occupant.
- DE 10 2008 048 325 A1 describes an operation input unit, which comprises an image recording device.
- the operation input unit includes a hand-range detecting device which detects an area of a human hand from a motion picture taken with the image pick-up device.
- a manual operation determining device is provided, which determines a manual operation of a shape and movement of the detected hand area.
- a menu selection display device is provided which includes a
- Operation input device has a plurality of image pickup devices.
- Control command for a component of a motor vehicle described.
- an image sequence of an input object guided by a user is generated in a predetermined detection area by means of an imaging device.
- a posture change of the input object is recognized based on the image sequence, and a control command is issued to a component of the motor vehicle based on the detected posture change.
- the imaging device may have at least one infrared-sensitive camera.
- a device for controlling vehicle functions has at least one interface with which a user interacts through static and / or dynamic user gestures.
- a detection unit is provided which detects the static and dynamic user gestures.
- a user gesture is detected with a control unit and a corresponding vehicle function is activated.
- the detection unit may comprise one or more cameras, which are designed in particular as TOF cameras. With these, preferably, the movement of a part of the head, in particular the eye or the nose can be detected.
- the Detection unit designed to recognize parallel gestures of a user and / or the gestures of several people simultaneously and / or sequentially.
- Detection device of the type mentioned for detecting a gesture and / or a viewing direction of an occupant of a motor vehicle can be operated reliably.
- a detection device is used to detect a gesture and / or a viewing direction of an occupant of a motor vehicle.
- the detection device comprises a first sensor device and at least one second sensor device.
- Each of the sensor devices has in each case a lighting unit for emitting light.
- each of the sensor devices comprises a receiving unit for receiving the light reflected by the occupant.
- each of the sensor devices comprises a computing unit for driving the lighting unit and for detecting the gesture and / or the viewing direction based on the reflected light.
- the arithmetic units of the sensor devices are designed to actuate the light units synchronously in dependence on a synchronization signal.
- the detection device can be used in a motor vehicle. With the detection device, a gesture of an occupant of the motor vehicle and / or a viewing direction of the occupant of the motor vehicle can be detected.
- the detection device comprises at least two sensor devices, for example distributed in the
- the sensor devices may comprise an optical sensor or a camera.
- Each of the sensor devices has a lighting unit.
- This light unit can be controlled by means of the arithmetic unit.
- a lighting time so the time duration, while the light is emitted with the respective lighting unit, are controlled.
- Sensor devices thus have active illumination.
- the light emitted by the respective lighting unit strikes a part, for example a body part, of the occupant and is reflected by the occupant and in turn reaches the receiving unit
- the arithmetic unit can then recognize the gesture and / or the viewing direction of the occupant of the respective sensor device as a function of the reflected light.
- the present invention is based on the finding that, when using at least two sensor devices which have active illumination,
- Influencing the sensor device can occur with each other.
- interference may occur due to the light emitted by the light units.
- the luminous units of the at least two sensor devices can thus simultaneously in a specific
- Sequence can be controlled. Due to the synchronous operation of the respective
- Light units of the sensor devices can mutual interference of
- Sensor devices are avoided by their lighting units. Furthermore, an optimal illumination of the respective detection region of the sensor devices can also be made possible.
- the arithmetic unit of the first sensor device is adapted to provide the synchronization signal and to the arithmetic unit of the second
- the arithmetic unit of the first sensor device provides a corresponding synchronization signal.
- the lighting unit of the first sensor device In dependence on this synchronization signal of the arithmetic unit of the first sensor device, the lighting unit of the first
- This synchronization signal may also include information as to when the lighting unit of the second sensor device is to be controlled.
- This synchronization signal is then transmitted by the arithmetic unit of the first sensor device.
- the synchronization signal can also be a time delay in the transmission of the arithmetic unit of the first
- the synchronization signal can be provided with a computing unit of one of the sensor devices themselves.
- the computing units of the first and the at least one second sensor device are connected via a data line for transmitting the
- Synchronization signal are connected.
- the data line for example, a direct data connection between the arithmetic units of the two sensor devices to be provided.
- a reliable transmission of the synchronization signal between the sensor devices can be guaranteed.
- a time delay in transmitting the synchronization signal can be prevented or reduced by the direct connection.
- the data line is a data bus of the motor vehicle, in particular a CAN bus.
- the transmission of the synchronization signal via the existing data bus of the motor vehicle can take place.
- Synchronization signal in response to which the respective lighting units of each of the sensor devices are controlled, can be reliably avoided a disturbance of the sensor devices with each other.
- the time duration for processing a data frame which is transmitted via a data bus is significantly longer than the illumination duration, while the one
- Light unit is activated.
- reliable operation of the detection device can be enabled.
- the first and / or the at least one second sensor device is designed to determine a distance to at least a part of the occupant based on a transit time of the emitted light and the reflected light.
- the first and / or the second sensor device is designed as a so-called 3D camera or TOF camera.
- Inmates for example, a hand of the occupant, are spatially detected.
- corresponding gestures of the vehicle occupant can be detected and evaluated by means of the arithmetic unit.
- a comparison with predetermined gestures, which are stored for example in the arithmetic unit take place.
- a corresponding operating signal can then be transmitted to a functional device of the motor vehicle.
- the first and / or the at least one second sensor device is designed to detect an image of at least one part of the occupant on the basis of the reflected light.
- the first and / or the at least one second sensor device is designed as a camera, with which an image of the body part of the occupant can be detected.
- the first and / or a second sensor device is designed to detect a viewing direction of the occupant on the basis of the captured image.
- it can be detected, for example, whether the driver or the occupant directs his gaze on the road.
- it can also be recognized, for example, whether the driver has opened or closed his eyes. It It can also be provided that it is recognized by means of the viewing direction which
- Function direction of the motor vehicle to serve the occupant On the basis of the detected viewing direction, a corresponding operating signal can be output.
- Sensor device and the arithmetic unit of the at least one second sensor device to control the light units alternately.
- the two sensor devices for detecting the gesture are operated alternately. In this way, mutual influences of the sensor devices when detecting the gesture can be prevented.
- An operating arrangement according to the invention for a motor vehicle comprises a
- the operating arrangement comprises a functional device which can be activated as a function of the gesture and / or viewing direction detected by the detection device. If with the
- a corresponding operating signal can be output and to the
- the functional device to be transmitted.
- the functional device can then be in
- the functional device of the motor vehicle can, for. As an infotainment system, a navigation system or the like. It can also be provided that the functional device, a window regulator, an adjusting device for adjusting the exterior mirrors or the like.
- the operating arrangement may also be part of a driver assistance system of the motor vehicle, which performs, for example, an intervention in the steering and / or the brake system. This is particularly advantageous if by means of the detection device It is recognized that the driver avoids looking away from the road and threatens a danger.
- a motor vehicle according to the invention comprises an inventive
- the motor vehicle is designed in particular as a passenger car.
- An inventive method is used to detect a gesture and / or a viewing direction of an occupant of a motor vehicle. This will be a first
- a light unit for emitting light is actuated by means of a computing unit which receives from the occupant or reflected light means of a receiving unit and recognizes by means of the arithmetic unit the gesture and / or the viewing direction of the reflected light.
- the light units are controlled synchronously in response to a synchronization signal by means of the arithmetic units of the sensor devices.
- FIG. 1 shows a motor vehicle according to an embodiment of the present invention
- FIG. 2 shows a detection device of the motor vehicle, by means of which a gesture and / or a viewing direction of an occupant is detected;
- FIG. 3 the detection device of FIG. 2, which a first and a second
- the motor vehicle 1 shows a motor vehicle 1 according to one embodiment of the present invention in a schematic representation.
- the motor vehicle 1 is presently designed as a passenger car.
- the motor vehicle 1 comprises an operating arrangement 2.
- the operating arrangement 2 in turn comprises a detection device 3
- Detection device 3 can - as explained in more detail below - a gesture and / or a viewing direction of an occupant 13 of the motor vehicle 1 are detected.
- a corresponding operating signal can be transmitted to a functional device 4 of the motor vehicle 1 by the detection device 3.
- the functional device 4 of the motor vehicle 1 may, for example, be
- the functional device 4 may also be a corresponding adjusting device for opening and / or closing the window, an adjusting device for adjusting the exterior mirrors, an adjusting device for opening and / or closing a sunroof or a convertible top, an adjusting device for adjusting the seats or the like.
- the functional device may also be part of a driver assistance system of the motor vehicle 1.
- Fig. 2 shows an embodiment of the detection device 3 in a schematic representation.
- the detection device comprises 3, a first sensor device 5 and a second sensor device 6. It can also be provided that the detection device 3 comprises more than two sensor devices 5, 6.
- the sensor devices 5, 6 can be distributed in the interior of the
- Each of the sensor devices 5, 6 comprises a lighting unit 7, with which light can be emitted.
- the lighting unit 7 can be designed, for example, to emit light in the visible wave range or light in the infrared wave range.
- the emitted light 1 1 is reflected by a part of the occupant 13.
- the reflected light 12 reaches a receiving unit 8 of the respective sensor device 5, 6.
- One or both of the sensor devices 5, 6 can be designed as cameras which, depending on the reflected light 12, can capture an image of at least part of the occupant 13.
- One or both of the sensor devices 5, 6 can be designed as so-called 3D cameras or TOF cameras. With these, the spatial position of a part of the occupant 13 can be detected on the basis of the reflected light 12. Thus, for example, a gesture executed with a hand 15 of the occupant 13 can be recognized. Furthermore, for example, a position, an inclination and / or a rotation of a head 14 of the occupant 13 can be detected.
- Sensor devices 5, 6 can also be designed to detect a viewing direction of the occupant 13, for example, based on the position of the eyes of the occupant 13. This is shown here by way of example by the arrow 16.
- Each of the sensor devices 5, 6 comprises a computing unit 9. It can
- the respective computing units 9 serve the lighting units 7 of
- the computing units 9 are designed to detect the gesture or the viewing direction of the occupant 13 on the basis of the reflected light 12. For this purpose, for example, a corresponding
- Image processing are performed, which is recognized by the one gestures and / or viewing direction.
- the sensor devices 5, 6 are connected via a data line 10 for data transmission.
- the arithmetic units 9 of the respective sensor devices 5, 6 are connected to the data line 10.
- the data line 10 may be formed, for example, by a data bus of the motor vehicle 1, for example, the CAN bus. about A corresponding synchronization signal can be transmitted to the data bus as a function of which the respective computing units 9 control the light units 7 synchronously.
- Fig. 3 shows the detection device 3 in a further embodiment.
- each of the sensor devices 5, 6 has a corresponding sensor board 17, on which the arithmetic units 9 are arranged.
- each of the sensor devices 5, 6 has a communication interface 19 which is connected to the data line 10. Furthermore, a direct connection between the communication interface and the arithmetic unit 9 is provided via a data line 20.
- Detection device 3 can be transmitted via the data line, in particular the CAN bus, corresponding data frames.
- the data line in particular the CAN bus, corresponding data frames.
- Arithmetic unit 9 of the first sensor device can be defined as a master.
- Arithmetic unit 9 of the second sensor device 6 can be defined as a slave.
- a corresponding synchronization signal can then be provided.
- This synchronization signal may be provided by any data frame transmitted over the data line 10.
- the arithmetic units 9 can then control the respective lighting units 7.
- the light units 7 can be controlled simultaneously or offset in time. Due to the synchronous operation of the lighting units 7, influencing of the sensor devices 5, 6 with one another can be avoided in particular.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Mechanical Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Ophthalmology & Optometry (AREA)
- User Interface Of Digital Computer (AREA)
- Lighting Device Outwards From Vehicle And Optical Signal (AREA)
Abstract
L'invention concerne un dispositif de détection (3), destiné à reconnaître un geste et/ou une direction du regard d'un occupant (13) d'un véhicule automobile (1), qui comprend un premier moyen de détection (5) et au moins un deuxième moyen de détection (6). Chacun des moyens de détection (5, 6) comporte une unité d'éclairage (7) destinée à émettre de la lumière (11), une unité de réception (8) destinée à recevoir de la lumière (12) réfléchie par l'occupant (13), et une unité de calcul (9) destinée à commander l'unité d'éclairage (7) et à reconnaître le geste et/ou la direction du regard à partir de la lumière réfléchie (12). Les unités de calcul (9) des moyens de détection (5, 6) sont adaptées pour commander de manière synchrone les unités d'éclairage (7) en fonction d'un signal de synchronisation.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102014118387.8A DE102014118387A1 (de) | 2014-12-12 | 2014-12-12 | Erfassungsvorrichtung zum Erkennen einer Geste und/oder einer Blickrichtung eines Insassen eines Kraftfahrzeugs durch synchrone Ansteuerung von Leuchteinheiten, Bedienanordnung, Kraftfahrzeug sowie Verfahren |
PCT/EP2015/078623 WO2016091736A1 (fr) | 2014-12-11 | 2015-12-04 | Dispositif de détection pour connaître un geste et/ou une direction du regard d'un occupant d'un véhicule par commande synchrone d'unités d'éclairage |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3230828A1 true EP3230828A1 (fr) | 2017-10-18 |
Family
ID=54834808
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP15805454.4A Withdrawn EP3230828A1 (fr) | 2014-12-12 | 2015-12-04 | Dispositif de détection pour connaître un geste et/ou une direction du regard d'un occupant d'un véhicule par commande synchrone d'unités d'éclairage |
Country Status (6)
Country | Link |
---|---|
US (1) | US20170323165A1 (fr) |
EP (1) | EP3230828A1 (fr) |
JP (1) | JP2018503899A (fr) |
CN (1) | CN107209558A (fr) |
DE (1) | DE102014118387A1 (fr) |
WO (1) | WO2016091736A1 (fr) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6333921B2 (ja) * | 2016-11-09 | 2018-05-30 | ファナック株式会社 | 撮像装置及び撮像方法 |
JP7219041B2 (ja) * | 2018-10-05 | 2023-02-07 | 現代自動車株式会社 | 注視検出装置及びその輻輳制御方法 |
JP6894880B2 (ja) | 2018-11-07 | 2021-06-30 | 矢崎総業株式会社 | 監視システム |
DE102019204481A1 (de) * | 2019-03-29 | 2020-10-01 | Deere & Company | System zur Erkennung einer Bedienabsicht an einer von Hand betätigbaren Bedieneinheit |
DE102021113811A1 (de) | 2021-05-28 | 2022-12-01 | Bayerische Motoren Werke Aktiengesellschaft | System und Verfahren zur Innenraumüberwachung eines Fahrzeugs |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005138755A (ja) * | 2003-11-07 | 2005-06-02 | Denso Corp | 虚像表示装置およびプログラム |
JP4760445B2 (ja) * | 2006-02-27 | 2011-08-31 | トヨタ自動車株式会社 | 車両用警報装置 |
JP2008294662A (ja) * | 2007-05-23 | 2008-12-04 | Toyota Motor Corp | 通信装置、通信システム |
JP5228439B2 (ja) | 2007-10-22 | 2013-07-03 | 三菱電機株式会社 | 操作入力装置 |
CN102265261B (zh) * | 2008-12-22 | 2014-07-02 | 丰田自动车株式会社 | 车辆用电子控制系统、车辆用电子控制单元、车辆用控制同步方法 |
US8384534B2 (en) * | 2010-01-14 | 2013-02-26 | Toyota Motor Engineering & Manufacturing North America, Inc. | Combining driver and environment sensing for vehicular safety systems |
WO2012157793A1 (fr) * | 2011-05-17 | 2012-11-22 | Lg Electronics Inc. | Procédé et appareil de reconnaissance de geste |
US20120312956A1 (en) * | 2011-06-11 | 2012-12-13 | Tom Chang | Light sensor system for object detection and gesture recognition, and object detection method |
DE102011089195A1 (de) * | 2011-06-30 | 2013-01-03 | Johnson Controls Gmbh | Vorrichtung und Verfahren zur berührungslosen Erfassung von Gegenständen und/oder Personen und von diesen ausgeführten Gesten und/oder Bedienvorgängen |
US8830302B2 (en) * | 2011-08-24 | 2014-09-09 | Lg Electronics Inc. | Gesture-based user interface method and apparatus |
CN103782255B (zh) * | 2011-09-09 | 2016-09-28 | 泰利斯航空电子学公司 | 交通工具娱乐系统的眼动追踪控制 |
WO2013176265A1 (fr) * | 2012-05-25 | 2013-11-28 | 国立大学法人静岡大学 | Procédé de détection de pupille, procédé de détection de réflexe cornéen, procédé de détection de position du visage, et procédé de suivi de pupille |
JP5743221B2 (ja) * | 2012-06-29 | 2015-07-01 | カシオ計算機株式会社 | 無線同期システム、無線装置、センサ装置、無線同期方法、及びプログラム |
WO2014032822A2 (fr) | 2012-08-29 | 2014-03-06 | Johnson Controls Gmbh | Dispositif et procédé de commande de fonctions d'un véhicule par détection des interactions d'un utilisateur |
DE102012110460A1 (de) | 2012-10-31 | 2014-04-30 | Audi Ag | Verfahren zum Eingeben eines Steuerbefehls für eine Komponente eines Kraftwagens |
-
2014
- 2014-12-12 DE DE102014118387.8A patent/DE102014118387A1/de not_active Withdrawn
-
2015
- 2015-12-04 EP EP15805454.4A patent/EP3230828A1/fr not_active Withdrawn
- 2015-12-04 US US15/535,177 patent/US20170323165A1/en not_active Abandoned
- 2015-12-04 WO PCT/EP2015/078623 patent/WO2016091736A1/fr active Application Filing
- 2015-12-04 JP JP2017531284A patent/JP2018503899A/ja active Pending
- 2015-12-04 CN CN201580073953.3A patent/CN107209558A/zh active Pending
Non-Patent Citations (2)
Title |
---|
None * |
See also references of WO2016091736A1 * |
Also Published As
Publication number | Publication date |
---|---|
JP2018503899A (ja) | 2018-02-08 |
DE102014118387A1 (de) | 2016-06-16 |
WO2016091736A8 (fr) | 2017-12-14 |
CN107209558A (zh) | 2017-09-26 |
WO2016091736A1 (fr) | 2016-06-16 |
US20170323165A1 (en) | 2017-11-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102013012466B4 (de) | Bediensystem und Verfahren zum Bedienen einer fahrzeugseitigen Vorrichtung | |
WO2016091736A1 (fr) | Dispositif de détection pour connaître un geste et/ou une direction du regard d'un occupant d'un véhicule par commande synchrone d'unités d'éclairage | |
DE102014101208A9 (de) | Montagemodul | |
DE102016220979A1 (de) | Verfahren und Vorrichtung zum Steuern eines Betriebs eines Seiten- und Hecküberwachungs-Kamerasystems | |
DE102016206126A1 (de) | Verfahren und Vorrichtung zum Überwachen oder Regeln einer Fahraufgabe-Übergabe in einem selbstfahrenden Fahrzeug und System für eine Fahraufgabe-Übergabe in einem selbstfahrenden Fahrzeug | |
DE102016113808A1 (de) | Fahrspurerkennungsvorrichtung | |
EP3790768B1 (fr) | Dispositif et procédé servant à faire fonctionner une identification d'objets pour l'habitacle d'un véhicule automobile, et véhicule automobile | |
DE102013019210A1 (de) | Beleuchtungsvorrichtung für den Fahrgastraum eines Kraftfahrzeugs und Verfahren zum Steuern der Beleuchtungsvorrichtung | |
DE102018118849A1 (de) | Verstellbare gestapelte Filter für Fahrzeugkameras | |
DE102019005448B4 (de) | Vorrichtung und Verfahren zur Steuerung eines Beifahrersitzes in einem Fahrzeug | |
EP3254172B1 (fr) | Détermination d'une position d'un objet étranger à un véhicule dans un véhicule | |
DE102009002979A1 (de) | Projektionsanzeigevorrichtung für Fahrzeuge | |
DE102013020950B4 (de) | Verfahren zum Betreiben eines Rückansichtskamerasystems eines Kraftfahrzeugs, Rückansichtskamerasystem und Kraftfahrzeug | |
WO2018206213A1 (fr) | Procédé et dispositif de détection à résolution spatiale d'un objet externe à un véhicule à l'aide d'un capteur intégré dans un véhicule | |
WO2016067082A1 (fr) | Procédé et dispositif de commande gestuelle dans un véhicule | |
DE102014224484A1 (de) | Verfahren zur Anpassung des äußeren Erscheinungsbilds eines Kraftfahrzeugs und Kraftfahrzeug mit einem anpassbaren äußeren Erscheinungsbild | |
DE102012018685B4 (de) | System und Verfahren zur Steuerung von zumindest einem Fahrzeugsystem mittels von einem Fahrer durchgeführter Gesten | |
DE10257963A1 (de) | Verfahren und Vorrichtung zur Bestimmung der 3D-Position von PKW-Insassen | |
DE102015010421A1 (de) | Dreidimensionale Erfassung des Fahrzeuginnenraums | |
DE102016211495A1 (de) | Steuerungseinrichtung für ein Kraftfahrzeug | |
EP3583488B1 (fr) | Activation automatisée d'un système d'assistance visuelle | |
DE102019004692B3 (de) | Vorrichtung und Verfahren zur Ermittlung von Bilddaten der Augen, von Augenpositionen und/oder einer Blickrichtung eines Fahrzeugnutzers in einem Fahrzeug | |
DE112020003760T5 (de) | Signalverarbeitungsvorrichtung, signalverarbeitungsverfahren und bildgebungseinrichtung | |
DE102019132460A1 (de) | Einstellen eines Rückspiegels an einem Fahrzeug | |
DE102016011016A1 (de) | Verfahren zum Betrieb eines Assistenzsystems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20170609 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20190910 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20200121 |