GB2568512A - Vehicle controller - Google Patents
Vehicle controller Download PDFInfo
- Publication number
- GB2568512A GB2568512A GB1719071.1A GB201719071A GB2568512A GB 2568512 A GB2568512 A GB 2568512A GB 201719071 A GB201719071 A GB 201719071A GB 2568512 A GB2568512 A GB 2568512A
- Authority
- GB
- United Kingdom
- Prior art keywords
- hand
- control device
- dependence
- illumination source
- distance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000005286 illumination Methods 0.000 claims abstract description 133
- 238000000034 method Methods 0.000 claims abstract description 41
- 230000003247 decreasing effect Effects 0.000 claims description 6
- 230000003213 activating effect Effects 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 4
- 238000012544 monitoring process Methods 0.000 claims description 3
- 238000004458 analytical method Methods 0.000 description 10
- 230000004913 activation Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 108010066057 cabin-1 Proteins 0.000 description 3
- 230000001419 dependent effect Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000004378 air conditioning Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000004313 glare Effects 0.000 description 1
- 238000002329 infrared spectrum Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000036548 skin texture Effects 0.000 description 1
- 238000009423 ventilation Methods 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q3/00—Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
- B60Q3/10—Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors for dashboards
- B60Q3/16—Circuits; Control arrangements
- B60Q3/18—Circuits; Control arrangements for varying the light intensity
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q3/00—Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
- B60Q3/10—Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors for dashboards
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q3/00—Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
- B60Q3/10—Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors for dashboards
- B60Q3/12—Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors for dashboards lighting onto the surface to be illuminated
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q3/00—Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
- B60Q3/70—Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors characterised by the purpose
- B60Q3/76—Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors characterised by the purpose for spotlighting, e.g. reading lamps
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q3/00—Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
- B60Q3/80—Circuits; Control arrangements
- B60Q3/85—Circuits; Control arrangements for manual control of the light, e.g. of colour, orientation or intensity
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/141—Activation of instrument input devices by approaching fingers or pens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/21—Optical features of instruments using cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/33—Illumination features
- B60K2360/334—Projection means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/33—Illumination features
- B60K2360/345—Illumination of controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q2500/00—Special features or arrangements of vehicle interior lamps
- B60Q2500/20—Special features or arrangements of vehicle interior lamps associated with air conditioning arrangements
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Lighting Device Outwards From Vehicle And Optical Signal (AREA)
Abstract
A method of controlling an illumination source associated with a control device located in a vehicle. The method comprises detecting a hand of a vehicle occupant within a volume of space within the vehicle cabin; determining the relative position of the detected hand with respect to the control device; and controlling the illumination source in dependence on a distance between at least a portion of the hand and the control device being less than or equal to a predefined threshold distance.
Description
The present disclosure relates to a vehicle controller. Aspects of the invention relate to a controller for controlling an illumination source associated with a control device located in a vehicle cabin, to a vehicle comprising the controller, to a system for controlling an illumination source associated with the control device located in a vehicle cabin, and to a method of controlling an illumination source associated with a control device located in a vehicle.
BACKGROUND
Current vehicles, in particular automobiles, are often configured to illuminate control devices, such as switches and other controls located within the vehicle cabin, during poor lighting conditions, for example at night. This assists the vehicle occupants to locate and operate the control devices. In order to aid operation, it is common for each control device located within the vehicle cabin to be provided with an illumination source configured to automatically illuminate the control device in response to a control signal. In some known applications the control signal may be generated in response to the vehicle head lights being activated. Activation of the vehicle head lights is taken to be indicative of the vehicle being operated in poor lighting conditions. Similarly, it is known to automatically activate the control device’s illumination source in dependence on an ambient light sensor indicating that the vehicle is being operated in poor lighting conditions.
A disadvantage associated with the aforementioned prior art solutions is that either all of the control devices are illuminated or none of them are illuminated. Accordingly, the power demands associated with the prior art solutions are significant, and present a significant drain on battery resources.
It is an aim of at least certain embodiments of the present invention to ameliorate disadvantages associated with the prior art, and in particular to reduce the power requirements for illuminating control devices within the vehicle cabin.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a method, a controller, a system, a vehicle, a computer program product and a computer readable data carrier as claimed in the appended claims.
According to an aspect of the present invention there is provided a method of controlling an illumination source associated with a control device located in a vehicle. The method may comprise detecting a hand of a vehicle occupant within a volume of space within the vehicle cabin; determining the relative position of the detected hand with respect to the control device; and controlling the illumination source in dependence on a distance between at least a portion of the hand and the control device being less than or equal to a predefined threshold distance.
In this way it is possible to selectively illuminate individual control devices in dependence on a vehicle occupant’s hand being located within a predefined threshold distance of the desired control device. This avoids the problem of having to maintain all control devices illuminated, and reduces battery power consumption. A further benefit is that the amount of glare within the vehicle cabin is reduced, which is particularly beneficial when driving in poorly illuminated environments, such as at night.
The relative position of the detected hand may be determined with respect to the control device directly or indirectly, for example by determining the position of the detected hand with respect to a designated origin.
The method may comprise determining the relative position of the hand with respect to a control proximity boundary associated with the control device, the control proximity boundary defining a boundary offset from the control device by the predefined threshold distance, and controlling the illumination source in dependence on the position of at least a portion of the hand intersecting the control proximity boundary. The proximity boundary may define a volume of space adjacent to the control device, in dependence on the vehicle occupant’s hand being at or within the proximity boundary, the illumination source may be controlled.
The illumination source may comprise an ON-state in which the illumination source is active, and an OFF-state in which the illumination source is not active. The method may comprise activating the illumination source, when the illumination source is in the OFF-state, in dependence on the distance between at least a portion of the hand and the control device being less than or equal to the predefined threshold distance; and/or deactivating the illumination source, when the illumination source is in the ON-state, in dependence on the distance between at least a portion of the hand and the control device being greater than or equal to the predefined threshold distance. In this way, advantageously, the control device’s illumination source may either be activated or deactivated, as required, without requiring any physical contact with the control device. In this way it is also possible to activate or deactivate, as the case may be, the illumination source, without the driver having to divert their line of sight from the road.
In certain embodiments the method may comprise monitoring the relative position of the hand with respect to the control device, and varying an intensity of illumination associated with the illumination source in dependence on the relative position of the hand with respect to the control device. For example, this may comprise increasing the intensity of illumination in dependence on a decrease of a distance associated with the relative position of the hand with respect to the control device. Similarly, this may comprise decreasing the intensity of illumination in dependence on an increase of a distance associated with the relative position of the hand with respect to the control device. This provides a convenient way of varying the intensity of illumination without requiring physical contact with the control device or any other device for that matter.
The method may comprise determining if the hand is the vehicle occupant’s left or right hand, and controlling the illumination source in dependence on whether the hand is the vehicle occupant’s left or right hand. This may be achieved by determining if the hand is oriented palm upwards or downwards within the volume of space relative to the control device, and determining if the hand is the vehicle occupant’s left or right hand in dependence on the orientation of the hand. In this way it possible to control operation of the illumination source in dependence on whether the hand is a left or right hand.
The method may comprise determining if the hand is a hand of a first occupant of the vehicle or of a second occupant of the vehicle, and controlling the illumination source in dependence on whom the hand belongs to.
The method may comprise determining if the hand is a hand of a driver of the vehicle or of a passenger of the vehicle, and controlling the illumination source in dependence on whom the hand belongs to. In this way, advantageously, it is possible to selectively restrict control of the illumination source in dependence on which vehicle occupant the hand belongs to. This may be particularly advantageous where the driver of a vehicle may not want a vehicle occupant interfering with the illumination sources of a specific control device.
The method may comprise determining a direction of entry of the hand into the volume of space relative to the control device. The direction of entry of the hand may be indicative of which vehicle occupant the hand belongs to, and therefore provides a convenient way of identifying whom the hand belongs to, which in turn may be used for selectively restricting control of the illumination source to specific vehicle occupants.
In certain embodiments, the method may comprise obtaining image data of the hand within the volume of space; receiving a reflectance signal reflected from the hand; determining a distance of the hand from a designated origin in dependence on the reflectance signal; and determining the relative position of the detected hand with respect to the control device in dependence on the distance of the hand relative to the designated origin, the obtained image data, and a known distance of the control device relative to the designated origin. For example, the time taken for the reflectance signal to be measured by a receiver is proportional to the distance of the hand from the sensor, and therefore provides a convenient way for distance information associated with the position of the hand to be obtained. In this way it is possible to determine the distance of the hand from the control device on the basis of a two-dimensional image of the hand relative to the control device, and distance information of the hand. This significantly simplifies the hardware required to carry out the method, and in particular obviates the need for using a complex system of two or more cameras, each configured to capture different perspective images of the hand relative to the control device, from which the distance of the hand relative to the control device may be determined.
In certain embodiments the designated origin may be coincident with the position of an image capture device.
According to a further aspect of the invention there is provided a controller for controlling an illumination source associated with a control device located in a vehicle cabin. The controller may comprise an input configured to receive image data obtained by an image capture device, a processor and an output. The processor may be arranged in use to: recognise a hand of a vehicle occupant from the image, the hand being located within a volume of space within the vehicle cabin within which the image data is obtained by the image capture device; and determine a position of the hand with respect to the control device. The output may be arranged in use to output a control signal to the illumination source in dependence on a distance between at least a portion of the hand and the control device being less than or equal to a predefined threshold distance. The present controller benefits from the same advantages as set out in respect of the preceding aspect of the invention.
The processor may be arranged in use to determine the position of the hand with respect to a control proximity boundary associated with the control device, the control proximity boundary defining a boundary offset from the control device by the predefined threshold distance. The output may be arranged in use to output the control signal in dependence on the position of at least a portion of the hand intersecting the control proximity boundary.
The processor may be arranged in use to recognise the hand in image data associated with a sequence of images captured by the image capture device, the sequence of images comprising two or more image frames; and determine if the position of the hand with respect to the control varies in the sequence of images. The output may be arranged in use to output a control signal configured to vary an intensity of illumination of the illumination source in dependence on the determined variation of the position of the hand with respect to the control device. The output may be arranged in use to output a control signal configured to increase the intensity of illumination in dependence on a decrease in a distance associated with the position of the hand with respect to the control device. The output may be arranged in use to output a control signal configured to decrease the intensity of illumination in dependence on an increase in a distance associated with the position of the hand with respect to the control device.
The processor may be arranged in use to determine if the hand is the vehicle occupant’s left or right hand; and the output may be arranged in use to output the control signal in dependence on whether the hand is the vehicle occupant’s left or right hand.
The processor may be arranged in use to determine if the hand is oriented palm upwards or downwards within the volume of space relative to the control device; and determine if the hand is the vehicle occupant’s left or right hand in dependence on the orientation of the hand.
The processor may be arranged in use to determine if the hand is a hand of a first occupant of the vehicle or of a second occupant of the vehicle, and the output may be arranged in use to output the control signal in dependence on whom the hand belongs to.
The processor may be arranged in use to determine if the hand is a hand of a driver or of a passenger of the vehicle; and the output may be arranged in use to output the control signal in dependence on whom the hand belongs to.
The processor may be arranged in use to determine a direction of entry of the hand into the volume of space relative to the control device.
In certain embodiments the input may be configured to receive data from a 3D mapping device configured to generate a three-dimensional model of the vehicle occupant’s hand located within a volume of space within the vehicle cabin. The processor may be configured to determine the relative position of the hand with respect to the control device from the three-dimensional model.
In certain embodiments the input may be configured to receive data from a time-offlight (ToF) image capture device comprising a sensor. The ToF image capture device being may be arranged in use to obtain image data of the hand, to illuminate the hand, and to measure a time of return of a reflected illumination signal, the time of return of the reflected illumination signal being proportional to a distance of the hand from the sensor. The data may comprise the image data and the time of return of the reflected illumination signal. The processor may be arranged in use to determine the relative position of the hand with respect to the control device in dependence on the determined distance of the hand from the sensor, the obtained image data, and a known distance of the control device relative to the sensor, wherein the distance of the hand from the sensor is determined in dependence on the time of return of the reflected illumination signal. The ToF image capture device provides a convenient means for obtaining image data associated with image object distance data, and therefore simplifies determining the distance of the hand from the control device.
In accordance with yet a further aspect of the invention, there is provided a system for controlling an illumination source associated with a control device located in a vehicle cabin. The system may comprise the aforementioned controller and an image capture device. Optionally, the image capture device may comprise a time of flight (ToF) image capture device.
In certain embodiments the system may comprise the illumination source. The illumination source may be arranged to provide a backlighting function to the control device.
In accordance with yet a further aspect of the invention, there is provided a vehicle comprising the aforementioned controller, or the aforementioned system.
In accordance with yet a further aspect of the invention, there is provided a vehicle configured to carry out the aforementioned method.
In accordance with yet a further aspect of the invention, there is provided a computer program product comprising instructions for carrying out the aforementioned method.
The computer program product may comprise instructions, which when executed on a processor, configure the processor to: detect a hand of a vehicle occupant within a volume of space within the vehicle cabin; determine the relative position of the detected hand with respect to the control device; and control the illumination source in dependence on a distance between at least a portion of the hand and the control device being less than or equal to a predefined threshold distance.
In accordance with yet a further aspect of the invention, there is provided a computer readable data carrier having stored thereon instructions for carrying out the aforementioned method. Optionally, the computer readable data carrier comprises a non-transitory computer readable data carrier.
The data carrier may comprise instructions, which when executed on a processor, configure the processor to: detect a hand of a vehicle occupant within a volume of space within the vehicle cabin; determine the relative position of the detected hand with respect to the control device; and control the illumination source in dependence on a distance between at least a portion of the hand and the control device being less than or equal to a predefined threshold distance.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a schematic cut-away illustration of a front portion of a vehicle cabin having a camera having a field of view arranged to obtain image data of a vehicle occupant’s hand within a volume of space within a vehicle cabin;
Figure 2 is a schematic, magnified illustration of a portion of the control panel of the vehicle cabin of Figure 1, showing control devices that are illuminated in dependence on a relative distance between at least a portion of the vehicle occupant’s hand and the control devices;
Figure 3 is a schematic of a controller configured to enable operation of an illumination source associated with a control device located in the vehicle cabin of Figure 1;
Figure 4 is a process flow chart outlining a method for activating an illumination source associated with a control device within a vehicle cabin, in dependence on the proximity of a hand to the control device, using the camera of Figure 1;
Figures 5 is a schematic illustration highlighting the principle of operation of a Time-ofFlight (ToF) camera, which may be used to determine the position of a vehicle occupant’s hand within the vehicle cabin of Figure 1;
Figures 6a and 6b are schematic illustrations showing a three-dimensional point cloud of a vehicle occupant’s hand generated using the ToF camera of Figure 5; and
Figure 7 is a schematic illustration of a vehicle comprising the camera of Figure 1 and the controller of Figure 3.
DETAILED DESCRIPTION
Figure 1 is a cut-away perspective view of a portion of the vehicle cabin 1, and in particular shows the driver 3 sitting in the driver’s seat 5. An image capture device in the form of a camera 7, having a field of view 9 delineated in Figure 1 by lines 11, is shown located in the cabin roof. Optionally, the camera 7 may comprise a ToF camera. The camera 7 is arranged to image objects located within the camera’s field of view 9. The field of view defines a volume of space within the vehicle cabin within which objects are imaged by the camera 7. The camera 7 is arranged such that a control panel 13 of the vehicle 1 lies within the camera’s field of view 9. The control panel 13 comprises a plurality of different control devices 15, which may relate to, but are not limited to: air ventilation switches; air conditioning switches; vehicle infotainment system; air circulation switches; and any other control device configured to operate a control system of the vehicle. Each control device 15 may be provided with its own illumination source 16 (see Figure 2) arranged to illuminate the control device, to facilitate operation in poor ambient lighting conditions. For example, in many instances the illumination source may be configured to provide a backlighting function. The vehicle cabin 1 may be comprised in the vehicle 43 of Figure 7.
Figure 2 provides a magnified perspective view of a portion of the control panel 13 of the vehicle 1, in which the illumination sources 16 associated with three control devices 15 are activated to provide a backlighting function. Activation of each illumination source 16 is dependent on a relative distance between at least a portion of a vehicle occupant’s hand 17 and the associated control device 15. This is explained in further detail in the ensuing description.
In certain embodiments, the camera 7 may be operatively coupled to a controller 19 (shown in Figure 3), and configured to receive image data obtained by the camera 7 and to output a control signal to the illumination source 16 of a specific control device 15 in dependence on an analysis of the received image data. This enables selective operation of the associated illumination source.
Figure 3 provides a functional overview of the controller 19. The controller 19 may be functionally embedded into an existing electronic control unit of the vehicle 1. The controller 19 may be provided with an input 21 and an output 23. The input 21 may be configured to receive image data obtained by the camera 7, and the output 23 may be configured to output a control signal to the illumination source 16 associated with a given control device 15. In addition, the controller 19 comprises a processor 25 arranged to analyse image data received from the camera 7, to identify image objects such as the hand 17 of a vehicle occupant within the obtained image data, and to generate control signals for controlling operation of the illumination sources 16 of associated control devices 15 in dependence on the relative position of the hand 17 with respect to the control devices 15.
In use, as a vehicle occupant’s hand 17 is captured by the camera 7, and its position relative to a control device 15 is determined by the controller 19, typically by the processor 25 of the controller 19, the illumination source 16 of the associated control device 15 may be automatically activated by the controller 19 via a control signal output to the illumination source 16 associated with the desired control device 15. For example, the illumination source 16 associated with the control device 15 with respect to which the imaged vehicle occupant’s hand 17 is determined to be closest may be activated in this way. This means that the illumination sources 16 associated with the control devices 15 that the vehicle occupant is interested in operating are selectively and individually operable.
In certain embodiments, the controller 19 may be configured in use to output the control signal in dependence on a distance between at least a portion of a vehicle occupant’s hand 17 and the desired control device 15 being less than or equal to a predefined threshold distance. For example, as the camera 7 captures image data of a vehicle occupant’s hand, such as the driver’s hand 17, the controller 19 may be configured to identify the image of the hand within the received image data. The relative position of the imaged hand with respect to a desired control device 15 may then be determined, from the obtained image data. In order to identify an image of a hand, the controller 19, and specifically the processor 25 may be configured with image recognition software configured to identify a vehicle occupant’s hand 17 located within the camera’s field of view 9, from obtained image data.
Figure 4 is a process flow chart outlining the method used in accordance with certain embodiments of the invention, to control operation of the illumination source 16 associated with the control device 15 in dependence on the proximity of a vehicle occupant’s hand to the control device 15, using the camera 7 in operative communication with the controller 19. The method is initiated by the camera 7 obtaining image data within the vehicle cabin 1, at step 301. In certain embodiments the camera 7 may be configured to continuously obtain the image data, or to periodically obtain image data at a predefined frequency. The obtained image data may be forwarded to the controller 19 for analysis where, at step 303, obtained image data is analysed to identify a vehicle occupant’s hand 17 within the obtained image data. As mentioned previously, this may comprise the use of image recognition software. Once a vehicle occupant’s hand 17 has been identified within the obtained image data, the position of the hand 17 is determined relative to the control device 15, at step 305. Where the vehicle control panel 13 comprises a plurality of different control devices 15, step 305 may comprise determining the position of the hand 17 relative to the nearest control device 15. The position of the hand 17 relative to the control device 15 may be determined by the processor 25. At step 307 it is determined if at least a portion of the hand 17 lies at a distance that is less than or equal to a predefined threshold distance from the control device 15. If it is determined that no portion of the hand lies within the predefined threshold distance, then the processor 25 continues to analyse received obtained image data, and the method returns to step 303. If instead it is determined by the processor 25 that at least a portion of the identified hand lies within the predefined threshold distance of the control device 15, then the processor generates a control signal for output to the relevant control device’s illumination source, at step 308. Upon receipt of the control signal, at step 310, the illumination source 16 of the associated control device 15 is activated.
In certain embodiments the predefined threshold distance may relate to a few centimetres, for example any distance within the range of 1cm to 10cm, including 1cm and 10cm. In certain embodiments the predefined threshold may delineate a control proximity boundary surrounding and offset from the control device 15 by the predefined threshold distance, which control proximity boundary when intersected by at least a portion of the vehicle occupant’s hand causes the controller 19 to generate the control signal for output to the relevant control device’s illumination source.
The control proximity boundary may be geometrically shaped. For example, the control proximity boundary may be box-shaped, or spherically shaped. Effectively, the control proximity boundary relates to a volume of space offset from the control device by the predefined threshold distance. In dependence on any portion of the control proximity boundary being intersected by at least a portion of the vehicle occupant’s hand, the controller generates the control signal for activating the associated control device’s illumination source. It is to be appreciated that not all of the portions of the control proximity boundary need to be offset from the control device by the predefined threshold distance. For example, where the control proximity boundary is box-shaped (e.g. cube shaped), it is to be appreciated that some faces of the cube may not be offset from the control device by the predefined threshold distance.
In certain embodiments, in order to enable the position of the hand 17 to be determined relative to the control device 15, the camera 7 may relate to a 3D mapping controller arranged to generate a 3D model of the hand within the field of view 9. For example, in certain embodiments the camera 7 may relate to a Time-of-Flight (ToF) camera, in which each captured image pixel is associated with a distance on the basis of a time of return of a reflected illumination signal. To achieve this the ToF camera may be configured with an illumination source arranged to illuminate the camera’s field of view. The incident illumination signal is subsequently reflected by objects present in the camera’s field of view, and the time of return of the reflected illumination signal is measured. In this way it is possible to associate a distance measurement to each imaged object. The illumination signal may relate to any electro-magnetic signal, and need not be comprised in the visible spectrum. For example, in certain embodiments the illumination signal may operate in the infrared spectrum.
In those embodiments where the camera 7 comprises a ToF camera, the controller 19, and specifically the input 21 may be configured to receive both camera image data and image object distance information data from the ToF camera. This enables the controller 19, and more specifically the processor 25 to determine the position of the vehicle occupant’s hand 17 relative to a control device 15 from the received data.
Figure 5 is a schematic diagram illustrating the principle of operation of a ToF camera 27. A modulated illumination source 29 is used to illuminate a desired target 31. The incident illumination 33 is reflected by the target 31 and captured on a sensor 35 comprising an array of pixels. However, whilst simultaneously capturing the reflected modulated light 37, the pixels of the sensor 35 also capture visible light reflected from the target. Since the illumination signal is modulated 33, it may be distinguished from the visible light reflected from the target 31, which enables the time of flight of the modulated illumination signal to be measured. The time of flight taken for the modulated illumination signal to be incident on the target 31 and reflected back to the sensor 35 is measured when it is incident on the sensor 35. In this way, each captured image pixel may be associated with a distance of the corresponding image object on the basis of the measured time of flight required for the reflected modulated illumination signal 37 to be measured by the sensor 35. More specific details regarding operation of ToF cameras are widely available in the art, and for this reason a more detailed discussion is not necessary for present purposes.
Where the camera 7 of Figure 1 comprises a ToF camera 27, it is possible to generate a three-dimensional point cloud of the vehicle occupant’s hand located within the camera’s field of view 9. Figures 6a and 6b illustrate an example of a threedimensional point cloud 39 of the vehicle occupant’s hand 17, generated using the ToF camera 27. In certain embodiments the controller 19 may be configured to generate the three-dimensional point cloud using the image data and image object distance information received from the ToF camera 27. Figure 6a shows a point cloud 39 of the vehicle occupant’s hand 17 as it is approaching a rectangular-shaped control proximity boundary 41. In Figure 6b a portion of the point cloud 39 of the vehicle occupant’s hand 17 is intersecting a portion of the control proximity boundary 41. In this event, and as mentioned previously, the controller 19 is configured to generate a control signal for activating the control device’s illumination source.
In order to enable the position of the vehicle occupant’s hand 17 to be determined relative to a control device 15, the position of the control device 15 relative to the ToF camera may be determined. Again, this may be done using image recognition software. Since the position of the control device 15 relative to the ToF camera 27 is known, and the position of the vehicle occupant’s hand 17 relative to the ToF camera 27 is known, the position of the vehicle occupant’s hand 17 relative to the control device 15 may be determined using basic trigonometry. In certain embodiments, and in order to facilitate computation during use, the controller 19 may be provided with distance information of each control device 15 relative to the ToF camera 27 during an initial configuration of the ToF camera 27. This distance information may be stored and accessed for subsequent use when it’s needed. This facilitates subsequent computation of the position of the hand relative to the control device, since only the distance of the vehicle occupant’s hand 17 with respect to the ToF camera 27, and the position relative to the known position of the control device 15 requires calculation.
In certain embodiments the controller 19 may be configured to deactivate an activated illumination source associated with a control device 15 in dependence on the distance between at least a portion of the vehicle occupant’s hand 17 and the control device 15 being less than the predefined threshold, in a similar manner as previously described. In such embodiments it is envisaged that the vehicle occupant may deactivate an activated illumination source by moving their hand to a distance less than or equal to the predefined threshold distance.
In alternative embodiments the controller 19 may be configured to deactivate an activated illumination source associated with a control device 15 in dependence on the distance between at least a portion of the vehicle occupant’s hand 17 and the control device 15 being greater than or equal to the predefined threshold distance. For example, it is envisaged that as the illumination source associated with the control device 15 is activated by the vehicle occupant positioning their hand such that at least a portion of the hand lies at or within the predefined threshold distance of the control device 15, it is equally envisaged that, the activated illumination source may be deactivated by the vehicle occupant retracting their hand to a position relative to the associated control device 15, that lies at a distance from the control device 15 greater than the predefined threshold distance.
In certain embodiments it is envisaged that once the illumination source has been activated it remains activated for a predefined period of time irrespective of the position of the vehicle occupant’s hand relative to the control device. For example, should the vehicle occupant retract their hand such that it no longer lies within the predefined threshold distance, then the illumination source remains activated for the predefined period of time before deactivating. The predefined period of time may relate to several seconds, e.g. twenty seconds.
In certain embodiments the intensity of illumination of an activated illumination source 16 associated with the control device 15 may be varied as the distance between the vehicle occupant’s hand 17 and the control device 15 changes. For example, once at least a portion of the vehicle occupant’s hand 17 lies at a distance equal to the predefined threshold distance the illumination source 16 may be activated. Further movement of the vehicle occupant’s hand towards the control device 15, thus decreasing the distance between the vehicle occupant’s hand 17 and the control device 15, may result in the intensity of the illumination emitted by the associated illumination source 16 increasing. One way in which this may be achieved is by the controller 19 continuously analysing received image and distance data, and adapting the output control signal in order to increase the emitted illumination in dependence on the distance between the vehicle occupant’s hand 17 and the control device 15 decreasing. Similarly, the controller 19 may be adapted to output a control signal for decreasing the emitted illumination as the vehicle occupant’s hand is retracted. In this case as the distance between the vehicle occupant’s hand 17 and the control device 15 increases, the output control signal results in the intensity of illumination decreasing. Once the vehicle occupant’s hand 17 lies at a distance from the control device 15 that is greater than the predefined threshold distance, the illumination source 16 may be powered off. Alternatively, the illumination source 16 may remain activated for a predefined period of time at a minimum illumination setting before powering off.
In a further embodiment, the controller 19 may be configured to determine from an analysis of the received camera image data whether the vehicle occupant’s hand 17 is the occupant’s left or right hand. The controller 19 may then be configured to output the control signal in dependence on whether the hand 17 is the occupant’s left or right hand. In this way it is possible to restrict illumination of selected control devices within the vehicle cabin dependent on whether a vehicle occupant’s left or right hand is attempting to activate the associated illumination source 16.
In certain embodiments, the controller 19 may be configured to determine if the vehicle occupant’s hand 17 is oriented palm upwards or downwards relative to the camera 7, and determining if the hand 17 is the vehicle occupant’s left or right hand in dependence on whether the hand is oriented palm upwards or downwards. This may be determined on the basis of the reflectance signal from the hand 17, and by image object analysis. The skin texture between a palm and the back of a hand is different, and as a result the amount of incident light absorbed by the palm differs to the amount absorbed by the back of the hand. Accordingly, by configuring the controller to analyse the intensity of the reflected signal, which is indicative of the amount of incident illumination absorbed by the hand, it is possible for the controller to determine whether the hand is oriented palm upwards or downwards.
In certain embodiments, the controller 19 may be configured to determine which vehicle occupant the imaged hand belongs to. For example, whether the imaged hand 17 belongs to a driver of the vehicle or to a passenger. Operation of the illumination source 16 may then be controlled in dependence on whom the hand belongs to. For example, operation of the illumination source 16 may be controlled in dependence on the hand 17 belonging to the driver 3 of the vehicle. This helps to prevent accidental activation of a control device’s illumination source 16 by a passenger of the vehicle.
One non-limiting way in which the controller 19 may determine which vehicle occupant the hand belongs to, is by monitoring and determining a direction of entry of the hand into the camera’s field of view 9 relative to the control device 15. This may be achieved from an analysis by the controller 19 of image data obtained by the camera 7. The direction of entry of the hand 17 into the camera’s field of view 9 may be indicative of where the vehicle occupant is seated in relation to the control device 15, and therefore provides a good assumption regarding which vehicle occupant the hand 17 belongs to.
Another non-limiting way in which the controller 19 may determine which vehicle occupant the hand belongs to, is by determining from an analysis of the received camera image data whether the vehicle occupant’s hand 17 is an occupant’s left or right hand, as described in a foregoing embodiment. This may be particularly useful for control devices disposed between two vehicle occupants occupying a common seating row within the vehicle and facing the same direction, e.g. for control devices arranged between the front seat passengers. By way of further explanation, in this example a first occupant is most likely to operate the control devices with a left hand whereas a second occupant is most likely to operate the control devices with a right hand (or vice versa). In this manner, the controller may discriminate between two vehicle occupants seated adjacent one another within the vehicle.
The non-limiting examples described above may be used independently or in combination to discriminate between multiple vehicle occupants and to control operation of the illumination source 16 in dependence on whom the hand belongs to. Whilst the preceding embodiments of the invention have been described within the context of a ToF camera, it is to be appreciated that alternative camera configurations may be used in accordance with the herein described embodiments. Any configuration of cameras may be used that enables image data of a hand relative to a control device to be obtained, and the position of the hand relative to the control device to be determined. For example, a configuration of two or more cameras each configured to enable a different perspective image of the hand relative to the control device to be captured may also be used. In such an arrangement the different perspective images of the hand relative to the control device would enable the controller to determine the position of the hand with respect to the control device by triangulation.
Similarly, in an alternative embodiment, the ToF camera of the preceding embodiments may be replaced by a conventional camera, in combination with an optical ruler, such as a LIDAR for example. In such an embodiment the LIDAR provides the image object distance information, whilst the camera provides image data. The controller may be configured in such embodiments to analyse the LIDAR data in combination with the obtained image data in order to determine the position of the vehicle occupant’s hand relative to the control device.
It is to be appreciated that many modifications may be made to the above examples and embodiments without departing from the scope of the present invention as defined in the accompanying claims.
Claims (30)
1. A method of controlling an illumination source associated with a control device located in a vehicle cabin, the method comprising:
detecting a hand of a vehicle occupant within a volume of space within the vehicle cabin;
determining the relative position of the detected hand with respect to the control device; and controlling the illumination source in dependence on a distance between at least a portion of the hand and the control device being less than or equal to a predefined threshold distance.
2. The method of claim 1, comprising:
determining the relative position of the hand with respect to a control proximity boundary associated with the control device, the control proximity boundary defining a boundary offset from the control device by the predefined threshold distance; and controlling the illumination source in dependence on the position of at least a portion of the hand intersecting the control proximity boundary.
3. The method of claim 1 or 2, wherein the illumination source comprises an ONstate in which the illumination source is active, and an OFF-state in which the illumination source is not active, and the method comprises:
activating the illumination source, when the illumination source is in the OFF-state, in dependence on the distance between at least a portion of the hand and the control device being less than or equal to the predefined threshold distance; or deactivating the illumination source, when the illumination source is in the ON-state, in dependence on the distance between at least a portion of the hand and the control device being greater than or equal to the predefined threshold distance.
4. The method of any preceding claim, comprising:
monitoring the relative position of the hand with respect to the control device, and varying an intensity of illumination associated with the illumination source in dependence on the relative position of the hand with respect to the control device.
5. The method of claim 4, comprising:
increasing the intensity of illumination in dependence on a decrease of a distance associated with the relative position of the hand with respect to the control device.
6. The method of claim 4 or 5, comprising:
decreasing the intensity of illumination in dependence on an increase of a distance associated with the relative position of the hand with respect to the control device.
7. The method of any preceding claim, comprising:
determining if the hand is the vehicle occupant’s left or right hand; and controlling the illumination source in dependence on whether the hand is the vehicle occupant’s left or right hand.
8. The method of any preceding claim, comprising:
determining if the hand is oriented palm upwards or downwards within the volume of space relative to the control device; and determining if the hand is the vehicle occupant’s left or right hand in dependence on the orientation of the hand.
9. The method of any preceding claim, comprising:
determining if the hand is a hand of a driver of the vehicle or of a passenger of the vehicle; and controlling the illumination source in dependence on whom the hand belongs to.
10. The method of any of claims 7 to 9, comprising:
determining a direction of entry of the hand into the volume of space relative to the control device.
11. The method of any preceding claim, comprising:
obtaining image data of the hand within the volume of space;
receiving a reflectance signal reflected from the hand;
determining a distance of the hand from a designated origin in dependence on the reflectance signal; and determining the relative position of the detected hand with respect to the control device in dependence on the distance of the hand relative to the designated origin, the obtained image data, and a known distance of the control device relative to the designated origin.
12. The method of claim 11, wherein the designated origin is coincident with the position of an image capture device.
13. A controller for controlling an illumination source associated with a control device located in a vehicle cabin, the controller comprising:
an input configured to receive image data obtained by an image capture device;
a processor arranged in use to:
recognise a hand of a vehicle occupant from the image data, the hand being located within a volume of space within the vehicle cabin within which the image data is obtained by the image capture device; and determine a position of the hand with respect to the control device; and an output arranged in use to output a control signal to the illumination source in dependence on a distance between at least a portion of the hand and the control device being less than or equal to a predefined threshold distance.
14. The controller of claim 13, wherein the processor is arranged in use to determine the position of the hand with respect to a control proximity boundary associated with the control device, the control proximity boundary defining a boundary offset from the control device by the predefined threshold distance; and the output is arranged in use to output the control signal in dependence on the position of at least a portion of the hand intersecting the control proximity boundary.
15. The controller of claim 13 or 14, wherein the processor is arranged in use to:
recognise the hand in image data associated with a sequence of images captured by the image capture device, the sequence of images comprising two or more image frames; and determine if the position of the hand with respect to the control varies in the sequence of images; and wherein the output is arranged in use to output a control signal configured to vary an intensity of illumination of the illumination source in dependence on the determined variation of the position of the hand with respect to the control device.
16. The controller of any one of claims 13 to 15, wherein the output is arranged in use to output a control signal configured to increase the intensity of illumination in dependence on a decrease in a distance associated with the position of the hand with respect to the control device.
17. The controller of claim 15 or 16, wherein the output is arranged in use to output a control signal configured to decrease the intensity of illumination in dependence on an increase in a distance associated with the position of the hand with respect to the control device.
18. The controller of any one of claims 13 to 17, wherein the processor is arranged in use to determine if the hand is the vehicle occupant’s left or right hand; and the output is arranged in use to output the control signal in dependence on whether the hand is the vehicle occupant’s left or right hand.
19. The controller of claim 18, wherein the processor is arranged in use to determine if the hand is oriented palm upwards or downwards within the volume of space relative to the control device; and determine if the hand is the vehicle occupant’s left or right hand in dependence on the orientation of the hand.
20. The controller of any one of claims 13 to 19, wherein the processor is arranged in use to determine if the hand is a hand of a driver or of a passenger of the vehicle; and the output is arranged in use to output the control signal in dependence on whom the hand belongs to.
21. The controller of any one of claims 18 to 20, wherein the processor is arranged in use to determine a direction of entry of the hand into the volume of space relative to the control device.
22. The controller of any one of claims 13 to 21, wherein:
the input is configured to receive data from a time-of-flight (ToF) image capture device comprising a sensor, the ToF image capture device being arranged in use to obtain image data of the hand, to illuminate the hand, and to measure a time of return of a reflected illumination signal, the time of return of the reflected illumination signal being proportional to a distance of the hand from the sensor, the data comprising the image data and the time of return of the reflected illumination signal; and wherein the processor is arranged in use to determine the relative position of the hand with respect to the control device in dependence on the determined distance of the hand from the sensor, the obtained image data, and a known distance of the control device relative to the sensor, wherein the distance of the hand from the sensor is determined in dependence on the time of return of the reflected illumination signal.
23. A system for controlling an illumination source associated with a control device located in a vehicle cabin, the system comprising: the controller of any one of claims 13 to 22 in combination with an image capture device.
24. The system of claim 23, wherein the image capture device comprises a time of flight (ToF) image capture device.
25. The system of claim 23 or 24, comprising an illumination source.
26. The system of claim 25, wherein the illumination source is arranged to provide a backlighting function to the control device.
27. A vehicle comprising the controller of any one of claims 13 to 22, or the system of any one of claims 23 to 26.
28. A vehicle configured to carry out the method of any one of claims 1 to 12.
29. A computer program product comprising instructions for carrying out the method of any one of claims 1 to 12.
30. A computer readable data carrier having stored thereon instructions for carrying out the method of any one of claims 1 to 12.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1719071.1A GB2568512B (en) | 2017-11-17 | 2017-11-17 | Vehicle controller |
DE102018218479.8A DE102018218479A1 (en) | 2017-11-17 | 2018-10-29 | VEHICLE CONTROL |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1719071.1A GB2568512B (en) | 2017-11-17 | 2017-11-17 | Vehicle controller |
Publications (3)
Publication Number | Publication Date |
---|---|
GB201719071D0 GB201719071D0 (en) | 2018-01-03 |
GB2568512A true GB2568512A (en) | 2019-05-22 |
GB2568512B GB2568512B (en) | 2021-01-13 |
Family
ID=60805745
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1719071.1A Active GB2568512B (en) | 2017-11-17 | 2017-11-17 | Vehicle controller |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE102018218479A1 (en) |
GB (1) | GB2568512B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000159011A (en) * | 1998-11-19 | 2000-06-13 | Daimlerchrysler Ag | Method for confirming operating element of motor vehicle in darkness |
US6774505B1 (en) * | 1998-07-17 | 2004-08-10 | Lear Automotive Dearborn, Inc. | Vehicle switch assembly with proximity activated illumination |
KR20090095695A (en) * | 2008-03-06 | 2009-09-10 | 현대자동차주식회사 | Device for adjusting brightness of vehicle display |
US20150298603A1 (en) * | 2013-11-21 | 2015-10-22 | Ford Global Technologies, Llc | Printed led storage compartment |
US20150353006A1 (en) * | 2014-06-04 | 2015-12-10 | Volkswagen Ag | Predictive cockpit lighting and performance mode via touch |
WO2016155960A1 (en) * | 2015-04-01 | 2016-10-06 | Zf Friedrichshafen Ag | Actuating device and method for actuating at least one vehicle function |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102016201704A1 (en) * | 2016-02-04 | 2017-08-10 | Bayerische Motoren Werke Aktiengesellschaft | A gesture recognition apparatus and method for detecting a gesture of an occupant of a vehicle |
FR3047942B1 (en) * | 2016-02-24 | 2019-04-05 | Valeo Vision | INTERIOR LIGHTING DEVICE OF A MOTOR VEHICLE INTERIOR |
-
2017
- 2017-11-17 GB GB1719071.1A patent/GB2568512B/en active Active
-
2018
- 2018-10-29 DE DE102018218479.8A patent/DE102018218479A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6774505B1 (en) * | 1998-07-17 | 2004-08-10 | Lear Automotive Dearborn, Inc. | Vehicle switch assembly with proximity activated illumination |
JP2000159011A (en) * | 1998-11-19 | 2000-06-13 | Daimlerchrysler Ag | Method for confirming operating element of motor vehicle in darkness |
KR20090095695A (en) * | 2008-03-06 | 2009-09-10 | 현대자동차주식회사 | Device for adjusting brightness of vehicle display |
US20150298603A1 (en) * | 2013-11-21 | 2015-10-22 | Ford Global Technologies, Llc | Printed led storage compartment |
US20150353006A1 (en) * | 2014-06-04 | 2015-12-10 | Volkswagen Ag | Predictive cockpit lighting and performance mode via touch |
WO2016155960A1 (en) * | 2015-04-01 | 2016-10-06 | Zf Friedrichshafen Ag | Actuating device and method for actuating at least one vehicle function |
Also Published As
Publication number | Publication date |
---|---|
GB201719071D0 (en) | 2018-01-03 |
GB2568512B (en) | 2021-01-13 |
DE102018218479A1 (en) | 2019-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10832064B2 (en) | Vacant parking space detection apparatus and vacant parking space detection method | |
US10345806B2 (en) | Autonomous driving system and method for same | |
US9707885B2 (en) | Motor vehicle with driver's gaze controlled headlamp and method | |
TWI738939B (en) | Distance measurement system | |
US9827956B2 (en) | Method and device for detecting a braking situation | |
EP2912836B1 (en) | Gesture recognition in varying lighting conditions | |
US20150355707A1 (en) | Sensor assembly for detecting operator gestures in vehicles | |
US10086748B2 (en) | Method for operating a headlamp arrangement, headlamp system and motor vehicle | |
US20180222319A1 (en) | Image display device | |
EP3587186B1 (en) | Vehicle interior lighting system | |
JP2011033928A (en) | Display control device and program | |
US20190126944A1 (en) | Rear lateral side warning apparatus and method with learning of driving pattern | |
JP2017220876A (en) | Periphery monitoring device | |
US20200108831A1 (en) | Adaptive cruise control system and method based on luminance of incident light | |
WO2019172117A1 (en) | Sensor system, and image data generating device | |
CN109690344A (en) | Acceleration auxiliary of overtaking other vehicles for the adaptive learning algorithms in vehicle | |
GB2568511A (en) | Vehicle controller | |
GB2568512A (en) | Vehicle controller | |
GB2568669A (en) | Vehicle controller | |
US11066005B2 (en) | System and method for providing dynamic high beam control | |
US20190162532A1 (en) | Method and device for detecting a light-emitting object at a traffic junction for a vehicle | |
US20230336876A1 (en) | Vehicle-mounted sensing system and gated camera | |
GB2570629A (en) | Vehicle controller | |
GB2568509A (en) | Vehicle controller | |
GB2568508A (en) | Vehicle controller |