GB2568511A - Vehicle controller - Google Patents

Vehicle controller Download PDF

Info

Publication number
GB2568511A
GB2568511A GB1719070.3A GB201719070A GB2568511A GB 2568511 A GB2568511 A GB 2568511A GB 201719070 A GB201719070 A GB 201719070A GB 2568511 A GB2568511 A GB 2568511A
Authority
GB
United Kingdom
Prior art keywords
hand
control device
user operated
operated control
dependence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1719070.3A
Other versions
GB201719070D0 (en
GB2568511B (en
Inventor
Hasedzic Elvir
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1719070.3A priority Critical patent/GB2568511B/en
Publication of GB201719070D0 publication Critical patent/GB201719070D0/en
Priority to DE102018218480.1A priority patent/DE102018218480A1/en
Publication of GB2568511A publication Critical patent/GB2568511A/en
Application granted granted Critical
Publication of GB2568511B publication Critical patent/GB2568511B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/50Instruments characterised by their means of attachment to or integration in the vehicle
    • B60K35/53Movable instruments, e.g. slidable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/141Activation of instrument input devices by approaching fingers or pens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • B60K2360/14643D-gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/21Optical features of instruments using cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

A method of deploying a user operated control device (15,605,610,701,705,801) within a vehicle cabin (1, Fig.1). The method comprises detecting a hand (17) of a vehicle occupant (3) within a volume of space (9) within the vehicle cabin; determining a position of the hand relative to the user operated control device; and deploying the user operated control device in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance. Also claimed is a controller comprising a processor to carry out the method using input image data. Time of Flight (TOF) imaging may be used as its input and it may be able to distinguish between a hand of a driver and that of a passenger, whether it is a left or right hand, whether a palm of the hand is facing upwards or downwards or a direction of entry of the hand into the volume of space. The user operated control device may be stowed substantially flush with a body panel (see figs. 6a,7a,8a) and may comprise: a switch, a display unit, a ventilation control device or a ventilation vent. The method may further detect particular input gestures.

Description

(57) A method of deploying a useroperated control device (15,605,610,701,705,801) within a vehicle cabin (1, Fig.1). The method comprises detecting a hand (17) of a vehicle occupant (3) within a volume of space (9) within the vehicle cabin; determining a position of the hand relative to the user operated control device; and deploying the user operated control device in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance. Also claimed is a controller comprising a processor to carry out the method using input image data. Time of Flight (TOF) imaging may be used as its input and it may be able to distinguish between a hand of a driver and that of a passenger, whether it is a left or right hand, whether a palm of the hand is facing upwards or downwards or a direction of entry of the hand into the volume of space. The user operated control device may be stowed substantially flush with a body panel (see figs. 6a,7a,8a) and may comprise: a switch, a display unit, a ventilation control device or a ventilation vent. The method may further detect particular input gestures.
Fig. 3
At least one drawing originally filed was informal and the print reproduced here is taken from a later filed formal copy.
1/10
01 19
2/10
01 19
3/10
Deploy/stow user operated control device
4/10
01 19
η
5/10
01 19
X
6/10
01 19
7/10
8/10
01 19
9/10
01 19
X
10/10
01 19
VEHICLE CONTROLLER
TECHNICAL FIELD
The present disclosure relates to a vehicle controller. Aspects of the invention relate to a controller for controlling the deployment of a user operated control device located in a vehicle cabin, to a vehicle comprising the controller, to a system for controlling the deployment of a user operated control device located in a vehicle cabin, and to a method of controlling the deployment of a user operated control device located in a vehicle cabin.
BACKGROUND
Current vehicles, in particular automobiles, are often provided with one or more user operated control devices, such as switches, infotainment systems, displays and other controls located within the vehicle cabin, which are configured to enable a user to alter various settings of the vehicle. It is common for each user operated control device to be located in a position easily accessible to the occupants of the vehicle, such as on a body panel of the vehicle cabin. In some circumstances, some of the user operated control devices will be located in areas which are easily accessible to specific vehicle occupants in dependence on their function. For example, user operated control devices which control driving related functions such as traction control settings or driving mode settings may be placed in locations which are easily accessible only to the driver of the vehicle.
A disadvantage associated with the aforementioned prior art solutions is that there is limited space available for locating such user operated control devices, resulting in multiple control devices being placed in close proximity to one another. This may lead to control devices being accidentally operated. This is particularly true of a driver of a vehicle, where it is common to accidentally activate a control device whilst maintaining line of sight on the road. In order to prevent accidental activation of certain control devices, the relevant control devices are frequently provided in a stowed configuration. In this way it is first necessary to deploy the control device before they may be operated. This solution reduces the likelihood of unintentional activation however, it is difficult to deploy the control device without diverting one’s gaze to the control device. This is particularly true of a driver of the vehicle, where diverting line of sight from the road ahead, can impede the ability to operate the vehicle.
It is an aim of at least certain embodiments of the present invention to ameliorate disadvantages associated with the prior art and in particular to facilitate the deployment of stowed control devices within a vehicle cabin.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a method, a controller, a system, a vehicle, a computer program product and a computer readable data carrier as claimed in the appended claims.
According to an aspect of the present invention there is provided a method of deploying a user operated control device within a vehicle cabin. The method may comprise detecting a vehicle occupant’s hand within a volume of space within the vehicle cabin; determining a position of the hand relative to the user operated control device; and deploying the user operated control device in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance.
In this way it is possible to selectively deploy a user operated control device in dependence on a vehicle occupant’s hand being located within a predefined threshold distance of the desired control device. This significantly facilitates deployment and may be carried out without the vehicle occupant diverting their gaze to look at the control device. This is particularly convenient for drivers of a vehicle.
In one embodiment, the method may comprise monitoring the position of the vehicle occupant’s hand relative to a deployed user operated control device; and stowing the deployed user operated control device in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance.
The user may be required to withdraw their hand to an intermediate distance greater than the predefined threshold distance (optionally for a time period greater than or equal to a predefined time period) between successive deployment or stowing (or vice versa) operations to avoid false triggering. Thus, the deployed user operated control device may be stowed in dependence on a distance between at least a portion of the hand and the user operated control device transitioning from being greater than a predefined threshold distance to being less than or equal to a predefined threshold distance.
Accordingly, in certain embodiments the user operated control device may be deployed and subsequently stowed (or vice versa) by a vehicle occupant successively placing at least a portion of their hand at a distance from the user operated control device less than or equal to the predefined threshold distance. This provides a ‘toggle’ type action I operation for deploying and stowing (or vice versa) the user operated control device.
The method may comprise determining a position of the hand relative to a proximity boundary associated with the user operated control device. The proximity boundary may define a boundary offset from the user operated control device by the predefined threshold distance. The user operated control device may be deployed in dependence on at least a portion of the hand intersecting the proximity boundary.
In certain embodiments, the method may comprise monitoring the position of the hand with respect to a deployed user operated control device, and stowing the deployed user operated control device in dependence on the distance between the portion of the hand and the user operated control device being greater than the predefined threshold distance. This provides a convenient way of stowing the user operated control device one the vehicle occupant’s hand has been retracted from the deployed control device by the predefined threshold distance.
The method may comprise stowing the deployed user operated control device in dependence on the distance between the portion of the hand and the user operated control device being greater than the predefined threshold distance for a period of time greater than or equal to a predefined threshold time period. This prevents inadvertent stowing of the deployed control device when the vehicle occupant’s hand is retracted to a distance in excess of the predefined threshold distance.
In certain embodiments, the method may comprise determining if the hand is the vehicle occupant’s left or right hand and deploying the user operated control device in dependence on whether the hand is the vehicle occupant’s left or right hand. This may comprise determining if the hand is oriented palm upwards or downwards within the volume of space relative to the user operated control device; and determining if the hand is the vehicle occupant’s left or right hand in dependence on the orientation of the hand. Analysis of the orientation of the hand provides a convenient way of determining whether the hand is a vehicle occupant’s left or right hand.
The method may comprise determining which vehicle occupant the hand belongs to and deploying the user operated control device in dependence on which vehicle occupant the hand belongs to. In this way, advantageously, it is possible to selectively restrict control of the deployment of a user operated control device in dependence on which vehicle occupant the hand belongs to. This may be particularly advantageous where deployment of a specific user operated control device needs to be restricted to specific vehicle occupants, such as the driver of the vehicle.
The method may comprise determining a direction of entry of the hand into the volume of space relative to the user operated control device. The direction of entry of the hand may be indicative of which vehicle occupant the hand belongs to, which in turn may be used for selectively restricting deployment of the user operated control device to specific vehicle occupants.
The method may comprise obtaining image data of the hand within the volume of space; receiving a reflectance signal reflected from the hand; determining a distance of the hand from a designated origin in dependence on the received reflectance signal; and determining the relative position of the detected hand with respect to the user operated control device in dependence on the distance of the hand relative to the designated origin, the obtained image data, and a known distance of the user operated control device relative to the designated origin. The time taken for the reflectance signal to be measured by a receiver is proportional to the distance of the hand from the sensor, and therefore provides a convenient way for distance information associated with the position of the hand to be obtained. In this way it is possible to determine the position of the hand relative to the user operated control device on the basis of a twodimensional image of the hand relative to the user operated control device, and distance information of the hand. This significantly simplifies the hardware required to carry out the method, and in particular obviates the need for using a complex system of two or more cameras, each configured to capture different perspective images of the hand relative to the user operated control device, from which the distance of the hand relative to the user operated control device may be determined.
In certain embodiments the designated origin may be coincident with the position of an image capture device.
In certain embodiments, the user operated control device may have a stowed position in which the user operated control device lies substantially flush with a body panel within the vehicle cabin, and a deployed position in which the user operated control device projects from the body panel and is operable by an occupant of the vehicle.
Optionally, the user operated control devices may comprise any one or more of a switch, a display unit, a ventilation control device and a ventilation vent.
In certain embodiments, the method may comprise determining if the hand is performing a predefined gesture within the volume of space relative to the user operated control device and deploying the user operated control device in dependence on whether the hand is performing the predefined gesture. Moreover, the user operated control device may be deployed and I or stowed in dependence on the specific predefined gesture performed by the hand. Making deployment and I or stowing dependent on a specific gesture being captured further improves the robustness of the system and avoids accidental deployment.
According to a further aspect of the invention there is provided a controller for deploying a user operated control device located in a vehicle cabin. The controller may comprise an input configured to receive image data obtained by an image capture device, a processor and an output. The processor may be arranged in use to:
recognise a hand of a vehicle occupant from the image data, the hand being located within a volume of space within the vehicle cabin within which the image data is obtained by the image capture device; and determine a position of the hand with respect to the user operated control device. The output may be arranged in use to output a control signal to the user operated control device for deploying the user operated control device in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance. The present controller benefits from the same advantages as set out in respect of the preceding aspects of the invention.
In one embodiment, the processor may be arranged to monitor the position of the vehicle occupant’s hand relative to a deployed user operated control device; and stow the deployed user operated control device in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance.
The user may be required to withdraw their hand to an intermediate distance greater than the predefined threshold distance (optionally for a time period greater than or equal to a predefined time period) between successive deployment or stowing (or vice versa) operations to avoid false triggering. Thus, the deployed user operated control device may be stowed in dependence on a distance between at least a portion of the hand and the user operated control device transitioning from being greater than a predefined threshold distance to being less than or equal to a predefined threshold distance.
Accordingly, in certain embodiments the user operated control device may be deployed and subsequently stowed (or vice versa) by a vehicle occupant successively placing at least a portion of their hand at a distance from the user operated control device less than or equal to the predefined threshold distance. This provides a ‘toggle’ type action I operation for deploying and stowing (or vice versa) the user operated control device.
The processor may be configured in use to determine the relative position of the hand with respect to a proximity boundary associated with the user operated control device.
The proximity boundary may define a boundary offset from the user operated control device by the predefined threshold distance. The output may be arranged in use to output the control signal in dependence on the position of at least a portion of the hand intersecting the proximity boundary.
In certain embodiments, the processor may be arranged to determine the position of the hand with respect to a deployed user operated control device, and the output may be arranged in use to output a control signal for stowing the deployed user operated control device in dependence on the distance between the portion of the hand and the user operated control device being greater than the predefined threshold distance. The output may optionally be arranged in use to output the control signal for stowing the deployed user operated control device in dependence on the distance between the portion of the hand and the user operated control device being greater than the predefined threshold distance for a period of time greater than or equal to a predefined threshold time period.
The processor may be arranged in use to determine if the hand is the vehicle occupant’s left or right hand and the output may be arranged in use to output the control signal in dependence on whether the hand is the vehicle occupant’s left or right hand.
The processor may be arranged in use to determine if the hand is oriented palm upwards or downwards within the volume of space relative to the control device; and determine if the hand is the vehicle occupant’s left or right hand in dependence on the orientation of the hand.
The processor may be arranged in use to determine the vehicle occupant the hand belongs to and the output may be arranged in use to output the control signal in dependence on which vehicle occupant the hand belongs to.
The processor may be arranged in use to determine a direction of entry of the hand into the volume of space relative to the control device.
In certain embodiments the input may be configured to receive data from a time-offlight (ToF) image capture device comprising a sensor. The ToF image capture device may be arranged in use to obtain image data of the hand, to illuminate the hand within the volume of space, and to measure a time of return of a reflected illumination signal, the time of return of the reflected illumination signal being proportional to a distance of the hand from the sensor. The data may comprise the image data and the time of return of the reflected illumination signal. The processor may be arranged in use to determine the relative position of the hand with respect to the control device in dependence on the determined distance of the hand from the sensor, the obtained image data, and a known distance of the control device relative to the sensor, wherein the distance of the hand from the sensor is determined in dependence on the time of return of the reflected illumination signal. The ToF image capture device provides a convenient means for obtaining image data associated with image object distance data, and therefore simplifies determining the distance of the hand from the user operated control device.
In certain embodiments, the user operated control device may have a stowed position in which the user operated control device lies substantially flush with a body panel within the vehicle cabin, and a deployed position in which the user operated control device projects from the body panel and is operable by an occupant of the vehicle. The output may be arranged in use to output a control signal configured to transition the user operated control device between the stowed position and the deployed position (or vice versa).
In accordance with yet a further aspect of the invention, there is provided a system for deploying a user operated control device located in a vehicle cabin. The system may comprise the aforementioned controller and an image capture device. Optionally, the image capture device may comprise a time of flight (ToF) image capture device. The system may also optionally comprise a user operated control device. Optionally, the user operated control devices may comprise any one or more of a switch, a display unit, a ventilation control device and a ventilation vent.
In accordance with yet a further aspect of the invention, there is provided a vehicle comprising the aforementioned controller, or the aforementioned system.
In accordance with yet a further aspect of the invention, there is provided a vehicle configured to carry out the aforementioned method.
In accordance with yet a further aspect of the invention, there is provided a computer program product comprising instructions for carrying out the aforementioned method.
The computer program product may comprise instructions, which when executed on a processor, configure the processor to: detect a hand of a vehicle occupant within a volume of space within the vehicle cabin; determine the relative position of the detected hand with respect to the control device; and deploy the user operated control device in dependence on a distance between at least a portion of the hand and the control device being less than or equal to a predefined threshold distance.
In accordance with yet a further aspect of the invention, there is provided a computer readable data carrier having stored thereon instructions for carrying out the aforementioned method. Optionally, the computer readable data carrier comprises a non-transitory computer readable data carrier.
The data carrier may comprise instructions, which when executed on a processor, configure the processor to: detect a hand of a vehicle occupant within a volume of space within the vehicle cabin; determine the relative position of the detected hand with respect to the control device; and deploy the user operated control device in dependence on a distance between at least a portion of the hand and the control device being less than or equal to a predefined threshold distance.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a schematic cut-away illustration of a front portion of a vehicle cabin having a camera having a field of view arranged to obtain image data of a vehicle occupant’s hand within a volume of space within a vehicle cabin;
Figure 2 is a schematic illustration of a controller configured to output a control signal for deploying a user operated control device located in the vehicle cabin of Figure 1;
Figure 3 is a process flow chart outlining a method for deploying a user operated control device within a vehicle cabin, in dependence on the proximity of a vehicle occupant’s hand to the control device, using the camera of Figure 1;
Figure 4 is a schematic illustration highlighting the principle of operation of a Time-ofFlight (ToF) camera, which may be used to determine the position of a vehicle occupant’s hand within the vehicle cabin of Figure 1;
Figures 5a and 5b are schematic illustrations showing a three-dimensional point cloud of a vehicle occupant’s hand generated using the ToF camera of Figure 4;
Figures 6a and 6b are schematic illustrations showing two user operated control devices, in respectively a stowed configuration and in a deployed configuration in dependence on the proximity of the vehicle occupant’s hand to the control devices;
Figures 7a and 7b are schematic illustrations showing a vehicle display unit, in respectively a stowed configuration and in a deployed configuration in dependence on the proximity of the vehicle occupant’s hand to the display unit;
Figures 8a and 8b are schematic illustrations showing a ventilation control device, in respectively a stowed configuration and in a deployed configuration in dependence on the proximity of a hand to the ventilation control device; and
Figure 9 is a schematic illustration of a vehicle comprising the camera of Figure 1 and the controller of Figure 2.
DETAILED DESCRIPTION
Figure 1 is a cut-away perspective view of a portion of the vehicle cabin 1, and in particular shows the driver 3 sitting in the driver’s seat 5. An image capture device in the form of a camera 7, having a field of view 9 delineated in Figure 1 by lines 11, is shown located in the cabin roof. Optionally, the camera 7 may comprise a Time-offlight (ToF) camera. The camera 7 is arranged to image objects located within the camera’s field of view 9. The field of view defines a volume of space within the vehicle cabin within which objects are imaged by the camera 7. The camera 7 is arranged such that a body panel 13 of the vehicle 1 lies within the camera’s field of view 9. The body panel 13 comprises a plurality of different user operated control devices 15, which may relate to, but are not limited to: air ventilation switches; air conditioning switches; vehicle infotainment system; vehicle display units; air circulation switches; and any other control device configured to operate a control system of the vehicle. Each user operated control device 15 may be arranged to have both a deployed configuration and stowed configuration. In the deployed configuration, the control device 15 is operable by the user to control at least one associated control system of the vehicle 1. In the stowed configuration the control device is inoperable. In certain embodiments, the stowed user operated control device 15 may lie substantially flush with the body panel 13, and the deployed control device 15 projects from the body panel enabling operation of the control device by a vehicle occupant. The vehicle cabin 1 may be comprised in the vehicle 43 of Figure 9.
In certain embodiments, the camera 7 may be operatively coupled to a controller 19 (shown in Figure 2), and configured to receive image data obtained by the camera 7 and to output a control signal to a deployment and/or stowing mechanism (not shown) associated with a specific user operated control device 15, in dependence on an analysis of the received image data. This enables the associated user operated control device 15 to be selectively deployed and/or stowed, to render the user operated control device 15 operable or inoperable respectively. The deployment and stowing mechanisms may comprise any one or more devices which physically transition the user operated control device 15 between the two configurations. For example, the deployment and stowing mechanism may relate to a mechanical motor configured to deploy or stow the control device 15. A single device may be used to enable both the deployment and stowing functionality described above. In the ensuing description the term ‘deployment mechanism’ may be used to refer to any mechanism which is configured to provide the aforementioned deployment and/or stowing functionality.
Figure 2 provides a functional overview of the controller 19. The controller 19 may be functionally embedded into an existing electronic control unit of the vehicle 1. The controller 19 may be provided with an input 21 and an output 23. The input 21 may be configured to receive image data obtained by the camera 7, and the output 23 may be configured to output a control signal to the user operated control device 15, and specifically to a deployment mechanism associated with the given user operated control device 15. For the purposes of the present description the deployment mechanism will be taken to be an inherent component of the user operated control device 15, such that any reference to outputting a control signal to the user operated control device 15 may be understood as a control signal output to the deployment mechanism of the user operated control device. The controller 19 may additionally comprise a processor 25 arranged to analyse image data received from the camera 7, to identify image objects such as the hand 17 of a vehicle occupant within the obtained image data, and to generate and output control signals for deploying and/or stowing the given user operated control device 15, in dependence on the relative position of the hand 17 with respect to the user operated control device 15.
In use, as an image of a vehicle occupant’s hand is obtained by the camera 7, and its position relative to a given user operated control device 15 is determined by the controller 19, typically by the processor 25 of the controller 19, the given user operated control device 15 may be automatically deployed, if the given user operated control device is in the stowed configuration, by the controller 19 via a control signal output to the deployment mechanism associated with the given user operated control device 15.
In this way, the user operated control device 15 with respect to which the imaged vehicle occupant’s hand 17 is determined to be closest may be deployed in this way without requiring physical contact between the vehicle occupant’s hand 17 and the desired user operated control device 15. The result is that the specific user operated control device 15 that the vehicle occupant is interested in operating is selectively and individually operable.
In certain embodiments, the controller 19 may be configured in use to output the control signal in dependence on a distance between at least a portion of a vehicle occupant’s hand 17 and the desired user operated control device 15 being less than or equal to a predefined threshold distance. For example, as the camera 7 obtains image data of a vehicle occupant’s hand, such as the driver’s hand 17, the controller 19 may be configured to identify the image of the hand within the received image data. The relative position of the imaged hand with respect to a desired user operated control device 15 may then be determined, from the obtained image data. In order to identify an image of a hand, the controller 19, and specifically the processor 25 may be configured with image recognition software configured to identify a vehicle occupant’s hand 17 located within the camera’s field of view 9, from obtained image data.
Figure 3 is a process flow chart outlining the method used in accordance with certain embodiments of the invention, to control deployment of the user operated control device 15 in dependence on the proximity of a vehicle occupant’s hand to the desired user operated control device 15, using the camera 7 in operative communication with the controller 19. The method is initiated by the camera 7 obtaining image data within the vehicle cabin 1, at step 301. In certain embodiments the camera 7 may be configured to continuously obtain the image data, or to periodically obtain image data at a predefined frequency. The obtained image data may be forwarded to the controller 19 for analysis where, at step 303, obtained image data is analysed to identify a vehicle occupant’s hand 17 within the obtained image data. As mentioned previously, this may comprise the use of image recognition software. Once a vehicle occupant’s hand 17 has been identified within the obtained image data, the position of the hand 17 is determined relative to the user operated control device 15, at step 305. Where the vehicle body panel 13 comprises a plurality of different user operated control devices 15, step 305 may comprise determining the position of the hand 17 relative to the nearest user operated control device 15. The position of the hand 17 relative to the user operated control device 15 may be determined by the processor 25. At step 307 it is determined if at least a portion of the hand 17 lies at a distance that is less than or equal to a predefined threshold distance from the user operated control device 15. If it is determined that no portion of the hand lies within the predefined threshold distance, then the processor 25 continues to analyse received obtained image data, and the method returns to step 303. If instead it is determined by the processor 25 that at least a portion of the identified hand lies within the predefined threshold distance of the user operated control device 15, then the processor generates a control signal for output to the relevant user operated control device, at step 308. Upon receipt of the control signal, at step 310, the desired user operated control device is deployed. Specifically, the user operated control device’s deployment mechanism is activated to enable deployment of the desired user operated control device 15.
The foregoing description of Figure 3 assumes that the desired user operated control device is in the stowed configuration, and the vehicle occupant wishes to transition it to the deployed configuration. However, the same method may be adopted to transition a deployed user operated control device to its stowed configuration, with the exception that at step 310 rather than deploying the user operated control device, it is stowed. Accordingly, in certain embodiments the process of Figure 3 may comprise an additional step in which the processor 25 determines the current configuration of the user operated control device 15 that is closest to the vehicle occupant’s hand. This may occur after step 307. If the current configuration is determined to be a stowed configuration, then the control signal output at step 308 may comprise instructions for deploying the user operated control device 15. If instead the current configuration is determined to be a deployed configuration, then the control signal output at step 308 may comprise instructions for stowing the user operated control device 15.
In certain embodiments the predefined threshold distance may relate to a few centimetres, for example any distance within the range of 1cm to 10cm, including 1cm and 10cm. In certain embodiments the predefined threshold may delineate a control proximity boundary surrounding and offset from the user operated control device 15 by the predefined threshold distance, which when intersected by at least a portion of the vehicle occupant’s hand causes the controller 19 to generate the control signal for output to the relevant control device.
The control proximity boundary may be geometrically shaped. For example, the control proximity boundary may be box-shaped, or spherically shaped. Effectively, the control proximity boundary relates to a volume of space offset from the user operated control device 15 by the predefined threshold distance. In dependence on any portion of the control proximity boundary being intersected by at least a portion of the vehicle occupant’s hand, the controller generates the control signal for deploying or stowing, as the case may be, the associated control device. It is to be appreciated that not all of the portions of the control proximity boundary need to be offset from the user operated control device 15 by the predefined threshold distance. For example, where the control proximity boundary is box-shaped (e.g. cube shaped), it is to be appreciated that some faces of the cube may not be offset from the user operated control device 15 by the predefined threshold distance.
In certain embodiments, in order to enable the position of the hand 17 to be determined relative to the user operated control device 15, the camera 7 may relate to a Time-of-Flight (ToF) camera, in which each captured image pixel is associated with a distance on the basis of a time of return of a reflected illumination signal. To achieve this, the ToF camera may be configured with an illumination source arranged to illuminate the camera’s field of view. The incident illumination signal is subsequently reflected by objects present in the camera’s field of view, and the time of return of the reflected illumination signal is measured. In this way it is possible to associate a distance measurement to each imaged object. The illumination signal may relate to any electro-magnetic signal, and need not be comprised in the visible spectrum. For example, in certain embodiments the illumination signal may operate in the infrared spectrum.
In those embodiments comprising a ToF camera 27, the controller 19, and specifically the input 21 may be configured to receive both camera image data and image object distance information data from the ToF camera 27 This enables the controller 19, and more specifically the processor 25 to determine the position of the vehicle occupant’s hand 17 relative to a user operated control device 15 from the received data.
Figure 4 is a schematic diagram illustrating the principle of operation of a ToF camera 27. A modulated illumination source 29 is used to illuminate a desired target 31. The incident illumination 33 is reflected by the target 31 and captured on a sensor 35 comprising an array of pixels. However, whilst simultaneously capturing the reflected modulated light 37, the pixels of the sensor 35 also capture visible light reflected from the target. Since the illumination signal is modulated 33, it may be distinguished from the visible light reflected from the target 31, which enables the time of flight of the modulated illumination signal to be measured. The time of flight taken for the modulated illumination signal to be incident on the target 31 and reflected back to the sensor 35 is measured when it is incident on the sensor 35. In this way, each captured image pixel may be associated with a distance of the corresponding image object on the basis of the measured time of flight required for the reflected modulated illumination signal 37 to be measured by the sensor 35. More specific details regarding operation of ToF cameras are widely available in the art, and for this reason a more detailed discussion is not necessary for present purposes.
Where the camera 7 of Figure 1 comprises a ToF camera 27, it is possible to generate a three-dimensional point cloud of the vehicle occupant’s hand located within the camera’s field of view 9. Figures 5a and 5b illustrate an example of a threedimensional point cloud 39 of the vehicle occupant’s hand 17, generated using the ToF camera 27. In certain embodiments the controller 19 may be configured to generate the three-dimensional point cloud using the image data and image object distance information received from the ToF camera 27. Figure 5a shows a point cloud 39 of the vehicle occupant’s hand 17 as it is approaching a rectangular-shaped control proximity boundary 41. In Figure 5b a portion of the point cloud 39 of the vehicle occupant’s hand 17 is intersecting a portion of the control proximity boundary 41. In this event, and as mentioned previously, the controller 19 is configured to generate a control signal for deploying and/or stowing, as the case may be, the desired user operated control device.
In order to enable the position of the vehicle occupant’s hand 17 to be determined relative to a user operated control device 15, the position of the user operated control device 15 relative to the ToF camera may be determined. Again, this may be done using image recognition software. Since the position of the user operated control device 15 relative to the ToF camera 27 is known, and the position of the vehicle occupant’s hand 17 relative to the ToF camera 27 is known, the position of the vehicle occupant’s hand 17 relative to the user operated control device 15 may be determined using basic trigonometry. In certain embodiments, and in order to facilitate computation during use, the controller 19 may be provided with distance information of each user operated control device 15 relative to the ToF camera 27 during an initial configuration of the ToF camera 27. This distance information may be stored and accessed for subsequent use when it is needed. This facilitates subsequent computation of the position of the hand relative to a desired user operated control device 15, since only the distance of the vehicle occupant’s hand 17 with respect to the ToF camera 27, and the position relative to the known position of the user operated control device 15 requires calculation, both of which may be obtained from data captured by the ToF camera 27.
Figures 6a and 6b show an example use scenario wherein, in response to at least a portion of a vehicle occupant’s hand 17 intersecting a control proximity boundary 601, one or more user operated control devices 15 associated with the control proximity boundary are configured to be deployed when in their stowed configurations. In the illustrated example the user operated control devices may comprise switches for controlling various different vehicle control systems. Figure 6a shows the scenario where a vehicle occupant’s hand 17 is at a distance greater than the predefined threshold distance and therefore does not intersect the control proximity boundary 601. As a result, the associated switches remain in their stowed configuration 605, in which the switches lie substantially flush with a vehicle body panel located within the vehicle cabin. Figure 6b shows the scenario where the vehicle occupant’s hand 17 is at a position where at least a portion of the hand intersects the control proximity boundary 601. As a result, the switches are transitioned from their stowed configuration to their deployed configuration 610. In the deployed configuration 610 the switches project from the body panel, and are operable by the vehicle occupant. Transitioning from the stowed configuration to the deployed configuration occurs in accordance with the method as described in detail above.
Figures 7a and 7b show a further example use scenario, in which the user operated control device 15 relates to a vehicle display unit 701, 705, such as comprised in a vehicle infotainment system. In Figure 7a, the display unit 701 lies in its stowed configuration, in which the display unit 701 lies substantially flush with a body panel of the vehicle 1. In Figure 7a the vehicle occupant’s hand 17 is positioned such that no part of the hand intersects with the control proximity boundary 601. As the vehicle occupant’s hand 17 intersects the control proximity boundary, as shown in Figure 7b, the display screen may be deployed to its deployed configuration 705, in which the display unit projects from the body panel.
Figures 8a and 8b show a further exemplary use scenario, in which the user operated control device 15 comprises one or more ventilation vents 801, 805. As described above in relation to the other embodiments, the ventilation vents may transition from a substantially stowed configuration 801 as shown in Figure 8a, to a deployed configuration 805, as shown in Figure 8b, in dependence on at least a portion of the vehicle occupant’s hand 17 intersecting the control proximity boundary 601. In the illustrated example, the deployed configuration of the ventilation vents comprises the ventilation vents rotating about an axis to permit the passage of air through the vents. Similarly, in certain embodiments as at least a portion of the vehicle occupant’s hand intersects the control proximity boundary 601, a ventilation control device may be deployed enabling regulation of the flow of air through the ventilation vents.
In certain embodiments it is envisaged that once the user operated control device 15 has been deployed or stowed it remains in this configuration for a predefined period of time irrespective of the position of the vehicle occupant’s hand relative to the user operated control device 15. For example, when a user operated control device 15 is in a deployed position, should the vehicle occupant then retract their hand such that it no longer lies within the predefined threshold distance, then the user operated control device 15 remains deployed for the predefined period of time before transitioning to the stowed configuration. The predefined period of time may relate to any arbitrary period of time and may be dependent on the specific user operated control device. For example, a radio tuning switch may remain deployed for several minutes after the hand has been retracted, whilst a deployed ventilation vent control switch may only remain deployed for several seconds after retraction of the vehicle occupant’s hand.
In a further embodiment, the controller 19 may be configured to determine from an analysis of the received camera image data whether the vehicle occupant’s hand 17 is the occupant’s left or right hand. The controller 19 may then be configured to output the control signal in dependence on whether the hand 17 is the occupant’s left or right hand. In this way it is possible to restrict deployment or stowing of a desired user operated control device 15 within the vehicle cabin dependent on whether a vehicle occupant’s left or right hand is attempting to deploy/stow the associated user operated control device 15.
In certain embodiments, the controller 19 may be configured to determine if the vehicle occupant’s hand 17 is oriented palm upwards or downwards relative to the camera 7, and determining if the hand 17 is the vehicle occupant’s left or right hand in dependence on whether the hand is oriented palm upwards or downwards. This may be determined on the basis of the reflectance signal from the hand 17, and by image object analysis. The skin texture of a palm of a hand is different to the skin texture of the back of a hand, and as a result the amount of incident light absorbed by the palm differs to the amount of incident light absorbed by the back of the hand. Accordingly, by configuring the controller to analyse the intensity of the reflected signal, which is indicative of the amount of incident illumination absorbed by the hand, it is possible for the controller to determine whether the hand is oriented palm upwards or downwards.
In certain embodiments, the controller 19 may be configured to determine which vehicle occupant the imaged hand belongs to, for example, whether the imaged hand 17 belongs to a driver of the vehicle or to a passenger. Deployment and or stowing of the user operated control device may then be controlled in dependence on whom the hand belongs to. For example, deployment of the user operated control device may be controlled in dependence on the hand 17 belonging to the driver 3 of the vehicle. This helps to prevent accidental deployment or stowing of a user operated control device 15 by an unauthorised vehicle occupant.
One non-limiting way in which the controller 19 may determine which vehicle occupant the hand belongs to, is by monitoring and determining a direction of entry of the hand into the camera’s field of view 9 relative to the user operated control device 15. This may be achieved from an analysis by the controller 19 of image data obtained by the camera 7. The direction of entry of the hand 17 into the camera’s field of view 9 may be indicative of where the vehicle occupant is seated in relation to the user operated control device 15, and therefore provides a good assumption regarding which vehicle occupant the hand 17 belongs to.
Another non-limiting way in which the controller 19 may determine which vehicle occupant the hand belongs to, is by determining from an analysis of the received camera image data whether the vehicle occupant’s hand 17 is an occupant’s left or right hand, as described in a foregoing embodiment. This may be particularly useful for control devices disposed between two vehicle occupants occupying a common seating row within the vehicle and facing the same direction, e.g. for control devices arranged between the front seat passengers. By way of further explanation, in this example a first occupant is most likely to operate the control devices with a left hand whereas a second occupant is most likely to operate the control devices with a right hand (or vice versa). In this manner, the controller may discriminate between two vehicle occupants seated adjacent one another within the vehicle.
The non-limiting examples described above may be used independently or in combination to discriminate between multiple vehicle occupants, and to control deployment and I or stowing of the user operated control device in dependence on whom the hand belongs to.
In a further embodiment, the controller 19 may be configured to determine from an analysis of the received camera image data whether the vehicle occupant’s hand 17 is performing a predefined gesture. The controller 19 may then be configured to output the control signal in dependence on whether the hand 17 is performing the predefined gesture. In this way, the controller 19 may able to restrict the deployment or stowing of the user operated control device 15 not only in dependence on the relative position of the hand with respect to the user operated control device, but also in dependence on the predefined gesture. This further helps to prevent accidental deployment of a user operated control device, and for certain user operated control devices may be desirable, in particular where deployment of such user operated control devices may have safety consequences.
Whilst the preceding embodiments of the invention have been described within the context of a ToF camera, it is to be appreciated that alternative camera configurations may be used in accordance with the herein described embodiments. Any configuration of cameras may be used that enables image data of a hand relative to a user operated control device 15 to be captured, and the position of the hand relative to the user operated control device 15 to be determined. For example, a configuration of two or more cameras each configured to enable a different perspective image of the hand relative to the user operated control device 15 to be captured may also be used. In such an arrangement the different perspective images of the hand relative to the user operated control device 15 would enable the controller to determine the position of the hand with respect to the user operated control device 15 by triangulation.
Similarly, in an alternative embodiment, the ToF camera of the preceding embodiments may be replaced by a conventional camera, in combination with an optical ruler, such as a LIDAR for example. In such an embodiment the LIDAR provides the image object distance information, whilst the camera provides image data. The controller may be configured in such embodiments to analyse the LIDAR data in combination with the obtained image data in order to determine the position of the vehicle occupant’s hand relative to the user operated control device 15.
It is to be appreciated that many modifications may be made to the above examples and embodiments without departing from the scope of the present invention as defined in the accompanying claims.

Claims (30)

1. A method of deploying a user operated control device within a vehicle cabin, the method comprising:
detecting a vehicle occupant’s hand within a volume of space within the vehicle cabin;
determining a position of the hand relative to the user operated control device; and deploying the user operated control device in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance.
2. The method of claim 1, comprising:
determining a position of the hand relative to a proximity boundary associated with the user operated control device, the proximity boundary defining a boundary offset from the user operated control device by the predefined threshold distance; and deploying the user operated control device in dependence on at least a portion of the hand intersecting the proximity boundary.
3. The method of any preceding claim, comprising:
monitoring the position of the hand with respect to a deployed user operated control device, and stowing the deployed user operated control device in dependence on the distance between the portion of the hand and the user operated control device being greater than the predefined threshold distance.
4. The method of claim 3, comprising:
stowing the deployed user operated control device in dependence on the distance between the portion of the hand and the user operated control device being greater than the predefined threshold distance for a period of time greater than or equal to a predefined threshold time period..
5. The method of any preceding claim, comprising:
determining if the hand is the vehicle occupant’s left or right hand; and deploying the user operated control device in dependence on whether the hand is the vehicle occupant’s left or right hand.
6. The method of claim 5, comprising:
determining if the hand is oriented palm upwards or downwards within the volume of space relative to the user operated control device; and determining if the hand is the vehicle occupant’s left or right hand in dependence on the orientation of the hand.
7. The method of any preceding claim, comprising:
determining which vehicle occupant the hand belongs to; and deploying the user operated control device in dependence on which vehicle occupant the hand belongs to.
8. The method of any one of claims 5 to 7, comprising:
determining a direction of entry of the hand into the volume of space relative to the user operated control device.
9. The method of any preceding claim, comprising:
obtaining image data of the hand within the volume of space;
receiving a reflectance signal reflected from the hand;
determining a distance of the hand from a designated origin in dependence on the received reflectance signal; and determining the position of the hand with respect to the user operated control device in dependence on the distance of the hand relative to the designated origin, the obtained image data and a known distance of the user operated control device relative to the designated origin.
10. The method of claim 9, wherein the designated origin is coincident with a position of an image capture device.
11. The method of any preceding claim, wherein the user operated control device has a stowed position in which the user operated control device lies substantially flush with a body panel within the vehicle cabin, and a deployed position in which the user operated control device projects from the body panel and is operable by an occupant of the vehicle.
12. The method of any preceding claim, wherein the user operated control device is any one or more of:
i. a switch;
ii. a display unit;
iii. a ventilation control device; and iv. a ventilation vent.
13. The method of any preceding claim, comprising:
determining if the hand is performing a predefined gesture within the volume of space relative to the user operated control device; and deploying the user operated control device in dependence on whether the hand is performing the predefined gesture.
14. A controller for deploying a user operated control device located within a vehicle cabin, the controller comprising:
an input configured to receive image data obtained by an image capture device;
a processor configured in use to:
recognise a hand of a vehicle occupant from the image data, the hand being located within a volume of space within the vehicle cabin within which the image data is obtained by the image capture device;
to determine a position of the hand with respect to the user operated control device; and an output arranged in use to output a control signal to the user operated control device for deploying the user operated control device in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance.
15. The controller of claim 14, wherein the processor is arranged to determine the relative position of the hand with respect to a proximity boundary associated with the user operated control device, the proximity boundary defining a boundary offset from the user operated control device by the predefined threshold distance; and the output is arranged in use to output the control signal in dependence on the position of at least a portion of the hand intersecting the proximity boundary.
16. The controller of claim 14 or 15, wherein the processor is arranged in use to determine the position of the hand relative to a deployed user operated control device; and the output is arranged in use to output a control signal for stowing the deployed user operated control device in dependence on the distance between the portion of the hand and the user operated control device being greater than the predefined threshold distance.
17. The controller of claim 16, wherein the output is arranged in use to output the control signal for stowing the deployed user operated control device in dependence on the distance between the portion of the hand and the user operated control device being greater than the predefined threshold distance for a period of time greater than or equal to a predefined threshold time period.
18. The controller of any one of claims 14 to 17, wherein the processor is arranged in use to determine if the hand is a vehicle occupant’s left or right hand; and the output is arranged in use to output the control signal in dependence on whether the hand is the vehicle occupant’s right or left hand.
19. The controller of claim 18, wherein the processor is arranged in use to determine if the hand is oriented palm upwards or downwards within the volume of space relative to the user operated control device; and to determine if the hand is the vehicle occupant’s left or right hand in dependence on the orientation of the hand.
20. The controller of any one of claims 14 to 19, wherein the processor is arranged in use to determine which vehicle occupant the hand belongs to; and the output is arranged in use to output the control signal in dependence on which vehicle occupant the hand belongs to.
21. The controller of any one of claims 18 to 20, wherein the processor is arranged in use to determine the direction of entry of the hand into the volume of space relative to the user operated control device.
22. The controller of any one of claims 14 to 21, wherein:
the input is configured to receive data from a time-of-flight (ToF) image capture device comprising a sensor, the ToF image capture device being arranged in use to obtain image data of the hand, to illuminate the hand within the volume of space, and to measure a time of return of a reflected illumination signal, the time of return of the reflected illumination signal being proportional to a distance of the hand from the sensor, the data comprising the image data and the time of return of the reflected illumination signal; and wherein the processor is arranged in use to determine the relative position of the hand with respect to the user operated control device in dependence on the determined distance of the hand from the sensor, the obtained image data, and a known distance of the user operated control device relative to the sensor, wherein the distance of the hand from the sensor is determined in dependence on the time of return of the reflected illumination signal.
23. The controller of any one of claims 14 to 22, wherein the user operated control device comprises a stowed position in which the user operated control device lies substantially flush with a body panel within the vehicle cabin, and a deployed position in which the user operated control device projects from the body panel and is operable by a vehicle occupant; and the output is arranged in use to output a control signal configured to transition the user operated control device between the stowed position and the deployed position.
24. A system for deploying a user operated control device located within a vehicle cabin, the system comprising: the controller of any one of claims 14 to 23 in combination with a time-of-flight (ToF) image capture device.
25. The system of claim 24, comprising a user operated control device.
26. The system of claim 25, wherein the user operated control device comprises one or more of:
i. a switch;
ii. a display unit;
iii. a ventilation control device; and iv. a ventilation vent.
27. A vehicle comprising the controller of any one of claims 14 to 23 or a system of any one of claims 24 to 26.
28. A vehicle configured in use to carry out the method of any one of claims 1 to 13.
29. A computer program product comprising instructions for carrying out the method of any one of claims 1 to 13.
30. A computer readable data carrier having stored thereon instructions for carrying out the method of any one of claims 1 to 13.
GB1719070.3A 2017-11-17 2017-11-17 Vehicle controller Active GB2568511B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1719070.3A GB2568511B (en) 2017-11-17 2017-11-17 Vehicle controller
DE102018218480.1A DE102018218480A1 (en) 2017-11-17 2018-10-29 VEHICLE CONTROL

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1719070.3A GB2568511B (en) 2017-11-17 2017-11-17 Vehicle controller

Publications (3)

Publication Number Publication Date
GB201719070D0 GB201719070D0 (en) 2018-01-03
GB2568511A true GB2568511A (en) 2019-05-22
GB2568511B GB2568511B (en) 2020-04-08

Family

ID=60805551

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1719070.3A Active GB2568511B (en) 2017-11-17 2017-11-17 Vehicle controller

Country Status (2)

Country Link
DE (1) DE102018218480A1 (en)
GB (1) GB2568511B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4083757A1 (en) * 2020-10-19 2022-11-02 ameria AG Touchless input interface for an electronic display using multiple sensors

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114575685B (en) * 2022-03-14 2023-05-16 合众新能源汽车股份有限公司 Automobile door handle false touch prevention method and system based on vision

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110286676A1 (en) * 2010-05-20 2011-11-24 Edge3 Technologies Llc Systems and related methods for three dimensional gesture recognition in vehicles
DE102013000083A1 (en) * 2013-01-08 2014-07-10 Audi Ag Method for operating person-specific control interface in passenger car, involves checking compound of body part as criterion for determining whether remaining residual body of operator is in predetermined location area of vehicle interior
US20140282251A1 (en) * 2013-03-15 2014-09-18 Audi Ag Interactive sliding touchbar for automotive display
US20140306814A1 (en) * 2013-04-15 2014-10-16 Flextronics Ap, Llc Pedestrian monitoring application
CN204020591U (en) * 2014-08-28 2014-12-17 卢东煜 Concealed control dial for Vehicle Control Panel
US20150116200A1 (en) * 2013-10-25 2015-04-30 Honda Motor Co., Ltd. System and method for gestural control of vehicle systems
US20170010675A1 (en) * 2015-07-09 2017-01-12 Hyundai Motor Company Input device, vehicle having the input device, and method for controlling the vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110286676A1 (en) * 2010-05-20 2011-11-24 Edge3 Technologies Llc Systems and related methods for three dimensional gesture recognition in vehicles
DE102013000083A1 (en) * 2013-01-08 2014-07-10 Audi Ag Method for operating person-specific control interface in passenger car, involves checking compound of body part as criterion for determining whether remaining residual body of operator is in predetermined location area of vehicle interior
US20140282251A1 (en) * 2013-03-15 2014-09-18 Audi Ag Interactive sliding touchbar for automotive display
US20140306814A1 (en) * 2013-04-15 2014-10-16 Flextronics Ap, Llc Pedestrian monitoring application
US20150116200A1 (en) * 2013-10-25 2015-04-30 Honda Motor Co., Ltd. System and method for gestural control of vehicle systems
CN204020591U (en) * 2014-08-28 2014-12-17 卢东煜 Concealed control dial for Vehicle Control Panel
US20170010675A1 (en) * 2015-07-09 2017-01-12 Hyundai Motor Company Input device, vehicle having the input device, and method for controlling the vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4083757A1 (en) * 2020-10-19 2022-11-02 ameria AG Touchless input interface for an electronic display using multiple sensors

Also Published As

Publication number Publication date
GB201719070D0 (en) 2018-01-03
DE102018218480A1 (en) 2019-05-23
GB2568511B (en) 2020-04-08

Similar Documents

Publication Publication Date Title
US10345806B2 (en) Autonomous driving system and method for same
US10176368B1 (en) Vehicle rear door control system
US20210086662A1 (en) Method for operating an interior of a motor vehicle
US20140292805A1 (en) Image processing apparatus
CN108068737B (en) Vehicle driver positioning system
US10032298B2 (en) Image display control apparatus and image display system
US20140195096A1 (en) Apparatus and method for contactlessly detecting objects and/or persons and gestures and/or operating procedures made and/or carried out thereby
US8009977B2 (en) On-vehicle lighting apparatus
US20120314072A1 (en) Image generation apparatus
US20170297493A1 (en) System and method to improve situational awareness while operating a motor vehicle
US20170305345A1 (en) Image display control apparatus and image display system
US20190152387A1 (en) Ajar tailgate detection system
US10174542B1 (en) Vehicle rear door control system
US20190299847A1 (en) Vehicle control device
US11283995B2 (en) Image display apparatus
JP2011033928A (en) Display control device and program
JP6805223B2 (en) Vehicle display devices, vehicle display methods, and programs
GB2568511A (en) Vehicle controller
JP2005051403A (en) Vehicle perimeter display device
CN107472137B (en) Method and device for representing the environment of a motor vehicle
KR20170105093A (en) Positioning of non-vehicle objects in the vehicle
JP2017165218A (en) Display control device for vehicle, display system for vehicle, display control method for vehicle, and program
WO2019172117A1 (en) Sensor system, and image data generating device
GB2568669A (en) Vehicle controller
US11220214B1 (en) Vehicle viewing system and method including electronic image displays for rearward viewing by a driver