GB2568507A - Vehicle Controller - Google Patents

Vehicle Controller Download PDF

Info

Publication number
GB2568507A
GB2568507A GB1719064.6A GB201719064A GB2568507A GB 2568507 A GB2568507 A GB 2568507A GB 201719064 A GB201719064 A GB 201719064A GB 2568507 A GB2568507 A GB 2568507A
Authority
GB
United Kingdom
Prior art keywords
hand
control device
user operated
operated control
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1719064.6A
Other versions
GB201719064D0 (en
Inventor
Hasedzic Elvir
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1719064.6A priority Critical patent/GB2568507A/en
Publication of GB201719064D0 publication Critical patent/GB201719064D0/en
Publication of GB2568507A publication Critical patent/GB2568507A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/11Instrument graphical user interfaces or menu aspects
    • B60K2360/111Instrument graphical user interfaces or menu aspects for controlling multiple devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/126Rotatable input devices for instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1438Touch screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/145Instrument input by combination of touch screen and hardware input devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • B60K2360/14643D-gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/21Optical features of instruments using cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method enables control of output of information associated with a user operated control device 15 for display within a vehicle cabin 1. The method comprises detecting a hand 17 of a vehicle occupant within a volume of space. The position of the hand 17 with respect to the user operated control device 15 is determined. A time flight camera 7 having a field of view 9 can provide imaging of the hand 17. Information display is controlled in dependence on a distance between at least a portion of the hand 17 and the device 15 being less than or equal to a predefined threshold. Advantageously, additional movement of the hand towards the device 15 may result in further information content, such as secondary or tertiary functions of the control device 15, being displayed on the vehicle display unit 18.

Description

The present disclosure relates to a vehicle controller. Aspects of the invention relate to a controller for controlling output of information associated with a user operated control device for display on a vehicle display unit located within a vehicle cabin, to a method of controlling output of information associated with a user operated control device for display on a vehicle display unit located within a vehicle cabin, to a vehicle comprising the controller, to a vehicle arranged in use to carry out the method, to a system comprising the controller, to a computer program product comprising instructions for carrying out the method, and to a computer-readable data carrier having stored therein instructions for carrying out the method.
BACKGROUND
Current vehicles, in particular automobiles, often comprise numerous control devices, such as switches and latches, either to activate a dedicated function or to activate multiple functions or features. Normally each control device in the vehicle is labelled with a symbol or some limited text, such as an abbreviation, to describe the function it provides. This assists the vehicle occupants to understand the function associated with each control device and thus to operate the control devices in the vehicle. For example, a control device in a vehicle may be labelled with “A/C”, which is the abbreviation for Air Conditioning. Alternatively, the control device may be labelled with a logo to depict Air Conditioning.
A disadvantage associated with this known solution is that often the symbol is insufficient to accurately convey the functionality of the associated control device. Furthermore, often the functionality represented by the symbol needs to be learned before its meaning becomes clear. As a result, the functionality represented by abbreviation and/or the symbol may be misinterpreted. Where a control device provides two or more different functions, it may be very difficult to represent the different functionality with a single symbol or with limited text. This is true also where a control device comprises a plurality of different settings, where it is extremely difficult to convey information regarding the different settings, or how to operate them, via a graphical symbol of limited text. As a result of these shortcomings, vehicle occupants are often required to consult a vehicle manual and learn the functionality associated with each symbol and/or abbreviation.
At least in certain embodiments, the present invention seeks to mitigate or overcome at least some of the above-mentioned problems.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a method, a controller, a system, a vehicle, a computer program product and a computer readable data carrier as claimed in the appended claims.
In accordance with an aspect of the present invention there is provided a method of controlling output of information associated with a user operated control device for display on a vehicle display unit located within a vehicle cabin. The method may comprise: detecting a hand of a vehicle occupant within a volume of space within the vehicle cabin; determining the position of the hand with respect to the user operated control device; and controlling output of the information to the vehicle display unit in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance.
Advantageously, the method enables information about the user operated control device, such as the function and/or operation of the user operated control device, to be provided accurately and clearly to the vehicle occupant when desired. Since the information is output to the vehicle display unit an accurate and complete description may be provided. The dependence of the output of information on the position of the vehicle occupant’s hand enables the information to be outputted only when it is required by the vehicle occupant. For example, at a moment in which the vehicle occupant wishes for information related to a particular control device to be displayed, the vehicle occupant moves their hand towards the required control device and the function and/or operation is provided to the vehicle occupant, in dependence on the distance between the user operated control device and the vehicle occupant’s hand being less than or equal to the predefined threshold distance. Since the information is only displayed when required, energy usage is managed efficiently.
Furthermore and advantageously, the output of information also enables the vehicle occupant to identify which functions and settings of the user operated control device are available for use and how to operate each setting. A further advantage is that where there are multiple user operated control devices providing similar functionality, the differences in functionality provided by the devices is more clearly communicated to the vehicle occupant, without having to consult a vehicle manual or other reference guide.
The method may comprise determining the position of the hand with respect to a proximity boundary associated with the user operated control device, the proximity boundary defining a boundary offset from the user operated control device by the predefined threshold distance; and controlling output of the information in dependence on the position of at least a portion of the hand intersecting the proximity boundary.
The method may comprise monitoring the position of the hand with respect to the user operated control device, and varying the information for output in dependence on the monitored position of the hand with respect to the user operated control device. This provides a convenient way, for example, to vary the amount of information associated with the user operated control device, in dependence on the position of the vehicle occupants hand relative to the user operated control device. Similarly, the level of detail associated with the displayed information may be varied in this way. For example, a cursory level of information may be displayed in dependence on the vehicle occupant’s hand being at a first distance from the user operated control device. As the vehicle occupant moves their hand closer to the user operated control device, more detailed information may be displayed.
The method may comprise obtaining image data of the hand within the volume of space; receiving a reflectance signal reflected from the hand; determining a distance of the hand from a designated origin in dependence on the reflectance signal; and determining the relative position of the hand with respect to the user operated control device in dependence on the distance of the hand relative to the designated origin, the obtained image data, and a known distance of the user operated control device with respect to the designated origin. In this way it is possible to determine the position of the hand relative to the user operated control device on the basis of a two-dimensional image of the hand relative to the user operated control device, and distance information of the hand. This significantly simplifies the hardware required to carry out the method, and in particular obviates the need for using a complex system of two or more cameras, each configured to capture different perspective images of the hand relative to the user operated control device, from which the distance of the hand relative to the user operated control device may be determined.
Optionally, the designated origin may be coincident with a position of an image capture device.
The method may comprise determining if the hand is the vehicle occupant’s left or right hand; and controlling output of the information in dependence on whether the hand is the vehicle occupant’s left or right hand. In embodiments, the method may comprise determining if the hand is orientated palm upwards or downwards within the volume of space relative to the user operated control device; and determining if the hand is the vehicle occupant’s left or right hand in dependence on the orientation of the hand. Analysis of the orientation of the hand provides a convenient way of determining whether the hand is a vehicle occupant’s left or right hand.
The method may comprise determining which vehicle occupant the hand belongs to, and controlling output of the information in dependence on which vehicle occupant the hand belongs to. In this way, advantageously, it is possible to selectively restrict control of the output of the information to the vehicle display unit in dependence on which vehicle occupant the hand belongs to. This may be particularly advantageous where display of information to the vehicle display unit needs to be restricted to specific vehicle occupants, such as the driver of the vehicle. This may prevent unwanted interference with the display of information on the vehicle display unit by passengers within the vehicle, for example when a driver is attempting to navigate using navigation directions displayed on the vehicle display unit.
The method may comprise determining a direction of entry of the hand into the volume of space relative to the user operated control device; determining which vehicle occupant the hand belongs to in dependence on the direction of entry; and controlling output of the information in dependence on which vehicle occupant the hand belongs to. The direction of entry of the hand into the volume of space provides a good indication of which vehicle occupant the hand belongs to.
The method may comprise outputting information associated with a mode of operation of the user operated control device. Advantageously, where the user operated control device comprises multiple modes of operation, this provides a clear and convenient way of determining a current mode of operation of a user operated control device.
In certain embodiments the user operated control device may comprise a first mode of operation and a second mode of operation, and the method may comprise: monitoring the position of the hand with respect to the user operated control device, and outputting information associated with either the first mode of operation or the second mode of operation in dependence on the distance of the hand from the user operated control device. Advantageously, this enables information regarding the different modes of operation associated with the user operated control device to be output for display in dependence on the distance of the hand relative to the user operated control device. For example, if the vehicle occupant holds their hand at a first distance from the user operated control device, information regarding the first mode of operation may be displayed, and similarly if the user holds their hand at a second distance from the user operated control device, information regarding the second mode of operation may be displayed.
The method may comprise displaying the information on the vehicle display unit. The method may comprise displaying the information on any one of: a head-up display, a console display, instrument cluster display, primary or secondary touch display, rotary display. Advantageously, the present method may be used for displaying information on any one of numerous different displays located at different locations within the vehicle cabin.
In embodiments, the method may comprise determining if a distance between at least a portion of the hand and the user operated control device is less than or equal to the predefined threshold distance for a predefined period of time. The output of information to the vehicle display unit may be controlled in dependence on the distance between at least a portion of the hand and the user operated control device being less than or equal to the predefined threshold distance for the predefined period of time. Advantageously, this prevents undesired accidental display of information on the vehicle display unit, which could occur, for example, where a vehicle occupant accidentally moves their hand within the predefined threshold distance. Provided that the vehicle occupant does not maintain at least a portion of their hand at a distance equal to or less than the predefined threshold distance for the predefined period of time, the information is not output for display. This also reduces unnecessary energy usage.
In accordance with a further aspect of the present invention there is provided a controller for controlling output of information associated with a user operated control device for a vehicle display unit located within a vehicle cabin. The controller may comprise: an input configured to receive image data obtained from an image capture device; a processor configured in use to: recognise a hand of a vehicle occupant from the image data, the hand being located within a volume of space within the vehicle cabin within which the image data is obtained by the image capture device; determine a position of the hand with respect to the user operated control device; and an output arranged in use to output a control signal to the vehicle display unit, the control signal comprising the information associated with the user operated control device, in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold. The controller and its associated embodiments benefit from the same advantages as set out in relation to the previous aspect and its embodiments of the invention.
Optionally, the processor may be arranged in use to determine the relative position of the hand with respect to a control proximity boundary associated with the user operated control device, the control proximity boundary defining a boundary offset from the user operated control device by the predefined threshold distance; and the output may be arranged in use to output the control signal in dependence on at least a portion of the hand intersecting the control proximity boundary.
Optionally, the input may be arranged in use to receive a sequence of images obtained by the image capture device, the sequence of images comprising two or more image frames; the processor may be arranged in use to: recognise the hand in the sequence of images; and determine if the position of the hand with respect to the user operated control device varies in the sequence of images; and wherein the output is arranged in use to output a control signal configured to vary the information for output in dependence on the determined variation of the position of the hand with respect to the user operated control device.
In embodiments, the input may be configured to receive data from a time-of-flight (ToF) image capture device comprising a sensor, the ToF image capture device being arranged in use to obtain mage data of the hand, to illuminate the hand within the volume of space, and to measure a time of return of a reflected illumination signal, the time of return of the reflected illumination signal being proportional to a distance of the hand from the sensor; the data comprising the image data and the time of return of the reflected illumination signal; and wherein the processor may be arranged in use to determine the position of the hand with respect to the user operated control device in dependence on the determined distance of the hand from the sensor, the obtained image data, and a known distance of the user operated control device relative to the sensor, wherein the distance of the hand from the sensor may be determined in dependence on the time of return of the reflected illumination signal.
Optionally, the processor may be arranged in use to determine if the hand is the vehicle occupant’s left or right hand; and the output may be arranged in use to output the control signal in dependence on whether the hand is the vehicle occupant’s left or right hand.
The processor may be arranged in use to determine if the hand is orientated palm upwards or downwards within the volume of space relative to the user operated control device, and to determine if the hand is the vehicle occupant’s left or right hand in dependence on the orientation of the hand.
In embodiments the processor may be arranged in use to determine which vehicle occupant the hand belongs to; and the output may be arranged in use to output the control signal in dependence on which vehicle occupant the hand belongs to.
In certain embodiments the processor may be arranged in use to determine a direction of entry of the hand into the volume of space relative to the user operated control device; and to determine which vehicle occupant the hand belongs to in dependence on the direction of entry.
The control signal may comprise information associated with a mode of operation of the user operated control device; and the output may be arranged in use to output the control signal to the vehicle display for displaying the information associated with the mode of operation of the user operated control device on the vehicle display unit.
In certain embodiments the user operated control device may comprise a first mode of operation and a second mode of operation, and the processor may be arranged in use to monitor the position of the hand with respect to the user operated control device; and the output may be arranged in use to output the control signal comprising information associated with either the first or the second mode of operation in dependence on the distance of the hand from the user operated control device.
Optionally, the output may be arranged in use to output the control signal to any one of: a head-up display, a console display, instrument cluster display, primary or secondary touch display, rotary display.
In accordance with yet a further aspect of the invention, there is provided a vehicle arranged in use to carry out the previously described method of controlling output of information associated with a user operated control device for display on a vehicle display unit located within a vehicle cabin. The method may comprise: detecting a hand of a vehicle occupant within a volume of space within the vehicle cabin; determining the position of the hand with respect to the user operated control device; and controlling output of the information to the vehicle display unit in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance.
In accordance with yet a further aspect of the invention there is provided a vehicle comprising the previously described controller for controlling output of information associated with a user operated control device for a vehicle display unit located within a vehicle cabin. The controller may comprise: an input configured to receive image data obtained from an image capture device; a processor configured in use to: recognise a hand of a vehicle occupant from the image data, the hand being located within a volume of space within the vehicle cabin within which the image data is obtained by the image capture device; determine a position of the hand with respect to the user operated control device; and an output arranged in use to output a control signal to the vehicle display unit, the control signal comprising the information associated with the user operated control device, in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold. This aspect of the invention benefits from the same advantages as set out previously in relation to the other aspects and embodiments of the invention.
In accordance with another aspect of the invention there is there is provided a system comprising the aforementioned controller and a time-of-flight (ToF) camera. Optionally, the system comprises a user operated control device.
In accordance with yet a further aspect of the invention there is provided a computer program product comprising instructions for carrying out the previously described method. In particular, the computer program product may comprise instructions for controlling output of information associated with a user operated control device for display on a vehicle display unit located within a vehicle cabin. When executed on a processor, the instructions may configure the processor to: detect a hand of a vehicle occupant within a volume of space within the vehicle cabin; determine a position of a hand with respect to the user operated control device; and control output of the information to the vehicle display unit in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance.
In accordance with yet a further aspect of the invention there is provided a computerreadable data carrier having stored thereon instructions for carrying out the previously described method. In particular, the computer-readable data carrier may comprise instructions for controlling output of information associated with a user operated control device for display on a vehicle display unit located within a vehicle cabin. When executed on a processor, the instructions may configure the processor to: detect a hand of a vehicle occupant within a volume of space within the vehicle cabin; determine a position of a hand with respect to the user operated control device; and control output of the information to the vehicle display unit in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance. Optionally, the computerreadable data carrier is a non-transitory computer readable data carrier.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a schematic cut-away illustration of a front portion of a vehicle cabin having a camera having a field of view arranged to obtain image data of a vehicle occupant’s hand within a volume of space within a vehicle cabin;
Figures 2a and 2b are schematic illustrations of a portion of the control panel of the vehicle cabin of Figure 1, showing a vehicle display unit to which the output of information associated with a control device may be controlled in dependence on a distance between at least a portion of a vehicle occupant’s hand and the control device;
Figure 3 is a schematic illustration of the functional components of a controller configured to control output of information associated with a control device located in the vehicle cabin of Figure 1;
Figure 4 is a process flow chart outlining a method for displaying information on a vehicle display unit associated with a control device within a vehicle cabin, in dependence on the proximity of a hand to the control device, using the camera of Figure 1;
Figure 5 is a schematic illustration highlighting the principle of operation of a Time-ofFlight (ToF) camera, which may be used to determine the position of a vehicle occupant’s hand within the vehicle cabin of Figure 1;
Figures 6a and 6b are schematic illustrations showing a three-dimensional point cloud of a vehicle occupant’s hand generated using the ToF camera of Figure 5; and
Figure 7 is a schematic illustration of a vehicle comprising the camera of Figure 1 and the controller of Figure 3.
DETAILED DESCRIPTION
Figure 1 is a cut-away perspective view of a portion of the vehicle cabin 1, and in particular shows the driver 3 sitting in the driver’s seat 5. An image capture device in the form of a camera 7, having a field of view 9 delineated in Figure 1 by lines 11, is shown located in the cabin roof. Optionally, the camera 7 may comprise a ToF camera. The camera 7 is arranged to image objects located within the camera’s field of view 9. The field of view defines a volume of space within the vehicle cabin 1 within which objects are imaged by the camera 7. The camera 7 is arranged such that a control panel 13 of the vehicle 1 lies within the camera’s field of view 9. The control panel 13 comprises a plurality of different user operated control devices 15, which may relate to, but are not limited to: air ventilation switches; air conditioning switches; vehicle infotainment system; air circulation switches; and any other user operated control device configured to operate a control system of the vehicle. The vehicle cabin 1 may be comprised in the vehicle 43 of Figure 7.
Figure 2a provides a perspective view of a portion of the control panel 13 of the vehicle cabin of Figure 1, showing a vehicle display unit 18 on which information associated with a user operated control device 15 may be displayed.
Figure 2b provides a perspective view of a portion of the control panel 13 of the vehicle cabin of Figure 1, showing a vehicle display unit 18 on which information associated with a user operated control device 15 may be displayed. In addition, Figure 2b shows a vehicle occupant’s hand 17 in which at least a portion of the hand is at a distance equal to or less than a predefined threshold distance of from the control device 15, and in response information 16 associated with the user operated control device 15 is displayed on the vehicle display unit 18. This is explained in further detail in the ensuing description.
In certain embodiments, the camera 7 may be operatively coupled to a controller 19 (shown in Figure 3), and configured to receive image data obtained by the camera 7, and to output a control signal to the vehicle display unit 18 in dependence on an analysis of the received image data. The control signal may comprise the information associated with the control device 15. This enables selective output of information to the vehicle display unit 18.
Figure 3 provides a functional overview of the controller 19. The controller 19 may be functionally embedded into an existing electronic control unit of the vehicle 1. The controller 19 may be provided with an input 21 and an output 23. The input 21 may be configured to receive image data obtained by the camera 7, and the output 23 may be configured to output a control signal to the vehicle display unit 18, where the control signal may comprise information associated with a control device 15. The control signal may be output in dependence on a distance between at least a portion of the vehicle occupant’s hand 17 and the control device 15 being less than or equal to a predefined threshold. In addition, the controller 19 may comprise a processor 25 arranged to analyse image data received from the camera 7, to identify image objects such as the hand 17 of a vehicle occupant within the obtained image data, and to generate control signals for controlling output of information associated with the control device 15 in dependence on the relative position of the hand 17 with respect to the control device 15.
In use, as a vehicle occupant’s hand 17 is obtained by the camera 7, and its position relative to a control device 15 is determined by the controller 19, typically by the processor 25 of the controller 19, information 16 associated with the control device 15 may be displayed on the vehicle display unit 18 via a control signal output to the vehicle display unit 18. The displayed position of the information 16 on the vehicle display unit 18 may be associated with the position of the desired control device 15 relative to the vehicle display unit 18. For example, information 16 associated with the control device 15 with respect to which the imaged vehicle occupant’s hand 17 is determined to be closest may be activated in this way and displayed on the vehicle display unit 18 as shown in Figure 2b. This means that the information 16 associated with the control device 15 that the vehicle occupant may be interested in operating is selectively displayed on the vehicle display unit 18.
In certain embodiments, the controller 19 may be configured in use to output the control signal in dependence on a distance between at least a portion of a vehicle occupant’s hand 17 and the desired control device 15 being less than or equal to a predefined threshold distance, for a predefined period of time. For example, as the camera 7 obtains image data of a vehicle occupant’s hand 17, such as the driver’s hand, the controller 19 may be configured to identify the image of the hand within the received obtained image data. The relative position of the imaged hand with respect to a desired control device 15 may then be determined, from the obtained image data. In order to identify an image of a hand, the controller 19, and specifically the processor 25 may be configured with image recognition software configured to identify a vehicle occupant’s hand 17 located within the camera’s field of view 9, from obtained image data. From the relative position of the imaged hand with respect to the control device 15, the distance between the vehicle occupant’s hand 17 and the control device 15 may be calculated and compared with a predefined threshold distance. If the calculated distance is less than the predefined threshold distance, the controller 19 may be configured to determine the period of time that the calculated distance is less than the predefined threshold distance. If this period of time is less than a predefined period of time, the controller 19 may output a control signal to the vehicle display unit 18 to display information 16 associated with the control device 15.
Figure 4 is a process flow chart outlining a method used in accordance with certain embodiments of the invention, to control output of information associated with a user operated control device 15 in dependence on the proximity of a vehicle occupant’s hand 17 to the control device 15, using the camera 7 of Figure 1 in operative communication with the controller 19 of Figure 2. The method is initiated by the camera 7 capturing image data within the vehicle cabin 1, at step 301. In certain embodiments the camera 7 may be configured to continuously obtain the image data, or to periodically obtain image data at a predefined frequency. The obtained image data may be forwarded to the controller 19 for analysis where, at step 303, captured image data is analysed to identify a vehicle occupant’s hand 17 within the obtained image data. As mentioned previously, this may comprise the use of image recognition software. Once a vehicle occupant’s hand 17 has been identified within the obtained image data, the position of the hand 17 is determined relative to the user operated control device 15, at step 305. Where the vehicle control panel 13 comprises a plurality of different user operated control devices 15, step 305 may comprise determining the position of the hand 17 relative to the nearest user operated control device 15. The position of the hand 17 relative to the user operated control device 15 may be determined by the processor 25. At step 307 it is determined if at least a portion of the hand 17 lies at a distance that is less than or equal to a predefined threshold distance from the user operated control device 15. If it is determined that no portion of the hand lies within the predefined threshold distance, then the processor 25 continues to analyse received obtained image data, and the method returns to step 303. If instead it is determined by the processor 25 that at least a portion of the identified hand lies within the predefined threshold distance of the user operated control device 15, then the processor 25 generates a control signal for output to the vehicle display unit 18, at step 308. Upon receipt of the control signal, at step 310, the information associated with the user operated control device 15 is displayed on the vehicle display unit 18.
In certain embodiments, the predefined threshold distance may relate to a few centimetres, for example any distance within the range of 1cm to 10cm, including 1cm and 10cm. In certain embodiments the predefined threshold may delineate a control proximity boundary surrounding and offset from the user operated control device 15 by the predetermined threshold distance, which control proximity boundary when intersected by at least a portion of the vehicle occupant’s hand 17 causes the controller 19 to generate the control signal for output to the vehicle display unit 18.
The control proximity boundary may be geometrically shaped. For example, the control proximity boundary may be box-shaped, or spherically shaped. Effectively, the control proximity boundary relates to a volume of space offset from the control device by the predefined threshold distance. In dependence on any portion of the control proximity boundary being intersected by at least a portion of the vehicle occupant’s hand, the controller 19 generates the control signal comprising the information associated with the user operated control device 15. It is to be appreciated that not all of the portions of the control proximity boundary need to be offset from the control device by the predefined threshold distance. For example, where the control proximity boundary is box-shaped (e.g. cube shaped), it is to be appreciated that some faces of the cube may not be offset from the control device by the predefined threshold distance.
In certain embodiments, in order to enable the position of the vehicle occupant’s hand 17 to be determined relative to the user operated control device 15, the camera 7 may relate to a 3D mapping controller arranged to generate a 3D model of the hand within the field of view 9. For example, in certain embodiments the camera 7 may relate to a Time-of-Flight (ToF) camera, in which each obtained image pixel is associated with a distance on the basis of a time of return of a reflected illumination signal. To achieve this, the ToF camera may be configured with an illumination source arranged to illuminate the camera’s field of view. The incident illumination signal is subsequently reflected by objects present in the camera’s field of view, and the time of return of the reflected illumination signal is measured. In this way it is possible to associate a distance measurement to each imaged object. The illumination signal may relate to any electro-magnetic signal, and need not be comprised in the visible spectrum. For example, in certain embodiments the illumination signal may operate in the infrared spectrum.
In those embodiments where the camera 7 comprises a ToF camera, the controller 19, and specifically the input 21 may be configured to receive both camera image data and image object distance information data from the ToF camera. This enables the controller 19, and more specifically the processor 25 to determine the position of the vehicle occupant’s hand 17 relative to a user operated control device 15 from the received data.
Figure 5 is a schematic diagram illustrating the principle of operation of a ToF camera 27. A modulated illumination source 29 is used to illuminate a desired target 31. The incident illumination 33 is reflected by the target 31 and captured on a sensor 35 comprising an array of pixels. However, whilst simultaneously capturing the reflected modulated light 37, the pixels of the sensor 35 also capture visible light reflected from the target. Since the illumination signal is modulated 33, it may be distinguished from the visible light reflected from the target 31, which enables the time of flight of the modulated illumination signal to be measured. The time of flight taken for the modulated illumination signal to be incident on the target 31 and reflected back to the sensor 35 is measured when it is incident on the sensor 35. In this way, each obtained image pixel may be associated with a distance of the corresponding image object on the basis of the measured time of flight required for the reflected modulated illumination signal 37 to be measured by the sensor 35. More specific details regarding operation of ToF cameras are widely available in the art, and for this reason a more detailed discussion is not necessary for present purposes.
Where the camera 7 of Figure 1 comprises a ToF camera 27, it is possible to generate a three-dimensional point cloud of the vehicle occupant’s hand located within the camera’s field of view 9. Figures 6a and 6b illustrate an example of a threedimensional point cloud 39 of the vehicle occupant’s hand 17, generated using the ToF camera 27. In certain embodiments the controller 19 may be configured to generate the three-dimensional point cloud using the image data and image object distance information received from the ToF camera 27. Figure 6a shows a point cloud of the vehicle occupant’s hand 39 as it is approaching a rectangular-shaped control proximity boundary 41. In Figure 6b a portion of the point cloud 39 of the vehicle occupant’s hand 17 is intersecting a portion of the control proximity boundary 41. In this event, and as mentioned previously, the controller 19 is configured to generate a control signal for controlling output of information associated with a user operated control device 15 in dependence on the relative position of the hand 17 with respect to the control device 15.
In order to enable the position of the vehicle occupant’s hand 17 to be determined relative to a user operated control device 15, the position of the control device 15 relative to the ToF camera may be determined. Again, this may be done using image recognition software. Since the position of the user operated control device 15 relative to the ToF camera 27 is known, and the position of the vehicle occupant’s hand 17 relative to the ToF camera 27 is known, the position of the vehicle occupant’s hand 17 relative to the user operated control device 15 may be determined using trigonometry. In certain embodiments, and in order to facilitate computation during use, the controller 19 may be provided with distance information of each control device 15 relative to the ToF camera 27 during an initial configuration of the ToF camera 27. This distance information may be stored and accessed for subsequent use when it’s needed. This facilitates subsequent computation of the position of the hand relative to the user operated control device, since only the distance of the vehicle occupant’s hand 17 with respect to the ToF camera 27, and the position relative to the known position of the user operated control device 15 requires calculation.
In certain embodiments the controller 19 may be configured to deactivate the output of information associated with a control device 15 in dependence on the distance between at least a portion of the vehicle occupant’s hand 17 and the control device 15 being less than the predefined threshold, in a similar manner as previously described. In such embodiments it is envisaged that the vehicle occupant may cause the displayed information to be removed by moving their hand to a distance less than or equal to the predefined threshold distance.
In alternative embodiments the controller 19 may be configured to control the output of information associated with a control device 15 in dependence on the distance between at least a portion of the vehicle occupant’s hand 17 and the control device 15 being greater than or equal to the predefined threshold distance. For example, it is envisaged that as the output of information associated with the control device 15 is controlled by the vehicle occupant positioning their hand such that at least a portion of the hand lies at or within the predetermined threshold distance of the user operated control device 15, it is equally envisaged that, the output of information may be ceased by the vehicle occupant retracting their hand to a position relative to the associated user operated control device 15, that lies at a distance from the control device 15 greater than the predetermined threshold distance.
In certain embodiments it is envisaged that once the information is displayed on the vehicle display unit it remains displayed for a predefined period of time irrespective of the position of the vehicle occupant’s hand relative to the control device. For example, should the vehicle occupant retract their hand such that it no longer lies within the predefined threshold distance, then information associated with the user operated control device remains displayed for the predefined period of time before ceasing to be displayed. The predefined period of time may relate to several seconds, e.g. twenty seconds, or any user defined time period.
In certain embodiments, the information content associated with the control device 15 and displayed on the vehicle display unit 18 may be varied as the distance between the vehicle occupant’s hand 17, and the control device 15 changes. For example, once at least a portion of the vehicle occupant’s hand 17 lies at a distance equal to the predefined threshold distance, a first information content, such as the primary function of the control device 15, may be displayed. Further movement of the vehicle occupant’s hand towards the user operated control device 15, thus decreasing the distance between the vehicle occupant’s hand 17 and the control device 15, may result in a second information content, such as one or more options associated with the primary function of the control device 15, being displayed on the vehicle display unit
18. Additional further movement of the vehicle occupant’s hand towards the user operated control device 15, thus further decreasing the distance between the vehicle occupant’s hand 17 and the control device 15, may result in a third information content, such as secondary or tertiary functions of the control device 15, being displayed on the vehicle display unit 18.
One way in which this may be achieved is by the controller 19 continuously analysing received image and distance data, and adapting the information content of the output control signal in dependence on the distance between the vehicle occupant’s hand 17 and the user operated control device 15 changing.
Similarly, the controller 19 may be adapted to output a control signal for adapting the information content associated with the control device 15 and displayed on the vehicle display unit 18 as the vehicle occupant’s hand is retracted. For example, when a third information content is displayed on the vehicle display unit 18, movement of the vehicle occupant’s hand away from the user operated control device 15, thus increasing the distance between the vehicle occupant’s hand 17 and the control device 15, may result in the second information content being displayed on the vehicle display unit 18. Additional further movement of the vehicle occupant’s hand away from the user operated control device 15, thus further increasing the distance between the vehicle occupant’s hand 17 and the control device 15, may result in the first information content being displayed on the vehicle display unit 18. Once the vehicle occupant’s hand 17 lies at a distance from the control device 15 that is greater than the predefined threshold distance, no information associated with the control device 15 may be displayed on the vehicle display unit 18. Alternatively, the information associated with the control device 15 may be displayed on the vehicle display unit 18 for a predefined period of time before being removed.
In certain embodiments, the controller 19 may be configured to output information associated with a mode of operation of the user operated control device 15 that at least a portion of the vehicle occupant’s hand 17 is determined to lie at a distance less than or equal to the predetermined threshold distance. The user operated control device 15 may comprise a first mode of operation and a second mode of operation. The controller 19 may be configured to monitor the position of the hand 17 with respect to the user operated control device 15 and output information associated with either the first mode of operation or the second mode of operation, in dependence on the distance of the hand 17 from the user operated control device 15. For example, once at least a portion of the vehicle occupant’s hand 17 lies at a distance equal to the predefined threshold distance the controller may output information associated with the first mode of operation associated with the user operated control device 15. Further movement of the vehicle occupant’s hand towards the control device 15, thus decreasing the distance between the vehicle occupant’s hand 17 and the user operated control device 15, may result in the controller outputting information associated with the second mode of operation associated with the user operated control device 15. Similarly, for example, if information associated with the second mode of operation is being displayed on the vehicle display unit 18, further movement of the vehicle occupant’s hand away from the control device 15, thus increasing the distance between the vehicle occupant’s hand 17 and the user operated control device 15, may result in the controller outputting information associated with the first mode of operation associated with the user operated control device 15.
In certain embodiments, the granularity of information associated with the control device 15 and displayed on the vehicle display unit 18 may be varied as the distance between the vehicle occupant’s hand 17, and the control device 15 changes. For example, once at least a portion of the vehicle occupant’s hand 17 lies at a distance equal to the predefined threshold distance, a first granularity of information may be displayed. Further movement of the vehicle occupant’s hand towards the user operated control device 15, thus decreasing the distance between the vehicle occupant’s hand 17 and the control device 15, may result in a second granularity of information being displayed on the vehicle display unit 18. Additional further movement of the vehicle occupant’s hand towards the user operated control device 15, thus further decreasing the distance between the vehicle occupant’s hand 17 and the control device 15, may result in a third granularity being displayed on the vehicle display unit 18. The first, second and third granularities of information comprise an increasing level of detail concerning the information to be displayed on the vehicle display unit 18.
In a further embodiment, the controller 19 may be configured to determine from an analysis of the received camera image data whether the vehicle occupant’s hand 17 is the occupant’s left or right hand. The controller 19 may then be configured to output the control signal in dependence on whether the hand 17 is the occupant’s left or right hand. In this way it is possible to restrict the display of information on the vehicle display unit 18 in dependence on whether a vehicle occupant’s left or right hand is determined to be at a distance from the control device equal to or less than the predetermined threshold distance.
In certain embodiments, the controller 19 may be configured to determine if the vehicle occupant’s hand 17 is orientated palm upwards or downwards relative to the user operated control device 15 or relative to the camera 7, and determining if the hand 17 is the vehicle occupant’s left or right hand in dependence on the orientation of the hand. This may be determined on the basis of the reflectance signal from the hand 17, and/or by image object analysis. The skin texture of a palm of a hand is different to the skin texture of the back of a hand, and as a result the amount of incident light absorbed by the palm differs to the amount of light absorbed by the back of the hand. Accordingly, by configuring the controller to analyse the intensity of the reflected signal, which is indicative of the amount of incident illumination absorbed by the hand, it is possible for the controller to determine whether the hand is orientated palm upwards or downwards.
In certain embodiments, the controller 19 may be configured to determine which vehicle occupant the imaged hand belongs to, so, for example, whether the imaged hand 17 belongs to a driver of the vehicle or to a passenger. The output of information 16 on the vehicle display unit 18 may then be controlled in dependence on which vehicle occupant the hand belongs to. For example, the output of information associated with the user operated control device 15 on the vehicle display unit 18 may be controlled in dependence on the hand 17 belonging to the driver 3 of the vehicle. This may help to reduce driver distraction by preventing the accidental display of information on the vehicle display unit by a passenger of the vehicle. One non-limiting way in which the controller 19 may determine which vehicle occupant the hand belongs to, is by monitoring and determining a direction of entry of the hand into the camera’s field of view 9 relative to the user operated control device 15. This may be achieved from an analysis by the controller 19 of image data obtained by the camera
7. The direction of entry of the hand 17 into the camera’s field of view 9 may be indicative of where the vehicle occupant is seated in relation to the user operated control device 15, and therefore provides a good assumption regarding which vehicle occupant the hand 17 belongs to.
In a further embodiment, the output 23 of the controller 19 may be arranged in use to output a control signal configured to vary the information content for output in dependence on a variation in the relative position of the vehicle occupant’s hand 17 with respect to the user operated control device 15. The input 21 may be arranged in use to receive a sequence of images obtained by the image capture device. The sequence of images may comprise two or more image frames. The processor 25 may be arranged in use to recognise the vehicle occupant’s hand 17 in the sequence of images, and determine if the position of the hand 17 with respect to the control device 15 varies in the sequence of images. The output 23 may then be arranged in use to output a control signal configured to vary the information content for display on the vehicle display unit 18, in dependence on the determined variation of the position of the hand 17 with respect to the user operated control device.
Whilst the preceding embodiments of the invention have been described within the context of a ToF camera, it is to be appreciated that alternative camera configurations may be used in accordance with the herein described embodiments. Any configuration of cameras may be used that enables image data of a hand relative to a control device to be obtained, and the position of the hand relative to the control device to be determined. For example, a configuration of two or more cameras each configured to enable a different perspective image of the hand relative to the control device to be obtained may also be used. In such an arrangement the different perspective images of the hand relative to the control device would enable the controller to determine the position of the hand with respect to the control device by triangulation.
Similarly, in an alternative embodiment, the ToF camera of the preceding embodiments may be replaced by a conventional camera, in combination with an optical ruler, such as a LIDAR for example. In such an embodiment the LIDAR provides the image object distance information, whilst the camera provides image data. The controller may be configured in such embodiments to analyse the LIDAR data in combination with the obtained image data in order to determine the position of the vehicle occupant’s hand relative to the control device.
The preceding embodiments of the invention have been described within the context of the displaying of information on a vehicle display unit in a vehicle cabin. Some non22 limiting examples of vehicle display units that the invention may be configured to display information on include a head-up display (HUD), which includes any transparent display that presents data without requiring users to look away from their usual viewpoints, a console display, an instrument cluster display, a primary or 5 secondary touch display, and a rotary display.
It is to be appreciated that many modifications may be made to the above examples and embodiments without departing from the scope of the present invention as defined in the accompanying claims.

Claims (31)

1. A method of controlling output of information associated with a user operated control device for display on a vehicle display unit located within a vehicle cabin , the method comprising:
detecting a hand of a vehicle occupant within a volume of space within the vehicle cabin;
determining the position of the hand with respect to the user operated control device; and controlling output of the information to the vehicle display unit in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance.
2. The method of claim 1, comprising:
determining the position of the hand with respect to a proximity boundary associated with the user operated control device, the proximity boundary defining a boundary offset from the user operated control device by the predefined threshold distance; and controlling output of the information in dependence on the position of at least a portion of the hand intersecting the proximity boundary.
3. The method of any preceding claim, comprising:
monitoring the position of the hand with respect to the user operated control device, and varying the information for output in dependence on the monitored position of the hand with respect to the user operated control device.
4. The method of any preceding claim, comprising:
obtaining image data of the hand within the volume of space;
receiving a reflectance signal reflected from the hand;
determining a distance of the hand from a designated origin in dependence on the reflectance signal; and determining the relative position of the hand with respect to the user operated control device in dependence on the distance of the hand relative to the designated origin, the obtained image data, and a known distance of the user operated control device with respect to the designated origin.
5. The method of claim 4, wherein the designated origin is coincident with a position of an image capture device.
6. The method of any preceding claim, comprising:
determining if the hand is the vehicle occupant’s left or right hand; and controlling output of the information in dependence on whether the hand is the vehicle occupant’s left or right hand.
7. The method of any preceding claim, comprising:
determining if the hand is orientated palm upwards or downwards within the volume of space relative to the user operated control device; and determining if the hand is the vehicle occupant’s left or right hand in dependence on orientation of the hand.
8. The method of any preceding claim, comprising:
determining which vehicle occupant the hand belongs to; and controlling output of the information in dependence on which vehicle occupant the hand belongs to.
9. The method of any preceding claim, comprising:
determining a direction of entry of the hand into the volume of space relative to the user operated control device;
determining which vehicle occupant the hand belongs to in dependence on the direction of entry; and controlling output of the information in dependence on which vehicle occupant the hand belongs to.
10. The method of any preceding claim, comprising:
outputting information associated with a mode of operation of the user operated control device.
11. The method of any preceding claim, wherein the user operated control device comprises a first mode of operation and a second mode of operation, and the method comprises:
monitoring the position of the hand with respect to the user operated control device, and outputting information associated with either the first mode of operation or the second mode of operation in dependence on the distance of the hand from the user operated control device.
12. The method of any preceding claim, comprising:
displaying the information on the vehicle display unit.
13. The method of claim 12, comprising:
displaying the information on any one of: a head-up display, a console display, instrument cluster display, primary or secondary touch display, rotary display.
14. The method of any preceding claim, comprising:
determining if a distance between at least a portion of the hand and the user operated control device is less than or equal to the predefined threshold distance for a predefined period of time; and controlling the output of information to the vehicle display unit in dependence on the distance between at least a portion of the hand and the user operated control device being less than or equal to the predefined threshold distance for the predefined period of time.
15. A controller for controlling output of information associated with a user operated control device for a vehicle display unit located within a vehicle cabin, the controller comprising:
an input configured to receive image data obtained from an image capture device;
a processor configured in use to:
recognise a hand of a vehicle occupant from the image data, the hand being located within a volume of space within the vehicle cabin within which the image data is obtained by the image capture device;
determine a position of the hand with respect to the user operated control device; and an output arranged in use to output a control signal to the vehicle display unit, the control signal comprising the information associated with the user operated control device, in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold.
16. The controller of claim 15, wherein the processor is arranged in use to determine the relative position of the hand with respect to a control proximity boundary associated with the user operated control device, the control proximity boundary defining a boundary offset from the user operated control device by the predefined threshold distance; and the output is arranged in use to output the control signal in dependence on at least a portion of the hand intersecting the control proximity boundary.
17. The controller of claim 15 or 16, wherein the input is arranged in use to receive a sequence of images obtained by the image capture device, the sequence of images comprising two or more image frames;
the processor is arranged in use to:
recognise the hand in the sequence of images; and determine if the position of the hand with respect to the user operated control device varies in the sequence of images; and wherein the output is arranged in use to output a control signal configured to vary the information for output in dependence on the determined variation of the position of the hand with respect to the user operated control device.
18. The controller of any one of claims 15 to 17, wherein the input is configured to receive data from a time-of-flight (ToF) image capture device comprising a sensor, the ToF image capture device being arranged in use to obtain image data of the hand, to illuminate the hand within the volume of space, and to measure a time of return of a reflected illumination signal, the time of return of the reflected illumination signal being proportional to a distance of the hand from the sensor; the data comprising the image data and the time of return of the reflected illumination signal; and wherein the processor is arranged in use to determine the position of the hand with respect to the user operated control device in dependence on the determined distance of the hand from the sensor, the obtained image data, and a known distance of the user operated control device relative to the sensor, wherein the distance of the hand from the sensor is determined in dependence on the time of return of the reflected illumination signal.
19. The controller of any one of claims 15 to 18, wherein the processor is arranged in use to determine if the hand is the vehicle occupant’s left or right hand; and the output is arranged in use to output the control signal in dependence on whether the hand is the vehicle occupant’s left or right hand.
20. The controller of claim 19, wherein the processor is arranged in use to determine if the hand is orientated palm upwards or downwards within the volume of space relative to the user operated control device, and to determine if the hand is the vehicle occupant’s left or right hand in dependence on orientation of the hand.
21. The controller of any one of claims 15 to 20, wherein the processor is arranged in use to determine which vehicle occupant the hand belongs to; and the output is arranged in use to output the control signal in dependence on which vehicle occupant the hand belongs to.
22. The controller of any one of claims 15 to 21, wherein the processor is arranged in use to determine a direction of entry of the hand into the volume of space relative to the user operated control device; and to determine which vehicle occupant the hand belongs to in dependence on the direction of entry.
23. The controller of any one of claims 15 to 22, wherein the control signal comprises information associated with a mode of operation of the user operated control device; and the output is arranged in use to output the control signal to the vehicle display for displaying the information associated with the mode of operation of the user operated control device on the vehicle display unit.
24. The controller of any one of claims 15 to 23, wherein the user operated control device comprises a first mode of operation and a second mode of operation, and the processor is arranged in use to monitor the position of the hand with respect to the user operated control device; and the output is arranged in use to output the control signal comprising information associated with either the first or the second mode of operation in dependence on the distance of the hand from the user operated control device.
25. The controller of any one of claims 15 to 24, wherein the output is arranged in use to output the control signal to any one of: a head-up display, a console display, instrument cluster display, primary or secondary touch display, rotary display.
26. A vehicle arranged in use to carry out the method of any one of claims 1 to 14.
27. A vehicle comprising the controller of any one of claims 15 to 25.
28. A system comprising the controller of any one of claims 15 to 25 and a time-offlight (ToF) camera.
29. The system of claim 28 comprising a user operated control device.
30. A computer program product comprising instructions for carrying out the method of any one of claims 1 to 14.
31. A computer-readable data carrier having stored thereon instructions for carrying out the method of any one of claims 1 to 14.
GB1719064.6A 2017-11-17 2017-11-17 Vehicle Controller Withdrawn GB2568507A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1719064.6A GB2568507A (en) 2017-11-17 2017-11-17 Vehicle Controller

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1719064.6A GB2568507A (en) 2017-11-17 2017-11-17 Vehicle Controller

Publications (2)

Publication Number Publication Date
GB201719064D0 GB201719064D0 (en) 2018-01-03
GB2568507A true GB2568507A (en) 2019-05-22

Family

ID=60805606

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1719064.6A Withdrawn GB2568507A (en) 2017-11-17 2017-11-17 Vehicle Controller

Country Status (1)

Country Link
GB (1) GB2568507A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040036764A1 (en) * 2002-08-08 2004-02-26 Nissan Motor Co., Ltd. Operator identifying device
CN105718188A (en) * 2014-12-19 2016-06-29 罗伯特·博世有限公司 method of operating input device, input device, and motor vehicle
DE102015201722A1 (en) * 2015-02-02 2016-08-04 Robert Bosch Gmbh Method for operating an input device, input device
WO2017054894A1 (en) * 2015-10-01 2017-04-06 Audi Ag Interactive operating system and method for carrying out an operational action in an interactive operating system
WO2017110233A1 (en) * 2015-12-22 2017-06-29 クラリオン株式会社 In-vehicle device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040036764A1 (en) * 2002-08-08 2004-02-26 Nissan Motor Co., Ltd. Operator identifying device
CN105718188A (en) * 2014-12-19 2016-06-29 罗伯特·博世有限公司 method of operating input device, input device, and motor vehicle
DE102015201722A1 (en) * 2015-02-02 2016-08-04 Robert Bosch Gmbh Method for operating an input device, input device
WO2017054894A1 (en) * 2015-10-01 2017-04-06 Audi Ag Interactive operating system and method for carrying out an operational action in an interactive operating system
WO2017110233A1 (en) * 2015-12-22 2017-06-29 クラリオン株式会社 In-vehicle device

Also Published As

Publication number Publication date
GB201719064D0 (en) 2018-01-03

Similar Documents

Publication Publication Date Title
US10832064B2 (en) Vacant parking space detection apparatus and vacant parking space detection method
US10279703B2 (en) Vehicle seat adjustment systems
US10099576B2 (en) Vehicle seat adjustment system
US9961259B2 (en) Image generation device, image display system, image generation method and image display method
JP5999032B2 (en) In-vehicle display device and program
US10499014B2 (en) Image generation apparatus
JP5261554B2 (en) Human-machine interface for vehicles based on fingertip pointing and gestures
US20140195096A1 (en) Apparatus and method for contactlessly detecting objects and/or persons and gestures and/or operating procedures made and/or carried out thereby
EP3358840A1 (en) Image processing device for vehicles
US20120314072A1 (en) Image generation apparatus
KR20180122012A (en) An operating device including a snow tracker unit and a method for calibrating an eye tracker unit of the operating device
CN109145864A (en) Determine method, apparatus, storage medium and the terminal device of visibility region
US10592078B2 (en) Method and device for a graphical user interface in a vehicle with a display that adapts to the relative position and operating intention of the user
CN109314765B (en) Display control device for vehicle, display system, display control method, and program
EP3144850A1 (en) Determination apparatus, determination method, and non-transitory recording medium
CN111107310B (en) Camera parameter estimation device and camera parameter estimation method
EP3425488A1 (en) System and method for predicting a touch position of a pointer on a touch-enabled unit or determining a pointing direction in 3d space
US10248132B2 (en) Method and apparatus for visualization of an environment of a motor vehicle
JP2013149257A (en) Adaptive interface system
EP3173278B1 (en) Vehicle display control device
CN108422932A (en) driving assistance system, method and vehicle
GB2568511A (en) Vehicle controller
GB2568669A (en) Vehicle controller
GB2568507A (en) Vehicle Controller
GB2570629A (en) Vehicle controller

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)