GB2568669A - Vehicle controller - Google Patents

Vehicle controller Download PDF

Info

Publication number
GB2568669A
GB2568669A GB1719062.0A GB201719062A GB2568669A GB 2568669 A GB2568669 A GB 2568669A GB 201719062 A GB201719062 A GB 201719062A GB 2568669 A GB2568669 A GB 2568669A
Authority
GB
United Kingdom
Prior art keywords
hand
control device
user operated
operated control
vehicle occupant
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1719062.0A
Other versions
GB2568669B (en
GB201719062D0 (en
Inventor
Hasedzic Elvir
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1719062.0A priority Critical patent/GB2568669B/en
Publication of GB201719062D0 publication Critical patent/GB201719062D0/en
Priority to DE102018219106.9A priority patent/DE102018219106A1/en
Publication of GB2568669A publication Critical patent/GB2568669A/en
Application granted granted Critical
Publication of GB2568669B publication Critical patent/GB2568669B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/213Virtual instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/65Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60NSEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
    • B60N2/00Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
    • B60N2/002Seats provided with an occupancy detection means mounted therein or thereon
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • B60R11/0264Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for control means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/11Instrument graphical user interfaces or menu aspects
    • B60K2360/111Instrument graphical user interfaces or menu aspects for controlling multiple devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/141Activation of instrument input devices by approaching fingers or pens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • B60K2360/14643D-gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/21Optical features of instruments using cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/29Holographic features
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/31Virtual images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/741Instruments adapted for user detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • B60R2011/0276Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for rear passenger use

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

A vehicle passenger 3 is enabled to control of a user operated control device 15 located at least partly in a rear passenger compartment within a vehicle cabin 1. The method comprises detecting that a vehicle occupant 3 is seated in the rear passenger compartment and detecting a hand 17 of a vehicle occupant 3 within a volume of space within the rear passenger compartment. It further comprises determining a position of the hand 17 with respect to the device 15 and enabling control of the device 15 in dependence on a distance between at least a portion of the hand 17 and the device 15 being less than or equal to a predefined threshold distance, and dependent on the vehicle occupant 3 being seated on a rear seat 5. A time-of-flight camera 7 is provided with a field of view 9 for hand detection. The invention can for example enable infotainment control whilst wearing a safety belt.

Description

VEHICLE CONTROLLER
TECHNICAL FIELD
The present disclosure relates to a vehicle controller. Aspects of the invention relate to a controller for enabling control of a user operated control device located at least partly in a rear passenger compartment within a vehicle cabin, to a method of enabling control of a user operated control device located at least partly in a rear passenger compartment within a vehicle cabin, a vehicle arranged in use to carry out the aforementioned method and/or comprising the aforementioned controller, a computer program product comprising instructions to carry out the aforementioned method, a computer readable data carrier having stored thereon instructions for carrying out the aforementioned method and a system comprising the aforementioned controller and a time-of-flight camera.
BACKGROUND
Modern vehicles, in particular automobiles, are often provided with one or more user operated control devices, such as switches, infotainment systems, displays and other controls located within the vehicle cabin which are configured to enable a user to alter various settings of the vehicle. This includes providing user operated control devices in the rear compartment of the vehicle cabin, which enable control of various control systems specific to the rear compartment. However, the user operated control systems are often located in areas which may be difficult to reach by a passenger seated in the rear compartment, in particular whilst wearing a seatbelt which may restrict the passenger’s freedom of movement.
Accordingly, it is an aim of at least certain embodiments of the present invention to overcome or ameliorate at least some of the shortcomings associated with the prior art.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a method, a controller, a system, a vehicle, a computer program product and a computer readable data carrier as claimed in the appended claims.
According to an aspect of the present invention there is provided a method of enabling control of a user operated control device located at least partly within a rear passenger compartment within a vehicle cabin. The method may comprise: detecting that a vehicle occupant is seated in the rear passenger compartment; detecting a hand of a vehicle occupant within a volume of space within the rear passenger compartment; determining a position of the hand with respect to the user operated control device; and enabling control of the user operated control device in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance, and the vehicle occupant being seated in the rear passenger compartment.
Advantageously, the present method provides an improved way of enabling control of user operated control devices located in a rear compartment of a vehicle. In particular, in accordance with this method, and because physical contact with the user operated control device is not required, operation of control devices may be enabled without requiring a passenger seated in the rear vehicle compartment to remove their seatbelt. Furthermore, because the method detects if a vehicle occupant is seated within a volume of space within the rear passenger compartment, accidental activation of a user operated control device, for example by a passenger seated in the forward vehicle compartment reaching back to the rear passenger compartment, is prevented.
In certain embodiments, the method may comprise identifying a gesture of the hand and controlling the user operated control device in dependence on the identified gesture. In this way, various functions associated with the user operated control device may be controlled through the selective use of different gestures, without requiring physical interaction with the user operated control device. This may be particularly useful where the freedom of movement of the passenger is being restricted by a seatbelt.
The method may comprise determining the position of the hand with respect to a proximity boundary associated with the user operated control device. The proximity boundary may define a boundary offset from the user operated control device by the predefined threshold distance. Control of the user operated control device may be enabled in dependence on the position of at least a portion of the hand intersecting the proximity boundary.
In certain embodiments, the method may comprise determining if the hand is the vehicle occupant’s left or right hand and enabling control of the user operated control device in dependence on whether the hand is the vehicle occupant’s left or right hand. This may comprise determining if the hand is orientated palm upwards or downwards relative to the user operated control device; determining if the hand is the vehicle occupant’s left or right hand in dependence on whether the hand is orientated palm upwards or downwards; and enabling control of the user operated control device in dependence on whether the hand is the left or right hand. Analysis of the orientation of the hand provides a convenient way of determining whether the hand is a vehicle occupant’s left or right hand.
The method may comprise determining if the hand is the hand of a first vehicle occupant or of a second vehicle occupant and enabling control of the user operated control device in dependence on which vehicle occupant the hand belongs to. In this way it is also possible to restrict control of a user operated control device to specific vehicle occupants seated in the rear passenger compartment. For example, in this way operation of the rear left window may be restricted to the passenger seated in the rear left passenger seat.
In certain embodiments, the method may comprise determining the direction of entry of the hand into the volume of space relative to the user operated control device; determining if the hand is the hand of a first vehicle occupant or of a second vehicle occupant in dependence on the direction of entry; and enabling control of the user operated control device in dependence on which vehicle occupant the hand belongs to. Direction of entry offers a convenient indication of where the hand has originated from and consequently which vehicle occupant the hand belongs to.
In certain embodiments, the method may comprise obtaining image data of the hand within the volume of space; receiving a reflectance signal reflected from the hand; determining a distance of the hand from a designated origin in dependence on the reflectance signal; and determining the relative position of the hand with respect to the user operated control device in dependence on the distance of the hand relative to the designated origin, the obtained image data, and a known distance of the user operated control device relative to the designated origin. The designated origin may be coincident with a position of an image capture device.
The method may comprise detecting if a seatbelt associated with a seat located in the rear passenger compartment is engaged, the seatbelt being arranged when engaged to secure the vehicle occupant to the seat; and enabling control of the user operated control device in dependence on the seatbelt being engaged. This may be particularly useful in helping to determine when a vehicle occupant is seated in the rear passenger compartment of the vehicle. Similarly, detection of the vehicle occupant being seated in the rear passenger compartment may also be carried out using pressure sensors embedded in the rear compartment passenger seats, in which case the method may comprise receiving a signal from at least one pressure sensor embedded in a passenger seat, and detecting that a vehicle occupant is seated in the rear passenger in dependence on the received signal.
The method may comprise determining if at least a portion of the hand has been maintained for a predetermined period of time in a position in which the distance between the portion of the hand and the user operated control device is less than or equal to the predefined threshold distance; and enabling control of the user operated control device on or after lapse of the predetermined period of time. This reduces the risk of accidental enablement of the control device.
In certain embodiments, the method may comprise deploying the user operated control device from a stowed position in which the user operated control device is inoperable to a deployed position in which the user operated control device is operable, in dependence on the distance between at least the portion of the hand and the user operated control device being less than or equal to the predefined threshold distance. This is useful where the user operated control device is maintained in a stowed configuration when not in use, for example to reduce the likelihood of accidental damage or operation whilst a passenger enters and/or exits the rear passenger compartment of the vehicle. This is particularly useful where the user operated control device comprises a display.
In some embodiments, the user operated control device comprises any one or more of the following: an entertainment system; a ventilation system; a passenger table; one or more window blinds; a vehicle window controller; a sun roof controller; a cup-holder; and a display unit.
According to a further aspect of the invention there is provided a controller for enabling control of a user operated control device located at least partly in a rear passenger compartment in a vehicle cabin. The controller may comprise an input configured to receive information indicative of a vehicle occupant seated in the rear passenger compartment and image data obtained by an image capture device, a processor and an output. The processor may be arranged in use to: recognise a hand of the vehicle occupant from the image, the hand being located within a volume of space within the rear passenger compartment within which image objects are captured by the image capture device; and determine a position of the hand with respect to the user operated control device. The output may be arranged in use to output a control signal for enabling control of the user operated control device to the user operated control device, in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance and the vehicle occupant being seated in the rear passenger compartment. The present controller benefits from the same advantages as set out in respect of the preceding aspects of the invention.
In certain embodiments, the processor may be configured in use to recognise a gesture of the hand and the output may be arranged in use to output a control signal for controlling the user operated control device in dependence on the gesture.
In certain embodiments, the processor may be arranged in use to determine the position of the hand with respect to a proximity boundary associated with the user operated control device. The proximity boundary may define a boundary offset from the user operated control device by the predefined threshold distance. The output may be arranged in use to output the control signal in dependence on the position of at least a portion of the hand intersecting the proximity boundary.
The processor may be arranged in use to determine if the hand is the vehicle occupant’s left or right hand, and the output may be arranged in use to output the control signal in dependence on whether the hand is the vehicle occupant’s left or right hand. This may comprise the processor being arranged in use to determine if the hand is orientated palm upwards or downwards relative to the user operated control device; and to determine if the hand is the vehicle occupant’s left or right hand in dependence on the orientation of the hand.
In certain embodiments, the processor may be arranged in use to determine the direction of entry of the hand into the volume of space relative to the user operated control device.
In certain embodiments the input may be configured in use to receive data from a timeof-flight (ToF) image capture device comprising a sensor. The ToF image capture device may be arranged in use to obtain image data of the hand, to illuminate the hand within the volume of space, and to measure a time of return of a reflected illumination signal, the time of return of the reflected illumination signal being proportional to a distance of the hand from the sensor. The data may comprise the image of the hand and the time of return of the reflected illumination signal. The processor may be arranged in use to determine the position of the hand with respect to the control device in dependence on the determined distance of the hand from the sensor, the obtained image data, and a known distance of the user operated control device relative to the sensor, wherein the distance of the hand from the sensor is determined in dependence on the time of return of the reflected illumination signal. The ToF image capture device provides a convenient means for obtaining image data associated with image object distance data, and therefore simplifies determining the distance of the hand from the user operated control device.
The input may be configured in use to receive data indicative of whether a seatbelt associated with a seat located in the rear passenger compartment is engaged, the seatbelt being arranged when engaged to secure the vehicle occupant to the seat. The output may be arranged in use to output the control signal in dependence on the seatbelt being engaged.
In certain embodiments, the processor may be arranged in use to determine if at least a portion of the hand has been maintained for a predetermined period of time in a position in which the distance between the portion of the hand and the output may be arranged in use to output the control signal on or after lapse of the predetermined period of time.
In certain embodiments, the output may be arranged in use to output a control signal for deploying the user operated control device from a stowed position to a deployed position in which the user operated control device is operable, in dependence on the distance between at least the portion of the hand and the user operated control device being less than or equal to the predefined threshold distance.
In some embodiments, the output is configured to output the control signal to any one of: an entertainment system; a ventilation system; a passenger table; one or more window blinds; a vehicle window controller; a sun roof controller; a cup-holder; and a display unit.
In accordance with yet a further aspect of the invention, there is provided a system for enabling control of a user operated control device located at least partly in a rear passenger compartment within a vehicle cabin. The system may comprise the aforementioned controller and an image capture device. Optionally, the image capture device may comprise a time of flight (ToF) image capture device. The system may also optionally comprise a user operated control device.
In accordance with yet a further aspect of the invention, there is provided a computer program product comprising instructions for carrying out the aforementioned method.
The computer program product may comprise instructions, which when executed on a processor, configure the processor to: detect a vehicle occupant seated in the rear passenger compartment; detect a hand of a vehicle occupant within a volume of space within the rear passenger compartment; determine a position of the hand with respect to the user operated control device; and enable control of the user operated control device in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance and the vehicle occupant being seated in the rear passenger compartment.
In accordance with yet a further aspect of the invention, there is provided a computer readable data carrier having stored thereon instructions for carrying out the aforementioned method. Optionally, the computer readable data carrier comprises a non-transitory computer readable data carrier.
The data carrier may comprise instructions, which when executed on a processor, configure the processor to: detect a vehicle occupant seated in the rear passenger compartment; detect a hand of a vehicle occupant within a volume of space within the rear passenger compartment; determine a position of the hand with respect to the user operated control device; and deploy the user operated control device in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance and the vehicle occupant being seated in the rear passenger compartment.
In accordance with yet a further aspect of the invention, there is provided a vehicle configured to carry out the aforementioned method.
In accordance with yet a further aspect of the invention, there is provided a vehicle comprising the aforementioned controller, or the aforementioned system.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a schematic illustration of the passenger seats comprised in both the front and rear passenger compartments of a vehicle cabin, and shows a camera having a field of view configured to obtain image data of a passenger’s hand located in a volume of space within the rear passenger compartment;
Figure 2 is a functional diagram illustrating the functional components of a controller configured to output a control signal for enabling control of a user operated control device located in the rear passenger compartment of Figure 1;
Figure 3 is a process flow chart outlining a method for enabling control of a user operated control device within a rear passenger compartment of a vehicle cabin in dependence on a vehicle occupant being seated in the rear passenger compartment and the proximity of at least a portion of a rear passenger’s hand to the control device, using the camera of Figure 1 and the controller of Figure 2;
Figure 4 is a schematic illustration highlighting the principle of operation of a Time-ofFlight (ToF) camera, which may be used to determine the position of a vehicle occupant’s hand within the rear passenger compartment of Figure 1;
Figures 5a and 5b are schematic illustrations showing a three-dimensional point cloud of a vehicle occupant’s hand generated using the ToF camera of Figure 4;
Figures 6a and 6b are schematic illustrations showing a user operated control device located in the rear passenger compartment, in respectively a stowed configuration and in a deployed configuration, in dependence on the proximity of the vehicle occupant’s hand to the control device; and
Figure 7 is a schematic illustration of a vehicle comprising the camera of Figure 1 and the controller of Figure 2.
DETAILED DESCRIPTION
Figure 1 is a perspective view of a portion of the vehicle cabin 1, and in particular shows a passenger 3 sitting in a rear passenger seat 5 located in a rear passenger compartment of the vehicle cabin 1. An image capture device in the form of a camera 7, having a field of view 9 delineated in Figure 1 by lines 11, is shown located in the cabin roof. Optionally, the camera 7 may comprise a Time-of-Flight (ToF) camera. The camera 7 is arranged to image objects located within the camera’s field of view 9. The field of view defines a volume of space within the vehicle cabin within which objects are imaged by the camera 7. The camera 7 is arranged such that a panel on the rear of a seat 13 within the vehicle cabin 1 lies within the camera’s field of view 9. The panel on the rear of a seat 13 comprises one or more different user operated control devices 15, which may control operation of one or more different devices. For example, the user operated control device 15 may comprise an entertainment system having a display unit. Similarly, the camera’s field of view may be arranged to capture one or more different user operated control devices located at least partly in different locations within the rear passenger compartment (not shown in Figure 1). For example, any one or more of: an entertainment system, a ventilation system, a passenger table, one or more window blinds, a vehicle window controller, a sun roof controller, a cup-holder, a display unit; and any other user operated control device located at least partly in a rear passenger compartment of the vehicle cabin 1. The vehicle cabin 1 may be comprised in a vehicle 43 illustrated in Figure 7. For illustrative purposes only, embodiments of the inventions will be disclosed with respect to a user operated control device 15 located in the panel on the rear of the seat 13.
In certain embodiments, the camera 7 may be operatively coupled to a controller 19 (shown in Figure 2), which is configured to receive image data obtained by the camera 7 and to output a control signal to a specific user operated control device 15, in dependence on an analysis of the received image data. The controller 19 may be configured such that control of a specific user operated control device 15 is enabled if at least a portion of a vehicle occupant’s hand 17 of a vehicle occupant 3 seated in a rear passenger compartment, is located at a distance less than or equal to a predefined threshold distance from the specific user operated control device 15. Where two or more vehicle occupants are seated in the rear passenger compartment, then control of the user operated control device 15 may be dependent on which rear seated vehicle occupant the hand belongs to. For example, a first vehicle occupant may be a passenger of the vehicle sat in the rear left passenger seat located opposite to the user operated control device 15, and is the intended user of the control device 15, and a second vehicle occupant may be a further passenger of the vehicle, sat in the right rear passenger seat adjacent to the first vehicle occupant. The camera 7 in combination with the controller 19 may be used to identify a vehicle occupant’s hand 17, and to determine if the hand belongs to the first occupant or to the second vehicle occupant. Control of the user operated control device 15 may then be restricted to the desired vehicle occupant.
In certain embodiments, the user operated control device 15 may comprise both a deployed configuration and stowed configuration. In the deployed configuration, the control device 15 may be arranged to be operable by a vehicle occupant. In the stowed configuration the control device may be inoperable. In certain embodiments, when in the stowed configuration, the user operated control device 15 may lie within a storage compartment. The storage compartment may be comprised within a panel located at the rear of a seat 13 located within the forward vehicle compartment, as illustrated in Figure 1. In the deployed configuration, the user operated control device 15 projects at least partly from the panel located at the rear of the seat 13, enabling operation of the control device 15 by the vehicle occupant 3.
When the user operated control device 15 is in the stowed configuration, the control signal enabling control of the user operated control device 15 may be arranged to deploy the control device 15 from its stowed configuration. In certain embodiments this may be achieved by outputting the control signal to a deployment mechanism associated with the user operated control device 15. The deployment mechanism may comprise any one or more mechanical devices which physically transition the user operated control device 15 between the two configurations, such as an electric motor. The deployment mechanism may additionally be configured to transition the user operated control device from its deployed configuration to its stowed configuration upon receipt of a suitable control signal.
Figure 2 provides a functional overview of the controller 19. The controller 19 may be functionally embedded into an existing electronic control unit of the vehicle 43. The controller 19 may be provided with an input 21 and an output 23. The input 21 may be configured to receive image data obtained by the camera 7, and data indicative of a vehicle occupant 3 being seated in the rear passenger compartment. In certain embodiments the data indicative of the vehicle occupant 3 being seated in the rear passenger compartment may be comprised in the obtained image data received from the camera 7. Alternatively, the data indicative of a vehicle occupant 3 being seated in the rear passenger compartment may comprise data received, for example, from a pressure sensor embedded in a rear passenger seat. The output 23 may be configured to output a control signal to a specific user operated control device 15. The controller 19 may comprise a processor 25 arranged to: detect that a vehicle occupant is seated in the rear passenger compartment from the received data; analyse image data received from the camera 7, to identify image objects such as the vehicle occupant’s hand 17 within the obtained image data; and to generate one or more control signals enabling a specific user operated control device to be controlled, in dependence on the position of the vehicle occupant’s hand 17 relative to the specific user operated control device 15, and in dependence on a vehicle occupant being detected seated in the rear passenger compartment.
In certain embodiments, analysis of obtained image data may be used to determine if a vehicle occupant is seated in the rear passenger compartment. In which case, the processor 25 may be configured to determine if the vehicle occupant 3 is seated in the rear passenger compartment from the received image data.
In use, as image data of a vehicle occupant’s hand is obtained by the camera 7 and it is determined by the controller 19, typically by the processor 25 of the controller 19, that the vehicle occupant is seated in the rear passenger compartment, and the position of the vehicle occupant’s hand relative to the specific user operated control device determined, control of the specific user operated control device may be enabled via a control signal output from the controller 19.
In certain embodiments, the controller 19 may be configured in use to output the control signal in dependence on a distance between at least a portion of a vehicle occupant’s hand 17 and the user operated control device 15 being less than or equal to a predefined threshold distance, and also in dependence on the vehicle occupant being seated in the rear passenger compartment. For example, as the camera 7 obtains image data of the vehicle occupant, the controller 19 may be configured to identify the image of the vehicle occupant and the image of the vehicle occupant’s hand 17 within the received obtained image data. The controller 19 may then determine if the vehicle occupant is seated in the rear passenger compartment and the relative position of the imaged hand with respect to the user operated control device from the obtained image data.
In alternative embodiments, the controller 19 may determine if the vehicle occupant is seated in a rear passenger compartment from alternative data sources, for example from the aforementioned pressure sensor embedded into the rear passenger seats, or from a seatbelt sensor located in the rear passenger compartment. In such embodiments, the obtained image data may then be used to determine the position of the vehicle occupant’s hand 17 relative to the user operated control device 15.
Figure 3 is a process flow chart outlining a method used in accordance with certain embodiments, to enable control of a user operated control device 15 in dependence on the proximity of a vehicle occupant’s hand 17 to the user operated control device, and in dependence on the vehicle occupant being seated in the rear passenger compartment.
In certain embodiments the method may be initiated when a seated vehicle occupant is detected, at step 301, within the rear passenger compartment of the vehicle cabin 1. The method of detection may comprise the use of detection sensors installed in the seats and/or seatbelts of the rear passenger compartment, the sensors being configured to detect when a passenger is seated, or the use of image analysis, as described previously.
Alternatively, the method of detection may comprise obtaining image data, with the camera 7, of the rear passenger compartment within the vehicle cabin 1. The obtained image data may be forwarded to the controller 19 for analysis. In particular, the obtained image data may be analysed to determine if a vehicle occupant is seated in the rear passenger compartment. This may be achieved by the processor 25 being configured with image recognition software configured to identify a vehicle occupant in a seated position from the obtained image data.
The above two embodiments are examples of how the detection of a seated passenger may be achieved, although any suitable method for such detection may be used.
In those embodiments where the seated vehicle occupant is detected by the use of detection sensors, once the seated vehicle occupant has been detected in the rear passenger compartment, the method continues to step 303, where image data of the rear passenger compartment is obtained by the camera 7. In certain embodiments the camera 7 may be configured to continuously obtain the image data, or to periodically obtain image data at a predefined frequency.
Alternatively, in those embodiments where the vehicle occupant being seated in the rear passenger cabin is determined from analysis of obtained image data, the method of Figure 3 may be initiated at step 303, where image data of the rear passenger compartment is obtained. This is then followed by step 301 where it is determined from analysis of the obtained image data if a vehicle occupant is seated in the rear vehicle compartment. Step 304 then follows.
Accordingly, the order in which method steps 301 and 303 are carried out is dependent on which type of vehicle occupant detection is carried out. Where sensor data, such as pressure sensors, or seatbelts sensors are used to determine if a vehicle occupant is seated in the rear passenger compartment, method step 301 precedes step 303. Where instead image analysis is used to determine if a vehicle occupant is seated in the rear passenger compartment, method step 303 precedes step 301, and the seated vehicle occupant is detected from the obtained image data. The sequence in which the remaining method steps of Figure 3 are carried out is the same for both methods of detection.
The method continues with step 304 where the obtained image data may be forwarded to the controller 19 for analysis. The obtained image data may be analysed to identify the vehicle occupant’s hand 17 within the obtained image data. As mentioned previously, this may comprise the use of image recognition software.
Once a vehicle occupant’s hand 17 has been identified within the obtained image data, the position of the hand 17 is determined relative to the user operated control device 15, at step 305. The position of the hand 17 relative to the user operated control device 15 may be determined by the processor 25. Where the rear passenger compartment comprises a plurality of different user operated control devices, step 305 may comprise determining the position of the hand 17 relative to the nearest user operated control device. At step 307 it is determined if at least a portion of the hand 17 lies at a distance that is less than or equal to a predefined threshold distance from the user operated control device 15. If it is determined that no portion of the hand lies within the predefined threshold distance, then the processor 25 continues to analyse received image data, and the method returns to step 304. If instead it is determined by the processor 25 that at least a portion of the identified hand lies within the predefined threshold distance of the user operated control device 15, then the processor 25 generates a control signal for output to the relevant user operated control device 15, at step 308. Upon receipt of the control signal at the user operated control device, at step 310, control of the user operated control device is enabled. In this way, control of user operated control devices located in the rear passenger compartment may be selectively enabled in dependence upon whether a vehicle occupant is seated in the rear passenger compartment and the position of the seated vehicle occupant’s hand relative to the subject user operated control device.
The predefined threshold distances for each user operated control device may be selected such that each user operated control device may be controlled by the vehicle occupant from the comfort of a normal seated position.
In certain embodiments, where two or more vehicle occupants are seated in the rear passenger compartment, the method may additionally comprise determining which vehicle occupant the detected hand belongs to. This may be carried out by the processor 25, by analysing the obtained image data to determine which one of a plurality of vehicle occupants the identified hand belongs to. Control of the user operated control device 15 may then be selectively enabled in dependence upon which vehicle occupant the hand belongs to. For example, where two or more vehicle occupants are seated in the rear passenger compartment, it may be determined which vehicle occupant the identified hand belongs to and control of the subject user operated control device may be selectively enabled in dependence on which vehicle occupant the identified hand belongs to. For example, control of the user operated control device may be restricted to the vehicle occupant sitting closest to the subject user operated control device, whilst denying control of the same user operated control device to other vehicle occupants.
The controller 19 may be configured to determine from an analysis of the received camera image data whether the vehicle occupant’s hand 17 is the occupant’s left or right hand. The controller 19 may then be configured to output the control signal in dependence on whether the hand 17 is the occupant’s left or right hand. In this way it is possible to restrict control of a desired user operated control device 15 located within the rear passenger compartment in dependence on whether the vehicle occupant’s left or right hand is attempting to enable operation of the associated user operated control device 15.
The vehicle occupant may have their hand orientated palm upwards or palm downwards within the volume of space captured within the camera’s field of view 9 relative to the user operated control device 15. In some embodiments, the processor 25 may be configured to determine the orientation of the palm of the vehicle occupant’s hand and to subsequently determine whether the hand is the vehicle occupant’s left or right hand in dependence on the orientation of the hand. Control of the user operated control device 25 may be enabled in dependence on whether the hand is the vehicle occupant’s left or right hand.
In certain embodiments, it may be determined by the processor 25 whether the vehicle occupant’s hand is orientated palm upwards or palm downwards relative to the user operated control device 15 by analysing the absorption of light by the vehicle occupant’s hand. The skin texture of a palm of a hand is different to the skin texture of the back of a hand, and as a result, the amount of incident light absorbed by the palm differs to the amount of incident light absorbed by the back of the hand. Accordingly, by configuring the controller 19 to analyse the intensity of the reflected signal, which is indicative of the amount of incident illumination absorbed by the hand, it is possible for the controller 19 to determine whether the hand is orientated palm upwards or downwards. By determining whether the vehicle occupant has their hand orientated palm upwards or palm downwards within the volume of space relative to the user operated control device 15, it may therefore be determined whether the hand is a left hand or a right hand with greater accuracy. This information may then be used to determine whether the hand belongs to a first vehicle occupant or to a second vehicle occupant in the same way as described previously.
In certain embodiments, the controller may also determine the direction of entry of a vehicle occupant’s hand into the camera’s field of view 9 relative to the user operated control device 15. In dependence on the direction of entry of the hand, it may be determined whether the hand belongs to a first vehicle occupant or to a second vehicle occupant. For example, consider the scenario in which a user operated control device 15 is located on the rear of a left front passenger seat 13 as illustrated in Figure 1. A first vehicle occupant is seated opposite the user operated control device in the rear left seat within the rear passenger compartment. When attempting to operate the user operated control device, the first vehicle occupant is likely to extend their hand towards the user operated control device, and it is therefore expected that the first vehicle occupant’s hand will enter the camera’s field of view 9 from a direction that is opposite the location of the user operated control device. Similarly, if a second vehicle occupant seated in the right rear seat in the rear passenger compartment were to attempt to operate the same user operated control device, it is likely that their extended hand would be detected by the controller to be entering the camera’s field of view 9 from a direction that is sideways with respect to the position of the user operated control device. Accordingly, the direction of entry of a vehicle occupant’s hand into the camera’s field of view may be used to infer which vehicle occupant’s hand the detected hand belongs to, which in turn may be used to restrict control of the user operated control device.
In certain embodiments, the predefined threshold distance between at least a portion of a vehicle occupant’s hand 17 and the user operated control device 15 may relate to a few centimetres, for example any distance within the range of 1cm to 10cm, including 1cm and 10cm. In certain embodiments the predefined threshold may delineate a control proximity boundary surrounding and offset from the user control device 15 by the predetermined threshold distance, which control proximity boundary when intersected by at least a portion of the vehicle occupant’s hand causes the controller 19 to generate the control signal for output to the relevant user operated control device 15.
The control proximity boundary may be geometrically shaped. For example, the control proximity boundary may be box-shaped, or spherically shaped. Effectively, the control proximity boundary may relate to a volume of space offset from the user operated control device 15 by the predefined threshold distance. In dependence on any portion of the control proximity boundary being intersected by at least a portion of the vehicle occupant’s hand, the controller generates the control signal for output to the user operated control device 15. It is to be appreciated that not all of the portions of the control proximity boundary need to be offset from the user operated control device 15 by the predefined threshold distance. For example, where the control proximity boundary is box-shaped (e.g. cube shaped), it is to be appreciated that some faces of the cube may not be offset from the user operated control device by the predefined threshold distance.
In certain embodiments, in order to enable the position of the hand 17 to be determined relative to the user operated control device 15, the camera 7 may relate to a 3D mapping controller arranged to generate a 3D model of the hand within the field of view 9. For example, in certain embodiments the camera 7 may relate to a Time-ofFlight (ToF) camera, in which each captured image pixel is associated with a distance on the basis of a time of return of a reflected illumination signal. To achieve this, the ToF camera may be configured with an illumination source arranged to illuminate the camera’s field of view. The incident illumination signal is subsequently reflected by objects present in the camera’s field of view, and the time of return of the reflected illumination signal is measured. In this way it is possible to associate a distance measurement to each imaged object. The illumination signal may relate to any electromagnetic signal, and need not be comprised in the visible spectrum. For example, in certain embodiments the illumination signal may operate in the infrared spectrum.
Similarly, a ToF camera may also be used to detect if a vehicle occupant is seated in the rear passenger compartment where required, using the afore-described principles of time of return of a reflected illumination signal.
In those embodiments where the camera 7 comprises a ToF camera, the controller 19, and specifically the input 21 may be configured to receive both camera image data and image object distance information data from the ToF camera. This enables the controller 19, and more specifically the processor 25 to determine the position of the vehicle occupant’s hand 17 relative to the user operated control device 15 from the received data, and if necessary to determine if the vehicle occupant is seated in the rear passenger compartment.
Figure 4 is a schematic diagram illustrating the principle of operation of a ToF camera
27. A modulated illumination source 29 is used to illuminate a desired target 31. The incident illumination 33 is reflected by the target 31 and captured on a sensor 35 comprising an array of pixels. However, whilst simultaneously capturing the reflected modulated light 37, the pixels of the sensor 35 also capture visible light reflected from the target. Since the illumination signal is modulated 33, it may be distinguished from the visible light reflected from the target 31, which enables the time of flight of the modulated illumination signal to be measured. The time of flight taken for the modulated illumination signal to be incident on the target 31 and reflected back to the sensor 35 is measured when it is incident on the sensor 35. In this way, each captured image pixel may be associated with a distance of the corresponding image object on the basis of the measured time of flight required for the reflected modulated illumination signal 37 to be measured by the sensor 35. More specific details regarding operation of ToF cameras are widely available in the art, and for this reason a more detailed discussion is not necessary for present purposes.
Where the camera 7 of Figure 1 comprises a ToF camera 27, it is possible to generate a three-dimensional point cloud of the vehicle occupant’s hand located within the camera’s field of view 9. Figures 5a and 5b illustrate an example of a threedimensional point cloud 39 of the vehicle occupant’s hand 17, generated using the ToF camera 27. In the same way it is also possible, where required, to generate a threedimensional point cloud of the vehicle occupant seated in the rear passenger compartment. In certain embodiments the controller 19 may be configured to generate the three-dimensional point cloud using the image data and image object distance information received from the ToF camera 27. Figure 5a shows a point cloud 39 of the vehicle occupant’s hand 17 as it is approaching a rectangular-shaped control proximity boundary 41. In Figure 5b a portion of the point cloud 39 of the vehicle occupant’s hand 17 is intersecting a portion of the control proximity boundary 41. In this event, and as mentioned previously, depending on which vehicle occupant the hand belongs to, the controller 19 may be configured to generate a control signal for output to the user operated control device 15 in order to enable control of this user operated control device 15.
In order to enable the position of the vehicle occupant’s hand 17 to be determined relative to a user operated control device 15, the position of the user operated control device 15 relative to the ToF camera may be determined. Again, this may be done using image recognition software. Since the position of the user operated control device 15 relative to the ToF camera 27 is known, and the position of the vehicle occupant’s hand 17 relative to the ToF camera 27 is known, the position of the vehicle occupant’s hand 17 relative to the user operated control device 15 may be determined using trigonometry. In certain embodiments, and in order to facilitate computation during use, the controller 19 may be provided with distance information of each user operated control device 15 relative to the ToF camera 27 during an initial configuration of the ToF camera 27. This distance information may be stored and accessed for subsequent use when it is needed. This facilitates subsequent computation of the position of the hand relative to a desired user operated control device 15, since only the distance of the vehicle occupant’s hand 17 with respect to the ToF camera 27, and the position relative to the known position of the user operated control device 15 requires calculation, both of which may be obtained from data captured by the ToF camera 27.
Where image data captured by the ToF camera 27 is also being used to detect if the vehicle occupant is seated in the rear passenger compartment, then the controller 19 may also be provided with distance information of the passenger seat relative to the ToF camera 27. The vehicle occupant may be determined to be seated when the distance of the vehicle occupant’s legs are determined to be within a threshold distance of the seat. Similarly, the vehicle occupant may be determined to be seated in dependence on the position of the vehicle occupant. For example, if the vehicle occupant is determined to be in a position in which their legs are bent relative to their torso in a conventional seated position, then the vehicle occupant may also be determined to be seated.
Figures 6a and 6b show an example use scenario in which a user operated control device 15 located in the rear of a front passenger seat may be controlled in accordance with the previously described method. In response to a passenger being seated in a rear passenger compartment of a vehicle cabin 1 as well as at least a portion of a vehicle occupant’s hand 17 intersecting a control proximity boundary 601, the user operated control device 15 associated with the control proximity boundary 601 may be configured to be deployed when in its stowed configuration. In the illustrated example, the user operated control device may comprise a touch screen display for controlling various different vehicle control systems, or may relate to an infotainment system. Figure 6a shows the scenario where a vehicle occupant’s hand 17 is at a distance greater than the predetermined threshold distance and therefore does not intersect the control proximity boundary 601. As a result, the associated display unit remains in its stowed configuration 605, in which the control panel lies within a storage compartment within the rear of the front passenger seat, and is inoperable in this stowed configuration. Figure 6b shows the scenario where the vehicle occupant’s hand 17 is at a position where at least a portion of the hand intersects the control proximity boundary 601. As a result, the display unit is transitioned from its stowed configuration to its deployed configuration 610. In the deployed configuration 610 at least a portion of the control panel projects from the storage compartment, and is operable by the vehicle occupant.
In certain embodiments, there may be more than one vehicle occupant’s hand within the camera’s field of view 9, for example, when a first and a second vehicle occupant are seated in the rear passenger compartment. Both the first and second vehicle occupants may wish to control either different user operated control devices located within the rear passenger compartment, or the same user operated control device at the same time. If a first hand belonging to the first vehicle occupant and a second hand belonging to the second vehicle occupant are both detected within the camera’s field of view 9 at the same time, the processor 25 may be configured to identify which hand belongs to which vehicle occupant. The processor 25 may then determine the position of each hand with respect to a user operated control device. Control of the user operated control device 15 may be enabled to the first vehicle occupant in dependence on at least a portion of the first hand being within the threshold distance of the user operated control device.
In further embodiments, the controller 19 may, in addition to enabling control of a user operated control device 15 to the first user, additionally prevent the second vehicle occupant from using the user operated control device 15 in dependence on the second hand being within the threshold distance of the user operated control device 15. In this way, control of a specific user operated control device 15 may only be enabled to a first vehicle occupant, even in the scenario in which the hand of the second vehicle occupant is present within the proximity boundary of the user operated control device 15, and one or more vehicle occupants are detected as being seated in the rear passenger compartment. This improves the robustness of the system.
In certain embodiments, control of a first user operated control device may be enabled for a first vehicle occupant and control of a second user operated control device may be enabled for a second vehicle occupant where both a first hand and a second hand belonging to respectively the first and second vehicle occupants are detected within the proximity boundaries of respectively a first user operated control device and a second user operated control device.
In certain embodiments, the user operated control device may comprise a display unit located within the vehicle cabin. The display unit may comprise a user interface, which may enable a vehicle occupant to change multiple vehicle settings. In one example, control of the entire display unit may be enabled in dependence on a hand being detected within the proximity boundary of the display unit and in dependence on a seated vehicle occupant being detected in the rear passenger compartment of the vehicle 43. In other embodiments, different functionalities associated with the display unit may be enabled to different users, in dependence on their hand being determined to lie on or within the proximity boundary, and them being seated in the rear passenger compartment of the vehicle.
In certain embodiments, seatbelt sensor data may be used to detect if a vehicle occupant is seated in the rear passenger compartment. Control of the user operated control device may then be enabled in dependence upon a seatbelt sensor associated with a seat located in the rear passenger compartment indicating that the seatbelt has been engaged - engagement of the seatbelt being indicative that the vehicle occupant is seated in the rear passenger compartment. In such embodiments the controller’s processor 25 may be configured to receive an input signal from the seatbelt detection system. As described previously, control of the user operated control device may then be enabled in dependence on both the seatbelt being engaged and the vehicle occupant’s hand being located at or within the predefined threshold distance. An advantage of this embodiment is that it encourages seatbelt use, since operation of the user operated control devices are dependent on the seatbelt being engaged.
In a further embodiment, the controller 19 may be configured to determine from an analysis of the received camera image data whether the vehicle occupant’s hand 17 is performing a predefined gesture. The controller 19 may then be configured to output the control signal in dependence on whether the hand 17 is performing the predefined gesture. In this way, the controller 19 may able to restrict the operation of the user operated control device 15 not only in dependence on the relative position of the hand with respect to the user operated control device and the vehicle occupant being seated in the rear passenger compartment, but also in dependence on the predefined gesture. This helps to prevent accidentally enabling control of the user operated control device. For example, it may prevent accidental deployment of a stowed user operated control device. This feature may be particularly desirable where accidentally enabling control of the user operated control device may be undesirable.
In yet a further embodiment, the controller 19 may be configured to output a control signal for controlling a user operated control device in dependence upon a predefined gesture being identified. For example, different functions or modes of operation of the user operated control device may be engaged using different predefined gestures. In such embodiments, the output control signal may be dependent also on the specific gesture performed by the vehicle occupant. Different control signals may be output for different identified gestures.
In certain embodiments it is envisaged that in order for control of the user operated control device 15 to be enabled, a portion of the vehicle occupant’s hand 17 must remain within the predefined threshold distance, as previously described, for a predetermined period of time. The predefined period of time may relate to any arbitrary period of time and may be dependent on the specific user operated control device. This requirement helps to prevent an accidental activation of the user operated control device 15, as a result of a vehicle occupant unintentionally moving their hand to a position which lies within the predefined threshold distance.
Whilst the preceding embodiments of the invention have been described within the context of a ToF camera, it is to be appreciated that alternative camera configurations may be used in accordance with the herein described embodiments. Any configuration of cameras may be used that enables image data of a hand relative to the designated surface area to be captured, and the position of the hand relative to the designated surface area to be determined. For example, a configuration of two or more cameras each configured to enable a different perspective image of the hand relative to the designated surface area to be captured may also be used. In such an arrangement the different perspective images of the hand relative to the designated surface area would enable the controller to determine the position of the hand with respect to the designated surface area by triangulation. Such a configuration of cameras may also be used to detect if a vehicle occupant is seated in the rear passenger compartment.
Similarly, in an alternative embodiment, the ToF camera of the preceding embodiments may be replaced by a conventional camera, in combination with an optical ruler, such as a LIDAR for example. In such an embodiment the LIDAR provides the image object distance information, whilst the camera provides image data. The controller may be configured in such embodiments to analyse the LIDAR data in combination with the obtained image data in order to determine the position of the vehicle occupant’s hand relative to the designated surface area.
It is to be appreciated that many modifications may be made to the above examples and embodiments without departing from the scope of the present invention as defined in the accompanying claims.

Claims (30)

1. A method of enabling control of a user operated control device located at least partly in a rear passenger compartment within a vehicle cabin, the method comprising:
detecting that a vehicle occupant is seated in the rear passenger compartment;
detecting a hand of the vehicle occupant within a volume of space within the rear passenger compartment;
determining a position of the hand with respect to the user operated control device; and enabling control of the user operated control device in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance, and the vehicle occupant being seated in the rear passenger compartment.
2. The method of claim 1, comprising:
identifying a gesture of the hand; and controlling the user operated control device in dependence on the identified gesture.
3. The method of claim 1 or 2, comprising:
determining the position of the hand with respect to a proximity boundary associated with the user operated control device, the proximity boundary defining a boundary offset from the user operated control device by the predefined threshold distance; and enabling control of the user operated control device in dependence on the position of at least a portion of the hand intersecting the proximity boundary.
4. The method of any preceding claim, comprising:
determining if the hand is the vehicle occupant’s left or right hand; and enabling control of the user operated control device in dependence on whether the hand is the vehicle occupant’s left or right hand.
5. The method of any preceding claim, comprising:
determining if the hand is orientated palm upwards or downwards relative to the user operated control device;
determining if the hand is the vehicle occupant’s left or right hand in dependence on whether the hand is orientated palm upwards or downwards; and enabling control of the user operated control device in dependence on whether the hand is the left or right hand.
6. The method of any preceding claim, comprising:
determining if the hand is the hand of a first vehicle occupant or of a second vehicle occupant; and enabling control of the user operated control device in dependence on which vehicle occupant the hand belongs to.
7. The method of any preceding claim, comprising:
determining the direction of entry of the hand into the volume of space relative to the user operated control device;
determining if the hand is the hand of a first vehicle occupant or of a second vehicle occupant in dependence on the direction of entry; and enabling control of the user operated control device in dependence on which vehicle occupant the hand belongs to.
8. The method of any preceding claim, comprising:
obtaining image data of the hand within the volume of space;
receiving a reflectance signal reflected from the hand;
determining a distance of the hand from a designated origin in dependence on the reflectance signal; and determining the relative position of the hand with respect to the user operated control device in dependence on the distance of the hand relative to the designated origin, the obtained image data, and a known distance of the user operated control device relative to the designated origin.
9. The method of claim 8, wherein the designated origin is coincident with a position of an image capture device.
10. The method of any preceding claim, comprising:
detecting if a seatbelt associated with a seat located in the rear passenger compartment is engaged, the seatbelt being arranged when engaged to secure the vehicle occupant to the seat; and enabling control of the user operated control device in dependence on the seatbelt being engaged.
11. The method of any preceding claim, comprising:
determining if at least a portion of the hand has been maintained for a predetermined period of time in a position in which the distance between the portion of the hand and the user operated control device is less than or equal to the predefined threshold distance; and enabling control of the user operated control device on or after lapse of the predetermined period of time.
12. The method of any preceding claim, comprising:
deploying the user operated control device from a stowed position in which the user operated control device is inoperable to a deployed position in which the user operated control device is operable, in dependence on the distance between at least the portion of the hand and the user operated control device being less than or equal to the predefined threshold distance.
13. The method of any preceding claim, wherein the user operated control device comprises any one of:
a. an entertainment system;
b. a ventilation system;
c. a passenger table;
d. one or more window blinds;
e. a vehicle window controller;
f. a sun roof controller;
g. a cup-holder; and
h. a display unit.
14. A controller for enabling control of a user operated control device located at least partly in a rear passenger compartment within a vehicle cabin, the controller comprising:
an input configured to receive:
information indicative of a vehicle occupant seated in the rear passenger compartment; and image data obtained by an image capture device;
a processor configured in use to:
recognise a hand of the vehicle occupant from the image data, the hand being located within a volume of space within the rear passenger compartment within which image objects are captured by the image capture device;
determine a position of the hand with respect to the user operated control device; and an output arranged in use to output a control signal for enabling control of the user operated control device to the user operated control device, in dependence on a distance between at least a portion of the hand and the user operated control device being less than or equal to a predefined threshold distance, and the vehicle occupant being seated in the rear passenger compartment.
15. The controller of claim 14, wherein:
the processor is configured in use to recognise a gesture of the hand; and the output is arranged in use to output a control signal for controlling the user operated control device in dependence on the gesture.
16. The controller of claim 14 or 15, wherein the processor is arranged in use to determine the position of the hand with respect to a proximity boundary associated with the user operated control device, the proximity boundary defining a boundary offset from the user operated control device by the predefined threshold distance; and the output is arranged in use to output the control signal in dependence on the position of at least a portion of the hand intersecting the proximity boundary.
17. The controller of any one of claims 14 to 16, wherein the processor is arranged in use to determine if the hand is the vehicle occupant’s left or right hand; and the output is arranged in use to output the control signal in dependence on whether the hand is the vehicle occupant’s left or right hand.
18. The controller of any one of claims 14 to 17, wherein the processor is arranged in use to determine if the hand is orientated palm upwards or downwards within the volume of space relative to the user operated control device, and to determine if the hand is the vehicle occupant’s left or right hand in dependence on orientation of the hand.
19. The controller of any one of claims 14 to 18, wherein the processor is arranged in use to determine a direction of entry of the hand into the volume of space relative to the user operated control device.
20. The controller of any one of claims 14 to 19, wherein:
the input is configured in use to receive data from a time-of-flight (ToF) image capture device comprising a sensor, the ToF image capture device being arranged in use to obtain image data of the hand, to illuminate the hand within the volume of space, and to measure a time of return of a reflected illumination signal, the time of return of the reflected illumination signal being proportional to a distance of the hand from the sensor, the data comprising the image data and the time of return of the reflected illumination signal; and wherein the processor is arranged in use to determine the position of the hand with respect to the user operated control device in dependence on the determined distance of the hand from the sensor, the obtained image data, and a known distance of the user operated control device relative to the sensor, wherein the distance of the hand from the sensor is determined in dependence on the time of return of the reflected illumination signal.
21. The controller of any one of claims 14 to 20, wherein:
the input is configured in use to receive data indicative of whether a seatbelt associated with a seat located in the rear passenger compartment is engaged, the seatbelt being arranged when engaged to secure the vehicle occupant to the seat; and the output is arranged in use to output the control signal in dependence on the seatbelt being engaged.
22. The controller of any one of claims 14 to 21, wherein the processor is arranged in use to determine if at least a portion of the hand has been maintained for a predetermined period of time in a position in which a distance between the portion of the hand and the user operated control device is less than or equal to the predefined threshold distance; and the output is arranged in use to output the control signal on or after lapse of the predetermined period of time.
23. The controller of any one of claims 14 to 22, wherein the output is arranged in use to output a control signal for deploying the user operated control device from a stowed position to a deployed position in which the user operated control device is operable, in dependence on the distance between at least the portion of the hand and the user operated control device being less than or equal to the predefined threshold distance.
24. The controller of any one of claims 14 to 23, wherein the output is configured to output the control signal to any one of:
a. an entertainment system;
b. a ventilation system;
c. a passenger table;
d. one or more window blinds;
e. a vehicle window controller;
f. a sun roof controller;
g. a cup-holder; and
h. a display unit.
25. A system for enabling control of a user operated control device located at least partly in a rear passenger compartment within a vehicle cabin, the system comprising the controller of any one of claims 14 to 24 and a time-of-flight (ToF) image capture device.
26. The system of claim 25, comprising a user operated control device.
27. A computer program product comprising instructions for carrying out the method of any one of claims 1 to 13.
28. A computer-readable data carrier having stored thereon instructions for carrying
10 out the method of any one of claims 1 to 13.
29. A vehicle configured in use to carry out the method of any one of claims 1 to 13.
30. A vehicle comprising the controller of any one of claims 14 to 24 or the system of
15 claim 26.
GB1719062.0A 2017-11-17 2017-11-17 Proximity based vehicle controller Active GB2568669B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1719062.0A GB2568669B (en) 2017-11-17 2017-11-17 Proximity based vehicle controller
DE102018219106.9A DE102018219106A1 (en) 2017-11-17 2018-11-08 VEHICLE CONTROL

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1719062.0A GB2568669B (en) 2017-11-17 2017-11-17 Proximity based vehicle controller

Publications (3)

Publication Number Publication Date
GB201719062D0 GB201719062D0 (en) 2018-01-03
GB2568669A true GB2568669A (en) 2019-05-29
GB2568669B GB2568669B (en) 2020-03-25

Family

ID=60805816

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1719062.0A Active GB2568669B (en) 2017-11-17 2017-11-17 Proximity based vehicle controller

Country Status (2)

Country Link
DE (1) DE102018219106A1 (en)
GB (1) GB2568669B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110539670B (en) * 2019-09-25 2021-04-02 恒大恒驰新能源汽车科技(广东)有限公司 Vehicle seat adjusting method and device, vehicle-mounted terminal and computer storage medium
IL272059A (en) * 2020-01-15 2021-07-29 Abdallah SHVIKI An intelligent vehicle seat monitoring and alerting system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130030645A1 (en) * 2011-07-28 2013-01-31 Panasonic Corporation Auto-control of vehicle infotainment system based on extracted characteristics of car occupants
US20150081133A1 (en) * 2013-09-17 2015-03-19 Toyota Motor Sales, U.S.A., Inc. Gesture-based system enabling children to control some vehicle functions in a vehicle
US20150131857A1 (en) * 2013-11-08 2015-05-14 Hyundai Motor Company Vehicle recognizing user gesture and method for controlling the same
KR20150072074A (en) * 2013-12-19 2015-06-29 현대자동차주식회사 System and control method for gesture recognition of vehicle
US20170250525A1 (en) * 2011-11-16 2017-08-31 Autoconnect Holdings Llc Universal console chassis for the car

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130030645A1 (en) * 2011-07-28 2013-01-31 Panasonic Corporation Auto-control of vehicle infotainment system based on extracted characteristics of car occupants
US20170250525A1 (en) * 2011-11-16 2017-08-31 Autoconnect Holdings Llc Universal console chassis for the car
US20150081133A1 (en) * 2013-09-17 2015-03-19 Toyota Motor Sales, U.S.A., Inc. Gesture-based system enabling children to control some vehicle functions in a vehicle
US20150131857A1 (en) * 2013-11-08 2015-05-14 Hyundai Motor Company Vehicle recognizing user gesture and method for controlling the same
KR20150072074A (en) * 2013-12-19 2015-06-29 현대자동차주식회사 System and control method for gesture recognition of vehicle

Also Published As

Publication number Publication date
GB2568669B (en) 2020-03-25
DE102018219106A1 (en) 2019-05-23
GB201719062D0 (en) 2018-01-03

Similar Documents

Publication Publication Date Title
US10832064B2 (en) Vacant parking space detection apparatus and vacant parking space detection method
US10501048B2 (en) Seatbelt buckling detection
US10099576B2 (en) Vehicle seat adjustment system
US20210086662A1 (en) Method for operating an interior of a motor vehicle
US20210001796A1 (en) Physique estimation device and physique estimation method
US10730465B2 (en) 3D time of flight active reflecting sensing systems and methods
EP1816589B1 (en) Detection device of vehicle interior condition
CN113556975A (en) System, apparatus and method for detecting object in vehicle and obtaining object information
US20200327344A1 (en) Occupant detection device
KR102181196B1 (en) Parking control method and parking control device
US20200012848A1 (en) Monitoring system
US20150055678A1 (en) Information acquisition device for object to be measured
WO2009156941A1 (en) Vehicle mirror adjustment method and system
US20190299847A1 (en) Vehicle control device
KR102130059B1 (en) Digital rearview mirror control unit and method
JP2012127811A (en) Occupant detection device
GB2568669A (en) Vehicle controller
ES2902698T3 (en) Determining the position of a non-vehicle object in a vehicle
US10579867B2 (en) Method and device for detecting an object in a vehicle
JP2016203910A (en) Occupant detection device and occupant detection method
US20200324614A1 (en) Air conditioning control system and air conditioning control method
CN106458092B (en) The device and method for the backsight vision by electronic console for vehicle
GB2568511A (en) Vehicle controller
US20150301175A1 (en) Driver-entry detector for a motor vehicle
US20200231111A1 (en) Vehicle footwell reflector