GB2582899A - Control system amd method for controlling a vehicle system - Google Patents

Control system amd method for controlling a vehicle system Download PDF

Info

Publication number
GB2582899A
GB2582899A GB1903589.8A GB201903589A GB2582899A GB 2582899 A GB2582899 A GB 2582899A GB 201903589 A GB201903589 A GB 201903589A GB 2582899 A GB2582899 A GB 2582899A
Authority
GB
United Kingdom
Prior art keywords
user
vehicle
control system
data
height
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1903589.8A
Other versions
GB201903589D0 (en
GB2582899B (en
Inventor
Valentin Gheorghe Ionut
Hasedzic Elvir
Munoz Mauricio
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1903589.8A priority Critical patent/GB2582899B/en
Publication of GB201903589D0 publication Critical patent/GB201903589D0/en
Publication of GB2582899A publication Critical patent/GB2582899A/en
Application granted granted Critical
Publication of GB2582899B publication Critical patent/GB2582899B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/22Conjoint control of vehicle sub-units of different type or different function including control of suspension systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G17/00Resilient suspensions having means for adjusting the spring or vibration-damper characteristics, for regulating the distance between a supporting surface and a sprung part of vehicle or for locking suspension during use to meet varying vehicular or surface conditions, e.g. due to speed or load
    • B60G17/015Resilient suspensions having means for adjusting the spring or vibration-damper characteristics, for regulating the distance between a supporting surface and a sprung part of vehicle or for locking suspension during use to meet varying vehicular or surface conditions, e.g. due to speed or load the regulating means comprising electric or electronic elements
    • B60G17/017Resilient suspensions having means for adjusting the spring or vibration-damper characteristics, for regulating the distance between a supporting surface and a sprung part of vehicle or for locking suspension during use to meet varying vehicular or surface conditions, e.g. due to speed or load the regulating means comprising electric or electronic elements characterised by their use when the vehicle is stationary, e.g. during loading, engine start-up or switch-off
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G17/00Resilient suspensions having means for adjusting the spring or vibration-damper characteristics, for regulating the distance between a supporting surface and a sprung part of vehicle or for locking suspension during use to meet varying vehicular or surface conditions, e.g. due to speed or load
    • B60G17/06Characteristics of dampers, e.g. mechanical dampers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/10Interpretation of driver requests or demands
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2401/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60G2401/14Photo or light sensitive means, e.g. Infrared
    • B60G2401/142Visual Display Camera, e.g. LCD
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2401/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60G2401/16GPS track data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2401/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60G2401/17Magnetic/Electromagnetic
    • B60G2401/174Radar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2500/00Indexing codes relating to the regulated action or device
    • B60G2500/10Damping action or damper
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2500/00Indexing codes relating to the regulated action or device
    • B60G2500/30Height or ground clearance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2500/00Indexing codes relating to the regulated action or device
    • B60G2500/30Height or ground clearance
    • B60G2500/32Height or ground clearance of only one vehicle part or side
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2800/00Indexing codes relating to the type of movement or to the condition of the vehicle and to the end result to be achieved by the control action
    • B60G2800/20Stationary vehicle
    • B60G2800/202Stationary vehicle kneeling, e.g. for letting passengers on/off
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2800/00Indexing codes relating to the type of movement or to the condition of the vehicle and to the end result to be achieved by the control action
    • B60G2800/20Stationary vehicle
    • B60G2800/203Stationary vehicle lowering the floor for loading/unloading

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Traffic Control Systems (AREA)

Abstract

A control system configured to control a height management system for a vehicle, a vehicle having the control system, and a method for controlling a height management system for a vehicle are disclosed. A control system has a sensor system configured to determine 202 the presence of a user of the vehicle in the vicinity of the vehicle, and acquire data relating to the user of the vehicle. A processing module of the control system is configured to process 204 the acquired data to determine a feature of the user, and a control module of the control system is configured to generate 206, in dependence on the determined feature of the user, a control parameter for adjusting a height of at least a portion of the vehicle. The control parameter may be a suspension parameter for adjusting the suspension system of the vehicle. The feature of the user may be one or more of: a tracked movement of the user; a body landmark feature of the user; a gait feature of the user; and a gesture of the user.

Description

CONTROL SYSTEM AND METHOD FOR CONTROLLING A VEHICLE SYSTEM
TECHNICAL FIELD
The present disclosure relates to a control system, control system and method for controlling a vehicle system. Aspects of the invention relate to a control system configured to control a height management system for a vehicle, a vehicle having the control system, and a method for controlling a height management system for a vehicle.
BACKGROUND
It is an ongoing challenge within the automotive industry to improve vehicle functionality and design, and in turn enhance the perceived quality and 'feel' of vehicles, without adding significant additional cost.
One area that has received interest in this respect is vehicle access systems. There is a growing desire to provide a seamless entry process that creates an effect of 'welcoming' a user to a vehicle. In this context, systems for adjusting the height of a vehicle, or of a portion of the vehicle, are known. For example, known systems can move the chassis of a land vehicle closer to or further away from a ground area, or raise or lower a specific section of the vehicle.
However, certain known systems are not able to automatically or independently determine that the height of (a portion of) the vehicle is to be altered. Such systems may also not be capable of determining the appropriate extent to which the height is to be adjusted.
In addition, known systems may not be capable of associating an appropriate extent for adjustment of the height with a detected object, such as a user of the vehicle, and therefore not capable of adjusting the height appropriately for a given user, or a given object, or indeed a given type of object.
It is against this background that the present invention has been devised.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a control system configured to control a height management system for a vehicle, a vehicle having the control system, and a method for controlling a height management system for a vehicle as claimed in the appended claims.
According to an aspect of the present invention there is provided a control system configured to control a height management system for a vehicle, the control system comprising: a sensor system configured to: determine the presence of a user of the vehicle in the vicinity of the vehicle; and acquire data relating to the user of the vehicle; a processing module configured to process the acquired data to determine a feature of the user; and a control module configured to generate, in dependence on the determined feature of the user, a control parameter for adjusting a height of at least a portion of the vehicle.
According to another aspect of the present invention there is provided a control system configured to control a height management system for a vehicle, the control system comprising one or more controllers, the control system configured to: determine the presence of a user of the vehicle in the vicinity of the vehicle; acquire data relating to the user of the vehicle; process the acquired data to determine a feature of the user; and generate, in dependence on the determined feature of the user, a control parameter for adjusting a height of at least a portion of the vehicle.
In some embodiments, the one or more controllers collectively comprise: at least one electronic processor having an electrical input for receiving data indicative of the presence of a user of the vehicle in the vicinity of the vehicle and data; and at least one memory device electrically coupled to the at least one electronic processor and having instructions stored therein; and wherein the at least one electronic processor is configured to access the at least one memory device and execute the instructions thereon so as to acquire data relating to the user of the vehicle; process the acquired data to determine a feature of the user; and generate, in dependence on the determined feature of the user, a control parameter for adjusting a height of at least a portion of the vehicle.
Thus a determined feature of the user can be used to automatically adjust a height of the portion of the vehicle, allowing the height of that portion to be adjusted to an extent appropriate for the determined feature of that specific user. For example, this can allow improved ease of access for vehicle users into the cabin of the vehicle. This can also allow improved ease of access to luggage or loading spaces, seats or seating areas, and maintenance areas of the vehicle such as tyre or wheel-arch areas of a land vehicle.
The sensor system may be configured to acquire or capture data relating to the user (or determine their presence) when the user is in the vicinity of, in proximity to, nearby, adjacent to, or in the area of the vehicle. A distance threshold may be applied for the sensor system to determine the relevant vicinity.
The sensor system may acquire the data from a further system, once a user is identified; for example, if a user is identified, the data may be acquired from a store on-board the vehicle, rather than from a scan of the user.
Determining or detecting the feature of the user may comprise determining a characteristic, factor or attribute (for example, a feature which is not an innate feature of the user) for the user. The data relating to the user and the feature of the user may relate to a feature other than a bodily feature; for example, the user may be carrying luggage which may be detected or identified, and this information used to adjust the height of the vehicle portion.
Adjustment of the height of the at least a portion of the vehicle may comprise managing or regulating the height or conforming the height to a condition or threshold. The height of the portion of the vehicle may be a vertical distance between points on the vehicle (portion), or between the vehicle portion and an external reference, such as a ground area for a land vehicle. For example, the height may be a ground clearance for the vehicle, a height of the roof above the ground, or a load space height (above the ground, or above another portion of the vehicle) or a seat height above a cabin floor or chassis reference point, or ground point.
Optionally, the control parameter is a suspension parameter for adjusting the suspension system for the vehicle. The control parameter may also be a tyre parameter for adjusting a tyre system, a seat parameter for adjusting a seat system, or a load space parameter for adjusting a load space system. The control parameter may comprise two or more such parameters, for example using a combination of the suspension, tyre and seat parameters to optimize vehicle height for user entry/exit. The adjustment of the relevant system may be activating or de-activating the system.
Suitably, the control module is configured to set the control parameter at one of a range of values for the vehicle height, in dependence on the determined feature of the user. Optionally, the control module is configured to generate the control parameter for a height for the vehicle.
In embodiments, the feature of the user comprises one or more of: a tracked movement for the user; a body landmark feature for the user; a gait feature for the user; and a gesture of the user. The feature of the user may not be determined from the user in the vicinity itself; the feature may be determined from data acquired relating to the user, for example from an onboard store. The feature of the user may be that the user is using (or determined to be intending to use) a given area of the vehicle, such as a given vehicle body aperture, or a load space.
Suitably, the sensor system is configured to acquire data for a region in the vicinity of the vehicle. Optionally, the processing module is configured to analyse the data to determine one or more body landmarks of the user within the data. In embodiments, the acquired data is of the region in the vicinity of the vehicle over a time period, and wherein the processing module is configured to track the position of the or each body landmark relative to the vehicle over the time period. Suitably, the processing module is configured to determine an intent of a user by comparing the tracked movement of each body landmark to a respective predetermined movement. The determination may comprise determining the user's intent to board the vehicle.
In embodiments, the processing module is configured to determine the user's intent by comparing the tracked movement of the or each body landmark with a gait pattern, e.g. in respect to vehicle position.
Suitably, the processing module is configured to analyse the tracked movement of the or each body landmark to determine a trajectory of the user relative to the vehicle or vehicle position. Optionally, the processing module is configured to determine the user's intent by determining whether the trajectory of the user indicates that the user is approaching the vehicle. In embodiments, the processing module is configured to determine the user's intent by determining whether the trajectory of the user indicates that the user is approaching a body aperture closure member of the vehicle, such as a door of the vehicle.
In embodiments, the sensor system comprises one or more image data capture devices. For instance, the image data capture devices may have an external view of the environment around the vehicle. Optionally, the processing module is configured to process data from the one or more image data capture devices to determine the feature of the user data.
Suitably, the processing module is configured to process data from the image data capture devices to determine the presence of a user and optionally an identity of the user.
In embodiments, the sensor system is configured to: determine the presence of the user in relation to a first threshold distance from the vehicle; and acquire the data relating to the user in relation to a second threshold distance from the vehicle.
Suitably, the sensor system is configured to, on determining the presence of the user in the vicinity of the vehicle, acquire data relating to the user of the vehicle from a storage device associated with the vehicle.
In embodiments, the acquired data comprises any one or more of the following: point cloud data; data acquired by a radar system; data acquired by a lidar system; data acquired by a Bluetooth system; data acquired by a WiFi system; data acquired by a RFID system; data acquired by a Near Field Sensing (NFS) system; and data acquired by an ultrasonic system.
According to an aspect of the present invention there is provided a vehicle comprising a control system as described above.
According to an aspect of the present invention there is provided a method of controlling a height management system for a vehicle, comprising: determining the presence of a user of the vehicle in the vicinity of the vehicle; acquiring data relating to the user of the vehicle; processing the acquired data to determine a feature of the user; and generating, in dependence on the determined feature of the user, a control parameter for adjusting a height of at least a portion of the vehicle.
According to an aspect of the present invention there is provided a control system configured to manage a suspension system for a vehicle, the control system comprising: a sensor system configured to: determine the presence of a user of the vehicle in the vicinity of the vehicle; and acquire data relating to the user of the vehicle; a processing module configured to process the acquired user data to determine a feature of the user data; and a control module configured to generate, in dependence on the determined feature of the user data, a suspension parameter for adjusting the suspension system for the vehicle. Optionally, the suspension system may be adjusted for parameters other than height of the suspension system, in order to control the disposition of the vehicle. For example a damping or travel parameter for the suspension system may be adjusted; this parameter may allow control of the height of the vehicle.
Further aspects of the invention comprise computer programs or computer program applications or computer program products executable on a processor so as to implement the methods of the above described aspects and embodiments, non-transitory computer readable media loaded with such computer programs or computer program products, and processors arranged to implement these methods or products.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 is a schematic diagram illustrating a vehicle having a control system according to an embodiment of the invention; Figure 2 is a diagram illustrating steps of a method according to an embodiment of the invention; Figure 3 is a schematic diagram illustrating components of a control system, according to an embodiment of the invention; Figures 4a and 4b are diagrams illustrating steps of a method according to embodiments of the invention; and Figure 5 is a simplified representation of an image stored by the control system of Figure 3, according to an embodiment of the invention.
DETAILED DESCRIPTION
Embodiments of the invention allow the automatic adjustment of the height of a portion of a vehicle to conform to users' needs, whether for instance determined by a direct feature of the user, such as a proportion or size of the user, or by a detected intended use of the vehicle, such as a user approaching a load space of the vehicle, or by a stored feature of a user once the user has been identified.
In an embodiment, the system detects when an authorised user in within a range of the vehicle, which can then activate at least one camera to determine the trajectory of the user, the height of the user and the luggage the user is carrying. If the user is carrying an object, which in this embodiment is determined by the orientation of the arms, the load space can be raised or lowered to facilitate a smooth entry. Luggage load space may be lowered to the minimum depth, or a set amount, for example in dependence on the detection of a gesture of a user. If a detected object is being dragged, for example a suitcase, then the suspension can be lowered to the minimum permissible limit.
In embodiments, a two zone system can be used: the first zone determines if an authorised user is within the vicinity of the vehicle; the second zone determines the trajectory, user and type of objects the user is carrying. For example, a controller of or for the vehicle may be configured to determine the proximity of an authorised user of a vehicle when approaching a first threshold distance, and when the authorised user is within the second threshold distance determine the trajectory of the user. In dependence on the trajectory, the control system may determine how the user is carrying any objects and thus raise or lower the vehicle luggage space to allow the user to place the objects in the luggage space.
Embodiments of the invention can adapt the vehicle height such that it allows for ergonomic access in situations where the elevation profile of the vehicle is unsuited for an easy entry access. Examples of such situations where these can occur are: 1. a user trying to access the vehicle with a suspension height that is not adjusted to allow easy access to that user, for instance the access area or dimensions may be either too short or too long (wide, narrow) for an anthropometric size of the user's legs; 2. a user with a disability that requires vehicle height to be lowered when access is required; 3. a user carrying heavy luggage and intending to unload the luggage into a load space, for example.
Embodiments of the invention can detect poses of the body by detecting several key body landmarks, such as legs, arms, joints and the like and track these landmarks through time using a sensor system. Additionally, computer vision techniques can be used to detect different types of luggage. From the detected landmarks, including luggage, the anthropometric properties of the user and the volumetric properties of the luggage can be determined or inferred, and used to request the raising or lowering of the vehicle height system.
Benefits of the automatic adjustment of the vehicle height in this way may be: more ergonomic, easy entry access; ease of loading / unloading luggage, especially for users with compromised health or mobility; and new ways of experiencing vehicle interaction.
For purposes of this disclosure, it is to be understood that the control system described herein may comprise one or more controllers, each controller comprising a control unit or computational device having one or more electronic processors. Thus. the vehicle and/or a system thereof may comprise a single control module or electronic controller or alternatively different functions of the controller(s) may be embodied in, or hosted in, different control modules or controllers.
As used herein, the terms 'controller' or 'control module' will be understood to include both a single control module or controller and a plurality of control modules or controllers collectively operating to provide the required control functionality. A set of instructions could be provided which, when executed, cause said control modules(s) to implement the control techniques described herein (including the method(s) described below). The set of instructions may be embedded in one or more electronic CPU or processors.
Alternatively, the set of instructions could be provided as software to be executed by one or more electronic processor(s). For example, a first control module may be implemented in software run on one or more electronic processors, and one or more other control modules may also be implemented in software run on one or more electronic processors, optionally the same one or more processors as the first control module. It will be appreciated, however, that other arrangements are also useful, and therefore, the present invention is not intended to be limited to any particular arrangement.
In any event, the set of instructions described above may be embedded in a computer-readable storage medium (e.g. a non-transitory storage medium) that may comprise any mechanism for storing information in a form readable by a machine or electronic processors/computational device, including, without limitation: a magnetic storage medium (e.g., floppy diskette); optical storage medium (e.g., CD-ROM); magneto-optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; or electrical or other types of medium for storing such information/instructions.
Figure 1 is a diagram illustrating a vehicle 10 comprising a vehicle height control system 100 according to an embodiment of the invention, which is used to control in this embodiment a forward and rear suspension system (120) of the vehicle. The vehicle in this case is an automobile such as a car or SUV, but may be any other kind of user-driven or user-managed vehicle, such as a water-borne vehicle, a large land vehicle, or an autonomous or partially autonomous vehicle.
The vehicle includes a sensor system (6, 7) to detect an approaching user, to acquire data relating to the user, and in embodiments to identify the user. In this embodiment, the sensor system includes first (front) and second (rear) image capture devices 6, 7, located along an upper edge of a lower door (12) panel, immediately below the window.
Referring to Figure 2, the vehicle sensor system (whether as in Figure 1, or as in the alternative embodiments described herein) determines the presence of the user in the vicinity of the vehicle (202), and then acquires data relating to the user. In the embodiment shown in Figures 1 and 3, this data may relate to body landmarks of the user. A processing module (204) of the control system 100 is then using to determine from the acquired data a feature of that user; for example, the detected body landmarks may be used to determine a size or shape of a portion of the user's body.
A control module (206) of the control system 100 then uses the determined feature(s) of the user to generate a control parameter for adjusting a height of at least a portion of the vehicle. For example, a determined size or shape of a portion of the user's body (or their luggage) may be used to determine an appropriate height of suspension to be used to allow easier access for that determined user, given the body landmark data, or to provide an appropriate load space height for the luggage.
Referring back to Figure 1, in this embodiment, the image capture devices 6, 7 are stereoscopic cameras, although in other embodiments the image capture devices 6, 7 may include any suitable type of sensor that is configured to generate data that can represent a two-dimensional image of an area adjacent to the vehicle 10 towards which the device that produced the data is directed. Such data is hereafter referred to as 'scan data'. Suitable sensors include alternative types of camera (e.g. Time of Flight (ToF) sensors), proximity sensors (e.g. ultrasonic proximity detectors), light-detecting sensors such as Light Detection and Ranging (LIDAR) sensors, or Radio Detection and Ranging (RADAR) sensors, for example. It follows that the scan data may comprise any one or more of the following: photographic images; point cloud data; data acquired by a radar system; data acquired by a LIDAR system; data acquired by a Bluetooth system; and data acquired by an ultrasonic system.
The rear image capture device 6 is located immediately behind the front image capture device 7, along a longitudinal axis of the vehicle 10, and is oriented to capture two-dimensional scan data in the form of still or moving images (e.g. as a video feed) relating to a user approaching the vehicle 10 from the rear. The front image capture device 7 is oriented in an opposite manner to capture scan data in the form of still or moving images relating to a user approaching the vehicle 10 from the front.
The scan data detected by the front and rear image capture devices 6, 7 is transmitted to the control system 100 either wirelessly or by hardwiring through the vehicle door. It is particularly convenient to mount the image capture devices 6, 7 in this position as they can be connected to an electrical wiring harness already present within the door for the electrical windows.
The front and rear image capture devices 6, 7 have a collective panoramic field of view that spans 180 degrees around the vertical. In this way, the front and rear image capture devices 6, 7 are able to produce scan data covering all possible angles of trajectory of approach that a user may take when approaching the vehicle. Additional corresponding front and rear image capture devices may be situated on the left-side of the vehicle 10 (not shown in figures) to generate scan data covering all possible angles of trajectory of approach that a user may take when approaching the left-side vehicle doors.
The front image capture device 7 is angled to provide a field of view that captures a user travelling towards the front of the vehicle 10 as they approach a door of the vehicle 10. Also, the image capture device 7 is angled so that an adequate still image of the face of a user standing next to the vehicle 10 may be captured. The rear image capture device 6 is positioned and oriented in a similar fashion.
In other embodiments only one image capture device with a wider field of view may be used.
However, the use of dual image capture devices 6, 7 provides the advantage that the distance between the image capture device and the approaching user and the height and physical dimensions of the approaching user, may be determined more accurately than with a mono camera.
The image capture devices 6, 7 are configurable in a 'standby' state in which they are idle, and a 'record' state in which they actively capture scan data. The state of the image capture devices 6, 7 is controlled by the control system 100, which is described in more detail with reference to Figure 3.
It should be noted that in other embodiments, the data acquired relating to the user may be to identify the approaching user, and the data relating to features of that user may be (in part, or wholly) obtained from an indirect source, such as an on-board store, or a remote data server.
In such embodiments, the data acquired may for instance include RFID data, or data acquired by a Near Field Sensing (NFS) system. In embodiments, a key fob of the user is used to identify the user, prior to using the camera system (6, 7) to obtain data relating to the user (to determine their size, shape, and the like for prompting adjustment of the vehicle height). In embodiments, the key fob may be used to identify the user, and the stored data (local or remote) relating to that user can then prompt adjustment of the vehicle height appropriately.
In embodiments, in a first (or in an initial set of) use case(s), data may be captured for an identified user, which is then used to adjust vehicle height. This data may then be stored, locally or remotely for use with that user in future; if the same user is identified, the system can retrieve the stored data and perform the same vehicle height adjustment as previously applied. In an embodiment, the stored or historical data may for example include GPS data, so that height levels previously adjusted at given different locations can be tracked. For instance, at a home location for a given user, there may be a high step that is used to climb into the vehicle; this can be recorded in association with the GPS location, and this stored data used once the user has been identified.
In embodiments the sensor system may be used in addition to the user identification and retrieval of stored data. This may be appropriate if, for example, the driver identification system identifies the user only with the key fob; a given key fob may be assigned to the driver ID, but in some circumstances the key fob may have been given to another driver than the user having that ID.
In embodiments, the sensor system may nevertheless be used; a returning user may have the same access requirements for a passenger door (and thus require the same adjustment), but in a new instance have luggage which requires a different height adjustment for a load space, for example. In a combined embodiment, the height adjustment may be (at least in part) done before the sensor system is used to scan the user. For example, the system may pre-adjust the height at a specific GPS position (see above) when a driver has left and locked the vehicle, for a quicker entry the next time the vehicle is used.
A communications device 112 of the vehicle may be used for communicating with remote servers, or for infrastructure data sources, or the like.
In other embodiments, the approaching user may not be identified as an authorised user in order for data relating to the user is acquired. In embodiments, the sensor system can be used to detect the presence of any potential user, identified or not, near the vehicle, and begin the process of acquiring data in order to assess whether a height adjustment is required. At a later point, the system will require that the user is either identified, or unlocks the vehicle, in order for access to be permitted. These embodiments may require an always-on sensor system, such as a low-power radar system, which may then trigger the image capture devices (or other sensor system for acquiring data).
Referring to Figure 3, the vehicle height adjustment control system 100 comprises a key fob identification module 11 configured to receive a wireless signal from the key fob 9 when it enters the vicinity of the vehicle 10. The wireless signal may be an infrared or radio wave signal, for example, and is unique to the particular key fob. The key fob identification module 11 is continually in a 'standby' state, in which it is ready to receive such a wireless signal.
The key fob identification module 11 compares the signals it receives with a key fob identifier (e.g. an identification or serial number) stored in a memory 12 in a control and processing module 320 of the control system 100. If there is a match, the key fob identification module 11 sends a correspondence signal to a processing module 321.
In response to the correspondence signal, the control system 100 is configured to initialise or start-up' the image capture devices 6, 7 from their 'standby' state, via control signals sent to the devices from the control module. In this way, the quantity of scan data that is recorded and processed is minimised, as scan data is only recorded, collated, and processed if there is a positive indication that a user is approaching with the authorised key fob 9.
Once activated, the scan data may be retrieved by the image capture devices 6, 7 over a predefined time period or an indefinite time period. The scan data may comprise a single image or a series of images over regular or otherwise defined time intervals. A series of images may be recorded at relatively high frequency, for example 50Hz, so that the series forms a 'moving image'. In other embodiments, the control system 100 activates the image capture devices 6, 7 to be in a continuous 'record' state, where they actively capture a continuous stream of scan data. In turn, the processing module 321 continuously analyses this scan data.
The captured scan data (e.g. camera images in this embodiment, or RADAR or LIDAR signals, etc. in other embodiments) may be communicated directly to the processing module 321 for processing and analysis, or may be collated before forwarding to the processing module.
The processing and analysis performed by the processing module includes filtering out any spurious signals and background objects in the captured scene that do not relate to features of the user that are required for authorisation or body landmark (or other data acquisition) purposes. Once any unwanted elements of the collated scan data have been removed or corrected, the processing module 321 further processes the collated scan data in order to perform body landmark recognition to identify body (including facial) landmarks on the images of the user.
Each body landmark is a virtual point that represents a specific, predefined body part such as the major joints belonging to virtual skeletal approximations of the anthropometric form of a potential user, and so by identifying a set of body landmarks the size, position and movement of the user can be determined. The multiple virtual points may define a three-dimensional point cloud in a coordinate system (e.g. Cartesian coordinate system) intended to represent the external surfaces of a user and/or object. For example, Figure 5 provides a simplified illustration of an image 41 of a user in which user identifiers 42 representing body landmarks are defined over an area of the image. Additional landmarks may be determined on luggage of the user; for example, a simple edge detection algorithm may be used to determine the shape of the luggage, and landmarks applied to vertices of the detected shape, in order to determine a size. Alternatively, a convolutional neural network object detector maybe used in order to detect luggage or other such objects. Vertices of such shapes may be used by the detector to infer size.
The control system 100 may alternatively or additionally use a 'flat world' system in its model approximation of the scanned area, typically only focussing on three-dimensional objects in the field of view of the image capture devices 6, 7 that may be mapped by one or more three-dimensional point clouds with respect to a 'flat' reference plane, typically the foreground within the field of view. This advantageously reduces the processing burden on the processing module 321 by filtering out differences in possible terrain heights in the scanned area, for example.
Embodiments of the invention may determine body landmarks in image data by known methods, for example by segmenting the image data to determine the section of the image containing a body-like feature, and then comparing the segmented section to stored body landmark references in order to assign landmarks to that feature in the image data. In another example, the landmark detection uses a convolutional neural network that has been trained to detect all the possible body parts in an image (e.g. left elbow, right elbow, left shoulder right shoulder etc.). Subsequently, this detected collection of body landmarks is joined together into likely association(s) of these landmarks, or 'skeletons', in order to track the user and/or the dynamic movement of the user.
In embodiments, the body landmark data may be used to identify the user, as well as collect data for use in the vehicle height adjustment (for instance, in embodiments in which there is no initial identification step by a key fob). For example, relative distances between each body landmark of a given user may be characterised as unique to that user, such that the set of body landmarks assigned to a user may be used to identify the user subsequently.
Once body landmarks have been identified within the image or series of images forming the scan data, the processing module 312 communicates with the memory 12 to identify the authenticated user by comparing the body landmarks defined in the scan data with stored user identifiers that represent body landmarks of a pre-registered user. In an example embodiment, the user identifiers are stored on the memory 12 in the form of still and/or moving images of body landmarks associated with a respective user.
In these embodiments of the invention, the processing module 321 analyses the captured scan data to identify any user(s) in the field of view of image capture devices 6, 7, using an algorithm that is able to recognise an anthropometric form and extrapolate it from the captured scan data. This allows the processing module to identify a potential human user from the captured scan data, and distinguish that potential human user from any other non-human or inanimate objects (e.g. lampposts, bollards, other animals, etc.) captured in the scan data by the image capture devices 6, 7.
The processing module may fit a virtual skeletal approximation to the potential user's anthropometric anatomy. The processing module assigns authorised user identifiers to the user corresponding to the body landmarks defined for that user. This may be sufficient in itself to enable subsequent identification of the user.
However, in an enhancement, the control system 100 may be used in a user registration process to determine the user's default gait pattern or walking style (e.g. bipedal gait cycle) as being the gait pattern of that user when they are walking towards the vehicle 10 in their typical manner. The determined gait pattern may therefore be used subsequently to identify the user.
One way to achieve this is for the user to record an authorised moving image of their typical gait pattern or walking style using the appropriate controls on image capture devices 6, 7. The scan data is processed and transmitted to the memory 12 where it is stored as an authorised user identifier for that particular user. Alternatively, the control system 100 may initially identify a user using another technique, such as facial recognition, and then associate a gait pattern with that user, thereby enhancing the data stored for that user and in turn improving subsequent identification.
Once a user has been identified by one or more of the embodiments described above, the processing module 321 determines one or more body landmarks of that user. These are then used to inform the vehicle height control system, which adjusts the height of the relevant portion of the vehicle in response. A signal, including a control parameter (here, a suspension control parameter) is sent from the control system 100 to the vehicle height control module 340, which in this embodiment instructs a vehicle suspension system 342, adjusting the vehicle's height from the ground.
In this and other embodiments, height adjustable air suspension systems are included in the vehicle 10 to allow the motorist to vary the ride height or ground clearance. This can be done for various reasons including giving better ground clearance over rough terrain, a lower ground clearance to improve performance and fuel economy at high speed, or for stylistic reasons.
Height adjustment in such systems can be achieved by any known means, such as air or oil compression used for the "springs" of the vehicle -when the pressure is varied, the vehicle body rises or lowers.
The suspension system 342 can for example be used to raise or lower the entire chassis of the vehicle, in order to provide a given access height, or to lower the vehicle at the rear, to allow easier access to a load space. In alternative embodiments, the system controlled by the height control module 340 may not be a suspension system; for example the data acquired to determine the user feature may assess a given height or ability of the user, and may instruct a lowering of the seat to allow access, or to (additionally) raise the height of the seat once the user has entered the vehicle to a given height for that user on the basis of the determined user height/ability.
Following detection of the body landmarks, such as those shown in Figure 5, the relative distances between the landmarks can be determined in order to determine size or shape of the user, and/or their luggage. For example, in the embodiment shown in Figures 1 and 3, the cameras 6 and 7 can provide stereoscopic image data of the user, from which the depth of the user within the image data can be established. Using the ratio of this depth distance to the 'size' of the landmarks (for example, by the number of vertical pixels retrieved from image data between two given landmarks), the user's height (or an estimate for it) can be determined.
In an alternative, a single image capture device may be used. Using people of different heights for calibration, the following measurements are recorded for random walks around the vehicle: (a) height of the person; and (b) number of pixels extending between the bottom of the field of view of the camera (or the image) and the feet of the person.
By examining the ratio between (b) in image data captured of an approaching user, and the numbers of pixels that make up parts of the skeleton, which can be determined from the body landmark detection described above, an approximation of height can be established for a previously unseen user.
In an embodiment, a sequence of body landmarks that belong to a single 'skeleton' (i.e. a user) may be tracked through time in pixel coordinates to determine intent of vehicle ingress.
When a user moves towards the vehicle with a certain body posture that is consistent with them wanting to board the vehicle (for example, it may be determined that a crouched posture is not consistent, for instance if the user is performing a maintenance task), the cluster of body landmarks expands over time. Conversely when the user is moving away from the vehicle this cluster shrinks.
In some circumstances, the user might remain stationary for brief periods of time in which case the cluster remains fairly constant. A particularly well-suited algorithm to learn and predict intent from such temporal transitions of cluster points is a long short term memory (LSTM) network, a type of recurrent neural network (RNN). The output of this algorithm can be used to assess if the recognised person or people around the vehicle (whether recognised by face, gait or other techniques) truly intend to enter the vehicle using a specific door.
In embodiments, additional factors may be used to determine intent. For example the control system may measure the timing of ratio or cluster expansion/contraction changes, for example the rate of change, or time between noted changes. These can be used to determine how fast the approach is, where delay could mean a change of intent of the person/user. The angle of the approach of the person may be monitored, in part determining the trajectory of the approach; for example a person could be passing by the vehicle without intending to board it.
Trajectories or angles consistent with such behaviour can be stored for the control system, or learned by a neural network for the control system.
Once one or more dimensions, heights or distances between body landmarks have been established for the user by the processing and control module 320, an instruction can be given to the vehicle height control module 340. For example, if a height is determined below a given threshold, then an instruction to activate the height adjustment system 342 can be given. If an approximate value for, for instances, a user's leg height has been determined, this can be used to determine where in the range of adjustment of the height control system should be chosen for that circumstance. For example, the memory 12 or the database 310 may store a look-up table for vehicle height system settings depending on one or more input user body landmark values.
If the control system 100 determines that the user is carrying luggage, the height system 342 may be configured to raise or lower the floor of the vehicle 10 accordingly to facilitate a smooth vehicle ingress by the user. If the luggage (i.e. suitcase) is being transported by the user along the ground, then the system can be configured to lower a load space height.
In an example embodiment, the range of values may be a continuous range of values, rather than a discrete value, of control parameters for adjusting the height of the vehicle 10. In an example embodiment, a suspension system 120 may be controllable for individual suspension of each wheel of the vehicle 10. The mechanism of such a suspension system may refer to a plurality of different types of possible vehicle suspensions, such as gas-operated or spring-operated suspensions, shock absorbers, and/or linkages connected the vehicle and the wheels which might allow relative motion between the two.
In a modification to the process described above, the processing module 320 may be programmed with pre-determined acceptable threshold distances away from the centre of the vehicle 10. In this way, a two-zone system is used by the control system 100, where the first zone is defined by a first threshold distance away from the centre of the vehicle 10. The first threshold distance is used by the processing module to determine if an authorised user is within the vicinity of the vehicle 10 (rather than not sufficiently in range to determine features, or not sufficiently close to warrant determining that a user is close enough). A second zone is defined by a second threshold distance. The second threshold distance is used by the processing module to determine the body landmarks of the user, trajectory of the user (see below), and type of objects the user is carrying (e.g. luggage, bags, suitcases, and the like).
In embodiments of the invention the vehicle systems are also configured to determine the intent of a user based on body landmarking analysis of the scan data obtained from the image capture devices 6, 7, and to trigger a vehicle height system action based on that determined intent. For example, the system may operate to determine whether the user intends to access a door of the vehicle 10, or a load space of the vehicle, for example by tracking the trajectory of approach of the user through monitoring the body landmarks identified for that user.
Alternatively, or in parallel, the control system 100 may use the identified body landmarks to detect that the user has made a pre-determined gesture indicating their intention to access the vehicle 10, and determine the user's intent on that basis.
In this embodiment, the vehicle action may only be triggered in the event that the user is found to be an authorised user. However, in other embodiments the vehicle action may trigger for any user approaching the vehicle 10 with the key fob 9.
In order for the control system 100 to pre-empt an authorised user's intent, the control and processing module 320 includes a 'neural network' that identifies and analyses the body landmarks relating to the user. As the skilled reader will appreciate, artificial neural network architecture refers to a computer processing system that is designed to progressively self-improve its own performance over time, without requiring task-specific programming.
In a modification to the process described previously, the control module is configured to initiate an action on the vehicle 10 in dependence on the identity of the user.
In one example, the control system 100 determines whether the authorised user is carrying objects (e.g. luggage, shopping bags, etc.) by comparing the tracked movement of each of the user's body landmarks with a stored gait pattern associated with that user. By analysing the scan data the processing module 320 may be able to calculate any deviations in the user's movement relative to their usual default gait pattern. The neural network 'learning' of the control system 100 may enable it to determine that the user intends to access the vehicle's boot to deposit the object(s) that they are carrying. In this case, the control system 100 may automatically lower the rear vehicle height for loading, rather than a vehicle height for ingress.
As noted above, the processing module may track the movement of each body landmark to determine the trajectory of a user relative to the vehicle 10. In this respect, the processing module actively tracks one or more of the assigned body landmarks detected in the scan data.
The processing module is able to track the movement of each body landmark in real-time or near-real time, in order to determine the trajectory of the user relative to the vehicle 10.
For the complementary approach, in which the control system 100 detects from the scan data that a user has made a pre-determined gesture to signal their intent, it is noted that such gestures are identified through cross-referencing with a set of stored gestures, each having a respective associated vehicle action. Typical examples include hand waving or other hand signals. For example, raising one finger towards the vehicle 10 may correspond to lowering a load space vehicle height first, whereas raising two fingers may lowering a vehicle ingress height first.
It is noted that a gesture may be detected in a single image within the scan data, or in a series of images.
Aside from pre-determined gestures stored in the memory 12, a user can define their own gestures during a registration process. Specifically, through a vehicle HMI the user can control the image capture devices 6, 7 to capture images of the user while they make a particular gesture, and then that gesture can be recorded and associated with a specific vehicle action. In this way, the control system 100 is customisable. In an embodiment, registration of the gesture is done outside the vehicle; the external image capture devices or cameras are used to capture the gesture, as they are the same devices that will scan for the gestures using the given field of view. The system may be trained to determine a particular gesture for a given user. Over time the user may then perform the gesture differently, and this change can be tracked in order for the gesture not to be rejected.
It is not necessary to require a simultaneous authorisation of every image contained within the scan data; confirmation of an authorised user in any one of a series of images (and/or in combination with a positive identification of the correct key fob 9) may be adequate to trigger an action on the vehicle 10.
In this embodiment, to avoid lowering a vehicle height so much that the vehicle is obstructed by or contacts another object, such as a ground obstacle (e.g. a kerb), the processing module 320 is configured to analyse the scan data to detect one or more hazards in the area adjacent to the vehicle 10, and to inhibit lowering (or raising) if a hazard is detected. The control system 100 may use other on-board sensors in addition to the image capture devices to provide indications of possible hazards, such as ultrasonic proximity detectors associated with a park-assist system.
The processing module may be configured to compare features and characteristics of scan data other than those mentioned, including any biometric characteristic of a user. In each case, users can have the opportunity to store, at a registration phase, a user identifier that corresponds to the gesture or physical characteristic that they wish to use as an authorising characteristic each time they wish to trigger a vehicle action.
Although the above description assumes that the vehicle control system includes all functionality required to initiate a vehicle action based on a detected user intent, in other embodiments some functions may be provided externally to the vehicle, for example on a remote server or on a portable device. By way of example, the resource intensive task of processing scan data may be outsourced to a remote server, which may then return a simple indication of user intent to the vehicle control system.
As noted above, the control system 100 comprises a control and processing module 320 with processor 321 and memory 12. It also has associated communications functionality 323, for communicating with vehicles buses, or with networks via communications device 112. The memory 12 may store readable instructions to instruct the processor to perform the functions of the monitoring system. The processor 321 is a representation of processing capability and may in practice be provided by several processors. A database 310 is provided, storing data as applicable. Elements shown within the module 320 use the processor 321 and the memory 12 to deliver functionality; for example, these elements can provide steps of embodiments of the invention such as determining the presence of a user of the vehicle in the vicinity of the vehicle; acquiring data relating to the user of the vehicle; processing the acquired data to determine a feature of the user; and generating, in dependence on the determined feature of the user, a control parameter for adjusting a height of at least a portion of the vehicle.
Figures 4a and 4b are diagrams illustrating in flow chart form steps of a method according to an embodiment of the invention, which may be performed by the control system 100. In this embodiment, the key fob 9 is used to authorise or identify the user (rather than body landmarks alone), thus a key fob proximity is detected (402), and if no key fob is present, the camera system is queried (404), and if awake is powered off (406). If the key fob is detected, height control system is activated and the camera system switched on (408). The camera system is then used (410) for identifying where the user is, and their identify by facial recognition (412). If the facial recognition system does not recognise the user, a further step (414) of checking that the key fob is detected is initiated. If no key fob is present, the camera system is powered off (406). if the key fob is present, the user detection and facial recognition step (410, 412) is re-attempted.
In an embodiment, the first recognition step or system start ("wake-up') is triggered by detection of an authorised device, such as a smart phone, using a local wireless network such as Bluetooth. This first step can be used from distances of around 20 metres from the vehicle. The key fob is then used, from around 10 metres away, as an authorisation and/or identification step.
Once the user is recognised, the user's skeleton information is extracted (416) from the camera image data by connecting the body landmarks recognised by the neural network processes described above, here deep convolutional neural networks. The body landmarks and key fob position are then tracked (418) to infer or detect (420) a user intention. If the user intention is not detected or inferred sufficiently, the key fob detection step 414 is repeated.
Turning to Figure 4b, if the intention is detected the processing module undertakes height profiling and luggage detection for the user (422). Once these are established, a number of possible inputs (424) can be used to determine appropriate settings (426) for the vehicle height system, here a suspension adjustment system. For example: person anthropometrics may be determined as either Tall, Average or Short; luggage size as either Big, Medium or Small; and suspension position as either High, or Low. Thus a Short anthropometric and Big luggage input may result in a Low suspension output from a truth table.
This is then used to adjust the suspension (428). In this embodiment a further step of transmitting an aperture opening command is conducted (430), for example to trigger opening of a load space aperture (e.g. a boot) or a door of the vehicle. In addition, an obstacle detection step (432) is used to check that the opening of the door or boot will not result in the door or boot contacting an obstacle (in similar vein to the obstacle detection system for the vehicle height adjustment described above). If an obstacle is detected (434), this step is repeated. If no obstacle is detected, a final check is made as to whether the user intent is still valid (436), and if so the relevant aperture is opened (438).
In alternative embodiments, as noted above a variety of means may be used to recognise and/or identify a user, for example to determine if they are a registered user. If there is a circumstance in which factors do not correlate, for example if the key fob recognised does not match the gait pattern or facial recognition for the user, in embodiments there are also options for allowing the system to nevertheless perform the tracking and height adjustment as necessary. In an example, a registered user may have given their key fob to a family member, who has not registered their facial features with the control system. In addition, the facial recognition system may in some circumstances not complete its routine satisfactorily, for example if the user's face is obscured by clothing or accessories. In such conditions, one of the other factors described in above embodiments used to recognise, identify, or authorise users can be used as confirmation. For example, if the key fob is recognised, but the facial recognition is not, a gesture may be used to confirm the authorisation, or the landmark tracking may be used to identify the user (if those landmarks have been associated with the user, or the key fob). In another example, if the initial step is to recognise the user from tracking landmarks or facial recognition, the key fob or a gesture may be used as a secondary or confirmatory factor.
Many modifications may be made to the above examples without departing from the scope of the present invention as defined in the accompanying claims.

Claims (24)

  1. CLAIMS1. A control system configured to control a height management system for a vehicle, the control system comprising: a sensor system configured to: determine the presence of a user of the vehicle in the vicinity of the vehicle; and acquire data relating to the user of the vehicle; a processing module configured to process the acquired data to determine a feature of the user; and a control module configured to generate, in dependence on the determined feature of the user, a control parameter for adjusting a height of at least a portion of the vehicle.
  2. 2. A control system according to Claim 1, wherein the control parameter is a suspension parameter for adjusting the suspension system for the vehicle.
  3. 3. A control system according to Claim 1 or Claim 2, wherein the control module is configured to set the control parameter at one of a range of values for the vehicle height, in dependence on the determined feature of the user.
  4. 4. A control system according to Claim 3, wherein the control module is configured to generate the control parameter for a height for the vehicle.
  5. 5. A control system according to any preceding claim, wherein the feature of the user comprises one or more of: a tracked movement for the user; a body landmark feature for the user; a gait feature for the user; and a gesture of the user.
  6. 6. A control system according to any preceding claim, wherein the sensor system is configured to acquire data for a region in the vicinity of the vehicle.
  7. 7. A control system according to Claim 6, wherein the processing module is configured to analyse the data to determine one or more body landmarks of the user within the data.
  8. 8. A control system according to Claim 7, wherein the acquired data is of the region in the vicinity of the vehicle over a time period, and wherein the processing module is configured to track the position of the or each body landmark relative to the vehicle over the time period.
  9. JLRP10335GB0 9. A control system according to Claim 8, wherein the processing module is configured to determine an intent of a user by comparing the tracked movement of each body landmark to a respective predetermined movement.
  10. 10. A control system according to Claim 8 or Claim 9, wherein the processing module is configured to determine the user's intent by comparing the tracked movement of the or each body landmark with a gait pattern.
  11. 11. A control system according to any of the Claims 8 to 10, wherein the processing module is configured to analyse the tracked movement of the or each body landmark to determine a trajectory of the user relative to the vehicle.
  12. 12. A control system according to Claim 11, wherein the processing module is configured to determine the user's intent by determining whether the trajectory of the user indicates that the user is approaching the vehicle.
  13. 13. A control system according to Claim 12, wherein the processing module is configured to determine the user's intent by determining whether the trajectory of the user indicates that the user is approaching a body aperture closure member of the vehicle.
  14. 14. A control system according to any preceding claim, wherein the sensor system comprises one or more image data capture devices.
  15. 15. A control system according to Claim 14, wherein the processing module is configured to process data from the one or more image data capture devices to determine the feature of the user data.
  16. 16. A control system according to Claim 15, wherein the processing module is configured to process data from the image data capture devices to determine the presence of a user and optionally an identity of the user.
  17. 17. A control system according to any preceding claim wherein the sensor system is configured to: JLRP10335GB0 determine the presence of the user in relation to a first threshold distance from the vehicle; and acquire the data relating to the user in relation to a second threshold distance from the vehicle.
  18. 18. A control system according to any preceding claim, wherein the sensor system is configured to, on determining the presence of the user in the vicinity of the vehicle, acquire data relating to the user of the vehicle from a storage device associated with the vehicle.
  19. 19. A control system according to any preceding claim, wherein the acquired data comprises any one or more of the following: point cloud data; data acquired by a radar system; data acquired by a lidar system; data acquired by a Bluetooth system; data acquired by a WiFi system; data acquired by a RFID system; data acquired by a Near Field Sensing system; and data acquired by an ultrasonic system.
  20. 20. A vehicle comprising a control system according to any preceding claim.
  21. 21. A method of controlling a height management system for a vehicle, comprising: determining the presence of a user of the vehicle in the vicinity of the vehicle; acquiring data relating to the user of the vehicle; processing the acquired data to determine a feature of the user; and generating, in dependence on the determined feature of the user, a control parameter for adjusting a height of at least a portion of the vehicle.
  22. 22. A computer program product executable on a processor so as to implement the method of Claim 21.
  23. 23. A non-transitory computer readable medium loaded with the computer program product of Claim 22.
  24. 24. A processor arranged to implement the method of Claim 21, or the computer program product of Claim 22.
GB1903589.8A 2019-03-15 2019-03-15 Control system and method for controlling a vehicle system Active GB2582899B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1903589.8A GB2582899B (en) 2019-03-15 2019-03-15 Control system and method for controlling a vehicle system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1903589.8A GB2582899B (en) 2019-03-15 2019-03-15 Control system and method for controlling a vehicle system

Publications (3)

Publication Number Publication Date
GB201903589D0 GB201903589D0 (en) 2019-05-01
GB2582899A true GB2582899A (en) 2020-10-14
GB2582899B GB2582899B (en) 2021-09-29

Family

ID=66381033

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1903589.8A Active GB2582899B (en) 2019-03-15 2019-03-15 Control system and method for controlling a vehicle system

Country Status (1)

Country Link
GB (1) GB2582899B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220281481A1 (en) * 2021-03-02 2022-09-08 Toyota Jidosha Kabushiki Kaisha Autonomous vehicle, passenger vehicle, and vehicle transfer system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114083950B (en) * 2021-10-14 2024-06-07 的卢技术有限公司 Human-vehicle interaction mode, system and storage medium for getting on and off vehicles

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01233111A (en) * 1988-03-12 1989-09-18 Mazda Motor Corp Car height adjusting device for vehicle
US20060055129A1 (en) * 2004-09-14 2006-03-16 Toyota Jidosha Kabushiki Kaisha Height adjusting system for automotive vehicle
EP2921325A1 (en) * 2014-03-19 2015-09-23 WABCO GmbH Method and device for the remote control of a motor vehicle, electronic motor vehicle control system and a motor vehicle equipped with same
DE102017002634A1 (en) * 2017-03-17 2017-11-30 Daimler Ag Method for level control
DE102018004572A1 (en) * 2018-06-08 2018-11-15 Daimler Ag Method for leveling a vehicle
WO2019061502A1 (en) * 2017-09-30 2019-04-04 深圳市大疆创新科技有限公司 Vehicle suspension adjustment method and system, and adjustable suspended vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01233111A (en) * 1988-03-12 1989-09-18 Mazda Motor Corp Car height adjusting device for vehicle
US20060055129A1 (en) * 2004-09-14 2006-03-16 Toyota Jidosha Kabushiki Kaisha Height adjusting system for automotive vehicle
EP2921325A1 (en) * 2014-03-19 2015-09-23 WABCO GmbH Method and device for the remote control of a motor vehicle, electronic motor vehicle control system and a motor vehicle equipped with same
DE102017002634A1 (en) * 2017-03-17 2017-11-30 Daimler Ag Method for level control
WO2019061502A1 (en) * 2017-09-30 2019-04-04 深圳市大疆创新科技有限公司 Vehicle suspension adjustment method and system, and adjustable suspended vehicle
DE102018004572A1 (en) * 2018-06-08 2018-11-15 Daimler Ag Method for leveling a vehicle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220281481A1 (en) * 2021-03-02 2022-09-08 Toyota Jidosha Kabushiki Kaisha Autonomous vehicle, passenger vehicle, and vehicle transfer system
US11912308B2 (en) * 2021-03-02 2024-02-27 Toyota Jidosha Kabushiki Kaisha Autonomous vehicle, passenger vehicle, and vehicle transfer system

Also Published As

Publication number Publication date
GB201903589D0 (en) 2019-05-01
GB2582899B (en) 2021-09-29

Similar Documents

Publication Publication Date Title
GB2579539A (en) Vehicle controller
US11068069B2 (en) Vehicle control with facial and gesture recognition using a convolutional neural network
US10395457B2 (en) User recognition system and methods for autonomous vehicles
US10229654B2 (en) Vehicle and method for controlling the vehicle
KR102179864B1 (en) Methods for controlling parking procedures
KR101851155B1 (en) Autonomous driving control apparatus, vehicle having the same and method for controlling the same
KR100999084B1 (en) Video aided system for elevator control
EP3172589B1 (en) Method for mitigating radar sensor limitations with video camera input for active braking for pedestrians
US10776636B2 (en) Stereo camera-based detection of objects proximate to a vehicle
US10071706B2 (en) Method and apparatus for external operation of an actuator of a vehicle
US10345806B2 (en) Autonomous driving system and method for same
US10384641B2 (en) Vehicle driver locator
US10429503B2 (en) Vehicle cognitive radar methods and systems
US11100347B2 (en) Photometric stereo object detection for articles left in an autonomous vehicle
CN105966357A (en) Vehicle control method and device and vehicle
US20190299847A1 (en) Vehicle control device
US20170185763A1 (en) Camera-based detection of objects proximate to a vehicle
GB2582899A (en) Control system amd method for controlling a vehicle system
US11981288B2 (en) Activating vehicle components based on intent of individual near vehicle
US20200307514A1 (en) Vehicle control system, vehicle control method, and storage medium
CN110325384A (en) The vehicle of vehicle hanging adjusting method, system and adjustable hanging
US20210286973A1 (en) Facial recognition system for a vehicle technical field
CN111196316B (en) Control method and device for vehicle trunk and vehicle
US20230219483A1 (en) Method and apparatus for providing user interface based on projection
CN110239529A (en) Control method for vehicle, device and computer readable storage medium