GB2534163A - Vehicle interface device - Google Patents

Vehicle interface device Download PDF

Info

Publication number
GB2534163A
GB2534163A GB1500590.3A GB201500590A GB2534163A GB 2534163 A GB2534163 A GB 2534163A GB 201500590 A GB201500590 A GB 201500590A GB 2534163 A GB2534163 A GB 2534163A
Authority
GB
United Kingdom
Prior art keywords
vehicle
identified object
dependence
interface device
angular position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1500590.3A
Other versions
GB201500590D0 (en
GB2534163B (en
Inventor
Loeillet Jean-Jacques
Wells Andrew
Trevana Alan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1500590.3A priority Critical patent/GB2534163B/en
Publication of GB201500590D0 publication Critical patent/GB201500590D0/en
Priority to EP16700486.0A priority patent/EP3245642A1/en
Priority to PCT/EP2016/050653 priority patent/WO2016113345A1/en
Priority to US15/540,153 priority patent/US10229595B2/en
Publication of GB2534163A publication Critical patent/GB2534163A/en
Application granted granted Critical
Publication of GB2534163B publication Critical patent/GB2534163B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q5/00Arrangement or adaptation of acoustic signal devices
    • B60Q5/005Arrangement or adaptation of acoustic signal devices automatically actuated
    • B60Q5/006Arrangement or adaptation of acoustic signal devices automatically actuated indicating risk of collision between vehicles or with pedestrians
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/015Detecting movement of traffic to be counted or controlled with provision for distinguishing between two or more types of vehicles, e.g. between motor-cars and cycles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10KSOUND-PRODUCING DEVICES; METHODS OR DEVICES FOR PROTECTING AGAINST, OR FOR DAMPING, NOISE OR OTHER ACOUSTIC WAVES IN GENERAL; ACOUSTICS NOT OTHERWISE PROVIDED FOR
    • G10K11/00Methods or devices for transmitting, conducting or directing sound in general; Methods or devices for protecting against, or for damping, noise or other acoustic waves in general
    • G10K11/18Methods or devices for transmitting, conducting or directing sound
    • G10K11/26Sound-focusing or directing, e.g. scanning
    • G10K11/34Sound-focusing or directing, e.g. scanning using electrical steering of transducer arrays, e.g. beam steering
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2499/00Aspects covered by H04R or H04S not otherwise provided for in their subgroups
    • H04R2499/10General applications
    • H04R2499/13Acoustic transducers and sound field adaptation in vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Traffic Control Systems (AREA)

Abstract

A vehicle interface device 1 for generating an audible indication of a potential hazard 6 comprising; a plurality of electroacoustic transducers 30 for generating an audio object 29 and a processor 7. The processor 7 is configured to determine the angular position of the identified hazard 6 relative to the vehicle, generate a control signal to cause the electroacoustic transducers 30 to generate an audio object 29 and to modify the control signal to progressively change a perceived spatial location of the audio object 29 to represent changes in the determined relative angular position of the identified hazard 6. The vehicle interface device 1 may also comprise a display [13, fig 2] on which a visual indication of the potential hazard 6 may be generated. The device 1 may further comprise vibration generators [38, fig 10] within the drivers seat [32, fig 10] for creating a haptic indication of the potential hazard 6.

Description

Intellectual Property Office Application No. GII1500590.3 RTM Date June 2015 The following terms are registered trade marks and should be read as such wherever they occur in this document: FlexRay Intellectual Property Office is an operating name of the Patent Office www.gov.uk /ipo
VEHICLE INTERFACE DEVICE
TECHNICAL FIELD
The present disclosure relates to a vehicle interface device for generating an audible indication of a potential hazard; to a vehicle comprising a vehicle interface device; and to a method of generating an audible indication of a potential hazard.
BACKGROUND
The activity of driving a vehicle can prove stressful. According to the driving situation, the driver can be distracted or have a high level of mental workload. These situations can be classified in a number of different ways, for example: * Driving environment (urban, motorway, parking, etc.); * Conditions (weather, traffic, noise, light, etc.); and * Status of the vehicle (speed, level of fuel, presence of warning, etc.).
To reduce driver workload, there are numerous developments to the Advanced Driver Assistance System (ADAS) provided in modern vehicles. These developments result in the generation of new information to be conveyed to the driver. This presents a risk of visual clutter and may render the information difficult to understand or potentially incomprehensible.
It is envisaged that future ADAS will be connected with their environment in order to identify the driving context and to help driver awareness. This will generate additional information to be conveyed to the driver of the vehicle.
It is against this background that the present invention has been conceived. At least in certain embodiments the present invention seeks to provide a vehicle having an interface device which overcomes or ameliorates at least some of the shortcomings.
SUMMARY OF THE INVENTION
Aspects of the present invention relate to a vehicle interface device for generating an audible indication of a potential hazard; to a vehicle comprising a vehicle interface device; and to a method of generating an audible indication of a potential hazard.
According to a further aspect of the present invention there is provided a vehicle interface device for generating an audible indication of a potential hazard, the vehicle interface device 35 comprising: a plurality of electroacoustic transducers for generating an audio object; and a processor for controlling said electroacoustic transducers; wherein the processor is configured to: in dependence on object data relating to an identified object representing a potential hazard, determine an angular position of the identified object relative to the vehicle; generate a control signal to cause the electroacoustic transducers to generate an audio object; and modify the control signal to progressively change a perceived spatial location of the audio object to represent changes in the determined relative angular position of the identified object.
The vehicle interface device outputs the audio object to alert the vehicle driver of an identified object which represents a potential hazard. By progressively changing the audio object, the vehicle interface can provide a substantially continuous (i.e. substantially uninterrupted) indication of the angular position of the identified object in relation to the vehicle. The audio object can be controlled to mirror changes in the determined position of the identified object in relation to the vehicle. The perceived spatial location of the audio object can be changed progressively to track the determined angular position of the identified object. The vehicle interface can, at least in certain embodiments, facilitate identification and assessment of the hazard posed by the identified object. The location of the audio object, as perceived by a vehicle occupant (for example the driver of the vehicle), is referred to herein as the perceived spatial location of the audio object.
The vehicle interface device can provide an awareness of a potential hazard in terms of one or more of the following: position, trajectory, nature, criticality, etc. The audio object generated by the vehicle interface device can convey information, for example using one or more of the following strategies: frequency and/or volume to represent criticality of the potential hazard; position (i.e. the location of the potential hazard); motion (trajectory of the potential hazard); and controlling the acoustic signature to indicate the nature of the identified object. At least in certain embodiments, these strategies can provide an intuitive interface capable of providing improved awareness of a potential hazard in order to facilitate prevention of an accident.
The audio object can represent the position of the identified object in relation to the vehicle. The position of the identified object can be defined using a coordinate system, for example defined in relation to a virtual reference point. The speed and/or acceleration of the identified object can be determined by monitoring changes in the position of the identified object with respect to time, or the rate of change of the position of the identified object. The processor can be configured to determine the absolute speed and/or acceleration of the identified object (using the vehicle speed and/or direction of travel); or can be configured to determine the relative speed and/or acceleration of the identified object.
The object data can comprise position data indicating the position of the object relative to the vehicle; and/or identification data indicating the nature (i.e. type or classification) of the object. The processor can be configured to receive said object data from sensor means. The vehicle interface device can be disposed in a vehicle and the sensor means can be provided on said vehicle. The sensor means can be configured to monitor a region external to the vehicle to identify the identified object. The sensor means can comprise one or more of the following set: one or more ultrasonic sensors; one or more capacitive sensors; one or more optical sensors; and one or more radar sensors. The sensor means can, for example, form part of an advanced driver aid system (ADAS). By way of example, ADAS can comprise one or more of the following: a camera (surround view, lane departure warning, park assistance); stereo camera (pedestrian detection); long-range radar (Adaptive Cruise Control); short/medium-range radar (blind spot detection).
Alternatively, or in addition, the vehicle can comprise a receiver for receiving the object data. The receiver can be a wireless receiver for receiving a wireless transmission, for example a radio frequency (RF) transmission. The object data can be transmitted from another vehicle as part of a vehicle-to-vehicle (V2V) communication; or from infrastructure as part of an infrastructure-to-vehicle (12V) communication. The object data could be transmitted from another vehicle to indicate the position of that vehicle or the position of another vehicle. The object data could be transmitted by the infrastructure to indicate the position and/or movements of one or more other vehicles in the vicinity of the vehicle.
The audio object is formed from a plurality of mechanical waves having an audible frequency. The electroacoustic transducers can be a set of loudspeakers disposed in the occupant compartment, for example forming part of an audio entertainment system in the vehicle.
The audio object is defined spatially within an audio scene (also referred to as a spatial audio object). The audio scene can, for example, correspond to an occupant compartment of a vehicle. The spatial position of the audio object can, for example, be defined in two dimensions (X and Y coordinates, for example corresponding to a longitudinal axis and a transverse axis of a vehicle), or in three dimensions (X, Y and Z coordinates, for example corresponding to a longitudinal axis, a transverse axis and a vertical axis of a vehicle). The audio object can provide an audible indication to an occupant of the vehicle of the relative angular position of the identified object. The processor can be in the form of an audio renderer. The spatial position of the audio object can be controlled to indicate the determined relative angular position of the identified object. The processor can be configured to progressively change the spatial location of the audio object to represent changes in the determined relative angular position of the identified object. By varying the location of the audio object within the occupant compartment, the perceived source of the alert changes. In use, the electroacoustic transducers can generate a multi-dimensional audio object within an occupant compartment of a vehicle.
The processor can be configured to determine a trajectory of the identified object in dependence on the object data. The processor can be configured to modify the audio object in dependence on the determined trajectory. The spatial location of the audio object could be modified to travel along a virtual trajectory which at least substantially matches the determined trajectory of the identified object.
The processor can be configured to determine a time to collision in dependence on the object data. The processor can be configured to modify the audio object in dependence on the determined time to collision. For example, the frequency and/or volume of an audio object can be altered in dependence on the determined time to collision.
The processor can be configured to determine a nature of the identified object in dependence on the object data. The processor can be configured to modify the audio object in dependence on the determined nature of the identified object. For example, the acoustic signature (or pattern) could be modified in dependence on the determined nature of the identified object. A first acoustic signature can be output if the identified object is identified as another vehicle. The first acoustic signature can, for example, be the sound of a vehicle horn. A second acoustic signature can be output if the identified object is identified as a cyclist. The second acoustic signature can, for example, be the sound of a bicycle bell. A third acoustic signature can be output if the identified object is identified as a pedestrian. The third acoustic signature can, for example, be the sound of voices. A fourth acoustic signature can be output if the identified object is identified as an animal. The fourth acoustic signature can, for example, be the sound of a dog barking.
The processor can be configured to determine a speed of the identified object in dependence on the object data. The processor can modify the audio object in dependence on the determined speed of the identified object.
The processor can be configured to determine a hazard level (criticality) in dependence on the object data. The processor can be configured to modify the audio object in dependence on the determined hazard level.
The processor can be configured to modify the audio object by changing one or more of the following parameters: amplitude, frequency, volume, acoustic pattern, signature, and pattern form. The processor can be configured to modify the audio object to alter the perceived loudness of the audio object.
The processor can be configured to receive driver status data. The driver status data can, for example, be generated by driver monitoring means, for example to determine gaze direction and/or head pose. The driver monitoring means can, for example, be in the form of an optical camera coupled to an image processing unit. The processor can be configured to change the audio object in dependence on the gaze direction and/or head pose of the driver.
For example, the spatial position of the audio object could be varied in dependence on the gaze direction and/or head pose of the driver.
The vehicle interface device can also be suitable for generating a visual indication of a potential hazard, the vehicle interface device comprising: a display configured to extend around at least a portion of a perimeter of an occupant compartment in a vehicle; and a processor for controlling said display; wherein the processor is configured to: in dependence on object data relating to an identified object representing a potential hazard, determine an angular position of the identified object relative to the vehicle; generate a control signal to cause the display to display a visual indicator at a display position in said display corresponding to the determined relative angular position of the identified object; and modify the control signal to progressively change the display position of the visual indicator within the display at least substantially to match changes in the relative angular position of the identified object.
The vehicle interface device can also be suitable for generating a haptic indication of a potential hazard, the vehicle interface device comprising: at least one haptic generator configured to generate a haptic signal; and a processor for controlling said haptic generator; wherein the processor is configured to: in dependence on object data relating to an identified object representing a potential hazard, determine an angular position of the object relative to the vehicle; generate a control signal to cause the haptic generator to output a haptic signal for providing an indication of the determined relative position of the object; and modify the control signal to progressively change the generated haptic signal to represent changes in the relative angular position of the identified object.
According to a further aspect of the present invention there is provided a vehicle comprising a vehicle interface device as described herein.
According to a further aspect of the present invention there is provided a method of generating an audible indication of a potential hazard, the method comprising: determining an angular position of an identified object relative to a vehicle; generating an audio object for providing an indication of the determined relative angular position of the identified object; and progressively changing a perceived spatial location of the audio object to represent changes in the determined relative angular position of the identified object.
The angular position of the identified object can be determined in dependence on object data received from sensor means.
The perceived spatial location of the audio object can be changed progressively to track the determined angular position of the identified object.
The method can comprise monitoring a driver of the vehicle to generate driver data. The perceived spatial position of the audio object can be changed in dependence on the driver data.
The method can comprise determining a trajectory of the identified object; and modifying the audio object in dependence on the determined trajectory.
The method can comprise determining a time to collision. The audio object can be modified in dependence on the determined time to collision.
The method can comprise determining a nature of the identified object. The audio object can be modified in dependence on the determined nature of the identified object.
The audio object can be modified by changing one or more of the following parameters: amplitude, frequency, volume, acoustic pattern, signature, and pattern form.
The method can comprise generating a visual indication of a potential hazard, the method comprising: displaying a visual indicator at a display position corresponding to the determined relative angular position of the identified object; and progressively changing the display position of the visual indicator at least substantially to match changes in the relative angular position of the identified object.
The method can comprise generating a haptic indication of a potential hazard, the method comprising: generating a haptic signal for providing an indication of the determined relative position of the object; and progressively changing the generated haptic signal to represent changes in the relative angular position of the identified object.
The term processor is used herein to refer to one or more electronic processors. Similarly, the term system memory is used herein to refer to one or more storage devices. The processor can be a general purpose computational device configured to execute a set of software instructions to perform the method(s) described herein.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
An embodiment of the present invention will now be described, by way of example only, with reference to the accompanying figures, in which: Figure 1 shows a schematic representation of a vehicle incorporating an interface device in accordance with an embodiment of the present invention; Figure 2 shows a block diagram providing an overview of the vehicle interface device shown in Figure 1; Figure 3 shows a more detailed representation of the vehicle interface device shown in Figure 1; Figure 4 shows a schematic representation of the display device of the vehicle interface device within the occupant compartment of the vehicle; Figure 5 illustrates operation of the display device shown in Figure 4; Figure 6 shows a lateral view of the interior of the occupant compartment including the display device shown in Figure 4; Figures 7A and 7B illustrate operation of the display device of the vehicle interface device; Figure 8 illustrates changes in the field of view of the driver in dependence on vehicle speed; Figure 9 shows a schematic representation of an audio device in accordance with an embodiment of the present invention; Figure 10 shows a vehicle seat incorporating a haptic device in accordance with an embodiment of the present invention; Figures 11A-C show the contact patch on the vehicle seat shown in Figure 10 based on percentile weight measurements; Figures 12A-C illustrates the operation of the vehicle interface device in a first operating scenario; and Figures 13A-C illustrates the operation of the vehicle interface device in a second operating scenario.
DETAILED DESCRIPTION
A vehicle interface device 1 in accordance with an embodiment of the present embodiment will now be described. The vehicle interface device 1 functions as a human machine interface (HMI) for a vehicle 2 shown schematically in Figure 1. In the present embodiment, the vehicle 2 is an automobile comprising an occupant compartment in the form of a cabin 3 having a front windshield 4, left and right side windows 5L, 5R and a rear windshield (not shown). The vehicle interface device 1 can be implemented in other vehicle types.
The vehicle interface device 1 is operable to generate an alert to notify a driver of the vehicle 2 that a potential hazard has been identified. The alert comprises a directional component to notify the driver of the angular position of the potential hazard in relation to the vehicle 2. The alert in the present embodiment comprises three modalities: 1. Vision -patterns using colors, pulse and motion are displayed in the visual structure of the occupant compartment; 2. Sound -directional object-based sound associated with the nature of the identified object; and 3. Haptic -directional and applied via the seat (or the steering wheel), for example in the form of vibration or contact.
The potential hazard typically takes the form of an object 6 identified by the vehicle interface device 1. The identified object 6 can be either stationary or moving. The vehicle interface device 1 is configured to provide the driver with an indication of the position of the identified object 6 in relation to the vehicle 2. The vehicle interface device 1 can be configured to differentiate between different types of objects to determine the nature of the potential hazard, for example to determine if the potential hazard is a pedestrian (potentially differentiating between an adult and a child), a cyclist, a vehicle, a truck, an animal or an inanimate object. An image processing algorithm can, for example, be applied to image data to determine the nature of the identified object. The form of the alert can be modified in dependence on the determined nature of the potential hazard. The vehicle interface device 1 could identify more than one potential hazard at any time and the techniques described herein could be performed simultaneously for the plurality of identified hazards. Alternatively, the vehicle interface device 1 could be configured to prioritise one of the identified hazards over the others, for example in dependence on the nature of the potential hazards identified by the vehicle interface device 1. The vehicle interface device 1 could be configured to output an alert only relating to the potential hazard identified as having the highest priority.
As shown in Figure 1, the vehicle interface device 1 comprises a processor 7 coupled to system memory 8. The processor 7 is an electronic processor and the system memory 8 comprises an electronic memory device. The processor 7 is configured to execute a set of software instructions held in the system memory 8 to implement a control algorithm in accordance with an aspect of the present invention. The processor 7 is configured to receive signals from a plurality of on-board vehicle systems to identify the identified object 6 and to determine if it represents a potential hazard. The vehicle interface device 1 is shown schematically in Figures 2 and 3. The processor 7 receives a first input signal SiNi from an advanced driver aid system (ADAS) 9, a second input signal SiN2 from a driver monitoring system 10, a third input signal SIN3 from a user identification system 11, and a fourth input signal Sul from a vehicle information system 12 (for example by accessing data signals published to a communications area (CAN) bus or FlexRay). The processor 7 outputs a first output signal Sour, to a display device 13; a second output signal 50UT2 to an audio device 14; and a third output signal ScuT3 to a haptic device 15. The display device 13, the audio device 14 and the haptic device 15 are each operable in dependence on the output signals Sours-3 to generate the respective visual, audio and haptic alert(s) for the driver of the vehicle 2.
The ADAS 9 is coupled to sensor means for monitoring a region surrounding the vehicle 2. As shown in Figure 1, the sensor means is configured to monitor a first operating region R1 disposed in front of the vehicle 2; second and third operating regions R2, R3 disposed on the left and right sides of the vehicle 2 respectively; and a fourth operating region R4 disposed behind the vehicle 2. The sensor means is in the form of a forward-facing radar sensor 16, left-facing and right-facing cameras 17, 18, and a rear-facing camera 19. The radar sensor 16 comprises a radio frequency transceiver for transmitting a radio signal and receiving a signal reflected by the identified object 6. The radar sensor 16 outputs reflected signal data for analysis by a first signal processor (typically associated with the radar sensor 16) to identify any objects 6 disposed in the first operating region R1. The radar sensor 16 can be a long-range radar sensor, for example provided as part of an Adaptive Cruise Control (ACC) system; or a shorVmedium-range radar sensor, for example provided as part of a blind spot detection system. The cameras 17, 18, 19 in the present embodiment are optical cameras which output image data for analysis by a second signal processor (not shown) to identify any objects 6 disposed in the second, third and fourth operating regions R2, R3, R4. The cameras 17, 18, 19 can be provided on the vehicle 2 to implement one or more of the following functions: surround view, lane departure warning and park assistance. One or more of the cameras 17, 18, 19 can be in the form of a stereo camera, for example to detect a pedestrian.
The first and second signal processors identify object(s) 6 proximal to the vehicle 2 within the operating zones R1-4 and output the positional data D1 in the form of x, y coordinates defining the position of the identified object 6 relative to a virtual reference point on the vehicle 2. It will be understood that the sensor means can comprise different types of sensors, such as ultrasonic sensors and/or capacitive sensors. Moreover, the sensor means could be remote from the vehicle 2, for example in another vehicle which is in communication with the vehicle 2 (vehicle-to-vehicle (V2V) communication).
With reference to Figure 3, the positional data D1 is used to determine an angular position of the identified object 6 in relation to the vehicle 2 (referred to herein as the "relative angular position"). The relative angular position in the present embodiment corresponds to an angle (bearing) measured relative to a longitudinal axis of the vehicle 2. The first and second signal processors also estimate a time to collision (tic), for example estimated based on the measured position and trajectory of the identified object 6. The first and second signal processors output time to collision data D2 associated with each identified object 6. By comparing the reflected signal data and the image data with referenced files stored in a database in the system memory 8, the first and second signal processors also classify the nature of each object 6 and output nature of object data D3. In the present embodiment, the first and second signal processors are incorporated into the ADAS 9, but they could be implemented as separate processing modules. In respect of each object 6, the ADAS 9 outputs the first input signal SiNi comprising the positional data D1, the time to collision data D2, and the nature of object data D3 to the processor 7.
The positional data D1, the time to collision data D2, and the nature of object data D3 each relate to the identified object 6 identified as a potential hazard. These different data sets are referred to herein as object data. The object data can be output from the sensors means disposed on the vehicle 2. Alternatively, or in addition, the processor 7 could be configured to receive the object data from an external source, such as infrastructure (infrastructure-tovehicle (12V)) or another vehicle.
The driver monitoring system 10 comprises a driver monitoring camera (not shown). An image processing unit receives image data from the driver monitoring camera and assesses a driver distraction level and a driver tiredness (fatigue) level. The image processing unit can, for example, implement an image-processing algorithm to determine a driver alertness level, for example based on head pose and/or gaze direction. The driver monitoring system 10 can also monitor the driver workload, for example with reference to the vehicle speed and/or steering angle. A driver capability can also be determined by the driver monitoring system 10 to provide an estimate of an expected reaction time by the driver at any given time. The driver monitoring system 10 monitors the current driver workload, driver distraction, driver tiredness (fatigue) and driver capability. The driver monitoring system 10 can comprise a driver-facing camera to monitor driver behaviour, for example based on face recognition algorithms. The driver monitoring system 10 can also monitor driver inputs, including steering angle and/or pedal angles. The driver monitoring system 10 outputs the second input signal SIN2 which includes driver monitoring data D4 comprising an estimated driver reaction time. The driver monitoring system 10 can also output data generated by the image processing unit defining the head pose and/or the gaze direction of the driver. The second input signal SiN2 could optionally also comprise information relating to the driver's vision capabilities, for example short or long distance vision and/or colour perception. The driver capability can be determined = TO (reaction time of a specific situation, for example obtained from a look-up table) + (delta) AT (additional time calculated based on driver monitoring of workload and/or distraction and/or tiredness) or multiply by a predefined percentage of reaction rate (for example, a fatigued individual may take x°/0 longer to react, where x is a predefined number greater than zero). The driver monitoring system 10 can optionally also utilise auditory and/or vision information relating to a particular driver. The auditory information can define a driver's auditory capabilities; and the vision information can define a driver's vision capabilities, for example indicating the driver's long/short sighted ability and/or colour perception ability. The auditory and/or vision information could be measured or could be input by the driver.
The third input signal 51N3 can be output from the user identification system 11 to identify the driver of the vehicle. The processor 7 additionally receives vehicle dynamics data from the vehicle information system 12. The fourth input signal 51N4 comprises vehicle speed data D5, but can include other vehicle dynamics parameters, such as the steering angle. The processor 7 can also receive driver data D6 which, as described herein, can be used to estimate one or more physical characteristics of the driver. The processor 7 can also receive driver head position data D7 indicating the position of the driver's head. The processor 7 can also receive driver clothing data D8 characterising the clothing worn by the driver, for example the thickness of a garment and/or the number of layers. The driver head position data D7 and the driver clothing data D8 can be generated in dependence on image processing of image data generated by a driver-facing camera (not shown). The outputs from the display device 13, the audio device 14 and the haptic device 15 can be modified in dependence on the estimated physical characteristics. For example, the haptic output generated by the haptic device 15 can be controlled in dependence on a pressure zone on the driver seat estimated in dependence on the measured weight of the driver.
The processor 7 applies a control algorithm to the input signals SiN1_4 to generate the output signals SouT1.3 for controlling operation of the display device 13, the audio device 14 and the haptic device 15 respectively. The processor 7 thereby functions as a HMI controller for the vehicle 2. The configuration of the display device 13, the audio device 14 and the haptic device 15 will now be described in more detail.
The display device 13 is configured to output a visual indicator to notify a vehicle occupant of a potential hazard. The display device 13 is configured to extend substantially around the interior perimeter of the occupant compartment 3. As shown in Figure 4, the display device 13 comprises a front panel 20, left and right lateral panels 21L, 21R, and a rear panel 22.
Thus, the display device 13 provides a 360° visual display extending around the occupant(s) of the vehicle 2. The front panel 20 is disposed adjacent to a base of the front windshield 4, for example across the top of a dashboard; the left and right lateral panels 21 L, 21 R are disposed adjacent to a base of the respective left and right side windows 5L, 5R; and the rear panel 22 is disposed adjacent to a base of the rear windshield (not shown). In the present embodiment, the display device 13 also extends vertically upwardly along at least a portion of each A-pillar 23. This provides a blind-spot indicator function, as illustrated in Figure 4. Alternatively, or in addition, the display device 13 can extend vertically upwardly along at least a portion of additional pillars within the vehicle 2, for example the B-pillar and/or the C-pillar and/or the D-pillar.
In the present embodiment, the display device 13 comprises a matrix of light emitting elements 24 arranged to form a substantially continuous optical track or band around the interior perimeter of the occupant compartment 3. The light emitting elements 24 each comprise one or more light emitting diodes (LEDs). As shown in Figures 5 and 6, the light emitting elements 24 each have a hexagonal shape and are arranged to form a plurality of substantially continuous chains extending around the occupant compartment 3. It will be appreciated that the light emitting elements 24 could have different shapes, for example rectangular, circular or elliptical. A sequence of light emitting elements 24 each having an elliptical shape is illustrated in Figure 7A by way of example. The light emitting elements 24 can each be controlled independently, for example to change the display colour and/or illumination level (intensity). A plurality of said light emitting elements 24 can be selectively illuminated to form a visual indicator in the form of a visual pattern P to represent the identified object 6. The light emitting elements 24 could be configured to reflect light onto the windows of the vehicle cabin. The colour and/or form and/or illumination level of the pattern P can be modified in dependence on an estimated hazard level (criticality), as illustrated in Figure 7B. An illumination area of each light emitting element 24 within the pattern P could be controlled to create a halftone image. It will be understood that the display device 13 is not limited to light emitting diodes LED and could use other display technologies, for example an organic light emitting diode (OLED), electroluminescence or back lighting technology. The display device 13 could comprise one or more projectors for projecting the visual indicator onto an interior of the occupant compartment 3 or onto a window of the vehicle.
With reference to Figure 3, the first output signal Soun comprises a display position signal S1, a display colour signal S2 and a display form signal S3 which control the display position, display colour and display form of the visual pattern P. The display position defines the position within the display device 13 that the visual pattern P is displayed. The display position signal S1 is generated as a function of the positional data defining the position of the identified object 6 and the vehicle speed. The display position signal S1 can also use information from the driver monitoring system 10, such as a gaze direction or a head pose.
In certain embodiments, the driver monitoring system 10 could also provide information relating to one or more physical characteristics of the driver. The one or more physical characteristic(s) can, for example, relate to a determined height or weight of the driver. The one or more physical characteristics can, for example, be estimated based on a seat position, a setting of an infotainment system (personalisation) or a measurement. The one or more physical characteristics could be estimated, for example based on a percentile for a given weight.
The display colour relates to the colour of the visual pattern P and can be changed to indicate a determined risk level associated with the potential hazard. By way of example, the visual pattern P can be displayed in yellow when the determined risk level posed by the identified object 6 is relatively low; or red when the determined risk level posed by the potential hazard is relatively high. The light emitting elements 24 can display a green colour to indicate that the vehicle interface device 1 is in operation but no potential hazards have been identified. The display colour signal S2 is generated as a function of the determined time to collision (ttc) and the estimated driver reaction time. The display form signal S3 is generated as a function of the determined nature of the identified object 6. The display form signal S3 controls the display form (shape) of the visual pattern P to represent different types of objects 6. For example, a first visual pattern P can be displayed to represent a cyclist, and a second visual pattern P can be displayed to represent another vehicle. The size of the visual pattern P can be controlled in dependence on the display form signal S3. The different visual patterns P can be predefined or generated dynamically, for example derived from the object data.
The processor 7 is configured to control the display position signal S1 such that changes in the display position of the visual pattern P are substantially continuous to provide a spatially uninterrupted indication of changes in the relative angular position of the identified object 6.
To indicate changes in the relative angular position of the identified object 6, the visual pattern P travels progressively within the display device 13 to provide a scrolling effect providing an uninterrupted (seamless) representation of changes in the relative angular position of the identified object 6. The display position of the visual pattern P can change in a horizontal direction to indicate changes in the relative angular position of the identified object 6. The size and/or illumination level of the visual pattern P could also be controlled, for example to indicate a determined range to the identified object 6 and/or a determined size of the identified object 6. Alternatively, or in addition, the display position of the visual pattern P can change in a vertical direction to indicate that the identified object 6 is travelling towards and/or away from the vehicle 2. For example, the visual pattern P can travel upwardly within those vertical portions of the display device 13 disposed on the A-pillar 23 (and optionally also the B-pillar and/or the C-pillar and/or the D-pillar) to indicate that the identified object 6 is travelling towards the vehicle 2.
The processor 7 can also be configured to control the size and/or position and/or illumination level of the visual pattern P depending on the field of vision of the driver. The field of vision of the driver is illustrated in Figure 5 by a central line of vision, with near-peripheral and mid-peripheral regions represented by concentric circles. Significantly, the field of vision of the driver tends to change in dependence on the speed of the vehicle 2. As illustrated in Figure 8, a field of vision a of the driver narrows as the speed of the vehicle 2 increases. The processor 7 can be configured to adjust the display position of the visual pattern P and/or the size (lateral extent) of the visual pattern P in conjunction with increasing vehicle speed. The prominence of the visual pattern P can thereby be increased with vehicle speed.
The processor 7 can take into account additional control factors. For example, the processor 7 can use the driver size as a further input to determine the display position of the visual pattern P. For example, if the driver is small (necessitating a forward seating position), the processor 7 can translate the display position of the visual pattern P towards the front of the occupant compartment to improve visibility of the visual pattern P. The driver size can be determined by processing the image data received from the driver monitoring camera. Alternatively, the driver size can be estimated based on the position of the driver seat and/or a measured weight of the driver.
The audio device 14 is an object-based audio system configured to generate a multidimensional audio alert in dependence on the second output signal SouT2 generated by the processor 7. The audio alert conveys positional information and/or movement information relating to the identified object 6. The audio device 14 is configured to output an acoustic pattern which is audible within the occupant compartment 3. In the present embodiment, the audio device 14 comprises a rendering station 28 configured to generate an object-based audio output which can combine different sound elements with metadata to form an audio object 29 (or a plurality of audio objects 29). The audio object 29 is an acoustic event perceived in space that may or may not occupy the same location as a loudspeaker. The audio object 29 has physical parameters that are manipulated to provide a change in the perceived location of the audio object 29 representing changes to the state of the identified (physical) object 6. This is different from the "phantom centre" experienced when a listener sits between two stereo loudspeakers because the centre image cannot be manipulated as a result of external factors.
The metadata utilised by the rendering station 28 is generated in dependence on the determined position of the identified object 6, for example the determined relative angular position (heading) and/or range of the identified object 6. The rendering station 28 can control the perceived spatial location of the audio object 29 in three-dimensions. The perceived spatial location of the audio object 29 conveys information relating to the position of the identified object 6 in relation to the vehicle 2. By way of example, the perceived spatial location of the audio object 29 can provide an indication of the relative angular position of the identified object 6. Moreover, the perceived spatial location of the audio object 29 can be changed to represent changes in the relative angular position of the identified object 6. One or more characteristics of the audio object 29 can also be controlled to convey information relating to the identified object 6. For example, a sound effect transmitted in said audio object 29 can be selected to indicate the nature of the identified object 6. The amplitude of the audio object 29 can be controlled to indicate a range to the identified object 6.
As illustrated in Figure 9, the audio object 29 is centred on the determined position of the driver's head 31. The metadata defines how the sound element should be reproduced in the sound stage (i.e. within the occupant compartment 3), by defining its position in a three-dimensional (3-D) field using vector information, audio level information, etc. The rendering station 28 is coupled to a plurality of acoustic transducers disposed within the occupant compartment 3. In the present embodiment the acoustic transducers are in the form of loudspeakers 30. In the illustrated arrangement, the rendering station 28 is coupled to four (4) loudspeakers 30A-D. As described herein, the rendering station 28 maps the second output signal Sour2 from the processor 7 to an audio program and generates separate audio control signals SA14 for each loudspeaker 30A-D. The audio control signals SA1_4 control the audio output from each loudspeaker 30A-D which combine to form the audio object 29. The rendering station 28 generates the information in real-time for each audio object 29 depending on the configuration of the loudspeakers 30 within the occupant compartment 3.
The resulting audio object 29 conveys directional information to an occupant of the vehicle 2, typically the driver. The audio device 14 can thereby provide an audio indication of the determined relative angular position of the identified object 6. Moreover, by modifying the perceived spatial location of the audio object 29, the rendering station 28 can convey information relating to the movement of the identified object 6 relative to the vehicle 2. It will be appreciated that the loudspeakers 30A-D may be different from each other, for example not having equal or equivalent frequency bandwidth. To implement the identified object based system, the characteristics and/or bandwidth limitations of each loudspeaker 30A-D are available to the processor 7 to appropriate adjustments to the distribution of acoustic energy.
The rendering station 28 could, for example, be configured to change the spatial location of the audio object 29 in dependence on the determined angular position of the identified object 6 relative to the vehicle 2. The spatial relationship between the vehicle 2 and the identified object 6 can be used to define the perceived spatial location of the audio object 29. In particular, the perceived angular position of the audio object 29 can correspond to the angular position of the identified object 6 in relation to the vehicle 2. The audio device 14 could optionally implement a sound shower such that the audio object 29 can be heard only in the driver area, thereby reducing disturbance to other occupants.
The perceived vertical location of the audio object 29 can be varied to convey additional information, for example relating to the size or nature of the identified object 6. The perceived vertical location of the audio object 29 could be relatively low to indicate that the identified object 6 is relatively small (for example to indicate that a child has been identified); and relatively high to indicate that the identified object 6 is relatively large (for example to indicate that an adult or a cyclist has been identified). Equally, the perceived vertical location of the audio object 29 could be adjusted to indicate range (distance), a relatively high perceived vertical location representing a relatively large range to the identified object 6 and a relatively low perceived vertical location representing a relatively small range to the identified object 6.
With reference to Figure 3, the second output signal Sou-ft comprises an audio direction signal S4, an audio amplitude signal S5, an audio frequency signal S6, and an audio signature signal S7. The audio direction signal S4 is generated as a function of the positional data defining the relative angular position of the identified object 6 and also a determined position of the driver's head 31. The position of the driver's head 31 can be determined by the driver monitoring system 10, or could be estimated (for example based on the position of the driver seat). The audio amplitude signal S5 is generated as a function of the determined time to collision (ttc), the driver reaction time and the driver auditory capabilities. The audio frequency signal S6 can be generated as a function of the determined range to the identified object 6. The audio signature can be defined to facilitate determination of the nature of the identified object 6 by the occupant of the vehicle 2. The audio signature signal S7 can be generated as a function of the determined nature of the identified object 6. For example, a first audio signature can be output if the identified object 6 is identified as another vehicle (such as the sound of a vehicle horn); a second audio signature (such as the ringing of a bicycle bell) can be output if the identified object 6 is identified as a cyclist; a third audio signature (such as the sound of voices) can be output if the identified object 6 is identified as a pedestrian; and a fourth audio signature (such as the sound of a dog barking) can be output if the identified object 6 is identified as an animal. One or more of the aforementioned audio signatures can be used. The second output signal SpuT2 can optionally also comprise an audio signature signal generated as a function of the determined nature of the identified object 6.
The haptic device 15 is configured to generate a haptic alert in dependence on the third output signal Sours generated by the processor 7. The haptic alert is configured to convey positional information and/or movement information relating to the identified object 6. The haptic device 15 is associated with a driver seat 32 disposed in the occupant compartment 3. As shown in Figure 10, the driver seat 32 comprises a seat cushion 33, a seat squab 34 and a head rest 35. A weight sensor 36 is incorporated into the seat cushion 33 to weigh the driver. A haptic effect generating device 37 is incorporated into the seat squab 34 (and optionally also the seat cushion 33) to output a haptic pattern which is sensed by the driver.
In the present embodiment the haptic effect generating device 37 comprises an array of vibration generators 38 that can be controlled independently of each other. The vibration generators 38 can, for example, each comprise an electric actuator (such as a piezoelectric actuator), an eccentric rotating element, or a vibratory transducer. In the illustrated arrangement, the haptic effect generating device 37 comprises nine (9) vibration generators 38. The haptic device 15 comprises a haptic control unit 39 configured to control operation of said vibration generators 38 in dependence on the third output signal Sow-3. Specifically, the haptic control unit 39 is configured to output haptic control signals Sry to control each vibration generator 38 independently. It will be understood that less than, or more than nine (9) vibration generators 38 can be incorporated into the haptic effect generating device 37. Alternatively, or in addition, the haptic effect generating device 37 could comprise one or more of the following: an ultrasonic transducer (for example haptic touchless technology), an electric actuator (such as a piezoelectric actuator) and a vibratory transducer.
The haptic effect generating device 37 is controlled in dependence on the third output signal SoryT3 selectively to energize one or more of the vibration generators 38 to generate a haptic pattern. The haptic pattern is controlled to convey information to the driver of the vehicle 2 relating to the identified object 6, for example to indicate a relative angular position and/or relative angular movement of the detected object 6. As shown in Figure 10, the vibration generators 38 are arranged in the seat squab 34 in a 3 x 3 matrix consisting of three columns Y1-3 and three rows X1-3. By selectively activating one or more vibration generators 38 in each column Y1-3, the haptic effect generating device 37 can convey positional information to the driver. For example, by activating the vibration generators 38 in the middle column Y2 a haptic alert can be generated to indicate that the identified object 6 is directly behind the vehicle 2. By activating the vibration generators 38 in the left column Yl, a haptic alert can be generated to indicate that the identified object 6 is to the left of the vehicle 2. Conversely, by activating the vibration generators 38 in the right column Y3, a haptic alert can be generated to indicate that the identified object 6 is to the right of the vehicle 2. By sequencing activation of the vibration generators 38 and/or controlling the magnitude of the vibrations, the haptic alert can convey the relative angular position of the identified object 6. For example, the vibration generators 38 in the central column Y2 could initially be activated to indicate that the identified object 6 is behind the vehicle 2; and then the vibration generators 38 in the right column Y3 can be activated to indicate that the identified object 6 is passing to the right of the vehicle 2. By sequentially activating the vibration generators 38, the haptic feedback can be moved in order to indicate the relative angular position of the identified object 6. It will be understood that providing more vibration generators 38 allows increased resolution of the haptic alert, for example more precisely to indicate the relative angular position of the identified object 6. The vibration generators 38 could be provided in lumbar supports provided on each side of the seat squab 34 to provide additional directional information. Additionally, or alternatively, the vibration generators 38 could be incorporated into the seat cushion 33, for example arranged in a longitudinal direction and/or a transverse direction.
The amplitude and/or frequency of the haptic pattern can be controlled to convey additional information, such as a hazard level (criticality) posed by the identified object 6. For example, the amplitude of the haptic pattern could be increased if the processor 7 determines that the identified object 6 is a particular hazard. The hazard level can, for example, be calculated based on the determined time to collision (ttc) and the reaction time of the driver. The amplitude and/or frequency of the haptic pattern could be modified to indicate the form of the identified object 6.
With reference to Figure 3, the third output signal SouT3 comprises a haptic direction signal S8, a haptic amplitude signal S9 and a haptic magnitude signal S10. The haptic direction signal S8 is generated as a function of the positional data defining the relative angular position of the identified object 6. The haptic amplitude signal S9 and the haptic magnitude signal S10 are generated as functions of the determined time to collision (ttc) and optionally also the determined reaction time of the driver.
The processor 7 can also be configured to control operation of the haptic effect generating device 37 in dependence on a determined contact between the driver and the seat squab 34. Using weight percentiles, a contact pattern between the driver and the seat cushion 33 and the seat squab 34 can be estimated. By way of example, a first contact pattern 40A for a 5th percentile is shown in Figure 11A; a second contact pattern 40B for a 50th percentile is shown in Figure 11B: and a third contact pattern 40C for a 95th percentile is shown in Figure 11 C. The haptic control unit 39 can be configured to control the haptic pattern in dependence on the weight of the driver measured by the weight sensor 36. For example, the location of the haptic pattern output by the haptic effect generating device 37 can be modified in dependence on the measured weight of the driver. Alternatively, or in addition, the amplitude of the haptic pattern can be modified in dependence on the measured weight of the driver. At least in certain embodiments, this control strategy can help to ensure that the haptic pattern provides feedback which is commensurate with a determined hazard level. For example, if the measured weight determines that the driver is in the 95th percentile, the amplitude of the haptic pattern can be lower than for a driver in the 5th percentile due to the increased contact between the driver and the seat cushion 33 and the seat squab 34. The processor 7 can be configured to characterise additional parameters relating to the driver, for example relating to the clothing being worn. The processor 7 can perform image processing on image data received from a driver-facing camera to identify the clothing, for example to determine if the driver is wearing a jacket. This processing could be an extension of a facial recognition algorithm. The processor 7 can adjust the magnitude of the haptic pattern in dependence on the determined clothing characterisation.
The haptic effect generating device 37 could utilise an array of ultrasonic transducers in place of (or in addition to) the vibration generators 38. The ultrasonic generators could be incorporated into the seat cushion 33 and/or the seat squab 34 and/or the head rest 35. In this arrangement, one or more of said ultrasonic generators can be activated to generate the haptic pattern which is sensed by the driver. The haptic pattern can be controlled by selectively activation of one or more of said ultrasonic generators. In use, the ultrasonic transducers could be configured to generate an ultrasonic signal that is transmitted through the air and is felt by the driver. Thus, the ultrasonic transducers are operable to transmit the haptic pattern when the driver is not in direct contact with the driver seat 32.
In an alternative arrangement, the haptic pattern could be generated by controlling an airflow incident on the driver of the vehicle 2. The haptic effect generating device 37 could utilise one or more air vents to control the airflow to generate the haptic pattern. The one or more air vents could be incorporated into the driver seat 32, for example into a head rest; and/or into a door of the vehicle 2; and/or into a B-pillar of the vehicle 2. The one or more air vents could be selectively opened/closed to control airflow incident on the driver, for example on the back of the driver's head, neck or shoulders. The resulting haptic pattern can be used to notify the driver of the relative angular position and/or relative movement of the identified object 6. The extent to which each air vent is opened could be controlled to control the strength of the incident airflow. Alternatively, or in addition, the haptic effect generating device 37 could comprise an adjustable nozzle (not shown) which can be controlled to change the direction of the incident airflow. An operating speed of a fan unit for generating the airflow could be controlled. The incident airflow could be pulsed. The pulsed airflow could be controlled to convey additional information, such as the nature of the identified object 6 and/or a hazard level. For example, the frequency of the pulses could be increased to signal a reduction in the range to the identified object 6.
The operation of the vehicle interface device 1 will now be described with reference to a first example illustrated in Figures 12A-C. The first example relates to a scenario in which the identified object 6 is a cyclist approaching from behind the vehicle 2. The cyclist is initially only visible in a vehicle side mirror, but is detected by the rear-facing camera 19. The ADAS 9 determines the relative angular position of the cyclist and generates the first input signal SIN1 for the processor 7. In dependence on the first input signal SIN1, the processor 7 generates the second output signal SOUT2 which includes the audio direction signal S4, the audio amplitude signal S5, the audio frequency S6 and the audio signature S7. The rendering station 28 generates the audio object 29 which provides an initial notification to the driver of the vehicle 2 that the cyclist has been detected. The perceived spatial location of the audio object 29 serves also to notify the driver of the relative angular position of the cyclist. In the present example, the audio object 29 is initially generated behind and to the right of the driver to notify the driver that the cyclist is approaching from this direction, as illustrated in Figure 12& The processor 7 also generates the first output signal SouT, which includes the display position signal S1, the display colour signal S2 and the display form signal S3. The visual pattern P is displayed by the display device 13 at a display position corresponding to the determined relative angular position of the cyclist. As illustrated in Figures 12B and 12C, the display position changes progressively as the relative angular position of the cyclist changes. The visual pattern P thereby sweeps along the right lateral panel 21 R and into the front panel 20 to provide a continuous indication of the relative angular position of the cyclist. As shown in Figure 12B, when the cyclist is partially obscured, the visual pattern P is displayed on the inside of the A-pillar 23 to ensure that the driver is aware of their continued presence proximal to the vehicle 2. The visual pattern P is centred on the closest portion of the identified object 6 to the vehicle 2. The perceived spatial location of the audio object 29 is adjusted continuously to match the movement of the cyclist in relation to the vehicle 2. In the present example, the perceived spatial location of the audio object 29 travels forward on the right hand side of the driver as the cyclist passes the vehicle 2. The volume of the audio object 29 is controlled based on the measured range to the cyclist.
The processor 7 outputs the third output signal Sours to control operation of the haptic device 15. The haptic device 15 operates throughout the sequence to provide an additional communication means. In particular, the vibration generators 38 in the central column Y2 are activated initially when the cyclist is detected behind the vehicle 2. The intensity of the vibrations is adjusted based on the measured range to the cyclist. As the cyclist approaches on the right hand side of the vehicle 2, the vibration generators 38 in the right column Y3 generate vibrations which progressively increase in magnitude while those generated by the vibration generators 38 in the central column Y2 progressively decrease. When the cyclist is alongside the vehicle 2, only those vibration generators 38 in the right hand column Y3 are active. The magnitude of the vibrations decreases as the cyclist moves further away from the vehicle 2. The vibration generators 38 thereby generate a haptic pattern which also conveys relative angular position and movement information to the driver.
The processor 7 can be configured to control output of the first, second and third output signals SOUT1-3 to control activation of the display device 13, the audio device 14 and the haptic device 15 to convey different information. The display device 13, the audio device 14 and the haptic device 15 can be activated independently of each other to convey information relating to different identified hazards, for example in dependence on a determined priority of a plurality of potential hazards or in dependence on an identified region in which the hazards is identified. The processor 7 can control an activation sequence of the display device 13, the audio device 14 and the haptic device 15, for example depending on a personal preference setting or depending on a determined urgency. By way of example, the haptic device 15 can be activated when there is an imminent hazard to provide direct feedback to the driver.
The operation of the vehicle interface device 1 will now be described with reference to a second example illustrated in Figures 13A-C. The second example relates to another scenario in which the identified object 6 is a cyclist which the vehicle 2 is overtaking. The cyclist is visible through the front windshield 4 and is detected by the radar sensor 16. The ADAS 9 determines the relative angular position of the cyclist and generates the first input signal SiN1 for the processor 7. In dependence on the first input signal Sul, the processor 7 generates the second output signal SOUT2 which includes the audio direction signal S4 the audio amplitude signal S5, the audio frequency S6 and the audio signature S7. The rendering station 28 generates the audio object 29 to provide an initial alert to notify the driver of the vehicle 2 that the cyclist is approaching on the right hand side of the vehicle 2, as illustrated in Figure 13A. The processor 7 also generates the first output signal SoUT1 which includes the display position signal S1, the display colour signal S2 and the display form signal S3. The visual pattern P is displayed by the display device 13 at a display position corresponding to the determined relative angular position of the cyclist. The rendering station 28 generates an audio pattern in dependence on the second output signal SoUT2 and the haptic device 15 generates a haptic pattern in dependence on the third output signal Sours. The perceived spatial location of the audio object 29 serves also to notify the driver of the relative angular position of the cyclist. As illustrated in Figures 13B and 13C, the display position changes progressively as the relative angular position of the cyclist changes. The visual pattern P thereby sweeps along the right lateral panel 21 R to provide a continuous indication of the relative angular position of the cyclist. As shown in Figures 13B and 13C, when the cyclist is partially obscured, the visual pattern P is displayed on the inside of the A-pillar 23 to ensure that the driver is aware of their continued presence proximal to the vehicle 2. The visual pattern P is centred on the closest portion of the identified object 6 to the vehicle 2.
It will be appreciated that various changes and modifications can be made to the vehicle interface device without departing from the scope of the present invention. The vehicle interface device has been described herein with reference to implementing three modalities, namely visual, audio and haptic feedback. It will be appreciated that the vehicle interface device could be implemented with only one of said modalities or two of said modalities.
The illumination level (or intensity) of the illuminating elements can be controlled individually within the visual pattern P to indicate a measured distance between the vehicle and the identified object 6. The measured distance could be the shortest distance between the vehicle 2 and the identified object 6, for example measured normal to an exterior of the vehicle 2; or could be the distance measured relative to a reference point in the vehicle 2. By varying the illumination level within the visual pattern P, a sense of depth or perspective can be conveyed.
The user can set preferences for operation of the visual device 13, the audio device 14 and the haptic device 15.
Further aspects of the present invention are set out in the following numbered paragraphs: 1. A vehicle interface device for generating an audible indication of a potential hazard, the vehicle interface device comprising: a plurality of electroacoustic transducers for generating an audio object; and a processor for controlling said electroacoustic transducers; wherein the processor is configured to: in dependence on object data relating to an identified object representing a potential hazard, determine an angular position of the identified object relative to the vehicle; generate a control signal to cause the electroacoustic transducers to generate an audio object; and modify the control signal to progressively change a perceived spatial location of the audio object to represent changes in the determined relative angular position of the identified object.
2. A vehicle interface device as described in paragraph 1, wherein the processor is configured to receive said object data from one or more sensor.
3. A vehicle interface device as described in paragraph 1, wherein the perceived spatial location of the audio object is changed progressively to track the determined angular position of the identified object.
4. A vehicle interface device as described in paragraph 1 comprising means for monitoring a driver of the vehicle and generating driver data; wherein the processor is configured to change the perceived spatial position of the audio object in dependence on the driver data.
5. A vehicle interface device as described in paragraph 1, wherein the processor is configured to determine a trajectory of the identified object in dependence on the object data; and to modify the audio object in dependence on the determined trajectory.
6. A vehicle interface device as described in paragraph 1, wherein the processor is configured to determine a time to collision in dependence on the object data; and to modify the audio object in dependence on the determined time to collision.
7. A vehicle interface device as described in paragraph 1, wherein the processor is configured to determine a nature of the identified object in dependence on the object data; and to modify the audio object in dependence on the determined nature of the identified object.
8. A vehicle interface device as described in paragraph 1, wherein the processor is configured to modify the audio object by changing one or more of the following parameters: amplitude, frequency, volume, acoustic pattern, signature, and pattern form.
9. A vehicle interface device as described in paragraph 1, wherein the vehicle interface device is also suitable for generating a visual indication of a potential hazard, the vehicle interface device comprising: a display configured to extend around at least a portion of a perimeter of an occupant compartment in a vehicle; and a processor for controlling said display; wherein the processor is configured to: in dependence on object data relating to an identified object representing a potential hazard, determine an angular position of the identified object relative to the vehicle; generate a control signal to cause the display to display a visual indicator at a display position in said display corresponding to the determined relative angular position of the identified object; and modify the control signal to progressively change the display position of the visual indicator within the display at least substantially to match changes in the relative angular position of the identified object.
10. A vehicle interface device as described in paragraph 1, wherein the vehicle interface device is also suitable for generating a haptic indication of a potential hazard, the vehicle interface device comprising: at least one haptic generator configured to generate a haptic signal; and a processor for controlling said haptic generator; wherein the processor is configured to: in dependence on object data relating to an identified object representing a potential hazard, determine an angular position of the object relative to the vehicle; generate a control signal to cause the haptic generator to output a haptic signal for providing an indication of the determined relative position of the object; and modify the control signal to progressively change the generated haptic signal to represent changes in the relative angular position of the identified object.
11. A vehicle comprising a vehicle interface device as described in paragraph 1. 25 12. A method of generating an audible indication of a potential hazard, the method comprising: determining an angular position of an identified object relative to a vehicle; generating an audio object for providing an indication of the determined relative angular position of the identified object; and progressively changing a perceived spatial location of the audio object to represent changes in the determined relative angular position of the identified object.
13. A method as described in paragraph 12, wherein the angular position of the identified object is determined in dependence on object data received from one or more sensor.
14. A method as described in paragraph 12, wherein the perceived spatial location of the audio object is changed progressively to track the determined angular position of the identified object.
15. A method as described in paragraph 12 comprising monitoring a driver of the vehicle to generate driver data; and changing the perceived spatial position of the audio object in dependence on the driver data.
16. A method as described in paragraph 12 comprising determining a trajectory of the identified object; and modifying the audio object in dependence on the determined trajectory.
17. A method as described in paragraph 12 comprising determining a time to collision; and modifying the audio object in dependence on the determined time to collision.
18. A method as described in paragraph 12 comprising determining a nature of the identified object; and modifying the audio object in dependence on the determined nature of the identified object.
19. A method as described in paragraph 12 comprising modifying the audio object by changing one or more of the following parameters: amplitude, frequency, volume, acoustic pattern, signature, and pattern form.
20. A method as described in paragraph 12 comprising generating a visual indication of a potential hazard, the method comprising: displaying a visual indicator at a display position corresponding to the determined relative angular position of the identified object; and progressively changing the display position of the visual indicator at least substantially to match changes in the relative angular position of the identified object.
21. A method as described in paragraph 12 comprising generating a haptic indication of a potential hazard, the method comprising: generating a haptic signal for providing an indication of the determined relative position of the object; and progressively changing the generated haptic signal to represent changes in the relative angular position of the identified object.

Claims (24)

  1. CLAIMS: 1. A vehicle interface device for generating an audible indication of a potential hazard, the device comprising: a plurality of electroacoustic transducers for generating an audio object; and a processor for controlling said electroacoustic transducers; wherein the processor is configured to: in dependence on object data relating to an identified object representing a potential hazard, determine an angular position of the identified object relative to the vehicle; generate a control signal to cause the electroacoustic transducers to generate an audio object; and modify the control signal to progressively change a perceived spatial location of the audio object to represent changes in the determined relative angular position of the identified object.
  2. 2. A vehicle interface device as claimed in claim 1, wherein the processor is configured to receive said object data from sensor means.
  3. 3. A vehicle interface device as claimed in claim 1 or claim 2, wherein the perceived spatial location of the audio object is changed progressively to track the determined angular position of the identified object.
  4. 4. A vehicle interface device as claimed in any one of claims 1, 2 or 3 comprising means for monitoring a driver of the vehicle and generating driver data; wherein the processor is configured to change the perceived spatial position of the audio object in dependence on the driver data.
  5. 5. A vehicle interface device as claimed in any one of claims 1 to 4, wherein the processor is configured to determine a trajectory of the identified object in dependence on the object data; and to modify the audio object in dependence on the determined trajectory.
  6. 6. A vehicle interface device as claimed in any one of the preceding claims, wherein the processor is configured to determine a time to collision in dependence on the object data; and to modify the audio object in dependence on the determined time to collision.
  7. 7. A vehicle interface device as claimed in any one of the preceding claims, wherein the processor is configured to determine a nature of the identified object in dependence on the object data; and to modify the audio object in dependence on the determined nature of the identified object.
  8. 8. A vehicle interface device as claimed in any one of the preceding claims, wherein the processor is configured to modify the audio object by changing one or more of the following parameters: amplitude, frequency, volume, acoustic pattern, signature, and pattern form.
  9. 9. A vehicle interface device as claimed in any one of the preceding claims, wherein the vehicle interface device is also suitable for generating a visual indication of a potential hazard, the vehicle interface device comprising: a display configured to extend around at least a portion of a perimeter of an occupant compartment in a vehicle; and a processor for controlling said display; wherein the processor is configured to: in dependence on object data relating to an identified object representing a potential hazard, determine an angular position of the identified object relative to the vehicle; generate a control signal to cause the display to display a visual indicator at a display position in said display corresponding to the determined relative angular position of the identified object; and modify the control signal to progressively change the display position of the visual indicator within the display at least substantially to match changes in the relative angular position of the identified object.
  10. 10. A vehicle interface device as claimed in any one of the preceding claims, wherein the vehicle interface device is also suitable for generating a haptic indication of a potential hazard, the vehicle interface device comprising: at least one haptic generator configured to generate a haptic signal; and a processor for controlling said haptic generator; wherein the processor is configured to: in dependence on object data relating to an identified object representing a potential hazard, determine an angular position of the object relative to the vehicle; generate a control signal to cause the haptic generator to output a haptic signal for providing an indication of the determined relative position of the object; and modify the control signal to progressively change the generated haptic signal to represent changes in the relative angular position of the identified object.
  11. 11. A vehicle comprising a vehicle interface device as claimed in any one of the preceding claims. 10
  12. 12. A method of generating an audible indication of a potential hazard, the method comprising: determining an angular position of an identified object relative to a vehicle; generating an audio object for providing an indication of the determined relative angular position of the identified object; and progressively changing a perceived spatial location of the audio object to represent changes in the determined relative angular position of the identified object.
  13. 13. A method as claimed in claim 12, wherein the angular position of the identified object is determined in dependence on object data received from sensor means.
  14. 14. A method as claimed in claim 12 or claim 13, wherein the perceived spatial location of the audio object is changed progressively to track the determined angular position of the identified object.
  15. 15. A method as claimed in any one of claims 12, 13 or 14 comprising monitoring a driver of the vehicle to generate driver data; and changing the perceived spatial position of the audio object in dependence on the driver data.
  16. 16. A method as claimed in any one of claims 12 to 15 comprising determining a trajectory of the identified object; and modifying the audio object in dependence on the determined trajectory.
  17. 17. A method as claimed in any one of claims 12 to 16 comprising determining a time to collision; and modifying the audio object in dependence on the determined time to collision.
  18. 18. A method as claimed in any one of claims 12 to 17 comprising determining a nature of the identified object; and modifying the audio object in dependence on the determined nature of the identified object.
  19. 19. A method as claimed in any one of claims 12 to 18 comprising modifying the audio object by changing one or more of the following parameters: amplitude, frequency, volume, acoustic pattern, signature, and pattern form.
  20. 20. A method as claimed in any one of claims 12 to 19 comprising generating a visual indication of a potential hazard, the method comprising: displaying a visual indicator at a display position corresponding to the determined relative angular position of the identified object; and progressively changing the display position of the visual indicator at least substantially to match changes in the relative angular position of the identified object.
  21. 21. A method as claimed in any one of claims 12 to 20 comprising generating a haptic indication of a potential hazard, the method comprising: generating a haptic signal for providing an indication of the determined relative position of the object; and progressively changing the generated haptic signal to represent changes in the relative angular position of the identified object.
  22. 22. A vehicle interface device substantially as herein described with reference to the accompanying figures.
  23. 23. A vehicle substantially as herein described with reference to the accompanying figures.
  24. 24. A method substantially as herein described with reference to the accompanying figure&
GB1500590.3A 2015-01-14 2015-01-14 Vehicle interface device Active GB2534163B (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
GB1500590.3A GB2534163B (en) 2015-01-14 2015-01-14 Vehicle interface device
EP16700486.0A EP3245642A1 (en) 2015-01-14 2016-01-14 Vehicle interface device
PCT/EP2016/050653 WO2016113345A1 (en) 2015-01-14 2016-01-14 Vehicle interface device
US15/540,153 US10229595B2 (en) 2015-01-14 2016-01-14 Vehicle interface device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1500590.3A GB2534163B (en) 2015-01-14 2015-01-14 Vehicle interface device

Publications (3)

Publication Number Publication Date
GB201500590D0 GB201500590D0 (en) 2015-02-25
GB2534163A true GB2534163A (en) 2016-07-20
GB2534163B GB2534163B (en) 2017-11-22

Family

ID=52597594

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1500590.3A Active GB2534163B (en) 2015-01-14 2015-01-14 Vehicle interface device

Country Status (1)

Country Link
GB (1) GB2534163B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016114413A1 (en) 2016-08-04 2018-03-22 Visteon Global Technologies, Inc. Device for generating object-dependent audio data and method for generating object-dependent audio data in a vehicle interior
EP3413287A1 (en) * 2010-04-19 2018-12-12 SMR Patents S.à.r.l. Method for indicating to a driver of a vehicle the presence of an object at least temporarily moving relative to the vehicle
CN110431613A (en) * 2017-03-29 2019-11-08 索尼公司 Information processing unit, information processing method, program and mobile object
DE102020114924A1 (en) 2020-06-04 2021-12-09 Bayerische Motoren Werke Aktiengesellschaft MOTOR VEHICLE
EP4003805A4 (en) * 2019-07-31 2023-08-09 Karma Automotive LLC System and method for a combined visual and audible spatial warning system
GB2622197A (en) * 2022-09-01 2024-03-13 Continental Automotive Tech Gmbh Instrument cluster apparatus and vehicle warning arrangement including the instrument cluster apparatus

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019201407A1 (en) 2019-02-04 2020-08-06 Continental Automotive Gmbh Method and device for controlling a driver's attention

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007049995A1 (en) * 2005-10-24 2007-05-03 Volvo Lastvagnar Ab Object detection system and method
JP2008129734A (en) * 2006-11-17 2008-06-05 Calsonic Kansei Corp Solid sound image warning device
DE102009005260A1 (en) * 2009-01-20 2010-07-22 Bayerische Motoren Werke Aktiengesellschaft Method for acoustic warning against ahead-driving vehicle in surrounding of motor vehicle, involves moving virtual sound centrum of acoustic warning signals from starting point to position of danger source in specific time interval
DE102009005688A1 (en) * 2009-01-22 2010-07-29 Bayerische Motoren Werke Aktiengesellschaft Method for acoustic indication of source of danger in environment of motor vehicle, involves adapting sound characteristic of acoustic signal such that acoustic signal comprises virtual echo that represents place of source of danger
US20120081219A1 (en) * 2010-09-29 2012-04-05 GM Global Technology Operations LLC Motor vehicle with warning system
US20140368650A1 (en) * 2011-05-17 2014-12-18 Raytheon Company Integrated 3d audiovisual threat cueing system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007049995A1 (en) * 2005-10-24 2007-05-03 Volvo Lastvagnar Ab Object detection system and method
JP2008129734A (en) * 2006-11-17 2008-06-05 Calsonic Kansei Corp Solid sound image warning device
DE102009005260A1 (en) * 2009-01-20 2010-07-22 Bayerische Motoren Werke Aktiengesellschaft Method for acoustic warning against ahead-driving vehicle in surrounding of motor vehicle, involves moving virtual sound centrum of acoustic warning signals from starting point to position of danger source in specific time interval
DE102009005688A1 (en) * 2009-01-22 2010-07-29 Bayerische Motoren Werke Aktiengesellschaft Method for acoustic indication of source of danger in environment of motor vehicle, involves adapting sound characteristic of acoustic signal such that acoustic signal comprises virtual echo that represents place of source of danger
US20120081219A1 (en) * 2010-09-29 2012-04-05 GM Global Technology Operations LLC Motor vehicle with warning system
US20140368650A1 (en) * 2011-05-17 2014-12-18 Raytheon Company Integrated 3d audiovisual threat cueing system

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3413287A1 (en) * 2010-04-19 2018-12-12 SMR Patents S.à.r.l. Method for indicating to a driver of a vehicle the presence of an object at least temporarily moving relative to the vehicle
DE102016114413A1 (en) 2016-08-04 2018-03-22 Visteon Global Technologies, Inc. Device for generating object-dependent audio data and method for generating object-dependent audio data in a vehicle interior
CN110431613A (en) * 2017-03-29 2019-11-08 索尼公司 Information processing unit, information processing method, program and mobile object
EP3605499A4 (en) * 2017-03-29 2020-04-22 Sony Corporation Information processing device, information processing method, program, and moving body
US11317205B2 (en) 2017-03-29 2022-04-26 Sony Corporation Information processing apparatus, information processing method, program, and mobile object
CN110431613B (en) * 2017-03-29 2023-02-28 索尼公司 Information processing device, information processing method, program, and moving object
EP4003805A4 (en) * 2019-07-31 2023-08-09 Karma Automotive LLC System and method for a combined visual and audible spatial warning system
DE102020114924A1 (en) 2020-06-04 2021-12-09 Bayerische Motoren Werke Aktiengesellschaft MOTOR VEHICLE
GB2622197A (en) * 2022-09-01 2024-03-13 Continental Automotive Tech Gmbh Instrument cluster apparatus and vehicle warning arrangement including the instrument cluster apparatus

Also Published As

Publication number Publication date
GB201500590D0 (en) 2015-02-25
GB2534163B (en) 2017-11-22

Similar Documents

Publication Publication Date Title
US10229595B2 (en) Vehicle interface device
GB2534163A (en) Vehicle interface device
CN109478354B (en) Haptic guidance system
US10223602B2 (en) Dynamic control apparatus and related method
CN103569134B (en) Alert systems and methods for a vehicle
US9734699B2 (en) System for providing alerts to vehicle occupants
US7741962B2 (en) Auditory display of vehicular environment
KR101795902B1 (en) Vehicle system
RU2689930C2 (en) Vehicle (embodiments) and vehicle collision warning method based on time until collision
US10421465B1 (en) Advanced driver attention escalation using chassis feedback
US9539944B2 (en) Systems and methods of improving driver experience
JP6339235B2 (en) Control device and related method
US20180173230A1 (en) Contextual-assessment vehicle systems
US7245231B2 (en) Collision avoidance system
US20160347329A1 (en) Situational awareness for a vehicle
CN110431613B (en) Information processing device, information processing method, program, and moving object
US20150319608A1 (en) Method For Detecting Smart Device Use While Operating A Motor Vehicle
GB2500690A (en) Driver monitoring and vehicle control system
JP2017526034A (en) Automobile detection system that uses a sound stage to indicate a driver's lack of arousal in the event of an impending danger
JP2002133596A (en) Onboard outside recognition device
GB2534165A (en) Vehicle interface device
JP2018120352A (en) Dependency estimation device
JP2020075586A (en) Drive support method and drive support device
GB2534164A (en) Vehicle interface device
JP2019105941A (en) Travel support method and travel support device