GB2539329A - Method for operating a vehicle, in particular a passenger vehicle - Google Patents

Method for operating a vehicle, in particular a passenger vehicle Download PDF

Info

Publication number
GB2539329A
GB2539329A GB1609739.6A GB201609739A GB2539329A GB 2539329 A GB2539329 A GB 2539329A GB 201609739 A GB201609739 A GB 201609739A GB 2539329 A GB2539329 A GB 2539329A
Authority
GB
United Kingdom
Prior art keywords
vehicle
occupant
interior
user
basis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1609739.6A
Other versions
GB201609739D0 (en
Inventor
Jagodic Ratko
Plewe Sarah
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Daimler AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler AG filed Critical Daimler AG
Priority to GB1609739.6A priority Critical patent/GB2539329A/en
Publication of GB201609739D0 publication Critical patent/GB201609739D0/en
Publication of GB2539329A publication Critical patent/GB2539329A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/29Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/60Instruments characterised by their location or relative disposition in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/037Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/11Instrument graphical user interfaces or menu aspects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/141Activation of instrument input devices by approaching fingers or pens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1438Touch screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/18Information management
    • B60K2360/199Information management for avoiding maloperation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/215Selection or confirmation of options

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to a method for operating a vehicle (10), the method comprising: detecting at least a portion (30) of an occupant in the interior (14) of the vehicle (10) by means of at least one sensor device (12) which detects that the occupant is proximal to the sensor and then determining at least one local relation between the portion (30) and the element (20) by means of a control unit (16) of the vehicle (10); on the basis of the determined relation: predicting at least one future action or identifying at least one current action of the occupant by means of the control unit (16); and on the basis of the predicted or identified action: effecting at least one function of the vehicle (10) by means of the control unit (16). Also included is a method for detecting an occupant, determining a local relation between the occupant and the vehicle and emitting a help signal. Also included is a method for detecting an occupant, determining a local relation and presenting at least one option list of selectable items to the occupant.

Description

Method for Operating a Vehicle, in particular a Passenger Vehicle The invention relates to a method for operating a vehicle, in particular a passenger vehicle.
DE 10 2010 012 239 Al shows an operating and display apparatus of a motor vehicle, said apparatus comprising at least one operating device arranged in the motor vehicle. In said apparatus, a virtual operating device in a display device is assigned to at least one portion of the operating device. Said virtual operating device indicates to at least one function which can be activated by the operating device. In said apparatus, the virtual operating device is only displayed when an operating means approaches and/or touches the operating device.
It is an object of the present invention to provide a method by means of which a vehicle can be operated in a particularly easy way.
This object is solved by a method having the features of patent claim 1, by a method having the features of patent claim 3, and by a method having the features of patent claim 5. Advantageous embodiments with expedient developments of the invention are indicated in the other patent claims.
A first aspect of the invention relates to a method for operating a vehicle, in particular a motor vehicle. The method according to the first aspect of the invention comprises a first step of detecting at least a portion of an occupant in the interior of the vehicle by means of at least one sensor device. The method according to the first aspect further comprises a second step of detecting at least one element in the interior of the vehicle by means of the sensor device. Said element can be an object arranged in the interior, wherein the object can be part of the interior itself. Moreover, the element can be a body or part of a body of the occupant or another occupant in the interior. Furthermore, the element can be a physical or given element of the interior.
In a third step of the method according to the first aspect, at least one local relation between the portion and the element is determined by means of a control unit of the vehicle on the basis of the detected portion and element. The method according to the first aspect of the invention further comprises a fourth step of, on the basis of the determined relation, predicting at least one future action or identifying at least one current action of the occupant by means of the control unit. Furthermore, in a fifth step of the method according to the first aspect, at least one function of the vehicle is effected by means of the control unit on the basis of the predicted or identified action.
This means the method according to the first aspect of the present invention is configured to predict or identify at least one intent of the occupant inside the vehicle cabin. It was found that, presently, the interaction of a user with various features in a vehicle has to be boiled down to the lowest common denominator of an input device such as clicks, swipes and/or dial turns which slows down the interaction, requires taking eyes off the road to navigate menus and makes discovering features difficult. In order to overcome these disadvantages, the method according to the first aspect of the invention helps create an interaction model that extends the traditional interface with more natural interactions either explicitly or implicitly performed through the motion or state of the occupant being a user of the vehicle. For example, hand motions can be interpreted differently depending on the current context and the vehicle can act on these interpretations in a way that simplifies the interaction or even accomplishes the task all together. Being able to detect natural occupant behavior in the vehicle cabin has the potential to significantly improve the user experience by simplifying the interface, decreasing the learning curve and anticipating the user's need. Another added benefit is increased safety since various safety measures could be triggered automatically depending on the user attention and behavior. For example, different sensor technologies can be used for enabling interaction. For example, proximity sensors can be used. Moreover, various ways of processing and learning sensor data such as machine learning, computer vision, etc. can be used.
A second aspect of the present invention relates to a method for operating a vehicle, in particular a passenger vehicle. The method according to the second aspect of the present invention comprises a first step of detecting at least a portion of an occupant in the interior of the vehicle by means of at least one sensor device. In a second step of the method according to the second aspect of the invention at least one local relation between the portion and a component of the vehicle is determined by means of a control unit of the vehicle on the basis of the detected portion. Moreover, the method according to the second aspect of the present invention comprises a third step in which, on the basis of the determined relation, at least one help signal is emitted in the interior of the vehicle, the help signal being indicative of at least one function of said component. Advantages and advantageous embodiments of the first aspect of the invention are to be regarded as advantages and advantageous embodiments of the second aspect of the invention and vice versa.
It was found that, currently, there is no easy or elegant way to get more information on control elements within a vehicle while driving. The occupant, i.e. the user of the vehicle always has to go through the inconvenient procedure of looking it up in some kind of manual while being stopped rather than just being able to point, edit or touch it, and ask the vehicle itself about this particular functionality. The advantage of the method according to the second aspect of the invention is that the user gets a better understanding of the capabilities of the vehicle and is better informed, because the user can get this information in the moment they need it. The user is less distracted if the user knows how the vehicle functions, which makes driving safer. For example, a pointing or touching interaction itself can vary in very little detail, as well as possible wording to ask for more information.
For example, the method according to the second aspect of the present invention helps create a 3-dimensional look up concept since, for example, the portion of the user, in particular movements of said portion are monitored 3-dimensionally by the sensor device. On the basis of the detected portion the occupant (user) can be provided with various pieces of information on various components of the vehicle in a very comfortable way.
A third aspect of the present invention relates to a method for operating a vehicle, in particular a passenger vehicle. The method according to the third aspect of the invention comprises a first step of detecting at least a portion of an occupant in the interior of the vehicle by means of at least one sensor device. In a second step of the method according to the third aspect of the present invention at least one local relation between the portion and the component of the vehicle is determined by means of a control unit of the vehicle on the basis of the detected portion. The method according to the third aspect of the present invention further comprises a third step of presenting at least one option list to the occupant, said option list comprising a plurality of items being selectable by the occupant. Preferably, the option list is optically and/or acoustically presented to the occupant so that, for example said option list can be a menu such as a graphical user interface (GUI). Advantages and advantageous embodiments of the first and second aspects of the present invention are to be regarded as advantages and advantageous embodiments of the third aspect of the invention and vice versa.
In addition to or instead of the option list the user can directly interact with the vehicle component and change related vehicle functions without any GUI. For example, tapping on an interior light by the roof can turn the light on/off without any GUI feedback.
Similarly, covering a speaker in the dashboard with their hand can mute the audio without any GUI on the screen. The option list provides an access to certain functions which can be effected by the user.
With regard to the third aspect of the present invention, it was found that in an effort to de-clutter the interior of a vehicle a lot of hard controls get removed and functionality gets moved into a digital menu, often to a deep menu level. This increases the number of interaction steps necessary to operate the feature. The placement of digital controls relative to the feature they control often is not instantly clear to a user being an occupant of the vehicle. Also, digital menus that are operated via touchscreens or touchpad often lack haptic feedback. All of this increases cognitive, visual and motor load and therefore leads to more distraction. The method according to the third aspect of the present invention solves these problems by allowing the user to directly interact with different elements of the vehicle interior in order to operate certain features, i.e. functions or functionalities. Therefore, the user has direct access to certain functionalities, the control is moved closer to the actual feature and there is haptic feedback of touching the right thing.
One advantage of the method according to the third aspect of the present invention is that all benefits of having physical controls can be kept without the need for extra buttons or hard controls for interaction as elements that are already in the vehicle, in particular in the interior of the vehicle, are utilized. Instead of having to move all hard controls to a digital menu in order to reduce cost and clutter, elements that are already in the interior are used. Also, the benefits of physical hard controls can be kept, which causes less driver distraction. For example, pointing and/or touching interaction can be used to control the respective functions.
Further advantages, features, and details of the invention derive from the following description of preferred embodiments as well as from the drawings. The features and feature combinations previously mentioned in the description as well as the features and feature combinations mentioned in the following description of the figures and/or shown in the figures alone can be employed not only in the respectively indicated combination but also in other combination or taken alone without leaving the scope of the invention.
The drawings show in: Fig. 1 a schematic view of system components used to realize a method for operating a vehicle, in particular a passenger vehicle; part of a schematic and perspective view of the interior of the vehicle; Fig. 2 part of a further schematic and perspective view of the interior of the vehicle; Fig. 3 Fig. 4 part of a further schematic and perspective view of the interior of the vehicle; Fig. 5 part of a further schematic and perspective view of the interior of the vehicle; Fig. 6 part of a further schematic and perspective view of the interior of the vehicle; Fig. 7 part of a further schematic and perspective view of the interior of the vehicle; Fig. 8 part of a further schematic and perspective view of the interior of the vehicle; and Fig. 9 part of a further schematic and perspective view of the interior of the vehicle.
In the figures the same elements or elements having the same functions are indicated by the same reference signs.
Fig.1 shows system components used to realize a method for operating a vehicle 10 which is, for example, configured as a motor vehicle in the form of a passenger vehicle. One of said system components is a sensor device 12 which comprises at least one detecting element. For example, the detecting element can be configured as a camera. By means of the sensor device 12 at least a portion of the interior 14 (Figs. 2 to 9) of the vehicle 10 can be detected and monitored. A second one of said system components is a control unit 16 of the vehicle 10. For example, the control unit 16 is an electronic control unit which is also referred to as a computer or tracking computer. For example, the sensor device 12 is configured to provide at least one signal 18 indicative of the detected and monitored portion of the interior 14.
The signal 18 is transmitted to and received by the control unit 16. Both the control unit 16 and the sensor device 12 are components of the vehicle 10. Moreover, the vehicle 10 comprises elements or components 20 which are, for example, arranged in the interior 14 as can be seen from, for example, Fig. 2. For example, the components 20 can be configured as operating elements such as buttons. Alternatively, the components 20 can be status elements. Furthermore, at least one of the components 20 can be an air vent configured to provide the interior 14 with air.
In the following, a first embodiment of said method is described with regard to Figs. 1 to 3. With respect to the first embodiment it was found that, currently, the expressiveness or richness of interaction of occupants of vehicles with the vehicles is quite low, wherein said occupants are users of the vehicles. The interaction is primarily enabled by either numerous physical controls such as buttons or knobs, or single multi-function controls such as dials or touchpads. The output on the other hand is almost exclusively bound to one or two fixed rectangular displays 22 and 24 arranged in the interior 14. Although this provides a consistent focal point for interaction, it hinders discoverability, spatial memory and muscle memory. This creates a significant bottleneck in interaction flow, especially as the number of features keeps increasing and features become more and more complex, while at the same time the amount of non-driving related stimuli keeps increasing, for example because of wearables. The driver of such a vehicle becomes more distracted and therefore needs an easier and faster way to interact with commonly used features of the vehicle.
Presently, the interaction with various features has to be boiled down to the lowest common denominator of an input device of the respective vehicle such as clicks, swipes or dial turns, which slows down the interaction, requires taking eyes off the road to navigate menus and makes discovering features difficult. Instead, an interaction model is envisioned, said interaction model extending this traditional interface with more natural interactions either explicitly or implicitly performed through the motion or state of the user being an occupant of the vehicle 10. For example, hand motions can be interpreted differently depending on the current context, and the vehicle 10 can act on these interpretations in a way that simplifies the interaction or even accomplishes the task altogether.
For example, in the first embodiment the method comprises the following steps: - Detecting at least a portion of an occupant in the interior 14 of the vehicle 10 by means of the sensor device 12.
- Detecting at least one element being an object in the interior 14 of the vehicle 10 by means of the sensor device 12.
- On the basis of the detected portion and object: determining at least one local relation between the portion and the object by means of the control unit 16 of the vehicle 10.
- On the basis of the determined relation: predicting at least one future action of the occupant by means of the control unit 16.
- On the basis of the predicted action: effecting at least one function of the vehicle 10 by means of the control unit 16.
For example, said object can be at least one of the components 20. The first embodiment of the method is a solution to simplifying interaction in the vehicle 10 by detecting, predicting the intent and acting on the user's motions or pose inside the cabin, i.e. the interior 14. This has the potential to reduce cognitive load, reduce required learning and increase driver intention. For example, to realize the method, in particular the first embodiment, a system is used, said system comprising at least said system components. In particular, the system, in particular the sensor device 12, can comprise a single or multiple 3D sensor covering at least a portion of or the whole interior 14.
Each sensor can be a depth sensor, for example, time-of-flight, stereo, structured light, etc. or be based on some other spatial sensing technologies such as, for example, ultrasonic, radar, infrared, 2D optical, etc. For example, the respective sensor, i.e. the sensor device 12 provides at least one signal indicative of the detected object and portion. For example, the data from various sensors is aggregated in order to create a unified representation of the interior 14. This representation, for example a depth map, is then processed, in particular by means of the control unit 16, in order to detect occupants and objects in the interior 14, which are then tracked over time. With the knowledge of where various components in the interior 14 are, where the user is in relation to them, for example a hand approaching a glove box, and the environmental context such as time of day, road conditions, etc., the system, i.e. the method can make reasonable predictions on what the user is trying to accomplish and either make the task easier or accomplish it automatically, for example, via at least one CAN message (CAN -controller area network).
In the following, potential use cases are described with respect to the first embodiment of the method. A first use case relates to increasing safety. Although simplifying interaction in the vehicle 10 usually translates to less distracted driving, being able to predict a user's intent has direct safety benefits. For example, if the systems detects that the driver is reaching for his phone (object) and looking away from the road, the system or method may prompt the driver to pay closer attention or even better, intensify advanced safety systems in the vehicle 10 such as lane keeping assistance and adaptive cruise control. Similarly, if the system or method detects that the user is looking in the side mirrors as if about to change a lane, a blind spot warning can be intensified, either through haptic, visual or auditory methods. Another example of increasing safety will be disabling specific features for the driver but enabling them for the passenger. As an alternative to simply disabling the feature for the driver, the system or method could also offer another less-distracting solution for the task. For example, instead of text input via touchscreen, the system or method could offer voice input which should be less-distracting.
A further use case relates to convenience features. Predicting a user's intent in the vehicle cabin, i.e. in the interior 14 can be used for convenience features. For example, if the driver is driving alone at night and the driver reaches over to the passenger side as if looking for something, a reading light on the passenger side could be turned on automatically, for example by the control unit 16. Similarly, if the passenger in the front reaches for the back seat without anyone there, the light could be turned on and off automatically there as well. As another convenience, the head unit could automatically adapt the user interface to the most appropriate one depending on which input method is being used, for example larger controls for a touchscreen and smaller for a touchpad.
Moreover, the action of predicting actions of the user can be improved through learning. In other words, to improve the accuracy of predictions, the system or method can observe the specific user's behavior over time and, based on the context and their implicit or explicit feedback, learn to better interpret their behavior. For example, if the system automatically turns on the reading light when the driver reaches over to the passenger side but the driver turns it off immediately, the system may adjust the intensity of the light next time or even disable the automatic light altogether. Similarly, the system may observe how long the user is searching for something in the passenger seat before turning on the light.
Aside from monitoring the user, the system can also learn a user's specific behavior. For example, if two people, i.e. users share a vehicle, it can learn that one person does not want the light to come on automatically but the other one does. Or the system may adjust the threshold of activity detection depending on how much a person normally moves in the cabin, for example one person is gesturing with their hands frequently during the cabin while talking, while another is not.
Figs. 2 and 3 show an example for how the vehicle 10 can use intent to support the user: intent is to either interact with a touchpad 26 or a touchscreen 28 of the display 24, the user interface adapts to the intent and supports the user by offering the optimized user interface for the respective input device (the touchpad 26 or the touchscreen 28). Thus, the touchscreen 28 and the touchpad 26 can be components arranged in the interior 14 of the vehicle 10.
In the following, a second embodiment of the method is illustrated with regard to Figs. 1, 4 and 5. With respect to the second embodiment, it was found that as an alternative to boiling down the interaction with various features to the lowest common denominator, every feature that is deemed important enough can get its own physical button. There is a lot of benefit to physical controls such as eyes-free operation, but with a large number of such controls, the iconography is becoming cryptic and their placement relative to the feature they control often is not instantly clear to the user. The iconographic descriptions of vehicle malfunctions or system states are also often not comprehensive and are increasing in complexity with the adoption of more autonomous driver assistance systems. This increases the learning curve and the cognitive load on the user(s).
The second embodiment provides a solution to this problem of a multitude of physical controls and other elements that are confusing and distracting the driver by implementing a system to track objects or movements in 3D space and using this to teach the driver about their vehicle 10.
For example, in the second embodiment the method comprises the following steps: - Detecting at least a portion of an occupant in the interior 14 of the vehicle 10 by means of the sensor device 12.
- On the basis of the detected portion: determining at least one local relation between the portion and at least one component 20 of the vehicle 10 by means of the control unit 16 of the vehicle 10, wherein, preferably, said at least one component 20 is arranged in the interior 14.
- On the basis of the determined relation: emitting at least one help signal in the interior 14 of the vehicle 10, the help signal being indicative of at least one function of said component 20.
In other words, in the second embodiment the user is provided with contextual help by means of the help signal. The second embodiment uses 3D tracking in the interior 14 to explain functions of operating elements such as buttons, other physical elements, and/or status or malfunction indicators in the interior 14. For example, the tracking is enabled through a package that contains at least one tracking sensor such as a depth sensor and/or camera or other proximity-sensing or user tracking technology. For example, said package is formed by the sensor device 12. For example, said package or the sensor device 12 is mounted in or on the ceiling or other positions in the cabin (interior 14) that allows the package to track, for example, hand or arm movements of the driver and/or a passenger, i.e. at least one occupant in the interior 14. Depending on the location, the package or at least one tracking sensor can either cover parts of the cabin, for example the central console area, or the full cabin. The tracking sensor can track hands near buttons or controls and present the user with help on these either visually, verbally or both. For example, the help signal comprises at least one acoustical and/or optical signal. For example, the help signal is shown on at least one of the displays 22 and 24.
For example, to bring up graphical content and a short written description of the functionality of a component such as an element or a button on a display unit the user could touch and hold but not press the button (component 20) for two to three seconds with one finger. Alternatively, the user could enter a specific help mode on the head unit where simple a touch or tap of the element presents the user with an explanation. Possibly, also neighboring elements or buttons can be displayed and explained, therefore a tracking of not only one finger but the whole hand should also be possible as neighboring elements are displayed and not only the precise element that was touched. For example, a touching and/or tapping gesture can be combined with a spoken question of the driver about the functionality of at least one component 20 such as a button or element such as "what is this?". The voice would be utilized as an additional trigger or intent differentiator, the main context is still provided by the tracking sensor. The visual explanation can then also be supplemented by an acoustic explanation. Asking about something while at the same time tapping/touching something in order to clarify what the person is taking about is a very natural behavior for humans. The invention makes use of this known, natural behavior and is enriching it with more functionality. Instead of someone having to look up the functionality of the button or element (component 20) in a lengthy manual, the vehicle 10 -enabled by the interior tracking system, i.e. the method is able to understand the driver, without the driver having to be extremely precise in his spoken request/description of the element (component 20).
The method may be combined with a help mode that enables or disables the 3D look up explanation feature. This functionality would aid the user in exploring and learning about features of their vehicle 10 in the actual context, as opposed to having to look it up after driving or outside of the vehicle 10.
Figs. 4 and 5 illustrate a use case of the second embodiment of the method. Upon moving a hand 30 towards the touchpad 26, in particular while in help mode, a visual representation 32 of the touchpad 26 and hard keys 34 surrounding the touchpad 26 shows up on the display 24, along with short explanations and their toggle states. This means the aforementioned help signal is, for example, the visual representation 32 graphically illustrating the physical touchpad 26 and the surrounding hard keys 34 on the display 24. This could also be combined with voice by pointing at something or touching something and asking what this button or element does.
In the following, a third embodiment of the method is illustrated with respect to Figs. 1 and 6 to 9. With respect to the third embodiment it was found that commonly used features that used to have physical controls get buried in deep levels of digital operating systems in an effort to simplify or de-clutter the vehicle interior as well as to save costs. Some features are deemed important enough to get their own physical buttons, i.e. operating elements, but the iconography is often cryptic and their placement relative to the feature they control often is not instantly clear to the user. Also, digital menus that are operated via touchscreen or touchpad often lack haptic feedback. All of this increases cognitive, visual and motor load and therefore leads to more distraction.
The third embodiment of the method provides a solution to these problems by allowing the user to directly interact with different elements, i.e. components such as the components 20 of the vehicle 10 in order to operate certain features, for example by tapping them. Preferably, said elements or components 20 are arranged in the interior 14. The function associated with the interaction should be tied to the element (component) that the user performed this action on, so that there is a mental association between the placement of the function and the feature it controls. The user has direct access to certain functionalities and there is haptic feedback of touching the right thing. With the method, the number of interaction steps necessary to control certain features can be reduced.
Moreover, learning can be reduced or minimized, and the haptic advantages of former hard controls can be kept. Moreover, the whole interaction can be sped up. Moreover, it is possible to create a less distracting user interface and make driving safer while at the same time having a beneficial convenience feature for the user.
In the third embodiment the method comprises the following steps: - Detecting at least a portion of an occupant in the interior 14 of the vehicle 10 by means of the sensor device 12.
- On the basis of the detected portion: determining at least one local relation between the portion and a component 20 of the vehicle 10 by means of the control unit 16 of the vehicle 10, wherein, for example, the component 20 is arranged in the interior 14.
- Presenting at least one option list to the occupant, the option list comprising a plurality of items being selectable by the occupant.
For example, the third embodiment makes use of 3D tracking to provide shortcuts to commonly used features. Moreover, the method uses 3D tracking in the interior 14 to detect user interaction with elements or components of the vehicle 10 in order to offer shortcuts to mentally associated features, wherein said elements or components are arranged in the interior 14. For example, said 3D tracking is realized by the sensor device 12. Said tracking is enabled through a package which is, for example, formed by the sensor device 12. Said package contains at least one tracking sensor such as a depth sensor or camera or other proximity-sensing or user tracking technology. The package can be mounted in the ceiling or other positions in the cabin (interior 14) that allows it to track and, for example, hand and/or arm movements of the driver and/or the passenger, i.e. at least one occupant in the interior 14. Depending on the location, the tracking sensor can either cover parts of the cabin, for example, the center console area, or the full cabin. The tracking sensor can track hands interacting with certain elements arranged in the interior 14 and offer the user direct access to certain features.
In the following, possible use cases of the third embodiment are described. To bring up or control a certain feature the user could touch or tap a specific element of the vehicle interior with their hand. Ideally, this will be a one-shot interaction, meaning the user can complete the interaction with only one tap or touch, or they do not have to change the input modality in order to adjust more detail within the feature. The method makes use of the natural behavior of people touching objects in order to interact with them and is enriching it with more functionality.
Also, the 3D tracking makes it possible to distinguish if the passenger or the driver is interacting with a certain element of the vehicle interior. This information can be used to default the shortcut feature functionality to the person actual using it, meaning the driver will only adjust the setting for themselves when interacting with an element of the vehicle interior, whereas the passenger will only adjust passenger settings. One of the possible elements or components of the vehicle interior that the method can be applied to is an air vent. In other words, possible elements arranged in the interior 14 that the method can be applied to are air vents. The user could, for example, just tap the air vents to bring up a climate setting for, for example air distribution, air flow or similar. Even more, the user could just tap several times to toggle through different levels of that specific setting. When the driver is tapping, the system can default to the driver's climate settings. When the passenger is tapping the physical element, it can default to the passenger's settings. Thus, the user can toggle through a manual particularly easily. For example, the said menu through which the user can toggle by, for example, tapping a component such as an air vent, can be the aforementioned option list, wherein said selectable settings can be set selectable items of the option list.
Figs. 6 and 7 show an example for a shortcut feature. In the example, the components 20 are configured as air vents. When the user taps one of said are vents with their hand 30, a certain climate control setting will open on, for example, the display 24. The climate control setting comprises, for example, an option list comprising a plurality of selectable items such as selectable settings. By tapping multiple times the user could toggle through different levels of the setting, i.e. through the various items.
There are many other elements such shortcuts could be used on. Some possible elements of the vehicle interior that the method could be applied to are the light panel on the ceiling, a clock, ambient light and basically any other element that is mentally associated with a certain feature for functionality. Tapping or gesturing near these items could then serve as a shortcut to control and/or bring up a display of mentally associated features such as a reading light, calendar, time settings, ambient light colors or brightness and many others.
In the following, a discovery feature or discovery mechanism of the third embodiment will be described. For example, the third embodiment also covers a discovery mechanism for the various interactions possible. Although there is not much teaching necessary, the user should be informed about what is happening and which elements of the vehicle interior are interactive. Therefore, when the user is interacting with the interior 14 in a completely normal and natural way, for example adjusting the air vents, the user should be given hints about what else they can do with that respective element. The hints could for example show up on a screen, but it is also possible to use voice output. Also, the hints could contain a graphical and possibly animated representation of how the enriched interaction has to be performed. The hints could also serve as a shortcut to an overview of all possible enriched interactions with the vehicle interior, so the user does not have to discover every single one of them.
Figs. 8 and 9 show an example for said discovery mechanism. When naturally interacting with at least one of the air vents, for example to adjust the air vents, the user gets a hint 36 about the enriched functionality of the air vents and information on how to perform the specific interaction. As can be seen from Fig. 9, said hint 36 is shown on the display 24.
List of reference signs vehicle 12 sensor device 14 interior 16 control unit 18 signal components 22 display 24 display 26 touchpad 28 touchscreen hand 32 representation 34 hard keys 36 hint

Claims (7)

  1. Patent Claims 1. A method for operating a vehicle (10), the method comprising: - detecting at least a portion (30) of an occupant in the interior (14) of the vehicle (10) by means of at least one sensor device (12); - detecting at least one element (20) in the interior (14) of the vehicle (10) by means of the sensor device (12); - on the basis of the detected portion (30) and element (20): determining at least one local relation between the portion (30) and the element (20) by means of a control unit (16) of the vehicle (10); on the basis of the determined relation: predicting at least one future action or identifying at least one current action of the occupant by means of the control unit (16); and on the basis of the predicted or identified action: effecting at least one function of the vehicle (10) by means of the control unit (16).
  2. 2. The method according to claim 1, wherein the action is effected on the basis of the time and/or on the basis of at least a portion of the surroundings of the vehicle (10).
  3. 3. A method for operating a vehicle (10), the method comprising: - detecting at least a portion (30) of an occupant in the interior (14) of the vehicle (10) by means of at least one sensor device (12); - on the basis of the detected portion (30): determining at least one local relation between the portion (30) and at least one component (20) of the vehicle (10) by means of a control unit (16) of the vehicle (10); and - on the basis of the determined relation: emitting at least one help signal in the interior (14) of the vehicle (10), the help signal being indicative of at least one function of the component (20).
  4. 4. The method according to claim 3, wherein the help signal comprises at least one acoustical signal.
  5. 5. A method for operating a vehicle (10), the method comprising: -detecting at least a portion (30) of an occupant in the interior (14) of the vehicle (10) by means of at least one sensor device (12); on the basis of the detected portion (30): determining at least one local relation between the portion (30) and a component (20) of the vehicle (10) by means of a control unit (16) of the vehicle (10); presenting at least one option list to the occupant, the option list comprising a plurality of items being selectable by the occupant.
  6. 6. The method according to claim 5, wherein the option list is optically and/or acoustically presented to the occupant.
  7. 7. The method according to claim 5 or 6, wherein the method comprises: -by means of the sensor device (12): detecting at least one movement of the occupant in relation to the component (30); and -selecting at least one of the items on the basis of the detected movement.
GB1609739.6A 2016-06-03 2016-06-03 Method for operating a vehicle, in particular a passenger vehicle Withdrawn GB2539329A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1609739.6A GB2539329A (en) 2016-06-03 2016-06-03 Method for operating a vehicle, in particular a passenger vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1609739.6A GB2539329A (en) 2016-06-03 2016-06-03 Method for operating a vehicle, in particular a passenger vehicle

Publications (2)

Publication Number Publication Date
GB201609739D0 GB201609739D0 (en) 2016-07-20
GB2539329A true GB2539329A (en) 2016-12-14

Family

ID=56508036

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1609739.6A Withdrawn GB2539329A (en) 2016-06-03 2016-06-03 Method for operating a vehicle, in particular a passenger vehicle

Country Status (1)

Country Link
GB (1) GB2539329A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3392670B1 (en) * 2017-04-20 2024-06-19 Polestar Performance AB A method and system for spatial modelling of an interior of a vehicle

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060047386A1 (en) * 2004-08-31 2006-03-02 International Business Machines Corporation Touch gesture based interface for motor vehicle
US20080300756A1 (en) * 2007-05-29 2008-12-04 Bayerische Motoren Werke Aktiengesellschaft System and method for displaying control information to the vehicle operator
US20130090807A1 (en) * 2011-04-22 2013-04-11 Yoshihiro Kojima Vehicular input device and vehicular input method
KR20130068945A (en) * 2011-12-16 2013-06-26 현대자동차주식회사 Interaction system for vehicles
CN203084540U (en) * 2013-03-01 2013-07-24 公安部第三研究所 Vehicle-mounted intelligent multimedia terminal device
DE102013001330A1 (en) * 2013-01-26 2014-07-31 Audi Ag Method for operating air conveying fan of fan device of motor vehicle, involves determining predetermined gesture in such way that occupant abducts fingers of his hand before clenching his fist
US20150370329A1 (en) * 2014-06-19 2015-12-24 Honda Motor Co., Ltd. Vehicle operation input device
US20160004322A1 (en) * 2013-07-05 2016-01-07 Clarion Co., Ltd. Information Processing Device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060047386A1 (en) * 2004-08-31 2006-03-02 International Business Machines Corporation Touch gesture based interface for motor vehicle
US20080300756A1 (en) * 2007-05-29 2008-12-04 Bayerische Motoren Werke Aktiengesellschaft System and method for displaying control information to the vehicle operator
US20130090807A1 (en) * 2011-04-22 2013-04-11 Yoshihiro Kojima Vehicular input device and vehicular input method
KR20130068945A (en) * 2011-12-16 2013-06-26 현대자동차주식회사 Interaction system for vehicles
DE102013001330A1 (en) * 2013-01-26 2014-07-31 Audi Ag Method for operating air conveying fan of fan device of motor vehicle, involves determining predetermined gesture in such way that occupant abducts fingers of his hand before clenching his fist
CN203084540U (en) * 2013-03-01 2013-07-24 公安部第三研究所 Vehicle-mounted intelligent multimedia terminal device
US20160004322A1 (en) * 2013-07-05 2016-01-07 Clarion Co., Ltd. Information Processing Device
US20150370329A1 (en) * 2014-06-19 2015-12-24 Honda Motor Co., Ltd. Vehicle operation input device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3392670B1 (en) * 2017-04-20 2024-06-19 Polestar Performance AB A method and system for spatial modelling of an interior of a vehicle

Also Published As

Publication number Publication date
GB201609739D0 (en) 2016-07-20

Similar Documents

Publication Publication Date Title
CN110045825B (en) Gesture recognition system for vehicle interaction control
RU2466038C2 (en) Vehicle system with help function
US9103691B2 (en) Multimode user interface of a driver assistance system for inputting and presentation of information
US8910086B2 (en) Method for controlling a graphical user interface and operating device for a graphical user interface
US9475390B2 (en) Method and device for providing a user interface in a vehicle
RU2523172C2 (en) Transformable panel for tactile control
US8446381B2 (en) Control panels for onboard instruments
KR20190122629A (en) Method and device for representing recommended operating actions of a proposal system and interaction with the proposal system
US9701201B2 (en) Input apparatus for vehicle and vehicle including the same
EP3018568A1 (en) Information processing device
KR20160083814A (en) Apparatus and method for supporting an user before operating a switch for electromotive adjustment of a part of a vehicle
US20110310001A1 (en) Display reconfiguration based on face/eye tracking
US11119576B2 (en) User interface and method for contactlessly operating a hardware operating element in a 3-D gesture mode
US11188211B2 (en) Transportation vehicle with an image capturing unit and an operating system for operating devices of the transportation vehicle and method for operating the operating system
EP3726360B1 (en) Device and method for controlling vehicle component
US20220258606A1 (en) Method and operating system for detecting a user input for a device of a vehicle
US20240109418A1 (en) Method for operating an operating device for a motor vehicle, and motor vehicle having an operating device
GB2539329A (en) Method for operating a vehicle, in particular a passenger vehicle
JP2018132824A (en) Operation device
JP2017197015A (en) On-board information processing system
CN114040857A (en) Method for operating an operating system in a vehicle and operating system in a vehicle
US20160154546A1 (en) Control panel for providing shortcut function and method of controlling using the same
JP2017197016A (en) On-board information processing system
JP6739864B2 (en) In-vehicle information processing system
JP2017187922A (en) In-vehicle information processing system

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)