WO2005055046A1 - Method and system for interact between a vehicle driver and a plurality of applications - Google Patents

Method and system for interact between a vehicle driver and a plurality of applications Download PDF

Info

Publication number
WO2005055046A1
WO2005055046A1 PCT/EP2004/013229 EP2004013229W WO2005055046A1 WO 2005055046 A1 WO2005055046 A1 WO 2005055046A1 EP 2004013229 W EP2004013229 W EP 2004013229W WO 2005055046 A1 WO2005055046 A1 WO 2005055046A1
Authority
WO
WIPO (PCT)
Prior art keywords
driver
interaction
applications
application
manager
Prior art date
Application number
PCT/EP2004/013229
Other languages
French (fr)
Inventor
Johan ENGSTRÖM
Petter Larsson
Original Assignee
Volvo Technology Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from SE0303122A external-priority patent/SE0303122D0/en
Application filed by Volvo Technology Corporation filed Critical Volvo Technology Corporation
Priority to EP04803214.8A priority Critical patent/EP1706815B1/en
Priority to JP2006540356A priority patent/JP4659754B2/en
Priority to BRPI0416839-9A priority patent/BRPI0416839A/en
Publication of WO2005055046A1 publication Critical patent/WO2005055046A1/en
Priority to US11/419,511 priority patent/US8009025B2/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/037Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/29Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/85Arrangements for transferring vehicle- or driver-related data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/18Information management
    • B60K2360/195Blocking or enabling display functions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/589Wireless data transfers
    • B60K2360/5899Internet

Definitions

  • the invention relates to a method and system for communication and/or interaction between a human being, especially a vehicle driver, and a plurality of integrated and/or non-integrated applications like e.g. native vehicle applications and/or after- market applications and/or nomad applications. Especially, the invention relates to such a method and system for managing the communication and/or interaction by means of an interaction manager.
  • GIDS Generic Intelligent Driver Support
  • Prioritization which prevents conflicting information from different support and service functions to be presented simultaneously, by presenting information sequentially and according to priority;
  • ID Application identification
  • message priority on a 6-point scale
  • preferred time of presentation content of message and specifications for the integrated HMI (Human Machine Interface)
  • HMI Human Machine Interface
  • order within task cluster and imposed workload per resource (i.e. how much workload the message will impose on the driver in each of the sensory modalities).
  • the dialogue controller decides when and how the message is presented to the driver.
  • the dialogue controller is also responsible for actually presenting the informa- tion to the driver through an integrated HMI.
  • the basic GIDS information management function is to filter information presented through the common HMI of the dialogue controller.
  • no method is disclosed for enabling the integrated management of stand-alone systems which have their own HMIs (e.g. aftermarket and nomad systems).
  • U.S. patent application US 2002/0120374A1 discloses a system and method for driver performance improvement by which operator activity data relating to activities of the operator within an interior portion of the vehicle are monitored and vehicle operating data, vehicle environment data and operator condition data are received. An operator cognitive load is estimated and on the basis of these data vehicle information is prioritized for selectively informing the operator of vehicle information.
  • the system may also operate with wireless communicaton devices like mobile phones, PDAs and pagers and prioritize incoming calls, e-mails and text and data messages of such devices, respectively.
  • an application shall cover any systems, components, functions, devices, units and modules which are able after their activation to communicate and/or interact one- and/or bidirectional with the vehicle driver, for example by initiating, sending and/or receiving actions to/from the driver, sending and/or receiving messages to/from the driver etc..
  • Such an application can also be very sophisticated like e.g. a collision avoidance system.
  • a driver/vehicle environment (DNE) state is a state which is evaluated on the basis of output signals of one or a plurality of sensors for detecting parameters of the driver and/or the vehicle and/or the environment.
  • inventive methods and systems are able to handle integrated or "native" applications which are implemented in the vehicle before shipping, as well as non-integrated applications like aftermarket applications that are added later and those applications which are brought into the vehicle temporarily or permanently by the driver or a passenger (nomad applications).
  • the centralised management of the communication and/or interaction between a driver and the applications by means of an interaction manager opens a great potential for a high degree of modularity, a comparatively simple system architecture and the possibility to extend the system in a simple way by additional applications which are able to send requests and receive responses according to claim 1.
  • the inventive methods and systems offer an open system architecture and the use of standardized data protocols so that they can be extended in a very flexible way with applications which are built-in later (or to nomad applications), without the need to change the interaction manager itself.
  • the interaction manager controls the capability of the applications to communicate and/or interact with a vehicle driver in dependence of a DNE state and in the absence of any request from the related application.
  • inventive methods, systems and applications are not limited to the communication and/or interaction with a vehicle driver but with any human being who is confronted more or less simultaneously with a plurality of signal or information sources which in case of activation have to be considered or handled in dependency of at least one certain environment state and/or other such activated signals and/or information sources and/or other conditions.
  • Fig. 1 a first functional architecture of such an exemplary and preferred embodiment of a system according to the invention
  • Fig. 2 a look-up table for a DNE state-dependent control of the service state or functions of an in- vehicle application according to the invention
  • Fig. 3 an embodiment of another application according to the invention
  • Fig. 4 a second functional architecture of an exemplary and preferred embodiment of a system according to the invention
  • Fig. 5 a third functional architecture of an exemplary and preferred embodiment of a system according to the invention.
  • the inventive system according to Figure 1 can be implemented e.g. on a multiplex vehicle bus like the well-known CA ⁇ -bus or MOST-bus or a wireless LAN (Local Area Network), working with the Bluetooth- or any other standard, and comprises the following four main components:
  • the first main component is a sensor array 1 which generally comprises one, but preferably a plurality of sensors of all possible types that are provided to monitor and/or to detect the state of the driver and/or the state of the vehicle and/or the state of the environment.
  • the sensor array 1 comprises exemplarily a first group 10 of driver state sensors which for example are head movement sensors and/or gaze sensors and/or eyelid closure sensors for tracking the movement of the head and/or the eye and/or the eyelid of a driver.
  • driver state sensors which for example are head movement sensors and/or gaze sensors and/or eyelid closure sensors for tracking the movement of the head and/or the eye and/or the eyelid of a driver.
  • the gaze direction is the direction in which a person momentarily directs the attention [fovea] of his eyes with the eye-balls as reference).
  • a second group 11 of vehicle state sensors comprises for example speed sensors, accelerometers, steering wheel angle sensors, pedal position sensors, gyros, tire pressure sensors or other sensors for detecting various vehicle related information.
  • a third group 12 of environment state sensors comprises e.g. radar and/or laser sensors and/or video cameras and is provided for detecting and/or monitoring e.g. the surrounding traffic.
  • a fourth group 13 of sensors e.g. GPS sensors with map matching
  • a fifth group 14 of sensors e.g. lane tracking sensors monitors the position of the vehicle on the road and/or other environmental states.
  • the second main component is formed by a plurality of units, modules or applica- tions 2 which are integrated into the vehicle.
  • these applications share the sensor array 1 (arrow Al) and/or those input/output (I/O) devices 431 (like e.g. displays, audio systems, buttons, knobs etc.) which belong to a common integrated human machine interface 43 (HMI) which is explained below.
  • I/O input/output
  • These integrated units, modules or applications 2 comprise as well the core computation units of an Advanced Driver Assistance System (ADAS) and/or an In-Nehicle Information System (INIS) that are integrated into the vehicle architecture.
  • these integrated applications 2 are for example an attention support system 21, a route guidance system 22, a lane departure warning system 23, a tyre pressure monitor system 24 and information- and entertainment systems 25 like e.g. radio, CD, DND and other.
  • the integrated applications 2 based on the output signals of the sensor array 1, the integrated applications 2 perform the computations needed for determining what action to take (e.g. issue a warning). They then use the common integrated HMI 43 (arrow A5) for interacting with the driver D (arrow A2).
  • the third main component is a plurality of units, modules or applications 3 which might be integrated into the vehicle architecture but not into the integrated HMI 43. These are considered as stand-alone applications 3. Generally, these applications 3 have their own sensors and/or input/output (I/O) devices, like displays, keyboards and/or other non-integrated HMIs 34 for communicating and/or interacting with the driver D.
  • I/O input/output
  • the units, modules or applications 3 comprise according to Fig. 1 e.g. integrated ap- plications 31 which, however, utilize their own HMI (which is not integrated into the integrated HMI 43), aftermarket applications 32 including those that are added to the vehicle after it has been shipped, and nomad applications 33 like mobile phones, portable media players (e.g. CD players) or hand held navigation systems like a GPS receiver.
  • HMI which is not integrated into the integrated HMI 43
  • aftermarket applications 32 including those that are added to the vehicle after it has been shipped
  • nomad applications 33 like mobile phones, portable media players (e.g. CD players) or hand held navigation systems like a GPS receiver.
  • driver information unit 4 This central unit is of primary importance and contains an interaction manager 41, a driver/vehicle environment (DNE) state estimator/predictor 42 and the integrated human machine interface (HMI) 43 (which has been mentioned above) for communicating and/or interacting with the driver D (arrow A2).
  • DNE driver/vehicle environment
  • HMI human machine interface
  • the interaction manager 41 contains the hardware/software (e.g. for realizing a prioritization algorithm and a waiting queue, as explained below) for managing the driver's communication and/or interaction with the integrated ADAS/INIS modules and applications 2 via the integrated HMI 43 (arrow A5) and with the stand-alone applications 3 via the non-integrated HMIs 34, in dependence of the real-time input from the driver/vehicle environment (DNE) state estimator/predictor 42 (arrow A3).
  • DNE driver/vehicle environment
  • the DNE state estimator/predictor 42 computes a (potentially) multidimensional estimate of a current and/or a predicted DNE state on the basis of output signals of the sensor array 1 (arrow A4) and/or the output signals of at least one of the integra- ted applications 2 (arrows A5 and A9) and continuously outputs a corresponding DNE state vector to the interaction manager 41 (arrow A3).
  • This state vector for example comprises and/or is established on the basis of at least one of the following criteria or parameters: primary task demand (e.g. the complexity of the driving situation and how critical the situation is for the driver), secondary task demand (e.g. driver activity focused on other tasks than the primary driving task), visual distraction, physiological driver impairment (like drowsiness, drug influence), driver characteristics (age, experience etc.), specific driving situations (e.g. overtaking another vehicle), the overall driving environment type or context (e.g. • highway, country road, city, rural, suburbia or city), as well as the driver identity.
  • primary task demand e.g. the complexity of the driving situation and how critical the situation is for the driver
  • secondary task demand e.g. driver activity focused on other tasks than the primary driving task
  • visual distraction e.g. driver activity focused on other tasks than the primary driving task
  • physiological driver impairment like drowsiness, drug influence
  • driver characteristics e.g. overtaking another vehicle
  • specific driving situations e.g. overtaking another vehicle
  • These criteria or parameters of the state vector can be computed by a sub-module within the DNE state estimator/predictor 42 on the basis of one or more sensor signals of one or more sensors in the sensor array 1 (arrow A4) and/or of one or more output signals of the integrated applications 2 (arrow A5 and A9), e.g. the driver state signals (e.g. sensor signals indicating gaze direction and alertness), the vehicle state signals (e.g. sensor signals indicating speed, acceleration, steering wheel angle, pedal positions, gyro signals, tyre pressure) and/or the environment state signals (e.g. sensor signals indicating road type, road surface condition, surrounding obstacles, geographical position etc.).
  • the driver state signals e.g. sensor signals indicating gaze direction and alertness
  • the vehicle state signals e.g. sensor signals indicating speed, acceleration, steering wheel angle, pedal positions, gyro signals, tyre pressure
  • the environment state signals e.g. sensor signals indicating road type, road surface condition, surrounding obstacles, geographical position etc
  • the DNE state estimation may include a prediction of future states as well.
  • the exact definition of the DNE state vector is based on the needs of the interaction manager 41 (i.e. what information is required for the interaction management functions).
  • the driver information unit 4 comprises the integrated human machine interface (HMI) 43 which is the main interface between the driver D (arrow A2) and the integrated ADAS/INIS modules and applications 2 (arrow A5), containing one or more I/O devices 431 for enabling the driver D to communicate and/or interact with the system in different sensory modalities or via different sensory channels (visual, auditory, hap tic).
  • HMI human machine interface
  • the stand-alone applications 3 may, but not have to, use the integrated HMI 43 because these applications 3 usually have their own built-in HMIs. So the driver D potentially can communicate and/or interact with the stand-alone applications 3 via their own built-in HMIs (arrow 34).
  • a first main feature of the inventive method conducted by the system according to Figure 1 is the scheduling of applications-initiated information, i.e. any communication and/or interaction with the driver D, in dependence of a current and/or predic- ted driver/vehicle environment (DNE) state.
  • applications-initiated information i.e. any communication and/or interaction with the driver D
  • DNE driver/vehicle environment
  • a second main feature is that, when communications and/or interactions with the driver D are initiated simultaneously by several different applications 2, 3 (including the case in which a request is queued and awaiting approval and a request for a second communication and/or interaction is received) the system selects the most important communication and/or interaction on the basis of a prioritization (according to certain criteria, see below). Other communications and/or interactions are put in a waiting queue and are performed later in order of priority.
  • the interaction manager 41 checks if the application is compatible with the interaction manager 41, i.e. whether the application can be controlled by the interaction manager 41. If the application does not have such compatibility by default, the interaction manager 41 (or the application 3) checks for the possibility to download and/or install an appropriate software required to make the application 3 compatible with the interaction manager 41.
  • the interaction manager 41 dynamically assigns an identification number to the application which is unique in this system to the application. This application will keep the assigned identification number as long as it is connected to the interaction manager 41 i.e. as long as the link between the application and the system exists. After termination of the link and connecting the application again, a new (possibly as well the same) identification number is assigned to the application.
  • 3 communicates and/or interacts with the driver D, it has to execute a first routine with the interaction manager 41 (arrows A6 and A7) which comprises e. g. the following steps:
  • Awaiting a response (preferably in a standardised format as well, see below) from the interaction manager 41.
  • This response comprises an indication about either (a) "permission granted - go ahead” or (b) "permission denied - wait and hold the communication and/or interaction";
  • a request is sent a few times (e.g. 3 to 10) to consider a possible loss of the request on the bus or the wireless LAN.
  • the application After a few tries it is supposed by the application sending the request that the interaction manager 41 is no longer available (e.g. that there are some problems with the interaction manager 41) and the application output a diagnostic message e.g. on a diagnostic bus, but preferably does not communicate and/or interact with the driver with respect to this message.
  • This procedure is applicable as well for an application to detect whether there is an interaction manager 41 present in a specific system or not. This might be of relevance if certain applications shall be used in a system without any interaction manager as well.
  • the application has to check if an interaction manager is present or not. If there isn't any, the application will switch over and work in a stand-alone mode i.e. it will communicate/interact with the driver regardless of the state of the driver and/or the HMI and/or the vehicle and/or the environment. Otherwise it follows the above decribed routine.
  • the same application can be used both in interaction manager cont- rolled systems as well as in systems without an interaction manager.
  • the modularity of the applications is increa- sed considerably.
  • the corresponding part of the first routine to be executed by the interaction manager 41 when a request is received from an application 2, 3 comprises e. g. the following steps:
  • Such a response can include an indication regarding the number of the request in the waiting queue.
  • the response is repeated circularly or whene- ver another request with a higher priority has left the waiting queue.
  • an application withdraws it's request, it preferably sends a delete message to the interaction manager instructing the same to remove the request from the waiting queue.
  • This is handled according to the description below with respect to "dynamic priorities". For certain applications 2, 3 which e.g. generate highly safety-critical message types, this routine may be skipped entirely, and the message is pushed through regardless of what requests are stored in the waiting queue.
  • highly safety-critical messages could also be included in the prioritization (but not in the scheduling) and handled by the interaction manager. Thus, such highly safety-critical messages preferably but not necessarily can be handled regardless of the DNE state, but they follow the same procedure as for other requests.
  • the current non-safety-critical request is preferably put into the wai- ting queue (again) with the highest possible priority allowed for non-safety-critical requests and is permitted according to the above first routine and/or as soon as the safety-critical message(s) have been presented.
  • the requests for communicating and/or interacting with the driver D, sent to the interaction manager 41 by the different integrated and/or non-integrated applications 2, 3 as well as the responses generated and transmitted by the interaction manager 41 preferably follow a standardised format.
  • Such a format and data structure for the requests sent by an application comprises e. g. the following fields: 1.
  • this identifier may be dynamically assigned to the application 33 by the interaction manager 41 as decribed above, when and as long as it is connected to the vehicle network.
  • the data type is preferably "integer";
  • This field contains an identifier of the communication and/or interaction (e.g. a message "low fuel", an incoming phone call, a vehicle diagnostic message, a route guidance message etc.) with the driver D, requested by the application 2, 3.
  • the data type is preferably "integer”.
  • the identifier however, has no connection to the ty- pe of communication/interaction itself, but is just an (integer) number.
  • This field contains preferably a floating number representing a priority index of the communication and/or interaction, established by means of a standardised method, e.g. SAE J2395. Representing the priority index by a float rather than by an integer has the advantage that it creates unique priorities. Otherwise, if two communications and/or interactions with the same priority index are initiated, the well-known first-in-first-out principle (i.e. the message that was initiated first is presented first) is preferably applied. Therefore, the data type is preferably "float".
  • the responses of the interaction manager 41 are preferably represented by a standardised format and data structure which contains e. g. the following fields:
  • Application identification comprises the identifier of the application 2, 3 sending the request.
  • the data type is preferably "integer";
  • This field indicates the identifier of the communication and/or interaction (e.g. as mentioned above) requested by the request.
  • the data type is preferably "integer";
  • This field contains the answer to the request, comprising an indication about either (a) “permission granted - go ahead” or (b) "permission denied - wait and hold the communication and/or interaction".
  • the data type is preferably "Booelan”, e.g. "1" for (a) and "0" for (b).
  • routines constitute a simple way to control any application 2, 3 that follows the standard formats and data structures specified above, without the need for the interaction manager 41 to keep an updated list of all applications 2, 3 and an updated list of all the communications and/or interactions (which the related application can perform) and their priorities (as in the prior art according to the GIDS-project).
  • This is particularly useful in the case of nomad applications 33, which in the near future could be expected to seamlessly connect to the vehicle data bus, e.g. via a wireless link (such as Bluetooth).
  • a nomad application manufacturer is developing a certain PDA. He lists all possible communications and/or interactions initiated by the PDA and assigns a priority index to each of them, using a standardised method (e.g. in accordance with SAE J2395) and storing a related table in the PDA. In practise, this could also be done by an authorised institution. Instead of listing all possible interactions and/or communications the manufacturer can list groups of possible interactions and/or comrnunica- tions and assign the same priority to the entire group. This may be advantageous if it is impossible to anticipate all possible interactions/communications especially if there are dynamic interactions/communications.
  • the routine described above with respect to sending requests to and receiving re- sponses from the interaction manager 41 is implemented in the PDA as well so that it, when connected to the vehicle bus (and after the interaction manager 41 assigned an application identification to the PDA as described above), always sends a request with respect to the desired communication and/or interaction (and its priority) to the interaction manager 41 and awaits a response indicating said permission, before communicating and/or interacting (e.g. presenting an information and/or initiating an action) with the driver D.
  • Another possibility is to provide an adapter especially for those applications which are not suitable for storing said table and/or for implementing the above routine.
  • the adapter has the function of an interface to the system and conducts the above routine.
  • the application or the interaction manager could also download from the internet or via a service provider a software for achieving the capability of interaction between the application and the interaction manager.
  • a software for achieving the capability of interaction between the application and the interaction manager This might be subject to a subscription or the like requiring a permission/safeguard that the right person/application is requiring such an adapter program (e.g. in case that this costs a fee).
  • an encryption can be used like e.g. a public key encryption method.
  • the interaction manager 41 does not need to know which application 2, 3 had sent the request. It is sufficient that the interaction manager 41 knows that an application X is asking for permission to perform a communication and/or interaction Y (e.g. presenting a message), preferably with a priority index P, without knowing what X and Y actually are.
  • the only extra standardisation needed is a specified format and data structure or protocol for requests and replies, e.g. as disclosed above.
  • the format and data structure can be extended to contain additional information of the communication and/or interaction to be performed. This information could be optional (if this information is missing it is replaced by default values).
  • This field indicates the estimated time that the driver needs for the communication and/or interaction, e.g. in case of a message, to comprehend the message. This is also the time before a subsequent communication and/or interaction can be performed (in seconds).
  • the data type is preferably "float"; However, it is preferred that safety-related requests (or any other request of a certain, predefined priority) are always be processed immediately irrespective whether another message/action of lower priority is currently active. The high priority message/action will be forwarded interrupting the other message/action.
  • the pre- ferred procedure for handling safety-critical messages e.g.
  • warnings (with priority above a certain threshold) is the following: a.) the message is pushed through irrespective of the DNE state; b.) the message overcides any message with low or medium priority currently active, wherein such current message is preferably put on hold into the waiting queue with the highest possible priority allowed for non-safety-critical messages (or requests); c.) several safety-critical messages initiated roughly simultaneously are presented in order of priority.
  • This field indicates the load imposed on the visual channel to the driver.
  • the data type is preferably "integer";
  • Auditory load This field indicates the load imposed on the auditory channel to the driver.
  • the data type is preferably "integer";
  • This field indicates the load imposed on the haptic channel to the driver.
  • the data type is preferably "integer”.
  • a first preferred extension of the inventive method is the handling of dynamic priorities:
  • the priority of a request may change over time.
  • a route guidance message such as "turn right at the next intersection", which becomes more urgent and important the closer the vehicle comes to the intersection.
  • a simple way to consider such dynamic priorities is to use multiple requests with the same identification but different priorities, e.g. a first request for a message when there are 100-200 meters left to the intersection and a second request for the same message when there are less than 100 meters left to the intersection.
  • this requires that the first request is taken out of the waiting queue at the interaction manager 41 when it becomes irrelevant and is replaced by the updated request. In other cases, it may be necessary to delete a request from the waiting queue without replacing it.
  • the phone application may send an updated request (same message ID) with a higher priority after some time e.g. 5 seconds because answering the call has become more urgent since the risk of the caller hanging-up increa- ses with time.
  • An important feature of this system is that the risk of loosing a call or information is reduced.
  • the caller gets a periodically updated information about the number or plane of his call in the waiting queue.
  • the standard format (protocol) of the request is extended by a field specifying the requested operation on the waiting queue as either add_request, replace_request or delete_request as follows:
  • the data type is preferably "integer”.
  • a third main feature of the inventive method is the control of the service state (i.e. the control of the mode of operation or function) of the applications 2, 3 by means of the interaction manager 41, based on and in dependence of a current and/or a predicted DNE state.
  • this is accomplished for the integrated HMI 43 by configuration of the same under direct control of the interaction manager 41 as illustrated by arrows A8 in Figure 1.
  • the configuration is controlled by the interaction manager 41 by sending instructions to these applications 31.
  • These instructions are considered as an additional message type and have a standardised format (protocol) so that it can be used by the interaction manager 41 to control e.g. the service state or the mode of operation or function(s) of these applications 2, 3 or to configure the related HMI 34, 43, including enabling and/or disabling one or more of the entire applications 2, 3.
  • the current driving situation index S n is then distributed to all non-integrated (stand-alone) applications 3 (possibly to the integrated applications 2 as well) by the interaction manager 41.
  • the developer of these applications 3 would then have to define a look-up table specifying how the service state or mode of operation or function(s) is determined by the current driving situation index S ⁇ .
  • Such a look-up table for situation-dependent control of the service state or mode of operation or function(s) of non-integrated applications 3 is exemplified in Figure 2.
  • the service state is defined by the enabling/disabling of certain functions FI to F8 of the related application depending on the drivers situation index SI to S4.
  • the look-up table could be generalised to represent more complex service states than mere enabling/disabling of functions.
  • this table contains dual states of the type "on'V'off ' or “loud”/" quiet” or “bright'V'dark” etc..
  • this table may have n states (very loud, loud, neutral, more quiet, absolute quiet) which then are a function of the DNE parameters as well.
  • This control thus requires a message type that the interaction manager 41 could use to distribute the driving situation index S n .
  • the proposed message format for enabling/disabling functions preferably comprises only one index which is specified as follows:
  • This field is preferably an integer- value representing the current general driving situation index S n .
  • a second preferred extension of the inventive method is the handling of a text-to- speech or text-to-display output by means of a text-converter:
  • Text-to-speech or text-to-display is likely to be a common type of HMI in future vehicles.
  • the basic idea is to give the driver access to longer texts (e.g. emails) without imposing excessive visual distraction when reading it as a whole from a display.
  • comprehending the spoken text still imposes a cognitive load on the driver, which may contribute to overload in demanding and/or critical driving situations.
  • Passengers in the car talking to the driver usually avoid this problem by pau- sing when the workload of the driver increases by the driving task.
  • the idea of the proposed function is to schedule the text output based on the output from the DNE state estimator/predictor. The general principle is illustrated in Figure 3.
  • Figure 3 shows a preferred embodiment of a text-converter 26 which can be consi- dered as a specific form of an application that can be an integrated or a non-integrated application 2, 3. Two different cases have to be considered with respect to the text to be output:
  • pre-chunked messages e. g. a route guidance message generated by a route gui- dance system A
  • non-pre-chunked messages e. g. an e-mail generated by an e-mail or SMS system B for which real-time parsing is necessary.
  • the application 26 is generally handled according to the methods outlined above. In the preferred embodiment as illustrated in Figure 3, the following procedure is executed: 1.)
  • the raw text representation "X1-X2-X3" (standing for any text comprising the text segments XI, X2, X3 to be converted to speech and output) is initiated by an information system like a route guidance system A and/or an e-mail or SMS system B.
  • XI, X2, X3 could be a paragraph, a sentence, a phrase, a word, a letter, a figure, etc.).
  • the text may be chunked up in advance by the information system itself. This is generally possible for pre-defined mes- sages (route guidance messages or similar messages). In this case, the message is passed directly to a waiting queue 262 provided within the application 26.
  • the messages may not be pre-chunked, as is the case for e-mails and SMS (B).
  • the raw text is fed into a parser 261 which segments the text into grammatical categories (a parser 261 is a piece of software that employs a stored lexicon and a grammar in order to segment a text into hierarchical grammatical categories like e.g. paragraphs, sentences, phrases, words etc.).
  • the parser 261 thus slices up the text "X1-X2-X3" into chunks XI, X2 and X3 (e.g. phrases), which are then put into the waiting queue 262 in order of presentation.
  • a simpler alternative to grammatical parsing would be to just look for specific dividing characters, e.g. comma, punctuation and/or colon in order to divide the text into meaningful chunks.
  • the text chunks XI, X2 and X3 are treated like the communications and/or applications according to the method described above.
  • the speech generator 263 sends requests to the interaction manager 41 and waits for responses confirming that presentation is permitted.
  • a chunk XI, X2 and/or X3 is held for longer than a certain first period of time, e.g. 10 seconds, the previous chunk is preferably repeated in order to facilitate understanding. If a chunk XI, X2 and/or X3 is held for longer than a second period of time (e.g. 20 seconds), the two previous chunks are repeated and so on.
  • a certain first period of time e.g. 10 seconds
  • a chunk XI, X2 and/or X3 is held for longer than a second period of time (e.g. 20 seconds)
  • a second period of time e.g. 20 seconds
  • the parser 261 might be an intelligent one which recognises not only grammatical structures, but also content (a semantical parser). Methods used in natural language processing systems might be applicable here to identify such logical units.
  • the driver is preferably in- formed on this, e.g. by a characteristic tone or a short voice message as e.g. "Message interrupted" or "Stop".
  • an appropriate display could be provided for displaying the text chunks XI, X2, X3, if the display is provided (by additional means) to send requests to the interaction manager 41 and to wait for responses confirming that displaying a chunk on the display is permitted.
  • each text chunk XI, X2, X3 can be outputted as speech and displayed on a display simultaneously or with a time delay as well, so that the driver can both li- sten to and read the text on a display.
  • FIG. 4 shows a second functional architecture of an exemplary and preferred embodiment of a system according to the invention.
  • This system substantially comprises a central unit 51 and a plurality of local units, in this example two local units 52, 53.
  • the central unit 51 comprises an interaction manager or driver vehicle environment manager 511 which controls a main information and resource manager 512.
  • This main information and resource manager 512 is provided for information management and for resource management and controls one or more HMI-devices 513, like e.g. a display and/or a speaker.
  • the main information and resource manager 512 receives requests from a plurality of applications 514 (e.g. a radio, an electronical control unit ECU, NECU, etc) and sends responses to these applications 514 as explained above for permitting or not permitting communication with the HMI-device 513, wherein the applications 514 are connected with the at least one HMI-device 513.
  • applications 514 e.g. a radio, an electronical control unit ECU, NECU, etc
  • Figure 4 shows a first local unit 52 which is additionally functioning as well as a gateway unit via e.g. a CAN or LIN bus B2 for the second local unit 53 mentioned below.
  • the unit 52 comprises a local resource manager 522 which is provided for resource management and not for information management and which again controls one or more HMI-devices 523 like e.g. a display and/or a speaker e.g. in the form of an integrated or non-integrated HMI.
  • the first local unit 52 furthermore comprises a plurality of applications 524 like e.g. a mobile phone, a hands free phone and/or a fleet management system etc., which send requests to the local resource manager 522 and receive responses from it as explained above with respect to Figures 1 to 3.
  • the applications 524 are again connected with the HMI-devices 523 for communication with the same.
  • the local resource manager 522 is connected e.g. via a CAN bus Bl with the main information and resource manager 512 within the central unit 51, for sending requests and receiving responses from it according to the methods disclosed with reference to Figures 1 to 3.
  • the second local unit 53 for example in the form of other systems of the vehicle again comprises a local resource manager 532 which is provided for resource management and not for information management.
  • the second local unit 53 comprises at least one application 534 (like e.g. nomad applications, mobile phones, navigation system, etc.) ha- ving its own and/or an integrated or non-integrated HMI-device 533 for controlling the same after having sent a request to the local resource manager 532 and having received a related response from it as explained above.
  • application 534 like e.g. nomad applications, mobile phones, navigation system, etc.
  • the local resource manager 532 is connected e.g. via a CAN bus or LIN bus B2 with the local resource manager 522 of the first local unit 52 which is functioning as a gateway so that the local resource manager 532 can send requests to and receive responses from the main information and resource manager 512 within the central unit 52.
  • This architecture offers a greater flexibility because it can be more easily adapted to different vehicles with different needs and different local units and it is relatively easy to connect further local units to the system especially if a common bus system (especially a CAN bus) Bl, B2 is used.
  • a common bus system especially a CAN bus
  • a central unit 51 is provided with a basic functionality which is standard equipment for all manufactured vehicles and which fullfills the needs for the typical user of such a vehicle.
  • the functionality of such a basic system can be expanded by one or more specific or local units ("add-on-mo- dules") 52, 53 which in case of special needs fullfil special functions and can be built in separately after the manufacturing of the vehicle as well.
  • Figure 5 shows a third functional architecture of an exemplary and preferred embodiment of a system according to the invention which in contrary to the gateway-ar- chitecture shown in Figur 4 is a star network architecture. However, both architectures can be combined as well.
  • a central unit 51 comprising substantially the same components as the central unit 51 in Figure 4.
  • the main information and resource ma- nager 512 is provided according to the needs of the star network architecture.
  • a first local unit 52 and a second local unit 53 is provided as well which again correspond with the first local unit 52 and the second local unit 53, respectively, as disclosed in Figure 4.
  • these units 51, 52 and 53 it is referred to the description in connection with Figure 4 above.
  • the system architecture according to Figure 5 furthermore comprises a third local unit 64 and a fourth local unit 65.
  • the third local unit 64 is provided e.g. with a time critical functionality e.g. in the form of a safety and/or a warning (alert) unit which comprises an active safety HMI manager 642 which receives requests and sends responses to a plurality of related safety applications 644 like e.g. a FCW (forward collision warning), a LDW (long distance warning), an ACC (adaptive cruise control) etc.
  • a time critical functionality e.g. in the form of a safety and/or a warning (alert) unit which comprises an active safety HMI manager 642 which receives requests and sends responses to a plurality of related safety applications 644 like e.g. a FCW (forward collision warning), a LDW (long distance warning), an ACC (adaptive cruise control) etc.
  • FCW forward collision warning
  • LDW long distance warning
  • ACC adaptive cruise control
  • a display 643 is provided which is controlled by these applications 644.
  • a basic difference between this third local unit 64 and the other local units 52, 53 and 65 is, that because of its time critical functionality, the active safety HMI manager 642 only submits an information signal via a bus B3 informing about a warning condition to the main information and resource manager 512 within the central unit 51 in case it receives a related request from at least one of the safety critical applications 644 (and sending a response to these for generating warning or alert signals on the display 643) as well.
  • FIG. 5 shows a local unit 65 which comprises a local resource HMI manager in the form of an AMI-C (automobile multimedia interface-collaboration) con- tent based HMI manager 652 which controls a related display 653.
  • AMI-C automobile multimedia interface-collaboration
  • This unit 65 furthermore comprises a plurality of applications 654 which are provided for transmitting content-based information preferably in the XML format via the unit 652 to the display 653.
  • applications 654 which are provided for transmitting content-based information preferably in the XML format via the unit 652 to the display 653.
  • the principles of the AMI-C content based HMI are disclosed in SAE Technical Paper Series 2004-01-0272: L. Jalics and F. Szczublewski: "AMI-C Content-Based Human Machine Interface (HMI)", March 2004, which by reference shall be made to a part of this disclosure.
  • the HMI manager 652 can send requests to the main information and resource manager 512 in the central unit 51 (via a bus B2) and can receive responses from it before it allows e.g. displaying the content transmitted by one of the applications 654 on the display 653.
  • the HMI ma- nager 652 receives a related response signal via the bus B2 and switches off the application 654 which is going to transmit its content to the display 653.
  • the content based applications 654 can as well be nomad applications.
  • the system architecture as shown in Figure 5 has substantially the same advantages as those mentioned in connection with Figure 4.
  • the communication between the central unit and the local units can be conducted via the same or different local bus systems Bl, B2, B3, B4 provided within the vehicle wherein wireless communications systems like Blue Tooth can be used as well.
  • the pro- tocols for transmitting signals between the units and within the units can be the same or different.
  • inventive methods, systems and applications are not limited to the communication and/or interaction with a vehicle driver.
  • nomad applications 33 are e.g. mobile phones, portable media (e.g. CD-) players, portable radios, PDAs and/or hand held navigation systems like a GPS receiver etc. which the captain or skipper brings on board and uses during the operation of the yacht, boad or ship and which are handled according to the disclosure with respect to the above nomad applications 33.
  • the inventive methods, systems and applications are accordingly applicable in cont- rol stations for controlling and guiding ships in the vicinity of harbours and/or other areas or waterways with a high traffic density.
  • the human being is a traffic controller or officer who uses e.g. his own mobile phone or another nomad application 33 during supervising the ship traffic, wherein the mobile phone etc. is handled according to the disclosure with respect to the above nomad applications 33.
  • inventive methods, systems and applications are accordingly applicable in power plants and in the industry and especially in control rooms where the human being is an operator who has to supervise and make sure that e.g. processes are running correctly.
  • different alarms/messages for various events could be controlled by an interaction manager according to the above disclosure. If e.g. a malfunction occurs, only the most important and critical information (the information that requires immediate reaction by the operator) will be passed through. Different processes or components of processes can be considered as different applications with unique identifications.
  • the operator can use e.g. his own mobile phone or another nomad application 33 during supervising the power plant, wherein such mobile phone etc. is again handled according to the disclosure with respect to the above nomad applications 33.
  • the HMI 43 could be the interface between the air traffic control computer and the human being who is e.g. an air traffic controller.
  • Aeroplanes could be considered to be nomad devices that fly in and out of the control zone being controlled by the air traffic control tower. Aeroplanes that enter the control zone use their transponder signal as identification. All communication between the air traffic controller and the pilot of the aeroplane is then conducted according to the inventive methods, systems and applications disclosed above.
  • the aeroplanes can use dynamic priorities so that e.g. a message will become more urgent the closer the aeroplane comes to the control tower.
  • the DNE-state could be based on the amount of information currently being processed by the air traffic controller and/or the number of aeroplanes currently flying within his control zone.
  • Safety critical messages e.g. when two airplanes fly very close to each other, are treated in the same way as disclosed above with respect to safety critical messages. If a controller uses an own mobile phone or another nomad application as mentioned above, this is again handled according to the disclosure with respect to the above nomad applications 33.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Small-Scale Networks (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

A method and system for communication and/or interaction between a vehicle driver (D) and a plurality of integrated and/or non-integrated applications (2; 3) like e.g. native vehicle applications and/or aftermarket applications and/or nomad applications is disclosed. Especially, such a method and system for managing the communication and/or interaction by means of an interaction manager (41) is provided, by which this communication and/or interaction is conducted or managed in such a way that risks and impairments for the safety and comfort of the driver are reduced considerably and workload and distraction of the driver are reduced considerably as well.

Description

METHOD ANS SYSTEM FOR INTERACTION BETWEEN A VEHICLE DRIVER AND A PLURALITY OF APPLICATIONS
The invention relates to a method and system for communication and/or interaction between a human being, especially a vehicle driver, and a plurality of integrated and/or non-integrated applications like e.g. native vehicle applications and/or after- market applications and/or nomad applications. Especially, the invention relates to such a method and system for managing the communication and/or interaction by means of an interaction manager.
Today, there is a strong trend towards the integration of vehicle built-in applications and functions in such a way that they can share at least partly the same hardware and/or software components like e.g. sensors and displays. Such applications espe- cially include Advanced Driver Assistance Systems (ADAS) and In-Vehicle Information Systems (INIS). However, aftermarket add-on applications which are added after shipping of the vehicle, are still very common. Moreover, many drivers use portable computing devices like PDAs (Personal Digital Assistants), mobile phones and other stand-alone applications (nomad systems) in the vehicle as well.
All these applications and other technologies have great potential for enhancing road safety, as well as enhancing the quality of life and work.
However, the proliferation of such in- vehicle applications increases the risk for con- flicts, e.g. when applications are activated and messages (e.g. incoming diagnostic messages or SMS- [Short Message Service] messages) are presented simultaneously. This may reduce comfort and impose safety-critical levels of workload and distraction on the driver. Moreover, interference between different ADAS and INIS may lead to sub-optimal performance, reduced user acceptance and, hence, re- duced safety benefits of these applications.
In "Generic Intelligent Driver Support", edited by John A. Michon, 1993 (Taylor & Francis, London, Washington), the GIDS (Generic Intelligent Driver Support) project is disclosed which had two main focuses: (1) to develop generic driver sup- port functions and (2) to develop techniques for integrating these functions and making them adaptive to the driver and the driving environment. A key feature of the GIDS project was information management. The core of such functions developed in GIDS was:
1. Prioritization which prevents conflicting information from different support and service functions to be presented simultaneously, by presenting information sequentially and according to priority; and
2. Scheduling which prevents conflicts between system-initiated information and demands imposed by the driving task by re-scheduling (e.g. postponing) information of low or medium priority initiated during demanding driving situations.
These functions have been realised by a data flow protocol according to which each application presenting information to the driver sent a "request" to a unit called the "dialogue controller", wherein the request contained among other features the following attributes:
Application identification (ID), message priority (on a 6-point scale), preferred time of presentation, content of message and specifications for the integrated HMI (Human Machine Interface), order within task cluster and imposed workload per resource (i.e. how much workload the message will impose on the driver in each of the sensory modalities).
Based on an estimation of the current workload, provided by a workload estimator, the dialogue controller decides when and how the message is presented to the driver. The dialogue controller is also responsible for actually presenting the informa- tion to the driver through an integrated HMI. Thus, the basic GIDS information management function is to filter information presented through the common HMI of the dialogue controller. However, no method is disclosed for enabling the integrated management of stand-alone systems which have their own HMIs (e.g. aftermarket and nomad systems).
U.S. patent application US 2002/0120374A1 discloses a system and method for driver performance improvement by which operator activity data relating to activities of the operator within an interior portion of the vehicle are monitored and vehicle operating data, vehicle environment data and operator condition data are received. An operator cognitive load is estimated and on the basis of these data vehicle information is prioritized for selectively informing the operator of vehicle information. Furthermore, the system may also operate with wireless communicaton devices like mobile phones, PDAs and pagers and prioritize incoming calls, e-mails and text and data messages of such devices, respectively.
However, integrating these so called stand-alone devices into this system requires additional hardware, especially a sensor fusion module and appropriate processing capability at or within the related device. This is considered to be disadvantageous and expensive.
Thus, it is an object of the invention to provide a method and system for communication and/or interaction between a vehicle driver and a plurality of integrated and/or non-integrated applications as mentioned above by which this communication and/or interaction is conducted or managed in such a way that risks and impairments for the safety and comfort of the driver are reduced considerably and workload and distraction of the driver by such communication and/or interaction is reduced considerably as well.
It is another object of the invention to provide such a method and system by which aftermarket and non-integrated applications like stand-alone or mobile applications and nomad applications can be integrated and used, respectively, as a part of the whole information management architecture of the system.
Especially, it is an object of the invention to provide such a method and system by which non-integrated applications and nomad applications which have their own human machine interface can be integrated and used, respectively, as a part of the whole information management architecture of the system as well.
Finally, it is an object of the invention to provide such a method and system which offer an open system architecture especially for integrating aftermarket and nomad applications according to the need of the driver and in such a way that workload and distraction of the driver by these applications are reduced considerably.
These objects are solved by methods according to claim 1 and 15 and by a system according to claim 20.
In the claims and in the specification the term "application" shall cover any systems, components, functions, devices, units and modules which are able after their activation to communicate and/or interact one- and/or bidirectional with the vehicle driver, for example by initiating, sending and/or receiving actions to/from the driver, sending and/or receiving messages to/from the driver etc.. Such an application can also be very sophisticated like e.g. a collision avoidance system.
Further, a driver/vehicle environment (DNE) state is a state which is evaluated on the basis of output signals of one or a plurality of sensors for detecting parameters of the driver and/or the vehicle and/or the environment.
The inventive methods and systems are able to handle integrated or "native" applications which are implemented in the vehicle before shipping, as well as non-integrated applications like aftermarket applications that are added later and those applications which are brought into the vehicle temporarily or permanently by the driver or a passenger (nomad applications).
The centralised management of the communication and/or interaction between a driver and the applications by means of an interaction manager opens a great potential for a high degree of modularity, a comparatively simple system architecture and the possibility to extend the system in a simple way by additional applications which are able to send requests and receive responses according to claim 1.
This is also due to the fact that the communication and/or interaction itself is not changed or altered by the interaction manager (only allowed or not allowed) but is conducted solely by the related application. So the interaction manager does not need to know for which kind of communication and/or interaction each application is provided for, and further does not need to know the kind of application that sent the request.
Especially if requests and responses have a standardized format, the inventive methods and systems offer an open system architecture and the use of standardized data protocols so that they can be extended in a very flexible way with applications which are built-in later (or to nomad applications), without the need to change the interaction manager itself.
As an alternative or as a complement to the solution according to claim 1, the above objects can be solved if according to claim 15 the interaction manager controls the capability of the applications to communicate and/or interact with a vehicle driver in dependence of a DNE state and in the absence of any request from the related application.
Finally, the inventive methods, systems and applications are not limited to the communication and/or interaction with a vehicle driver but with any human being who is confronted more or less simultaneously with a plurality of signal or information sources which in case of activation have to be considered or handled in dependency of at least one certain environment state and/or other such activated signals and/or information sources and/or other conditions.
The subclaims disclose advantageous embodiments of the methods and the system according to claims 1, 15, and 20, respectively.
Further details, features and advantages of the invention are evident from the following description of exemplary embodiments of the invention in connection with the drawings, in which schematically shows:
Fig. 1 a first functional architecture of such an exemplary and preferred embodiment of a system according to the invention; Fig. 2 a look-up table for a DNE state-dependent control of the service state or functions of an in- vehicle application according to the invention; Fig. 3 an embodiment of another application according to the invention; Fig. 4 a second functional architecture of an exemplary and preferred embodiment of a system according to the invention; and Fig. 5 a third functional architecture of an exemplary and preferred embodiment of a system according to the invention.
The inventive system according to Figure 1 can be implemented e.g. on a multiplex vehicle bus like the well-known CAΝ-bus or MOST-bus or a wireless LAN (Local Area Network), working with the Bluetooth- or any other standard, and comprises the following four main components:
The first main component is a sensor array 1 which generally comprises one, but preferably a plurality of sensors of all possible types that are provided to monitor and/or to detect the state of the driver and/or the state of the vehicle and/or the state of the environment.
The sensor array 1 according to Figure 1 comprises exemplarily a first group 10 of driver state sensors which for example are head movement sensors and/or gaze sensors and/or eyelid closure sensors for tracking the movement of the head and/or the eye and/or the eyelid of a driver. (The gaze direction is the direction in which a person momentarily directs the attention [fovea] of his eyes with the eye-balls as reference).
A second group 11 of vehicle state sensors comprises for example speed sensors, accelerometers, steering wheel angle sensors, pedal position sensors, gyros, tire pressure sensors or other sensors for detecting various vehicle related information.
A third group 12 of environment state sensors comprises e.g. radar and/or laser sensors and/or video cameras and is provided for detecting and/or monitoring e.g. the surrounding traffic.
A fourth group 13 of sensors (e.g. GPS sensors with map matching) is provided for detecting the geographical position of the vehicle, and a fifth group 14 of sensors (e.g. lane tracking sensors) monitors the position of the vehicle on the road and/or other environmental states.
The second main component is formed by a plurality of units, modules or applica- tions 2 which are integrated into the vehicle. Preferably, these applications share the sensor array 1 (arrow Al) and/or those input/output (I/O) devices 431 (like e.g. displays, audio systems, buttons, knobs etc.) which belong to a common integrated human machine interface 43 (HMI) which is explained below.
These integrated units, modules or applications 2 comprise as well the core computation units of an Advanced Driver Assistance System (ADAS) and/or an In-Nehicle Information System (INIS) that are integrated into the vehicle architecture. According to Figure 1, these integrated applications 2 are for example an attention support system 21, a route guidance system 22, a lane departure warning system 23, a tyre pressure monitor system 24 and information- and entertainment systems 25 like e.g. radio, CD, DND and other. Thus, based on the output signals of the sensor array 1, the integrated applications 2 perform the computations needed for determining what action to take (e.g. issue a warning). They then use the common integrated HMI 43 (arrow A5) for interacting with the driver D (arrow A2).
The third main component is a plurality of units, modules or applications 3 which might be integrated into the vehicle architecture but not into the integrated HMI 43. These are considered as stand-alone applications 3. Generally, these applications 3 have their own sensors and/or input/output (I/O) devices, like displays, keyboards and/or other non-integrated HMIs 34 for communicating and/or interacting with the driver D.
The units, modules or applications 3 comprise according to Fig. 1 e.g. integrated ap- plications 31 which, however, utilize their own HMI (which is not integrated into the integrated HMI 43), aftermarket applications 32 including those that are added to the vehicle after it has been shipped, and nomad applications 33 like mobile phones, portable media players (e.g. CD players) or hand held navigation systems like a GPS receiver.
Finally, the fourth main component is the driver information unit 4. This central unit is of primary importance and contains an interaction manager 41, a driver/vehicle environment (DNE) state estimator/predictor 42 and the integrated human machine interface (HMI) 43 (which has been mentioned above) for communicating and/or interacting with the driver D (arrow A2).
The interaction manager 41 contains the hardware/software (e.g. for realizing a prioritization algorithm and a waiting queue, as explained below) for managing the driver's communication and/or interaction with the integrated ADAS/INIS modules and applications 2 via the integrated HMI 43 (arrow A5) and with the stand-alone applications 3 via the non-integrated HMIs 34, in dependence of the real-time input from the driver/vehicle environment (DNE) state estimator/predictor 42 (arrow A3).
The DNE state estimator/predictor 42 computes a (potentially) multidimensional estimate of a current and/or a predicted DNE state on the basis of output signals of the sensor array 1 (arrow A4) and/or the output signals of at least one of the integra- ted applications 2 (arrows A5 and A9) and continuously outputs a corresponding DNE state vector to the interaction manager 41 (arrow A3).
This state vector for example comprises and/or is established on the basis of at least one of the following criteria or parameters: primary task demand (e.g. the complexity of the driving situation and how critical the situation is for the driver), secondary task demand (e.g. driver activity focused on other tasks than the primary driving task), visual distraction, physiological driver impairment (like drowsiness, drug influence), driver characteristics (age, experience etc.), specific driving situations (e.g. overtaking another vehicle), the overall driving environment type or context (e.g. highway, country road, city, rural, suburbia or city), as well as the driver identity.
These criteria or parameters of the state vector can be computed by a sub-module within the DNE state estimator/predictor 42 on the basis of one or more sensor signals of one or more sensors in the sensor array 1 (arrow A4) and/or of one or more output signals of the integrated applications 2 (arrow A5 and A9), e.g. the driver state signals (e.g. sensor signals indicating gaze direction and alertness), the vehicle state signals (e.g. sensor signals indicating speed, acceleration, steering wheel angle, pedal positions, gyro signals, tyre pressure) and/or the environment state signals (e.g. sensor signals indicating road type, road surface condition, surrounding obstacles, geographical position etc.).
The DNE state estimation may include a prediction of future states as well. The exact definition of the DNE state vector is based on the needs of the interaction manager 41 (i.e. what information is required for the interaction management functions).
Finally, the driver information unit 4 comprises the integrated human machine interface (HMI) 43 which is the main interface between the driver D (arrow A2) and the integrated ADAS/INIS modules and applications 2 (arrow A5), containing one or more I/O devices 431 for enabling the driver D to communicate and/or interact with the system in different sensory modalities or via different sensory channels (visual, auditory, hap tic). The stand-alone applications 3 may, but not have to, use the integrated HMI 43 because these applications 3 usually have their own built-in HMIs. So the driver D potentially can communicate and/or interact with the stand-alone applications 3 via their own built-in HMIs (arrow 34). A first main feature of the inventive method conducted by the system according to Figure 1 is the scheduling of applications-initiated information, i.e. any communication and/or interaction with the driver D, in dependence of a current and/or predic- ted driver/vehicle environment (DNE) state.
A second main feature is that, when communications and/or interactions with the driver D are initiated simultaneously by several different applications 2, 3 (including the case in which a request is queued and awaiting approval and a request for a second communication and/or interaction is received) the system selects the most important communication and/or interaction on the basis of a prioritization (according to certain criteria, see below). Other communications and/or interactions are put in a waiting queue and are performed later in order of priority.
Both these features shall now be explained in more details.
When a non-integrated application 3 is connected to the system (vehicle network) via a wireless or physical link an initiation by the interaction manager 41 (or the application 3) is included e.g. in the well-known handshake between the application 3 and the system. The interaction manager 41 checks if the application is compatible with the interaction manager 41, i.e. whether the application can be controlled by the interaction manager 41. If the application does not have such compatibility by default, the interaction manager 41 (or the application 3) checks for the possibility to download and/or install an appropriate software required to make the application 3 compatible with the interaction manager 41.
If such a software is downloaded by and successfully installed in the application 3 and/or the interaction manager 41, or if the application is compatible with the interaction manager 41 from the start, the interaction manager 41 dynamically assigns an identification number to the application which is unique in this system to the application. This application will keep the assigned identification number as long as it is connected to the interaction manager 41 i.e. as long as the link between the application and the system exists. After termination of the link and connecting the application again, a new (possibly as well the same) identification number is assigned to the application. Before an application 2, 3 communicates and/or interacts with the driver D, it has to execute a first routine with the interaction manager 41 (arrows A6 and A7) which comprises e. g. the following steps:
1. Generating and sending a request, preferably in a standardised format (see below) to the interaction manager 41 for permission to communicate and/or interact with the driver D, e.g. in the form of presenting a specified piece of information or a message;
2. Awaiting a response (preferably in a standardised format as well, see below) from the interaction manager 41. This response comprises an indication about either (a) "permission granted - go ahead" or (b) "permission denied - wait and hold the communication and/or interaction";
3. If no response has been received within a certain time period (normally <ls), the request is sent again. This step can be repeated n times (n = predefined = 0, 1, 2,...) Finally, if still no response is received, a diagnostic message is output.
Preferably, a request is sent a few times (e.g. 3 to 10) to consider a possible loss of the request on the bus or the wireless LAN. However, after a few tries it is supposed by the application sending the request that the interaction manager 41 is no longer available (e.g. that there are some problems with the interaction manager 41) and the application output a diagnostic message e.g. on a diagnostic bus, but preferably does not communicate and/or interact with the driver with respect to this message.
This procedure is applicable as well for an application to detect whether there is an interaction manager 41 present in a specific system or not. This might be of relevance if certain applications shall be used in a system without any interaction manager as well. In order for this application to use the same software in both systems the application has to check if an interaction manager is present or not. If there isn't any, the application will switch over and work in a stand-alone mode i.e. it will communicate/interact with the driver regardless of the state of the driver and/or the HMI and/or the vehicle and/or the environment. Otherwise it follows the above decribed routine. Thus, the same application can be used both in interaction manager cont- rolled systems as well as in systems without an interaction manager. By this, the modularity of the applications (especially for use in certain truck models) is increa- sed considerably.
4. If the request is finally denied, permission is awaited from the interaction manager 41 to communicate and/or interact with the driver D in the requested way.
The corresponding part of the first routine to be executed by the interaction manager 41 when a request is received from an application 2, 3 comprises e. g. the following steps:
1. Determining whether the requested communication and/or interaction can be permitted in dependence of a current and/or a predicted DNE state and/or whether requests have been received from other applications 2, 3;
2. If permission can be given, sending a response comprising an indication (a) "per- mission granted - go ahead" to the application 2, 3 from which the request was received;
3. If permission cannot be given, storing the request in a waiting queue and sending a response to the related application 2, 3 comprising an indication (b) "permission denied - wait and hold the communication and/or interaction" and possibly an instruction to await further responses;
Such a response can include an indication regarding the number of the request in the waiting queue. Preferably, in this case the response is repeated circularly or whene- ver another request with a higher priority has left the waiting queue.
4. If permission can be given, and one or more requests are stored in the waiting queue, picking up the request of the application 2, 3 which has the highest priority and sending a response comprising an indication "go-ahead" to this application 2, 3. This is repeated until all requests stored in the waiting queue have been permitted.
If an application withdraws it's request, it preferably sends a delete message to the interaction manager instructing the same to remove the request from the waiting queue. This is handled according to the description below with respect to "dynamic priorities". For certain applications 2, 3 which e.g. generate highly safety-critical message types, this routine may be skipped entirely, and the message is pushed through regardless of what requests are stored in the waiting queue. As an alternative, highly safety-critical messages could also be included in the prioritization (but not in the scheduling) and handled by the interaction manager. Thus, such highly safety-critical messages preferably but not necessarily can be handled regardless of the DNE state, but they follow the same procedure as for other requests. If this procedure requires too much time, another alternative is to let the safety-critical message push through only if no other such safety-critical message with a higher priority is cur- rently being directed to the driver (so that if several applications and/or systems go off in a certain scenario only the safety critical message with the highest priority will get pushed through).
In these cases the current non-safety-critical request is preferably put into the wai- ting queue (again) with the highest possible priority allowed for non-safety-critical requests and is permitted according to the above first routine and/or as soon as the safety-critical message(s) have been presented.
The details of the data flow protocol for these requests and responses shall now be described in more details.
The requests for communicating and/or interacting with the driver D, sent to the interaction manager 41 by the different integrated and/or non-integrated applications 2, 3 as well as the responses generated and transmitted by the interaction manager 41 preferably follow a standardised format.
Such a format and data structure for the requests sent by an application comprises e. g. the following fields: 1. Application identification: This field contains an identifier of the application 2, 3 sending the request. For nomad applications 33, this identifier may be dynamically assigned to the application 33 by the interaction manager 41 as decribed above, when and as long as it is connected to the vehicle network. The data type is preferably "integer";
2. Communication and/or interaction identification: This field contains an identifier of the communication and/or interaction (e.g. a message "low fuel", an incoming phone call, a vehicle diagnostic message, a route guidance message etc.) with the driver D, requested by the application 2, 3. The data type is preferably "integer". The identifier, however, has no connection to the ty- pe of communication/interaction itself, but is just an (integer) number.
3. Priority index:
This field contains preferably a floating number representing a priority index of the communication and/or interaction, established by means of a standardised method, e.g. SAE J2395. Representing the priority index by a float rather than by an integer has the advantage that it creates unique priorities. Otherwise, if two communications and/or interactions with the same priority index are initiated, the well-known first-in-first-out principle (i.e. the message that was initiated first is presented first) is preferably applied. Therefore, the data type is preferably "float".
The responses of the interaction manager 41 are preferably represented by a standardised format and data structure which contains e. g. the following fields:
1. Application identification: This field comprises the identifier of the application 2, 3 sending the request. The data type is preferably "integer";
2. Communication and/or interaction identification:
This field indicates the identifier of the communication and/or interaction (e.g. as mentioned above) requested by the request. The data type is preferably "integer";
3. Answer:
This field contains the answer to the request, comprising an indication about either (a) "permission granted - go ahead" or (b) "permission denied - wait and hold the communication and/or interaction". The data type is preferably "Booelan", e.g. "1" for (a) and "0" for (b).
These routines constitute a simple way to control any application 2, 3 that follows the standard formats and data structures specified above, without the need for the interaction manager 41 to keep an updated list of all applications 2, 3 and an updated list of all the communications and/or interactions (which the related application can perform) and their priorities (as in the prior art according to the GIDS-project). This is particularly useful in the case of nomad applications 33, which in the near future could be expected to seamlessly connect to the vehicle data bus, e.g. via a wireless link (such as Bluetooth).
A possible realization of these routines shall be illustrated in the following by an example:
A nomad application manufacturer is developing a certain PDA. He lists all possible communications and/or interactions initiated by the PDA and assigns a priority index to each of them, using a standardised method (e.g. in accordance with SAE J2395) and storing a related table in the PDA. In practise, this could also be done by an authorised institution. Instead of listing all possible interactions and/or communications the manufacturer can list groups of possible interactions and/or comrnunica- tions and assign the same priority to the entire group. This may be advantageous if it is impossible to anticipate all possible interactions/communications especially if there are dynamic interactions/communications.
The routine described above with respect to sending requests to and receiving re- sponses from the interaction manager 41 is implemented in the PDA as well so that it, when connected to the vehicle bus (and after the interaction manager 41 assigned an application identification to the PDA as described above), always sends a request with respect to the desired communication and/or interaction (and its priority) to the interaction manager 41 and awaits a response indicating said permission, before communicating and/or interacting (e.g. presenting an information and/or initiating an action) with the driver D.
Another possibility is to provide an adapter especially for those applications which are not suitable for storing said table and/or for implementing the above routine. In this case the adapter has the function of an interface to the system and conducts the above routine.
In case of an application, e.g. a mobile telephone, which is not compatible with the interaction manager, the application or the interaction manager could also download from the internet or via a service provider a software for achieving the capability of interaction between the application and the interaction manager. This might be subject to a subscription or the like requiring a permission/safeguard that the right person/application is requiring such an adapter program (e.g. in case that this costs a fee). To safeguard this an encryption can be used like e.g. a public key encryption method.
Summarizing, the interaction manager 41 does not need to know which application 2, 3 had sent the request. It is sufficient that the interaction manager 41 knows that an application X is asking for permission to perform a communication and/or interaction Y (e.g. presenting a message), preferably with a priority index P, without knowing what X and Y actually are. The only extra standardisation needed is a specified format and data structure or protocol for requests and replies, e.g. as disclosed above.
The format and data structure can be extended to contain additional information of the communication and/or interaction to be performed. This information could be optional (if this information is missing it is replaced by default values).
As an alternative to a predefined data sequence structure having default values if certain data are missing, it is possible to assign to each possible type of data (appli- cation identification, communication and/or interaction identification, priority index, etc., as mentioned above) a special code or address which is preceding the data connected to it and which enables the interaction manager to identify the type of data following this code/address. This is in other words a sort of dynamic protocol which for example looks like: "Application ID": XX, "Message ID": YY, "Priority index": ZZ, and then the next field could be any other data, but - preferably precee- ding the data - there is an identifier e.g. "Duration": AA or "Auditory Load": BB.
Some examples of additional information that could be added to the requests are given as follows:
4. Duration of the communication and/or interaction:
This field indicates the estimated time that the driver needs for the communication and/or interaction, e.g. in case of a message, to comprehend the message. This is also the time before a subsequent communication and/or interaction can be performed (in seconds). The data type is preferably "float"; However, it is preferred that safety-related requests (or any other request of a certain, predefined priority) are always be processed immediately irrespective whether another message/action of lower priority is currently active. The high priority message/action will be forwarded interrupting the other message/action. In sum, the pre- ferred procedure for handling safety-critical messages, e.g. warnings (with priority above a certain threshold) is the following: a.) the message is pushed through irrespective of the DNE state; b.) the message overcides any message with low or medium priority currently active, wherein such current message is preferably put on hold into the waiting queue with the highest possible priority allowed for non-safety-critical messages (or requests); c.) several safety-critical messages initiated roughly simultaneously are presented in order of priority.
5. Visual load:
This field indicates the load imposed on the visual channel to the driver. The data type is preferably "integer";
6. Auditory load: This field indicates the load imposed on the auditory channel to the driver. The data type is preferably "integer";
7. Haptic load:
This field indicates the load imposed on the haptic channel to the driver. The data type is preferably "integer".
A first preferred extension of the inventive method is the handling of dynamic priorities:
For some applications 2, 3, the priority of a request may change over time. One example of this is a route guidance message such as "turn right at the next intersection", which becomes more urgent and important the closer the vehicle comes to the intersection. A simple way to consider such dynamic priorities is to use multiple requests with the same identification but different priorities, e.g. a first request for a message when there are 100-200 meters left to the intersection and a second request for the same message when there are less than 100 meters left to the intersection. However, this requires that the first request is taken out of the waiting queue at the interaction manager 41 when it becomes irrelevant and is replaced by the updated request. In other cases, it may be necessary to delete a request from the waiting queue without replacing it.
Another example is the case in which the request for an incoming call is denied by the interaction manager. The phone application may send an updated request (same message ID) with a higher priority after some time e.g. 5 seconds because answering the call has become more urgent since the risk of the caller hanging-up increa- ses with time. An important feature of this system is that the risk of loosing a call or information is reduced. Preferably, the caller gets a periodically updated information about the number or plane of his call in the waiting queue.
In order to accomplish such dynamic priorities, the standard format (protocol) of the request is extended by a field specifying the requested operation on the waiting queue as either add_request, replace_request or delete_request as follows:
8. Type of request:
This field indicates the type of the request (or action), which can take three values: 0=add_request, l=replace_request and 2=delete_request. The data type is preferably "integer".
A third main feature of the inventive method is the control of the service state (i.e. the control of the mode of operation or function) of the applications 2, 3 by means of the interaction manager 41, based on and in dependence of a current and/or a predicted DNE state. This could include, for example, the enabling and/or disabling of functions of the applications 2, 3 (or one or more of the entire applications) and/or a real-time re-configuration of the related HMI 34, 43.
According to an exemplary and preferred embodiment of the invention, this is accomplished for the integrated HMI 43 by configuration of the same under direct control of the interaction manager 41 as illustrated by arrows A8 in Figure 1.
For non-integrated (stand-alone) applications 31 having their own built-in HMI 34 the configuration is controlled by the interaction manager 41 by sending instructions to these applications 31. These instructions are considered as an additional message type and have a standardised format (protocol) so that it can be used by the interaction manager 41 to control e.g. the service state or the mode of operation or function(s) of these applications 2, 3 or to configure the related HMI 34, 43, including enabling and/or disabling one or more of the entire applications 2, 3.
One possibility for realizing this is to establish and provide a set of standardised driving situations Sj...Sn that can be recognised by the DNE state estimator/predictor 42. A simple case would be two types of situations: Sι=vehicle standing still and S2=vehicle moving. If more advanced DNE monitoring techniques are used, the situations may include, for example:
Figure imgf000020_0001
still (engine shut off=parking), S2=standing still (engine idling=temporary stop at a traffic light or intersection), S3=driving on the highway, S =driving in the city, S5=overtaking, S5=driver is drowsy and so on.
The current driving situation index Sn is then distributed to all non-integrated (stand-alone) applications 3 (possibly to the integrated applications 2 as well) by the interaction manager 41. The developer of these applications 3 would then have to define a look-up table specifying how the service state or mode of operation or function(s) is determined by the current driving situation index Sπ. Such a look-up table for situation-dependent control of the service state or mode of operation or function(s) of non-integrated applications 3 is exemplified in Figure 2. In this example, the service state is defined by the enabling/disabling of certain functions FI to F8 of the related application depending on the drivers situation index SI to S4.
As a simple example, assume that F5="DND player" and Sl="vehicle standing still (engine shut off=parking)". The table in Figure 2 thus specifies that the DND player should only be enabled when the vehicle is parked (a black box indicates that func- tion is enabled, a white box indicates that a function is disabled).
The look-up table could be generalised to represent more complex service states than mere enabling/disabling of functions. In a simple example, this table contains dual states of the type "on'V'off ' or "loud"/" quiet" or "bright'V'dark" etc.. In a more advanced system this table may have n states (very loud, loud, neutral, more quiet, absolute quiet) which then are a function of the DNE parameters as well. This control thus requires a message type that the interaction manager 41 could use to distribute the driving situation index Sn. The proposed message format for enabling/disabling functions (distributed by the interaction manager) preferably comprises only one index which is specified as follows:
Driving situation index:
This field is preferably an integer- value representing the current general driving situation index Sn.
A second preferred extension of the inventive method is the handling of a text-to- speech or text-to-display output by means of a text-converter:
Text-to-speech or text-to-display is likely to be a common type of HMI in future vehicles. The basic idea is to give the driver access to longer texts (e.g. emails) without imposing excessive visual distraction when reading it as a whole from a display. However, comprehending the spoken text still imposes a cognitive load on the driver, which may contribute to overload in demanding and/or critical driving situations. Passengers in the car talking to the driver usually avoid this problem by pau- sing when the workload of the driver increases by the driving task. The idea of the proposed function is to schedule the text output based on the output from the DNE state estimator/predictor. The general principle is illustrated in Figure 3.
Figure 3 shows a preferred embodiment of a text-converter 26 which can be consi- dered as a specific form of an application that can be an integrated or a non-integrated application 2, 3. Two different cases have to be considered with respect to the text to be output:
a) pre-chunked messages, e. g. a route guidance message generated by a route gui- dance system A and b) non-pre-chunked messages, e. g. an e-mail generated by an e-mail or SMS system B for which real-time parsing is necessary.
The application 26 is generally handled according to the methods outlined above. In the preferred embodiment as illustrated in Figure 3, the following procedure is executed: 1.) The raw text representation "X1-X2-X3" (standing for any text comprising the text segments XI, X2, X3 to be converted to speech and output) is initiated by an information system like a route guidance system A and/or an e-mail or SMS system B. (XI, X2, X3 could be a paragraph, a sentence, a phrase, a word, a letter, a figure, etc.).
2a). In the first case (route guidance system), the text may be chunked up in advance by the information system itself. This is generally possible for pre-defined mes- sages (route guidance messages or similar messages). In this case, the message is passed directly to a waiting queue 262 provided within the application 26.
2b.) Alternatively, the messages may not be pre-chunked, as is the case for e-mails and SMS (B). In this case, the raw text is fed into a parser 261 which segments the text into grammatical categories (a parser 261 is a piece of software that employs a stored lexicon and a grammar in order to segment a text into hierarchical grammatical categories like e.g. paragraphs, sentences, phrases, words etc.).
The parser 261 thus slices up the text "X1-X2-X3" into chunks XI, X2 and X3 (e.g. phrases), which are then put into the waiting queue 262 in order of presentation.
A simpler alternative to grammatical parsing would be to just look for specific dividing characters, e.g. comma, punctuation and/or colon in order to divide the text into meaningful chunks.
3.) The text chunks XI, X2 and X3 are treated like the communications and/or applications according to the method described above. Thus, the speech generator 263 sends requests to the interaction manager 41 and waits for responses confirming that presentation is permitted.
4.) If a chunk XI, X2 and/or X3 is held for longer than a certain first period of time, e.g. 10 seconds, the previous chunk is preferably repeated in order to facilitate understanding. If a chunk XI, X2 and/or X3 is held for longer than a second period of time (e.g. 20 seconds), the two previous chunks are repeated and so on.
Instead of repeating individual text chunks or groups of such chunks as a function of the waiting time irrespective whether the repeated chunks form a logical unit, in a more advanced system the system will repeat those chunks which form logical units (a whole sentence instead of a word or a phrase, or a whole paragraph instead of a sentence, especially in case the waiting time has been too long). The parser 261 might be an intelligent one which recognises not only grammatical structures, but also content (a semantical parser). Methods used in natural language processing systems might be applicable here to identify such logical units.
5.) When a message is interrupted (i.e. a chunk is held), the driver is preferably in- formed on this, e.g. by a characteristic tone or a short voice message as e.g. "Message interrupted" or "Stop".
Instead of the speech generator 263, an appropriate display (not illustrated) could be provided for displaying the text chunks XI, X2, X3, if the display is provided (by additional means) to send requests to the interaction manager 41 and to wait for responses confirming that displaying a chunk on the display is permitted.
Finally, each text chunk XI, X2, X3 can be outputted as speech and displayed on a display simultaneously or with a time delay as well, so that the driver can both li- sten to and read the text on a display.
Although this embodiment of the invention is for managing and outputting voice messages, the general principles illustrated in Figure 3 could be applied to other message types as well.
Figure 4 shows a second functional architecture of an exemplary and preferred embodiment of a system according to the invention. This system substantially comprises a central unit 51 and a plurality of local units, in this example two local units 52, 53.
The central unit 51 comprises an interaction manager or driver vehicle environment manager 511 which controls a main information and resource manager 512. This main information and resource manager 512 is provided for information management and for resource management and controls one or more HMI-devices 513, like e.g. a display and/or a speaker. Furthermore the main information and resource manager 512 receives requests from a plurality of applications 514 (e.g. a radio, an electronical control unit ECU, NECU, etc) and sends responses to these applications 514 as explained above for permitting or not permitting communication with the HMI-device 513, wherein the applications 514 are connected with the at least one HMI-device 513.
Figure 4 shows a first local unit 52 which is additionally functioning as well as a gateway unit via e.g. a CAN or LIN bus B2 for the second local unit 53 mentioned below. The unit 52 comprises a local resource manager 522 which is provided for resource management and not for information management and which again controls one or more HMI-devices 523 like e.g. a display and/or a speaker e.g. in the form of an integrated or non-integrated HMI.
The first local unit 52 furthermore comprises a plurality of applications 524 like e.g. a mobile phone, a hands free phone and/or a fleet management system etc., which send requests to the local resource manager 522 and receive responses from it as explained above with respect to Figures 1 to 3. The applications 524 are again connected with the HMI-devices 523 for communication with the same.
Furthermore, the local resource manager 522 is connected e.g. via a CAN bus Bl with the main information and resource manager 512 within the central unit 51, for sending requests and receiving responses from it according to the methods disclosed with reference to Figures 1 to 3.
The second local unit 53 for example in the form of other systems of the vehicle again comprises a local resource manager 532 which is provided for resource management and not for information management.
Again, the second local unit 53 according to Figure 4 comprises at least one application 534 (like e.g. nomad applications, mobile phones, navigation system, etc.) ha- ving its own and/or an integrated or non-integrated HMI-device 533 for controlling the same after having sent a request to the local resource manager 532 and having received a related response from it as explained above.
Finally, the local resource manager 532 is connected e.g. via a CAN bus or LIN bus B2 with the local resource manager 522 of the first local unit 52 which is functioning as a gateway so that the local resource manager 532 can send requests to and receive responses from the main information and resource manager 512 within the central unit 52.
By this second architecture, a central information management but a distributed resource management is provided. This means that within the local units 52, 53 (and possibly between the local units 52, 53) only a resource management is conducted by the local resource managers 522, 532, whereas between a local unit 52; 53 and the central unit 51 (possibly via another local unit as a gateway) an information management and, if necessary, a resource management is provided.
This architecture offers a greater flexibility because it can be more easily adapted to different vehicles with different needs and different local units and it is relatively easy to connect further local units to the system especially if a common bus system (especially a CAN bus) Bl, B2 is used.
It is especially advantageous if a central unit 51 is provided with a basic functionality which is standard equipment for all manufactured vehicles and which fullfills the needs for the typical user of such a vehicle. Optionally, the functionality of such a basic system can be expanded by one or more specific or local units ("add-on-mo- dules") 52, 53 which in case of special needs fullfil special functions and can be built in separately after the manufacturing of the vehicle as well.
Figure 5 shows a third functional architecture of an exemplary and preferred embodiment of a system according to the invention which in contrary to the gateway-ar- chitecture shown in Figur 4 is a star network architecture. However, both architectures can be combined as well.
Again, a central unit 51 is provided comprising substantially the same components as the central unit 51 in Figure 4. However, the main information and resource ma- nager 512 is provided according to the needs of the star network architecture. Preferably, a first local unit 52 and a second local unit 53 is provided as well which again correspond with the first local unit 52 and the second local unit 53, respectively, as disclosed in Figure 4. With respect to these units 51, 52 and 53 it is referred to the description in connection with Figure 4 above.
The system architecture according to Figure 5 furthermore comprises a third local unit 64 and a fourth local unit 65. The third local unit 64 is provided e.g. with a time critical functionality e.g. in the form of a safety and/or a warning (alert) unit which comprises an active safety HMI manager 642 which receives requests and sends responses to a plurality of related safety applications 644 like e.g. a FCW (forward collision warning), a LDW (long distance warning), an ACC (adaptive cruise control) etc..
Furthermore, a display 643 is provided which is controlled by these applications 644. A basic difference between this third local unit 64 and the other local units 52, 53 and 65 is, that because of its time critical functionality, the active safety HMI manager 642 only submits an information signal via a bus B3 informing about a warning condition to the main information and resource manager 512 within the central unit 51 in case it receives a related request from at least one of the safety critical applications 644 (and sending a response to these for generating warning or alert signals on the display 643) as well.
As an example of another local unit which can be connected to the system architec- turem, Figure 5 shows a local unit 65 which comprises a local resource HMI manager in the form of an AMI-C (automobile multimedia interface-collaboration) con- tent based HMI manager 652 which controls a related display 653.
This unit 65 furthermore comprises a plurality of applications 654 which are provided for transmitting content-based information preferably in the XML format via the unit 652 to the display 653. The principles of the AMI-C content based HMI are disclosed in SAE Technical Paper Series 2004-01-0272: L. Jalics and F. Szczublewski: "AMI-C Content-Based Human Machine Interface (HMI)", March 2004, which by reference shall be made to a part of this disclosure.
Again, according to the above methods as described with reference to Figures 1 to 3, the HMI manager 652 can send requests to the main information and resource manager 512 in the central unit 51 (via a bus B2) and can receive responses from it before it allows e.g. displaying the content transmitted by one of the applications 654 on the display 653.
If e.g. a warning signal has been transmitted from the active safety HMI manager 642 to the main information and resource manager 512 via the bus B3, the HMI ma- nager 652 receives a related response signal via the bus B2 and switches off the application 654 which is going to transmit its content to the display 653. The content based applications 654 can as well be nomad applications.
The system architecture as shown in Figure 5 has substantially the same advantages as those mentioned in connection with Figure 4. Generally the communication between the central unit and the local units can be conducted via the same or different local bus systems Bl, B2, B3, B4 provided within the vehicle wherein wireless communications systems like Blue Tooth can be used as well. Furthermore the pro- tocols for transmitting signals between the units and within the units can be the same or different.
As mentioned in the introductory part, the inventive methods, systems and applications are not limited to the communication and/or interaction with a vehicle driver.
For example, instead of a vehicle driver, the human being can be a captain or skipper of a motor boat, a sailing boat or another yacht or ship. In these cases, nomad applications 33 are e.g. mobile phones, portable media (e.g. CD-) players, portable radios, PDAs and/or hand held navigation systems like a GPS receiver etc. which the captain or skipper brings on board and uses during the operation of the yacht, boad or ship and which are handled according to the disclosure with respect to the above nomad applications 33.
The inventive methods, systems and applications are accordingly applicable in cont- rol stations for controlling and guiding ships in the vicinity of harbours and/or other areas or waterways with a high traffic density. In these cases, the human being is a traffic controller or officer who uses e.g. his own mobile phone or another nomad application 33 during supervising the ship traffic, wherein the mobile phone etc. is handled according to the disclosure with respect to the above nomad applications 33.
Furthermore, the inventive methods, systems and applications are accordingly applicable in power plants and in the industry and especially in control rooms where the human being is an operator who has to supervise and make sure that e.g. processes are running correctly. In this case, different alarms/messages for various events could be controlled by an interaction manager according to the above disclosure. If e.g. a malfunction occurs, only the most important and critical information (the information that requires immediate reaction by the operator) will be passed through. Different processes or components of processes can be considered as different applications with unique identifications. Additionally or alternatively, the operator can use e.g. his own mobile phone or another nomad application 33 during supervising the power plant, wherein such mobile phone etc. is again handled according to the disclosure with respect to the above nomad applications 33.
The inventive methods, systems and applications can for example be applied for air traffic controlling as well. In this case, the HMI 43 could be the interface between the air traffic control computer and the human being who is e.g. an air traffic controller. Aeroplanes could be considered to be nomad devices that fly in and out of the control zone being controlled by the air traffic control tower. Aeroplanes that enter the control zone use their transponder signal as identification. All communication between the air traffic controller and the pilot of the aeroplane is then conducted according to the inventive methods, systems and applications disclosed above. The aeroplanes can use dynamic priorities so that e.g. a message will become more urgent the closer the aeroplane comes to the control tower. The DNE-state could be based on the amount of information currently being processed by the air traffic controller and/or the number of aeroplanes currently flying within his control zone. Safety critical messages, e.g. when two airplanes fly very close to each other, are treated in the same way as disclosed above with respect to safety critical messages. If a controller uses an own mobile phone or another nomad application as mentioned above, this is again handled according to the disclosure with respect to the above nomad applications 33.

Claims

Claims
1. Method for communication and/or interaction between a vehicle driver and a plurality of integrated and/or non-integrated applications like native vehicle applica- tions and/or aftermarket applications and/or nomad applications, by means of a first routine executed by an interaction manager and by each application before communicating and/or interacting with the driver, wherein the first routine comprises the following steps:
- sending a request by the application to the interaction manager for permission to communicate and or to interact with the driver;
- determining by the interaction manager whether a permission for the requested communication and/or interaction can be given, in dependence of a current and/or predicted driver/vehicle environment (DNE) state and/or of other requests which have been received before from this and/or other applications; and - generating and transmitting a response by the interaction manager to the application, comprising an indication about permitting or not permitting the requested communication and/or interaction; and wherein the requested communication and/or interaction is conducted by the application if the indication permits the same.
2. Method according to claim 1, wherein a priority index is assigned to at least one of the communications and/or interactions of at least one of the applications, wherein the priority index is submitted together with the request and wherein the determination of permission/non-permis- sion is conducted by the interaction manager in dependence of the priority index.
3. Method according to claim 2, wherein the priority index is established by means of a standardized method.
4. Method according to claim 1, wherein the related request is stored in a first waiting queue in case the interaction manager determines that permission can currently not be given for the requested communication and/or interaction.
5. Method according to claim 1, wherein at least one of the requests comprises at least one of the following informa- tion about: an identification of the application sending the request, a duration and/or identification of communication and/or interaction requested by the request, a visual load, an auditory load and/or a haptic load, each imposed on the driver by the requested communication and/or interaction, and a type of the request.
6. Method according to claim 2, wherein the same request can repeatedly be sent, preferably with different priority indices, in case the interaction manager determined that permission cannot be given for a requested communication and/or interaction.
10 7. Method according to claim 6, wherein in case of repeatedly sending, the request comprises an information regarding its type which indicates whether to add this request to a first waiting queue or to replace or to delete another request already stored in the first waiting queue or the 75 request is sent n-times (n = 1, 2, 3,...) in case no response is received from the interaction manager.
8. Method according to claim 1, wherein at least one of the responses comprises at least one of the following infor- 20 mation about: an identification of the application which sent the related request, an identification of the communication and/or interaction requested and/or an answer related to the determination whether a permission for the requested communication and/or interaction can be given or not.
25 9. Method according to claim 1, wherein the application is a text converter, preferably a text-to-speech- and/or a text-to-display converter and wherein the communication and/or interaction is provided for outputting: - a non-text message to the driver, preferably a speech message, and/or
30 - a message on a display.
10. Method according to claim 9, wherein the text is output as speech and/or dis isplayed on the display, respectively, in the form of chunked segments and wherein each segment is considered as one 35 communication and/or interaction.
11. Method according to claim 10, wherein the chunked segments are combined to form at least one logical unit when output.
12. Method according to claim 10, wherein the chunked segments are stored in a second waiting queue, until a request for outputting a segment is responded by the interaction manager permitting such output.
13. Method according to claim 12, wherein the chunked segments are stored in the second waiting queue until all segments have been output and wherein at least one of the stored segments can be output again if the segments are stored in the second waiting queue longer than a predetermined time.
14. Method according to claim 10, wherein chunked segments are generated from non-pre-chunked messages by means of a parser.
15. Method for communication and/or interaction between a vehicle driver and a plurality of Integrated and/or non-integrated applications like e.g. native vehicle applications and/or aftermarket applications and/or nomad applications, by means of a second routine executed by an interaction manager, comprising the following steps: - determining a current and/or a predicted driver/vehicle environment (DNE) state; - controlling the configuration of a human machine interface (HMI) and/or a service state and/or a mode of operation or function(s) of at least one of the applications by the interaction manager in dependence of a driver/vehicle environment (DNE) state.
16. Method according to claim 15, wherein controlling the configuration is conducted by sending a message to the application by the interaction manager, wherein the message contains a driving situation index which is evaluated on the basis of a current and/or predicted driver/vehicle environment (DNE) state, and wherein the application receiving the message is provided for controlling its human machine interface (HMI) and/or its service state and/or its mode of operation or function(s) according to the driving situation index contained within the received message.
17. Method according to claim 1 or 15, wherein a current and/or a predicted driver/vehicle environment (DNE) state is evaluated on the basis of at least one of the following criteria: primary task demand, secondary task demand, visual distraction, physiological driver impairment, driver characteristics, specific driving situations, overall driving environment type or context, and driver identity.
18. Method according to claim 17, wherein at least one of the criteria is evaluated from the output signals of at least one sensor for detecting a driver state and/or a vehicle state and/or an environmental state.
19. Method according to claim 1, wherein instead of a vehicle driver the communication and/or interaction is conducted between a human being and a plurality of signal or information sources which in case of activation have to be considered or handled in dependency of at least one certain state and/or other such activated signal or information sources.
20. System especially for conducting a method according to at least one of the preceding claims, comprising an interaction manager (41), an estimator/predictor (42) for estimating and/or predicting a driver/vehicle environment (DNE) state and a plurality of integrated and/or non-integrated applications (2; 3).
21. System according to claim 20, comprising a sensor array (1) with a plurality of sensors (10, 11, 12, 13, 14) which are connected with the estimator/predictor (42) for estimating and/or predicting the driver/vehicle environment (DNE) state.
22. System according to claim 20, wherein at least one of the non-integrated applications (31) is equipped with an own human machine interface (34).
23. System according to claim 20, comprising an integrated human machine interface (43).
24. System according to claim 20, comprising a driver information unit (4) including at least one of the following components: the interaction manager (41), the estimator/predictor (42) for estimating and/or predicting the driver/vehicle environment (DNE) state and the integrated hu- man machine interface (43).
25. System according to claim 20, wherein one of the applications is a text converter (26) which is provided for storing chunked text segments (XI, X2, X3) in a second waiting queue (262) and for out- putting the text segments (XI, X2, X3) as speech and/or for displaying them on a display, respectively.
26. System according to claim 25, wherein the text converter (26) comprises a parser (261) for slicing text into chunked text segments (XI, X2, X3) and for storing the same in the second waiting queue (262).
27. System according to claim 25, wherein the text converter (26) comprises a speech generator (263) and/or a display which are provided for executing the first routine according to claim 1 before generating and outputting a chunked text segment (XI, X2, X3) as speech and/or displaying it on the display, respectively.
28. System according to claim 20, comprising a microcomputer for conducting a method according to at least one of claims 1 to 19.
29. System according to claim 20, comprises one central or main unit (51) and at least one local or subordinate unit (52, 53, 64, 65) which are connected to each other in a star network and/or a gateway architecture.
30. System according to claim 29, wherein the central unit (51) comprises an interaction manager or a driver vehicle environment manager (511) which controls a main information and resource manager (512) for providing information and resource mangement for the system and wherein the at least one local unit (52, 53, 64, 65) comprises a local resource manager (522, 532, 642, 652) which is provided for local resource management.
31. Method according to claim 1 or 15 for a system according to claim 29 wherein within and/or between the local units a resource management is conducted and between the central unit and at least one local unit an information and resource management is conducted.
32. Central unit (51) for, or being a part of, a system according to claim 29 com- prising an interaction manager or driver vehicle environment manager (511) and a main information and resource manager (512) for providing information management and resource mangement for the system.
33. Local or subordinate unit (52, 53, 64, 65) for, or being a part of, a system accor- ding to claim 29 comprising a local resource manager (522, 532, 642, 652) which is provided for local resource management.
34. Driver information unit (4) for, or being a part of, a system according to claim 20 comprising at least one of the following components: an interaction manager (41), an estimator/predictor (42) for estimating and/or predicting the driver/vehicle environment (DVE) state and an integrated human machine interface (43).
35. Method for implementing in, or performed by, a driver information unit (4) according to claim 34 for receiving requests and generating responses according to the first routine and/or for controlling an application according to the second routine.
36. Interaction manager for, or being a part of, a system according to claim 20 or a driver information unit (4) according to claim 34 which is provided for executing a first routine according to claim 1 and/or a second routine according to claim 15.
37. Method for implementing in, or performed by, an interaction manager (41) according to claim 36 for receiving requests and generating responses according to the first routine and/or for controlling an application according to the second routine.
38. Method for communication and/or interaction between a vehicle driver and a plurality of integrated and/or non-integrated applications like e.g. native vehicle ap- plications and/or aftermarket applications and/or nomad applications, by means of a third routine executed by an interaction manager, comprising the following steps:
- receiving a request from an application for permission to communicate and/or to interact with the driver; - determining whether a permission for the requested communication and/or. interaction can be given, in dependence of a current and/or predicted driver/vehicle environment (DNE) state and/or of other requests which have been received before from this and/or other applications; and
- generating and transmitting a response to the application, comprising an indication about permitting or not permitting the requested communication and/or interaction.
39. DNE state estimator/predictor (42) for, or being a part of, a system according to claim 20 or a driver information unit (4) according to claim 34 which is provided for estimating and/or predicting a driver/vehicle environment state.
40. Method for implementing in, or performed by, a DNE state estimator/predictor (42) according to claim 39 for estimating and/or predicting a driver/vehicle environment state.
41. Application for, or being a part of, a system according to claim 20, in the form of an integrated and/or non-integrated application (2; 3) like e.g. a native vehicle application, an aftermarket application or a nomad application, which is provided for executing the first routine according to claim 1 and/or for being controlled according to the second routine according to claim 15.
42. Method for implementing in, or performed by, an application according to claim
41. for generating requests and receiving responses according to the first routine and/or for being controlled according to the second routine.
43. Method for communication and/or interaction between a vehicle driver and a plurality of integrated and/or non-integrated applications like e.g. native vehicle applications and/or aftermarket applications and/or nomad applications, by means of an interaction manager and a fourth routine executed by at least one of the applications, comprising the following steps: - sending a request to the interaction manager for permission to communicate and/or to interact with the driver; - receiving a response from the interaction manager; and
- conducting the requested communication and/or interaction if the response comprises an indication which permits the same.
44. Adapter for, or being a part of, an application for, or being a part of, a system according to claim 20 for connecting the application with the system and providing the application with capability to execute the first routine according to claim 1 and/or for being controlled according to the second routine according to claim 15.
45. Method for implementing in, or performed by, an adapter according to claim 44, for providing the application with capability to generate requests and receive responses according to the first routine and/or for controlling the application according to the second routine.
46. Computer program comprising computer program code means adapted to perform a method according to at least one of claims 1 to 19 when said program is run on a programmable microcomputer.
47. Computer program according to claim 46 adapted to be downloaded to the sy- stem according to claim 20 or one of its components when run on a computer which is connected to the internet.
48. Computer program product stored on a computer usable medium, comprising computer program code means according to claim 46.
PCT/EP2004/013229 2003-11-20 2004-11-22 Method and system for interact between a vehicle driver and a plurality of applications WO2005055046A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP04803214.8A EP1706815B1 (en) 2003-11-20 2004-11-22 Methods for interaction between a vehicle driver and a plurality of applications
JP2006540356A JP4659754B2 (en) 2003-11-20 2004-11-22 Method and system for interaction between vehicle driver and multiple applications
BRPI0416839-9A BRPI0416839A (en) 2003-11-20 2004-11-22 method and system for interacting between a vehicle driver and a plurality of applications
US11/419,511 US8009025B2 (en) 2003-11-20 2006-05-22 Method and system for interaction between a vehicle driver and a plurality of applications

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
SESE0303122-6 2003-11-20
SE0303122A SE0303122D0 (en) 2003-11-20 2003-11-20 Method and system for communication and / or interaction between a vehicle driver and a plurality of applications
SEPCT/SE03/001833 2003-11-25
PCT/SE2003/001833 WO2005050522A1 (en) 2003-11-20 2003-11-25 Method and system for communication and/or interaction between a vehicle driver and a plurality of applications

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/419,511 Continuation US8009025B2 (en) 2003-11-20 2006-05-22 Method and system for interaction between a vehicle driver and a plurality of applications

Publications (1)

Publication Number Publication Date
WO2005055046A1 true WO2005055046A1 (en) 2005-06-16

Family

ID=34656366

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2004/013229 WO2005055046A1 (en) 2003-11-20 2004-11-22 Method and system for interact between a vehicle driver and a plurality of applications

Country Status (2)

Country Link
JP (1) JP4659754B2 (en)
WO (1) WO2005055046A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008082350A1 (en) * 2006-12-29 2008-07-10 Scania Cv Ab (Publ) Device and method for prioritizing audio in a vehicle
WO2008082351A1 (en) * 2006-12-29 2008-07-10 Scania Cv Ab (Publ) Device and method for managing audio messages in a vehicle
EP2050610A1 (en) * 2007-10-17 2009-04-22 Audi AG Motor vehicle
EP2163450A1 (en) 2008-06-25 2010-03-17 Ford Global Technologies, LLC Method for allowing of suppressing a request for presenting information to a user
DE102009059141A1 (en) * 2009-10-08 2011-04-14 Bayerische Motoren Werke Aktiengesellschaft Method for integrating a component in an information system of a vehicle
DE102009059142A1 (en) * 2009-10-08 2011-04-14 Bayerische Motoren Werke Aktiengesellschaft Method for integrating component in information system of vehicle, involves providing applications to user of vehicle by human-machine-interface of information system, where application is accessed through program interface at parameter
DE102009059140A1 (en) * 2009-10-08 2011-04-14 Bayerische Motoren Werke Aktiengesellschaft Method for integrating component in information system of vehicle, involves providing priority value to applications relative to human-machine-interface, where priority value provides position for treating one application
GB2500581A (en) * 2012-03-23 2013-10-02 Jaguar Cars Controlling the output of information to a driver based on an estimated driver workload
US9013312B2 (en) 2005-06-20 2015-04-21 Biovigil Hygiene Technologies, Llc Hand cleanliness
US10713925B2 (en) 2005-06-20 2020-07-14 Biovigil Hygiene Technologies, Llc Hand cleanliness
US10752252B2 (en) 2013-03-15 2020-08-25 Honda Motor Co., Ltd. System and method for responding to driver state
DE102019118189A1 (en) * 2019-07-05 2021-01-07 Bayerische Motoren Werke Aktiengesellschaft Coupling of user interfaces
US11069220B2 (en) 2017-07-10 2021-07-20 Biovigil Hygiene Technologies, Llc Hand cleanliness monitoring

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4736780B2 (en) * 2005-12-16 2011-07-27 セイコーエプソン株式会社 Positioning device, positioning method, positioning program.
US9641625B2 (en) * 2009-06-09 2017-05-02 Ford Global Technologies, Llc Method and system for executing an internet radio application within a vehicle
US8972106B2 (en) 2010-07-29 2015-03-03 Ford Global Technologies, Llc Systems and methods for scheduling driver interface tasks based on driver workload
US9213522B2 (en) 2010-07-29 2015-12-15 Ford Global Technologies, Llc Systems and methods for scheduling driver interface tasks based on driver workload
BR122013012796A2 (en) * 2010-07-29 2019-08-06 Ford Global Technologies, Llc. VEHICLE AND METHOD FOR MANAGING INTERFACE TASKS WITH A DRIVER
JP2014000955A (en) * 2013-07-30 2014-01-09 Ford Global Technologies Llc Method for managing driver interface task, and vehicle
JP6819529B2 (en) * 2017-09-27 2021-01-27 株式会社デンソー Information processing equipment, information processing system, and information processing method
WO2024058027A1 (en) * 2022-09-13 2024-03-21 株式会社デンソー Onboard device, center device, vehicle control program, and vehicle control method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10153987A1 (en) * 2001-11-06 2003-05-28 Daimler Chrysler Ag Information system in a vehicle
DE10162653A1 (en) * 2001-12-20 2003-07-03 Bosch Gmbh Robert Method and system for displaying information and vehicle infotainment system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001143191A (en) * 1999-11-12 2001-05-25 Yazaki Corp Vehicle information processing method and device and vehicle
US6925425B2 (en) * 2000-10-14 2005-08-02 Motorola, Inc. Method and apparatus for vehicle operator performance assessment and improvement

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10153987A1 (en) * 2001-11-06 2003-05-28 Daimler Chrysler Ag Information system in a vehicle
DE10162653A1 (en) * 2001-12-20 2003-07-03 Bosch Gmbh Robert Method and system for displaying information and vehicle infotainment system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"Standard SAE J2395 : ITS In-Vehicle Message Priority", February 2002, SAE (SOCIETY OF AUTOMOTIVE ENGINEERS), WARRENDALE PA USA, XP008044585 *
KNOLL P M ET AL: "ADVANCED INTEGRATED DRIVER INFORMATION SYSTEMS", MEASUREMENT AND CONTROL, INSTITUTE OF MEASUREMENT AND CONTROL. LONDON, GB, vol. 25, no. 9, 1 November 1992 (1992-11-01), pages 264 - 268, XP000320446, ISSN: 0020-2940 *
MALEC J ET AL: "Driver support in intelligent autonomous cruise control", PROCEEDINGS OF THE INTELLIGENT VEHICLES '94 SYMPOSIUM (CAT. NO.94TH8011) IEEE NEW YORK, NY, USA, 24 October 1994 (1994-10-24), pages 160 - 164, XP010258314, ISBN: 0-7803-2135-9 *

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11538329B2 (en) 2005-06-20 2022-12-27 Biovigil Hygiene Technologies, Llc Hand cleanliness
US10713925B2 (en) 2005-06-20 2020-07-14 Biovigil Hygiene Technologies, Llc Hand cleanliness
US9013312B2 (en) 2005-06-20 2015-04-21 Biovigil Hygiene Technologies, Llc Hand cleanliness
WO2008082350A1 (en) * 2006-12-29 2008-07-10 Scania Cv Ab (Publ) Device and method for prioritizing audio in a vehicle
WO2008082351A1 (en) * 2006-12-29 2008-07-10 Scania Cv Ab (Publ) Device and method for managing audio messages in a vehicle
DE112007003199T5 (en) 2006-12-29 2010-01-28 Scania Cv Ab (Publ) Device and method for managing audio messages in a vehicle
EP2050610A1 (en) * 2007-10-17 2009-04-22 Audi AG Motor vehicle
EP2163450A1 (en) 2008-06-25 2010-03-17 Ford Global Technologies, LLC Method for allowing of suppressing a request for presenting information to a user
DE102009059141A1 (en) * 2009-10-08 2011-04-14 Bayerische Motoren Werke Aktiengesellschaft Method for integrating a component in an information system of a vehicle
DE102009059142A1 (en) * 2009-10-08 2011-04-14 Bayerische Motoren Werke Aktiengesellschaft Method for integrating component in information system of vehicle, involves providing applications to user of vehicle by human-machine-interface of information system, where application is accessed through program interface at parameter
DE102009059140A1 (en) * 2009-10-08 2011-04-14 Bayerische Motoren Werke Aktiengesellschaft Method for integrating component in information system of vehicle, involves providing priority value to applications relative to human-machine-interface, where priority value provides position for treating one application
US9575771B2 (en) 2009-10-08 2017-02-21 Bayerische Motoren Werke Aktiengesellschaft Method for integrating a component into an information system of a vehicle
US9669840B2 (en) 2012-03-23 2017-06-06 Jaguar Land Rover Limited Control system and method
GB2500581B (en) * 2012-03-23 2014-08-20 Jaguar Land Rover Ltd Method and system for controlling the output of information to a driver based on an estimated driver workload
GB2500581A (en) * 2012-03-23 2013-10-02 Jaguar Cars Controlling the output of information to a driver based on an estimated driver workload
US10752252B2 (en) 2013-03-15 2020-08-25 Honda Motor Co., Ltd. System and method for responding to driver state
US10759436B2 (en) 2013-03-15 2020-09-01 Honda Motor Co., Ltd. System and method for responding to driver state
US10759438B2 (en) 2013-03-15 2020-09-01 Honda Motor Co., Ltd. System and method for responding to driver state
US10759437B2 (en) 2013-03-15 2020-09-01 Honda Motor Co., Ltd. System and method for responding to driver state
US10780891B2 (en) 2013-03-15 2020-09-22 Honda Motor Co., Ltd. System and method for responding to driver state
US11383721B2 (en) 2013-03-15 2022-07-12 Honda Motor Co., Ltd. System and method for responding to driver state
US11069220B2 (en) 2017-07-10 2021-07-20 Biovigil Hygiene Technologies, Llc Hand cleanliness monitoring
US11704992B2 (en) 2017-07-10 2023-07-18 Biovigil Hygiene Technologies, Llc Hand cleanliness monitoring
DE102019118189A1 (en) * 2019-07-05 2021-01-07 Bayerische Motoren Werke Aktiengesellschaft Coupling of user interfaces

Also Published As

Publication number Publication date
JP2007511414A (en) 2007-05-10
JP4659754B2 (en) 2011-03-30

Similar Documents

Publication Publication Date Title
EP1706815B1 (en) Methods for interaction between a vehicle driver and a plurality of applications
WO2005055046A1 (en) Method and system for interact between a vehicle driver and a plurality of applications
US9612999B2 (en) Method and system for supervising information communication based on occupant and vehicle environment
JP2007511414A6 (en) Method and system for interaction between vehicle driver and multiple applications
US8400332B2 (en) Emotive advisory system including time agent
EP3410239B1 (en) Vehicle control method and system
US9188449B2 (en) Controlling in-vehicle computing system based on contextual data
US20110172873A1 (en) Emotive advisory system vehicle maintenance advisor
US20170168689A1 (en) Systems and methods for providing vehicle-related information in accord with a pre-selected information-sharing mode
US20110093158A1 (en) Smart vehicle manuals and maintenance tracking system
US20090299569A1 (en) Apparatrus for assisting driving of a vehicle and method for operating the apparatus
US20050004724A1 (en) Switch apparatus for a driver information interface
CN111885572A (en) Communication control method based on intelligent cabin and intelligent cabin
CN112463271A (en) Method and apparatus for presenting information on a vehicle display
EP1549526B1 (en) Information interface and method of managing driver information
US20200231173A1 (en) System and method for providing a notification to an occupant of a vehicle
CN115915991A (en) Interactive protection system
US20130338919A1 (en) User-centric platform for dynamic mixed-initiative interaction through cooperative multi-agent community
CN118140496A (en) Method, device and storage medium for scheduling notifications based on driving assistance function
CN216002550U (en) Automatic driving graded takeover interaction system
CN115700203A (en) User interface for non-monitoring time period allocation during automatic control of a device
Campbell et al. 15 HMI Design for Automated, Connected, and Intelligent Vehicles
Stoter et al. Context-aware prioritization of information: an architecture for real-time in-vehicle information management
CN117203110A (en) Method and device for prompting vehicle state information
CN118295517A (en) Method for providing user-specific information

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200480034396.6

Country of ref document: CN

AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2006540356

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 11419511

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Ref document number: DE

WWE Wipo information: entry into national phase

Ref document number: 2004803214

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2004803214

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 11419511

Country of ref document: US

ENP Entry into the national phase

Ref document number: PI0416839

Country of ref document: BR