WO2018134196A1 - Communication control apparatus and method - Google Patents

Communication control apparatus and method Download PDF

Info

Publication number
WO2018134196A1
WO2018134196A1 PCT/EP2018/050998 EP2018050998W WO2018134196A1 WO 2018134196 A1 WO2018134196 A1 WO 2018134196A1 EP 2018050998 W EP2018050998 W EP 2018050998W WO 2018134196 A1 WO2018134196 A1 WO 2018134196A1
Authority
WO
WIPO (PCT)
Prior art keywords
dialogue
output
component
processor
dialogue component
Prior art date
Application number
PCT/EP2018/050998
Other languages
French (fr)
Inventor
Harpreet Singh
Ben ANYASODO
Francesco BIONDI
Original Assignee
Jaguar Land Rover Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Limited filed Critical Jaguar Land Rover Limited
Publication of WO2018134196A1 publication Critical patent/WO2018134196A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0816Indicating performance data, e.g. occurrence of a malfunction
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0816Indicating performance data, e.g. occurrence of a malfunction
    • G07C5/0833Indicating performance data, e.g. occurrence of a malfunction using audio means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/037Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
    • B60R16/0373Voice control

Definitions

  • the present disclosure relates to a communication control apparatus and method.
  • the present disclosure relates to a communication controller, to a vehicle and to a method.
  • a virtual agent into a traditional consumer device, such as a laptop, a tablet computer, a cellular telephone and a personal computer. It is also known to implement a virtual agent in a standalone device to function as a digital assistant.
  • the device may provide a visual indication of the status of the agent, for example to indicate that the agent is active, listening or off. For example, an icon may be displayed on a display screen. The shape and/or colour of the icon may change to indicate different system states.
  • agents With increasingly complex interfaces on vehicles, it is expected that agents will be incorporated into vehicle interfaces. However, it is envisaged that this will present unique challenges not encountered in existing consumer device applications. For example, the agent may be responsible for communicating critical information to the driver of the vehicle. It is against this backdrop that the present invention(s) have been conceived.
  • a communication controller for controlling a vehicle-initiated dialogue (VI D) comprising at least one dialogue component, the communication controller comprising:
  • At least one processor configured to control output of said at least one dialogue component
  • a memory connected to the at least one processor
  • the at least one processor is configured to:
  • the advance notification may provide an alert for the driver before a new dialogue is initiated. At least in certain embodiments the advance notification may help to reduce driver distraction and aid driver comprehension of the information output. It will be understood, therefore, that information may be communicated more effectively to the driver. At least in certain embodiments, the driver may terminate, pause or postpone the dialogue before the dialogue component is output.
  • the advance notification may comprise one or more of the following: an audio output; a visual output; and a haptic output.
  • the advance notification may be independent of the content and/or criticality of the dialogue component to be output.
  • the format of the advance notification may be standardised.
  • a generic or non-specific advance notification may be output irrespective of the content and/or criticality of the dialogue component.
  • the advance notification may comprise or consist of a haptic output.
  • the advance notification may be varied in dependence on the content and/or criticality of the dialogue component to be output.
  • the magnitude and/or intensity of the advance notification may be modified in dependence on a predefined criticality of the dialogue component.
  • the advance notification signal may optionally comprise a timing or scheduling component to indicate when the identified dialogue component is scheduled to be output.
  • the advance notification may be generated a predetermined time period prior to the scheduled output time of said dialogue component.
  • the predetermined time period may be between five (5) and eleven (1 1 ) seconds inclusive, for example the predetermined time period may be five (5), seven (7), nine (9) or eleven (1 1 ) seconds. Other time periods are useful.
  • the at least one processor may be configured to check for a user input signal after output of the advance notification signal.
  • the at least one processor may be configured to control output of said dialogue component in dependence on said user input signal.
  • the at least one processor may be configured to check for a user input signal comprising a cancellation request.
  • the at least one processor may be configured to cancel output of said dialogue component upon detection of said cancellation request.
  • the at least one processor may be configured to check for a user input signal comprising a postpone request.
  • the at least one processor may be configured to postpone output of said dialogue component upon detection of said postpone request.
  • the time period between outputting the advance notification signal and outputting the identified dialogue component may be increased in dependence on receipt of the postpone request.
  • the at least one processor may be configured to check for a user input signal comprising an expedite request.
  • the at least one processor may be configured to expedite output of said dialogue component upon detection of said expedite request.
  • the time period between outputting the advance notification signal and outputting the identified dialogue component may be reduced in dependence on receipt of the expedite request.
  • the at least one processor may be configured to output the dialogue component immediately upon detection of said expedite request.
  • the at least one processor may be configured to output said dialogue component after a predetermined time period in the absence of a user input signal to the advance notification.
  • the dialogue component may be an initial dialogue component.
  • the initial dialogue component is the first (i.e. opening) dialogue component of a new dialogue initiated by the communication controller.
  • Output of the advance notification informs the driver that a new dialogue is about to start.
  • the at least one processor may be configured to output a conclusion notification signal to generate a conclusion notification. Output of the conclusion notification informs the driver that the current dialogue has concluded.
  • the conclusion notification signal may be output after a concluding dialogue component is output.
  • the concluding dialogue component is the final dialogue component of the current dialogue.
  • the at least one processor may be configured to output a plurality of dialogue components. For example, a series or sequence of dialogue components may be output. The dialogue components may form a dialogue chain.
  • the conclusion notification may be output only after output of a final dialogue component in a dialogue sequence. At least in certain embodiments, a conclusion notification is not output for each dialogue component which precedes the final dialogue component in the dialogue sequence.
  • the conclusion notification may be output after a predetermined time period has elapsed following output of the final dialogue component in a dialogue sequence.
  • a single conclusion notification may be output in respect of each dialogue sequence. The conclusion notification may thereby mark or indicate the conclusion of a given dialogue sequence comprising one or more dialogue component.
  • an advance notification may be output only before output of the first dialogue component in a dialogue sequence. At least in certain embodiments, a single advance notification may be output in respect of each dialogue sequence.
  • the dialogue sequence may comprise a linear path.
  • the dialogue sequence may comprise a branching path.
  • the dialogue sequence may, for example, be generated by traversing a dialogue tree comprising one or more decision node. The traversal path through the tree structure may be selected in dependence on a user input or a detected response at each decision node.
  • the detected response may comprise a user gesture or action, for example detected by processing image data generated by a user-facing camera. Alternatively, or in addition, the detected response may comprise a user voice command.
  • output dialogue component may function as both the initial and concluding dialogue components. As such, an advance notification and/or a conclusion notification may be output in dependence on said dialogue component.
  • a communication controller for controlling a vehicle-initiated dialogue comprising at least one dialogue component, the communication controller comprising:
  • At least one processor configured to control output of said at least one dialogue component
  • a memory connected to the at least one processor
  • the at least one processor is configured to:
  • the at least one processor may be configured to output a plurality of dialogue components. For example, a series or sequence of dialogue components may be output. The dialogue components may form a dialogue chain.
  • the conclusion notification may be output only after output of a final dialogue component in a dialogue sequence. At least in certain embodiments, a conclusion notification is not output for each dialogue component which precedes the final dialogue component in the dialogue sequence.
  • the conclusion notification may be output after a predetermined time period has elapsed following output of the final dialogue component in a dialogue sequence.
  • a single conclusion notification may be output in respect of each dialogue sequence. The conclusion notification may thereby mark or indicate the conclusion of a given dialogue sequence comprising one or more dialogue component.
  • an advance notification may be output only before output of the first dialogue component in a dialogue sequence. At least in certain embodiments, a single advance notification may be output in respect of each dialogue sequence.
  • the dialogue sequence may comprise a linear path.
  • the dialogue sequence may comprise a branching path.
  • the dialogue sequence may, for example, be generated by traversing a dialogue tree comprising one or more decision node. The traversal path through the tree structure may be selected in dependence on a user input or a detected response at each decision node.
  • a vehicle comprising a communication controller as described herein.
  • the advance notification signal (ANS) may be output to one or more of the following: a display screen; an audio device; and a haptic device.
  • VI D vehicle-initiated dialogue
  • the advance notification may be output only before output of the first dialogue component in a dialogue sequence. At least in certain embodiments, a single advance notification may be output in respect of each dialogue sequence.
  • the method may comprise checking for a user input after generating the advance notification and controlling output of said dialogue component in dependence on said user input.
  • the user input may comprise a cancellation request.
  • the method may comprise cancelling output of said dialogue component upon detection of said cancellation request.
  • the user input may comprise a postpone request.
  • the method may comprise postponing output of said dialogue component upon detection of said postpone request.
  • the time period between outputting the advance notification signal and outputting the identified dialogue component may be increased in dependence on receipt of the postpone request.
  • the user input may comprise an expedite request.
  • the method may comprise expediting output of said dialogue component upon detection of said expedite request.
  • the time period between outputting the advance notification signal and outputting the identified dialogue component may be reduced in dependence on receipt of the expedite request.
  • the method may comprise outputting said dialogue component after a predetermined time period in the absence of a user input signal.
  • the dialogue component may be an initial dialogue component.
  • the method may comprise outputting a conclusion notification when the vehicle-initiated dialogue is concluded.
  • the method may comprise outputting a plurality of dialogue components.
  • a series or sequence of dialogue components may be output.
  • the dialogue components may form a dialogue chain.
  • the conclusion notification may be output only after output of a final dialogue component in a dialogue sequence.
  • a conclusion notification is not output for each dialogue component which precedes the final dialogue component in the dialogue sequence.
  • the conclusion notification may be output after a predetermined time period has elapsed following output of the final dialogue component in a dialogue sequence.
  • a single conclusion notification may be output in respect of each dialogue sequence.
  • the conclusion notification may thereby mark or indicate the conclusion of a given dialogue sequence comprising one or more dialogue component.
  • an advance notification may be output only before output of the first dialogue component in a dialogue sequence.
  • a single advance notification may be output in respect of each dialogue sequence.
  • the dialogue sequence may comprise a linear path.
  • the dialogue sequence may comprise a branching path.
  • the dialogue sequence may, for example, be generated by traversing a dialogue tree comprising one or more decision node. The traversal path through the tree structure may be selected in dependence on a user input or a detected response at each decision node.
  • the method may comprise outputting a plurality of dialogue components.
  • a series or sequence of dialogue components may be output.
  • the dialogue components may form a dialogue chain.
  • the conclusion notification may be output only after output of a final dialogue component in a dialogue sequence.
  • a conclusion notification is not output for each dialogue component which precedes the final dialogue component in the dialogue sequence.
  • the conclusion notification may be output after a predetermined time period has elapsed following output of the final dialogue component in a dialogue sequence.
  • a single conclusion notification may be output in respect of each dialogue sequence.
  • the conclusion notification may thereby mark or indicate the conclusion of a given dialogue sequence comprising one or more dialogue component.
  • an advance notification may be output only before output of the first dialogue component in a dialogue sequence.
  • a single advance notification may be output in respect of each dialogue sequence.
  • the dialogue sequence may comprise a linear path.
  • the dialogue sequence may comprise a branching path.
  • the dialogue sequence may, for example, be generated by traversing a dialogue tree comprising one or more decision node. The traversal path through the tree structure may be selected in dependence on a user input or a detected response at each decision node.
  • a non-transitory computer readable media comprising a set of computational instructions which, when executed, cause a computer to implement the method described herein.
  • control unit or controller described herein may suitably comprise a computational device having one or more electronic processors.
  • the system may comprise a single control unit or electronic controller or alternatively different functions of the controller may be embodied in, or hosted in, different control units or controllers.
  • controller or “control unit” will be understood to include both a single control unit or controller and a plurality of control units or controllers collectively operating to provide any stated control functionality.
  • a suitable set of instructions may be provided which, when executed, cause said control unit or computational device to implement the control techniques specified herein.
  • the set of instructions may suitably be embedded in said one or more electronic processors.
  • the set of instructions may be provided as software saved on one or more memory associated with said controller to be executed on said computational device.
  • the control unit or controller may be implemented in software run on one or more processors.
  • One or more other control unit or controller may be implemented in software run on one or more processors, optionally the same one or more processors as the first controller. Other suitable arrangements may also be used.
  • Figure 1 shows a schematic representation of a vehicle incorporating a communication controller in accordance with an embodiment of the present invention
  • Figure 2 shows a flow diagram representing operation of the communication controller according to an embodiment of the present invention
  • Figures 3A, 3B and 3C illustrate a display screen comprising an icon indicating the status of the communication controller
  • Figures 4A, 4B and 4C illustrate respective dialogue components output by the communication controller to the display screen.
  • the communication system 1 comprises a communication controller 2.
  • the communication controller 2 in the present embodiment is adapted to control communication with a driver of the vehicle V, but it will be understood that communication with other occupants of the vehicle V is also possible.
  • the communication controller 2 is configured to provide an agent which implements a vehicle-initiated dialogue (VID) for communicating information to the vehicle driver.
  • VID vehicle-initiated dialogue
  • the VID is operative to initiate communication with the vehicle driver, for example to start a dialogue with the vehicle driver to convey system information.
  • the VID may, for example, be configured to convey one or more of the following categories of information to the vehicle driver:
  • Warning i.e. Major/Minor vehicle warning, external warnings, safety advance notifications
  • Reminders i.e. Configuration message, Events calendar reminder, scheduled maintenance, MOT and other external reminders.
  • the communication controller 2 implements a VID to interact with the driver to help improve vehicle performance, to improve the driving experience and to help reduce distraction.
  • the communication controller 2 comprises an electronic processor 3 configured to execute a set of computational instructions stored on a non- transitory computer readable media.
  • the computational instructions are stored in a system memory 4 connected to the electronic processor 3.
  • the system memory 4 may, for example, comprise a memory device.
  • the electronic processor 3 is configured to implement the VID function of the communication controller 2.
  • the electronic processor 3 is configured to identify information to be output to the vehicle driver and to schedule the output of that information.
  • the electronic processor 3 communicates with a plurality of vehicle systems 5-n over a communication network 6.
  • the vehicle systems 5-n each publish one or more data signal S-n to the communication network 6.
  • the data signals S-n each comprise information relating to an on-board vehicle system, for example vehicle speed data and/or fault data; or an external system, such as local traffic.
  • the system memory 4 comprises a plurality of dialogue components 7 for communicating information to the vehicle driver.
  • the dialogue components 7 may be adapted to convey information derived from one or more of said data signals S-n.
  • the electronic processor 3 is configured to select one or more of said dialogue components 7 in dependence on the data signals S-n received from the vehicle systems 5-n.
  • the selected dialogue component(s) 7 is then output to communicate information to the vehicle driver.
  • the electronic processor 3 may select and output a plurality of said dialogue components 7, for example to be output in a sequence.
  • the dialogue components 7 may be arranged in a dialogue tree which is traversed in dependence on one or more user inputs.
  • the dialogue components 7 may each comprise or consist of an audio output and/or a visual output.
  • a flow chart 100 is shown in Figure 2 to illustrate the operating states of the communication controller 2.
  • the communication controller 2 switches between an idle state ST1 ; an initiate communication state ST2; a communicating information state ST3; and a finish dialogue state ST4.
  • the communication controller 2 defaults to the idle state ST1 when activated.
  • the operating state of the communication controller 2 may be indicated to the driver.
  • an icon 8 is output to provide a visual indication of the operating state of the communication controller 2.
  • the electronic processor 3 is configured to read the data signals S-n published to the communication network 5 and to identify information to be communicated to the vehicle driver.
  • the electronic processor 3 may identify information to be output to the vehicle driver based on predefined criteria.
  • the electronic processor 3 may monitor the data signals S-n published to the communication network 6 to determine operating conditions of that vehicle system 5-n are outside predefined operating parameters, for example above an upper threshold and/or below a lower threshold. Upon identification of information to be communicated to the vehicle driver, the operating status of the communication controller 2 changes to the initiate communication state ST2. The electronic processor 3 identifies one or more dialogue component 7 for output in dependence on the data signals S-n. The electronic processor 3 then schedules output of said one or more selected dialogue component 7. The scheduling may, for example, depend on the criticality of the information to be conveyed to the vehicle driver. Information having a higher criticality rating will be identified for output sooner than information having a lower criticality rating.
  • the communication controller 2 then changes to the initiate communication state ST2 and the electronic processor 3 is configured to generate an advance notification signal ANS.
  • the advance notification signal ANS is output to signal to the driver that the communication controller 2 is preparing to initiate a new dialogue.
  • the advance notification signal ANS is output prior to output of an initial dialogue component 7 in which is to open the new dialogue with the driver.
  • the advance notification signal ANS is output to a notification generating means 9.
  • the notification generating means 9 generates an advance notification which alerts the vehicle driver that the communication controller 2 has identified a dialogue component 7 to be output. This advance notification helps to reduce driver distraction when the new dialogue commences and also ensures that the driver is ready to receive the information contained in the dialogue, thereby aiding comprehension.
  • the advance notification is independent of the content of the initial dialogue component 7 and/or the criticality of the initial dialogue component 7.
  • the advance notification may comprise one or more of the following: an audio output, a visual output and a haptic output.
  • the notification generating means 9 may be implemented using an existing vehicle system.
  • a visual output of the advance notification may output to a display screen 10 in the vehicle 1 , for example disposed in the instrument cluster, a dashboard or a centre console.
  • An audio output of the advance notification may be output by an on-board audio entertainment system 1 1 .
  • a haptic output of the advance notification may be output by a vibration generator 12 disposed in a steering wheel or a driver seat.
  • the operating status of the communication controller 2 is held in the initiate communication state ST2 for a predetermined time period following output of the advance notification signal ANS.
  • the predetermined time period can be calibrated, but a time period of between five (5) and eleven (12) seconds inclusive is envisaged. Other time periods are useful.
  • the delay provides the driver an opportunity to delay output of the initial dialogue component 7, for example if they do not wish to receive new information due to current workload.
  • the electronic processor 3 checks for a user input signal SIN and output of the initial dialogue component 7 may be controlled in dependence on any such user input signal.
  • the user input signal SIN may be generated when the vehicle driver actuates an input device 13, such as a button, a switch, or a capacitive sensor.
  • the input device 13 may be disposed on a steering wheel, in a centre console, or a control panel.
  • the user input signal SIN may, for example, comprise a cancellation request to cancel output of the initial dialogue component 7; and/or a postpone request to delay or postpone output of the initial dialogue component 7.
  • the user input signal may comprise an expedite request to reduce or remove any time period in outputting the initial dialogue component 7.
  • the communication controller 2 changes to the communicating information state ST3.
  • the electronic processor 3 is configured to output the initial dialogue component 7 and any subsequent dialogue components 7.
  • the initial dialogue component 7 may comprise a visual output which is output to the display screen 10; and/or an audio output which is output over the audio entertainment system 1 1.
  • the electronic processor 3 may output additional dialogue components 7, for example to provide additional information or to provide updated information.
  • the electronic processor 3 may select and output additional dialogue components 7 in dependence on a user input signal SIN.
  • the electronic processor 3 may traverse a dialogue tree comprising multiple dialogue components in dependence on one or more user input signals SIN.
  • the communication controller 2 remains in the information communicating state ST3 as long as it is communicating with the vehicle driver, including outputting dialogue components 7 and, if appropriate, waiting for a driver response. Once the driver has made a final decision, the communication controller 2 confirms this to the driver.
  • the electronic processor 3 may be configured to determine that the current dialogue is finished when one or more of the following conditions is identified: a time-out condition, for example after a predetermined time period has elapsed; a user request provided in a user input signal SIN; no further dialogue components 7 for output, for example upon completion of a traversal of a dialogue tree; and determination that the information relating to said one or more vehicle systems 5-n is no longer relevant, for example a speed limit warning may be removed when the vehicle speed decreases below the speed limit.
  • the electronic processor 3 determines that the current dialogue is complete and the communication controller 2 switches to the finish dialogue state ST4.
  • the electronic processor 3 may optionally output a conclusion notification signal CNS to the notification generating means 9.
  • the notification generating means 9 may output a conclusion notification to indicate that the vehicle-initiated dialogue has been concluded.
  • the conclusion notification may be the same as or different from the advance notification signal.
  • the communication controller 2 then reverts to the idle state ST1 .
  • the icon 8 is output to the display screen 10 to provide a visual indication of the operating status and/or condition of the communication controller 2.
  • the icon 8 may, for example, represent one or more of the following: the status of the communication controller 2 (On, Off, Idle, Listening, About to start, Finish and Error); the criticality of information (High Priority, Normal Priority and Low Priority); the perceived effect of information (indicating how information may affect a driver's behaviour).
  • the shape of the icon 8 may be varied to indicate the status of the communication controller 2; the colour of the icon 8 may be varied to indicate the criticality of information to be communicated; and a change in depth of the icon 8 may be used to indicate the perceived effect of the information.
  • FIG. 3A and 3B A schematic representation of a first image 14 output to the display screen 10 is shown in Figures 3A and 3B.
  • the display screen 10 is disposed in a dashboard of the vehicle and the first image 14 comprises a digital representation of an instrument cluster 15.
  • the icon 8 is removed (or blacked out) in Figure 3A to indicate that the communication controller 2 has been deactivated.
  • the icon 8 is shown in grey in Figure 3B to indicate that the communication controller 2 is in said idle state ST1 .
  • the icon 8 is displayed in colour in Figure 3C (illuminated green in the illustrated example) to indicate that the communication controller 2 is active.
  • a dialogue component 7 comprising a warning that the washer fluid level is low is displayed in the first image 14; the criticality of this information is categorised as Low Priority and the colour of the icon 8 remains unchanged (illuminated green in the illustrated example).
  • a dialogue component 7 comprising a warning that the current speed of the vehicle 1 exceeds the legal speed limit is displayed in the first image 14; the criticality of this information is categorised as High Priority and the colour of the icon 8 represents this state (flashing red in the illustrated example).
  • a dialogue component 7 comprising a warning that an engine fault has been detected ("Cooling system failed") is displayed in the first image 14; the criticality of this information is categorised as High Priority and the colour of the icon 8 represents this state (flashing red in the illustrated example).
  • the advance notification is independent of the content and/or the criticality of the dialogue component 7 to be output subsequently.
  • the advance notification may provide an indication of the content and/or criticality of the dialogue component 7 to be output.
  • the magnitude and/or frequency and/or duration of vibrations generated by the vibration generator 12 may be modified in dependence on the content and/or criticality of the information. Similar approaches may be employed to modify the visual and/or audio form of the advance notification in dependence on the content and/or criticality of the information.
  • the user input signal SIN is described herein as being generated by actuation of an input device 13, such as a button, a switch, or a capacitive sensor. It will be understood that other techniques may be used to detect a user input.
  • the user input signal SIN may be generated in dependence on detection of a user gesture or action, for example detected by processing image data generated by a user-facing camera.
  • a user gesture or action may also be detected by using other known sensors, but not limited to, such as e-field sensor, ultrasonic sensor, infrared sensor and radar.
  • the user input signal SIN may be generated in dependence on a user voice command or instruction.

Abstract

The present disclosure relates to a communication controller (2) for controlling a vehicle-initiated dialogue (VID) composed of at least one dialogue component (7). The communication controller (2) includes at least one processor (3) configured to control output of said at least one dialogue component (7). A memory (4) is connected to the at least one processor. The at least one processor (3) is configured to identify a dialogue component (7) to be output. The at least one processor (3) outputs an advance notification signal (ANS) to generate an advance notification that the identified dialogue component (7) is to be output. The identified dialogue component (7) is output after said advance notification signal (ANS). The present disclosure also relates to a vehicle (1) incorporating the communication controller (2) and a related method.

Description

COMMUNICATION CONTROL APPARATUS AND METHOD
TECHNICAL FIELD
The present disclosure relates to a communication control apparatus and method. In particular, but not exclusively, the present disclosure relates to a communication controller, to a vehicle and to a method.
BACKGROUND
It is known to incorporate a virtual agent into a traditional consumer device, such as a laptop, a tablet computer, a cellular telephone and a personal computer. It is also known to implement a virtual agent in a standalone device to function as a digital assistant. The device may provide a visual indication of the status of the agent, for example to indicate that the agent is active, listening or off. For example, an icon may be displayed on a display screen. The shape and/or colour of the icon may change to indicate different system states. With increasingly complex interfaces on vehicles, it is expected that agents will be incorporated into vehicle interfaces. However, it is envisaged that this will present unique challenges not encountered in existing consumer device applications. For example, the agent may be responsible for communicating critical information to the driver of the vehicle. It is against this backdrop that the present invention(s) have been conceived.
SUMMARY OF THE INVENTION
Aspects of the present invention relate to a communication controller, to a vehicle and to a method as claimed in the appended claims.
According to a further aspect of the present invention there is provided a communication controller for controlling a vehicle-initiated dialogue (VI D) comprising at least one dialogue component, the communication controller comprising:
at least one processor configured to control output of said at least one dialogue component; and
a memory connected to the at least one processor;
wherein the at least one processor is configured to:
identify a dialogue component to be output;
output an advance notification signal to generate an advance notification that the identified dialogue component is to be output; and
outputting the identified dialogue component after said advance notification signal. The advance notification may provide an alert for the driver before a new dialogue is initiated. At least in certain embodiments the advance notification may help to reduce driver distraction and aid driver comprehension of the information output. It will be understood, therefore, that information may be communicated more effectively to the driver. At least in certain embodiments, the driver may terminate, pause or postpone the dialogue before the dialogue component is output.
The advance notification may comprise one or more of the following: an audio output; a visual output; and a haptic output.
The advance notification may be independent of the content and/or criticality of the dialogue component to be output. The format of the advance notification may be standardised. A generic or non-specific advance notification may be output irrespective of the content and/or criticality of the dialogue component. In certain embodiments, the advance notification may comprise or consist of a haptic output.
Alternatively, the advance notification may be varied in dependence on the content and/or criticality of the dialogue component to be output. The magnitude and/or intensity of the advance notification may be modified in dependence on a predefined criticality of the dialogue component.
The advance notification signal may optionally comprise a timing or scheduling component to indicate when the identified dialogue component is scheduled to be output. The advance notification may be generated a predetermined time period prior to the scheduled output time of said dialogue component. The predetermined time period may be between five (5) and eleven (1 1 ) seconds inclusive, for example the predetermined time period may be five (5), seven (7), nine (9) or eleven (1 1 ) seconds. Other time periods are useful.
The at least one processor may be configured to check for a user input signal after output of the advance notification signal. The at least one processor may be configured to control output of said dialogue component in dependence on said user input signal.
The at least one processor may be configured to check for a user input signal comprising a cancellation request. The at least one processor may be configured to cancel output of said dialogue component upon detection of said cancellation request. The at least one processor may be configured to check for a user input signal comprising a postpone request. The at least one processor may be configured to postpone output of said dialogue component upon detection of said postpone request. The time period between outputting the advance notification signal and outputting the identified dialogue component may be increased in dependence on receipt of the postpone request.
The at least one processor may be configured to check for a user input signal comprising an expedite request. The at least one processor may be configured to expedite output of said dialogue component upon detection of said expedite request. The time period between outputting the advance notification signal and outputting the identified dialogue component may be reduced in dependence on receipt of the expedite request. The at least one processor may be configured to output the dialogue component immediately upon detection of said expedite request. The at least one processor may be configured to output said dialogue component after a predetermined time period in the absence of a user input signal to the advance notification.
The dialogue component may be an initial dialogue component. The initial dialogue component is the first (i.e. opening) dialogue component of a new dialogue initiated by the communication controller. Output of the advance notification informs the driver that a new dialogue is about to start.
The at least one processor may be configured to output a conclusion notification signal to generate a conclusion notification. Output of the conclusion notification informs the driver that the current dialogue has concluded. The conclusion notification signal may be output after a concluding dialogue component is output. The concluding dialogue component is the final dialogue component of the current dialogue.
The at least one processor may be configured to output a plurality of dialogue components. For example, a series or sequence of dialogue components may be output. The dialogue components may form a dialogue chain. The conclusion notification may be output only after output of a final dialogue component in a dialogue sequence. At least in certain embodiments, a conclusion notification is not output for each dialogue component which precedes the final dialogue component in the dialogue sequence. The conclusion notification may be output after a predetermined time period has elapsed following output of the final dialogue component in a dialogue sequence. At least in certain embodiments, a single conclusion notification may be output in respect of each dialogue sequence. The conclusion notification may thereby mark or indicate the conclusion of a given dialogue sequence comprising one or more dialogue component. Alternatively, or in addition, an advance notification may be output only before output of the first dialogue component in a dialogue sequence. At least in certain embodiments, a single advance notification may be output in respect of each dialogue sequence.
The dialogue sequence may comprise a linear path. Alternatively, the dialogue sequence may comprise a branching path. The dialogue sequence may, for example, be generated by traversing a dialogue tree comprising one or more decision node. The traversal path through the tree structure may be selected in dependence on a user input or a detected response at each decision node. The detected response may comprise a user gesture or action, for example detected by processing image data generated by a user-facing camera. Alternatively, or in addition, the detected response may comprise a user voice command. It will be understood that if the dialogue consists of a single dialogue component, output dialogue component may function as both the initial and concluding dialogue components. As such, an advance notification and/or a conclusion notification may be output in dependence on said dialogue component.
According to a still further aspect of the present invention there is provided a communication controller for controlling a vehicle-initiated dialogue comprising at least one dialogue component, the communication controller comprising:
at least one processor configured to control output of said at least one dialogue component; and
a memory connected to the at least one processor;
wherein the at least one processor is configured to:
output a conclusion notification signal to generate a conclusion notification when the vehicle-initiated dialogue is concluded.
The at least one processor may be configured to output a plurality of dialogue components. For example, a series or sequence of dialogue components may be output. The dialogue components may form a dialogue chain. The conclusion notification may be output only after output of a final dialogue component in a dialogue sequence. At least in certain embodiments, a conclusion notification is not output for each dialogue component which precedes the final dialogue component in the dialogue sequence. The conclusion notification may be output after a predetermined time period has elapsed following output of the final dialogue component in a dialogue sequence. At least in certain embodiments, a single conclusion notification may be output in respect of each dialogue sequence. The conclusion notification may thereby mark or indicate the conclusion of a given dialogue sequence comprising one or more dialogue component. Alternatively, or in addition, an advance notification may be output only before output of the first dialogue component in a dialogue sequence. At least in certain embodiments, a single advance notification may be output in respect of each dialogue sequence.
The dialogue sequence may comprise a linear path. Alternatively, the dialogue sequence may comprise a branching path. The dialogue sequence may, for example, be generated by traversing a dialogue tree comprising one or more decision node. The traversal path through the tree structure may be selected in dependence on a user input or a detected response at each decision node.
According to a further aspect of the present invention there is provided a vehicle comprising a communication controller as described herein. The advance notification signal (ANS) may be output to one or more of the following: a display screen; an audio device; and a haptic device.
According to an aspect of the present invention there is provided a method of controlling a vehicle-initiated dialogue (VI D) comprising at least one dialogue component, the method comprising:
identifying a dialogue component to be output;
generating an advance notification prior to output of said dialogue component; and outputting the identified dialogue component after said advance notification. The advance notification may be output only before output of the first dialogue component in a dialogue sequence. At least in certain embodiments, a single advance notification may be output in respect of each dialogue sequence.
The method may comprise checking for a user input after generating the advance notification and controlling output of said dialogue component in dependence on said user input.
The user input may comprise a cancellation request. The method may comprise cancelling output of said dialogue component upon detection of said cancellation request.
The user input may comprise a postpone request. The method may comprise postponing output of said dialogue component upon detection of said postpone request. The time period between outputting the advance notification signal and outputting the identified dialogue component may be increased in dependence on receipt of the postpone request.
The user input may comprise an expedite request. The method may comprise expediting output of said dialogue component upon detection of said expedite request. The time period between outputting the advance notification signal and outputting the identified dialogue component may be reduced in dependence on receipt of the expedite request.
The method may comprise outputting said dialogue component after a predetermined time period in the absence of a user input signal.
The dialogue component may be an initial dialogue component.
The method may comprise outputting a conclusion notification when the vehicle-initiated dialogue is concluded.
The method may comprise outputting a plurality of dialogue components. For example, a series or sequence of dialogue components may be output. The dialogue components may form a dialogue chain. The conclusion notification may be output only after output of a final dialogue component in a dialogue sequence. At least in certain embodiments, a conclusion notification is not output for each dialogue component which precedes the final dialogue component in the dialogue sequence. The conclusion notification may be output after a predetermined time period has elapsed following output of the final dialogue component in a dialogue sequence. At least in certain embodiments, a single conclusion notification may be output in respect of each dialogue sequence. The conclusion notification may thereby mark or indicate the conclusion of a given dialogue sequence comprising one or more dialogue component. Alternatively, or in addition, an advance notification may be output only before output of the first dialogue component in a dialogue sequence. At least in certain embodiments, a single advance notification may be output in respect of each dialogue sequence.
The dialogue sequence may comprise a linear path. Alternatively, the dialogue sequence may comprise a branching path. The dialogue sequence may, for example, be generated by traversing a dialogue tree comprising one or more decision node. The traversal path through the tree structure may be selected in dependence on a user input or a detected response at each decision node. According to a further aspect of the present invention there is provided a method of controlling a vehicle-initiated dialogue comprising at least one dialogue component, the method comprising:
outputting at least one dialogue component; and
outputting a conclusion notification when the vehicle-initiated dialogue is concluded.
The method may comprise outputting a plurality of dialogue components. For example, a series or sequence of dialogue components may be output. The dialogue components may form a dialogue chain. The conclusion notification may be output only after output of a final dialogue component in a dialogue sequence. At least in certain embodiments, a conclusion notification is not output for each dialogue component which precedes the final dialogue component in the dialogue sequence. The conclusion notification may be output after a predetermined time period has elapsed following output of the final dialogue component in a dialogue sequence. At least in certain embodiments, a single conclusion notification may be output in respect of each dialogue sequence. The conclusion notification may thereby mark or indicate the conclusion of a given dialogue sequence comprising one or more dialogue component. Alternatively, or in addition, an advance notification may be output only before output of the first dialogue component in a dialogue sequence. At least in certain embodiments, a single advance notification may be output in respect of each dialogue sequence.
The dialogue sequence may comprise a linear path. Alternatively, the dialogue sequence may comprise a branching path. The dialogue sequence may, for example, be generated by traversing a dialogue tree comprising one or more decision node. The traversal path through the tree structure may be selected in dependence on a user input or a detected response at each decision node.
According to a further aspect of the present invention there is provided a set of instructions which, when executed, cause a computational device to implement the method described herein.
According to a still further aspect of the present invention there is provided a non-transitory computer readable media comprising a set of computational instructions which, when executed, cause a computer to implement the method described herein.
Any control unit or controller described herein may suitably comprise a computational device having one or more electronic processors. The system may comprise a single control unit or electronic controller or alternatively different functions of the controller may be embodied in, or hosted in, different control units or controllers. As used herein the term "controller" or "control unit" will be understood to include both a single control unit or controller and a plurality of control units or controllers collectively operating to provide any stated control functionality. To configure a controller or control unit, a suitable set of instructions may be provided which, when executed, cause said control unit or computational device to implement the control techniques specified herein. The set of instructions may suitably be embedded in said one or more electronic processors. Alternatively, the set of instructions may be provided as software saved on one or more memory associated with said controller to be executed on said computational device. The control unit or controller may be implemented in software run on one or more processors. One or more other control unit or controller may be implemented in software run on one or more processors, optionally the same one or more processors as the first controller. Other suitable arrangements may also be used.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the present invention will now be described, by way of example only, with reference to the accompanying figures, in which:
Figure 1 shows a schematic representation of a vehicle incorporating a communication controller in accordance with an embodiment of the present invention;
Figure 2 shows a flow diagram representing operation of the communication controller according to an embodiment of the present invention;
Figures 3A, 3B and 3C illustrate a display screen comprising an icon indicating the status of the communication controller; and
Figures 4A, 4B and 4C illustrate respective dialogue components output by the communication controller to the display screen. DETAILED DESCRIPTION
A vehicle V incorporating a communication system 1 in accordance with an embodiment of the present invention will now be described with reference to the accompanying figures. The communication system 1 comprises a communication controller 2. The communication controller 2 in the present embodiment is adapted to control communication with a driver of the vehicle V, but it will be understood that communication with other occupants of the vehicle V is also possible.
The communication controller 2 is configured to provide an agent which implements a vehicle-initiated dialogue (VID) for communicating information to the vehicle driver. The VID is operative to initiate communication with the vehicle driver, for example to start a dialogue with the vehicle driver to convey system information. The VID may, for example, be configured to convey one or more of the following categories of information to the vehicle driver:
· Warning (i.e. Major/Minor vehicle warning, external warnings, safety advance notifications);
• Recommendations (i.e. Help and guidance, recommendation to use some features);
• Feedback (i.e. confirmation of actions, External messages, status message, information); and
· Reminders (i.e. Configuration message, Events calendar reminder, scheduled maintenance, MOT and other external reminders).
At least in certain embodiments, the communication controller 2 implements a VID to interact with the driver to help improve vehicle performance, to improve the driving experience and to help reduce distraction.
With reference to Figure 1 , the communication controller 2 comprises an electronic processor 3 configured to execute a set of computational instructions stored on a non- transitory computer readable media. In the present embodiment, the computational instructions are stored in a system memory 4 connected to the electronic processor 3. The system memory 4 may, for example, comprise a memory device. The electronic processor 3 is configured to implement the VID function of the communication controller 2. The electronic processor 3 is configured to identify information to be output to the vehicle driver and to schedule the output of that information. The electronic processor 3 communicates with a plurality of vehicle systems 5-n over a communication network 6. The vehicle systems 5-n each publish one or more data signal S-n to the communication network 6. The data signals S-n each comprise information relating to an on-board vehicle system, for example vehicle speed data and/or fault data; or an external system, such as local traffic.
The system memory 4 comprises a plurality of dialogue components 7 for communicating information to the vehicle driver. The dialogue components 7 may be adapted to convey information derived from one or more of said data signals S-n. The electronic processor 3 is configured to select one or more of said dialogue components 7 in dependence on the data signals S-n received from the vehicle systems 5-n. The selected dialogue component(s) 7 is then output to communicate information to the vehicle driver. The electronic processor 3 may select and output a plurality of said dialogue components 7, for example to be output in a sequence. In certain embodiments, the dialogue components 7 may be arranged in a dialogue tree which is traversed in dependence on one or more user inputs. The dialogue components 7 may each comprise or consist of an audio output and/or a visual output. A flow chart 100 is shown in Figure 2 to illustrate the operating states of the communication controller 2. The communication controller 2 switches between an idle state ST1 ; an initiate communication state ST2; a communicating information state ST3; and a finish dialogue state ST4. The communication controller 2 defaults to the idle state ST1 when activated. The operating state of the communication controller 2 may be indicated to the driver. In the present embodiment, an icon 8 is output to provide a visual indication of the operating state of the communication controller 2. When in said idle state ST1 , the electronic processor 3 is configured to read the data signals S-n published to the communication network 5 and to identify information to be communicated to the vehicle driver. The electronic processor 3 may identify information to be output to the vehicle driver based on predefined criteria. The electronic processor 3 may monitor the data signals S-n published to the communication network 6 to determine operating conditions of that vehicle system 5-n are outside predefined operating parameters, for example above an upper threshold and/or below a lower threshold. Upon identification of information to be communicated to the vehicle driver, the operating status of the communication controller 2 changes to the initiate communication state ST2. The electronic processor 3 identifies one or more dialogue component 7 for output in dependence on the data signals S-n. The electronic processor 3 then schedules output of said one or more selected dialogue component 7. The scheduling may, for example, depend on the criticality of the information to be conveyed to the vehicle driver. Information having a higher criticality rating will be identified for output sooner than information having a lower criticality rating. The communication controller 2 then changes to the initiate communication state ST2 and the electronic processor 3 is configured to generate an advance notification signal ANS. The advance notification signal ANS is output to signal to the driver that the communication controller 2 is preparing to initiate a new dialogue. The advance notification signal ANS is output prior to output of an initial dialogue component 7 in which is to open the new dialogue with the driver. The advance notification signal ANS is output to a notification generating means 9. The notification generating means 9 generates an advance notification which alerts the vehicle driver that the communication controller 2 has identified a dialogue component 7 to be output. This advance notification helps to reduce driver distraction when the new dialogue commences and also ensures that the driver is ready to receive the information contained in the dialogue, thereby aiding comprehension. In the present embodiment the advance notification is independent of the content of the initial dialogue component 7 and/or the criticality of the initial dialogue component 7. The advance notification may comprise one or more of the following: an audio output, a visual output and a haptic output. The notification generating means 9 may be implemented using an existing vehicle system. By way of example, a visual output of the advance notification may output to a display screen 10 in the vehicle 1 , for example disposed in the instrument cluster, a dashboard or a centre console. An audio output of the advance notification may be output by an on-board audio entertainment system 1 1 . A haptic output of the advance notification may be output by a vibration generator 12 disposed in a steering wheel or a driver seat.
The operating status of the communication controller 2 is held in the initiate communication state ST2 for a predetermined time period following output of the advance notification signal ANS. The predetermined time period can be calibrated, but a time period of between five (5) and eleven (12) seconds inclusive is envisaged. Other time periods are useful. The delay provides the driver an opportunity to delay output of the initial dialogue component 7, for example if they do not wish to receive new information due to current workload. During the predetermined time period, the electronic processor 3 checks for a user input signal SIN and output of the initial dialogue component 7 may be controlled in dependence on any such user input signal. The user input signal SIN may be generated when the vehicle driver actuates an input device 13, such as a button, a switch, or a capacitive sensor. The input device 13 may be disposed on a steering wheel, in a centre console, or a control panel. The user input signal SIN may, for example, comprise a cancellation request to cancel output of the initial dialogue component 7; and/or a postpone request to delay or postpone output of the initial dialogue component 7. Alternatively, or in addition, the user input signal may comprise an expedite request to reduce or remove any time period in outputting the initial dialogue component 7. When the predetermined time period expires, the communication controller 2 changes to the communicating information state ST3. In the communicating information state, the electronic processor 3 is configured to output the initial dialogue component 7 and any subsequent dialogue components 7. The initial dialogue component 7 may comprise a visual output which is output to the display screen 10; and/or an audio output which is output over the audio entertainment system 1 1. While the communication controller 2 is in said communicating information state ST3, the electronic processor 3 may output additional dialogue components 7, for example to provide additional information or to provide updated information. The electronic processor 3 may select and output additional dialogue components 7 in dependence on a user input signal SIN. The electronic processor 3 may traverse a dialogue tree comprising multiple dialogue components in dependence on one or more user input signals SIN.
The communication controller 2 remains in the information communicating state ST3 as long as it is communicating with the vehicle driver, including outputting dialogue components 7 and, if appropriate, waiting for a driver response. Once the driver has made a final decision, the communication controller 2 confirms this to the driver. The electronic processor 3 may be configured to determine that the current dialogue is finished when one or more of the following conditions is identified: a time-out condition, for example after a predetermined time period has elapsed; a user request provided in a user input signal SIN; no further dialogue components 7 for output, for example upon completion of a traversal of a dialogue tree; and determination that the information relating to said one or more vehicle systems 5-n is no longer relevant, for example a speed limit warning may be removed when the vehicle speed decreases below the speed limit. The electronic processor 3 determines that the current dialogue is complete and the communication controller 2 switches to the finish dialogue state ST4. The electronic processor 3 may optionally output a conclusion notification signal CNS to the notification generating means 9. The notification generating means 9 may output a conclusion notification to indicate that the vehicle-initiated dialogue has been concluded. The conclusion notification may be the same as or different from the advance notification signal. The communication controller 2 then reverts to the idle state ST1 .
As outlined above, the icon 8 is output to the display screen 10 to provide a visual indication of the operating status and/or condition of the communication controller 2. The icon 8 may, for example, represent one or more of the following: the status of the communication controller 2 (On, Off, Idle, Listening, About to start, Finish and Error); the criticality of information (High Priority, Normal Priority and Low Priority); the perceived effect of information (indicating how information may affect a driver's behaviour). By way of example, the shape of the icon 8 may be varied to indicate the status of the communication controller 2; the colour of the icon 8 may be varied to indicate the criticality of information to be communicated; and a change in depth of the icon 8 may be used to indicate the perceived effect of the information.
A schematic representation of a first image 14 output to the display screen 10 is shown in Figures 3A and 3B. In the illustrated arrangement the display screen 10 is disposed in a dashboard of the vehicle and the first image 14 comprises a digital representation of an instrument cluster 15. The icon 8 is removed (or blacked out) in Figure 3A to indicate that the communication controller 2 has been deactivated. The icon 8 is shown in grey in Figure 3B to indicate that the communication controller 2 is in said idle state ST1 . The icon 8 is displayed in colour in Figure 3C (illuminated green in the illustrated example) to indicate that the communication controller 2 is active. With reference to Figure 4A, a dialogue component 7 comprising a warning that the washer fluid level is low is displayed in the first image 14; the criticality of this information is categorised as Low Priority and the colour of the icon 8 remains unchanged (illuminated green in the illustrated example). With reference to Figure 4B, a dialogue component 7 comprising a warning that the current speed of the vehicle 1 exceeds the legal speed limit is displayed in the first image 14; the criticality of this information is categorised as High Priority and the colour of the icon 8 represents this state (flashing red in the illustrated example). With reference to Figure 4C, a dialogue component 7 comprising a warning that an engine fault has been detected ("Cooling system failed") is displayed in the first image 14; the criticality of this information is categorised as High Priority and the colour of the icon 8 represents this state (flashing red in the illustrated example). It will be understood that various modifications may be made to the embodiment(s) described herein without departing from the scope of the appended claims. In the embodiment described herein the advance notification is independent of the content and/or the criticality of the dialogue component 7 to be output subsequently. In alternative arrangements, the advance notification may provide an indication of the content and/or criticality of the dialogue component 7 to be output. For example, the magnitude and/or frequency and/or duration of vibrations generated by the vibration generator 12 may be modified in dependence on the content and/or criticality of the information. Similar approaches may be employed to modify the visual and/or audio form of the advance notification in dependence on the content and/or criticality of the information.
The user input signal SIN is described herein as being generated by actuation of an input device 13, such as a button, a switch, or a capacitive sensor. It will be understood that other techniques may be used to detect a user input. For example, the user input signal SIN may be generated in dependence on detection of a user gesture or action, for example detected by processing image data generated by a user-facing camera. Alternatively, or in addition, it will be understood that a user gesture or action may also be detected by using other known sensors, but not limited to, such as e-field sensor, ultrasonic sensor, infrared sensor and radar. Alternatively, or in addition, the user input signal SIN may be generated in dependence on a user voice command or instruction.

Claims

CLAIMS:
1 . A communication controller for controlling a vehicle-initiated dialogue (VID) comprising at least one dialogue component, the communication controller comprising: at least one processor configured to control output of said at least one dialogue component; and
a memory connected to the at least one processor;
wherein the at least one processor is configured to:
identify a dialogue component to be output;
output an advance notification signal (ANS) to generate an advance notification that the identified dialogue component is to be output; and
outputting the identified dialogue component after said advance notification signal (ANS).
2. A communication controller as claimed in claim 1 , wherein the at least one processor is configured to check for a user input signal after output of the advance notification signal (ANS) and to control output of said dialogue component in dependence on said user input signal.
3. A communication controller as claimed in claim 2, wherein the at least one processor is configured to check for a user input signal comprising a postpone request, the at least one processor being configured to postpone output of said dialogue component upon detection of said postpone request.
4. A communication controller as claimed in claim 2 or claim 3, wherein the at least one processor is configured to check for a user input signal comprising an expedite request, the at least one processor being configured to expedite output of said dialogue component upon detection of said expedite request.
5. A communication controller as claimed in any one of claims 2, 3 or 4, wherein the at least one processor is configured to check for a user input signal comprising a cancellation request, the at least one processor being configured to cancel output of said dialogue component upon detection of said cancellation request.
6. A communication controller as claimed in any one of claims 2 to 5, wherein the at least one processor is configured to output said dialogue component after a predetermined time period in the absence of a user input signal.
7. A communication controller as claimed in any one of the preceding claims, wherein the dialogue component is an initial dialogue component.
8. A communication controller as claimed in any one of the preceding claims, wherein the at least one processor is configured to output a conclusion notification signal (CNS) to generate a conclusion notification.
9. A communication controller for controlling a vehicle-initiated dialogue (VID) comprising at least one dialogue component, the communication controller comprising: at least one processor configured to control output of said at least one dialogue component; and
a memory connected to the at least one processor;
wherein the at least one processor is configured to:
output a conclusion notification signal (CNS) to generate a conclusion notification when the vehicle-initiated dialogue (VID) is concluded.
10. A communication system comprising a communication controller as claimed in any one of the preceding claims.
1 1 . A vehicle comprising a communication controller as claimed in any one of the preceding claims.
12. A vehicle as claimed in claim 1 1 , wherein said advance notification signal (ANS) is output to one or more of the following: a display screen; an audio device; and a haptic device.
13. A method of controlling a vehicle-initiated dialogue (VID) comprising at least one dialogue component, the method comprising:
identifying a dialogue component to be output;
generating an advance notification prior to output of said dialogue component; and outputting the identified dialogue component after said advance notification.
14. A method as claimed in claim 13, comprising checking for a user input after generating the advance notification and controlling output of said dialogue component in dependence on said user input.
15. A method as claimed in claim 14, wherein the user input comprises a postpone request, the method comprising postponing output of said dialogue component upon detection of said postpone request.
16. A method as claimed in claim 14 or claim 15, wherein the user input comprises an expedite request, the at least one processor being configured to expedite output of said dialogue component upon detection of said expedite request.
17. A method as claimed in any one of claims 14, 15 or 16, wherein the user input comprises a cancellation request, the method comprising cancelling output of said dialogue component upon detection of said cancellation request.
18. A method as claimed in any one of claims 14 to 17 comprising outputting said dialogue component after a predetermined time period in the absence of a user input signal.
19. A method as claimed in any one of the claims 13 to 18, wherein the dialogue component is an initial dialogue component.
20. A method as claimed in any one of claims 13 to 19 comprising outputting a conclusion notification when the vehicle-initiated dialogue (VI D) is concluded.
21 . A method of controlling a vehicle-initiated dialogue (VID) comprising at least one dialogue component, the method comprising:
outputting the at least one dialogue component; and
outputting a conclusion notification when the vehicle-initiated dialogue (VID) is concluded.
22. A set of instructions which, when executed, cause a computational device to implement the method claimed in any one of claims 13 to 21.
23. A non-transitory computer readable media comprising a set of computational instructions which, when executed, cause a computer to implement the method claimed in any one of claims 13 to 21 .
PCT/EP2018/050998 2017-01-17 2018-01-16 Communication control apparatus and method WO2018134196A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1700817.8 2017-01-17
GB1700817.8A GB2558669B (en) 2017-01-17 2017-01-17 Communication control apparatus and method

Publications (1)

Publication Number Publication Date
WO2018134196A1 true WO2018134196A1 (en) 2018-07-26

Family

ID=58463477

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/050998 WO2018134196A1 (en) 2017-01-17 2018-01-16 Communication control apparatus and method

Country Status (2)

Country Link
GB (1) GB2558669B (en)
WO (1) WO2018134196A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10338512A1 (en) * 2003-08-22 2005-03-17 Daimlerchrysler Ag Support procedure for speech dialogues for the operation of motor vehicle functions
EP1560200A1 (en) * 2004-01-29 2005-08-03 Harman Becker Automotive Systems GmbH Method and system for spoken dialogue interface
US20070143115A1 (en) * 2002-02-04 2007-06-21 Microsoft Corporation Systems And Methods For Managing Interactions From Multiple Speech-Enabled Applications
US20140206391A1 (en) * 2013-01-18 2014-07-24 Plantronics, Inc. Context Sensitive and Shared Location Based Reminder

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS56103633A (en) * 1980-01-22 1981-08-18 Nissan Motor Co Ltd Voice warning device for vehicle
JPS5747231A (en) * 1980-09-05 1982-03-18 Nippon Denso Co Ltd Warning method and apparatus used for vehicle
US6198388B1 (en) * 2000-05-09 2001-03-06 Victor Chen Voice warning system for automobiles
JP4936059B2 (en) * 2007-07-09 2012-05-23 株式会社デンソー Mobile object approach recognition device
US10186138B2 (en) * 2014-09-02 2019-01-22 Apple Inc. Providing priming cues to a user of an electronic device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070143115A1 (en) * 2002-02-04 2007-06-21 Microsoft Corporation Systems And Methods For Managing Interactions From Multiple Speech-Enabled Applications
DE10338512A1 (en) * 2003-08-22 2005-03-17 Daimlerchrysler Ag Support procedure for speech dialogues for the operation of motor vehicle functions
EP1560200A1 (en) * 2004-01-29 2005-08-03 Harman Becker Automotive Systems GmbH Method and system for spoken dialogue interface
US20140206391A1 (en) * 2013-01-18 2014-07-24 Plantronics, Inc. Context Sensitive and Shared Location Based Reminder

Also Published As

Publication number Publication date
GB2558669A (en) 2018-07-18
GB201700817D0 (en) 2017-03-01
GB2558669B (en) 2020-04-22

Similar Documents

Publication Publication Date Title
CN104859652B (en) For the method and system of automatic Pilot
JP4286876B2 (en) Image display control device
US9493116B2 (en) Alert systems and methods for a vehicle
JP2018165148A (en) Automatic operation interface
US9701245B2 (en) Alert systems and methods for a vehicle
US8301317B2 (en) Driver information device
RU2709210C2 (en) Improved notification of vehicle system
DE102013211025A1 (en) Warning systems and warning procedures for a vehicle
JP4659754B2 (en) Method and system for interaction between vehicle driver and multiple applications
US20130342364A1 (en) Alert systems and methods for a vehicle
CN108146438A (en) For enhancing driver attention's module of driving assistance system
JP2007511414A6 (en) Method and system for interaction between vehicle driver and multiple applications
US10055993B2 (en) Systems and methods for control of mobile platform safety systems
JP6288114B2 (en) Vehicle information providing device
EP2903240A1 (en) Configurable communication systems and methods for communication
WO2018134198A1 (en) Communication control apparatus and method
CN107683236A (en) The method and system that driving model for managing motor vehicles changes
WO2018134196A1 (en) Communication control apparatus and method
WO2017220375A1 (en) Activity monitor
US10418020B2 (en) Vehicle adaptive cruise control noise cancelation
US20210018908A1 (en) Remote driving system
GB2555088A (en) Interface apparatus and method
CN112918251A (en) Display control device, vehicle, display control method, and recording medium having program recorded thereon
US20230221964A1 (en) Display control device for vehicle, vehicle display device, vehicle, method and non-transitory storage medium
US20240124012A1 (en) Method, Device and Storage Medium for Scheduling Notification Based on Driving assistance features

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18701421

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18701421

Country of ref document: EP

Kind code of ref document: A1