WO2023276347A1 - 音声案内装置、音声案内方法及びプログラム - Google Patents

音声案内装置、音声案内方法及びプログラム Download PDF

Info

Publication number
WO2023276347A1
WO2023276347A1 PCT/JP2022/014350 JP2022014350W WO2023276347A1 WO 2023276347 A1 WO2023276347 A1 WO 2023276347A1 JP 2022014350 W JP2022014350 W JP 2022014350W WO 2023276347 A1 WO2023276347 A1 WO 2023276347A1
Authority
WO
WIPO (PCT)
Prior art keywords
message
image
mobile device
user
function
Prior art date
Application number
PCT/JP2022/014350
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
修 西岡
拓也 小山内
虎喜 岩丸
崚 武智
Original Assignee
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本田技研工業株式会社 filed Critical 本田技研工業株式会社
Priority to JP2023531436A priority Critical patent/JPWO2023276347A1/ja
Priority to CN202280043796.1A priority patent/CN117546137A/zh
Publication of WO2023276347A1 publication Critical patent/WO2023276347A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L13/00Speech synthesis; Text to speech systems

Definitions

  • the present invention relates to a voice guidance device, voice guidance method and program.
  • Patent Literature 1 proposes an information providing apparatus that superimposes and outputs primary sound that expresses information to be provided to a user in spoken language and secondary sound that is associated with the information to be provided to the user.
  • One aspect of the present invention provides techniques for enabling users to grasp messages in a short period of time.
  • a voice guidance device comprising message determination means for determining a message to be provided to a user, sound selection means for selecting a sound related to the message from a plurality of sounds, and prior to providing the message. and providing means for providing the user with a sound related to the message, wherein the reproduction time of the sound related to the message is shorter than the reproduction time of the message.
  • FIG. 10 is a diagram illustrating a display screen of communication functions of some embodiments; It is a figure explaining the display screen of the route guidance function of some embodiment.
  • FIG. 10 is a diagram illustrating a display screen of the music playback function of some embodiments;
  • FIG. 12 illustrates a display screen of the messaging function of some embodiments; It is a figure explaining the display in the vehicle of some embodiments.
  • Figures 4A-4D illustrate the operation of a mobile device of some embodiments;
  • FIG. 10 is a diagram illustrating the manner in which the preceding sound is provided in accordance with some embodiments;
  • FIG. 10 is a diagram illustrating the preceding sound providing operation of some embodiments;
  • FIG. 2 illustrates types of messages of some embodiments;
  • FIG. 2 illustrates types of messages of some embodiments;
  • FIG. 2 illustrates types of messages of some embodiments;
  • FIG. 1 is a front rear view of a vehicle 100.
  • FIG. 1 is a front rear view of a vehicle 100.
  • the vehicle 100 has a display device 101 in the center in the vehicle width direction.
  • the display device 101 displays information for the driver of the vehicle 100 (hereinafter simply referred to as the driver).
  • the display device 101 may be a dot-matrix display device such as a liquid crystal display or an organic EL (electro-luminescence) display, and notifies by extinguishing or lighting (or blinking) a predetermined mark. It may be a set of indicators.
  • the vehicle 100 has a left handle switch 102 inside the left handle grip in the vehicle width direction.
  • the appearance of the left handlebar switch 102 will be described with reference to FIG.
  • Left handle switch 102 includes a plurality of switches. At least some of these switches can be operated with the left thumb while the driver is holding the left steering wheel with the left hand.
  • switches used for a linking function (described later) between the vehicle 100 and the mobile device will be described.
  • Other switches that is, switches used for functions of vehicle 100 other than the linking function
  • left handle switch 102 may have the same configuration as the conventional one, and therefore description thereof will be omitted.
  • the left handle switch 102 includes an upper switch 200U, a lower switch 200D, and a left/right switch 200H.
  • the left/right switch 200H is arranged substantially in the center of the left handle switch 102 .
  • the left/right switch 200H is a tilting switch that can be tilted in each of the left and right directions.
  • the driver can input a left direction command by tilting the left/right switch 200H to the left, and can input a right direction command by tilting the left/right switch 200H to the right.
  • the upper switch 200U is arranged above the left/right switch 200H.
  • the upper switch 200U is a push-type switch.
  • the driver can input an upward direction command by pressing the up switch 200U.
  • the lower switch 200D is arranged below the left/right switch 200H.
  • the lower switch 200D is a push-type switch.
  • the driver can input a downward direction command by pressing the downward switch 200D.
  • the upper switch 200U, the lower switch 200D, and the left/right switch 200H are collectively referred to as direction switches 200.
  • the directional switch 200 is not limited to the configuration of FIG. 2, and may have any configuration that allows input of instructions in four directions.
  • This cooperation system has the above-described vehicle 100 , mobile device 310 , and headset 320 .
  • the vehicle 100 has a control section 301 , a display section 304 , a direction input section 305 and a communication section 306 .
  • Control unit 301 controls the entire vehicle 100 .
  • the control unit 301 is configured by a processor 302 and a memory 303, for example.
  • the operation by the control unit 301 is realized by the processor 302 executing the program stored in the memory 303 .
  • a part or all of the operation of the control unit 301 may be realized by a dedicated circuit such as ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gateway).
  • the display unit 304 displays information for the driver.
  • the display unit 304 is realized by the display device 101, for example.
  • a direction input unit 305 acquires four direction instructions from the driver.
  • the direction input unit 305 is implemented by, for example, an up switch 200U, a down switch 200D and a left/right switch 200H.
  • Communication unit 306 provides a function for vehicle 100 to communicate with the outside.
  • Communication unit 306 may support short-range wireless communication such as Bluetooth®.
  • the communication unit 306 may support cellular communication, road-to-vehicle communication, and the like.
  • the direction input unit 305 may use the direction switch 200 to accept two levels of input for each direction.
  • the direction input unit 305 may accept a short-press input (for example, an input with a duration of less than 1 second) and a long-press input (for example, an input with a duration of 1 second or more).
  • the directional input unit 305 supports single-tap input (for example, input with an interval of 1 second or longer before the next input) and double-tap input (for example, two consecutive inputs with an interval of less than 1 second). and may be accepted.
  • the mobile device 310 has a control unit 311 , a display unit 314 , an input unit 315 and a communication unit 316 .
  • Mobile device 310 may be, for example, a mobile telephony device such as a smart phone.
  • the user of mobile device 310 may be the same as the driver. A case where the driver uses the mobile device 310 will be described below.
  • Mobile device 310 may be held by a driver or housed in vehicle 100 .
  • a control unit 311 controls the entire mobile device 310 .
  • the control unit 311 is configured by a processor 312 and a memory 313, for example. In this case, the operation by the control unit 311 is realized by the processor 312 executing the program stored in the memory 313 .
  • a program may include an operating system and an application. A part or all of the operation of the control unit 311 may be realized by a dedicated circuit such as ASIC or FPGA.
  • the display unit 314 displays information for the driver.
  • the display unit 314 is implemented by a display device such as a liquid crystal display or an organic EL display.
  • the input unit 315 acquires input from the driver.
  • the input unit 315 is implemented by an input device such as a touch panel or buttons, for example.
  • the communication unit 316 provides functions for the mobile device 310 to communicate with the outside. Communication unit 316 may support short-range wireless communication such as Bluetooth. Furthermore, the communication unit 306 may support cellular communication, WiFi (registered trademark) communication, and the like.
  • the headset 320 has a microphone 321 , a speaker 322 and a communication section 323 .
  • Headset 320 is worn on the driver's head.
  • Microphone 321 acquires voice input from the driver.
  • a speaker 322 outputs audio to the driver.
  • Communication unit 323 may support short-range wireless communication such as Bluetooth.
  • vehicle 100 and mobile device 310 work together. Specifically, a communication link is established between the vehicle 100 and the mobile device 310, for example, by short-range wireless communication. Vehicle 100 and mobile device 310 exchange data over this communication link. A communication link is also established between the mobile device 310 and the headset 320, for example by short-range wireless communication. Mobile device 310 and headset 320 exchange data over this communication link.
  • At least some of the functions provided by the mobile device 310 can cooperate with the vehicle 100.
  • a function that can cooperate with the vehicle 100 will be referred to as a cooperation function.
  • the driver inputs a direction to the mobile device 310 through the direction input unit 305 of the vehicle 100 .
  • This directional input is transmitted from vehicle 100 to mobile device 310 .
  • the mobile device 310 performs the operation of the cooperative function according to the directional input from the driver, and generates an image showing the operation status.
  • Mobile device 310 transmits the generated image to vehicle 100 .
  • Vehicle 100 displays the received image on display unit 304 .
  • Mobile device 310 also transmits audio output to the driver to headset 320 .
  • Headset 320 outputs the received audio output from speaker 322 .
  • Audio output to the driver may include audio guidance. Since the mobile device 310 thus has a voice guidance function, the mobile device 310 can also be called a voice guidance device.
  • Headset 320 transmits voice input from the driver captured by microphone 321 to mobile device 310 . The mobile device 310 operates the cooperation function according to the voice input from the driver.
  • FIG. 4 illustrates a display example of the display unit 304 in cooperative operation between the vehicle 100 and the mobile device 310.
  • FIG. A screen 400 is an example of a screen displayed on the display device 101 .
  • the running speed is displayed in the center of the screen 400, and the remaining amount of fuel is displayed below it.
  • a speed tachometer is displayed on the left side of the screen 400 .
  • An area 401 on the right side of the screen 400 is an area for displaying images supplied from the mobile device 310 while the vehicle 100 is in cooperation with the mobile device 310 .
  • Such an image is hereinafter referred to as a linked image.
  • vehicle 100 displays other images in area 401, such as images not related to linked functions. good too.
  • the vehicle 100 displays an option image such as riding mode parameters (for example, information set in the vehicle 100 such as the operation status of the ABS (anti-lock braking system)) and level adjustment in the area 401. good too.
  • an image with a low importance degree may be displayed among images that are not related to the cooperation function (for example, an image related to running of the vehicle 100).
  • the space of the screen 400 can be effectively utilized by sharing the area 401 with images that are not related to the link function and have a low degree of importance and link images.
  • An area 402 on the lower side of the screen 400 is an area for displaying images supplied from the mobile device 310 while the vehicle 100 is not in cooperation with the mobile device 310 .
  • Such an image is hereinafter referred to as an interrupt image. While no interrupt images are being provided by mobile device 310, vehicle 100 may display other images in area 402, such as images not associated with collaboration features.
  • Mobile device 310 provides a number of federation features.
  • the plurality of cooperative functions may include two or more of a weather information providing function, a calling function, a messaging function, a music playing function, and a route guidance function.
  • the weather information provision function is a function that provides weather information for a specific location. Weather may include at least one of weather and temperature.
  • a call function is a function of making a call with another person.
  • the messaging function is the ability to exchange text messages with others.
  • a music reproduction function is a function for reproducing music.
  • the route guidance function is a function of guiding a route to a specified destination.
  • the mobile device 310 receives two types of input for each of the four directions (up, down, left, and right) from the vehicle 100 .
  • the following example deals with the case where the two types of input are a short press and a long press. 5 to 9, a black triangle indicates a short press, and a black triangle with a " ⁇ " indicates a long press. 5 to 9, all images other than image 500 are linked images, that is, images that mobile device 310 transmits to vehicle 100 and displayed on display device 101 of vehicle 100 while the linked function is being executed. Therefore, in the following description of FIGS. 5 to 9, the process up to the transmission of the linked image will be described, and the description of the subsequent display of the linked image will be omitted.
  • the image 500 other than the cooperating function is displayed in the area 401 of the screen 400.
  • the mobile device 310 starts cooperation with the vehicle 100 when receiving a long left press input from the driver via the direction input unit 305 of the vehicle 100 while not in cooperation with the vehicle 100 . Since the direction input is similarly obtained from the driver via the direction input unit 305 of the vehicle 100 in the following description, only the input direction (up, down, left, right) and the input mode (short press/long press) will be described below. show.
  • Each of the multiple linked functions has a top image.
  • the top image is an image that the mobile device 310 transmits to the vehicle 100 immediately after one linked function is selected.
  • An image 501 is the top image of the weather information providing function. The image 501 displays an image showing the weather at the current location of the vehicle 100 and the temperature at the current location. When a destination is set for the vehicle 100, the image 501 may display an image showing the weather at the destination and the temperature at the destination.
  • Image 502 is the top image of the call function. Image 502 includes an image depicting call functionality.
  • Image 503 is the top image of the route guidance function.
  • Image 503 includes an image showing the route guidance function.
  • Image 504 is the top image of the music playback function.
  • Image 504 includes an image showing the music playback function.
  • Image 505 is the top image of the messaging function. Image 505 includes an image that illustrates messaging functionality.
  • the mobile device 310 selects any one of the plurality of cooperation functions and transmits the top image of the selected cooperation function.
  • the linking function selected here may be a linking function set in advance as an initial function. Such a linking function is called an initial linking function.
  • the call function is set as the initial cooperation function. Therefore, the mobile device 310 transmits the image 502 immediately after starting cooperation.
  • the driver can also set another linked function as the initial linked function. Mobile device 310 may audibly notify that cooperation has started.
  • the mobile device 310 terminates cooperation with the vehicle 100 when receiving a right long press input while the top image (image 502 in this example) of the initial cooperation function is displayed. In this case, mobile device 310 transmits a notification to end cooperation to vehicle 100 , and vehicle 100 ends the cooperation operation based on the notification, and displays image 500 other than the cooperation function in area 401 . . The mobile device 310 may audibly notify that the collaboration has ended.
  • the linked image is displayed in the area 401 on the right side of the screen 400. Therefore, when the linked image is displayed in area 401 at the start of collaboration, the driver feels as if the linked image has entered area 401 from the outside of screen 400 beyond the right edge of screen 400 . Since the input direction (left direction) for starting the cooperative action matches the virtual moving direction of the cooperative image, the driver can intuitively start the cooperative function. In order to make this effect more conspicuous, the vehicle 100 may be displayed with animation such that the linked image enters the area 401 from the right side. Further, when the linked image in the area 401 is erased at the end of the linking, the driver feels as if the linked image has crossed the right end of the screen 400 to the outside of the screen 400 .
  • the vehicle 100 may be displayed with an animation such that the linked image appears from the area 401 to the right side. Furthermore, vehicle 100 may associate the input direction with the movement direction of the image also when displaying other linked images. That is, the vehicle 100 may display an image received in response to transmitting upward, downward, leftward, and downward directions to the mobile device 310 with an animation of movement in the directions of the directions.
  • the mobile device 310 When the mobile device 310 acquires an up short press input or a down short press input in a state in which the top image (any of the images 501 to 505) of any one of the plurality of linked functions is displayed, the mobile device 310 selects the linked function being selected. switch to another linkage function, and send the top image of the linked function after switching. For example, when the mobile device 310 acquires a short press input while the top image of the selected linkage function is displayed, the mobile device 310 performs the weather information provision function, call function, route guidance function, music playback function, and messaging function. The currently selected linkage function is switched in order, and the top image (one of the images 501 to 505) corresponding to the linked function after switching is transmitted.
  • mobile device 310 switches the currently selected link function in reverse order in response to acquiring the up short press input in a state where the top image of the link function being selected is displayed, and switches the selected link function in reverse order.
  • the top image (one of the images 501 to 505) is transmitted.
  • the mobile device 310 may notify the linked function after switching by voice when switching the linked function.
  • the mobile device 310 displays the top image (image 502 in this example) of the initial link function when the left long press input is acquired while any link image shown in FIGS. 5 to 9 is being displayed. Transition to transmission.
  • the mobile device 310 sets the volume of the sound output to the driver to zero (that is, mutes the sound) when the long-pressing input is acquired while any of the linked images shown in FIGS. 5 to 9 is being displayed.
  • the mobile device 310 cancels the mute state and restores the volume to the volume before the mute state.
  • the mobile device 310 may end the cooperative operation when receiving a right long press input even when any of the cooperative images shown in FIGS.
  • the mobile device 310 may accept an instruction to end the cooperation operation only when the top image of the initial cooperation function (the image 502 in this example) is displayed, or may accept an instruction to end the cooperation operation.
  • An instruction to end the cooperative operation may be accepted only when the top image is displayed.
  • a specific operation example of the call function will be described with reference to FIG.
  • the mobile device 310 acquires a short right press input while the image 502 is displayed, the mobile device 310 transitions to transmission of the image 602 .
  • An image 602 is an image for requesting the driver to designate a calling party. Along with sending image 602, mobile device 310 may audibly request the driver to designate a calling party.
  • the mobile device 310 acquires a short-down press input, a short-right press input, and a short left-press input while the image 602 is displayed, the mobile device 310 transitions to transmission of the image 603, transmission of the image 604, and transmission of the image 502, respectively. do.
  • the image 603 is an image containing the preset name of the other party.
  • Mobile device 310 reads the preset name of the calling party from memory 313 and includes it in image 603 when transitioning to transmission of image 603 .
  • Mobile device 310 may send image 603 and notify the driver of the read name of the other party by voice.
  • the mobile device 310 acquires an up short press input, a right short press input, and a left short press input while the image 603 is displayed, the mobile device 310 transitions to transmission of the image 602, transmission of the image 606, and transmission of the image 502, respectively. do.
  • An image 604 is an image indicating that the voice search is in a standby state.
  • Image 604 may include an image that indicates the input direction to end the voice search.
  • mobile device 310 may audibly notify the driver that voice search is awaiting.
  • mobile device 310 waits for voice input via headset 320 .
  • the mobile device 310 searches the contact list in the memory 313 for the called party based on the voice input.
  • the mobile device 310 transitions to transmission of the image 605 when the calling party can be identified, and does not transmit a new image when the calling party cannot be identified (that is, the vehicle 100 continues displaying the image 604). ). If the mobile device 310 cannot identify the caller, the mobile device 310 may notify the driver by voice.
  • the mobile device 310 acquires a short left press input while the image 604 is displayed, the mobile device 310 transitions to transmission of the image 602 .
  • An image 605 is an image containing the name of the caller specified by voice search.
  • mobile device 310 may notify the driver of the identified caller's name by voice.
  • the mobile device 310 acquires a right short press input and a left short press input while the image 605 is displayed, the mobile device 310 transitions to transmission of the image 606 and image 602 respectively.
  • An image 606 is an image showing the execution status of the call function.
  • the image 606 may include the caller and call status (ringing or calling (in which case, call duration)).
  • Image 606 may include an image indicating input directions for ending a call.
  • the mobile device 310 starts calling the other party, and updates the image 606 according to the call status. Audio of the call is input and output between mobile device 310 and the driver via headset 320 .
  • the mobile device 310 acquires a short left press input while the image 606 is displayed, the mobile device 310 ends the call and transitions to transmission of the image 502 .
  • the mobile device 310 increases or decreases the volume output from the headset 320 when receiving a short up-press input or a short down-press input while the image 606 is displayed.
  • the mobile device 310 acquires the left long press input while the image 606 is displayed, the mobile device 310 transitions to transmission of the image 601 while continuing the call.
  • An image 601 is the top image of the call function during execution of the call function.
  • the image 502 is the top image of the call function when the call function is not being executed. Executing the call function may mean that the other party is being called or is in the middle of a call with the other party.
  • the image 601 may include the caller and call status (calling, calling (in that case, call duration)). Since the image 601 is the top image, the mobile device 310 switches the call function to another cooperation function when receiving a short up or short down press while the image 601 is displayed. Mobile device 310 may maintain a call state if the call function switches to another collaboration function while the call function is running.
  • the mobile device 310 When the mobile device 310 acquires a short right press input while the image 601 is displayed, the mobile device 310 transitions to transmission of the image 606 while continuing the call. The mobile device 310 transitions to transmission of the image 502 when the call ends (the other party ends the call) while the image 601 is displayed. Mobile device 310 does not have to accept the call termination instruction from the driver while image 601 is displayed.
  • a specific operation example of the route guidance function will be described with reference to FIG.
  • the mobile device 310 acquires a short right press input while the image 503 is displayed, the mobile device 310 transitions to transmission of the image 702 .
  • An image 702 is an image for requesting the driver to designate a destination. Along with sending the image 702, the mobile device 310 may verbally request the driver to specify the destination.
  • the mobile device 310 acquires a short-down press input, a short-right press input, and a short left-press input while the image 702 is displayed, the mobile device 310 transitions to transmission of the image 703, transmission of the image 704, and transmission of the image 503, respectively. do.
  • An image 703 is an image containing a preset destination.
  • Mobile device 310 upon transitioning to sending image 703 , reads the pre-set destination location from memory 313 and includes it in image 703 .
  • Mobile device 310 may send image 703 and notify the driver of the retrieved destination by voice.
  • the mobile device 310 acquires an up short press input, a right short press input, and a left short press input while the image 703 is displayed, the mobile device 310 transitions to transmission of the image 702, transmission of the image 706, and transmission of the image 503, respectively. do.
  • An image 704 is an image indicating that the voice search is in a standby state.
  • Image 704 may include an image indicating the input direction to end the voice search.
  • mobile device 310 may audibly notify the driver that voice search is awaiting.
  • mobile device 310 waits for voice input via headset 320 .
  • the mobile device 310 identifies the destination based on the voice input, for example, from map information.
  • the mobile device 310 transitions to transmission of the image 705 when the destination is identified, and does not transmit a new image when the destination is not identified (that is, the vehicle 100 continues displaying the image 704). ). If the mobile device 310 fails to identify the destination, the mobile device 310 may notify the driver by voice.
  • the mobile device 310 acquires a short left press input while the image 704 is displayed, the mobile device 310 transitions to transmission of the image 702 .
  • An image 705 is an image containing the destination specified by voice search. Along with sending image 705, mobile device 310 may audibly notify the driver of the identified destination. When the mobile device 310 acquires a right short press input and a left short press input while the image 705 is displayed, the mobile device 310 transitions to transmission of the image 706 and image 702 respectively.
  • An image 706 is an image showing the execution status of the route guidance function.
  • Image 706 may include the distance and turn direction to where to turn.
  • Image 706 may include an image showing input directions for ending route guidance.
  • the mobile device 310 starts route guidance along with the transmission of the image 706, and updates the image 706 according to the guidance status. Audio for route guidance is output from mobile device 310 to the driver via headset 320 .
  • the mobile device 310 acquires a short left press input while the image 706 is displayed, the mobile device 310 transitions to transmission of the image 707 .
  • the mobile device 310 increases or decreases the volume output from the headset 320 when receiving a short up press input or a short down press input while the image 706 is displayed.
  • An image 707 is an image for confirming whether to end the route guidance.
  • Image 707 may include an image requesting confirmation and an image indicating input directions for responding to the confirmation.
  • the mobile device 310 acquires the short right press input while the image 707 is displayed, the mobile device 310 ends the route suggestion and transitions to the transmission of the image 503 .
  • the mobile device 310 acquires the right long press input while the image 707 is displayed, the mobile device 310 transitions to transmission of the image 706 while continuing the route suggestion.
  • An image 701 is the top image of the route guidance function during execution of the route guidance function.
  • an image 503 is the top image of the route guidance function when the route guidance function is not being executed.
  • the fact that the route guidance function is being executed may mean that the route is being guided to the driver.
  • Image 701 may include the distance and turn direction to where to turn.
  • the mobile device 310 receives a long left press input while the route guidance function is selected (that is, any of the images 503 and 701 to 707 is displayed), the initial cooperation Transition to sending the top image of the function.
  • the mobile device 310 transmits the image 701 as the top image of the route guidance function if route guidance was being executed at the time of the previous selection.
  • the mobile device 310 switches the route guidance function to another cooperative function when receiving a short up or short down press input while the image 701 is displayed.
  • the mobile device 310 acquires a short right press input while the image 701 is displayed, the mobile device 310 transitions to transmission of the image 706 while continuing the route guidance.
  • the mobile device 310 completes the route guidance (arrives at the destination) with the image 701 displayed, the mobile device 310 transitions to transmission of the image 503 .
  • the mobile device 310 does not have to accept an instruction to end route guidance while the image 701 is displayed.
  • a specific operation example of the music playback function will be described with reference to FIG.
  • the mobile device 310 acquires a short right press input while the image 504 is displayed, the mobile device 310 transitions to transmission of the image 802 .
  • An image 802 is an image for searching for music to be played.
  • Mobile device 310 selects one song from the playlist in memory 313 to include in image 802 along with sending image 802 .
  • Mobile device 310 may audibly notify the driver of the name of the selected song along with sending image 802 .
  • the mobile device 310 acquires a right short press input and a left short press input while the image 802 is displayed, the mobile device 310 transitions to the transmission of the image 803 and the image 504 respectively.
  • the mobile device 310 When the mobile device 310 acquires an up short press input while the image 802 is displayed, the mobile device 310 selects the previous song in the playlist and updates the image 802 accordingly. When the mobile device 310 receives a short downward press input while the image 802 is displayed, the mobile device 310 selects the next song in the playlist and updates the image 802 accordingly.
  • An image 803 is an image showing the execution status of the music playback function.
  • the image 803 may include the name of the song being played and the playback status (playback time).
  • the image 803 may include an image indicating the input direction to end playback and the input direction to skip the song that is playing.
  • the mobile device 310 transmits the image 803 and updates the image 803 according to the playback status of the song.
  • the song being played is output from mobile device 310 to the driver via headset 320 .
  • the mobile device 310 acquires a short left press input while the image 803 is displayed, the mobile device 310 ends playback and transitions to transmission of the image 802 .
  • the mobile device 310 When the mobile device 310 acquires a short right press input while the image 803 is displayed, the mobile device 310 selects the next song in the playlist and updates the image 803 accordingly.
  • the mobile device 310 increases or decreases the volume output from the headset 320 when receiving a short up-press input or a short down-press input while the image 803 is displayed.
  • the mobile device 310 acquires the left long press input while the image 806 is displayed, the mobile device 310 transitions to transmission of the top image (the image 502 in this example) of the initial cooperation function while continuing the reproduction.
  • An image 801 is the top image of the music playback function during execution of the music playback function.
  • an image 504 is a top image of the music playing function when the music playing function is not being executed. Execution of the music reproduction function may mean that music is being reproduced for the driver.
  • the image 801 may include the name of the song being played and the playback status (playback time).
  • the mobile device 310 receives a long left press input while the music playback function is selected (that is, any of the images 504 and 801-803 are displayed)
  • the mobile device 310 performs the initial association. Transition to sending the top image of the function.
  • the mobile device 310 transmits the image 801 as the top image of the music playback function if the music playback was in progress at the time of the previous selection.
  • the mobile device 310 switches the music playback function to another cooperation function when receiving a short up or short down press while the image 801 is displayed.
  • the mobile device 310 acquires a short right press input while the image 801 is displayed, the mobile device 310 transitions to transmission of the image 803 while continuing music reproduction.
  • the mobile device 310 transitions to transmission of the image 504 when music playback ends (playlist is completed) while the image 801 is displayed.
  • the mobile device 310 does not have to accept the music playback end instruction while the image 801 is displayed.
  • a specific operation example of the messaging function will be described with reference to FIG.
  • the mobile device 310 acquires a short right press input while the image 505 is displayed, the mobile device 310 transitions to transmission of the image 901 .
  • An image 901 is an image for requesting the driver to designate a destination.
  • the mobile device 310 may send the image 901 and request the driver to specify the destination by voice.
  • the mobile device 310 acquires a short-down press input, a short-right press input, and a short left-press input while the image 901 is displayed, the mobile device 310 transitions to transmission of the image 902, transmission of the image 903, and transmission of the image 505, respectively. do.
  • the image 902 is an image containing the preset destination name.
  • mobile device 310 When transitioning to sending image 902 , mobile device 310 reads the name preset as the destination from memory 313 and includes it in image 902 . Mobile device 310 may transmit image 902 and notify the driver of the read destination name by voice. When the mobile device 310 acquires an up short press input, a right short press input, and a left short press input while the image 902 is displayed, the mobile device 310 transitions to transmission of the image 901, transmission of the image 905, and transmission of the image 505, respectively. do.
  • An image 903 is an image indicating that the voice search is in a standby state.
  • Image 903 may include an image indicating the input direction to end the voice search.
  • mobile device 310 may audibly notify the driver that voice search is awaiting.
  • mobile device 310 waits for voice input via headset 320 .
  • the mobile device 310 searches the contact list in memory 313 for a destination based on the voice input.
  • the mobile device 310 transitions to transmission of the image 904 when the destination is identified, and does not transmit a new image when the destination is not identified (that is, the vehicle 100 continues displaying the image 903). ). If the mobile device 310 cannot identify the destination, the mobile device 310 may notify the driver by voice.
  • the mobile device 310 acquires a short left press input while the image 903 is displayed, the mobile device 310 transitions to transmission of the image 901 .
  • An image 904 is an image containing the name of the destination specified by voice search. Along with sending image 904, mobile device 310 may audibly notify the driver of the name of the identified destination. When the mobile device 310 acquires a right short press input and a left short press input while the image 904 is displayed, the mobile device 310 transitions to transmission of the image 905 and image 901 respectively.
  • An image 905 is an image indicating a standby state for voice input.
  • the image 905 may include an image indicating the input direction for ending voice input and the input direction for starting transmission.
  • mobile device 310 may audibly notify the driver that it is waiting for voice input.
  • mobile device 310 waits for voice input via headset 320 .
  • the mobile device 310 acquires the input voice as a message to be sent.
  • the mobile device 310 transitions to the transmission of the image 904, the image 907, and the image 906, respectively. do.
  • An image 906 is an image containing a pre-configured standard message.
  • Mobile device 310 upon transitioning to sending image 906 , reads a preset message template from memory 313 and includes it in image 906 .
  • Image 906 may further include an image showing the input direction to cancel transmission and the input direction to initiate transmission.
  • Mobile device 310 may send image 906 and notify the driver of the read fixed phrase by voice.
  • the mobile device 310 acquires an up short press input, a right short press input, and a left short press input while the image 906 is displayed, the mobile device 310 transitions to transmission of the image 905, transmission of the image 907, and transmission of the image 904, respectively. do.
  • An image 907 is an image showing the execution status of the messaging function.
  • Image 907 may include an image indicating that a message is being sent.
  • the mobile device 310 starts transmitting the message to the destination along with the transmission of the image 907, and transitions to transmission of the image 505 after completing the transmission.
  • the images (that is, linked images) transmitted by each linked function are classified into three layers: menu layer, target selection layer, and active layer.
  • the menu hierarchy includes the top image of the selected link function.
  • the object selection layer includes an image for selecting an execution object of the currently selected cooperation function.
  • the active layer includes an image showing the execution status of the selected linkage function.
  • each cooperation function that has a target selection hierarchy the transition from the menu hierarchy to the target selection hierarchy is performed based on a right short press input.
  • the transition from the target selection layer to the active layer is performed based on the right short press input.
  • the driver can intuitively transition between hierarchical levels. Also, switches other than direction switches such as decision switches are unnecessary.
  • transitions from the active hierarchy to the object selection hierarchy and transitions from the object selection hierarchy to the menu hierarchy are performed based on a left short press input in the opposite direction to a right short press input. Therefore, the driver can intuitively return to the hierarchy.
  • the mobile device 310 does not provide visual confirmation to the driver when transitioning from the menu hierarchy to the object selection hierarchy and transitioning from the object selection hierarchy to the active hierarchy. As a result, the amount of operations required to execute the process can be reduced.
  • Upper short-press input and lower short-press input in each layer are used for transition of processing within the same layer.
  • the left/right switch 200H is a tilt switch, and the upper switch 200U and the lower switch 200D are separate press switches. Therefore, the amount of movement of the driver's finger in the horizontal direction is smaller than that in the vertical direction. In the above-described embodiment, the amount of movement of the driver's finger can be reduced by transitioning between layers by inputting in the horizontal direction.
  • the transition to voice search is performed based on the short right press input. In this way, by enabling the transition to the voice search based on the input in the same direction as the transition to the deeper layer (right short press input), a unified operational feeling is provided.
  • the interrupt process is a process in which mobile device 310 requests vehicle 100 to start the cooperative function without depending on an instruction from the driver. For example, the mobile device 310 requests initiation of interrupt processing when receiving a call from another party or receiving a message from another party.
  • mobile device 310 When mobile device 310 receives an incoming call, mobile device 310 transmits an image (that is, an interrupt image) related to interrupt processing of the call function to vehicle 100, and vehicle 100 displays the interrupt image in area 402 of screen 400.
  • the interrupt image may include the name of the caller, the input direction for instructing to answer the incoming call, and the input direction for instructing not to answer the incoming call.
  • an image of the selected link function is displayed in area 401 .
  • the link function is not being executed at this time, an image other than the link function is displayed in the area 401 .
  • the mobile device 310 When the mobile device 310 acquires the right short press input while the interrupt image is displayed, the mobile device 310 responds to the incoming call and transitions to the transmission of the image 606 . At this point, a linked image (specifically, image 606) is displayed in area 401. FIG. At this point, vehicle 100 may remove the interrupt image from area 402 and display the image from the uncoordinated function. A similar process occurs when the mobile device 310 receives the message.
  • the method may begin when the mobile device 310 is powered up. Each step of the following operation is executed by the controller 311 of the mobile device 310 .
  • control unit 311 establishes a communication link with the vehicle 100 and the headset 320.
  • This communication link may be a short-range wireless communication such as Bluetooth, as described above.
  • Control unit 311 may establish a communication link with either vehicle 100 or headset 320 first, or may establish a communication link with both. Also, the communication link with headset 320 may be established at the time of audio output to headset 320 .
  • step S1102 the control unit 311 determines whether an instruction to start cooperation has been acquired. If this instruction is acquired (YES in S1102), the control unit 311 shifts the process to step S1103, otherwise (NO in S1102), the control unit 311 repeats step S1102. In the above example, the control unit 311 determines that an instruction to start cooperation has been obtained when a long left press input is received from the vehicle 100 .
  • step S ⁇ b>1103 the control unit 311 transmits the top image of the initial cooperation function (the image 502 in the above example) to the vehicle 100 .
  • the initial collaboration function (call function in the example above) is selected.
  • step S1104 the control unit 311 determines whether an instruction to change the currently selected cooperation function has been acquired. If this instruction is acquired (YES in S1104), the control unit 311 shifts the process to step S1105, otherwise (NO in S1104), the control unit 311 shifts the process to step S1106. In the above example, the control unit 311 determines that an instruction to change the cooperation function has been obtained when a short up-press input or a short down-press input is received from the vehicle 100 .
  • step S ⁇ b>1105 the control unit 311 changes the currently selected linked function and transmits the top image of the changed linked function to the vehicle 100 .
  • the control unit 311 may provide the user with a message that describes the linked function after the change by voice.
  • step S1106 the control unit 311 determines whether or not an instruction related to the currently selected cooperation function has been acquired. If this instruction is acquired (YES in S1106), the control unit 311 shifts the process to step S1107, otherwise (NO in S1106), the control unit 311 shifts the process to step S1108.
  • step S1107 the control unit 311 processes the currently selected cooperation function according to the acquired instruction. Since the processing of the cooperation function has been described in detail with reference to FIGS. 6 to 9, it will not be repeated here.
  • the control unit 311 may provide a voice message to the user in the processing of the cooperation function.
  • step S1108 the control unit 311 determines whether or not an instruction to end cooperation has been acquired. If this instruction is acquired (YES in S1108), the control unit 311 shifts the process to step S1102, otherwise (NO in S1108), the control unit 311 shifts the process to step S1104. In the above example, the control unit 311 determines that an instruction to end cooperation has been obtained when a right long press input is received from the vehicle 100 .
  • the mobile device 310 determines a message to be provided to the user (the driver of the vehicle 100 in the above embodiment) for at least one of the plurality of linked functions, and provides the message to the user.
  • mobile device 310 provides a message for each of the five federation functions. Providing the message is performed, for example, by playing audio from speakers 322 of headset 320 .
  • mobile device 310 provides the user with sounds associated with the message prior to providing the message.
  • the sound provided prior to the provision of the message in this way and associated with the message is hereinafter referred to as the preceding sound.
  • the playing time of the preceding sound is shorter than the playing time of the message. Therefore, the user can grasp the content of the message provided after that in a short time only by listening to the preceding sound.
  • preceding sounds with different pitches are assigned to each of the five linked functions.
  • the mobile device 310 selects and provides the same leading tone for messages relating to the same federation feature and selects and provides different leading tones for messages relating to different federation features.
  • the call function is assigned a “re” sound. Therefore, the mobile device 310, prior to providing a message related to the call function (e.g., a message indicating that the call function has been selected, a message indicating the results of a voice search for selecting a call partner, etc.), It provides the sound of "re" as a sound.
  • a message related to the call function e.g., a message indicating that the call function has been selected, a message indicating the results of a voice search for selecting a call partner, etc.
  • the playback time of the preceding sound may be a preset value (eg, 0.5 seconds).
  • the message playback time varies depending on the content of the message, but any message is longer than the preceding sound playback time.
  • multiple preceding sounds assigned to multiple cooperative functions may have multiple pitches of the same tone color (eg, beep sound).
  • the multiple preceding notes may have multiple timbres.
  • a sound with the same pitch and a different timbre may be assigned to each of the five cooperation functions, or a sound with a different pitch and a different timbre may be assigned to each of the five cooperation functions.
  • the phone call function is assigned a “do” tone of a piano tone
  • the route guidance function is assigned a “do” tone of a violin tone
  • the other functions are similarly assigned a “do” tone of a different tone. ” sound may be assigned.
  • the call function is assigned a piano tone "re” sound
  • the route guidance function is assigned a violin tone "mi” sound
  • similarly other functions are assigned different tones.
  • the mobile device 310 determines the message to be provided to the user for at least one layer of functions having multiple layers. In such cases, the mobile device 310 may repeatedly provide the preceding tone a number of times depending on the level of the hierarchy prior to providing the message. For example, the mobile device 310 may provide the preceding tone once for messages regarding processing of the menu hierarchy, repeat the preceding tone twice for messages regarding processing of the object selection hierarchy, and repeat the preceding tone three times for messages regarding processing of the active hierarchy. It can be provided repeatedly.
  • the time interval between consecutive preceding sounds may be a preset value (eg, 0.3 seconds).
  • the user can grasp which hierarchy the message is to be provided by simply listening to the preceding sound before the message is provided.
  • the method of classifying the layers of cooperation functions is not limited to the above example.
  • the level of hierarchy may be the depth from the top image (the number of operations required to transition from the top image).
  • the mobile device 310 may play the preceding sound only once regardless of the level of hierarchy. This makes it possible to start providing messages earlier.
  • the method is performed when mobile device 310 provides a message to a user.
  • the mobile device 310 performs steps S1101 (when transmitting the top image), step S1105 (when changing the selection function), and step S1107 (when performing processing using the selection function) in FIG.
  • the voice guidance function may be executed when performing the interrupt process.
  • Each step of the following operation is executed by the controller 311 of the mobile device 310 .
  • the control unit 311 determines a message to be provided to the user. As noted above, providing the message may be playing the message. The same is true for the offerings described below.
  • the messages provided are determined by the function being performed. For example, as shown in FIG. 5, when a short press is input while an image 503 is being displayed, the mobile device 310 displays an image 502, which is the top image of the call function, while the call function is activated. It may decide to provide a message indicating the selection (eg, "Call feature selected.”).
  • the control unit 311 selects a preceding sound related to the message determined at step S1301 from a plurality of sounds.
  • the criteria eg, criteria defined in the table of FIG. 12
  • the criteria may be preset and stored in a storage device (eg, memory 313) of mobile device 310.
  • FIG. the control unit 311 may select the pitch of the preceding sound from a plurality of pitches, may select the timbre of the preceding sound from a plurality of timbres, or may select both of them. .
  • One of the pitch and timbre may remain unchanged regardless of the message.
  • the control unit 311 may determine the number of times to provide the preceding sound based on the hierarchy of functions to be executed.
  • control unit 311 provides the preceding sound selected at step S1302 to the user. If the number of times to provide the preceding sound has also been determined in step S1302, the control unit 311 provides the selected preceding sound that number of times. If the number of times has not been determined, the control unit 311 provides the selected preceding sound only once.
  • control unit 311 starts providing the message determined at step S1301.
  • the interval from the end of providing the preceding tone to the start of providing the message may be a preset value (eg, 0.3 seconds).
  • step S1305 the control unit 311 determines whether the user has operated the mobile device 310 or not. If the user has operated the mobile device 310 ("YES" in step S1305), the control unit 311 shifts the process to step S1307; transition to
  • step S1306 the control unit 311 determines whether or not the message has been provided (that is, whether or not the message has been provided to the end). Control unit 311 ends the process if the message has been provided ("YES” in step S1306), otherwise ("NO” in step S1306), and transitions the process to step 1305. In this way, the control unit 311 continues to provide the message if there is no operation by the user during the provision of the message.
  • step S1306 in response to the user's operation of the mobile device 310 while the message is being provided, the control unit 311 determines that providing the rest of the message is unnecessary, and terminates providing the message. After that, the control unit 311 terminates the process and waits for a new execution of the function provided by the message.
  • the types of messages provided to the user will be described with reference to FIGS. 14A-14C.
  • the message provided to the user may be a message containing only variable parts, a message containing only fixed parts, or a message containing both variable parts and fixed parts.
  • a variable part is a part that changes depending on the circumstances under which the message is provided.
  • a fixed part is a part that is the same regardless of the context in which the message is provided.
  • FIG. 14A shows an example of a message 1402 containing only fixed parts.
  • the mobile device 310 provides the same message (eg, "music playing function selected") regardless of the context in which the message is provided.
  • the sound “Fa” is assigned to the music playback function
  • the preceding sound 1401 with the pitch “Fa” is provided prior to the message 1402 .
  • selection of the music playback function is a process in the menu hierarchy, the preceding tone 1401 is provided only once prior to the message 1402 .
  • FIG. 14B shows an example of a message 1412 containing only variable parts.
  • the mobile device 310 when indicating a route in a route guidance function, the mobile device 310 provides a message (eg, "Please turn left at the next intersection.") according to the context in which the message is provided.
  • a message eg, "Please turn left at the next intersection.”
  • the preceding sound 1411 since the sound “mi” is assigned to the route guidance function, the preceding sound 1411 whose pitch is “mi” is provided prior to the message 1412 .
  • preceding sound 1411 precedes message 1412 and is provided repeatedly three times.
  • FIG. 14C shows an example of a message 1422 containing both a variable portion 1423 and a fixed portion 1424.
  • the mobile device 310 when showing the result of voice input in the route guidance function, the mobile device 310 provides a variable portion 1423 (for example, "Tokyo Station") depending on the situation to provide the message and a fixed portion 1423 not depending on the situation to provide the message. 1424 (eg, "You have entered.") and a message 1422 (eg, "You have entered Tokyo Station.”).
  • the sound “mi” is assigned to the route guidance function, the preceding sound 1421 with the pitch “mi” is provided prior to the message 1422 .
  • preceding sound 1421 is provided repeatedly twice before message 1422 .
  • mobile device 310 may provide the variable portion prior to the fixed portion and the preceding tone prior to the variable portion. good. This allows the user to quickly grasp the part of the message that changes according to the situation.
  • variable and fixed parts of messages are possible.
  • the variable portion of the message may be the name of the person calling the mobile device 310 ("Taro Honda").
  • the mobile device 310 may have a function (that is, a skill level management function) of managing the user's skill level regarding the operation of the mobile device 310 . For example, if the mobile device 310 determines that the user is not accustomed to operating the mobile device 310, it sets the skill level of the user to a low level. Also, set the user proficiency to a high level. Mobile device 310 may provide the message and/or the preceding tone in a proficiency-based manner.
  • a skill level management function that is, a skill level management function of managing the user's skill level regarding the operation of the mobile device 310 . For example, if the mobile device 310 determines that the user is not accustomed to operating the mobile device 310, it sets the skill level of the user to a low level. Also, set the user proficiency to a high level. Mobile device 310 may provide the message and/or the preceding tone in a proficiency-based manner.
  • the mobile device 310 manages the user's proficiency in four levels from “0" to "3". The higher the proficiency value, the higher the proficiency (that is, the more accustomed to the operation). In the example of FIG. 15, the mobile device 310 provides different modes of message or preceding sound for four items: "message content”, “message playback volume”, “message playback speed” and "previous sound playback volume”. .
  • the mobile device 310 provides a detailed message (eg, "Mr. Taro Honda is calling. Would you like to answer?"). If the skill level is higher than this (eg, "2"), mobile device 310 provides a message with simplified content (eg, "Incoming call from Mr. Taro Honda.”). If the skill level is higher (eg, "3"), the mobile device 310 provides a message (eg, "Mr. Taro Honda") that omits the fixed form. Since a user with a high level of skill can understand the specific contents only with a simple message, the user's operational feeling is improved by varying the contents of the message in this way.
  • the mobile device 310 will provide the message at a high volume. If the mobile device 310 has a higher proficiency level (eg, '1' to '3'), the mobile device 310 will provide the message at a slightly louder volume. Since a user with a high skill level can grasp the contents of a message even at a low volume, the user's operational feeling is improved by varying the playback volume of the message in this way.
  • a low proficiency level eg, "0”
  • a higher proficiency level eg, '1' to '3'
  • the mobile device 310 will provide the message at a slightly louder volume. Since a user with a high skill level can grasp the contents of a message even at a low volume, the user's operational feeling is improved by varying the playback volume of the message in this way.
  • the mobile device 310 If the mobile device 310 has a low proficiency (eg, "0"), it will serve messages at a slow rate. The mobile device 310 speeds up message playback as it gains proficiency. Since a highly skilled user can grasp the contents of a message even at a high speed, by varying the playback speed of messages in this way, the user's operational feeling is improved.
  • a low proficiency eg, "0”
  • the mobile device 310 speeds up message playback as it gains proficiency. Since a highly skilled user can grasp the contents of a message even at a high speed, by varying the playback speed of messages in this way, the user's operational feeling is improved.
  • Preceding sound playback volume will be explained. If the mobile device 310 has a low proficiency level (eg, "0"), it provides the preceding tone at a high volume. The mobile device 310 lowers the playback volume of the preceding sound as the proficiency level increases. Since a highly skilled user can grasp the preceding sound even at a low volume, the user's operational feeling is improved by varying the volume of the preceding sound in this manner.
  • a low proficiency level eg, "0”
  • FIG. 16A when a message 1402 to be provided to the user includes only a fixed part, the mobile device 310 is set to the mobile device 310 after the preceding sound 1401 is provided and before the message 1402 is provided. You may increase your proficiency based on what you've done.
  • the message 1402 includes only a fixed part, the user can grasp the contents of the following message only by the preceding sound 1401 . Therefore, when the user operates the mobile device 310 without waiting for the completion of the provision of the message 1402, the user is considered to have grasped the content of the subsequent message only from the preceding sound 1401.
  • FIG. 16A when a message 1402 to be provided to the user includes only a fixed part, the mobile device 310 is set to the mobile device 310 after the preceding sound 1401 is provided and before the message 1402 is provided. You may increase your proficiency based on what you've done.
  • the message 1402 includes only a fixed part, the user can grasp the contents of the following message only by the preceding sound 1401 . Therefore, when
  • the mobile device 310 considers that the user who has performed such an operation is proficient in operating the mobile device 310, and raises the proficiency level. On the other hand, the mobile device 310 does not need to raise the skill level even if the user operates the mobile device 310 before the provision of the preceding sound 1401 ends. In such cases, the user may have manipulated the mobile device 310 (eg, to stop providing the message) without knowing the content of the subsequent message. Therefore, the mobile device 310 does not increase proficiency in such cases.
  • the mobile device 310 may not increase proficiency regardless of whether the user has operated the mobile device 310. . If the message 1412 contains only variable parts, the user cannot grasp the content of the following message only from the preceding sound 1411 . In such cases, the user may have manipulated the mobile device 310 (eg, to stop providing the message) without knowing the content of the subsequent message. Therefore, the mobile device 310 does not increase proficiency in such cases.
  • mobile device 310 when message 1422 provided to the user includes both variable portion 1423 and fixed portion 1424 , mobile device 310 is notified that the user has operated mobile device 310 during provision of variable portion 1423 . Instead, the proficiency level may be increased based on the user's operation of the mobile device 310 while the template portion 1424 is being provided. When the user operates the mobile device 310 while the fixed form portion 1424 is being provided, the user is considered to have grasped the content of the fixed form portion 1424 only from the preceding sound 1421 . Accordingly, mobile device 310 may increase the proficiency level based on the user's operation of mobile device 310 while stylized portion 1424 is being provided.
  • the user cannot comprehend the content of the variable portion 1423 that follows only from the preceding sound 1421 . Therefore, even if the user operates the mobile device 310 while the variable portion 1423 is being provided, the mobile device 310 does not have to increase the skill level.
  • the initial value of the skill level of the user may be the lowest level (“0” in the example of FIG. 15).
  • the user's proficiency level is stored in a storage device (eg, memory 313) of mobile device 310 and is updated by the following operations.
  • the user's proficiency level may be updateable to a user-specified value.
  • step S1701 the control unit 311 determines whether the message provided to the user contains variable parts. If the message provided to the user includes a variable part ("YES" in step S1701), the control unit 311 shifts the process to step S1702; transition to
  • step S1702 the control unit 311 determines whether the user has operated the mobile device 310 during provision of the fixed form portion. If the user operates the mobile device 310 during provision of the fixed form portion ("YES" in step S1702), the control unit 311 shifts the process to step S1704; otherwise ("NO" in step S1702) End the process. If the user does not perform any operation while the message is being provided, the user may be unfamiliar with the operation and have listened to the message to the end. Therefore, in such a case, the mobile device 310 does not increase the proficiency level. In addition, even if the user performs an operation while the message is being provided, if the operation is not during the provision of the fixed form part, the mobile device 310 increases the skill level as described above with reference to FIGS. 16B and 16C. do not.
  • step S1704 the control unit 311 increments the count value.
  • the count value is a value indicating the number of times the condition for increasing the skill level is satisfied, and is stored in the storage device (for example, memory 313) of mobile device 310.
  • FIG. The initial value of the count value is zero, and is incremented by one each time step S1704 is executed. Also, when the skill level is increased, the control unit 311 resets the count value to zero.
  • step S1703 the control unit 311 determines whether the user has operated the mobile device 310 after the preceding sound has been provided and before the message has been provided. If the user operates the mobile device 310 after the preceding sound has been provided and before the message has been provided ("YES" in step S1703), the control unit 311 shifts the process to step S1704; ("NO" in step S1703) and the process ends. If the user does not perform any operation while the message is being provided, the user may be unfamiliar with the operation and have listened to the message to the end. Therefore, in such a case, the mobile device 310 does not increase the proficiency level. On the other hand, if the user operates before the end of providing the message, the user may be accustomed to operating the mobile device 310 . Therefore, in step S1704, the control unit 311 increments the count value.
  • step S1705 the control unit 311 determines whether the count value has reached the threshold number of times. If the count value reaches the threshold number of times ("YES" in step S1705), the control unit 311 shifts the process to step S1706, otherwise ("NO" in step S1705) ends the process.
  • step S1706 the control unit 311 increases the skill level of the user. For example, the control unit 311 raises the user's proficiency level by one level. If the user's proficiency level is already at the highest level, the control unit 311 does not change the proficiency level.
  • the threshold number of times used in step S1705 may be two or more.
  • the threshold number of times may be a different value for each skill level. For example, the condition must be met 10 times in order to raise the proficiency from 0 to 1, and the condition must be met 5 times in order to raise the proficiency from 1 to 2. good too.
  • the control unit 311 may be able to determine whether the user is driving the vehicle 100 or not. For example, the control unit 311 may query the vehicle 100 whether the user is driving the vehicle 100 and make this determination based on the response. The control unit 311 may determine that the user is not driving the vehicle 100 when the speed of the vehicle 100 is equal to or less than a threshold value (for example, 1 km/h or less or 0 km/h or less). The control unit 311 may acquire the speed of the vehicle 100 from the vehicle 100 or may regard the moving speed of the mobile device 310 as the speed of the vehicle 100 .
  • a threshold value for example, 1 km/h or less or 0 km/h or less.
  • the control unit 311 manages the skill level separately when the user is driving the vehicle 100 and when the user is not driving the vehicle 100 (for example, when the vehicle 100 is stopped). may Even if the user becomes accustomed to operating the mobile device 310 when not driving the vehicle 100, the user may not be able to operate the mobile device 310 with the same level of proficiency while driving. Therefore, by managing the skill level separately for the case of driving and the case of not driving, the user's operational feeling is improved.
  • the mobile device 310 stores, in its own storage device (for example, the memory 313), the skill level for operations while driving and the skill level for operations other than while driving.
  • the control unit 301 increases the skill level for the operation while driving in step S1706. Further, when providing a message to the user who is driving, the control unit 301 provides the message in a mode according to the skill level of the operation during driving. On the other hand, when the method of FIG. 17 is executed for the message provided while not driving, the control unit 301 increases the proficiency level for operations other than driving in step S1706.
  • control unit 301 when providing a message to a user other than driving, the control unit 301 provides the message in a mode according to the level of skill with respect to operations other than driving.
  • the count value used in step S1704 is also managed separately depending on whether the vehicle is in operation or not.
  • the skill level management function has been described in the context of the cooperation function between the vehicle 100 and the mobile device 310. Instead, the skill level management function of the present invention may be applied to the function of cooperation between the vehicle 100 and other devices, the function of the vehicle 100 alone, or a combination thereof. may be
  • ⁇ Item 1> A voice guidance device (310), message determination means (311) for determining messages (1402, 1412, 1422) to be provided to the user; sound selection means (311) for selecting a sound associated with said message from a plurality of sounds; providing means (311) for providing the user with sounds (1401, 1411, 1421) related to the message prior to providing the message; A voice guidance device, wherein the reproduction time of the sound associated with the message is shorter than the reproduction time of the message. According to this item, since the message can be comprehended by the sound preceding the message, the user can comprehend the message in a short period of time.
  • the user can grasp the message by the difference in pitch.
  • the user can grasp the message from the difference in tone color.
  • the message determination means determines a message to be provided to the user for at least one hierarchy among functions having a plurality of hierarchies; 4.
  • the voice guidance device according to any one of items 1 to 3, wherein said providing means repeatedly provides a sound related to said message a number of times according to said hierarchical level prior to providing said message. According to this item, the user can grasp the hierarchy of functions that provide messages.
  • the message determination means determines a message to be provided to the user for at least one of a plurality of functions; 5.
  • the voice guidance device according to any one of items 1 to 4, wherein the sound selection means selects the same sound for messages regarding the same function, and selects different sounds for messages regarding different functions. According to this item, the user can grasp which function message is provided by the difference in sound.
  • ⁇ Item 6> When the message includes a variable part (1423) that changes according to the situation in which the message is provided and a fixed part (1424) that is the same regardless of the situation in which the message is provided, the providing means 6.
  • the voice guidance device according to any one of items 1 to 5, wherein the variable portion is provided prior to the fixed portion, and a sound associated with the message is provided prior to the variable portion. According to this item, the user can quickly grasp the variable part that cannot be grasped by the preceding sound.
  • ⁇ Item 7> further comprising skill level management means (311) for managing the skill level of the user's operation of the voice guidance device, 7.
  • said providing means provides at least one of said message and said sound in a manner corresponding to said skill level. According to this item, the voice guidance is provided in a mode according to the skill level of the user.
  • ⁇ Item 8> The voice according to item 7, wherein the skill level management means increases the skill level based on the user's operation of the voice guidance device after the end of providing the sound and before the end of providing the message. guidance device. According to this item, the proficiency level can be increased when it is considered that the user has grasped the message from the preceding sound.
  • the user may 9.
  • the proficiency level can be increased when it is considered that the user has grasped the message from the preceding sound.
  • Device According to this item, it is possible to accurately determine whether the user has become accustomed to the operation.
  • ⁇ Item 11> Further comprising driving determination means (311) for determining whether the user is driving a vehicle, 11.
  • the voice guidance device according to any one of items 7 to 10, wherein the skill management means separately manages the skill when the user is driving and when the user is not driving. .
  • ⁇ Item 13> A voice guidance method, A message determination step (S1301) for determining a message to be provided to the user; a sound selection step (S1302) of selecting a sound related to the message from a plurality of sounds; a providing step (S1303) of providing the user with a sound related to the message prior to providing the message; A voice guidance method, wherein the playback time of the sound associated with the message is shorter than the playback time of the message. According to this item, since the message can be comprehended by the sound preceding the message, the user can comprehend the message in a short period of time.
  • ⁇ Item 14> A program for causing a computer to function as each means of the voice guidance device according to any one of items 1 to 12. According to this item, a program for realizing the above-described voice guidance device is provided.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Navigation (AREA)
PCT/JP2022/014350 2021-06-29 2022-03-25 音声案内装置、音声案内方法及びプログラム WO2023276347A1 (ja)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2023531436A JPWO2023276347A1 (zh) 2021-06-29 2022-03-25
CN202280043796.1A CN117546137A (zh) 2021-06-29 2022-03-25 语音引导装置、语音引导方法以及程序

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021107786 2021-06-29
JP2021-107786 2021-06-29

Publications (1)

Publication Number Publication Date
WO2023276347A1 true WO2023276347A1 (ja) 2023-01-05

Family

ID=84692631

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/014350 WO2023276347A1 (ja) 2021-06-29 2022-03-25 音声案内装置、音声案内方法及びプログラム

Country Status (3)

Country Link
JP (1) JPWO2023276347A1 (zh)
CN (1) CN117546137A (zh)
WO (1) WO2023276347A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05173587A (ja) * 1991-12-25 1993-07-13 Matsushita Electric Ind Co Ltd 音声合成装置
JP2003084965A (ja) * 2001-09-10 2003-03-20 Ricoh Co Ltd 機器操作装置、プログラムおよび記録媒体
JP2008233678A (ja) * 2007-03-22 2008-10-02 Honda Motor Co Ltd 音声対話装置、音声対話方法、及び音声対話用プログラム
JP2010230699A (ja) * 2009-03-25 2010-10-14 Toshiba Corp 音声合成装置、プログラム、及び方法
JP2020024140A (ja) * 2018-08-07 2020-02-13 株式会社東京精密 三次元測定機の作動方法及び三次元測定機

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05173587A (ja) * 1991-12-25 1993-07-13 Matsushita Electric Ind Co Ltd 音声合成装置
JP2003084965A (ja) * 2001-09-10 2003-03-20 Ricoh Co Ltd 機器操作装置、プログラムおよび記録媒体
JP2008233678A (ja) * 2007-03-22 2008-10-02 Honda Motor Co Ltd 音声対話装置、音声対話方法、及び音声対話用プログラム
JP2010230699A (ja) * 2009-03-25 2010-10-14 Toshiba Corp 音声合成装置、プログラム、及び方法
JP2020024140A (ja) * 2018-08-07 2020-02-13 株式会社東京精密 三次元測定機の作動方法及び三次元測定機

Also Published As

Publication number Publication date
CN117546137A (zh) 2024-02-09
JPWO2023276347A1 (zh) 2023-01-05

Similar Documents

Publication Publication Date Title
JP6136627B2 (ja) 車両用情報表示装置
JP5906347B2 (ja) 車載情報システム、車載装置
JP5859969B2 (ja) 車載情報システム、車載装置、情報端末
US9103691B2 (en) Multimode user interface of a driver assistance system for inputting and presentation of information
KR101304321B1 (ko) 싱글 터치 압력에 기반한 ui 제공방법 및 이를 적용한 전자기기
KR101545875B1 (ko) 멀티미디어 아이템 조작 장치 및 방법
US8457611B2 (en) Audio file edit method and apparatus for mobile terminal
US8225236B2 (en) Displaying active cursor in mobile terminal
US8005500B2 (en) Mobile communications terminal providing memo function and method thereof
JPWO2003078930A1 (ja) 車両用ナビゲーション装置
US20090117945A1 (en) Hands-Free Device Producing a Spoken Prompt with Spatial Effect
KR20050077806A (ko) 음성 대화 실행 방법 및 음성 대화 시스템
JP6119456B2 (ja) 車両用情報表示装置
WO2023276347A1 (ja) 音声案内装置、音声案内方法及びプログラム
JP2003223447A (ja) 移動体用情報提示装置
WO2023276293A1 (ja) 音声案内装置、音声案内方法及びプログラム
JP2009276833A (ja) 表示装置および表示方法
JP7282208B2 (ja) 車両、モバイル装置及びその制御方法
JP2005208798A (ja) 情報提供端末、および情報提供方法
JP4212592B2 (ja) 移動通信端末装置
WO2023276348A1 (ja) 車両
WO2023276381A1 (ja) 車両
JP5187381B2 (ja) 操作情報入力装置
JP6325231B2 (ja) 情報表示システム、情報表示システムの制御方法およびプログラム
JP7323050B2 (ja) 表示制御装置及び表示制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22832509

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202280043796.1

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2023531436

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22832509

Country of ref document: EP

Kind code of ref document: A1