WO2020031241A1 - 音声対話装置、音声対話システム、及び音声対話装置の制御方法 - Google Patents
音声対話装置、音声対話システム、及び音声対話装置の制御方法 Download PDFInfo
- Publication number
- WO2020031241A1 WO2020031241A1 PCT/JP2018/029470 JP2018029470W WO2020031241A1 WO 2020031241 A1 WO2020031241 A1 WO 2020031241A1 JP 2018029470 W JP2018029470 W JP 2018029470W WO 2020031241 A1 WO2020031241 A1 WO 2020031241A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- occupant
- controller
- voice
- load
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 41
- 230000002452 interceptive effect Effects 0.000 claims description 74
- 230000003993 interaction Effects 0.000 claims description 41
- 238000001514 detection method Methods 0.000 claims description 24
- 230000004044 response Effects 0.000 claims description 20
- 230000007613 environmental effect Effects 0.000 abstract 1
- 230000006870 function Effects 0.000 description 22
- 238000010586 diagram Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 9
- 238000004891 communication Methods 0.000 description 6
- 238000002485 combustion reaction Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000004904 shortening Methods 0.000 description 2
- 239000012141 concentrate Substances 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/10—Interpretation of driver requests or demands
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
- B60R16/037—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
- B60R16/0373—Voice control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/54—Audio sensitive means, e.g. ultrasound
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/21—Voice
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/22—Psychological state; Stress level or workload
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/05—Type of road, e.g. motorways, local streets, paved or unpaved roads
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/30—Road curve radius
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/406—Traffic density
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/226—Procedures used during a speech recognition process, e.g. man-machine dialogue using non-speech characteristics
- G10L2015/227—Procedures used during a speech recognition process, e.g. man-machine dialogue using non-speech characteristics of the speaker; Human-factor methodology
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/226—Procedures used during a speech recognition process, e.g. man-machine dialogue using non-speech characteristics
- G10L2015/228—Procedures used during a speech recognition process, e.g. man-machine dialogue using non-speech characteristics of application context
Definitions
- the present invention relates to a voice interaction device, a voice interaction system, and a method for controlling a voice interaction device.
- An interactive device that is mounted on a vehicle together with an audio output device that outputs audio into the vehicle interior and is capable of interacting with a driver of the vehicle, generates a conversation sentence directed to the driver, and outputs the conversation sentence by the audio output device.
- a dialogue execution unit for uttering a load determination unit for determining whether the driver's driving load is high on the road on which the vehicle is traveling, and a dialogue realization when the load determination unit determines that the driving load is high.
- An utterance control unit that sets a prohibited state in which the start of utterance by the unit is prohibited, and sets an allowable state in which utterance by the dialogue execution unit is permitted when the driving load is determined to be low by the load determination unit. Is known (Patent Document 1).
- the problem to be solved by the present invention is to provide a voice interactive device, a voice interactive system, and a control method of a voice interactive device that can continue a dialog with an occupant regardless of the load state of the occupant. is there.
- the present invention determines a load state of an occupant based on at least one of a running state of a vehicle, a state of an external environment of the vehicle, and a state of an occupant of the vehicle, and executes an interactive program according to the state of the load.
- the above-mentioned problem is solved by using and interacting with the occupant.
- FIG. 1 is a diagram showing a block configuration diagram of the voice interaction system according to the present embodiment.
- FIG. 2 is an example of items for determining the magnitude of the operation load and a determination standard.
- FIG. 3 is a diagram for describing a first interactive mode and a second interactive mode in the first embodiment.
- FIG. 4 is a flowchart illustrating a process executed by the controller according to the present embodiment.
- FIG. 5 is an example of a dialog performed when the first dialog program is used.
- FIG. 6 is an example of a dialog performed when the second dialog program is used.
- FIG. 7 is a diagram for describing a first interactive mode and a second interactive mode in the second embodiment.
- FIG. 8 is a diagram for explaining a first interactive mode and a second interactive mode in the third embodiment.
- FIG. 1 is a diagram illustrating the configuration of the voice interaction system according to the present embodiment. As illustrated in FIG. 1, in the present embodiment, a configuration in which a voice interaction system is mounted on a vehicle 100 will be described as an example.
- the vehicle 100 includes a sensor group 110, a surrounding detection device 120, a navigation device 130, an external information acquisition device 140, a driving device 150, an indoor camera 160, an indoor microphone 170, a speaker 180, a controller 190, Is provided. These devices are connected to each other by a CAN (Controller Area Network) or other in-vehicle LAN to exchange information with each other.
- the voice interaction device includes a controller 190.
- the vehicle 100 of the present embodiment can be exemplified by an electric vehicle having an electric motor as a drive source, an engine vehicle having an internal combustion engine as a drive source, and a hybrid vehicle having both an electric motor and an internal combustion engine as a drive source.
- electric vehicles and hybrid vehicles that use an electric motor as a drive source include those that use a secondary battery as a power source for the electric motor and those that use a fuel cell as a power source for the electric motor.
- the sensor group 110 is configured by a device that detects a traveling state of the vehicle.
- the sensor group 110 of the present embodiment includes a vehicle speed sensor 111, an engine speed sensor 112, an accelerator opening sensor 113, a brake opening sensor 114, a steering angle sensor 115, and a shift lever sensor 116. .
- the vehicle speed sensor 111 measures the rotational speed of a drive system such as a drive shaft, and detects the traveling speed of the vehicle (hereinafter, also referred to as the vehicle speed) based on the measured rotational speed. Vehicle speed sensor 111 outputs vehicle speed information to controller 190.
- the vehicle 100 may include an acceleration sensor instead of or in addition to the vehicle speed sensor 111.
- the engine speed sensor 112 detects the engine speed and outputs information on the engine speed to the controller 190.
- the accelerator opening sensor 113 detects the operation amount of the accelerator pedal, and outputs information on the operation amount of the accelerator pedal to the controller 190.
- the brake opening sensor 114 detects the operation amount of the brake pedal and outputs information on the operation amount of the brake pedal to the controller 190.
- the steering angle sensor 115 detects the steering angle of the steering, and outputs information on the steering angle of the steering to the controller 190.
- the shift lever sensor 116 detects the position of the shift lever (shift lever position) and outputs position information of the shift lever to the controller 190.
- the surrounding detection device 120 detects an object existing around the vehicle 100.
- Examples of the surroundings detection device 120 include a vehicle-mounted camera 121 and a radar 122.
- the onboard camera 121 captures an image of the periphery of the vehicle 100.
- the in-vehicle camera 121 includes, for example, a front camera that images the front of the vehicle 100, a rear camera that images the rear of the vehicle 100, and a side camera that images the side of the vehicle 100.
- the radar 122 detects an obstacle existing around the vehicle 100.
- the radar 122 is, for example, a front radar that detects an obstacle existing in front of the vehicle 100, a rear radar that detects an obstacle existing behind the vehicle 100, and a side that detects an obstacle existing beside the vehicle 100. Composed of radar.
- the radar 122 detects the distance from the vehicle 100 to the obstacle and the direction in which the obstacle exists.
- the objects detected by the surrounding detection device 120 include pedestrians, bicycles, motorcycles, automobiles, road obstacles, traffic lights, road markings, pedestrian crossings, and the like.
- any one of the above-described in-vehicle camera 121 and radar 122 may be used, or a combination of two or more types may be used.
- the surrounding detection device 120 outputs the captured information and the detection result to the controller 190 as the surrounding information.
- the navigation device 130 guides the driver by indicating a route from the current position of the vehicle 100 to the destination based on the position information of the vehicle 100 detected by the GPS 131.
- the navigation device 130 has map information, and calculates the travel route of the vehicle 100 from the position information of the vehicle 100 and the position information of the destination.
- the navigation device 130 outputs the position information of the vehicle 100 and the information of the traveling route of the vehicle 100 to the controller 190.
- the travel route of the vehicle 100 includes a route on which the vehicle 100 has actually traveled and a route on which the vehicle 100 is to travel.
- the external information acquisition device 140 is connected to a network existing outside the vehicle 100 and acquires information on the external environment of the vehicle 100.
- An example of the external information acquisition device 140 is a device that acquires various types of information from a network outside the vehicle at a predetermined cycle via a communication line.
- the external information acquisition device 140 acquires road congestion information, road construction information, and accident information from the VICS (registered trademark) system.
- the external information acquisition device 140 acquires weather information from an external server.
- the external information acquisition device 140 outputs information acquired from outside the vehicle to the controller 190.
- the external information acquisition device 140 is not limited to acquiring information from an external server, but can search for necessary information on a network and access a server that manages information according to the search result.
- the external information acquisition device 140 is not limited to a device that acquires information on the external environment via a communication line.
- an external air temperature sensor that detects an external air temperature
- a humidity sensor that detects a humidity
- a raindrop sensor that detects a raindrop It
- the outside air temperature sensor outputs information on the outside air temperature to the controller 190 as a detection result.
- the humidity sensor outputs humidity information to the controller 190 as a detection result.
- the raindrop sensor outputs raindrop information to the controller 190 as a detection result.
- Drive device 150 includes a drive mechanism for vehicle 100.
- the driving mechanism includes an electric motor and / or an internal combustion engine that is a driving source of the vehicle 100 described above, a power transmission device including a drive shaft and an automatic transmission that transmits an output from the driving source to the driving wheels, and wheels.
- a braking device (not shown) for braking the vehicle.
- the drive device 150 generates control signals for these drive mechanisms based on input signals from a driver's accelerator operation and brake operation, and control signals acquired from a vehicle controller (not shown) or a travel control device (not shown), The travel control including acceleration and deceleration of the vehicle is executed. By sending the command information to the drive device 150, it is possible to automatically perform travel control including acceleration and deceleration of the vehicle.
- the distribution of torque to be output to each of the electric motor and the internal combustion engine according to the running state of the vehicle is also transmitted to the drive device 150.
- the indoor camera 160 is provided at a position where the occupant of the vehicle 100 can be imaged, and images the occupant. In the present embodiment, the indoor camera 160 captures an image of a driver among occupants of the vehicle 100.
- the indoor camera 160 is preferably provided at a position where the driver's facial expression including the driver's line of sight and the driving operation by the driver can be imaged.
- the indoor camera 160 outputs information on the captured image of the driver to the controller 190.
- the indoor microphone 170 acquires the voice information of the occupant of the vehicle 100 and stores the voice information at least temporarily. In the present embodiment, the indoor microphone 170 acquires voice information of a driver among occupants of the vehicle 100.
- the installation position of the indoor camera 160 is not particularly limited, but is preferably provided near the occupant's seat.
- the speaker 180 outputs voice information to the occupant of the vehicle 100.
- the speaker 180 outputs voice information to a driver among occupants of the vehicle 100.
- the installation position of the speaker 180 is not particularly limited, it is preferable to provide the speaker 180 near the occupant's seat.
- the controller 190 includes a ROM (Read Only Memory) storing a program for executing an interactive process with the occupant using an interactive program corresponding to the load of the occupant of the vehicle 100, and a CPU executing the program stored in the ROM. (Central Processing Unit) and RAM (Random Access Memory) functioning as an accessible storage device.
- ROM Read Only Memory
- RAM Random Access Memory
- an MPU Micro Processing Unit
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- FPGA Field Programmable Gate Array
- the controller 190 executes a program stored in the ROM by the CPU, thereby obtaining a traveling state information acquiring function for acquiring information relating to the traveling state of the vehicle 100, and an external information acquiring function for acquiring information relating to the external environment of the vehicle 100.
- each function of the controller 190 will be described.
- the controller 190 acquires information on the traveling state of the vehicle 100 by the traveling state information acquisition function.
- the information on the traveling state includes the vehicle speed of the vehicle 100, the engine speed, the accelerator operation amount, the brake operation amount, and the steering angle of the steering.
- the vehicle 100 includes a device operated by an application such as the navigation device 130
- information on the type of the running application is input to the controller 190.
- the information on the traveling state includes information on the type of the application in addition to the information described above.
- the controller 190 acquires information on the external environment of the vehicle 100 by the external information acquisition function.
- the information on the external environment includes a distance to an obstacle around the vehicle 100, a direction in which the obstacle exists, and information on a road on which the vehicle 100 is traveling (type information, traffic congestion information, construction information, and accident information). ), The current position of the vehicle 100 is included.
- the controller 190 acquires information related to the driving operation by the driving operation information acquisition function.
- the information on the driving operation includes information on the shift lever position, information on the driving subject of the vehicle 100, and information on the driving posture of the driver.
- the driving posture information is information based on a driver's image captured by the indoor camera 160.
- the controller 190 grasps the driving posture of the driver by performing image processing on the captured image of the driver.
- the controller 190 determines the magnitude of the driver's driving load by the driving load determination function.
- the controller 190 of the present embodiment includes at least any one of information on a driving state obtained by a driving state information obtaining function, information on an external environment obtained by an external environment information obtaining function, and information on a driving operation obtained by a driving operation information obtaining function. Based on this one information, it is determined whether the magnitude of the operation load is larger or smaller than a predetermined reference. Note that, in the present embodiment, a configuration in which the magnitude of the operating load is divided into two categories, that is, larger or smaller than a predetermined reference, will be described as an example. However, the method of determining the magnitude of the operating load is not limited to this. It is not something to be done.
- the magnitude of the driving load may be divided into three or more categories.
- the determination of the magnitude of the driving load referred to here means that the controller 190 uses the driving state information acquisition function to acquire information about the traveling state, the external environment information acquisition function that acquires the external environment information, and the driving operation information acquisition function. It is to determine which of the two or more classifications the state of the driving load is based on at least one of the information on the driving operation acquired by the above. In the present embodiment, it is determined which of two or more classifications corresponds to a state where the operating load is large or a state where the operating load is small.
- the driving load is a load that the driver bears when the driver operates the vehicle 100.
- the driving load is large, the driver is required to have high concentration on driving. For this reason, the driver lowers the priority consciously or unconsciously for operations other than driving.
- the operation other than the driving include an operation for adjusting the temperature or air volume of the air conditioner, an operation for adjusting the volume of music or radio, and the like.
- the controller 190 determines the magnitude of the driving load according to the traveling state of the vehicle 100, as shown in FIG.
- the running state of the vehicle 100 corresponds to “running”
- the controller 190 determines that the driving load is large.
- the running state of the vehicle 100 corresponds to “stopped”
- the controller 190 determines that the driving load is small.
- the traveling state of the vehicle may be determined from the vehicle speed, the engine speed, the operation amount of the accelerator pedal, and the operation amount of the brake pedal. .
- the traveling state item is not limited to this.
- the controller 190 may determine that the driving load is small when the vehicle speed is equal to or lower than the predetermined speed even while the vehicle 100 is running. Further, for example, even if the vehicle 100 is stopped, the controller 190 starts the vehicle 100 when the shift lever position is located at “D (drive)” and the operation amount of the brake pedal is equal to or more than a predetermined value. It may be determined that the vehicle is in the previous stopped state and that the driving load is large.
- the controller 190 detects that the driver does not operate the steering wheel from the captured image of the driver captured by the indoor camera 160, even though the vehicle is running. In this case, the controller 190 may determine that the driving subject corresponds to the “vehicle controller” and the driving load is small.
- the controller 190 determines the magnitude of the driving load according to the driving operation.
- the controller 190 determines that the driving load is large when the driving operation corresponds to any of the operations of “garage parking”, “parallel parking”, and “back parking”. For example, when detecting that the shift lever position has been set to “R (reverse)”, the controller 190 determines temporarily that the driving operation may correspond to “parking”.
- the controller 190 determines that the driving operation corresponds to “entering the garage”. Further, the controller 190 determines that the driving operation corresponds to “parallel parking” when the other vehicle or the parallel parking lot is detected from the peripheral information of the vehicle 100. When detecting that the vehicle 100 is parked backward from the steering angle and the brake operation amount, the controller 190 determines that the driving operation corresponds to “back parking”. Note that the determination method using the driving operation is an example and is not particularly limited. For example, the traveling trajectory of the vehicle 100 is estimated from the steering angle, the vehicle speed, the brake operation amount, and the accelerator operation amount, and whether the driving operation corresponds to any one of “garage parking”, “parallel parking”, and “back parking” May be determined.
- the magnitude of the operating load may be determined for at least one of the items.
- Each dialogue program consists of a program that recognizes the voice of the occupant and a program that outputs voice to the occupant.
- the controller 190 outputs a voice through the speaker 180 by executing a program for outputting a voice, and speaks to the occupant.
- the controller 190 recognizes the voice of the occupant input via the indoor microphone 170 by executing a voice recognition program.
- the controller 190 interacts with the occupant by repeatedly outputting the voice and recognizing the occupant's voice.
- the first dialogue program is programmed so that the number of words included in the response of the occupant is smaller than the second dialogue program.
- the answer is to answer a question or a call (hereinafter also referred to as a question or the like) from the controller 190.
- the response of the occupant includes not only a response to a single question or the like, but also all responses made from the start of the dialog to the end of the dialog.
- Methods to reduce the number of words spoken by the occupant include, for example, a method of reducing the number of times the occupant replies, a method of reducing the number of words occupied by the occupant per response, and a method of shortening the word uttered by the occupant. No.
- the controller 190 assigns a number to the answer candidate and asks the occupant to answer the number as an alternative. For example, a case where the above-described destination is settled will be described. For example, the controller 190 asks, “Which of the destinations is 1.XX, 2. ⁇ , 3.XX?” Ask a question that asks the crew to answer with a number 1-3. For example, when the controller 190 recognizes the voice of “1” as the response of the occupant, the controller 190 accesses the navigation device 130 and sets the destination of the vehicle 100 to ⁇ . Although the case where the number of answer candidates is three has been described as an example, the number of answer candidates is not particularly limited.
- step S101 the controller 190 acquires information on the running state of the vehicle 100, the state of the external environment of the vehicle 100, and the state of the driving operation of the vehicle 100.
- the controller 190 acquires shift lever position information from the shift lever sensor 116 as traveling state information.
- the controller 190 acquires the surrounding information of the vehicle 100 from the surrounding detection device 120 as information of the external environment.
- the controller 190 performs image processing on a captured image of the driver of the vehicle 100 captured by the indoor camera 160.
- the controller 190 grasps the driving posture of the driver by image processing, and acquires driving operation information.
- step S102 the controller 190 determines the magnitude of the driving load of the occupant of the vehicle 100 based on the information acquired in step S101. For example, the controller 190 determines that the driving load is large when the vehicle 100 is traveling, and determines that the driving load is small when the vehicle 100 is stopped. Further, for example, the controller 190 determines that the driving load is large when the vehicle 100 is traveling on a highway, and determines that the driving load is small when the vehicle 100 is traveling on a congested road. judge. Further, for example, when the driver performs a garage driving operation, the controller 190 determines that the driving load is large. When it is determined that the operating load is large, the process proceeds to step S103, and when it is determined that the operating load is small, the process proceeds to step S105. Note that the above-described determination method is an example and is not particularly limited.
- FIG. 5 is an example of a dialogue performed when the first dialogue program is used.
- FIG. 5A shows an example of a dialogue in the case where the occupant operates the navigation device 130 in order to set a destination in the vehicle 100 stopped at a traffic light, but the signal 100 has changed and the vehicle 100 has started. It is.
- FIG. 5B is an example of a dialogue in a situation where the room temperature of the vehicle 100 rises while the occupant is driving on the highway.
- the execution of the first dialogue program allows the occupant to respond to the question from the controller 190 as “Yes” or “No, no”. Can be answered using simple and short words.
- step S102 If it is determined in step S102 that the driving load is small, the process proceeds to step S105.
- step S105 the controller 190 executes a dialog using the second dialog program.
- the voice interaction device uses the sensor group 110, the surrounding detection device 120, the navigation device 130, the external information acquisition device 140, the driving device 150, and the indoor camera 160 mounted on the vehicle 100. And a controller 190 for interacting with the occupant of the vehicle 100.
- the sensor group 110 detects a traveling state of the vehicle 100.
- the surrounding detection device 120, the navigation device 130, and the external information acquisition device 140 detect the state of the external environment of the vehicle 100.
- the drive device 150 and the indoor camera 160 detect the state of the driving operation of the occupant of the vehicle 100.
- the controller 190 uses the first dialogue program to ask a question that causes the occupant to make an alternative answer.
- the controller 190 uses the first dialogue program to ask a question that causes the occupant to make an alternative answer.
- the controller 190 determines that the magnitude of the driving load is larger than a predetermined reference.
- driving on a highway imposes a heavy load on a driver, so that the magnitude of the driving load can be appropriately determined.
- the magnitude of the driving load can be appropriately determined.
- the controller 190 determines that the magnitude of the driving load is smaller than a predetermined reference.
- the load on the driver is relatively small, so that the magnitude of the driving load can be appropriately determined.
- the controller 190 determines that the magnitude of the driving load is lower than a predetermined reference. Judge as small. Generally, during traffic congestion in which the vehicle travels at a slow speed or in which stop and start are repeated, the load on the driver is relatively small, so that the magnitude of the driving load can be appropriately determined.
- the first interactive program according to the present embodiment is different from the first interactive program according to the above-described first embodiment with respect to voice output and voice recognition.
- a technique called so-called discrete word recognition is used for speech recognition in the first dialogue program.
- Discrete word recognition does not recognize the occupant's voice information as a sentence but recognizes a predetermined word (hereinafter, also referred to as a keyword) included in the occupant's voice information.
- the keyword is a word registered in advance in a storage device such as a ROM or a database.
- the type and the number of characters of the keyword are not particularly limited.
- the number of keywords is not particularly limited, the number of keywords is preferably in the range of 10 to 20 in order to increase the processing speed.
- Keywords include “Yes”, “No”, “Front”, “Back”, “Left”, “Right”, “Top”, “Bottom”, “Strong”, “Weak”, “High”, “Low” And the like.
- possible answers to the question “Do you want to lower the temperature of the air conditioner?” Include “Yes”, “No”, “Lower”, and “No, raise”. It is preferable that words corresponding to the contents are registered in advance as keywords.
- the second interactive program according to the present embodiment is the same as the second interactive program according to the first embodiment, as shown in FIG. For this reason, the repeated description is omitted, and the description given in the first embodiment is referred to.
- the technology of discrete word recognition is used for speech recognition.
- speech recognition for example, when traveling on an unpaved road or a road with a wet road surface, even when a large road noise occurs, it is possible to accurately recognize the word spoken by the occupant, The processing time required from recognizing the occupant's voice to outputting the voice can be reduced.
- the voice interaction device of the present embodiment has the same configuration as that of the above-described embodiment, except that the first interaction program and the second interaction program used by the controller 190 are different. Then, the description in the above-described embodiment will be referred to.
- FIG. 8 is a diagram for describing a first interactive mode and a second interactive mode in the present embodiment.
- the second interactive program is the same as the second interactive program according to the above-described second embodiment, the repeated description is omitted, and the description in the above-described embodiment is referred to. I do.
- the first dialogue program according to the present embodiment includes the first dialogue program according to the first embodiment for voice output, and the first dialogue program according to the second embodiment for voice recognition.
- the controller 190 asks a question or the like that causes the occupant to answer with a specific keyword.
- the specific keyword is a keyword used in discrete word recognition.
- the controller 190 asks a question or the like that causes the occupant to reply “yes” or “no”. Also, for example, it is assumed that “hot” and “cold” have been registered in the storage device in advance as keywords used in discrete word recognition. In this case, the controller 190 asks a question regarding the room temperature, for example, to make the occupant reply “hot” or “cold”. Further, for example, it is assumed that “good”, “normal”, and “bad” have been registered in the storage device in advance as keywords used in discrete word recognition. In this case, the controller 190 asks a question or the like to prompt the occupant to select from “good”, “normal”, and “bad” options.
- the topic handled by the first dialogue program is different from the topic handled by the second dialogue program.
- the first dialogue program is programmed so as to have a dialogue with the occupant on matters related to traveling.
- the items related to traveling include, for example, steering control, traveling control, and setting of a destination.
- the second dialogue program no topic restriction is set in the dialogue with the occupant.
- Topics handled by the second dialogue program include, for example, topics unrelated to running, such as weather and entertainment.
- the controller 190 uses the first interactive program to ask a question that causes the occupant to answer with a keyword registered in advance.
- the occupant can have a conversation while maintaining the concentration for driving.
- the discrete word recognition can reduce the processing time required for recognizing the occupant's voice, thereby shortening the time until the dialogue ends.
- the first dialogue program is programmed to talk with the occupant on topics related to the traveling of the vehicle.
- a question regarding steering or a notification regarding the vehicle speed can be made in a simple sentence in an expression that is easy for the occupant to understand.
- the voice interaction system does not need to be configured only with the vehicle 100, and can communicate with the vehicle 100 and the vehicle 100. It may be configured with a simple server.
- the voice interaction system may have a function similar to that of the controller 190, and may include a server that communicates with a vehicle to communicate with a vehicle occupant.
- the server exchanges information on the running state, information on the state of the external environment, information on the state of the driving operation, and voice information via a communication device (not shown).
- a communication device for example, when there is a vehicle having a configuration excluding the controller 190 in the configuration illustrated in FIG. 1, the server sequentially obtains information necessary for determining the magnitude of the driving load from the vehicle via the communication device.
- the voice information generated by using the interactive program according to the magnitude of the driving load is transmitted to the vehicle.
- the in-vehicle communication device (not shown) transmits the occupant voice information acquired by the indoor microphone 170 to the server, receives the voice information transmitted from the server, and causes the speaker 180 to output the information. .
- the voice interaction system including the server By configuring the voice interaction system including the server in this way, for example, in a data center including the server, the history of the interaction performed with a plurality of vehicles can be managed collectively. Then, for example, by analyzing the conversation history for each occupant, it is possible to generate a criterion for determining the magnitude of the driving load in accordance with the occupant's driving characteristics.
- the driving load is large when the driving subject is the “driver”.
- the present invention is not limited to this in the case of a vehicle having functions. For example, even when the driving subject is the “driver”, if the driving support is received by the vehicle controller or the driving control, the driving load may be determined to be small.
- the present invention can also be applied to a dialog system for an occupant of a vehicle that is not a driver.
- the load state of the occupant of the vehicle that is not the driver is determined, and when the load state of the occupant is large, the above-described first interactive program is used.
- the second interactive program described above can be used.
- the load state of the occupant of the vehicle that is not the driver is information on the traveling state acquired by the traveling state information acquiring function, information on the external environment acquired by the external environment information acquiring function, an infotainment system mounted on the vehicle 100 (illustrated in FIG.
- the load state of the occupant of the vehicle other than the driver is determined based on at least one of the information on the use state of the vehicle terminal (not shown) and the information on the use state of the mobile terminal (not shown) used by the occupant. For example, when the information on the use state of the infotainment system (not shown) mounted on the vehicle 100 indicates that a moving image is being reproduced in the infotainment system, the load state of the occupant is determined to be large. Further, when the information on the use state of the mobile terminal used by the occupant indicates that the mobile terminal used by the occupant is operated by the occupant, the load state of the occupant is determined to be large, and the first interactive program is executed. To interact with the crew.
- the infotainment system or the like (not shown; on-vehicle equipment such as an information display device) mounted on the vehicle 100 indicates that the infotainment system is not used, or If the information on the use state of the mobile terminal used by the user indicates that the mobile terminal is not operated by the occupant, the load state of the occupant is determined to be small, and the communication with the occupant is determined using the second dialogue program.
- the first dialogue program or the second dialogue program is selected depending on whether the in-vehicle device or the mobile terminal is used, and the dialogue with the occupant is performed. That is, in the present embodiment, when it is determined that the driver or the occupant who is not the driver is performing any task that requires concentration, the first interactive program is used to determine that the task that requires concentration is not performed. If so, the dialogue is performed using the second dialogue program.
- the controller according to the present invention will be described using the controller 190 as an example, but the present invention is not limited to this.
- the sensor according to the present invention will be described using a sensor group 110, a surrounding detection device 120, a navigation device 130, an external information acquisition device 140, and a driving device 150 as an example, but the present invention is not limited to this. Absent.
- the storage device according to the present invention will be described using a ROM or a database as an example, but the present invention is not limited to this.
- the audio device according to the present invention will be described using the indoor microphone 170 and the speaker 180 as an example, but the present invention is not limited thereto.
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Transportation (AREA)
- Computational Linguistics (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Physics & Mathematics (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Air-Conditioning For Vehicles (AREA)
Abstract
Description
図1は、本実施形態に係る音声対話システムの構成を示す図である。図1に示すように、本実施形態では、音声対話システムを車両100に搭載した構成を例に挙げて説明する。
次に第2実施形態に係る音声対話装置について説明する。本実施形態の音声対話装置は、コントローラ190が用いる第1の対話プログラム及び第2の対話プログラムが異なる点以外は、上述した実施形態と同様の構成を有しているため、繰り返しの説明は省略して、上述の実施形態においてした説明を援用する。
次に第3実施形態に係る音声対話装置について説明する。本実施形態の音声対話装置は、コントローラ190が用いる第1の対話プログラム及び第2の対話プログラムが異なる点以外は、上述した実施形態と同様の構成を有しているため、繰り返しの説明は省略して、上述の実施形態においてした説明を援用する。
つまり、本実施形態では、運転者または運転者ではない乗員が何らかの集中を要するタスクを行っていると判定される場合には第1の対話プログラムを用い、集中を要するタスクを行っていないと判定される場合には第2の対話プログラムを用いて対話を行うものである。
110…センサ群
111…車速センサ
112…エンジン回転数センサ
113…アクセル開度センサ
114…ブレーキ開度センサ
115…操舵角センサ
116…シフトレバーセンサ
120…周囲検出装置
121…車載カメラ
122…レーダー
130…ナビゲーション装置
131…GPS
140…外部情報取得装置
150…駆動装置
160…室内カメラ
170…室内マイク
Claims (15)
- 車両に搭載されたセンサを用いて、前記車両の乗員と対話するコントローラを備えた音声対話装置であって、
前記センサは、前記車両の走行状態、前記車両の外部環境の状態、及び前記乗員の状態のうち少なくとも一つの状態を検出し、
前記コントローラは、
前記センサの検出結果に基づいて、前記乗員の負荷の状態を判定し、
前記負荷の状態に応じた対話プログラムを用いて、前記乗員と対話する音声対話装置。 - 請求項1記載の音声対話装置であって、
前記コントローラは、前記負荷の大きさが所定の基準よりも大きい場合には、第1の対話プログラムを用い、前記負荷の大きさが前記基準よりも小さい場合には、前記第1の対話プログラムと異なる第2の対話プログラムを用いる音声対話装置。 - 請求項2記載の音声対話装置であって、
前記コントローラは、
前記第1の対話プログラムを実行することで、前記乗員から質問に対する第1の返答を取得し、
前記第2の対話プログラムを実行することで、前記乗員から質問に対する第2の返答を取得し、
前記第1の返答に含まれる単語数は、前記第2の返答に含まれる単語数と比べて少ない音声対話装置。 - 請求項2又は3記載の音声対話装置であって、
前記コントローラは、前記第1の対話プログラムを用いて、前記乗員に予め定められた所定の単語で回答させるような質問を行う音声対話装置。 - 請求項2~4の何れか一項に記載の音声対話装置であって、
前記コントローラは、前記第1の対話プログラムを用いて、前記乗員に択一的に返答させるような質問を行う音声対話装置。 - 請求項2~5の何れか一項に記載の音声対話装置であって、
前記コントローラは、前記検出結果に基づいて、前記車両が前記乗員の運転により走行していることを検出した場合、前記負荷の大きさが前記基準よりも大きいと判定する音声対話装置。 - 請求項2~6の何れか一項に記載の音声対話装置であって、
前記コントローラは、前記検出結果に基づいて、前記乗員が駐車するための運転操作を行っていることを検出した場合、または、前記車両が高速道路又は交差点を走行していることを検出した場合、前記負荷の大きさが前記基準よりも大きいと判定する音声対話装置。 - 請求項2~7の何れか一項に記載の音声対話装置であって、
前記コントローラは、前記検出結果に基づいて、前記乗員が前記車両の車載機器またはモバイル端末を利用していることを検出した場合、負荷が大きいと判定する音声対話装置。 - 請求項2~8の何れか一項に記載の音声対話装置であって、
前記コントローラは、前記検出結果に基づいて、前記車両が停車していることを検出した場合、前記負荷の大きさが前記基準よりも小さいと判定する音声対話装置。 - 請求項2~9の何れか一項に記載の音声対話装置であって、
前記コントローラは、前記検出結果に基づいて、前記車両が自動的な運転により走行していることを検出した場合、前記負荷の大きさが前記基準よりも小さいと判定する音声対話装置。 - 請求項2~10の何れか一項に記載の音声対話装置であって、
前記コントローラは、前記検出結果に基づいて、前記車両が渋滞中の道路を走行していることを検出した場合、又は前記車両が直線道路を走行していることを検出した場合、前記負荷の大きさが前記基準よりも小さいと判定する音声対話装置。 - 請求項2~11の何れか一項に記載の音声対話装置であって、
前記第1の対話プログラムは、前記車両の走行に関する話題で前記乗員と対話する機能を実現するプログラムである音声対話装置。 - 請求項1~12の何れか一項に記載の音声対話装置であって、
前記コントローラは、
記憶装置に記憶された対話プログラムの中から、前記負荷の大きさに応じた対話プログラムを選択し、
前記対話プログラムを実行することで、一又は複数の音声装置を介して、前記乗員と対話し、
前記一又は複数の音声装置は、前記乗員の音声を認識し、前記乗員へ音声を出力する音声対話装置。 - 車両と通信することで、前記車両の乗員と対話するサーバと、を備える音声対話システムであって、
前記車両は、センサ、及び一又は複数の音声装置を搭載し、
前記センサは、前記車両の走行状態、前記車両の外部環境の状態、及び前記乗員の状態のうち少なくとも一つの状態を検出し、
前記一又は複数の音声装置は、前記乗員の音声を認識し、前記乗員へ音声を出力し、
前記サーバは、
前記センサの検出結果を取得し、
前記センサの検出結果に基づいて、前記乗員の負荷の状態を判定し、
前記負荷の状態に応じた対話プログラムを用いて、前記乗員と対話する音声対話システム。 - 車両に搭載されたセンサを用いて、前記車両の乗員と対話するコントローラを備えた音声対話装置の制御方法であって、
前記センサにより、前記車両の走行状態、前記車両の外部環境の状態、及び前記乗員の状態のうち少なくとも一つの状態を検出し、
前記コントローラにより、前記センサの検出結果に基づいて、前記乗員の負荷の状態を判定し、
前記負荷の状態に応じた対話プログラムを用いて、前記乗員と対話する音声対話装置の制御方法。
Priority Applications (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
BR112021002265-1A BR112021002265A2 (pt) | 2018-08-06 | 2018-08-06 | dispositivo de diálogo por voz, sistema de diálogo por voz e método de controle do sistema de diálogo por voz |
JP2020535355A JP6996632B2 (ja) | 2018-08-06 | 2018-08-06 | 音声対話装置、音声対話システム、及び音声対話装置の制御方法 |
EP18929357.4A EP3836138B1 (en) | 2018-08-06 | 2018-08-06 | Voice dialogue device, voice dialogue system, and control method for voice dialogue system |
PCT/JP2018/029470 WO2020031241A1 (ja) | 2018-08-06 | 2018-08-06 | 音声対話装置、音声対話システム、及び音声対話装置の制御方法 |
US17/266,401 US11938958B2 (en) | 2018-08-06 | 2018-08-06 | Voice dialogue device, voice dialogue system, and control method for voice dialogue system |
MX2021001243A MX2021001243A (es) | 2018-08-06 | 2018-08-06 | Dispositivo de dialogo por voz, sistema de dialogo por voz, y metodo de control para sistema de dialogo por voz. |
CN201880096314.2A CN112534499B (zh) | 2018-08-06 | 2018-08-06 | 声音对话装置、声音对话系统以及声音对话装置的控制方法 |
JP2021203314A JP2022046551A (ja) | 2018-08-06 | 2021-12-15 | 音声対話装置、音声対話システム、及び音声対話装置の制御方法 |
JP2023064988A JP7509266B2 (ja) | 2018-08-06 | 2023-04-12 | 音声対話装置、音声対話システム、及び音声対話装置の制御方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2018/029470 WO2020031241A1 (ja) | 2018-08-06 | 2018-08-06 | 音声対話装置、音声対話システム、及び音声対話装置の制御方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020031241A1 true WO2020031241A1 (ja) | 2020-02-13 |
Family
ID=69414288
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/029470 WO2020031241A1 (ja) | 2018-08-06 | 2018-08-06 | 音声対話装置、音声対話システム、及び音声対話装置の制御方法 |
Country Status (7)
Country | Link |
---|---|
US (1) | US11938958B2 (ja) |
EP (1) | EP3836138B1 (ja) |
JP (1) | JP6996632B2 (ja) |
CN (1) | CN112534499B (ja) |
BR (1) | BR112021002265A2 (ja) |
MX (1) | MX2021001243A (ja) |
WO (1) | WO2020031241A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021144321A (ja) * | 2020-03-10 | 2021-09-24 | 株式会社東海理化電機製作所 | エージェントシステム、制御装置、およびコンピュータプログラム |
JP2021152566A (ja) * | 2020-03-24 | 2021-09-30 | 本田技研工業株式会社 | 待機時間調整方法および装置 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003108191A (ja) * | 2001-10-01 | 2003-04-11 | Toyota Central Res & Dev Lab Inc | 音声対話装置 |
JP2006138994A (ja) * | 2004-11-11 | 2006-06-01 | Nissan Motor Co Ltd | 音声認識装置 |
JP2009168773A (ja) * | 2008-01-21 | 2009-07-30 | Nissan Motor Co Ltd | ナビゲーション装置および情報提供方法 |
JP2012024481A (ja) * | 2010-07-27 | 2012-02-09 | Denso Corp | 健康管理支援システム |
JP2015068866A (ja) * | 2013-09-27 | 2015-04-13 | クラリオン株式会社 | 車両用装置、サーバ、及び、情報処理方法 |
WO2015128960A1 (ja) * | 2014-02-26 | 2015-09-03 | 三菱電機株式会社 | 車載制御装置および車載制御方法 |
WO2016147367A1 (ja) * | 2015-03-19 | 2016-09-22 | 三菱電機株式会社 | 未確認情報出力装置、未確認情報出力方法 |
JP2017067849A (ja) | 2015-09-28 | 2017-04-06 | 株式会社デンソー | 対話装置及び対話方法 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3703050B2 (ja) | 1996-09-30 | 2005-10-05 | マツダ株式会社 | ナビゲーション装置 |
JP4260788B2 (ja) * | 2005-10-20 | 2009-04-30 | 本田技研工業株式会社 | 音声認識機器制御装置 |
JP2008233678A (ja) * | 2007-03-22 | 2008-10-02 | Honda Motor Co Ltd | 音声対話装置、音声対話方法、及び音声対話用プログラム |
US8400332B2 (en) * | 2010-02-09 | 2013-03-19 | Ford Global Technologies, Llc | Emotive advisory system including time agent |
DE102014002543A1 (de) * | 2014-02-22 | 2015-08-27 | Audi Ag | Verfahren zur Erfassung wenigstens zweier zu erfassender Informationen mit zu verknüpfendem Informationsgehalt durch eine Sprachdialogeinrichtung, Sprachdialogeinrichtung und Kraftfahrzeug |
JP2015184563A (ja) * | 2014-03-25 | 2015-10-22 | シャープ株式会社 | 対話型家電システム、サーバ装置、対話型家電機器、家電システムが対話を行なうための方法、当該方法をコンピュータに実現させるためのプログラム |
JP6081966B2 (ja) * | 2014-07-18 | 2017-02-15 | キャンバスマップル株式会社 | 情報検索装置、情報検索プログラム、および情報検索システム |
US10137902B2 (en) * | 2015-02-12 | 2018-11-27 | Harman International Industries, Incorporated | Adaptive interactive voice system |
JP6358212B2 (ja) * | 2015-09-17 | 2018-07-18 | トヨタ自動車株式会社 | 車両用覚醒制御システム |
EP3384475B1 (en) | 2015-12-06 | 2021-12-22 | Cerence Operating Company | System and method of conversational adjustment based on user's cognitive state |
US10032453B2 (en) * | 2016-05-06 | 2018-07-24 | GM Global Technology Operations LLC | System for providing occupant-specific acoustic functions in a vehicle of transportation |
KR102643501B1 (ko) * | 2016-12-26 | 2024-03-06 | 현대자동차주식회사 | 대화 처리 장치, 이를 포함하는 차량 및 대화 처리 방법 |
US10170111B2 (en) * | 2017-01-19 | 2019-01-01 | Toyota Motor Engineering & Manufacturing North America, Inc. | Adaptive infotainment system based on vehicle surrounding and driver mood and/or behavior |
US20190332915A1 (en) * | 2018-04-26 | 2019-10-31 | Wipro Limited | Method and system for interactively engaging a user of a vehicle |
-
2018
- 2018-08-06 BR BR112021002265-1A patent/BR112021002265A2/pt unknown
- 2018-08-06 EP EP18929357.4A patent/EP3836138B1/en active Active
- 2018-08-06 JP JP2020535355A patent/JP6996632B2/ja active Active
- 2018-08-06 MX MX2021001243A patent/MX2021001243A/es unknown
- 2018-08-06 CN CN201880096314.2A patent/CN112534499B/zh active Active
- 2018-08-06 WO PCT/JP2018/029470 patent/WO2020031241A1/ja active Application Filing
- 2018-08-06 US US17/266,401 patent/US11938958B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003108191A (ja) * | 2001-10-01 | 2003-04-11 | Toyota Central Res & Dev Lab Inc | 音声対話装置 |
JP2006138994A (ja) * | 2004-11-11 | 2006-06-01 | Nissan Motor Co Ltd | 音声認識装置 |
JP2009168773A (ja) * | 2008-01-21 | 2009-07-30 | Nissan Motor Co Ltd | ナビゲーション装置および情報提供方法 |
JP2012024481A (ja) * | 2010-07-27 | 2012-02-09 | Denso Corp | 健康管理支援システム |
JP2015068866A (ja) * | 2013-09-27 | 2015-04-13 | クラリオン株式会社 | 車両用装置、サーバ、及び、情報処理方法 |
WO2015128960A1 (ja) * | 2014-02-26 | 2015-09-03 | 三菱電機株式会社 | 車載制御装置および車載制御方法 |
WO2016147367A1 (ja) * | 2015-03-19 | 2016-09-22 | 三菱電機株式会社 | 未確認情報出力装置、未確認情報出力方法 |
JP2017067849A (ja) | 2015-09-28 | 2017-04-06 | 株式会社デンソー | 対話装置及び対話方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3836138A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021144321A (ja) * | 2020-03-10 | 2021-09-24 | 株式会社東海理化電機製作所 | エージェントシステム、制御装置、およびコンピュータプログラム |
JP2021152566A (ja) * | 2020-03-24 | 2021-09-30 | 本田技研工業株式会社 | 待機時間調整方法および装置 |
JP7388962B2 (ja) | 2020-03-24 | 2023-11-29 | 本田技研工業株式会社 | 待機時間調整方法および装置 |
Also Published As
Publication number | Publication date |
---|---|
EP3836138A4 (en) | 2021-07-28 |
MX2021001243A (es) | 2021-03-31 |
JPWO2020031241A1 (ja) | 2021-08-02 |
CN112534499B (zh) | 2024-02-23 |
US11938958B2 (en) | 2024-03-26 |
CN112534499A (zh) | 2021-03-19 |
US20210309241A1 (en) | 2021-10-07 |
BR112021002265A2 (pt) | 2021-05-04 |
EP3836138A1 (en) | 2021-06-16 |
EP3836138B1 (en) | 2023-04-26 |
JP6996632B2 (ja) | 2022-01-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107945555B (zh) | 用于半自动驾驶的路线资格的动态更新 | |
JP6623468B2 (ja) | 車両制御装置、車両制御方法、および車両制御プログラム | |
CN111830961B (zh) | 路径设定装置、路径设定方法以及存储介质 | |
US11054818B2 (en) | Vehicle control arbitration | |
CN111348032A (zh) | 用于自主车辆的自动停止和/或驻车的系统和方法 | |
JP6800340B2 (ja) | 車両制御システム、車両制御方法、およびプログラム | |
JP6906175B2 (ja) | 運転支援方法およびそれを利用した運転支援装置、自動運転制御装置、車両、プログラム、運転支援システム | |
US20180321678A1 (en) | Notification System For Automotive Vehicle | |
WO2020031241A1 (ja) | 音声対話装置、音声対話システム、及び音声対話装置の制御方法 | |
WO2016121112A1 (ja) | 評価情報収集システム | |
JP2019156355A (ja) | 車両制御装置 | |
JP7509266B2 (ja) | 音声対話装置、音声対話システム、及び音声対話装置の制御方法 | |
JP2021142839A (ja) | 車両及びその制御装置 | |
RU2772382C1 (ru) | Голосовое диалоговое устройство, голосовая диалоговая система и способ управления для голосовой диалоговой системы | |
CN113386758B (zh) | 车辆及其控制装置 | |
JP2019045940A (ja) | 自動運転システム | |
WO2020202373A1 (ja) | 制御装置、制御方法及びプログラム | |
WO2023090057A1 (ja) | 情報処理装置、情報処理方法および情報処理プログラム | |
US11507346B1 (en) | Intelligent text and voice feedback for voice assistant | |
CN114572219B (zh) | 自动超车方法、装置、车辆、存储介质及芯片 | |
US10953875B2 (en) | System and method for railway en-route prediction | |
US20240241936A1 (en) | Systems and methods for permitting access to a human machine interface in a vehicle | |
US20240227749A1 (en) | Power supply device, control method, and storage medium | |
CN117095564A (zh) | 车辆控制方法、设备及车辆 | |
JP2024045840A (ja) | 運転支援装置、運転支援方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18929357 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020535355 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112021002265 Country of ref document: BR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021102612 Country of ref document: RU |
|
ENP | Entry into the national phase |
Ref document number: 2018929357 Country of ref document: EP Effective date: 20210309 |
|
ENP | Entry into the national phase |
Ref document number: 112021002265 Country of ref document: BR Kind code of ref document: A2 Effective date: 20210205 |