CN111439271A - Auxiliary driving method and auxiliary driving equipment based on voice control - Google Patents

Auxiliary driving method and auxiliary driving equipment based on voice control Download PDF

Info

Publication number
CN111439271A
CN111439271A CN202010316284.0A CN202010316284A CN111439271A CN 111439271 A CN111439271 A CN 111439271A CN 202010316284 A CN202010316284 A CN 202010316284A CN 111439271 A CN111439271 A CN 111439271A
Authority
CN
China
Prior art keywords
driver
voice
driving
vehicle
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010316284.0A
Other languages
Chinese (zh)
Inventor
宋庆谱
黄洪荣
朱国章
陈桢
张思远
王仙涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SAIC Volkswagen Automotive Co Ltd
Original Assignee
SAIC Volkswagen Automotive Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SAIC Volkswagen Automotive Co Ltd filed Critical SAIC Volkswagen Automotive Co Ltd
Priority to CN202010316284.0A priority Critical patent/CN111439271A/en
Publication of CN111439271A publication Critical patent/CN111439271A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/04Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
    • B60W10/06Conjoint control of vehicle sub-units of different type or different function including control of propulsion units including control of combustion engines
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/10Conjoint control of vehicle sub-units of different type or different function including control of change-speed gearings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/10Path keeping
    • B60W30/12Lane keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/14Adaptive cruise control
    • B60W30/143Speed control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18163Lane change; Overtaking manoeuvres
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/089Driver voice
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0043Signal treatments, identification of variables or parameters, parameter estimation or state estimation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/06Combustion engines, Gas turbines
    • B60W2710/0605Throttle position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/10Change speed gearings
    • B60W2710/1005Transmission ratio engaged
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/18Braking system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/20Steering systems

Abstract

The invention discloses a driving assisting device based on voice control, which comprises: the system comprises an environment perception module, a driver monitoring module, a voice processing module, a driving decision module, an execution module and a human-computer interaction module. The invention also discloses a driving assisting method based on voice control, which is executed by the driving assisting equipment based on voice control, and the method comprises the following steps: a waking step, in which the auxiliary driving equipment is waken up according to a voice wake-up instruction of a driver; an environment sensing step of sensing a driving state of a vehicle and a surrounding driving environment to generate an environment sensing signal; a driver monitoring step, monitoring the state of a driver and generating a driver monitoring signal; a voice instruction acquisition step, which is used for acquiring and processing sound signals in the vehicle and identifying a voice instruction of a driver from the sound signals; a driving decision step, namely generating a driving instruction according to the environment sensing signal, the driver monitoring signal and the voice instruction; and an execution step of adjusting the running state of the vehicle according to the running execution.

Description

Auxiliary driving method and auxiliary driving equipment based on voice control
Technical Field
The invention relates to the technical field of intelligent driving, in particular to a voice-based auxiliary driving technology.
Background
In the field of intelligent driving, with the continuous development of intelligent driving technology, a plurality of advanced driving assistance systems are applied, such as adaptive cruise and the like. The self-adaptive cruise system is provided with a sensor such as a millimeter wave radar or a laser radar, automatically senses the condition of the front road, the condition of the surrounding environment and the running state of the vehicle, adjusts the running state of the vehicle according to the condition of the front road and the condition of the surrounding environment, and realizes self-adaptive running.
The existing adaptive cruise has more complex operation steps: it is first necessary to keep the vehicle running in the center of the lane and then to activate the adaptive cruise system. However, the adaptive cruise system is not immediately started, and needs to wait for the adaptive cruise trigger condition to be satisfied, which is generally a vehicle speed condition, for example, the vehicle speed needs to be maintained at 80km/h or more, before the adaptive cruise can be started. Upon satisfaction of the condition, the cruise icon appears. After seeing the appearance of the cruise icon, the driver presses the function key. Then, when the indicator light on the steering wheel is turned on and the road conditions are safe, the self-adaptive cruise function is formally started only by loosening the two hands.
To the present state of the art, in natural driving, the eyes of the driver are the most reliable environmental perception sensors, while the brain of the driver is the most reliable processing system, and the driver is the most critical actuator. The driving assistance system can only temporarily assist driving to a semi-automatic extent. Because the operation of the current adaptive cruise system is complex, when a driver uses the adaptive cruise system, the driver needs to continuously observe the indicator light and the instrument panel and also needs to continuously operate the keys to start the function of the adaptive cruise. These observations and operations of the adaptive cruise system are instead distracting to the driver, and to some extent present a safety hazard.
In addition, when the adaptive cruise is performed, the adaptive cruise system takes over driving of the vehicle, and the driver does not operate the vehicle for a while. However, the current adaptive cruise system has limited road conditions, can only adapt to simple road conditions, and still needs a driver to intervene in operation in time when encountering complex road conditions or emergency situations. However, when the driver drives on the highway for a long distance, the driver is inevitably fatigued, and the driver may be distracted due to the fatigued driver. The driver's distraction presents a serious accident risk, in one of which the driver is driving manually, but is unresponsive due to fatigue, and malfunctions occur. In another situation, adaptive cruise is being used, but complex road conditions occur, which require timely intervention of the driver in manual driving, but the driver is not timely involved due to fatigue. The existing adaptive cruise system does not pay attention to the state of a driver, so that potential safety hazards exist.
Disclosure of Invention
The invention provides an auxiliary driving technology based on voice control.
According to an embodiment of the present invention, there is provided a driving assistance apparatus based on voice control, including: the system comprises an environment perception module, a driver monitoring module, a voice processing module, a driving decision module, an execution module and a human-computer interaction module. The environment sensing module senses the running state of the vehicle and the surrounding running environment and generates an environment sensing signal. The driver monitoring module monitors the state of the driver and generates a driver monitoring signal. The voice processing module collects and processes sound signals in the vehicle and identifies voice instructions of a driver from the sound signals. The driving decision module is connected with the environment sensing module, the driver monitoring module and the voice processing module, receives the environment sensing signal, the driver monitoring signal and the voice instruction, and generates a driving instruction according to the environment sensing signal, the driver monitoring signal and the voice instruction. The execution module is connected with the driving decision module, and adjusts the driving state of the vehicle according to driving execution. The man-machine interaction module is connected with the driving decision module and feeds back to a driver according to the current state of the auxiliary driving equipment and the driving instruction.
In one embodiment, the context awareness module comprises: lane keeping cameras, lane keeping controllers, millimeter wave radars and millimeter wave radar controllers. The lane keeping camera points to the front of the vehicle, and the lane keeping camera collects road information in front of the vehicle. The lane keeping controller is connected with the lane keeping camera and carries out lane recognition according to the road information collected by the lane keeping camera. Millimeter wave radar detects signals of other traffic participants in the vicinity of the vehicle. The millimeter wave radar controller is connected with the millimeter wave radar, and the millimeter wave radar controller identifies other traffic participants around the vehicle and states of the other traffic participants according to signals detected by the millimeter wave radar. The lane keeping controller and the millimeter wave radar controller generate environment perception signals.
In one embodiment, the lane keeping camera is arranged at the front of the vehicle and points to the front of the vehicle, and the lane keeping camera collects road information within 60m of the front of the vehicle, and the collection frame rate is 30 frames/second. The millimeter wave radar is a long-distance millimeter wave radar, the detection distance is not less than 200m, the near-end horizontal detection angle is not less than 120 degrees (+ -60 degrees), and the near-end vertical detection angle is not less than 20 degrees.
In one embodiment, the driver monitoring module includes: driver monitoring camera and driver monitoring processor. The driver monitoring camera is arranged in the vehicle and points to the position of a driver, the field angle of the driver monitoring camera is not lower than 45 degrees, the acquisition frame rate is not lower than 30 frames/second, and the infrared light supplementing device is matched. The driver monitoring processor is connected with the driver monitoring camera and is used for processing the image acquired by the driver monitoring camera to generate a driver monitoring signal.
In one embodiment, the speech processing module comprises: a microphone and a speech processor. The microphone collects sound signals within the vehicle. The voice processor processes the voice signal in the vehicle and recognizes the voice command of the driver from the voice signal, the voice processor comprises a voiceprint recognition module and a semantic analysis module, the voiceprint recognition module recognizes whether the voice signal comes from the target driver or not according to the voiceprint, the voiceprint recognition module filters the voice signal which does not come from the target driver, and the voice signal coming from the target driver is subjected to semantic analysis by the semantic analysis module to be recognized and generate the voice command.
In one embodiment, the driving decision module generates the driving instruction according to the environment sensing signal, the driver monitoring signal and the voice instruction comprises: when the vehicle running state does not conform to the surrounding running environment, a running instruction is generated so that the vehicle running state conforms to the surrounding running environment. And when the state of the driver does not meet the driving requirement, generating a driving instruction to take over the driving of the vehicle and reminding the driver. And when the vehicle running state accords with the surrounding running environment and the state of the driver accords with the driving requirement, generating a running instruction according to the voice instruction.
In one embodiment, the execution module includes: ECU control unit, derailleur control unit, throttle controller, brake controller, steering motor of steering wheel.
In one embodiment, the human-computer interaction module comprises: a display prompting device and a voice prompting device. The display prompting device comprises a group of indicator lamps, and the indicator lamps indicate the current state of the auxiliary driving equipment. The voice prompt device comprises a power amplifier and a loudspeaker, generates a voice prompt signal according to the driving instruction, and feeds back the voice prompt signal to the driver through the power amplifier and the loudspeaker.
According to an embodiment of the present invention, there is provided a driving assistance method based on voice control, which is performed by the aforementioned driving assistance apparatus based on voice control, the driving assistance method including:
a waking step, in which the auxiliary driving equipment is waken up according to a voice wake-up instruction of a driver;
an environment sensing step of sensing a driving state of a vehicle and a surrounding driving environment to generate an environment sensing signal;
a driver monitoring step, monitoring the state of a driver and generating a driver monitoring signal;
a voice instruction acquisition step, which is used for acquiring and processing sound signals in the vehicle and identifying a voice instruction of a driver from the sound signals;
a driving decision step, namely generating a driving instruction according to the environment sensing signal, the driver monitoring signal and the voice instruction;
and an execution step of adjusting the running state of the vehicle according to the running execution.
In one embodiment, the step of waking up comprises:
a sound collection step of collecting a sound signal in the vehicle by a microphone;
a voiceprint recognition step, wherein a voiceprint recognition module of a voice processor recognizes whether the voice signal comes from the target driver or not according to the voiceprint, and filters the voice signal which does not come from the target driver;
and a step of awakening instruction identification, in which a semantic analysis module of a voice processor carries out semantic analysis on the voice signal from the target driver, identifies whether an awakening instruction exists or not, and awakens the auxiliary driving equipment if the awakening instruction exists.
In one embodiment, the wake instruction is a fixed wake phrase.
In one embodiment, the context awareness step perceives road information in front of the vehicle and signals from other traffic participants in the vicinity of the vehicle.
In one embodiment, the voice instruction collecting step comprises:
a sound collection step of collecting a sound signal in the vehicle by a microphone;
a voiceprint recognition step, wherein a voiceprint recognition module of a voice processor recognizes whether the voice signal comes from the target driver or not according to the voiceprint, and filters the voice signal which does not come from the target driver;
and a voice instruction identification step, namely performing semantic analysis on the voice signal from the target driver by a semantic analysis module of the voice processor, and identifying and generating a voice instruction.
In one embodiment, in the driving decision step,
when the vehicle running state does not accord with the surrounding running environment, a running instruction is generated to enable the vehicle running state to accord with the surrounding running environment;
when the state of the driver does not meet the driving requirement, generating a driving instruction to take over the driving of the vehicle and reminding the driver;
and when the vehicle running state accords with the surrounding running environment and the state of the driver accords with the driving requirement, generating a running instruction according to the voice instruction.
In one embodiment, the voice-based control-assisted driving device further comprises a quitting step of quitting and turning off the assisted driving device when a quitting instruction is received, wherein the quitting instruction is a fixed operation action, and the vehicle is taken over by the driver.
The voice control-based auxiliary driving equipment and the voice control-based auxiliary driving method utilize the voice command of the driver as the control command of the vehicle, realize the acceleration, the deceleration and the lane change of the vehicle, are simple and convenient to operate, can improve the driving experience and the comfort of the driver, and relieve the driving fatigue caused by the manual driving of the driver on the expressway for a long time. When the voice command does not meet the vehicle driving safety requirement, the method can correct the vehicle driving state in real time or refuse to execute the voice control command of the driver to the vehicle so as to ensure the driving safety. Meanwhile, when the driver has fatigue driving, the invention can automatically remind the driver to concentrate attention and take over in time, thereby avoiding traffic safety accidents caused by fatigue driving.
Drawings
Fig. 1 discloses a block diagram of a driving assistance device based on voice control according to an embodiment of the present invention.
Fig. 2 discloses an exemplary structure of a human-computer interaction module in the voice control-based auxiliary driving device.
Fig. 3 discloses a flow chart of a driving assistance method based on voice control according to an embodiment of the invention.
Fig. 4 discloses an example process of performing the wake-up step in the voice control-based assisted driving method of the present invention.
Fig. 5 discloses an example procedure of the voice control-based driving assistance method steps of the present invention.
Detailed Description
The invention is suitable for auxiliary driving on the expressway at present. The highway has single traffic participant component, and the traffic behavior of the traffic participant is simple, basically including speed change, distance change and lane change. Under the current range of processor capability and software algorithm capability, a suitable scenario for driving assistance is semi-automatic driving technology on a highway.
Fig. 1 discloses a block diagram of a driving assistance device based on voice control according to an embodiment of the present invention. Referring to fig. 1, the voice control-based driving assistance apparatus includes: the system comprises an environment perception module 101, a driver monitoring module 102, a voice processing module 103, a driving decision module 104, an execution module 105 and a human-computer interaction module 106.
The environment sensing module 101 senses a driving state of the vehicle and a surrounding driving environment, and generates an environment sensing signal. In one embodiment, the context awareness module 101 includes: a lane-keeping camera 111, a lane-keeping controller 112, a millimeter-wave radar 113, and a millimeter-wave radar controller 114. The lane keeping camera 111 is directed forward of the vehicle, and the lane keeping camera 111 collects road information forward of the vehicle. In one embodiment, the lane keeping camera is arranged at the front of the vehicle and points to the front of the vehicle, and the lane keeping camera collects road information within 60m of the front of the vehicle, and the collection frame rate is 30 frames/second. The lane keeping controller 112 is connected to the lane keeping camera 111, and the lane keeping controller 112 performs lane recognition based on the road information collected by the lane keeping camera 111. For the highway condition, the lanes are basically parallel in the same direction, and there are basically no opposite lanes or crossed lanes. The millimeter wave radar 113 detects signals of other traffic participants around the vehicle, and regarding the road condition of the expressway, the other traffic participants are basically vehicles traveling in the same direction, and the main concern is that the speed of the vehicles traveling in the same direction and the distance between the vehicles vary. In one embodiment, the millimeter wave radar is a long range millimeter wave radar, the detection distance is not less than 200m, the near-end horizontal detection angle is not less than 120 ° (± 60 °), and the near-end vertical detection angle is not less than 20 °. The millimeter-wave radar controller 114 is connected to the millimeter-wave radar 113. The millimeter wave radar controller 114 identifies other traffic participants in the vicinity of the vehicle and their states, i.e., the speed of the aforementioned vehicles traveling in the same direction and the distance variation from these vehicles, based on the signals detected by the millimeter wave radar 113. The lane keeping controller 112 and the millimeter wave radar controller 114 generate an environment sensing signal.
A Driver Monitoring (DMS) module 102 monitors a status of a driver and generates a driver monitoring signal. In one embodiment, the Driver Monitoring (DMS) module 102 includes: a Driver Monitoring (DMS) camera 121 and a Driver Monitoring (DMS) processor 122. The Driver Monitoring (DMS) camera 121 is arranged in the vehicle and points to the position of a driver, the field angle of the driver monitoring camera is not lower than 45 degrees, the collection frame rate is not lower than 30 frames/second, and the vehicle is provided with an infrared light supplementing device. In one embodiment, the driver monitoring camera is arranged beside the rear view mirror, aimed at the driver's seat. The driver monitoring camera carries out infrared light supplement through infrared light supplement device to satisfy the image acquisition demand under the darker environment. The driver monitoring processor 122 is connected to the driver monitoring camera 121, and processes the image collected by the driver monitoring camera to generate a driver monitoring signal. Driver monitoring is primarily a determination of whether the driver is in a state of fatigue. In one embodiment, the Driver Monitoring (DMS) processor 122 includes an image processor and a central processor, and determines the driving state of the driver by rendering the image captured by the driver monitoring camera 121 through the processing of the image processor, and determines that the driver is in a fatigue state if more than 90 frames of the continuously captured 240 frames of images indicate that the driver is in a fatigue state. In one embodiment, the criteria for determining fatigue state are: images of frequent yawning, long-time eye closing and the like of a driver are acquired.
The voice processing module 103 collects and processes sound signals in the vehicle and recognizes a voice command of the driver from the sound signals. In one embodiment, the speech processing module 103 includes: a microphone 131 and a speech processor 132. The microphone 131 collects a sound signal in the vehicle. The voice processor 132 processes the sound signal in the vehicle and recognizes the driver's voice command from the sound signal. In one embodiment, the speech processor 132 includes a voiceprint recognition module and a semantic parsing module. The voiceprint recognition module recognizes whether the voice signal is from the target driver according to the voiceprint, and the voiceprint recognition module filters the voice signal which is not from the target driver. And the voice signal from the target driver identified by the voiceprint identification module is subjected to semantic analysis by the semantic analysis module, and a voice command is identified and generated. In one embodiment, the voice command is a fixed command phrase. Insofar as prior art capabilities are available, the voice instructions may include: simple vehicle travel control instructions, such as "accelerate", "decelerate", "accelerate to km/h", "decelerate to km/h", or simple in-vehicle equipment control instructions, such as "open sunroof", "open air conditioner", "air conditioner warm up", "air conditioner cool down", "vent", or the like, or simple navigation instructions, such as "open navigation", "destination to km", "re-plan route", or the like. In order to avoid the voice of other passengers in the vehicle or the talking sound from causing the misoperation of the voice processing module, the voiceprint of the driver needs to be learned and memorized in advance. In one embodiment, the voice processing module can require the driver to pre-learn the voiceprint and vocabulary in advance, memorize the voiceprint of the driver during the learning process, and also pre-learn and memorize the voice command vocabulary used at high frequency so as to improve the speed and accuracy of recognition. After learning, the voice processing module can recognize whether the voice command of the driver exists from the voice signal at a high speed and accuracy. The learning and recognition of the voiceprint and voice commands are realized by adopting the commercially available technology, which is not the content of the invention, and the invention directly utilizes the learning and recognition of the voiceprint and voice commands.
The driving decision module 104 is connected with the environment sensing module 101, the driver monitoring module 102 and the voice processing module 103. The driving decision module 104 receives the environment sensing signal, the driver monitoring signal and the voice instruction, and generates a driving instruction according to the environment sensing signal, the driver monitoring signal and the voice instruction. In one embodiment, the driving decision module 104 generates the driving instruction according to the environmental perception signal, the driver monitoring signal and the voice instruction, which mainly includes the following aspects:
when the vehicle running state does not conform to the surrounding running environment, a running instruction is generated so that the vehicle running state conforms to the surrounding running environment. Such as: when the lane keeping camera detects lane deviation and finds that the steering lamp is not turned on through the vehicle-mounted controller, the driving decision module generates and sends a corresponding instruction to ensure that the vehicle runs on the lane without changing the lane; or when the millimeter wave radar finds that the vehicle runs at the existing speed and is at the risk of collision of other vehicles, the running decision module generates and sends a corresponding instruction to request the vehicle to decelerate or accelerate so as to reduce the risk of collision.
And when the state of the driver does not meet the driving requirement, generating a driving instruction to take over the driving of the vehicle and reminding the driver. The driver monitoring module finds more than a certain number of images in the continuously acquired images showing that the driver is in evidence of fatigue: and images such as frequent yawning, long-time eye closing and the like of the driver are acquired, and the state of the driver is considered to be not in accordance with the driving requirement. The driving decision module generates a driving instruction to take over the driving of the vehicle and sends out a signal to remind a driver to concentrate on the attention through an indicator lamp, a loudspeaker and the like.
And when the vehicle running state accords with the surrounding running environment and the state of the driver accords with the driving requirement, generating a running instruction according to the voice instruction. When the lane is normally kept and the vehicle and the surrounding vehicles keep a safe distance, the driver is also in a normal driving state and has no fatigue sign, and when the voice command of the driver is received, the driving decision module generates a corresponding command according to the voice command, adjusts the driving state of the vehicle, accelerates or decelerates, or turns on or turns off equipment in the vehicle according to the requirement of the voice command.
The execution module 105 is connected to the driving decision module 104. The execution module 105 adjusts a running state of the vehicle according to the running execution. The execution module 105 is the original control controller or execution mechanism of the vehicle, and the invention is connected with the components through the driving decision module 104, and the driving decision module 104 sends instructions to the components. In one embodiment, the execution module includes: ECU control unit, derailleur control unit, throttle controller, brake controller, steering motor of steering wheel. The execution module 105 may also include other vehicle execution components, which are referred to as vehicle execution components and limit specific components in the present invention, and according to the actual functional requirements, the execution module 105 of the present invention may be formed by connecting the original control controller or execution mechanism of the vehicle to the driving decision module 104 and operating according to the instruction of the driving decision module 104.
The human-machine interaction module 106 is connected with the driving decision module 104. The human-computer interaction module 106 feeds back to the driver according to the current state of the auxiliary driving equipment and the driving instruction. In one embodiment, human-computer interaction module 106 includes: a display prompting device 161 and a voice prompting device 162. Fig. 2 discloses an exemplary structure of a human-computer interaction module in the voice control-based auxiliary driving device. As shown in fig. 2, the display prompting device 161 is disposed on the steering wheel and includes a set of steering wheel indicator lights that indicate the current status of the driver assistance apparatus. In the illustrated embodiment, the steering wheel indicator light has three indication states: green, green sparkling, red. Green color indicates the system is working properly, respectively. The green flash indicates that the driver has awakened the system and awaited the driver voice control command, and the red prompt requires manual take-over by the driver. The voice prompt 162 includes a power amplifier and a speaker. The voice prompt device 162 generates a voice prompt signal according to the driving instruction, and the voice prompt signal is fed back to the driver through the power amplifier and the loudspeaker. In the illustrated embodiment, the voice prompt apparatus provides voice prompts for the following states: voice take-over prompt, system start prompt and system close prompt. After the system is turned off and exited, the steering wheel indicator light will turn off.
The invention also provides a driving assisting method based on voice control, which is executed by the driving assisting equipment based on voice control. Fig. 3 discloses a flow chart of a driving assistance method based on voice control according to an embodiment of the invention. Referring to fig. 3, the voice control-based driving assistance method includes:
s101, a waking step, namely waking up the auxiliary driving equipment according to a voice waking instruction of the driver. In one embodiment, the waking step S101 includes the following procedures:
and a sound collection step of collecting sound signals in the vehicle by a microphone.
And a voiceprint recognition step, namely recognizing whether the voice signal is from the target driver or not by a voiceprint recognition module of the voice processor according to the voiceprint, and filtering the voice signal which is not from the target driver.
And a step of awakening instruction identification, in which a semantic analysis module of a voice processor carries out semantic analysis on the voice signal from the target driver, identifies whether an awakening instruction exists or not, and awakens the auxiliary driving equipment if the awakening instruction exists. In one embodiment, the wake up instruction is a fixed wake up phrase such as "hello", "boot", "start", etc.
In order to avoid the erroneous operation caused by the speaking voice or talking voice of other passengers in the vehicle, it is necessary to learn and memorize the voiceprint of the driver in advance. In one embodiment, the method of the invention requires the driver to pre-learn the voiceprint and vocabulary, memorizes the voiceprint of the driver during the learning process, and also pre-learns and memorizes the voice command vocabulary and the awakening command vocabulary which are used at high frequency so as to improve the recognition speed and accuracy. After learning, the voice processing module can identify whether the voice command or the awakening command of the driver exists from the voice signal at a high speed and high accuracy. The learning and recognition of the voiceprint and voice commands and the wake-up command are realized by adopting the commercially available technology, which is not the invention content of the invention, and the invention directly utilizes the learning and recognition of the voiceprint and voice commands.
Fig. 4 discloses an example process of performing the wake-up step in the voice control-based assisted driving method of the present invention. Referring to fig. 4, the process of waking up includes: and collecting the sound comprising the fixed awakening words, and identifying the fixed awakening words. And (4) carrying out voiceprint confirmation, if the voiceprint of the driver is confirmed, entering the next step, and if the voiceprint of the driver is not confirmed, starting to fail the system. The driver sends an open command. And carrying out voice semantic analysis. The driver confirms the opening. If the driver confirms the start, the system is started and prompts are given. If the driver does not confirm the turn-on, the system fails to turn on.
S102, an environment sensing step, namely sensing the running state of the vehicle and the surrounding running environment and generating an environment sensing signal. In one embodiment, the context awareness step perceives road information in front of the vehicle and signals from other traffic participants in the vicinity of the vehicle. For the road condition of the expressway, the road information is basically parallel lanes in the same direction, and basically has no opposite lanes or crossed lanes. Other traffic participants are basically vehicles running in the same direction, and the main concern is that the speed of the vehicles running in the same direction and the distance between the vehicles change. The lane information and the speed of the vehicles traveling in the same direction and the distance to these vehicles constitute the main content of the situational awareness signal.
S103, a driver monitoring step, namely monitoring the state of the driver and generating a driver monitoring signal. In the driver monitoring step, continuous images of a driver are acquired through a driver monitoring camera, the acquired images are processed and rendered through an image processor, then the driving state of the driver is judged through a central processing unit, and if more than 90 frames of images in the continuously acquired 240 frames of images show that the driver is in a fatigue state, the driver is judged to be in the fatigue state. In one embodiment, the criteria for determining fatigue state are: images of frequent yawning, long-time eye closing and the like of a driver are acquired.
And S104, a voice instruction acquisition step, namely acquiring and processing the sound signals in the vehicle and recognizing the voice instruction of the driver from the sound signals. In one embodiment, the voice instruction collecting step S104 includes the following processes:
and a sound collection step of collecting sound signals in the vehicle by a microphone.
And a voiceprint recognition step, namely recognizing whether the voice signal is from the target driver or not by a voiceprint recognition module of the voice processor according to the voiceprint, and filtering the voice signal which is not from the target driver.
And a voice instruction identification step, namely performing semantic analysis on the voice signal from the target driver by a semantic analysis module of the voice processor, and identifying and generating a voice instruction.
In one embodiment, the voice instructions may include: simple vehicle travel control instructions, such as "accelerate", "decelerate", "accelerate to km/h", "decelerate to km/h", or simple in-vehicle equipment control instructions, such as "open sunroof", "open air conditioner", "air conditioner warm up", "air conditioner cool down", "vent", or the like, or simple navigation instructions, such as "open navigation", "destination to km", "re-plan route", or the like. Similar to the wake-up step, in order to avoid the misoperations caused by the speaking voice or talking voice of other passengers in the vehicle, the voiceprint of the driver needs to be learned and memorized in advance. In one embodiment, the method of the invention requires the driver to pre-learn the voiceprint and vocabulary, memorizes the voiceprint of the driver during the learning process, and also pre-learns and memorizes the voice command vocabulary and the awakening command vocabulary which are used at high frequency so as to improve the recognition speed and accuracy. After learning, the voice processing module can identify whether the voice command or the awakening command of the driver exists from the voice signal at a high speed and high accuracy. The learning and recognition of the voiceprint and voice commands and the wake-up command are realized by adopting the commercially available technology, which is not the invention content of the invention, and the invention directly utilizes the learning and recognition of the voiceprint and voice commands.
And S105, a driving decision step, namely generating a driving instruction according to the environment sensing signal, the driver monitoring signal and the voice instruction. In one embodiment, the step of generating the driving instruction according to the environmental sensing signal, the driver monitoring signal and the voice instruction in the driving decision step S105 mainly includes the following aspects:
when the vehicle running state does not conform to the surrounding running environment, a running instruction is generated so that the vehicle running state conforms to the surrounding running environment. Such as: when the lane keeping camera detects lane deviation and finds that the steering lamp is not turned on through the vehicle-mounted controller, the driving decision module generates and sends a corresponding instruction to ensure that the vehicle runs on the lane without changing the lane; or when the millimeter wave radar finds that the vehicle runs at the existing speed and is at the risk of collision of other vehicles, the running decision module generates and sends a corresponding instruction to request the vehicle to decelerate or accelerate so as to reduce the risk of collision.
And when the state of the driver does not meet the driving requirement, generating a driving instruction to take over the driving of the vehicle and reminding the driver. The driver monitoring module finds more than a certain number of images in the continuously acquired images showing that the driver is in evidence of fatigue: and images such as frequent yawning, long-time eye closing and the like of the driver are acquired, and the state of the driver is considered to be not in accordance with the driving requirement. The driving decision module generates a driving instruction to take over the driving of the vehicle and sends out a signal to remind a driver to concentrate on the attention through an indicator lamp, a loudspeaker and the like.
And when the vehicle running state accords with the surrounding running environment and the state of the driver accords with the driving requirement, generating a running instruction according to the voice instruction. When the lane is normally kept and the vehicle and the surrounding vehicles keep a safe distance, the driver is also in a normal driving state and has no fatigue sign, and when the voice command of the driver is received, the driving decision module generates a corresponding command according to the voice command, adjusts the driving state of the vehicle, accelerates or decelerates, or turns on or turns off equipment in the vehicle according to the requirement of the voice command.
And S106, executing the step of adjusting the running state of the vehicle according to the running execution. In one embodiment, the step S106 of sending the running instruction to the original control controller or actuator of the vehicle is performed to adjust the running state of the vehicle. In one embodiment, the execution modules include: ECU control unit, derailleur control unit, throttle controller, brake controller, steering motor of steering wheel.
In one embodiment, the voice-based control-assisted driving device further comprises a quitting step of quitting and turning off the assisted driving device when receiving a quitting instruction, and the vehicle is taken over by the driver. The exit instruction is a fixed operation such as a driver holding the steering wheel with his hand and continuously operating the steering wheel, or a driver stepping on the brake pedal to brake.
In the execution process of the driving assistance method based on voice control, the state of the driving assistance equipment is fed back to the driver through the man-machine interaction module in a mode of using the indicator lamp and voice prompt, and the driver is reminded if necessary.
Fig. 5 discloses an example procedure of the voice control-based driving assistance method steps of the present invention.
Referring to FIG. 5:
the driver firstly wakes up the system through the fixed wake-up word to judge the current driver identity, and after the driver is confirmed as the target speaker through the voiceprint, the driver can control the driving state of the vehicle through vehicle control commands of accelerating to speed up to km/h, decelerating to speed down to km/h, changing the lane left, changing the lane right and the like. And the voice semantics are analyzed to a driver control instruction and then sent to the driving decision module.
Taking an acceleration instruction as an example, after the system receives the acceleration instruction of the driver, the driving decision module can judge whether the vehicle has a collision risk according to the motion state of the front vehicle detected by the front millimeter wave radar. Further, when the driver sends an acceleration instruction, under the condition that no collision risk exists between the vehicle and the front vehicle, the vehicle speed of the vehicle is increased by 10km/h, and if the vehicle speed exceeds the maximum vehicle speed limit, the vehicle runs at the maximum vehicle speed limit; if the collision risk exists, the system can judge the highest safe speed of the vehicle according to the distance and the speed of the front vehicle detected by the millimeter wave radar, and the vehicle can run at the highest safe speed. When a driver sends a deceleration instruction, the speed of the vehicle is reduced by 10 km/h; further, after the semi-automatic driving of the expressway is started, the maximum driving speed of the vehicle is 120 km/h. Meanwhile, an image processing unit in the DMS processor processes the image acquired by the DMS camera, and then a central processing unit in the DMS processor judges whether the driver is in a fatigue driving state or not. And if the continuously collected images of 240 frames and more than 90 frames show that the driver is in a fatigue state, judging that the driver has fatigue driving behaviors. At the moment, the system refuses to execute the acceleration instruction of the driver, and simultaneously reminds the driver to concentrate attention and take over the vehicle through the power amplifier.
Further, after the system is started, lane keeping cameras collect lane lines on two sides 60m in front of the vehicle, and send collected picture information to a lane keeping controller. The controller analyzes according to the collected scene information, extracts the characteristic points of the image to establish a two-dimensional model, thereby obtaining lane information of two sides of the vehicle, and compares the lane information with the driving direction of the vehicle. If the driving direction of the vehicle is crossed with the lane lines and the steering lamps of the vehicle are not turned on, the lane keeping algorithm calculates the torque according to the crossing amount of the driving direction of the vehicle and the lane lines on the two sides, and then the driving decision module sends corresponding instructions to the steering motor to correct the torque.
When the driver sends a lane change instruction, the system calculates the steering wheel angle and the lane change time by a lane change algorithm according to the current vehicle speed and sends the steering wheel angle and the lane change time to the execution module. Similarly, if the DMS module detects that the driver has fatigue driving behavior, the system prompts the driver to concentrate attention and prompts the driver to take over the semi-automatic driving system of the expressway.
The voice control-based auxiliary driving equipment and the voice control-based auxiliary driving method utilize the voice command of the driver as the control command of the vehicle, realize the acceleration, the deceleration and the lane change of the vehicle, are simple and convenient to operate, can improve the driving experience and the comfort of the driver, and relieve the driving fatigue caused by the manual driving of the driver on the expressway for a long time. When the voice command does not meet the vehicle driving safety requirement, the method can correct the vehicle driving state in real time or refuse to execute the voice control command of the driver to the vehicle so as to ensure the driving safety. Meanwhile, when the driver has fatigue driving, the invention can automatically remind the driver to concentrate attention and take over in time, thereby avoiding traffic safety accidents caused by fatigue driving.
It should also be noted that the above-mentioned embodiments are only specific embodiments of the present invention. It is apparent that the present invention is not limited to the above embodiments and similar changes or modifications can be easily made by those skilled in the art from the disclosure of the present invention and shall fall within the scope of the present invention. The embodiments described above are provided to enable persons skilled in the art to make or use the invention and that modifications or variations can be made to the embodiments described above by persons skilled in the art without departing from the inventive concept of the present invention, so that the scope of protection of the present invention is not limited by the embodiments described above but should be accorded the widest scope consistent with the innovative features set forth in the claims.

Claims (15)

1. A driving assistance apparatus based on voice control, characterized by comprising:
the environment sensing module senses the running state and the surrounding running environment of the vehicle and generates an environment sensing signal;
the driver monitoring module is used for monitoring the state of a driver and generating a driver monitoring signal;
the voice processing module is used for acquiring and processing sound signals in the vehicle and identifying a voice instruction of a driver from the sound signals;
the driving decision module is connected with the environment sensing module, the driver monitoring module and the voice processing module, receives the environment sensing signal, the driver monitoring signal and the voice instruction, and generates a driving instruction according to the environment sensing signal, the driver monitoring signal and the voice instruction;
the execution module is connected with the driving decision module and used for adjusting the driving state of the vehicle according to driving execution;
and the human-computer interaction module is connected with the driving decision module and feeds back the current state of the auxiliary driving equipment and the driving instruction to the driver.
2. The voice-control-based driver assistance apparatus according to claim 1, wherein the environment awareness module includes:
the lane keeping camera points to the front of the vehicle and collects road information in front of the vehicle;
the lane keeping controller is connected with the lane keeping camera and is used for carrying out lane identification according to the road information collected by the lane keeping camera;
the millimeter wave radar detects signals of other traffic participants around the vehicle;
the millimeter wave radar controller is connected with the millimeter wave radar and identifies other traffic participants around the vehicle and states of the other traffic participants according to signals detected by the millimeter wave radar;
the lane keeping controller and the millimeter wave radar controller generate environment perception signals.
3. The voice-control-based driving assistance apparatus according to claim 2,
the lane keeping camera is arranged at the front of the vehicle and points to the front of the vehicle, the lane keeping camera collects road information in a range of 60m in front of the vehicle, and the collection frame rate is 30 frames/second;
the millimeter wave radar is a long-distance millimeter wave radar, the detection distance is not less than 200m, the near-end horizontal detection angle is not less than 120 degrees (+ -60 degrees), and the near-end vertical detection angle is not less than 20 degrees.
4. The voice-control-based driver assistance apparatus according to claim 1, wherein the driver monitoring module includes:
the driver monitoring camera is arranged in the vehicle and points to the position of a driver, the field angle of the driver monitoring camera is not lower than 45 degrees, the acquisition frame rate is not lower than 30 frames/second, and the driver monitoring camera is provided with an infrared light supplementing device;
and the driver monitoring processor is connected with the driver monitoring camera and is used for processing the image acquired by the driver monitoring camera to generate a driver monitoring signal.
5. The voice-control-based driver assistance apparatus according to claim 1, wherein the voice processing module includes:
a microphone that collects sound signals inside the vehicle;
the voice processor processes the sound signals in the vehicle and recognizes the voice commands of the driver from the sound signals, the voice processor comprises a voiceprint recognition module and a semantic analysis module, the voiceprint recognition module recognizes whether the sound signals come from the target driver or not according to the voiceprint, the voiceprint recognition module filters the sound signals which do not come from the target driver, and the semantic analysis module carries out semantic analysis on the sound signals coming from the target driver to recognize and generate the voice commands.
6. The voice-control-based driver assistance apparatus according to claim 1, wherein the driving decision module generating the driving instruction according to the environment sensing signal, the driver monitoring signal and the voice instruction comprises:
when the vehicle running state does not accord with the surrounding running environment, a running instruction is generated to enable the vehicle running state to accord with the surrounding running environment;
when the state of the driver does not meet the driving requirement, generating a driving instruction to take over the driving of the vehicle and reminding the driver;
and when the vehicle running state accords with the surrounding running environment and the state of the driver accords with the driving requirement, generating a running instruction according to the voice instruction.
7. The voice-control-based driver assistance apparatus according to claim 1, wherein the execution module includes: ECU control unit, derailleur control unit, throttle controller, brake controller, steering motor of steering wheel.
8. The voice-control-based driver assistance apparatus according to claim 1, wherein the human-computer interaction module includes:
the display prompting device comprises a group of indicator lamps, and the indicator lamps indicate the current state of the auxiliary driving equipment;
the voice prompt device comprises a power amplifier and a loudspeaker, generates a voice prompt signal according to the driving instruction, and feeds back the voice prompt signal to a driver through the power amplifier and the loudspeaker.
9. A voice-control-based driving assistance method that is executed by the voice-control-based driving assistance apparatus according to any one of claims 1 to 8, the driving assistance method comprising:
a waking step, in which the auxiliary driving equipment is waken up according to a voice wake-up instruction of a driver;
an environment sensing step of sensing a driving state of a vehicle and a surrounding driving environment to generate an environment sensing signal;
a driver monitoring step, monitoring the state of a driver and generating a driver monitoring signal;
a voice instruction acquisition step, which is used for acquiring and processing sound signals in the vehicle and identifying a voice instruction of a driver from the sound signals;
a driving decision step, namely generating a driving instruction according to the environment sensing signal, the driver monitoring signal and the voice instruction;
and an execution step of adjusting the running state of the vehicle according to the running execution.
10. The voice-control-based driving assistance method according to claim 9, wherein the waking-up step includes:
a sound collection step of collecting a sound signal in the vehicle by a microphone;
a voiceprint recognition step, wherein a voiceprint recognition module of a voice processor recognizes whether the voice signal comes from the target driver or not according to the voiceprint, and filters the voice signal which does not come from the target driver;
and a step of awakening instruction identification, in which a semantic analysis module of a voice processor carries out semantic analysis on the voice signal from the target driver, identifies whether an awakening instruction exists or not, and awakens the auxiliary driving equipment if the awakening instruction exists.
11. The voice-control-based driving assistance method according to claim 10, wherein the wake-up command is a fixed wake-up phrase.
12. The voice-control-based driving assistance method according to claim 9, wherein the environment sensing step senses road information in front of the vehicle and signals of other traffic participants in the vicinity of the vehicle.
13. The voice-control-based driving assistance method according to claim 9, wherein the voice instruction acquisition step includes:
a sound collection step of collecting a sound signal in the vehicle by a microphone;
a voiceprint recognition step, wherein a voiceprint recognition module of a voice processor recognizes whether the voice signal comes from the target driver or not according to the voiceprint, and filters the voice signal which does not come from the target driver;
and a voice instruction identification step, namely performing semantic analysis on the voice signal from the target driver by a semantic analysis module of the voice processor, and identifying and generating a voice instruction.
14. The voice-control-based driver assistance apparatus according to claim 1, wherein in the driving decision step,
when the vehicle running state does not accord with the surrounding running environment, a running instruction is generated to enable the vehicle running state to accord with the surrounding running environment;
when the state of the driver does not meet the driving requirement, generating a driving instruction to take over the driving of the vehicle and reminding the driver;
and when the vehicle running state accords with the surrounding running environment and the state of the driver accords with the driving requirement, generating a running instruction according to the voice instruction.
15. The voice-control-based driver assistance apparatus according to claim 14, further comprising a quitting step of quitting and turning off the driver assistance apparatus when a quitting instruction is received, the vehicle running being taken over by the driver, the quitting instruction being a fixed operation action.
CN202010316284.0A 2020-04-21 2020-04-21 Auxiliary driving method and auxiliary driving equipment based on voice control Pending CN111439271A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010316284.0A CN111439271A (en) 2020-04-21 2020-04-21 Auxiliary driving method and auxiliary driving equipment based on voice control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010316284.0A CN111439271A (en) 2020-04-21 2020-04-21 Auxiliary driving method and auxiliary driving equipment based on voice control

Publications (1)

Publication Number Publication Date
CN111439271A true CN111439271A (en) 2020-07-24

Family

ID=71653538

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010316284.0A Pending CN111439271A (en) 2020-04-21 2020-04-21 Auxiliary driving method and auxiliary driving equipment based on voice control

Country Status (1)

Country Link
CN (1) CN111439271A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112017667A (en) * 2020-09-04 2020-12-01 华人运通(上海)云计算科技有限公司 Voice interaction method, vehicle and computer storage medium
CN112037790A (en) * 2020-08-10 2020-12-04 上汽大众汽车有限公司 Method and system for controlling third-party application based on vehicle-mounted voice recognition system and vehicle
CN112572474A (en) * 2020-12-29 2021-03-30 联合汽车电子有限公司 Autonomous driving redundancy system
CN113147780A (en) * 2021-03-29 2021-07-23 江铃汽车股份有限公司 Control method and system for switch of adaptive cruise system
CN113460092A (en) * 2021-09-01 2021-10-01 国汽智控(北京)科技有限公司 Method, device, equipment, storage medium and product for controlling vehicle
CN113479214A (en) * 2021-08-16 2021-10-08 国汽智控(北京)科技有限公司 Automatic driving system and application method thereof
CN113511156A (en) * 2021-05-08 2021-10-19 山西三友和智慧信息技术股份有限公司 Vehicle driving auxiliary system based on artificial intelligence
CN113990033A (en) * 2021-09-10 2022-01-28 南京融才交通科技研究院有限公司 Vehicle traffic accident remote take-over rescue method and system based on 5G internet of vehicles
CN114103973A (en) * 2021-11-12 2022-03-01 上汽通用五菱汽车股份有限公司 Vehicle control method, device, vehicle and computer readable storage medium
CN114274954A (en) * 2021-12-16 2022-04-05 上汽大众汽车有限公司 Vehicle active pipe connecting system and method combining vehicle internal and external sensing
CN114290986A (en) * 2021-11-25 2022-04-08 合众新能源汽车有限公司 Night driving assisting method and device
CN114506265A (en) * 2022-02-18 2022-05-17 东风汽车集团股份有限公司 Human-computer interaction control method and device for vehicle and pedestrian

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200805A (en) * 2014-08-30 2014-12-10 长城汽车股份有限公司 Car driver voice assistant
CN106887232A (en) * 2017-01-22 2017-06-23 斑马信息科技有限公司 For the sound control method and speech control system of vehicle
CN107826117A (en) * 2017-11-22 2018-03-23 天津智能网联汽车产业研究院 A kind of automated driving system and control method
CN107901915A (en) * 2017-11-24 2018-04-13 重庆长安汽车股份有限公司 Vehicle drive automated system and method based on voice control
CN108407813A (en) * 2018-01-25 2018-08-17 惠州市德赛西威汽车电子股份有限公司 A kind of antifatigue safe driving method of vehicle based on big data
US20180281819A1 (en) * 2017-03-31 2018-10-04 Honda Motor Co., Ltd. Vehicle control system, vehicle control method, and vehicle control program
CN109795494A (en) * 2019-01-25 2019-05-24 温州大学 A method of control automatic driving vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200805A (en) * 2014-08-30 2014-12-10 长城汽车股份有限公司 Car driver voice assistant
CN106887232A (en) * 2017-01-22 2017-06-23 斑马信息科技有限公司 For the sound control method and speech control system of vehicle
US20180281819A1 (en) * 2017-03-31 2018-10-04 Honda Motor Co., Ltd. Vehicle control system, vehicle control method, and vehicle control program
CN107826117A (en) * 2017-11-22 2018-03-23 天津智能网联汽车产业研究院 A kind of automated driving system and control method
CN107901915A (en) * 2017-11-24 2018-04-13 重庆长安汽车股份有限公司 Vehicle drive automated system and method based on voice control
CN108407813A (en) * 2018-01-25 2018-08-17 惠州市德赛西威汽车电子股份有限公司 A kind of antifatigue safe driving method of vehicle based on big data
CN109795494A (en) * 2019-01-25 2019-05-24 温州大学 A method of control automatic driving vehicle

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112037790A (en) * 2020-08-10 2020-12-04 上汽大众汽车有限公司 Method and system for controlling third-party application based on vehicle-mounted voice recognition system and vehicle
CN112037790B (en) * 2020-08-10 2024-02-23 上汽大众汽车有限公司 Method and system for controlling third party application based on vehicle-mounted voice recognition system and vehicle
CN112017667A (en) * 2020-09-04 2020-12-01 华人运通(上海)云计算科技有限公司 Voice interaction method, vehicle and computer storage medium
CN112017667B (en) * 2020-09-04 2024-03-15 华人运通(上海)云计算科技有限公司 Voice interaction method, vehicle and computer storage medium
CN112572474A (en) * 2020-12-29 2021-03-30 联合汽车电子有限公司 Autonomous driving redundancy system
CN113147780A (en) * 2021-03-29 2021-07-23 江铃汽车股份有限公司 Control method and system for switch of adaptive cruise system
CN113511156A (en) * 2021-05-08 2021-10-19 山西三友和智慧信息技术股份有限公司 Vehicle driving auxiliary system based on artificial intelligence
CN113479214B (en) * 2021-08-16 2022-08-12 国汽智控(北京)科技有限公司 Automatic driving system and application method thereof
CN113479214A (en) * 2021-08-16 2021-10-08 国汽智控(北京)科技有限公司 Automatic driving system and application method thereof
CN113460092A (en) * 2021-09-01 2021-10-01 国汽智控(北京)科技有限公司 Method, device, equipment, storage medium and product for controlling vehicle
CN113990033A (en) * 2021-09-10 2022-01-28 南京融才交通科技研究院有限公司 Vehicle traffic accident remote take-over rescue method and system based on 5G internet of vehicles
CN114103973A (en) * 2021-11-12 2022-03-01 上汽通用五菱汽车股份有限公司 Vehicle control method, device, vehicle and computer readable storage medium
CN114290986A (en) * 2021-11-25 2022-04-08 合众新能源汽车有限公司 Night driving assisting method and device
CN114274954A (en) * 2021-12-16 2022-04-05 上汽大众汽车有限公司 Vehicle active pipe connecting system and method combining vehicle internal and external sensing
CN114506265A (en) * 2022-02-18 2022-05-17 东风汽车集团股份有限公司 Human-computer interaction control method and device for vehicle and pedestrian
CN114506265B (en) * 2022-02-18 2024-03-26 东风汽车集团股份有限公司 Man-machine interaction control method and device for vehicles and pedestrians

Similar Documents

Publication Publication Date Title
CN111439271A (en) Auxiliary driving method and auxiliary driving equipment based on voice control
CN109747649B (en) Vehicle control system and method based on driver state
US8983750B2 (en) Driving support apparatus for vehicle
CN111137284B (en) Early warning method and early warning device based on driving distraction state
CN110293969B (en) Self-adaptive cruise driving system and method and automobile
CN108189709B (en) Control method of electric automobile brake system and electric automobile
US20230143515A1 (en) Driving assistance apparatus
JP2006199233A (en) Safety control device for vehicle
CN103213550B (en) A kind of braking automobile safety instruction control system and safety instruction control method
CN117302198A (en) Driving support device and vehicle control method
JP5486254B2 (en) Vehicle driving support device
JP2008049917A (en) Automatic stop control device
JP7139889B2 (en) vehicle control system
KR102331882B1 (en) Method and apparatus for controlling an vehicle based on voice recognition
CN202413789U (en) Driving auxiliary device for automobile
CN110077417A (en) A kind of method and system of automobile guideboard acquisition of information prompt
CN113479214B (en) Automatic driving system and application method thereof
CN111845749A (en) Control method and system for automatically driving vehicle
US20230382377A1 (en) Vehicle Guidance System and Method for Operating a Driving Function Depending on the Expected Stopping Duration
US20230406313A1 (en) Vehicle Guidance System and Method for Automated Starting of a Vehicle
CN114655226A (en) Intelligent cabin multi-mode human-computer interaction driver state monitoring and adjusting system
CN111661040B (en) Slow-moving safety system and method at medium and low speed
CN115123207A (en) Driving assistance device and vehicle
CN110667573A (en) Automobile driving state risk perception early warning system and method thereof
CN111516491B (en) Vehicle accelerator control device and control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200724

RJ01 Rejection of invention patent application after publication