CN112455461A - Human-vehicle interaction method for automatically driving vehicle and automatically driving system - Google Patents

Human-vehicle interaction method for automatically driving vehicle and automatically driving system Download PDF

Info

Publication number
CN112455461A
CN112455461A CN201910844653.0A CN201910844653A CN112455461A CN 112455461 A CN112455461 A CN 112455461A CN 201910844653 A CN201910844653 A CN 201910844653A CN 112455461 A CN112455461 A CN 112455461A
Authority
CN
China
Prior art keywords
state
brain wave
driver
driving state
product
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910844653.0A
Other languages
Chinese (zh)
Other versions
CN112455461B (en
Inventor
唐卫东
杨莉莉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201910844653.0A priority Critical patent/CN112455461B/en
Publication of CN112455461A publication Critical patent/CN112455461A/en
Application granted granted Critical
Publication of CN112455461B publication Critical patent/CN112455461B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0285Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using signals transmitted via a public communication network, e.g. GSM network
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0872Driver physiology

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Mathematical Physics (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application relates to artificial intelligence, and provides a human-vehicle interaction method for automatically driving a vehicle, which comprises the following steps: acquiring brain wave signals of a driver; determining the brain wave activity state of the driver according to the brain wave signal of the driver; controlling a driving state of the autonomous vehicle according to a brain wave activity state of the driver, the driving state including at least one of a manual driving state and an assisted autonomous driving state. The application provides a human-vehicle interaction method for automatically driving a vehicle and an automatic driving system, and aims to improve the user experience of the automatic driving system.

Description

Human-vehicle interaction method for automatically driving vehicle and automatically driving system
Technical Field
The present application relates to the field of automatic driving, and more particularly, to a human-vehicle interaction method for automatically driving a vehicle and an automatic driving system.
Background
Artificial Intelligence (AI) is a theory, method, technique and application system that uses a digital computer or a machine controlled by a digital computer to simulate, extend and expand human Intelligence, perceive the environment, acquire knowledge and use the knowledge to obtain the best results. In other words, artificial intelligence is a branch of computer science that attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a manner similar to human intelligence. Artificial intelligence is the research of the design principle and the realization method of various intelligent machines, so that the machines have the functions of perception, reasoning and decision making. Research in the field of artificial intelligence includes robotics, natural language processing, computer vision, decision and reasoning, human-computer interaction, recommendation and search, AI basic theory, and the like.
Automatic driving is a mainstream application in the field of artificial intelligence, and the automatic driving technology depends on the cooperative cooperation of computer vision, radar, a monitoring device, a global positioning system and the like, so that the motor vehicle can realize automatic driving without the active operation of human beings. Autonomous vehicles use various computing systems to assist in transporting passengers from one location to another. Some autonomous vehicles may require some initial input or continuous input from an operator, such as a pilot, driver, or passenger. Autonomous vehicles permit an operator to switch from a manual mode of operation to an autonomous mode or an intervening mode. Because the automatic driving technology does not need human to drive the motor vehicle, the driving error of human can be effectively avoided theoretically, the occurrence of traffic accidents is reduced, and the transportation efficiency of the road can be improved. Therefore, the automatic driving technique is increasingly emphasized.
Automatic driving is realized, behavior and physical state of a driver are accurately sensed, and humanized driving assistance service is provided for the driver under the condition of ensuring safety. Since the sex, the growth, the character and the physical state of the driver are different, it is important to correctly identify the driving state of the driver. Only if the driving state of the driver is correctly identified, the driving system can humanizedly provide the proper auxiliary driving function required by the driver.
Disclosure of Invention
The application provides a human-vehicle interaction method and an automatic driving system for automatically driving a vehicle, and aims to provide a proper auxiliary driving function for a driver and improve the user experience of the automatic driving system.
In a first aspect, a method of human-vehicle interaction of an autonomous vehicle is provided, comprising: acquiring brain wave signals of a driver; determining the brain wave activity state of the driver according to the brain wave signal of the driver; controlling a driving state of the autonomous vehicle according to the brain wave activity state of the driver, the driving state including at least one of a manual driving state and an assisted autonomous driving state.
Brain wave signals may also be referred to as electroencephalogram (EEG) signals. Optionally, the brain wave signal is a signal obtained after artifact removal processing.
Brain wave activity states may include concentration, distraction, normal driving, emergency response, level of attention, and the like. The brain activity state may be a numerical value when the brain activity state is the attention level. The larger the value, the more concentrated the attention of the driver, and the smaller the value, the more distracted the driver. The expression of the driver's distraction may be, for example, a case where the driver is slow down, pays attention to a plurality of objects at the same time, and fails to concentrate on the attention.
In the embodiment of the present application, the brain wave signal may reflect an activity state of the brain, and may reflect a psychological activity of the driver. The brain wave can reflect the deep requirements of the driver more easily and is difficult to be controlled by the subjectively of the driver, so the brain wave signal can reflect the real driving will or the real driving ability of the driver, and the driver can be provided with more proper automatic driving service to ensure the driving safety. And the brain wave signal can reflect the preference and habit of the driver, and is beneficial to providing personalized service for the driver.
With reference to the first aspect, in certain implementations of the first aspect, the determining the brain wave activity state of the driver from the brain wave signal of the driver includes: determining the brain wave activity state of the driver according to the similarity between a first brain wave signal and a second brain wave signal, wherein the first brain wave signal is a brain wave signal acquired in a first time period, the second brain wave signal is a brain wave signal acquired in a second time period, and the time interval between the first time period and the second time period is smaller than a first preset threshold.
In the embodiment of the present application, when the similarity between the first brain wave signal and the second brain wave signal in two different time periods is high, it often means that the attention of the driver is focused, and when the similarity between the first brain wave signal and the second brain wave signal in two different time periods is low, it means that the attention of the driver is changed, for example, situations such as distraction, attention shift, and encountering an emergency road condition occur. The similarity of the brain wave signals can visually reflect the attention change of the driver, so that the judgment process of the change of the driving state of the driver is easier, and the driving state of the vehicle can be corrected in time.
With reference to the first aspect, in certain implementations of the first aspect, the controlling a driving state of the autonomous vehicle according to the brain wave activity state of the driver includes: and controlling the driving state of the automatic driving vehicle to be the auxiliary automatic driving state under the condition that the brain wave activity state is lower than the second preset threshold value.
For example, the driving state of the autonomous vehicle is the manual driving state when the brain wave activity state is higher than a second preset threshold, and the driving state of the autonomous vehicle is the assisted autonomous driving state when the brain wave activity state is lower than the second preset threshold.
Optionally, the second preset threshold may be obtained by training a neural network model.
In the embodiment of the application, under the condition that the attention of the driver is concentrated or the driver is in a normal driving state, the driving state of the automatic driving vehicle is an artificial driving state, so that the driver can freely control and drive the vehicle; under the condition that the driver is distracted or meets an emergency road condition, the driving state of the automatic driving vehicle is an auxiliary automatic driving state, so that automatic driving service can be provided for the driver in time, and potential safety hazards are avoided. In addition, different second preset thresholds can be set for different drivers, and different second preset thresholds can also be set for the same driver in different driving time periods, so that personalized services can be provided for the driver.
With reference to the first aspect, in certain implementations of the first aspect, the auxiliary automatic driving state includes a first type auxiliary automatic driving state and a second type auxiliary automatic driving state, the driving state of the autonomous vehicle is the first type auxiliary automatic driving state when the brain wave activity state is higher than a third preset threshold, the driving state of the autonomous vehicle is the second type auxiliary automatic driving state when the brain wave activity state is lower than the third preset threshold, the automation level of the second type auxiliary automatic driving state is higher than the automation level of the first type auxiliary automatic driving state, and the third preset threshold is lower than the second preset threshold.
In the embodiment of the application, under the condition that the driver is distracted or encounters an emergency road condition, the type of the auxiliary automatic driving state can be judged according to the brain wave activity state of the driver, so that the proper automatic driving service is provided for the driver, the potential safety hazard is avoided, and the actual requirements of the driver are better met. In addition, different third preset thresholds can be set for different drivers, and different third preset thresholds can also be set for the same driver in different driving time periods, so that personalized service can be provided for the driver.
With reference to the first aspect, in certain implementations of the first aspect, prior to the controlling the driving state of the autonomous vehicle according to the brain wave activity state of the driver, the method further comprises: acquiring eye data of the driver; determining the eye activity state of the driver according to the eye data; the controlling a driving state of the autonomous vehicle according to the brain wave activity state of the driver includes: and controlling the driving state of the automatic driving vehicle according to the brain wave activity state and the eye activity state.
In the embodiment of the present application, since the eye activity state is further considered, it is easier to accurately recognize the driving state of the driver than when only the brain wave activity state is considered, thereby providing a suitable driving assistance service to the driver.
With reference to the first aspect, in certain implementations of the first aspect, the controlling a driving state of the autonomous vehicle according to the brain wave activity state and the eye activity state includes: controlling the driving state of the autonomous vehicle to be the assisted autonomous driving state when the sum of the first product and the second product is smaller than the fourth preset threshold, wherein the first product is the product of a first weight value and the brain wave activity state, the second product is the product of a second weight value and the eye activity state, and the first weight value is larger than the second weight value.
For example, in a case where a sum of the first product and the second product is greater than a fourth preset threshold, the driving state of the autonomous vehicle is the manual driving state; the driving state of the autonomous vehicle is the assisted autonomous driving state, when a sum of the first product and the second product is less than the fourth preset threshold.
In the embodiment of the application, the brain wave activity state can directly reflect the thinking and psychological states of the driver, and the eye activity state indirectly or cannot reflect the thinking or psychological states of the driver, so that the driving state of the driver is determined to be relatively more dependent on the brain wave activity state by setting the weight value corresponding to the brain wave activity state to be greater than the weight value corresponding to the eye activity state, and the stability of determining the driving state of the driver can be ensured. In addition, different fourth preset thresholds, first weight values and second weight values can be set for different drivers, and different fourth preset thresholds, first weight values and second weight values can also be set for the same driver in different driving time periods, so that personalized services can be provided for the driver.
With reference to the first aspect, in certain implementations of the first aspect, prior to the controlling the driving state of the autonomous vehicle according to the brain wave activity state of the driver, the method further comprises: acquiring biological characteristic data of the driver, wherein the biological characteristic data comprises one or more of pulse data, heartbeat data and blood pressure data; determining a biometric activity state of the driver according to the biometric data; the controlling a driving state of the autonomous vehicle according to the brain wave activity state of the driver includes: controlling a driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state.
In the embodiment of the present application, it is easier to accurately recognize the driving state of the driver due to further consideration of the biometric activity state than the case where only the brain wave activity state is considered, thereby providing a suitable driving assistance service to the driver.
With reference to the first aspect, in certain implementations of the first aspect, the determining a biometric activity state of the driver from the biometric data includes: determining the biological feature activity state of the driver according to the similarity between first biological feature data and second biological feature data, wherein the first biological feature data are obtained in a third time period, the second biological feature data are obtained in a fourth time period, and the time interval between the third time period and the fourth time period is smaller than a fifth preset threshold value.
In the embodiment of the application, when the similarity between the first biological characteristic data and the second biological characteristic data in two different time periods is higher, the driving state of the driver is more stable, or the driver is in a normal driving state; when the similarity between the first biometric data and the second biometric data in two different time periods is low, it means that the driving state of the driver changes, for example, a health hazard occurs, and an emergency road condition is encountered. The similarity of the biological characteristic data can intuitively reflect the change of the body state of the driver, so that the judgment process of the change of the driving state of the driver is easier, and the driving state of the vehicle can be corrected in time.
Optionally, a time interval between the first time period and the third time period or the fourth time period is smaller than an eighth preset threshold, and/or a time interval between the second time period and the third time period or the fourth time period is smaller than a ninth preset threshold.
In the embodiment of the application, the time interval between the acquisition of the brain wave signal and the acquisition of the biological characteristic signal should be as small as possible, so that the comparability between the biological characteristic activity state and the brain wave activity state is ensured, and a more appropriate driving service is provided for a driver.
With reference to the first aspect, in certain implementations of the first aspect, the controlling a driving state of the autonomous vehicle based on the brain wave activity state and the biometric activity state includes: controlling the driving state of the autonomous vehicle to be the assisted autonomous driving state, in case the sum of the first product and the third product is smaller than the sixth preset threshold; wherein the first product is a product of a first weight value and the brain wave activity state, the third product is a product of a third weight value and the biological characteristic activity state, and the first weight value is greater than the third weight value.
For example, in a case where a sum of the first product and the third product is greater than a sixth preset threshold, the driving state of the autonomous vehicle is the manual driving state; and in the case that the sum of the first product and the third product is less than the sixth preset threshold, the driving state of the autonomous vehicle is the assisted autonomous driving state.
In the embodiment of the application, because the brain wave activity state can directly reflect the thinking and psychological states of the driver, and the biological characteristic activity state indirectly or cannot reflect the thinking or psychological states of the driver, the determination of the driving state of the driver is relatively more dependent on the brain wave activity state by setting the weight value corresponding to the brain wave activity state to be greater than the weight value corresponding to the biological characteristic activity state, so that the stability of the determination of the driving state of the driver can be ensured. In addition, different sixth preset thresholds, first weight values and third weight values can be set for different drivers, and different sixth preset thresholds, first weight values and third weight values can also be set for the same driver in different driving time periods, so that personalized services can be provided for the driver.
With reference to the first aspect, in certain implementations of the first aspect, prior to the controlling a driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state, the method further comprises: acquiring eye data of the driver; determining the eye activity state of the driver according to the eye data; controlling a driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state, comprising: controlling the driving state of the autonomous vehicle to be the auxiliary autonomous driving state when the sum of a first product, a second product and a third product is smaller than the seventh preset threshold, wherein the first product is the product of a first weight value and the brain wave activity state, the second product is the product of a second weight value and the eye activity state, the third product is the product of a third weight value and the biological feature activity state, the first weight value is larger than the second weight value, and the first weight value is larger than the third weight value.
Optionally, the controlling the driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state includes: and controlling the driving state of the automatic driving vehicle to be the manual driving state under the condition that the sum of the first product, the second product and the third product is greater than a seventh preset threshold value.
In the embodiment of the application, compared with the case of only considering the brain wave activity state, the driving state of the driver can be more easily and accurately recognized due to the fact that the eye activity state and the biological feature activity state are further considered, and therefore the appropriate driving assistance service is provided for the driver. Because the brain wave activity state can directly reflect the thinking and psychological states of the driver, and the eye activity state and the biological characteristic activity state indirectly or cannot reflect the thinking or psychological states of the driver, the driving state of the driver is determined to be relatively more dependent on the brain wave activity state by setting the weight value corresponding to the brain wave activity state to be greater than the weight value corresponding to the biological characteristic activity state and setting the weight value corresponding to the brain wave activity state to be greater than the weight value corresponding to the eye activity state, so that the stability of the driving state of the driver can be ensured. In addition, different seventh preset thresholds, first weight values, second weight values and third weight values can be set for different drivers, and different seventh preset thresholds, first weight values, second weight values and third weight values can be set for the same driver in different driving time periods, so that personalized services can be provided for the driver.
With reference to the first aspect, in certain implementations of the first aspect, the acquiring brain wave signals of the driver includes: and receiving the brain wave signals collected and sent by the in-ear type earplugs.
In this application embodiment, the brain wave collection system of pleasant formula has the characteristics that the structure is small and exquisite, wear easily, the comfort level is high, is favorable to increasing driver's experience sense.
In a second aspect, there is provided an autonomous driving system comprising: the brain wave acquisition device is used for acquiring brain wave signals of a driver; the driver state processing device is used for determining the brain wave activity state of the driver according to the brain wave signal of the driver; and the vehicle state processing device is used for controlling the driving state of the automatic driving vehicle according to the brain wave activity state of the driver, and the driving state comprises at least one of a manual driving state and an auxiliary automatic driving state.
With reference to the second aspect, in certain implementation manners of the second aspect, the brain wave collecting device is specifically configured to determine the brain wave activity state of the driver according to a similarity between a first brain wave signal and a second brain wave signal, where the first brain wave signal is a brain wave signal acquired in a first time period, the second brain wave signal is a brain wave signal acquired in a second time period, and a time interval between the first time period and the second time period is smaller than a first preset threshold.
With reference to the second aspect, in certain implementations of the second aspect, the vehicle state processing device is specifically configured to, in a case where the brain wave activity state is lower than the second preset threshold, control the driving state of the autonomous vehicle to be the assisted autonomous driving state.
With reference to the second aspect, in certain implementations of the second aspect, the auxiliary automatic driving states include a first type auxiliary automatic driving state and a second type auxiliary automatic driving state, the driving state of the autonomous vehicle is the first type auxiliary automatic driving state when the brain wave activity state is higher than a third preset threshold, the driving state of the autonomous vehicle is the second type auxiliary automatic driving state when the brain wave activity state is lower than the third preset threshold, the automation level of the second type auxiliary automatic driving state is higher than the automation level of the first type auxiliary automatic driving state, and the third preset threshold is lower than the second preset threshold.
With reference to the second aspect, in certain implementations of the second aspect, the automatic driving system further includes: the eye data acquisition device is used for acquiring the eye data of the driver; the driver state processing device is further used for determining the eye activity state of the driver according to the eye data; the vehicle state processing device is specifically configured to control a driving state of the autonomous vehicle according to the brain wave activity state and the eye activity state.
With reference to the second aspect, in certain implementations of the second aspect, the vehicle state processing device is specifically configured to, in a case that a sum of the first product and the second product is less than the fourth preset threshold, control the driving state of the autonomous vehicle to be the assisted autonomous driving state, where the first product is a product of a first weight value and the brain wave activity state, the second product is a product of a second weight value and the eye activity state, and the first weight value is greater than the second weight value.
With reference to the second aspect, in certain implementations of the second aspect, the automatic driving system further includes: the biological characteristic data acquisition device is used for acquiring biological characteristic data of the driver, and the biological characteristic data comprises one or more of pulse data, heartbeat data and blood pressure data; the driver state processing device is further used for determining the biological characteristic activity state of the driver according to the biological characteristic data; the vehicle state processing device is specifically configured to control a driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state.
With reference to the second aspect, in certain implementations of the second aspect, the driver state processing device is specifically configured to determine the biometric activity state of the driver according to a similarity between first biometric data and second biometric data, where the first biometric data is biometric data acquired in a third time period, the second biometric data is biometric data acquired in a fourth time period, and a time interval between the third time period and the fourth time period is smaller than a fifth preset threshold.
With reference to the second aspect, in certain implementations of the second aspect, the vehicle state processing device is specifically configured to, in a case where a sum of the first product and the third product is less than the sixth preset threshold, control the driving state of the autonomous vehicle to be the assisted autonomous driving state; wherein the first product is a product of a first weight value and the brain wave activity state, the third product is a product of a third weight value and the biological characteristic activity state, and the first weight value is greater than the third weight value.
With reference to the second aspect, in certain implementations of the second aspect, the automatic driving system further includes: the eye data acquisition device is used for acquiring the eye data of the driver; determining the eye activity state of the driver according to the eye data; the driver state processing device is further used for determining the eye activity state of the driver according to the eye data; the vehicle state processing device is specifically configured to, when a sum of a first product, a second product, and a third product is smaller than the seventh preset threshold, control a driving state of the autonomous vehicle to be the assisted autonomous driving state, where the first product is a product of a first weight value and the brain wave activity state, the second product is a product of a second weight value and the eye activity state, the third product is a product of a third weight value and the biological characteristic activity state, the first weight value is greater than the second weight value, and the first weight value is greater than the third weight value.
Optionally, the vehicle state processing device is specifically configured to control the driving state of the autonomous vehicle to be the manual driving state when a sum of the first product, the second product, and the third product is greater than a seventh preset threshold.
With reference to the second aspect, in certain implementations of the second aspect, the brain wave acquisition device is an earbud-type earplug.
In a third aspect, there is provided a driving state processing apparatus including: the acquisition module is used for acquiring brain wave signals of a driver; and the processing module is used for determining the brain wave activity state of the driver according to the brain wave signal of the driver and controlling the driving state of the automatic driving vehicle according to the brain wave activity state of the driver, wherein the driving state comprises at least one of a manual driving state and an auxiliary automatic driving state.
With reference to the third aspect, in certain implementations of the third aspect, the processing module is specifically configured to determine the brain wave activity state of the driver according to a similarity between a first brain wave signal and a second brain wave signal, where the first brain wave signal is a brain wave signal acquired in a first time period, the second brain wave signal is a brain wave signal acquired in a second time period, and a time interval between the first time period and the second time period is smaller than a first preset threshold.
With reference to the third aspect, in certain implementations of the third aspect, the processing module is specifically configured to control the driving state of the autonomous vehicle to be the assisted autonomous driving state if the brain wave activity state is lower than the second preset threshold.
With reference to the third aspect, in certain implementations of the third aspect, the auxiliary autonomous driving state includes a first type auxiliary autonomous driving state and a second type auxiliary autonomous driving state, the driving state of the autonomous vehicle is the first type auxiliary autonomous driving state in a case where the brain wave activity state is higher than a third preset threshold, the driving state of the autonomous vehicle is the second type auxiliary autonomous driving state in a case where the brain wave activity state is lower than the third preset threshold, the automation level of the second type auxiliary autonomous driving state is higher than the automation level of the first type auxiliary autonomous driving state, and the third preset threshold is lower than the second preset threshold.
With reference to the third aspect, in certain implementations of the third aspect, the obtaining module is further configured to obtain eye data of the driver; the processing module is further used for determining the eye activity state of the driver according to the eye data; the processing module is specifically configured to control a driving state of the autonomous vehicle according to the brain wave activity state and the eye activity state.
With reference to the third aspect, in certain implementations of the third aspect, the processing module is specifically configured to control the driving state of the autonomous vehicle to be the assisted autonomous driving state if a sum of the first product and the second product is less than the fourth preset threshold, where the first product is a product of a first weight value and the brain wave activity state, the second product is a product of a second weight value and the eye activity state, and the first weight value is greater than the second weight value.
With reference to the third aspect, in certain implementations of the third aspect, the obtaining module is further configured to obtain biometric data of the driver, the biometric data including one or more of pulse data, heartbeat data, and blood pressure data; the processing module is further used for determining the biological characteristic activity state of the driver according to the biological characteristic data; the processing module is specifically configured to control a driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state.
With reference to the third aspect, in certain implementations of the third aspect, the processing module is specifically configured to determine the biometric activity state of the driver according to a similarity between first biometric data and second biometric data, where the first biometric data is biometric data acquired in a third time period, the second biometric data is biometric data acquired in a fourth time period, and a time interval between the third time period and the fourth time period is smaller than a fifth preset threshold.
With reference to the third aspect, in certain implementations of the third aspect, the processing module is specifically configured to control the driving state of the autonomous vehicle to be the assisted autonomous driving state if a sum of the first product and the third product is less than the sixth preset threshold; wherein the first product is a product of a first weight value and the brain wave activity state, the third product is a product of a third weight value and the biological characteristic activity state, and the first weight value is greater than the third weight value.
With reference to the third aspect, in certain implementations of the third aspect, the obtaining module is further configured to obtain eye data of the driver; the processing module is further used for determining the eye activity state of the driver according to the eye data; the processing module is specifically configured to, when a sum of a first product, a second product and a third product is smaller than a seventh preset threshold, control a driving state of the autonomous vehicle to be the assisted autonomous driving state, where the first product is a product of a first weight value and the brain wave activity state, the second product is a product of a second weight value and the eye activity state, the third product is a product of a third weight value and the biological characteristic activity state, the first weight value is greater than the second weight value, and the first weight value is greater than the third weight value.
Optionally, the processing module is further configured to determine an eye activity state of the driver according to the eye data; the processing module is specifically configured to control the driving state of the autonomous vehicle to be the manual driving state when a sum of the first product, the second product, and the third product is greater than a seventh preset threshold,
with reference to the third aspect, in certain implementations of the third aspect, the obtaining module is specifically configured to receive the brain wave signal collected and transmitted by an in-ear earplug.
In a fourth aspect, a driving state processing apparatus is provided, where the apparatus includes a storage medium, which may be a non-volatile storage medium, and a central processing unit, which stores a computer-executable program therein, and is connected to the non-volatile storage medium, and executes the computer-executable program to implement the method of the first aspect or any possible implementation manner of the first aspect.
In a fifth aspect, an autonomous vehicle is provided, which includes the autonomous driving system according to the second aspect and any one of the possible implementations of the second aspect.
In a sixth aspect, an autonomous vehicle is provided, which includes the driving state processing device according to the third aspect or any one of the possible implementations of the third aspect.
In a seventh aspect, a chip is provided, where the chip includes a processor and a data interface, and the processor reads instructions stored in a memory through the data interface to perform the first aspect or the method in any possible implementation manner of the first aspect.
Optionally, as an implementation manner, the chip may further include a memory, where instructions are stored in the memory, and the processor is configured to execute the instructions stored in the memory, and when the instructions are executed, the processor is configured to execute the first aspect or the method in any possible implementation manner of the first aspect.
In an eighth aspect, a computer-readable storage medium is provided that stores program code for execution by a device, the program code comprising instructions for performing the method of the first aspect or any possible implementation manner of the first aspect.
Drawings
Fig. 1 is a schematic structural diagram of an autonomous vehicle according to an embodiment of the present application.
Fig. 2 is a schematic structural diagram of a computer system according to an embodiment of the present application.
Fig. 3 is an application schematic diagram of a cloud-side instruction automatic driving vehicle according to an embodiment of the present application.
FIG. 4 is a schematic flowchart of a human-computer interaction method for an autonomous vehicle according to an embodiment of the present disclosure.
Fig. 5 is a schematic structural diagram of an ear-side wearing device according to an embodiment of the present application.
Fig. 6 is a schematic wearing position diagram of an ear-side wearing device according to an embodiment of the present application.
Fig. 7 is a schematic diagram of an electroencephalogram signal when attention is focused.
Fig. 8 is a schematic diagram of an electroencephalogram signal during distraction.
Fig. 9 is a schematic block diagram of a structure of an automatic driving system according to an embodiment of the present application.
Fig. 10 is a schematic diagram of an in-vehicle environment including an autopilot system according to an embodiment of the present application.
Fig. 11 is a schematic structural block diagram of a driving state processing device according to an embodiment of the present application.
Fig. 12 is a schematic structural block diagram of a driving state processing device according to an embodiment of the present application.
Fig. 13 is a schematic structural block diagram of a driving state processing device according to an embodiment of the present application.
Detailed Description
The technical solution in the present application will be described below with reference to the accompanying drawings.
Fig. 1 is a functional block diagram of a vehicle 100 provided in an embodiment of the present application. In one embodiment, the vehicle 100 is configured in a fully or partially autonomous driving mode. For example, the vehicle 100 may control itself while in the autonomous driving mode, and may determine a current state of the vehicle and its surroundings by human operation, determine a possible behavior of at least one other vehicle in the surroundings, and determine a confidence level corresponding to a likelihood that the other vehicle performs the possible behavior, controlling the vehicle 100 based on the determined information. While the vehicle 100 is in the autonomous driving mode, the vehicle 100 may be placed into operation without human interaction.
The vehicle 100 may include various subsystems such as a travel system 102, a sensor system 104, a control system 106, one or more peripherals 108, as well as a power supply 110, a computer system 162, and a user interface 116. Alternatively, vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements. In addition, each of the sub-systems and elements of the vehicle 100 may be interconnected by wire or wirelessly.
The travel system 102 may include components that provide powered motion to the vehicle 100. In one embodiment, the propulsion system 102 may include an engine 118, an energy source 119, a transmission 120, and wheels/tires 121. The engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a hybrid engine of a gasoline engine and an electric motor, or a hybrid engine of an internal combustion engine and an air compression engine. The engine 118 converts the energy source 119 into mechanical energy.
Examples of energy sources 119 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. The energy source 119 may also provide energy to other systems of the vehicle 100.
The transmission 120 may transmit mechanical power from the engine 118 to the wheels 121. The transmission 120 may include a gearbox, a differential, and a drive shaft. In one embodiment, the transmission 120 may also include other devices, such as a clutch. Wherein the drive shaft may comprise one or more shafts that may be coupled to one or more wheels 121.
The sensor system 104 may include a number of sensors that sense information about the environment surrounding the vehicle 100. For example, the sensor system 104 may include a positioning system 122 (which may be a Global Positioning System (GPS) system, a Beidou system, or other positioning system), an Inertial Measurement Unit (IMU) 124, a radar 126, a laser rangefinder 128, and a camera 130. The sensor system 104 may also include sensors of internal systems of the monitored vehicle 100 (e.g., an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors may be used to detect the object and its corresponding characteristics (position, shape, orientation, velocity, etc.). Such detection and identification is a critical function of the safe operation of the autonomous vehicle 100.
The positioning system 122 may be used to estimate the geographic location of the vehicle 100. The IMU 124 is used to sense position and orientation changes of the vehicle 100 based on inertial acceleration. In one embodiment, IMU 124 may be a combination of an accelerometer and a gyroscope.
The radar 126 may utilize radio signals to sense objects within the surrounding environment of the vehicle 100. In some embodiments, in addition to sensing objects, radar 126 may also be used to sense the speed and/or heading of an object.
The laser rangefinder 128 may utilize laser light to sense objects in the environment in which the vehicle 100 is located. In some embodiments, the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
The camera 130 may be used to capture multiple images of the surrounding environment of the vehicle 100. The camera 130 may also be used to capture the facial expression of the driver. The camera 130 may be a still camera or a video camera.
The control system 106 is for controlling the operation of the vehicle 100 and its components. The control system 106 may include various elements including a steering system 132, a throttle 134, a braking unit 136, a sensor fusion algorithm 138, a computer vision system 140, a route control system 142, and an obstacle avoidance system 144.
The steering system 132 is operable to adjust the heading of the vehicle 100. For example, in one embodiment, a steering wheel system.
The throttle 134 is used to control the operating speed of the engine 118 and thus the speed of the vehicle 100.
The brake unit 136 is used to control the deceleration of the vehicle 100. The brake unit 136 may use friction to slow the wheel 121. In other embodiments, the brake unit 136 may convert the kinetic energy of the wheel 121 into an electric current. The brake unit 136 may take other forms to slow the rotational speed of the wheels 121 to control the speed of the vehicle 100.
The computer vision system 140 may be operable to process and analyze images captured by the camera 130 to identify objects and/or features in the environment surrounding the vehicle 100. The objects and/or features may include traffic signals, road boundaries, and obstacles. The computer vision system 140 may use object recognition algorithms, Structure From Motion (SFM) algorithms, video tracking, and other computer vision techniques. In some embodiments, the computer vision system 140 may be used to map an environment, track objects, estimate the speed of objects, and so forth. When the camera 130 captures an image of the driver's face, the computer vision system 140 may analyze features of the driver's facial expression. For example, whether the driver is distracted from the eyes, whether the driver is talking on words, etc. The computer vision system 140 may use a facial recognition algorithm.
The route control system 142 is used to determine a travel route of the vehicle 100. In some embodiments, the route control system 142 may combine data from the sensors 138, the GPS 122, and one or more predetermined maps to determine a travel route for the vehicle 100.
The obstacle avoidance system 144 is used to identify, evaluate, and avoid or otherwise negotiate potential obstacles in the environment of the vehicle 100.
Of course, in one example, the control system 106 may additionally or alternatively include components other than those shown and described. Or may reduce some of the components shown above.
Vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through peripherals 108. The peripheral devices 108 may include a wireless communication system 146, an in-vehicle computer 148, a microphone 150, and/or speakers 152.
In some embodiments, the peripheral devices 108 provide a means for a user of the vehicle 100 to interact with the user interface 116. For example, the onboard computer 148 may provide information to a user of the vehicle 100. The user interface 116 may also operate the in-vehicle computer 148 to receive user input. The in-vehicle computer 148 may be operated via a touch screen. In other cases, the peripheral devices 108 may provide a means for the vehicle 100 to communicate with other devices located within the vehicle. For example, the microphone 150 may receive audio (e.g., voice commands or other audio input) from a user of the vehicle 100. Similarly, the speaker 152 may output audio to a user of the vehicle 100.
The wireless communication system 146 may communicate wirelessly with one or more devices, either directly or via a communication network. For example, the wireless communication system 146 may use 3G cellular communication such as Code Division Multiple Access (CDMA), global system for mobile communications (GSM), General Packet Radio Service (GPRS), or 4G cellular communication such as Long Term Evolution (LTE). Or 5G cellular communication. The wireless communication system 146 may communicate with a Wireless Local Area Network (WLAN) using WiFi. In some embodiments, the wireless communication system 146 may utilize an infrared link, bluetooth, or the like to communicate directly with the device. Other wireless protocols, such as various vehicle communication systems, for example, the wireless communication system 146 may include one or more Dedicated Short Range Communications (DSRC) devices that may include public and/or private data communications between vehicles and/or roadside stations.
The power supply 110 may provide power to various components of the vehicle 100. In one embodiment, power source 110 may be a rechargeable lithium ion or lead acid battery. One or more battery packs of such batteries may be configured as a power source to provide power to various components of the vehicle 100. In some embodiments, the power source 110 and the energy source 119 may be implemented together, such as in some all-electric vehicles.
Some or all of the functions of the vehicle 100 are controlled by the computer system 162. The computer system 162 may include at least one processor 163, the processor 163 executing instructions 165 stored in a non-transitory computer readable medium, such as the data storage device 114. The computer system 162 may also be a plurality of computing devices that control individual components or subsystems of the vehicle 100 in a distributed manner.
The processor 163 may be any conventional processor, such as a commercially available Central Processing Unit (CPU). Alternatively, the processor may be a dedicated device such as an ASIC or other hardware-based processor. Although fig. 1 functionally illustrates processors, memories, and other elements of the computer 110 in the same blocks, those of ordinary skill in the art will appreciate that the processors, computers, or memories may actually comprise multiple processors, computers, or memories that may or may not be stored within the same physical housing. For example, the memory may be a hard disk drive or other storage medium located in a different housing than the computer 110. Thus, references to a processor or computer are to be understood as including references to a collection of processors or computers or memories which may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components, such as the steering component and the retarding component, may each have their own processor that performs only computations related to the component-specific functions.
In various aspects described herein, the processor may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are executed on a processor disposed within the vehicle and others are executed by a remote processor, including taking the steps necessary to perform a single maneuver.
In some embodiments, the data storage device 114 may include instructions 165 (e.g., program logic), which instructions 165 may be executed by the processor 163 to perform various functions of the vehicle 100, including those described above. The data storage 114 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of the propulsion system 102, the sensor system 104, the control system 106, and the peripherals 108.
In addition to instructions 165, data storage device 114 may also store data such as road maps, route information, the location, direction, speed, and other such vehicle data of the vehicle, as well as other information. Such information may be used by the vehicle 100 and the computer system 162 during operation of the vehicle 100 in autonomous, semi-autonomous, and/or manual modes.
A user interface 116 for providing information to and receiving information from a user of the vehicle 100. Optionally, the user interface 116 may include one or more input/output devices within the collection of peripheral devices 108, such as a wireless communication system 146, an on-board vehicle computer 148, a microphone 150, and a speaker 152.
The computer system 162 may control the functions of the vehicle 100 based on inputs received from various subsystems (e.g., the travel system 102, the sensor system 104, and the control system 106) and from the user interface 116. For example, the computer system 162 may utilize input from the control system 106 in order to control the steering unit 132 to avoid obstacles detected by the sensor system 104 and the obstacle avoidance system 144. In some embodiments, the computer system 162 is operable to provide control over many aspects of the vehicle 100 and its subsystems.
Alternatively, one or more of these components described above may be mounted or associated separately from the vehicle 100. For example, the data storage device 114 may exist partially or completely separate from the vehicle 1100. The above components may be communicatively coupled together in a wired and/or wireless manner.
Optionally, the above components are only an example, in an actual application, components in the above modules may be added or deleted according to an actual need, and fig. 1 should not be construed as limiting the embodiment of the present application.
An autonomous automobile traveling on a roadway, such as vehicle 100 above, may identify objects within its surrounding environment to determine an adjustment to the current speed. The object may be another vehicle, a traffic control device, or another type of object. In some examples, each identified object may be considered independently, and based on the respective characteristics of the object, such as its current speed, acceleration, separation from the vehicle, etc., may be used to determine the speed at which the autonomous vehicle is to be adjusted.
Alternatively, the autonomous automobile vehicle 100 or a computing device associated with the autonomous vehicle 100 (e.g., the computer system 162, the computer vision system 140, the data storage 114 of fig. 1) may predict behavior of the identified objects based on characteristics of the identified objects and the state of the surrounding environment (e.g., traffic, rain, ice on the road, etc.). Optionally, each identified object depends on the behavior of each other, so it is also possible to predict the behavior of a single identified object taking all identified objects together into account. The vehicle 100 is able to adjust its speed based on the predicted behaviour of said identified object. In other words, the autonomous vehicle is able to determine what steady state the vehicle will need to adjust to (e.g., accelerate, decelerate, or stop) based on the predicted behavior of the object. In this process, other factors may also be considered to determine the speed of the vehicle 100, such as the lateral position of the vehicle 100 in the road on which it is traveling, the curvature of the road, the proximity of static and dynamic objects, and so forth.
In addition to providing instructions to adjust the speed of the autonomous vehicle, the computing device may also provide instructions to modify the steering angle of the vehicle 100 to cause the autonomous vehicle to follow a given trajectory and/or to maintain a safe lateral and longitudinal distance from objects in the vicinity of the autonomous vehicle (e.g., cars in adjacent lanes on the road).
Optionally, the autonomous vehicle 100 or a computing device associated with the autonomous vehicle 100 (e.g., the computer system 162, the computer vision system 140, the data storage 114 of fig. 1) may predict the driver's attention during driving based on the driver's facial expressions (e.g., whether the driver is distracted, whether the driver is talking with words with others, etc.). The vehicle 100 is able to activate or deactivate its driving assistance function based on the attention of the driver. In other words, the autonomous automobile can determine that the autonomous driving level of the vehicle is one of L0-L5 based on the attention of the driver.
L0: the driver is in full control of the vehicle.
L1: automated systems are sometimes able to assist the driver in completing certain driving tasks.
L2: the automatic system can complete some driving tasks, but the driver needs to monitor the driving environment to complete the rest part, and meanwhile, the problem is guaranteed to occur, and the driver can take over at any time. At this level, the false perception and judgment of the automated system is corrected by the driver at any time. L2 can be divided into different usage scenarios by speed and environment, such as loop low speed traffic jams, fast driving on high speed roads, and automatic parking by the driver in the car.
L3: the automated system can perform certain driving tasks and in some cases monitor the driving environment, but the driver must be ready to regain driving control (when requested by the automated system). So at this level, the driver still cannot go to sleep or take deep rest.
L4: automated systems are capable of completing driving tasks and monitoring the driving environment under certain circumstances and specific conditions. In the stage, all tasks related to driving are irrelevant to drivers and passengers in the range that automatic driving can be operated, and external responsibility is sensed to be in the automatic driving system.
L5: the automated system can accomplish all driving tasks under all conditions.
The vehicle 100 may be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, an amusement car, a playground vehicle, construction equipment, a trolley, a golf cart, a train, a trolley, etc., and the embodiment of the present invention is not particularly limited.
Fig. 2 is a schematic diagram of an automatic driving system provided in an embodiment of the present application.
The autopilot system shown in fig. 2 includes a computer system 101, wherein computer system 101 includes a processor 103, and processor 103 is coupled to a system bus 105. Processor 103 may be one or more processors, each of which may include one or more processor cores. A display adapter (video adapter)107, which may drive a display 109, the display 109 coupled with system bus 105. System bus 105 is coupled to an input/output (I/O) bus 113 via a bus bridge 111. The I/O interface 115 is coupled to an I/O bus. The I/O interface 115 communicates with various I/O devices, such as an input device 117 (e.g., keyboard, mouse, touch screen, etc.), a multimedia disk (media track) 121 (e.g., compact disk read-only memory (CD-ROM), multimedia interface, etc.). A transceiver 123 (which can send and/or receive radio communication signals), a camera 155 (which can capture field and motion digital video images), and an external Universal Serial Bus (USB) interface 125. Wherein, optionally, the interface connected with the I/O interface 115 may be a USB interface.
The processor 103 may be any conventional processor, including a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, or a combination thereof. Alternatively, the processor may be a dedicated device such as an Application Specific Integrated Circuit (ASIC). Alternatively, the processor 103 may be a neural network processor or a combination of a neural network processor and a conventional processor as described above.
Optionally, in various embodiments described herein, computer system 101 may be located remotely from the autonomous vehicle and may communicate wirelessly with the autonomous vehicle. In other aspects, some processes described herein are performed on a processor disposed within an autonomous vehicle, others being performed by a remote processor, including taking the actions required to perform a single maneuver.
Computer 101 may communicate with software deploying server 149 via network interface 129. The network interface 129 is a hardware network interface, such as a network card. The network 127 may be an external network, such as the internet, or an internal network, such as an ethernet or Virtual Private Network (VPN). Optionally, the network 127 may also be a wireless network, such as a WiFi network, a cellular network, and the like.
The hard drive interface is coupled to system bus 105. The hardware drive interface is connected with the hard disk drive. System memory 135 is coupled to system bus 105. Data running in system memory 135 may include the operating system 137 and application programs 143 of computer 101.
The operating system includes a parser 139(shell) and a kernel 141 (kernel). The shell 139 is an interface between the user and the kernel of the operating system. The shell is the outermost layer of the operating system. The shell manages the interaction between users and the operating system: waits for user input, interprets the user input to the operating system, and processes the output results of the various operating systems.
Kernel 141 is comprised of those portions of the operating system that are used to manage memory, files, peripherals, and system resources. Interacting directly with the hardware, the operating system kernel typically runs processes and provides inter-process communication, CPU slot management, interrupts, memory management, input output management, and the like.
The application programs 143 include programs related to controlling the automatic driving of a vehicle, such as programs for managing the interaction of an automatically driven vehicle with obstacles on the road, programs for controlling the route or speed of an automatically driven vehicle, and programs for controlling the interaction of an automatically driven vehicle with other automatically driven vehicles on the road. Application 143 also resides on the system of the exploiting server 149. In one embodiment, computer system 101 may download application program 143 from deploying server14 when needed to execute application program 147.
For example, the application 141 may also be a program that controls an autonomous vehicle to avoid collisions with other vehicles, safely passing through an intersection.
As another example, application 141 may also be a program that controls an autonomous vehicle to activate or deactivate an auxiliary autonomous function.
Sensor 153 is associated with computer system 101. The sensor 153 is used to detect the environment surrounding the computer 101. For example, the sensor 153 may detect an animal, a car, an obstacle, a crosswalk, and the like, and further, the sensor may detect an environment around the animal, the car, the obstacle, the crosswalk, and the like, such as: the environment surrounding the animal, e.g., other animals present around the animal, weather conditions, brightness of the surrounding environment, etc. Alternatively, if the computer 101 is located on an autonomous automobile, the sensor may be a camera, infrared sensor, chemical detector, microphone, or the like.
Computer system 162 in FIG. 1 may also receive information from, or transfer information to, other computer systems. Alternatively, sensor data collected from the sensor system 104 of the vehicle 100 may be transferred to another computer for processing of this data.
For example, as shown in fig. 3, data from computer system 312 may be transmitted via a network to cloud-side server 320 for further processing. The networks and intermediate nodes may include various configurations and protocols, including the internet, world wide web, intranets, virtual private networks, wide area networks, local area networks, private networks using one or more company's proprietary communication protocols, ethernet, WiFi, and hypertext transfer protocol (HTTP), as well as various combinations of the foregoing. Such communications may be by any device capable of communicating data to and from other computers, such as modems and wireless interfaces. For example, the attention data of the driver is transmitted to the cloud-side server 320 for further processing, and the cloud-side server may recognize and process the attention data by using various neural network models, and feed the driver's attention recognition result back to the computer system 312, so that the computer system 312 may confirm whether to turn on or off the auxiliary automatic driving function.
In one example, server 320 may comprise a server having multiple computers, such as a load balancing server farm, that exchange information with different nodes of a network for the purpose of receiving, processing, and transmitting data from computer system 312. The server may be configured similar to computer system 312, with processor 330, memory 340, instructions 350, and data 360.
The autopilot system may include several auxiliary autopilot functions. Such as pre-collision safety braking (PCS), Adaptive Cruise Control (ACC), Lane Keeping Aid (LKA), Cross Traffic Alert (CTA), Rear Cross Traffic Alert (RCTA), Blind Spot Warning (BSW), turn off vehicle alert, and traffic jam aid (tjajas).
Different drivers usually have different driving habits or driving states due to different intelligence, reaction speed, physical state and character. For example, a driver can finish backing and warehousing according to the rear view field of the vehicle, a rearview mirror or a vehicle data recorder and the like, the driver can switch back and forth among various view fields in the backing and warehousing process, different drivers usually have different driving habits, and therefore the psychological activity, the thinking state and the like of the driver are difficult to accurately obtain according to the action, the eye spirit and the like of the driver. For example, some drivers may have a high driving level to deal with a complicated traffic problem, while some drivers have a low driving level to require a low-speed and careful driving. That is, even if the vehicles are driven on the same road condition, different drivers may exhibit different driving responses, for example, some drivers may react agilely and some drivers may react slowly. That is, the intelligence, the reaction speed, the physical state, and the character of the driver greatly affect the real driving state of the driver, and thus it is difficult to obtain the real driving state of the driver. If the real internal activity or the real driving state of the driver cannot be accurately sensed, the automatic driving service with high user experience cannot be provided.
Therefore, the application provides a human-vehicle interaction method for automatically driving a vehicle, which processes brain wave data of a driver to identify the driving state of the driver so as to provide a proper driving assistance function for the driver. The method 400 shown in FIG. 4 may be performed by the autonomous vehicle shown in FIG. 1, or the autonomous system shown in FIG. 2, or the server 320 shown in FIG. 3.
401, a driver's brain wave signal is acquired.
Brain wave signals, which may also be referred to as electroencephalogram (EEG) signals, are extrinsic manifestations of brain activity, with different brain activities appearing as brain waves having different characteristics. By monitoring brain waves, the brain activity state of the driver can be monitored. Brain waves are a general reflection of electrophysiological activity of brain neurons on the surface of the cerebral cortex or scalp. Brain waves contain a large amount of physiological and disease information, and can provide a diagnosis basis for brain diseases in clinical medicine. During ordinary human activities, brain waves of α, β, γ, and θ are mainly generated.
θ wave: the frequency distribution is between 4Hz and 7Hz, the amplitude is between 20uv and 40uv, the slow wave belongs to slow waves, mainly appears in the occipital part and the top occipital part, is symmetrical on the left side and the right side, can be generally detected when a person is drowsy or sleepy, and has common connection with the psychological state of the person. This wave occurs with the central nerve in a state of depression, usually at a depressed mood, frustration, or drowsiness.
Delta wave: the frequency is distributed between 1Hz and 4Hz, the wave amplitude is between 20uv and 200uv, the performance of the Ding industry and pituitary is obvious in the period of infants or the period of immature intelligence development, and the slow wave belongs to the same theta wave. Normally, the delta wave is only present in states such as a quarterly lack of oxygen, a deep sleep state, or the presence of brain lesions.
Alpha wave: the frequency distribution is between 8Hz and 12Hz, the amplitude is between 25uv and 75uv, mainly in the occipital region, and the bilateral essential synchronization is maintained, which is the basic rhythm that normal human EEG has. The wave appears more clearly when in thinking, resting state, and disappears and the beta wave takes its place when the individual has targeted activity, opens his eyes or receives other stimuli.
Beta wave: the frequency is distributed between 14Hz and 30Hz, the amplitude of the wave is about half of delta wave, the wave mainly appears in the forehead and the central area, the frequency of the wave obviously represents the hyperexcitability of cerebral cortex, and the individual can appear in the waking state and the sleepiness stage.
In one example, the manner of acquiring brain waves may be acquiring brain waves of the positions of the occiput, the thimble, the tripod, the pituitary, the forehead, and the like, respectively. For example, a head cover is worn on the head of the driver, and a plurality of brain wave acquisition modules are provided in the head cover, and the brain waves at the positions are measured by the brain wave acquisition modules corresponding to the positions of the occipital part, the thimble part, the tripod, the pituitary, the forehead, and the like, so that the brain waves of the driver can be obtained as α, β, γ, and θ brain waves.
In one example, the manner of acquiring brain waves may be to measure the driver's brain waves by an ear-worn device. The ear side in the embodiment of the present application refers to a region on and near the ear of a human body where a bioelectrical signal can be measured, such as the inner side of the ear canal, the auricle, the auricular sulcus, the back of the ear, and the periphery of the ear. The ear-side wearing device may have various forms, for example, an earphone form or an earplug form. Fig. 5 shows an in-ear earplug for measuring brain waves, the earplug comprising an earplug body 501, a flexible electrode carrier 502 and a plurality of surface flexible electrodes 503. The flexible electrode carrier 502 provides a support with sufficient elasticity to ensure that the plurality of flexible electrodes 503 attached to the surface of the flexible electrode carrier 502 form a close fit with the ear-side surface of the user, thereby ensuring stable collection of the brain wave signals of the user. Section 510 illustratively presents a configuration of surface compliant electrodes 503, including biosensing compliant electrodes 503A presenting a equiangular 120 distribution, biosensing compliant electrodes 503B, and grounded common compliant electrodes 503C, 504 are ear plugs. For other possible embodiments, there may be only 1 or 2 biosensing flexible electrodes 503 attached to the surface of the flexible electrode carrier 502, while the earbud body 501 is connected to a common flexible electrode that is grounded. Or in other possible embodiments, the grounded common flexible electrode can be realized by an electrode contact on the auricle support. Fig. 6 shows a schematic wearing diagram of the ear-side wearing device in the form of an ear plug in fig. 5, wherein 601 is an ear canal of a user, 602 is an ear-type ear plug for measuring brain wave signals, 603 is a flexible electrode, and 604 is an auricle of the user. As can be seen in fig. 6, the plurality of flexible electrodes 603 on the surface of the flexible electrode carrier form a close fit with the inner surface of the ear canal 601 of the user when worn, and form a measurement system with the user's head. Although not shown, the ear-side worn device may also include a communication module for receiving or transmitting signals (e.g., brain wave signals).
Optionally, the method further includes: and (5) artifact removing processing.
I.e. removing signals that are not related to brain wave signals.
In the human body, there are many places where electrical phenomena occur, the most common is nerve conduction, one neuron receives stimulation and then transmits bioelectricity to the next neuron, the electrical phenomena occur all the time along with the survival of human beings, and each tiny expression of human beings is closely related to the conduction of nerve current. Not only nerve cells are the same, but also organs in a human body can generate bioelectricity signals with different degrees and different strengths, however, other bioelectricity signals are also mixed in the measurement of electroencephalogram, and because the initial electroencephalogram original signals cannot be completely extracted, different bioelectricity from the human body are basically mixed, and the influence is more or less. In addition, the expression and the limb movement of human beings can also greatly affect brain wave signals, such as heartbeat, muscle movement, blinking movement, deep breathing, skin perspiration and the like. Meanwhile, the difference of the temperature can also make the noise bioelectric signals change in strength and weakness in different degrees, and if the environmental temperature is low, the noise bioelectric signals can cause cold tremor and shaking of a few people, and the action amplitudes are large, so that interference can be caused to electroencephalogram. For example, waveforms larger than 32Hz (not signals in the frequency band in which brain wave signals are located) are filtered out.
Optionally, the method further includes: digital coding: the brain wave signals are digitally encoded and converted into digital signals.
The artifact removing processing can be carried out on the bioelectricity signals or can be carried out on the afterbrain wave signals of the feature extraction.
402, determining the brain wave activity state of the driver according to the brain wave signal of the driver.
The brain wave signals may be used to characterize the brain wave activity state of the driver. Brain wave activity states may include concentration, distraction, normal driving, emergency response, level of attention, and the like. That is, the brain wave signals may reflect the driver's attention as a concentration state or a distraction state, and/or the brain wave signals may reflect changes in the driver's brain activity in response to traffic conditions (i.e., the brain wave signals generated by the driver during normal driving and during an emergency are different). The brain activity state may be a numerical value when the brain activity state is the attention level. The larger the value, the more concentrated the attention of the driver, and the smaller the value, the more distracted the driver.
Fig. 7 is a schematic diagram showing brain wave signals when attention is focused. Fig. 8 is a schematic diagram showing brain wave signals in distraction. When the attention of the driver is more concentrated, the brain wave activity state is the state of concentrated attention or the attention level is higher, and the brain wave signal is more stable and hardly fluctuates. When the driver's attention is distracted, the brain wave activity state is distracted, or the level of attention is low, and the brain wave signal fluctuates and oscillates. The expression of the driver's distraction may be, for example, a case where the driver is slow down, pays attention to a plurality of objects at the same time, and fails to concentrate on the attention.
In addition, the driver's brain typically takes different attention when observing different types of things. The attention of drivers to observe traffic conditions and to pay attention to non-traffic information is often different. Therefore, when the driver switches from observing the traffic condition to paying attention to other non-traffic information (for example, switching to telephone communication, etc.), the attention of the driver fluctuates, which may cause sudden changes, vibrations, etc. of the driver's brain wave signal.
Optionally, the brain wave activity state of the driver is determined according to a similarity between a first brain wave signal and a second brain wave signal, the first brain wave signal is a brain wave signal acquired in a first time period, the second brain wave signal is a brain wave signal acquired in a second time period, and a time interval between the first time period and the second time period is smaller than a first preset threshold.
That is, when the similarity between the first brain wave signal and the second brain wave signal in two different periods is high, it tends to mean that the attention of the driver is focused or the driver is in a normal driving state; when the similarity between the first brain wave signal and the second brain wave signal in two different time periods is low, it means that the attention of the driver changes, for example, situations such as distraction, attention diversion, and emergency road conditions are encountered. And, in order to ensure comparability between the first brain wave signal and the second brain wave signal, the time interval between the first time interval and the second time interval is smaller than a first preset threshold value.
In one case, the time interval between the starting time of the first period and the starting time of the second period is less than a first preset threshold.
In one case, the first time period is earlier than the second time period, and the time interval between the suspension time of the first time period and the start time of the second time period is less than the first preset threshold.
Optionally, the first time period and the second time period may overlap.
Alternatively, the time lengths of the first and second periods may be a unit time length of the brain wave signal processing.
In particular, the first period and the second period cannot be the same period. If the first period is the same as the second period, it means that the start time of the first period is the same as the start time of the second period and the suspension time of the first period is the same as the suspension time of the second period. If the first time interval is the same as the second time interval, the first brain wave signal and the second brain wave signal are completely the same, the similarity is 1 or the maximum value, and the real brain wave activity state of the driver cannot be reflected.
The following explains, by way of an example, an implementation of determining the brain wave activity state of the driver based on the similarity between the first brain wave signal of the first time period and the second brain wave signal of the second time period. It is understood that this example is only for helping those skilled in the art to better understand the technical solution of the present application, and is not a limitation to the technical solution of the present application. Many modifications and other embodiments of the disclosure will come to mind to one skilled in the art to which this disclosure pertains having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the application is not limited to the specific embodiments disclosed.
A, intercepting a first brain wave signal in a first time interval to obtain N signal sampling points u1(1),u1(2),...,u1(N), N is an integer greater than 1. The N signal sampling points may be N signal sampling points at equal time intervals.
B, based on the N signal sampling points, respectively using u1(1),u1(2),…,u1(N-m +1) as a starting point, sequentially intercepting m sampling points to construct N-m +1 m-dimensional vectors, wherein the constructed N-m +1 m-dimensional vectors are X1(1),X1(2),...,X1(N-m +1) wherein X1(i)=[u(i),u(i+1),...,u(i+m-1)]I is not more than N-m +1, i is a positive integer, and m is a positive integer less than N.
C, determining X1(i) And X1(1),...,X1(i-1),X1(i+1),...,X1The distances between (i-m +1) are r1,i(1),...,r1,i(i-1),r1,i(i+1),...,r1,i(i-m +1), determining r1,i(1),...,r1,i(i-1),r1,i(i+1),...,r1,iThe number of r or less in (i-m +1) is B1(i) In that respect I.e. B1(i)=(number of X(j)such that d[X(i),X(j)]r)/(N-m) is not more than r, i is not equal to j, and the value range of i and j is [1, N-m +1 ≠ j]. Solution B1(i) The average value of all i values is recorded as B1I.e. B1=(B1(1)+…+B1(i)+…+B1(N-m+1))/(N-m+1)-1
r is a preset value, for example, the value of r can be related to the standard deviation δ of the sampling points, and the value of r can be between 0.1 δ and 0.3 δ.
d[X(i),X(j)]Can be defined as d [ X (i), X (j)]=max|ui(a)-uj(a)|,i≠j。ui(a) Is the a-dimension element of vector X (i), ui(a) Is the a-dimension element of vector x (j). That is, d represents the distance between vectors X (i) and X (j), and the distance between vectors X (i) and X (j) is determined by the largest difference among the differences of the corresponding elements. For example, X (1) [2, 3, 4, 6 ]],X(2)=[4,5,7,10]The maximum difference of the corresponding elements is |6-10| ═ 4, so d [ X (1), X (2)]=4。
D, based on the N signal sampling points, sequentially intercepting m +1 sampling points by taking u (1), u (2), … and u (N-m) as starting points to construct N-m + 1-dimensional vectors, wherein the constructed N-m + 1-dimensional vectors are X2(1),X2(2),...,X2(N-m +1) wherein X2(i)=[u(i),u(i+1),...,u(i+m)]I is not more than N-m and i is a positive integer.
E, determining X2(i) And X2(1),...,X2(i-1),X2(i+1),...,X2The distances between (i-m) are each r2,i(1),...,r2,i(i-1),r2,i(i+1),...,r2,i(i-m), determining r2,i(1),...,r2,i(i-1),r2,i(i+1),...,r2,iThe number of r or less in (i-m) is B2(i) In that respect I.e. B2(i)=(number of X(j)such that d[X(i),X(j)]r)/(N-m-1) is not more than r, i is not equal to j, and the value range of i and j is [1, N-m ≠ j]. Solution B2(i) The average value of all i values is recorded as B2I.e. B2=(B2(1)+…+B2(i)+…+B2(N-m-))/(N-m-)-1
F, based on B1And B2And determining the sample entropy SampEn of the brain wave signal. SampEn { -ln [ B { -ln { [ B ]2/B1]}. The sample entropy SampEn may be a state of brain wave activity.
The order of A-F is not fixed, for example B, C and D, E are not fixed in sequence. For example, D, E can be performed before B, C, simultaneously with D, B and/or simultaneously with E, C, or partially overlapping in time.
When m is N-2, X1(1)=[u(1),...,u(N-2)],X1(2)=[u(2),...,u(N-1)],X1(3)=[u(3),...,u(N)],X1(1) May be a first brain wave signal, X, for characterizing a first time interval1(2) May be a second brain wave signal, X, for characterizing a second time interval1(3) May be a third brain wave signal characterizing a third time period. B is1(1)、B1(2) Or B1(3) The method can be used for representing the similarity between the first brain wave signal and the second brain wave signal, the similarity between the first brain wave signal and the third brain wave signal or the similarity between the second brain wave signal and the third brain wave signal.
When m is N-1, X2(1)=[u(1),...,u(N-1)],X2(2)=[u(2),...,u(N)],X2(1) May be a fourth brain wave signal, X, for characterizing a fourth time interval2(2) May be a fifth brain wave signal for characterizing the fifth time period. B is2(1) Or B2(2) Can be used to characterize the similarity between the fourth brain wave signal and the fifth brain wave signal.
Optionally, the brain wave activity state is determined according to the ratio of alpha, beta, gamma, and theta brain waves in the brain wave signals.
Because the states reflected by the alpha, beta, gamma and theta brain waves are different, the brain wave activity state of the driver can be determined according to the occupation ratios of the alpha, beta, gamma and theta brain waves.
For example, in the case where the proportion of the alpha brainwaves is high, the attention of the driver may be more focused, or the driver is in a normal driving state. In contrast, when the occupancy of the alpha brain wave changes from high to low, the brain wave activity of the driver may be distracted, or the driver is encountering an emergency road condition.
For another example, the driver's brainwaves are collected in the first time period to obtain average occupation ratios of alpha, beta, gamma and theta brainwaves of W1, W2, W3 and W4 respectively, and the driver's brainwaves are collected in the second time period to obtain average occupation ratios of alpha, beta, gamma and theta brainwaves of W5, W6, W7 and W8 respectively. And analyzing the similarity of W1 and W5, W2 and W6, W3 and W7, and W4 and W8 to determine whether the brain wave of the driver is changed. If no change occurs, the driver's attention may be more focused or the driver is in a normal driving state. If a change is sent, the driver's brainwave activity may be distracted, or subject to an emergency road condition.
Optionally, the brain wave activity state of the current driver is determined according to the similarity between the current brain wave signal of the driver and the preset brain wave signal.
For example, the current brain wave signal cycle of the driver may be compared with a preset brain wave signal. If the comparison results are substantially the same, the attention of the driver may be more focused, or the driver is in a normal driving state; if the comparison result shows that the difference is large, the brain wave activity of the driver may be distracted or distracted, or the driver is encountering an emergency road condition.
For another example, the occupation ratio of the alpha, beta, gamma, and theta brain waves in the current brain wave of the driver may be compared with the preset brain wave occupation ratio. If the comparison results are substantially the same, the attention of the driver may be more focused, or the driver is in a normal driving state; if the comparison result shows that the difference is large, the brain wave activity of the driver may be distracted or distracted, or the driver is encountering an emergency road condition.
The preset brain wave signal can be obtained by collecting the brain wave signal of the driver in the normal driving process in advance. Due to the agility of the response of some drivers and the slowness of the response of some drivers, the brain responses of different drivers are usually not the same when facing the same emergency. For example, the attention peaks of different drivers may be different. That is, some drivers have a relatively greater ability to focus their attention during driving, and other drivers have a relatively lesser ability to focus their attention during driving. Therefore, if the preset brain wave signal is different, the brain wave activity state can reflect the uniqueness of the driver, and therefore, the high-quality automatic driving service can be provided for the user more easily.
And 403, controlling a driving state of the autonomous vehicle according to the brain wave activity state of the driver, wherein the driving state comprises at least one of a manual driving state and an auxiliary autonomous driving state.
The manual driving state may refer to a driving state in which little or no auxiliary driving function is activated. In the manual driving state, the driver needs to respond to all or almost all the traffic information in time. The manual driving state may be, for example, the driving state of the above automatic driving level L0.
The auxiliary automatic driving state starts the driving state of part or all of the auxiliary driving functions, and can provide certain auxiliary functions for the user, so that the driver can not respond to part or all of the road condition information. The assisted autonomous driving state may be, for example, one or more driving states of the autonomous driving level L1-L5, above.
The driving assistance function may include at least one of pre-collision safety braking (PCS), Adaptive Cruise Control (ACC), Lane Keeping Assistance (LKA), Cross Traffic Alert (CTA), Rear Cross Traffic Alert (RCTA), Blind Spot Warning (BSW), turning off vehicle alert, and Traffic Jam Assistance (TJA).
The brain wave signal can reflect the thinking and psychological activities of the driver, so that the driving state of the driver can be judged according to the brain wave signal, and a proper automatic driving service is provided for the driver.
In one example, it is possible to analyze the driving state of the driver from only the brain wave signal and control the driving state of the autonomous vehicle. That is, the driving state of the autonomous vehicle is controlled only according to the brain wave activity state of the driver, considering the brain wave activity state of the driver as the driving state of the driver.
Optionally, the controlling the driving state of the autonomous vehicle according to the brain wave activity state of the driver includes: and controlling the driving state of the automatic driving vehicle to be the auxiliary automatic driving state under the condition that the brain wave activity state is lower than the second preset threshold value.
Controlling the driving state of the autonomous vehicle to be the assisted autonomous driving state, in one way, maintaining the current driving state of the autonomous vehicle to be the assisted autonomous driving state, in case the brainwave activity state is continuously lower than a second preset threshold; in another mode, under the condition that the brain wave activity state is changed from being higher than a second preset threshold value to being lower than the second preset threshold value, the driving state of the current automatic driving vehicle is changed from a manual driving state to an auxiliary automatic driving state; one mode is that the brain wave activity state of the driver is detected when the vehicle is started, and the driving state after the vehicle is started is determined to be the auxiliary automatic driving state under the condition that the brain wave activity state of the driver is lower than a second preset threshold value.
Optionally, the controlling the driving state of the autonomous vehicle according to the brain wave activity state of the driver includes: and controlling the driving state of the automatic driving vehicle to be the manual driving state under the condition that the brain wave activity state is higher than the second preset threshold value.
Controlling the driving state of the automatic driving vehicle to be the manual driving state, wherein one mode is that the current driving state of the automatic driving vehicle is maintained to be the manual driving state under the condition that the brain wave activity state is continuously higher than a second preset threshold value; in another mode, under the condition that the brain wave activity state is changed from being lower than a second preset threshold value to being higher than the second preset threshold value, the driving state of the current automatic driving vehicle is changed from the auxiliary automatic driving state to the manual driving state; one mode is that the brain wave activity state of the driver is detected when the vehicle is started, and the driving state after the vehicle is started is determined to be the manual driving state when the brain wave activity state of the driver is higher than a second preset threshold value.
That is, when the brain wave activity state is higher than a second preset threshold, the driving state of the autonomous vehicle is the manual driving state, and when the brain wave activity state is lower than the second preset threshold, the driving state of the autonomous vehicle is the assisted autonomous driving state.
A higher brain wave activity state may mean that the driver's driving state is attentive or in a normal driving state, and a lower brain wave activity state may mean that the driver's driving state is distractive, distracting, or encountering an emergency road condition. Therefore, under the condition that the attention of the driver is concentrated or the driver is in a normal driving state, the driving state of the automatic driving vehicle is an artificial driving state, so that the driver can freely control and drive the vehicle; under the condition that the driver is distracted or meets an emergency road condition, the driving state of the automatic driving vehicle is an auxiliary automatic driving state, so that automatic driving service can be provided for the driver in time, and potential safety hazards are avoided.
Optionally, the second preset threshold may be obtained by training a neural network model.
For example, the brain wave signal of the driver and the driving state selected by the driver in the corresponding time period can be input into the neural network model, so as to obtain the trained second preset threshold.
Due to the agility of the response of some drivers and the slowness of the response of some drivers, the brain responses of different drivers are usually not the same when facing the same emergency. Therefore, the specific value of the second preset threshold is often different for different drivers.
For example, driver a is relatively more attentive, driver B is relatively less attentive, and driver a is more responsive to complex road conditions in time than driver B. Therefore, the second preset threshold value 1 corresponding to the driver a should be lower than the second preset threshold value 2 corresponding to the driver B. That is, it is easier for the driver B to satisfy the condition for starting the assist automatic driving service.
For another example, the concentration ratios of the driver driving in a short time and driving in a long time are different, so that different second preset thresholds can be set according to different driving durations to ensure the safety of the long-time driving.
Optionally, the auxiliary automatic driving state includes a first type auxiliary automatic driving state and a second type auxiliary automatic driving state, where the brain wave activity state is higher than a third preset threshold, the driving state of the automatic driving vehicle is the first type auxiliary automatic driving state, the brain wave activity state is lower than the third preset threshold, the driving state of the automatic driving vehicle is the second type auxiliary automatic driving state, an automation level of the second type auxiliary automatic driving state is higher than an automation level of the first type auxiliary automatic driving state, and the third preset threshold is lower than the second preset threshold.
That is, when the driving state of the driver requires control of the autonomous vehicle in the assisted autonomous driving state, it is also possible to further determine what driving level the autonomous vehicle should be in (e.g., the L1-L5 level as described in full text) according to the driving state of the driver. For example, the less attentive the driver, the more assisted autonomous driving services the autonomous vehicle provides.
The second type of auxiliary automatic driving state may have a higher level of automation than the first type of auxiliary automatic driving state, and may be, for example, an auxiliary automatic driving function that is activated in correspondence with the second type of auxiliary automatic driving state, and an auxiliary automatic driving function that is activated in correspondence with the first type of auxiliary automatic driving state. For example, when the autonomous vehicle is in the second type of assisted autonomous state, 5 assisted autonomous functions will be turned on; when the autonomous vehicle is in the first type of assisted autonomous state, 3 assisted autonomous functions will be activated.
The second type of assisted automatic driving state may have a higher level of automation than the first type of assisted automatic driving state, and for example, the assisted automatic driving function activated in correspondence with the second type of assisted automatic driving state may include the assisted automatic driving function activated in correspondence with the first type of assisted automatic driving state. For example, when the autonomous vehicle is in the second type of assisted autonomous state, the assisted driving functions that will be activated may include pre-crash safety braking, adaptive cruise control, lane keeping assist, cross traffic warning, trail cross traffic warning, blind spot warning, off vehicle warning, and traffic congestion assist; when the autonomous vehicle is in the first type of assisted autonomous state, pre-crash safety braking, vehicle rear cross traffic warning, blind spot warning, vehicle off warning, and traffic congestion assistance will be turned on.
Optionally, the third preset threshold may be obtained by training a neural network model.
In one example, the driving state of the autonomous vehicle is controlled according to the brain wave activity state, the eye activity state. That is, the driving state of the driver can be obtained by comprehensively considering the electroencephalogram signal and the eye data, and the driving state of the autonomous vehicle can be controlled according to the driving state of the driver.
Optionally, before the controlling the driving state of the autonomous vehicle according to the brain wave activity state of the driver, the method further comprises: acquiring eye data of the driver; determining the eye activity state of the driver according to the eye data; the controlling a driving state of the autonomous vehicle according to the brain wave activity state of the driver includes: and controlling the driving state of the automatic driving vehicle according to the brain wave activity state and the eye activity state.
The method for acquiring the eye data of the driver can be to shoot a picture of the face or eyes of the driver through a camera and obtain eye image information. The eye data may include eye image information at a plurality of time instants.
The eye activity status mainly includes whether the gaze is deviated, the deviation degree of the gaze, the starting direction and stopping direction of the gaze deviation, the blinking frequency, the fatigue degree of the eyes, and the like.
The eye activity state may be a numerical value. When the value of the eye activity state is small, the situations of obvious shift of the eye spirit, eye fatigue and the like of the driver can occur, and the driving state of the driver is poor and cannot adapt to complex traffic road conditions; when the value of the eye activity state is large, the eyes of the driver may not be deviated, which means that the driving state of the driver is excellent and can adapt to complex traffic road conditions.
One way to determine the eye activity state of the driver may be to determine the similarity of eye image information at multiple times. For example, when the first eye image information at the first time is compared with the second eye image information at the second time, the eye movement may be shifted when the similarity of the eye image information is low.
Another way to determine the eye activity state of the driver may be to input the eye data into a neural network model, and recognize the image through the neural network model, so as to obtain the recognition results of whether the gaze is shifted, the gaze shifting degree, the starting direction and stopping direction of the gaze shifting, the blinking frequency, the eye fatigue degree, and the like.
Compared with the case of considering only the brain wave activity state, the driving state of the driver can be more easily and accurately recognized due to the further consideration of the eye activity state, so that the driver can be provided with appropriate driving assistance service.
In particular, the absence of a shift in gaze does not mean that the driver's attention is focused, nor does the presence of a shift in gaze mean that the driver's attention is not focused. Therefore, the eye activity state may not match the brain wave activity state.
In one possible mode, the brain wave activity state and the eye activity state are regarded as a combination, and the driving state of the autonomous vehicle is controlled according to the driving state corresponding to the combination (the brain wave activity state and the eye activity state).
For example, when the brain wave activity state is normal driving and the eye activity state is eye deviation, the driving state of the automatic driving vehicle is controlled to be an artificial driving state; when the brain wave activity state is normal driving and the eye activity state is fatigue of eyes, controlling the driving state of the automatic driving vehicle to be an auxiliary automatic driving state; and when the brain wave activity state is distracted and the eye activity state is that the eyes are not deviated, controlling the driving state of the automatic driving vehicle to be an auxiliary automatic driving state.
Optionally, the controlling the driving state of the autonomous vehicle according to the brain wave activity state and the eye activity state includes: controlling the driving state of the autonomous vehicle to be the assisted autonomous driving state when the sum of the first product and the second product is smaller than the fourth preset threshold, wherein the first product is the product of a first weight value and the brain wave activity state, the second product is the product of a second weight value and the eye activity state, and the first weight value is larger than the second weight value.
That is, in a case where the sum of the first product and the second product is less than the fourth preset threshold, the driving state of the autonomous vehicle is the assisted autonomous driving state.
Optionally, the controlling the driving state of the autonomous vehicle according to the brain wave activity state and the eye activity state includes: controlling the driving state of the autonomous vehicle to be the manual driving state when the sum of the first product and the second product is smaller than the fourth preset threshold, wherein the first product is the product of a first weight value and the brain wave activity state, the second product is the product of a second weight value and the eye activity state, and the first weight value is larger than the second weight value.
That is, in a case where the sum of the first product and the second product is greater than a fourth preset threshold, the driving state of the autonomous vehicle is the manual driving state.
The sum of the first product and the second product may reflect the driving state of the driver. Because the brain wave activity state can directly reflect the thinking and psychological states of the driver, and the eye activity state indirectly or cannot reflect the thinking or psychological states of the driver, the driving state of the driver is determined to be relatively more dependent on the brain wave activity state by setting the weight value corresponding to the brain wave activity state to be greater than the weight value corresponding to the eye activity state.
Optionally, the fourth preset threshold may be obtained by training a neural network model.
Optionally, the first weight value may be obtained by training a neural network model.
Optionally, the second weight value may be obtained by training a neural network model.
For example, the brain wave signal of the driver, the catch data, and the driving state selected by the driver at the corresponding time period may be input into the neural network model, so as to obtain one or more of the trained fourth preset threshold, the first weight value, and the second weight value.
In particular, when the second weight value is 0, it may be considered that the eye activity state is not considered in determining the driving state of the driver.
Due to the agility of the response of some drivers and the slowness of the response of some drivers, the brain responses of different drivers are usually not the same when facing the same emergency. Therefore, the specific value of the fourth preset threshold is often different for different drivers. Due to the uniqueness of the driver, the first weight value and the second weight value can be confirmed according to the preference, habit and the like of the driver in order to improve the user experience.
For example, driver a is relatively more attentive, driver B is relatively less attentive, and driver a is more responsive to complex road conditions in time than driver B. Therefore, the fourth preset threshold value 1 corresponding to the driver a should be lower than the fourth preset threshold value 2 corresponding to the driver B. That is, it is easier for the driver B to satisfy the condition for starting the assist automatic driving service.
For another example, the concentration ratios of the driver driving in a short time and driving in a long time are different, so that different fourth preset threshold values can be set according to different driving durations to ensure the safety of the long-time driving.
In one example, a driving state of the autonomous vehicle is controlled based on the brain wave activity state, the biometric activity state. That is, the driving state of the driver can be obtained by comprehensively considering the brain wave signal and the biometric data, and the driving state of the autonomous vehicle can be controlled according to the driving state of the driver.
Optionally, before the controlling the driving state of the autonomous vehicle according to the brain wave activity state of the driver, the method further comprises: acquiring biological characteristic data of the driver, wherein the biological characteristic data comprises one or more of pulse data, heartbeat data and blood pressure data; determining a biometric activity state of the driver according to the biometric data; the controlling a driving state of the autonomous vehicle according to the brain wave activity state of the driver includes: controlling a driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state. The method for acquiring the eye data of the driver can be to shoot a picture of the face or eyes of the driver through a camera and obtain eye image information. The eye data may include eye image information at a plurality of time instants.
One way to obtain the biometric data may be by a watch worn by the driver, obtaining biometric data of the driver, such as pulse data and heartbeat data.
One way to obtain the biological characteristics may be to obtain biological characteristic data of the driver, such as pulse data, heartbeat data, and blood pressure data, through a biological characteristic collecting device disposed on the driving seat.
The biometric activity status may be used to identify whether the driver is healthy, such as whether the driver has suffered a stroke, breathlessness, heart attack, and the like.
The biometric activity state may be a numerical value. When the value of the biological characteristic activity state is small, the health hidden danger possibly exists in the body of a driver, which means that the driving state of the driver is poor and the driver cannot adapt to complex traffic road conditions; when the value of the biological characteristic activity state is large, the driver is probably very healthy, which means that the driving state of the driver is excellent and can adapt to complex traffic road conditions.
One way to determine the driver's biometric activity state may be to compare the driver's biometric data to human health thresholds to determine the driver's biometric activity state. That is, when the biometric data of the driver exceeds the health threshold, the driver may have a health hazard.
One way to determine the driver's biometric activity state may be to compare the biometric data during normal driving of the driver with the driver's current driving state to determine the driver's biometric activity state. That is, when the biometric data of the driver fluctuates significantly, the driving state of the driver is likely to change.
Optionally, the determining the biometric activity state of the driver according to the biometric data includes: determining the biological feature activity state of the driver according to the similarity between first biological feature data and second biological feature data, wherein the first biological feature data are obtained in a third time period, the second biological feature data are obtained in a fourth time period, and the time interval between the third time period and the fourth time period is smaller than a fifth preset threshold value.
That is, when the similarity between the first biometric data and the second biometric data of two different periods is high, it tends to mean that the driving state of the driver is stable or the driver is in a normal driving state; when the similarity between the first biometric data and the second biometric data in two different time periods is low, it means that the driving state of the driver changes, for example, a health hazard occurs, and an emergency road condition is encountered. And, in order to ensure comparability between the first biometric data and the second biometric data, the time interval between the third time period and the fourth time period is less than a fifth preset threshold.
In one case, a time interval between the start time of the third period and the start time of the fourth period is less than a fifth preset threshold.
In one case, the third time interval is earlier than the fourth time interval, and the time interval between the suspension time of the third time interval and the start time of the fourth time interval is less than a fifth preset threshold.
Optionally, the third time period and the fourth time period may overlap.
Alternatively, the time length of the third and fourth periods may be a unit time length of the biometric process.
In particular, the third period and the fourth period cannot be the same period. If the third period is the same as the fourth period, it means that the start time of the third period is the same as the start time of the fourth period and the suspension time of the third period is the same as the suspension time of the fourth period. If the third time interval is the same as the fourth time interval, the third brain wave signal and the fourth brain wave signal are completely the same, the similarity is 1 or the maximum value, and the real brain wave activity state of the driver cannot be reflected.
The method for determining the similarity between the first biological characteristic data and the second biological characteristic data may refer to the method for determining the similarity between the first brain wave signal and the second brain wave signal, and thus, it is not necessary to describe here.
Optionally, a time interval between the first time period and the third time period or the fourth time period is smaller than an eighth preset threshold, and/or a time interval between the second time period and the third time period or the fourth time period is smaller than a ninth preset threshold.
That is, in order to ensure the comparability of the biological characteristic activity state with the brain wave activity state, the time interval between the first period and the third period or the fourth period is less than the eighth preset threshold, and/or the time interval between the second period and the third period or the fourth period is less than the ninth preset threshold. That is, the time interval between any measurement period of the brain wave signal and any measurement period of the biometric feature should be sufficiently small.
Optionally, the first time period may overlap with the third time period and/or the fourth time period.
Optionally, the second time period may overlap with the third time period and/or the fourth time period.
Compared with the case of only considering the brain wave activity state, the driving state of the driver is easier to accurately recognize due to the further consideration of the biometric activity state, so that the driver is provided with appropriate driving assistance service.
In particular, the absence of a change in the biometric activity state does not mean that the driver's attention is focused, nor does the presence of a change in the biometric activity state mean that the driver's attention is not focused. Therefore, the biological characteristic activity state may not match the brain wave activity state.
In one possible approach, the brain wave activity state and the biometric activity state are considered as a combination, and the driving state of the autonomous vehicle is controlled according to the driving state corresponding to the combination (brain wave activity state, biometric activity state).
For example, when the brain wave activity state is normal driving and the biological characteristic activity state is too high heartbeat frequency, the driving state of the automatic driving vehicle is controlled to be an auxiliary automatic driving state; when the brain wave activity state is distracted and the biological characteristic activity state is healthy, controlling the driving state of the automatic driving vehicle to be an auxiliary automatic driving state; and when the brain wave activity state is normal driving and the biological characteristic activity state is healthy, controlling the driving state of the automatic driving vehicle to be an artificial driving state.
Optionally, the controlling the driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state includes: controlling the driving state of the autonomous vehicle to be the assisted autonomous driving state, in case the sum of the first product and the third product is smaller than the sixth preset threshold; wherein the first product is a product of a first weight value and the brain wave activity state, the third product is a product of a third weight value and the biological characteristic activity state, and the first weight value is greater than the third weight value.
That is, in a case where the sum of the first product and the third product is less than the sixth preset threshold, the driving state of the autonomous vehicle is the assisted autonomous driving state.
Optionally, the controlling the driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state includes: controlling the driving state of the autonomous vehicle to be the manual driving state when the sum of the first product and the third product is greater than the sixth preset threshold; wherein the first product is a product of a first weight value and the brain wave activity state, the third product is a product of a third weight value and the biological characteristic activity state, and the first weight value is greater than the third weight value.
That is, in a case where the sum of the first product and the third product is greater than a sixth preset threshold, the driving state of the autonomous vehicle is the manual driving state.
The sum of the first product and the third product may reflect the driving state of the driver. Because the brain wave activity state can directly reflect the thinking and psychological states of the driver, and the biological characteristic activity state indirectly or cannot reflect the thinking or psychological states of the driver, the driving state of the driver is determined to be relatively more dependent on the brain wave activity state by setting the weight value corresponding to the brain wave activity state to be greater than the weight value corresponding to the biological characteristic activity state.
Optionally, the sixth preset threshold may be obtained by training a neural network model.
Optionally, the third weight value may be obtained by training a neural network model.
For example, the brain wave signal of the driver, the biometric data, and the driving state selected by the driver at the corresponding time period may be input into the neural network model, so as to obtain one or more of the trained sixth preset threshold, the trained first weight value, and the trained third weight value.
In particular, when the third weight value is 0, it may be considered that the biometric activity state is not considered in determining the driving state of the driver.
Due to the agility of the response of some drivers and the slowness of the response of some drivers, the brain responses of different drivers are usually not the same when facing the same emergency. Therefore, the specific value of the sixth preset threshold is often different for different drivers. Due to the uniqueness of the driver, the first weight value and the third weight value can be confirmed according to the preference, habit and the like of the driver in order to improve the user experience.
For example, driver a is relatively more attentive, driver B is relatively less attentive, and driver a is more responsive to complex road conditions in time than driver B. Therefore, the sixth preset threshold value 1 corresponding to the driver a should be lower than the sixth preset threshold value 2 corresponding to the driver B. That is, it is easier for the driver B to satisfy the condition for starting the assist automatic driving service.
For another example, the concentration ratios of the driver driving in a short time and driving in a long time are different, so that different sixth preset threshold values can be set according to different driving durations to ensure the safety of the long-time driving.
In one example, a driving state of the autonomous vehicle is controlled based on brain wave activity state, biometric activity state, eye activity state. That is, the driving state of the driver can be obtained by comprehensively considering the brain wave signal, the biometric data, and the eye data, and the driving state of the autonomous vehicle can be controlled according to the driving state of the driver.
Optionally, before the controlling the driving state of the autonomous vehicle according to the brain wave activity state of the driver, the method further comprises: acquiring eye data of the driver; determining the eye activity state of the driver according to the eye data; acquiring biological characteristic data of the driver, wherein the biological characteristic data comprises one or more of pulse data, heartbeat data and blood pressure data; determining a biometric activity state of the driver according to the biometric data; the controlling a driving state of the autonomous vehicle according to the brain wave activity state of the driver includes: and controlling the driving state of the automatic driving vehicle according to the brain wave activity state, the eye activity state and the biological characteristic state of the driver.
The states of brain wave activity, eye activity and biological characteristics and the methods for obtaining the states of brain wave activity, eye activity and biological characteristics have been described above, and thus, detailed description thereof is omitted.
Because the brain wave activity state, the eye activity state and the biological characteristic activity state are considered in the sum, the driving state of the driver can be more easily and accurately identified, and therefore, the appropriate driving assistance service is provided for the driver.
In particular, the absence of a shift in gaze does not mean that the driver's attention is focused, nor does the presence of a shift in gaze mean that the driver's attention is not focused. In addition, the absence of a change in the biometric activity state does not mean that the driver's attention is focused, nor does the presence of a change in the biometric activity state mean that the driver's attention is not focused. Therefore, the biological feature activity state, the eye activity state, and the brain wave activity state may not match each other.
Optionally, when the sum of the first product, the second product and the third product is greater than a seventh preset threshold, controlling the driving state of the autonomous vehicle to be the manual driving state; controlling the driving state of the autonomous vehicle to be the assisted autonomous driving state when the sum of the first product, the second product and the third product is less than the seventh preset threshold; wherein the first product is a product of a first weight value and the brain wave activity state, the second product is a product of a second weight value and the eye activity state, the third product is a product of a third weight value and the biological feature activity state, the first weight value is greater than the second weight value, and the first weight value is greater than the third weight value.
The sum of the first product, the second product, and the third product may reflect the driving state of the driver. Because the brain wave activity state can directly reflect the thinking and psychological states of the driver, and the biological characteristic activity state and the eye spirit activity state indirectly or cannot reflect the thinking or psychological states of the driver, the driving state of the driver is determined to be relatively more dependent on the brain wave activity state by setting the weight value corresponding to the brain wave activity state to be greater than the weight value corresponding to the biological characteristic activity state and setting the weight value corresponding to the brain wave activity state to be greater than the weight value corresponding to the eye activity state.
Optionally, the seventh preset threshold may be obtained by training a neural network model.
In the embodiment of the present application, the brain wave signal may reflect an activity state of the brain, and may reflect a psychological activity of the driver. The brain wave can reflect deep requirements of the driver more easily and is difficult to be controlled by the driver subjectively, so the brain wave signal can reflect the real driving will or the real driving ability of the driver, more accurate information is provided for determining the driving state of the vehicle, and the driving safety is ensured. And the brain wave signal can reflect the preference and habit of the driver, and is beneficial to providing personalized service for the driver.
FIG. 9 is a schematic diagram of an autopilot system provided by one embodiment of the present application. It should be understood that the autopilot system 900 shown in fig. 9 is merely an example, and that the autopilot system of embodiments of the present application may include some or all of the modules or elements shown in fig. 9, and may also include other modules or elements not shown in fig. 9.
The autopilot system 900 includes an electroencephalogram acquisition device for acquiring an electroencephalogram signal of the driver.
The autopilot system 900 further comprises driver state processing means for determining the state of brain wave activity of the driver. The autopilot system 900 also includes a brain-computer interface connected between the brain wave collection device and the driver state processing device.
The autopilot system 900 further comprises vehicle state processing means for controlling the driving state of the autonomous vehicle in dependence on the brain wave activity state of the driver.
Optionally, the autopilot system 900 includes driver-operated sensors for detecting operational states including vehicle throttle, brake, steering wheel, etc.
Optionally, the autopilot system 900 further includes a driving environment sensor for detecting an environmental state around the vehicle during travel and stopping.
Optionally, the autopilot system 900 further includes a human-machine interface, a touch display screen, and keeps interacting with the driver through the external touch display screen.
Optionally, the automatic driving system 900 further includes a switch that is operated by the driver to switch the vehicle between the manual driving state and the assisted automatic driving state. The vehicle state processing means may perform switching of the vehicle state in response to the switch.
Optionally, the automatic driving system 900 further includes: the eye data acquisition device is used for acquiring eye data of a driver; the driver state processing device is also used for determining the eye activity state of the driver according to the eye data; the vehicle state processing device is specifically configured to control a driving state of the autonomous vehicle according to the brain wave activity state and the eye activity state.
Alternatively, the ocular data collection device may be an image collection device, such as a camera.
Optionally, the eye data acquisition device may also be used to monitor the expression, limb movements and the environment inside the vehicle of the driver.
Optionally, the automatic driving system 900 further includes: the biological characteristic data acquisition device is used for acquiring biological characteristic data of the driver, and the biological characteristic data comprises one or more of pulse data, heartbeat data and blood pressure data; the driver state processing device is also used for determining the biological characteristic activity state of the driver according to the biological characteristic data; the vehicle state processing device is specifically configured to control a driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state.
Optionally, the automatic driving system 900 further includes: the eye data acquisition device is used for acquiring eye data of a driver; the biological characteristic data acquisition device is used for acquiring biological characteristic data of the driver, and the biological characteristic data comprises one or more of pulse data, heartbeat data and blood pressure data; the driver state processing device is also used for determining the eye activity state of the driver according to the eye data; the driver state processing device is also used for determining the biological characteristic activity state of the driver according to the biological characteristic data; the vehicle state processing device is specifically configured to control a driving state of the autonomous vehicle according to the brain wave activity state, the eye activity state, and the biometric activity state.
Alternatively, the ocular data collection device may be an image collection device, such as a camera.
Optionally, the eye data acquisition device may also be used to monitor the expression, limb movements and the environment inside the vehicle of the driver.
Fig. 10 is a schematic diagram of an in-vehicle environment, and fig. 10 shows an autopilot system and the general locations of the various modules within the autopilot system. The automatic driving vehicle is controlled by a driver 1001 or an auxiliary automatic driving controller, the vehicle driver 1001 wears an in-ear brain wave acquisition device 1002 for extracting brain wave signals of the driver 1001, and the vehicle further comprises an image sensor 1003 for monitoring the expression, the limb movement and the environment in the vehicle of the driver. The vehicle further comprises a touch display screen 1004 for receiving operation information of the driver 1001, a voice device 1005 for receiving voice of the driver 1001 or transmitting voice information to the driver 1001, 1006 is a control steering wheel of the vehicle, and 1007 is a switch for switching between a manual driving state and an auxiliary automatic driving state. Other devices such as brakes, throttle and the like in the vehicle are not marked one by one in the figure.
Fig. 11 is a schematic structural diagram of a driving state processing device according to an embodiment of the present application. It should be understood that the autopilot system 1100 shown in fig. 11 is merely an example, and that the apparatus of embodiments of the present application may also include other modules or units. It should be understood that the autopilot system 1100 is capable of performing the various steps in the method of fig. 4 and, to avoid repetition, will not be described in detail herein.
As shown in fig. 11, the driving state processing apparatus 1100 may include an obtaining module 1110 and a processing module 1120, wherein the obtaining module 1110 is configured to obtain an electroencephalogram signal of the driver; the processing module 1120 is used for determining the brain wave activity state of the driver according to the brain wave signal of the driver; the processing module 1120 is further configured to control a driving state of the autonomous vehicle according to the brain wave activity state of the driver.
Optionally, in a possible implementation manner, the processing module 1120 is specifically configured to determine the brain wave activity state of the driver according to a similarity between a first brain wave signal and a second brain wave signal, where the first brain wave signal is a brain wave signal acquired at a first time interval, the second brain wave signal is a brain wave signal acquired at a second time interval, and a time interval between the first time interval and the second time interval is smaller than a first preset threshold.
Optionally, in a possible implementation manner, the processing module 1120 is specifically configured to control the driving state of the autonomous vehicle to be the auxiliary autonomous driving state when the brain wave activity state is lower than the second preset threshold.
Optionally, in a possible implementation manner, the auxiliary automatic driving state includes a first type auxiliary automatic driving state and a second type auxiliary automatic driving state, the driving state of the autonomous vehicle is the first type auxiliary automatic driving state when the brain wave activity state is higher than a third preset threshold, the driving state of the autonomous vehicle is the second type auxiliary automatic driving state when the brain wave activity state is lower than the third preset threshold, the automation level of the second type auxiliary automatic driving state is higher than the automation level of the first type auxiliary automatic driving state, and the third preset threshold is lower than the second preset threshold.
Optionally, in a possible implementation manner, the obtaining module 1110 is further configured to obtain eye data of the driver; the processing module 1120 is further configured to determine an eye activity state of the driver according to the eye data; the processing module 1120 is specifically configured to control a driving state of the autonomous vehicle according to the brain wave activity state and the eye activity state.
Optionally, in a possible implementation, the processing module 1120 is specifically configured to control the driving state of the autonomous vehicle to be the assisted autonomous driving state when a sum of the first product and the second product is less than the fourth preset threshold, where the first product is a product of a first weight value and the brain wave activity state, the second product is a product of a second weight value and the eye activity state, and the first weight value is greater than the second weight value.
Optionally, in a possible implementation manner, the obtaining module 1110 is further configured to obtain biometric data of the driver, where the biometric data includes one or more of pulse data, heartbeat data, and blood pressure data; the processing module 1120 is further configured to determine a biometric activity state of the driver according to the biometric data; the processing module 1120 is specifically configured to control a driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state.
Optionally, in a possible implementation manner, the processing module 1120 is specifically configured to determine the biometric activity state of the driver according to a similarity between first biometric data and second biometric data, where the first biometric data is biometric data acquired in a third time period, the second biometric data is biometric data acquired in a fourth time period, and a time interval between the third time period and the fourth time period is smaller than a fifth preset threshold.
Optionally, in a possible implementation manner, the processing module 1120 is specifically configured to control the driving state of the autonomous vehicle to be the auxiliary autonomous driving state when the sum of the first product and the third product is smaller than the sixth preset threshold; wherein the first product is a product of a first weight value and the brain wave activity state, the third product is a product of a third weight value and the biological characteristic activity state, and the first weight value is greater than the third weight value.
Optionally, in a possible implementation manner, the obtaining module 1110 is further configured to obtain eye data of the driver; the processing module 1120 is further configured to determine an eye activity state of the driver according to the eye data; the obtaining module 1110 is further configured to obtain biometric data of the driver, where the biometric data includes one or more of pulse data, heartbeat data, and blood pressure data; the processing module 1120 is further configured to determine a biometric activity state of the driver according to the biometric data; the processing module 1120 is specifically configured to control a driving state of the autonomous vehicle according to the brain wave activity state, the eye activity state, and the biometric activity state.
Optionally, in a possible implementation manner, the processing module 1120 is specifically configured to, in a case that a sum of the first product, the second product, and the third product is less than the seventh preset threshold, set the driving state of the autonomous vehicle to the assisted autonomous driving state; wherein the first product is a product of a first weight value and the brain wave activity state, the second product is a product of a second weight value and the eye activity state, the third product is a product of a third weight value and the biological feature activity state, the first weight value is greater than the second weight value, and the first weight value is greater than the third weight value.
Optionally, in a possible implementation manner, the obtaining module 1110 is specifically configured to receive the brain wave signal collected and transmitted by an in-ear earplug.
It should be understood that the driving state processing device 1100 herein is embodied in the form of functional modules. The term "module" herein may be implemented in software and/or hardware, and is not particularly limited thereto. For example, a "module" may be a software program, a hardware circuit, or a combination of both that implements the functionality described above. The hardware circuitry may include an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (e.g., a shared processor, a dedicated processor, or a group of processors) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that support the described functionality.
In one example, a schematic view of the structure of the driving state processing device may be as shown in fig. 12. The driving state processing device 1200 may include a driver state processing device 1210 and a vehicle state processing device 1220. The driving state processing apparatus 1200 may include various modules implemented by software and/or hardware.
For example, the driver state processing device 1210 may be configured to acquire an electroencephalogram signal of a driver and determine an electroencephalogram activity state of the driver according to the electroencephalogram signal of the driver; the vehicle state processing means 1220 may be configured to control a driving state of the autonomous vehicle, including at least one of a manual driving state and an assisted autonomous driving state, according to the brain wave activity state of the driver.
Optionally, the driver state processing device 1210 may be further configured to obtain eye data of the driver, and determine an eye activity state of the driver according to the eye data; the vehicle state processing device 1220 may be configured to control a driving state of the autonomous vehicle according to the brain wave activity state and the eye activity state.
Optionally, the driver state processing device 1210 may be further configured to obtain biometric data of the driver, and determine a biometric activity state of the driver according to the biometric data; the vehicle state processing means 1220 may be configured to control a driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state.
Optionally, the driver state processing device 1210 may be further configured to obtain biometric data and eye data of the driver, determine a biometric activity state of the driver according to the biometric data, and determine an eye activity state of the driver according to the eye data; the vehicle state processing device 1220 may be configured to control a driving state of the autonomous vehicle according to the brain wave activity state, the eye activity state, and the biometric activity state.
As an example, the driving state processing apparatus of an autonomous vehicle provided in the embodiments of the present application may be an on-board device of the autonomous vehicle, or may be a chip configured in the on-board device, and may execute the method described in the embodiments of the present application.
Fig. 13 is a schematic block diagram of a device for driving state processing according to an embodiment of the present application. The apparatus 1300 shown in fig. 13 includes a memory 1301, a processor 1302, a communication interface 1303, and a bus 1304. The memory 1301, the processor 1302, and the communication interface 1303 are communicatively connected to each other through a bus 1304.
The memory 1301 may be a Read Only Memory (ROM), a static memory device, a dynamic memory device, or a Random Access Memory (RAM). The memory 1301 may store a program, and when the program stored in the memory 1301 is executed by the processor 1302 for performing the steps of the method of planning an autonomous vehicle of the embodiments of the present application, for example, the steps of the embodiment shown in fig. 4 may be performed.
The processor 1302 may be a general purpose CPU, a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more integrated circuits, and is configured to execute related programs to implement the method for planning an autonomous vehicle according to the embodiment of the present invention.
The processor 1302 may also be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the method for human-vehicle interaction of an autonomous vehicle according to the embodiment of the present application may be implemented by integrated logic circuits of hardware or instructions in the form of software in the processor 1302.
The processor 1302 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 1301, and the processor 1302 reads information in the memory 1301, and completes functions required to be executed by a unit included in the driving state processing apparatus of the autonomous vehicle in the embodiment of the present application in combination with hardware thereof, or executes the method for planning an autonomous vehicle in the embodiment of the method of the present application, for example, the steps/functions of the embodiment shown in fig. 4 may be executed.
Communication interface 1303 may enable communication between apparatus 1300 and other devices or communication networks using transceiver devices, such as, but not limited to, transceivers.
Bus 1304 may include pathways for communicating information between various components of apparatus 1300 (e.g., memory 1301, processor 1302, communication interface 1303).
It should be understood that the apparatus shown in the embodiment of the present application may be an on-board device in an autonomous vehicle, or may also be a chip configured in the on-board device.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (38)

1. A method of human-vehicle interaction for autonomous driving of a vehicle, comprising:
acquiring brain wave signals of a driver;
determining the brain wave activity state of the driver according to the brain wave signal of the driver;
controlling a driving state of the autonomous vehicle according to the brain wave activity state of the driver, the driving state including at least one of a manual driving state and an assisted autonomous driving state.
2. The method according to claim 1, wherein the determining the brain wave activity state of the driver from the brain wave signal of the driver comprises:
determining the brain wave activity state of the driver according to the similarity between a first brain wave signal and a second brain wave signal, wherein the first brain wave signal is a brain wave signal acquired in a first time period, the second brain wave signal is a brain wave signal acquired in a second time period, and the time interval between the first time period and the second time period is smaller than a first preset threshold.
3. The method of claim 1 or 2, wherein controlling the driving state of the autonomous vehicle in accordance with the brain wave activity state of the driver comprises:
and controlling the driving state of the automatic driving vehicle to be the auxiliary automatic driving state under the condition that the brain wave activity state is lower than the second preset threshold value.
4. The method according to claim 3, wherein the assisted autonomous driving state comprises a first type assisted autonomous driving state and a second type assisted autonomous driving state, the driving state of the autonomous vehicle being the first type assisted autonomous driving state in case the brain wave activity state is above a third preset threshold, the driving state of the autonomous vehicle being the second type assisted autonomous driving state in case the brain wave activity state is below the third preset threshold, the level of automation of the second type assisted autonomous driving state being higher than the level of automation of the first type assisted autonomous driving state, the third preset threshold being below the second preset threshold.
5. The method according to claim 1 or 2, wherein prior to said controlling the driving state of the autonomous vehicle in accordance with the brain wave activity state of the driver, the method further comprises:
acquiring eye data of the driver;
determining the eye activity state of the driver according to the eye data;
the controlling a driving state of the autonomous vehicle according to the brain wave activity state of the driver includes:
and controlling the driving state of the automatic driving vehicle according to the brain wave activity state and the eye activity state.
6. The method of claim 5, wherein controlling the driving state of the autonomous vehicle in accordance with the brain wave activity state and the eye activity state comprises:
controlling the driving state of the autonomous vehicle to the assisted autonomous state in case the sum of the first product and the second product is smaller than the fourth preset threshold,
wherein the first product is a product of a first weight value and the brain wave activity state, the second product is a product of a second weight value and the eye activity state, and the first weight value is greater than the second weight value.
7. The method according to claim 1 or 2, wherein prior to said controlling the driving state of the autonomous vehicle in accordance with the brain wave activity state of the driver, the method further comprises:
acquiring biological characteristic data of the driver, wherein the biological characteristic data comprises one or more of pulse data, heartbeat data and blood pressure data;
determining a biometric activity state of the driver according to the biometric data;
the controlling a driving state of the autonomous vehicle according to the brain wave activity state of the driver includes:
controlling a driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state.
8. The method of claim 7, wherein the determining the driver's biometric activity state from the biometric data comprises:
determining the biological feature activity state of the driver according to the similarity between first biological feature data and second biological feature data, wherein the first biological feature data are obtained in a third time period, the second biological feature data are obtained in a fourth time period, and the time interval between the third time period and the fourth time period is smaller than a fifth preset threshold value.
9. The method of claim 7 or 8, wherein said controlling a driving state of the autonomous vehicle in accordance with the brain wave activity state and the biometric activity state comprises:
controlling the driving state of the autonomous vehicle to be the assisted autonomous driving state, in case the sum of the first product and the third product is smaller than the sixth preset threshold;
wherein the first product is a product of a first weight value and the brain wave activity state, the third product is a product of a third weight value and the biological characteristic activity state, and the first weight value is greater than the third weight value.
10. The method according to claim 7 or 8, wherein prior to said controlling a driving state of the autonomous vehicle in accordance with the brain wave activity state and the biometric activity state, the method further comprises:
acquiring eye data of the driver;
determining the eye activity state of the driver according to the eye data;
controlling a driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state, comprising:
controlling the driving state of the autonomous vehicle to the assisted autonomous state in case the sum of the first product, the second product and the third product is smaller than the seventh preset threshold,
wherein the first product is a product of a first weight value and the brain wave activity state, the second product is a product of a second weight value and the eye activity state, the third product is a product of a third weight value and the biological feature activity state, the first weight value is greater than the second weight value, and the first weight value is greater than the third weight value.
11. The method according to any one of claims 1 to 10, wherein the acquiring brain wave signals of the driver comprises:
and receiving the brain wave signals collected and sent by the in-ear type earplugs.
12. An autopilot system, comprising:
the brain wave acquisition device is used for acquiring brain wave signals of a driver;
the driver state processing device is used for determining the brain wave activity state of the driver according to the brain wave signal of the driver;
and the vehicle state processing device is used for controlling the driving state of the automatic driving vehicle according to the brain wave activity state of the driver, and the driving state comprises at least one of a manual driving state and an auxiliary automatic driving state.
13. The autopilot system of claim 12 wherein,
the brain wave acquisition device is specifically used for determining the brain wave activity state of the driver according to the similarity between a first brain wave signal and a second brain wave signal, wherein the first brain wave signal is the brain wave signal acquired in a first time period, the second brain wave signal is the brain wave signal acquired in a second time period, and the time interval between the first time period and the second time period is smaller than a first preset threshold.
14. The autopilot system of claim 12 or 13,
the vehicle state processing device is specifically configured to control the driving state of the autonomous vehicle to be the assisted autonomous driving state when the brain wave activity state is lower than the second preset threshold.
15. The autonomous driving system of claim 14, wherein the assisted autonomous driving state comprises a first type assisted autonomous driving state and a second type assisted autonomous driving state, the driving state of the autonomous vehicle being the first type assisted autonomous driving state in a case where the brain wave activity state is above a third preset threshold, the driving state of the autonomous vehicle being the second type assisted autonomous driving state in a case where the brain wave activity state is below the third preset threshold, the automation level of the second type assisted autonomous driving state being higher than the automation level of the first type assisted autonomous driving state, the third preset threshold being below the second preset threshold.
16. The autopilot system of claim 12 or 13, further comprising:
the eye data acquisition device is used for acquiring the eye data of the driver;
the driver state processing device is further used for determining the eye activity state of the driver according to the eye data;
the vehicle state processing device is specifically configured to control a driving state of the autonomous vehicle according to the brain wave activity state and the eye activity state.
17. The autopilot system of claim 16 wherein,
the vehicle state processing device is specifically configured to control the driving state of the autonomous vehicle to be the assisted autonomous driving state, when the sum of the first product and the second product is smaller than the fourth preset threshold,
wherein the first product is a product of a first weight value and the brain wave activity state, the second product is a product of a second weight value and the eye activity state, and the first weight value is greater than the second weight value.
18. The autopilot system of claim 12 or 13, further comprising:
the biological characteristic data acquisition device is used for acquiring biological characteristic data of the driver, and the biological characteristic data comprises one or more of pulse data, heartbeat data and blood pressure data;
the driver state processing device is further used for determining the biological characteristic activity state of the driver according to the biological characteristic data;
the vehicle state processing device is specifically configured to control a driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state.
19. The autopilot system of claim 18 wherein,
the driver state processing device is specifically configured to determine the biometric activity state of the driver according to a similarity between first biometric data and second biometric data, where the first biometric data is biometric data acquired in a third time period, the second biometric data is biometric data acquired in a fourth time period, and a time interval between the third time period and the fourth time period is smaller than a fifth preset threshold.
20. The autopilot system of claim 18 or 19,
the vehicle state processing device is specifically configured to control the driving state of the autonomous vehicle to be the assisted autonomous driving state when the sum of the first product and the third product is smaller than the sixth preset threshold;
wherein the first product is a product of a first weight value and the brain wave activity state, the third product is a product of a third weight value and the biological characteristic activity state, and the first weight value is greater than the third weight value.
21. The autopilot system of claim 18 or 19 further comprising:
the eye data acquisition device is used for acquiring the eye data of the driver; determining the eye activity state of the driver according to the eye data;
the driver state processing device is further used for determining the eye activity state of the driver according to the eye data;
the vehicle state processing device is specifically configured to control the driving state of the autonomous vehicle to be the assisted autonomous driving state, when a sum of the first product, the second product, and the third product is smaller than the seventh preset threshold,
wherein the first product is a product of a first weight value and the brain wave activity state, the second product is a product of a second weight value and the eye activity state, the third product is a product of a third weight value and the biological feature activity state, the first weight value is greater than the second weight value, and the first weight value is greater than the third weight value.
22. The autopilot system of any one of claims 12 to 21 wherein the brain wave acquisition device is an in-ear earplug.
23. A driving state processing apparatus characterized by comprising:
the acquisition module is used for acquiring brain wave signals of a driver;
and the processing module is used for determining the brain wave activity state of the driver according to the brain wave signal of the driver and controlling the driving state of the automatic driving vehicle according to the brain wave activity state of the driver, wherein the driving state comprises at least one of a manual driving state and an auxiliary automatic driving state.
24. The driving state processing apparatus according to claim 23,
the processing module is specifically configured to determine a brain wave activity state of the driver according to a similarity between a first brain wave signal and a second brain wave signal, where the first brain wave signal is a brain wave signal acquired in a first time period, the second brain wave signal is a brain wave signal acquired in a second time period, and a time interval between the first time period and the second time period is smaller than a first preset threshold.
25. The driving state processing apparatus according to claim 23 or 24, wherein the processing module is specifically configured to control the driving state of the autonomous vehicle to be the assisted autonomous driving state if the brain wave activity state is lower than the second preset threshold.
26. The driving state processing apparatus according to claim 25, wherein the assisted autonomous driving state includes a first type assisted autonomous driving state and a second type assisted autonomous driving state, the driving state of the autonomous vehicle is the first type assisted autonomous driving state in a case where the brain wave activity state is higher than a third preset threshold, the driving state of the autonomous vehicle is the second type assisted autonomous driving state in a case where the brain wave activity state is lower than the third preset threshold, an automation level of the second type assisted autonomous driving state is higher than an automation level of the first type assisted autonomous driving state, and the third preset threshold is lower than the second preset threshold.
27. The driving state processing apparatus according to claim 23 or 24,
the acquisition module is further used for acquiring the eye data of the driver;
the processing module is further used for determining the eye activity state of the driver according to the eye data;
the processing module is specifically configured to control a driving state of the autonomous vehicle according to the brain wave activity state and the eye activity state.
28. The driving state processing apparatus according to claim 27,
the processing module is specifically configured to control the driving state of the autonomous vehicle to be the assisted autonomous driving state when the sum of the first product and the second product is smaller than the fourth preset threshold,
wherein the first product is a product of a first weight value and the brain wave activity state, the second product is a product of a second weight value and the eye activity state, and the first weight value is greater than the second weight value.
29. The driving state processing apparatus according to claim 23 or 24,
the acquisition module is further used for acquiring biological characteristic data of the driver, wherein the biological characteristic data comprises one or more of pulse data, heartbeat data and blood pressure data;
the processing module is further used for determining the biological characteristic activity state of the driver according to the biological characteristic data;
the processing module is specifically configured to control a driving state of the autonomous vehicle according to the brain wave activity state and the biometric activity state.
30. The driving state processing apparatus according to claim 29,
the processing module is specifically configured to determine a biometric activity state of the driver according to a similarity between first biometric data and second biometric data, where the first biometric data is biometric data acquired in a third time period, the second biometric data is biometric data acquired in a fourth time period, and a time interval between the third time period and the fourth time period is smaller than a fifth preset threshold.
31. The driving state processing apparatus according to claim 29 or 30,
the processing module is specifically configured to control the driving state of the autonomous vehicle to be the assisted autonomous driving state when a sum of the first product and the third product is smaller than the sixth preset threshold;
wherein the first product is a product of a first weight value and the brain wave activity state, the third product is a product of a third weight value and the biological characteristic activity state, and the first weight value is greater than the third weight value.
32. The driving state processing apparatus according to claim 29 or 30,
the acquisition module is further used for acquiring the eye data of the driver;
the processing module is further used for determining the eye activity state of the driver according to the eye data;
the processing module is specifically configured to,
controlling the driving state of the autonomous vehicle to the assisted autonomous state in case the sum of the first product, the second product and the third product is smaller than the seventh preset threshold,
wherein the first product is a product of a first weight value and the brain wave activity state, the second product is a product of a second weight value and the eye activity state, the third product is a product of a third weight value and the biological feature activity state, the first weight value is greater than the second weight value, and the first weight value is greater than the third weight value.
33. The driving state processing apparatus according to any one of claims 23 to 32, wherein the obtaining module is specifically configured to receive the brainwave signals collected and transmitted by an in-ear earplug.
34. A driving state processing apparatus comprising a processor and a memory, the memory for storing program instructions, the processor for invoking the program instructions to perform the method of human-vehicle interaction of an autonomous vehicle of any of claims 1 to 11.
35. An autonomous vehicle comprising an autonomous driving system as claimed in any one of claims 12 to 22.
36. An autonomous vehicle characterized by comprising the driving state processing apparatus of any one of claims 23 to 33.
37. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored therein program instructions which, when executed by a processor, implement the method of human-vehicle interaction of an autonomous vehicle of any of claims 1 to 11.
38. A chip characterized in that it comprises a processor and a data interface through which the processor reads instructions stored on a memory to perform a method of human-vehicle interaction of an autonomous vehicle as claimed in any one of claims 1 to 11.
CN201910844653.0A 2019-09-06 2019-09-06 Human-vehicle interaction method for automatically driving vehicle and automatically driving system Active CN112455461B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910844653.0A CN112455461B (en) 2019-09-06 2019-09-06 Human-vehicle interaction method for automatically driving vehicle and automatically driving system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910844653.0A CN112455461B (en) 2019-09-06 2019-09-06 Human-vehicle interaction method for automatically driving vehicle and automatically driving system

Publications (2)

Publication Number Publication Date
CN112455461A true CN112455461A (en) 2021-03-09
CN112455461B CN112455461B (en) 2022-10-28

Family

ID=74806949

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910844653.0A Active CN112455461B (en) 2019-09-06 2019-09-06 Human-vehicle interaction method for automatically driving vehicle and automatically driving system

Country Status (1)

Country Link
CN (1) CN112455461B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113276867A (en) * 2021-06-15 2021-08-20 王晓铭 Brain wave emergency control system and method under automatic driving situation
CN113347522A (en) * 2021-05-08 2021-09-03 歌尔股份有限公司 Earphone control method, device, equipment and storage medium
CN114385099A (en) * 2021-11-26 2022-04-22 中国航空无线电电子研究所 Multi-unmanned aerial vehicle dynamic monitoring interface display method and device based on active push display

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107336711A (en) * 2017-06-19 2017-11-10 北京汽车股份有限公司 Vehicle and its automated driving system and method
CN107428245A (en) * 2015-03-03 2017-12-01 雷诺两合公司 For the apparatus and method for the vigilance level for predicting motor vehicle operator
US20190083022A1 (en) * 2017-09-15 2019-03-21 Hsin Ming Huang Fatigue driving monitoring device
CN109747656A (en) * 2017-11-02 2019-05-14 罗曦明 Artificial intelligence vehicle assistant drive method, apparatus, equipment and storage medium
CN109910900A (en) * 2019-04-01 2019-06-21 广东科学技术职业学院 A kind of intelligent driving system and method
KR101999211B1 (en) * 2018-01-03 2019-07-11 가톨릭대학교 산학협력단 Driver condition detecting apparatus using brain wave and method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107428245A (en) * 2015-03-03 2017-12-01 雷诺两合公司 For the apparatus and method for the vigilance level for predicting motor vehicle operator
CN107336711A (en) * 2017-06-19 2017-11-10 北京汽车股份有限公司 Vehicle and its automated driving system and method
US20190083022A1 (en) * 2017-09-15 2019-03-21 Hsin Ming Huang Fatigue driving monitoring device
CN109747656A (en) * 2017-11-02 2019-05-14 罗曦明 Artificial intelligence vehicle assistant drive method, apparatus, equipment and storage medium
KR101999211B1 (en) * 2018-01-03 2019-07-11 가톨릭대학교 산학협력단 Driver condition detecting apparatus using brain wave and method thereof
CN109910900A (en) * 2019-04-01 2019-06-21 广东科学技术职业学院 A kind of intelligent driving system and method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113347522A (en) * 2021-05-08 2021-09-03 歌尔股份有限公司 Earphone control method, device, equipment and storage medium
CN113276867A (en) * 2021-06-15 2021-08-20 王晓铭 Brain wave emergency control system and method under automatic driving situation
CN114385099A (en) * 2021-11-26 2022-04-22 中国航空无线电电子研究所 Multi-unmanned aerial vehicle dynamic monitoring interface display method and device based on active push display
CN114385099B (en) * 2021-11-26 2023-12-12 中国航空无线电电子研究所 Multi-unmanned aerial vehicle dynamic monitoring interface display method and device based on active push display

Also Published As

Publication number Publication date
CN112455461B (en) 2022-10-28

Similar Documents

Publication Publication Date Title
CN112041910B (en) Information processing apparatus, mobile device, method, and program
JP7288911B2 (en) Information processing device, mobile device, method, and program
JP7155122B2 (en) Vehicle control device and vehicle control method
JP7273031B2 (en) Information processing device, mobile device, information processing system and method, and program
JP7032966B2 (en) Vehicle safety support devices and methods
US10308258B2 (en) System and method for responding to driver state
JP7324716B2 (en) Information processing device, mobile device, method, and program
CN113665528B (en) Autonomous vehicle safety system and method
KR102053794B1 (en) Apparatus and method for delivering driver's movement intention based on bio-signals
WO2021145131A1 (en) Information processing device, information processing system, information processing method, and information processing program
CN112455461B (en) Human-vehicle interaction method for automatically driving vehicle and automatically driving system
JPWO2018190152A1 (en) Information processing apparatus, information processing method, and program
JP7357006B2 (en) Information processing device, mobile device, method, and program
WO2022172724A1 (en) Information processing device, information processing method, and information processing program
CN116204806A (en) Brain state determining method and device
JP7238193B2 (en) Vehicle control device and vehicle control method
CN116872944A (en) Vehicle speed control method, system, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant