CN117465459A - Vehicle control method, device, equipment and storage medium - Google Patents

Vehicle control method, device, equipment and storage medium Download PDF

Info

Publication number
CN117465459A
CN117465459A CN202311674941.9A CN202311674941A CN117465459A CN 117465459 A CN117465459 A CN 117465459A CN 202311674941 A CN202311674941 A CN 202311674941A CN 117465459 A CN117465459 A CN 117465459A
Authority
CN
China
Prior art keywords
vehicle
emotion
control strategy
data
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311674941.9A
Other languages
Chinese (zh)
Inventor
夏冰
刘永宏
付斌
谢健
刘会凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lantu Automobile Technology Co Ltd
Original Assignee
Lantu Automobile Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lantu Automobile Technology Co Ltd filed Critical Lantu Automobile Technology Co Ltd
Priority to CN202311674941.9A priority Critical patent/CN117465459A/en
Publication of CN117465459A publication Critical patent/CN117465459A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/21Voice
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/223Posture, e.g. hand, foot, or seat position, turned or inclined
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/20Ambient conditions, e.g. wind or rain

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to the technical field of vehicle control, and discloses a vehicle control method, a device, equipment and a storage medium, wherein the method comprises the following steps: collecting emotion data of a driver in the vehicle, and determining emotion types based on the emotion data; analyzing and mapping according to emotion types to obtain emotion grades; determining a driving control strategy based on the emotion level; sensing the surrounding environment of the vehicle to obtain vehicle environment data; the motion state of the vehicle is controlled based on the driving control strategy and the vehicle environment data. According to the method and the device, the emotion grade is determined according to the emotion data of the driver, so that the driving control strategy is determined to control the motion state of the vehicle, the technical problems that the motion state of the vehicle cannot be autonomously controlled after the emotion of the driver is detected, the intelligent degree is low, the safety is low are solved, the intelligent level of vehicle control is improved, and the driving safety is improved.

Description

Vehicle control method, device, equipment and storage medium
Technical Field
The present invention relates to the field of vehicle control technologies, and in particular, to a vehicle control method, device, apparatus, and storage medium.
Background
There are various reasons for traffic accidents, generally, human factors, vehicle factors, road and environment factors, etc., wherein the human factors are the first. The human factor is also the most important factor for the driver. The bad emotion of the driver is an important cause of traffic accident. However, in the existing method for detecting the emotion of the driver to control the vehicle, the vehicle auxiliary control is mainly performed based on the operation behavior of the driver after the emotion is detected, so that the vehicle cannot autonomously control the movement state after the emotion of the driver is detected, the intelligent degree is low, and traffic accidents still easily occur.
Disclosure of Invention
The invention mainly aims to provide a vehicle control method, device, equipment and storage medium, and aims to solve the technical problems that the prior art cannot automatically control a motion state of a vehicle after detecting the emotion of a driver, and the intelligent degree and the safety are low.
In order to achieve the above object, the present invention provides a vehicle control method comprising the steps of:
collecting emotion data of a driver in the vehicle, and determining emotion types based on the emotion data;
analyzing and mapping according to the emotion types to obtain emotion grades;
determining a driving control strategy based on the emotion level;
sensing the surrounding environment of the vehicle to obtain vehicle environment data;
and controlling a motion state of the vehicle based on the driving control strategy and the vehicle environment data.
Optionally, the emotion data includes a human body image, the collecting emotion data of the driver in the vehicle and determining the emotion type based on the emotion data includes:
responding to a starting signal of a vehicle, and acquiring a human body image of a driver in the vehicle to obtain the human body image;
extracting features based on the human body image to obtain facial expression features and limb action features;
and analyzing based on the facial expression characteristics and the limb action characteristics to obtain emotion types.
Optionally, the emotion data further includes voice data, the collecting emotion data of the driver in the vehicle, and determining the emotion type based on the emotion data, further includes:
responding to a starting signal of a vehicle, and collecting voice of a driver in the vehicle to obtain voice data;
extracting the characteristics of the voice data to obtain volume characteristics, mood characteristics and keyword characteristics;
and analyzing based on the volume characteristics, the mood characteristics and the keyword characteristics to obtain the emotion type.
Optionally, the driving control strategy includes a braking control strategy, and the determining the driving control strategy based on the emotion level includes:
determining a collision time threshold and an auxiliary braking force based on the emotion level, wherein the collision time threshold and auxiliary braking force are inversely related to the emotion level;
a brake control strategy is determined based on the collision time and the auxiliary braking force.
Optionally, the driving control strategy further includes a steering control strategy, and the determining the driving control strategy based on the emotion level further includes:
determining lane changing distance and lane changing duration based on the emotion level, wherein the lane changing distance and the lane changing duration are inversely related to the emotion level;
and determining a steering control strategy based on the lane change distance and the lane change duration.
Optionally, the driving control strategy further includes a driving control strategy, and the determining the driving control strategy based on the emotion level further includes:
determining a safety distance and a vehicle speed threshold value which are needed to be kept between a current vehicle and a front vehicle based on the emotion level, wherein the safety distance is inversely related to the emotion level, and the vehicle speed threshold value is positively related to the emotion level;
a drive control strategy is determined based on the relative distance spacing and a vehicle speed threshold.
Optionally, the controlling the motion state of the vehicle based on the driving control strategy and the vehicle environment data includes:
determining driving scenario information based on the vehicle environment data;
and controlling the motion state of the vehicle according to the driving control strategy corresponding to the driving scene information.
In addition, in order to achieve the above object, the present invention also proposes a vehicle control device including:
the system comprises an acquisition module, a control module and a control module, wherein the acquisition module is used for acquiring emotion data of a driver in a vehicle and determining emotion types based on the emotion data;
the analysis module is used for analyzing and mapping according to the emotion types to obtain emotion grades;
a determining module for determining a driving control strategy based on the emotion level;
the sensing module is used for sensing the surrounding environment of the vehicle to obtain vehicle environment data;
and the control module is used for controlling the motion state of the vehicle based on the driving control strategy and the vehicle environment data.
In addition, in order to achieve the above object, the present invention also proposes a vehicle control apparatus including: a memory, a processor, and a vehicle control program stored on the memory and executable on the processor, the vehicle control program configured to implement the steps of the vehicle control method as described above.
In addition, in order to achieve the above object, the present invention also proposes a storage medium having stored thereon a vehicle control program which, when executed by a processor, implements the steps of the vehicle control method as described above.
According to the method, emotion data of a driver in the vehicle are collected, and emotion types are determined based on the emotion data; analyzing and mapping according to emotion types to obtain emotion grades; determining a driving control strategy based on the emotion level; sensing the surrounding environment of the vehicle to obtain vehicle environment data; the motion state of the vehicle is controlled based on the driving control strategy and the vehicle environment data. Through the mode, the emotion grade is determined according to the emotion data of the driver, so that the driving control strategy is determined to control the motion state of the vehicle, the technical problems that the motion state of the vehicle cannot be independently controlled after the emotion of the driver is detected at present, the intelligent degree is low, the safety is low, the intelligent level of vehicle control is improved, and the driving safety is improved are solved.
Drawings
Fig. 1 is a schematic structural diagram of a vehicle control apparatus of a hardware running environment according to an embodiment of the present invention;
FIG. 2 is a flow chart of a first embodiment of a vehicle control method according to the present invention;
FIG. 3 is a schematic diagram showing a configuration of a vehicle control apparatus according to an embodiment of the vehicle control method of the present invention;
FIG. 4 is a schematic diagram of a brake control strategy according to an embodiment of the vehicle control method of the present invention;
FIG. 5 is a schematic diagram of a steering control strategy according to an embodiment of the vehicle control method of the present invention;
FIG. 6 is a schematic diagram of a driving control strategy according to an embodiment of the vehicle control method of the present invention;
FIG. 7 is a flow chart of a second embodiment of a vehicle control method according to the present invention;
FIG. 8 is a flow chart of emotion analysis according to an embodiment of the vehicle control method of the present invention;
fig. 9 is a block diagram showing the construction of a first embodiment of the vehicle control apparatus of the present invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, fig. 1 is a schematic diagram of a vehicle control device in a hardware operating environment according to an embodiment of the present invention.
As shown in fig. 1, the vehicle control apparatus may include: a processor 1001, such as a central processing unit (Central Processing Unit, CPU), a communication bus 1002, a user interface 1003, a network interface 1004, a memory 1005. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display, an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may further include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a Wireless interface (e.g., a Wireless-Fidelity (Wi-Fi) interface). The Memory 1005 may be a high-speed random access Memory (Random Access Memory, RAM) or a stable nonvolatile Memory (NVM), such as a disk Memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
It will be appreciated by those skilled in the art that the structure shown in fig. 1 does not constitute a limitation of the vehicle control apparatus, and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
As shown in fig. 1, an operating system, a network communication module, a user interface module, and a vehicle control program may be included in the memory 1005 as one type of storage medium.
In the vehicle control apparatus shown in fig. 1, the network interface 1004 is mainly used for data communication with a network server; the user interface 1003 is mainly used for data interaction with a user; the processor 1001 and the memory 1005 in the vehicle control apparatus of the present invention may be provided in the vehicle control apparatus, which invokes the vehicle control program stored in the memory 1005 through the processor 1001 and executes the vehicle control method provided by the embodiment of the present invention.
An embodiment of the present invention provides a vehicle control method, and referring to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of the vehicle control method of the present invention.
In this embodiment, the vehicle control method includes the steps of:
step S10: emotion data of a driver in the vehicle is collected, and an emotion type is determined based on the emotion data.
The execution body of the present embodiment is a vehicle control device, and may be other devices having the same or similar functions, which is not particularly limited in this embodiment, and the present embodiment is described by taking the vehicle control device as an example.
As shown in fig. 3, fig. 3 is a schematic structural diagram of a vehicle control apparatus including a driver emotion collecting device (vision), a driver emotion collecting device (sound), a driver emotion processing device, an intelligent driving control device, and a brake control device, a steering control device, and a drive control device.
It can be appreciated that in this embodiment, at least the emotion data of the driver in the vehicle may be collected from the vision and the sound, and the emotion data may be analyzed to determine the emotion type of the driver. The emotion data includes at least human body image and sound data. The emotion type is at least happy, sad, calm, anxiety, etc., and this embodiment is not particularly limited.
In a specific implementation, the emotion type of the driver may be determined by means of physiological index detection, facial expression analysis, voice emotion analysis, driving behavior analysis, and the like, which is not particularly limited in this embodiment. The physiological index detection includes: heart rate monitoring, i.e. monitoring the heart rate of the driver by means of a heart rate sensor, a rapid heart rate may be indicative of stress or excitement. Skin conductance, i.e. measuring skin conductance to detect mood changes, which is related to the activity of the mood-related autonomic nervous system; brain wave detection: i.e. brain activity is monitored using electroencephalography (EEG) in order to detect an emotional state. Facial expression analysis is to capture the facial expression of the driver using a camera and computer vision techniques and then recognize the emotion through a facial expression analysis algorithm. The voice emotion analysis is to collect the voice of the driver through a microphone in the vehicle and then analyze the emotion in the voice by using a voice emotion recognition technology. The driving behavior analysis is to analyze the driving behavior of the driver, such as acceleration, braking force, steering wheel rotation, etc., to infer the emotional state of the driver.
Step S20: and analyzing and mapping according to the emotion types to obtain emotion grades.
It should be noted that, the emotion data is mapped to emotion grades, that is, the collected emotion information is converted into a discrete and operable quantized value, emotion classification is performed based on the quantized value to obtain emotion types, and then each class is assigned with a corresponding grade. For example, happy/relaxed: the happy and relaxed emotion can be mapped to a high ranking, indicating that the driver emotion is very positive; calm/normal: neutral emotional states may be mapped to a medium level, indicating that the driver is emotional normal; stress/anxiety: stress and anxiety may be mapped to low and so on, indicating that emotion is more negative.
It is to be understood that the emotion level may be divided into a plurality of gears, for example, 1, 2, and 3, and the present embodiment is not particularly limited thereto.
Step S30: and determining a driving control strategy based on the emotion level.
It should be noted that, the emotional state of the driver may affect their desire for the behavior of the vehicle, for example, for an anxiety or a tension driver, it may be desirable that the vehicle travel more smoothly, and actions such as sudden braking, sudden acceleration, or a sudden turn may be avoided, so as to reduce the tension of the driver; an excited or pleasurable driver may prefer to experience some exciting vehicle actions, such as rapid acceleration, rapid cornering, etc., to enhance the enjoyment of driving; for a driver in a calm or calm state, comfort and smoothness may be more important, they may be more prone to gentle vehicle speed changes and gentle steering; the vehicle should provide more auxiliary functions such as active cruise control, lane keeping assist, etc. than for a tired or tired driver to alleviate the burden on the driver; for a driver with a low mood, the movement of the vehicle should minimize any movement that may cause discomfort to provide a more gentle driving experience.
It can be understood that in this implementation, physiological and behavioral characteristics of the driver can be monitored by the sensor, and emotion information can be obtained through the human-computer interface, so that a motion control strategy of the vehicle is adjusted to meet the requirements of the driver in different emotion states, comfort and satisfaction of the driver can be improved, and driving safety is ensured.
Further, the driving control strategy includes a braking control strategy, and the determining the driving control strategy based on the emotion level includes: determining a collision time threshold and an auxiliary braking force based on the emotion level, wherein the collision time threshold and auxiliary braking force are inversely related to the emotion level; a brake control strategy is determined based on the collision time and the auxiliary braking force.
It should be noted that, by executing a brake control strategy by the brake control device, the brake control strategy includes a brake alert and an auxiliary brake, and when the vehicle senses an emergency situation, such as an obstacle in front, sudden traffic deceleration, etc., the system may trigger the brake alert. Brake alerts are often used in conjunction with auxiliary brakes to avoid collisions or reduce their severity.
It is understood that time to collision, TTC, is an indicator used to evaluate potential time to collision between vehicles or objects.
It is worth noting that the better the driver state, the higher the corresponding level, the shorter the Time To Collision (TTC) setting, and the less sensitive the alarm, i.e. the time to collision threshold is inversely related to the emotion level.
It is worth noting that the better the driver state, the higher the corresponding level, the shorter the Time To Collision (TTC) setting, the later the system initiates braking, and the smaller the auxiliary braking force setting, i.e. the auxiliary braking force is inversely related to the mood level.
As shown in fig. 4, fig. 4 is a schematic diagram of a brake control strategy, where the brake control strategy includes a brake alert and an auxiliary brake, the relationship among emotion levels a, b, and c is La > Lb > Lc, the better the driver state, the higher the corresponding level, the shorter the Time To Collision (TTC) setting, and the less sensitive the alarm. The better the driver state, the higher the corresponding level, the shorter the Time To Collision (TTC) setting, the later the system initiates braking, and the smaller the auxiliary braking force setting.
Further, the driving control strategy further includes a steering control strategy, and the determining the driving control strategy based on the emotion level further includes: determining lane changing distance and lane changing duration based on the emotion level, wherein the lane changing distance and the lane changing duration are inversely related to the emotion level; and determining a steering control strategy based on the lane change distance and the lane change duration.
It should be noted that, the steering control device executes a steering control strategy, and the steering control strategy at least includes lane change and lateral avoidance.
It will be appreciated that the better the driver condition, the higher the corresponding level, the wider the variable lane confirmation condition, and the shorter the lane change expected, the shorter the vehicle front-to-rear distance setting, the shorter the lane change expected time, i.e. lane change distance and lane change duration, is inversely related to the mood level.
It is worth noting that the better the driver state, the higher the corresponding level, and the shorter the time required for lateral avoidance, i.e. the lateral avoidance time is inversely related to the emotion level.
As shown in fig. 5, fig. 5 is a schematic diagram of a steering control strategy, the steering control strategy includes lane changing and lateral avoidance, the relation among emotion levels a, b and c is La > Lb > Lc, the better the driver state is, the higher the corresponding level is, the wider the variable lane confirmation condition is, the shorter the distance between the vehicle and the expected lane changing lane is set, and the shorter the time for changing the expected lane is. The better the driver state, the higher the corresponding level, and the shorter the time required for lateral avoidance.
Further, the driving control strategy further includes a driving control strategy, and the determining the driving control strategy based on the emotion level further includes: determining a safety distance and a vehicle speed threshold value which are needed to be kept between a current vehicle and a front vehicle based on the emotion level, wherein the safety distance is inversely related to the emotion level, and the vehicle speed threshold value is positively related to the emotion level; a drive control strategy is determined based on the relative distance spacing and a vehicle speed threshold.
The drive control device executes a drive control strategy including a following distance and a cruise speed.
It will be appreciated that the better the driver status, the higher the corresponding level, the shorter the safety distance that the current vehicle needs to maintain from the vehicle in front, i.e. the shorter the distance between the vehicle and the vehicle in front, the safety distance being inversely related to the mood level. The better the driver state, the higher the corresponding level, and the greater the vehicle speed threshold within a defined range, i.e. the vehicle speed threshold is positively correlated with the emotion level.
In particular implementations, the system may select a more conservative cruise control strategy when the driver is more emotional or anxious, maintaining a greater safety distance or limiting the speed cap.
As shown in fig. 6, fig. 6 is a schematic diagram of a driving control strategy, where the driving control strategy includes a following distance and a cruising speed, and the relationship among emotion levels a, b, and c is La > Lb > Lc, and the better the driver state, the higher the corresponding level, the shorter the safety distance between the current vehicle and the preceding vehicle needs to be maintained, that is, the shorter the following distance. The better the driver state, the higher the corresponding level, and the greater the vehicle speed threshold within a defined range.
Step S40: and sensing the surrounding environment of the vehicle to obtain vehicle environment data.
It should be noted that, the surrounding environment of the vehicle may be sensed by the environment sensing device to obtain the vehicle environment data, and the environment sensing device may include visual sensing, millimeter wave radar, laser radar, and the like, which is not limited in this embodiment.
It will be appreciated that visual perception is the use of cameras to acquire real-time images for object detection, identification and tracking. This may help the vehicle understand vehicles on roads, pedestrians, traffic signs, etc. Millimeter wave radar is used for finer obstacle detection around a vehicle, such as object detection at the time of parking and low-speed running. Lidar creates a high resolution three-dimensional map of the vehicle surroundings by emitting a laser beam and measuring the time of reflection, facilitating accurate obstacle detection and scene modeling.
Step S50: and controlling a motion state of the vehicle based on the driving control strategy and the vehicle environment data.
It should be noted that, according to the environmental data and the driving control policy, it is possible to control how the vehicle moves on the road, avoid the obstacle, follow the traffic rules, and the like.
It is understood that motion control commands are sent based on environmental data and driving control strategies to communicate control signals to the vehicle's execution units, including the engine, brake system, steering system, etc. Motion control requires keeping track of the vehicle state in a real-time changing environment and adjusting control commands to accommodate changing conditions.
Further, the controlling the motion state of the vehicle based on the driving control strategy and the vehicle environment data includes: determining driving scenario information based on the vehicle environment data; and controlling the motion state of the vehicle according to the driving control strategy corresponding to the driving scene information.
It should be noted that, according to the road boundary, lane marking and intersection information in the vehicle environment data, an accurate model of the current road is established to obtain the road structure, including the straight road, the turning road, the intersection and the like, which is helpful for path planning.
It is understood that traffic signs on roads based on vehicle environment data, including speed limit signs, parking signs, and the like. The state of the traffic signal lamp is monitored to understand the traffic condition of the intersection.
And fusing the vehicle positioning information in the vehicle environment data with the map data to improve the accuracy of the vehicle position. The consistency of the position of the vehicle on the map and the surrounding environment is ensured.
And according to the position and the motion state of the obstacle in the vehicle environment data, making an obstacle avoidance strategy to ensure the safe passing of the vehicle.
In a specific implementation, the density and speed of the surrounding traffic flow is analyzed based on the vehicle environment data to understand the current road traffic conditions. Driving scenario information is determined in consideration of behaviors and intentions of other vehicles to make corresponding driving decisions in advance.
According to the method, emotion data of a driver in the vehicle are collected, and emotion types are determined based on the emotion data; analyzing and mapping according to emotion types to obtain emotion grades; determining a driving control strategy based on the emotion level; sensing the surrounding environment of the vehicle to obtain vehicle environment data; the motion state of the vehicle is controlled based on the driving control strategy and the vehicle environment data. Through the mode, the emotion grade is determined according to the emotion data of the driver, so that the driving control strategy is determined to control the motion state of the vehicle, the technical problems that the motion state of the vehicle cannot be independently controlled after the emotion of the driver is detected at present, the intelligent degree is low, the safety is low, the intelligent level of vehicle control is improved, and the driving safety is improved are solved.
Referring to fig. 7, fig. 7 is a flowchart illustrating a second embodiment of a vehicle control method according to the present invention.
Based on the above first embodiment, the emotion data includes a human body image, and the step S10 in the vehicle control method of the present embodiment includes:
step S101: and responding to a starting signal of the vehicle, and acquiring a human body image of the driver in the vehicle to obtain the human body image.
When the lane vehicle is started, a human body image of the driver is collected by the driver emotion collection device (vision).
It is to be understood that the driver emotion collecting device (vision) may be a camera, which is not particularly limited in this embodiment.
Step S102: and extracting features based on the human body image to obtain facial expression features and limb action features.
It should be noted that, extracting facial expression features based on a human body image requires using a face detection algorithm to find a face region in the image, and for the facial expression features, it is necessary to detect and locate positions of key face regions, such as eyes, mouth and eyebrows, so as to identify facial expressions and obtain facial expression features.
It will be appreciated that extracting limb movement features based on the human body image finds key points of the human body in the image using a human body posture estimation algorithm, the key points represent main parts of the body, such as hands, shoulders, knees, etc., and the limb movement is represented using information such as joint angles or distances between the key points, etc., resulting in limb movement features.
Step S103: and analyzing based on the facial expression characteristics and the limb action characteristics to obtain emotion types.
It can be appreciated that the facial expression features and limb motion features are fused to obtain more comprehensive emotion information, and emotion types are determined based on the emotion information.
Further, the emotion data further includes voice data, the collecting emotion data of the driver in the vehicle, and determining the emotion type based on the emotion data, further includes: responding to a starting signal of a vehicle, and collecting voice of a driver in the vehicle to obtain voice data; extracting the characteristics of the voice data to obtain volume characteristics, mood characteristics and keyword characteristics; and analyzing based on the volume characteristics, the mood characteristics and the keyword characteristics to obtain the emotion type.
It should be noted that, when the vehicle is started, the driver emotion collecting device (sound) collects the voice data of the driver, and the driver emotion collecting device (sound) may be a microphone in the vehicle, which is not particularly limited in this embodiment.
In a specific implementation, the original voice data is preprocessed, including noise reduction, silence removal, voice segmentation and other operations, so as to improve the effect of subsequent feature extraction. Extracting amplitude information of the preprocessed audio signal to obtain sound size; by analyzing the duration of the voice unit in the voice signal, the speech speed information can be obtained, so that the speech urgency is judged. And converting the voice data into text, and extracting keywords from the text to obtain keyword characteristics.
It will be appreciated that the extracted volume features, mood features and keyword features are fused to obtain a more comprehensive speech feature representation, and emotion types are determined based on the speech feature representation.
As shown in fig. 8, fig. 8 is a flowchart of emotion analysis, in which emotion analysis is performed according to facial expression, limb actions, sound size, and keyword grabbing information for rapid relaxation of mood acquired by emotion acquisition, so as to obtain different emotions, wherein class a includes happiness, sadness, calm, anxiety, etc., class b includes positive direction, negative direction, neutrality, and class c includes high, frigidity, fatigue, drowsiness. According to different emotions, emotion analysis and mapping are carried out to obtain different emotion grades, including 1 grade, 2 grade and 3 grade, corresponding driving operations are carried out according to the emotion grades, such as reminding, lane changing, overtaking, following, cruising, braking, transverse avoidance and the like.
In the embodiment, a human body image is acquired for a driver in a vehicle by responding to a starting signal of the vehicle, so that a human body image is obtained; extracting features based on the human body image to obtain facial expression features and limb action features; and analyzing based on the facial expression characteristics and the limb action characteristics to obtain emotion types. By the method, facial expression characteristics and limb action characteristics obtained by extracting the characteristics of the human body image are analyzed to obtain emotion types, so that emotion grades are determined, corresponding control strategies are formulated, and accuracy of vehicle control and driving safety are improved.
Referring to fig. 9, fig. 9 is a block diagram showing the construction of a first embodiment of the vehicle control apparatus of the present invention.
As shown in fig. 9, a vehicle control apparatus according to an embodiment of the present invention includes:
the collection module 10 is used for collecting emotion data of the driver in the vehicle and determining emotion types based on the emotion data.
And the analysis module 20 is used for analyzing and mapping according to the emotion types to obtain emotion grades.
A determination module 30 for determining a driving control strategy based on the emotion level.
The sensing module 40 is configured to sense a surrounding environment of the vehicle, and obtain vehicle environment data.
A control module 50 for controlling the state of motion of the vehicle based on the driving control strategy and the vehicle environment data.
According to the method, emotion data of a driver in the vehicle are collected, and emotion types are determined based on the emotion data; analyzing and mapping according to emotion types to obtain emotion grades; determining a driving control strategy based on the emotion level; sensing the surrounding environment of the vehicle to obtain vehicle environment data; the motion state of the vehicle is controlled based on the driving control strategy and the vehicle environment data. Through the mode, the emotion grade is determined according to the emotion data of the driver, so that the driving control strategy is determined to control the motion state of the vehicle, the technical problems that the motion state of the vehicle cannot be independently controlled after the emotion of the driver is detected at present, the intelligent degree is low, the safety is low, the intelligent level of vehicle control is improved, and the driving safety is improved are solved.
In an embodiment, the emotion data includes a human body image, and the acquisition module 10 is further configured to respond to a start signal of a vehicle to acquire the human body image of the driver in the vehicle, so as to obtain the human body image; extracting features based on the human body image to obtain facial expression features and limb action features; and analyzing based on the facial expression characteristics and the limb action characteristics to obtain emotion types.
In an embodiment, the emotion data further includes voice data, and the collecting module 10 is further configured to respond to a start signal of the vehicle and collect voice of the driver in the vehicle to obtain voice data; extracting the characteristics of the voice data to obtain volume characteristics, mood characteristics and keyword characteristics; and analyzing based on the volume characteristics, the mood characteristics and the keyword characteristics to obtain the emotion type.
In an embodiment, the driving control strategy includes a braking control strategy, and the determining module 30 is further configured to determine a collision time threshold and an auxiliary braking force based on the emotion level, wherein the collision time threshold and the auxiliary braking force are negatively related to the emotion level; a brake control strategy is determined based on the collision time and the auxiliary braking force.
In an embodiment, the driving control strategy further includes a steering control strategy, and the determining module 30 is further configured to determine a lane-changing distance and a lane-changing duration based on the emotion level, where the lane-changing distance and the lane-changing duration are inversely related to the emotion level; and determining a steering control strategy based on the lane change distance and the lane change duration.
In an embodiment, the driving control strategy further includes a driving control strategy, and the determining module 30 is further configured to determine, based on the emotion level, a safety distance that needs to be maintained between the current vehicle and the vehicle ahead and a vehicle speed threshold, where the safety distance is negatively related to the emotion level, and the vehicle speed threshold is positively related to the emotion level; a drive control strategy is determined based on the relative distance spacing and a vehicle speed threshold.
In one embodiment, the control module 50 is further configured to determine driving scenario information based on the vehicle environment data; and controlling the motion state of the vehicle according to the driving control strategy corresponding to the driving scene information.
In addition, in order to achieve the above object, the present invention also proposes a vehicle control apparatus including: a memory, a processor, and a vehicle control program stored on the memory and executable on the processor, the vehicle control program configured to implement the steps of the vehicle control method as described above.
The vehicle control device adopts all the technical solutions of all the embodiments, so that the vehicle control device has at least all the beneficial effects brought by the technical solutions of the embodiments, and is not described in detail herein.
In addition, the embodiment of the invention also provides a storage medium, wherein the storage medium stores a vehicle control program, and the vehicle control program realizes the steps of the vehicle control method when being executed by a processor.
Because the storage medium adopts all the technical schemes of all the embodiments, the storage medium has at least all the beneficial effects brought by the technical schemes of the embodiments, and the description is omitted here.
It should be understood that the foregoing is illustrative only and is not limiting, and that in specific applications, those skilled in the art may set the invention as desired, and the invention is not limited thereto.
It should be noted that the above-described working procedure is merely illustrative, and does not limit the scope of the present invention, and in practical application, a person skilled in the art may select part or all of them according to actual needs to achieve the purpose of the embodiment, which is not limited herein.
In addition, technical details not described in detail in the present embodiment may refer to the vehicle control method provided in any embodiment of the present invention, and are not described herein.
Furthermore, it should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited in order and may be performed in other orders, unless explicitly stated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, the order of their execution not necessarily occurring in sequence, but may be performed alternately or alternately with other steps or at least a portion of the other steps or stages.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. Read Only Memory)/RAM, magnetic disk, optical disk) and including several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (10)

1. A vehicle control method, characterized in that the method comprises:
collecting emotion data of a driver in the vehicle, and determining emotion types based on the emotion data;
analyzing and mapping according to the emotion types to obtain emotion grades;
determining a driving control strategy based on the emotion level;
sensing the surrounding environment of the vehicle to obtain vehicle environment data;
and controlling a motion state of the vehicle based on the driving control strategy and the vehicle environment data.
2. The method of claim 1, wherein the mood data comprises a human body image, the capturing mood data of the driver in the vehicle, and determining a mood type based on the mood data, comprising:
responding to a starting signal of a vehicle, and acquiring a human body image of a driver in the vehicle to obtain the human body image;
extracting features based on the human body image to obtain facial expression features and limb action features;
and analyzing based on the facial expression characteristics and the limb action characteristics to obtain emotion types.
3. The method of claim 1, wherein the emotion data further comprises voice data, the capturing emotion data of the in-vehicle driver and determining an emotion type based on the emotion data, further comprising:
responding to a starting signal of a vehicle, and collecting voice of a driver in the vehicle to obtain voice data;
extracting the characteristics of the voice data to obtain volume characteristics, mood characteristics and keyword characteristics;
and analyzing based on the volume characteristics, the mood characteristics and the keyword characteristics to obtain the emotion type.
4. The method of claim 1, wherein the driving control strategy comprises a braking control strategy, the determining a driving control strategy based on the emotion level comprising:
determining a collision time threshold and an auxiliary braking force based on the emotion level, wherein the collision time threshold and auxiliary braking force are inversely related to the emotion level;
a brake control strategy is determined based on the collision time and the auxiliary braking force.
5. The method of claim 4, wherein the driving control strategy further comprises a steering control strategy, the determining a driving control strategy based on the emotion level further comprising:
determining lane changing distance and lane changing duration based on the emotion level, wherein the lane changing distance and the lane changing duration are inversely related to the emotion level;
and determining a steering control strategy based on the lane change distance and the lane change duration.
6. The method of claim 5, wherein the driving control strategy further comprises a driving control strategy, the determining a driving control strategy based on the emotion level further comprising:
determining a safety distance and a vehicle speed threshold value which are needed to be kept between a current vehicle and a front vehicle based on the emotion level, wherein the safety distance is inversely related to the emotion level, and the vehicle speed threshold value is positively related to the emotion level;
a drive control strategy is determined based on the relative distance spacing and a vehicle speed threshold.
7. The method according to any one of claims 1 to 6, characterized in that the controlling the motion state of the vehicle based on the driving control strategy and the vehicle environment data includes:
determining driving scenario information based on the vehicle environment data;
and controlling the motion state of the vehicle according to the driving control strategy corresponding to the driving scene information.
8. A vehicle control apparatus, characterized by comprising:
the system comprises an acquisition module, a control module and a control module, wherein the acquisition module is used for acquiring emotion data of a driver in a vehicle and determining emotion types based on the emotion data;
the analysis module is used for analyzing and mapping according to the emotion types to obtain emotion grades;
a determining module for determining a driving control strategy based on the emotion level;
the sensing module is used for sensing the surrounding environment of the vehicle to obtain vehicle environment data;
and the control module is used for controlling the motion state of the vehicle based on the driving control strategy and the vehicle environment data.
9. A vehicle control apparatus, characterized by comprising: a memory, a processor, and a vehicle control program stored on the memory and executable on the processor, the vehicle control program configured to implement the vehicle control method of any one of claims 1 to 7.
10. A storage medium having stored thereon a vehicle control program which, when executed by a processor, implements the vehicle control method according to any one of claims 1 to 7.
CN202311674941.9A 2023-12-06 2023-12-06 Vehicle control method, device, equipment and storage medium Pending CN117465459A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311674941.9A CN117465459A (en) 2023-12-06 2023-12-06 Vehicle control method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311674941.9A CN117465459A (en) 2023-12-06 2023-12-06 Vehicle control method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117465459A true CN117465459A (en) 2024-01-30

Family

ID=89627625

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311674941.9A Pending CN117465459A (en) 2023-12-06 2023-12-06 Vehicle control method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117465459A (en)

Similar Documents

Publication Publication Date Title
US10703268B2 (en) System and method for driver distraction determination
JP7080598B2 (en) Vehicle control device and vehicle control method
KR102286674B1 (en) A method of operating a driver assistance device in a car, a driver assistance device and a vehicle
WO2020054458A1 (en) Information processing device, transport device and method, and program
EP1997705B1 (en) Drive behavior estimating device, drive supporting device, vehicle evaluating system, driver model making device, and drive behavior judging device
CN113474787A (en) Detection of cognitive state of driver
EP3885220B1 (en) Automatically estimating skill levels and confidence levels of drivers
US10745030B2 (en) Providing location and driving behavior based alerts
KR101276770B1 (en) Advanced driver assistance system for safety driving using driver adaptive irregular behavior detection
EP3564086A1 (en) Managing drive modes of a vehicle
JP4182131B2 (en) Arousal level determination device and arousal level determination method
JP5041160B2 (en) Driving assistance device
CN114735010B (en) Intelligent vehicle running control method and system based on emotion recognition and storage medium
Rezaei et al. Simultaneous analysis of driver behaviour and road condition for driver distraction detection
CN116331221A (en) Driving assistance method, driving assistance device, electronic equipment and storage medium
US10745029B2 (en) Providing relevant alerts to a driver of a vehicle
CN117842085A (en) Driving state detection and early warning method, driving state detection and early warning system, electronic equipment and storage medium
EP3892511A1 (en) Method and system for modifying a self-driving model
CN117465459A (en) Vehicle control method, device, equipment and storage medium
Soultana et al. Context-awareness in the smart car: Study and analysis
Modak et al. Human head pose and eye state based driver distraction monitoring system
Stang et al. Adaptive customized forward collision warning system through driver monitoring
JP2024525153A (en) DEVICE AND METHOD FOR PREDICTING COLLASONS, PREDICTING INTERSECTION VIOLATIONS AND/OR DETERMINING AREAS OF INTEREST FOR OBJECT DETECTION IN CAMERA IMAGES - Patent application
KASHEVNIK et al. PERSONALIZED DANGEROUS SITUATION DETECTION IN VEHICLE CABINS USING SMARTPHONE SENSORS
CN115923830A (en) Driving prompting method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination