CN116061951A - Vehicle control method and device, vehicle and storage medium - Google Patents

Vehicle control method and device, vehicle and storage medium Download PDF

Info

Publication number
CN116061951A
CN116061951A CN202310171503.4A CN202310171503A CN116061951A CN 116061951 A CN116061951 A CN 116061951A CN 202310171503 A CN202310171503 A CN 202310171503A CN 116061951 A CN116061951 A CN 116061951A
Authority
CN
China
Prior art keywords
driver
vehicle
target
candidate
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310171503.4A
Other languages
Chinese (zh)
Inventor
韩亚凝
张建
刘秋铮
王御
洪日
姜洪伟
王珊
谢飞
张苏铁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FAW Group Corp
Original Assignee
FAW Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FAW Group Corp filed Critical FAW Group Corp
Priority to CN202310171503.4A priority Critical patent/CN116061951A/en
Publication of CN116061951A publication Critical patent/CN116061951A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • B60K28/06Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
    • B60K28/063Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver preventing starting of vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver
    • B60W2040/0827Inactivity or incapacity of driver due to sleepiness
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0881Seat occupation; Driver or passenger presence
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/089Driver voice
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/045Occupant permissions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/21Voice
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/22Psychological state; Stress level or workload
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/221Physiology, e.g. weight, heartbeat, health or special needs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/26Incapacity

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a vehicle control method, a device, a vehicle and a storage medium, and relates to the technical field of vehicle control, wherein the method comprises the following steps: identifying roles and states of candidate persons in the vehicle; determining a target person from the candidate persons according to the roles and the states of the candidate persons; determining target control data of a target system of the vehicle according to the state of the target person; and controlling the target system according to the target control data. The method and the system determine the target personnel through the roles and the states of different personnel in the vehicle, and control the vehicle according to the states of the target personnel, so that the personnel with different roles in the vehicle are considered, the intelligent degree is improved, and the use experience of a user is improved.

Description

Vehicle control method and device, vehicle and storage medium
Technical Field
The present invention relates to the field of vehicle control technologies, and in particular, to a vehicle control method, a device, a vehicle, and a storage medium.
Background
At present, intelligent driving is becoming more and more popular, and when an automobile is started, personnel in the automobile can select a proper driving mode according to own needs to confirm an intelligent driving scheme, so that the automobile is convenient and quick.
In the prior art, a driving mode can be selected based on passenger information, and the passenger information is acquired and analyzed and fed back to a driver to confirm and select, so that the intelligent driving mode can be selected to meet the requirements of passengers. However, intelligent driving is only controlled according to passenger information, so that the intelligent driving is very limited, and other personnel in the vehicle cannot be considered, so that the humanization degree and the intelligent degree are low.
Disclosure of Invention
The invention provides a vehicle control method, a device, a vehicle and a storage medium, wherein target personnel can be determined through the roles and states of different personnel in the vehicle, and then the vehicle is controlled according to the states of the target personnel, so that the personnel with different roles in the vehicle are considered, the intelligent degree is improved, and the use experience of a user is improved.
In a first aspect, the present invention provides a vehicle control method including:
identifying roles and states of candidate persons in the vehicle;
determining a target person from the candidate persons according to the roles and the states of the candidate persons;
determining target control data of a target system of the vehicle according to the state of the target person;
and controlling the target system according to the target control data.
In a second aspect, the present invention provides a vehicle control apparatus comprising:
the identification module is used for identifying the roles and states of candidate persons in the vehicle;
the target person determining module is used for determining target persons from the candidate persons according to the roles and the states of the candidate persons;
the data determining module is used for determining target control data of a target system of the vehicle according to the state of the target person;
and the control module is used for controlling the target system according to the target control data.
In a third aspect, the present invention provides a vehicle comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing a vehicle control method according to any of the embodiments of the present invention when executing the program.
In a fourth aspect, the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a vehicle control method according to any of the embodiments of the present invention.
According to the scheme, the roles and the states of the candidate persons in the vehicle can be identified; determining a target person from the candidate persons according to the roles and the states of the candidate persons; determining target control data of a target system of the vehicle according to the state of the target person; and controlling the target system according to the target control data. The method and the system determine the target personnel through the roles and the states of different personnel in the vehicle, and control the vehicle according to the states of the target personnel, so that the personnel with different roles in the vehicle are considered, the intelligent degree is improved, and the use experience of a user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and should not be considered as limiting the scope, and that other related drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a vehicle control method provided by the present invention;
FIG. 2 is another flow chart of a vehicle control method provided by the present invention;
FIG. 3 is a schematic view of a vehicle control apparatus according to the present invention;
fig. 4 is a schematic structural view of a vehicle according to the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a schematic flow chart of a vehicle control method provided by the present invention, which may be implemented by a vehicle control device provided by the present invention, where the device may be implemented in software and/or hardware. In a specific embodiment, the device may be integrated in a vehicle. The following embodiment will be described taking the example of the integration of the device in a vehicle, and referring to fig. 1, the method may specifically include the steps of:
step 101, identifying roles and states of candidate persons in the vehicle.
The in-vehicle candidate refers to a person existing in the vehicle, and is a driver when there is only one driver in the vehicle, for example; when there is a driver and one or more passengers in the vehicle, the candidates in the vehicle are the driver and the passengers.
Specifically, the roles and states of the candidates in the vehicle can be identified by acquiring information of the candidates in the vehicle. Alternatively, the information of the candidate in the vehicle may be acquired through a gravity sensor preset in the vehicle, and the roles and states may be acquired according to the information of the candidate in the vehicle. The method comprises the steps that an example is that a gravity sensor is preset below each seat in a vehicle, when only the gravity sensor at the position of a driver senses that a person sits down, only the driver in the vehicle is identified, and candidate persons are drivers; when the gravity sensors of a plurality of seats in the vehicle sense that a person sits down, the driver and one or more passengers in the vehicle are identified, and the candidate persons in the vehicle are the driver and the passengers.
Alternatively, image information of the candidate in the vehicle may be acquired by an image sensor preset in the vehicle, and the character and the state may be acquired based on the information of the candidate in the vehicle. For example, an image sensor is installed in front of each seat in the vehicle, and when only the driver position has a person in the acquired image, only the driver in the vehicle is identified, and the candidate person is the driver; when a plurality of seats in the acquired image have personnel, the driver and one or more passengers in the vehicle are identified, and the candidates in the vehicle are the driver and the passengers.
Alternatively, sound information of the candidate in the vehicle may be acquired by a sound extraction device preset in the vehicle, and the character and the state may be acquired based on the information of the candidate in the vehicle. Illustratively, a sound extraction device is installed in front of each seat in the vehicle, which can only acquire sound close to the seat, so that when the acquired sound is only sound at the driver, only one driver is identified in the vehicle, and the candidate person is the driver; when the acquired sound has sounds of a plurality of persons, the driver and one or more passengers in the vehicle are identified, and the candidates in the vehicle are the driver and the passengers.
Alternatively, the above several methods may be arbitrarily combined to identify the candidate person in the vehicle, which is not particularly limited herein.
And 102, determining a target person from the candidate persons according to the roles and the states of the candidate persons.
Optionally, when only the driver is included in the roles of the candidate persons, determining the target person as the driver;
specifically, when only the driver is included in the roles of the candidate person, the target person is determined to be the driver, and the target control data of the target system of the vehicle is determined according to the driver state.
Optionally, when the driver and the passenger are included in the roles of the candidate person, judging whether the driver is qualified for driving, tired driving, or abnormal emotion;
specifically, when the roles of the candidate person include a driver and a passenger, judging the age section of the driver according to the acquired driver information, further judging whether the driver has driving qualification, and when the driver does not have the driving qualification, judging that the driver is abnormal in state, and determining that the target person is the driver; exemplary, face recognition is performed according to face information of a driver, inquiry is performed in the driver information stored in the vehicle or the internet according to the recognition result to determine whether the driver has a driving license, if the driver does not have the driving license, the driver is not qualified, at this time, the abnormal state of the driver is determined, and the target person is determined to be the driver
Judging whether the driver is in a tired state according to the acquired driver information, further judging whether the driver is in a tired driving state, judging that the driver is abnormal in the tired driving state, and determining that the target person is the driver; for example, if images of the driver driving the vehicle are acquired for 4 consecutive hours, and the driver has not left during this period, it is determined that the driver is in a fatigue driving state, and at this time, it is determined that the driver state is abnormal, and it is determined that the target person is the driver.
Judging whether the driver is in abnormal emotion according to the acquired driver information, and judging that the driver is in abnormal state when the driver is in abnormal emotion, and determining that the target person is the driver; wherein, the emotion abnormality includes emotion such as anger, difficulty, agitation or anxiety. For example, if the driver in the acquired driver image is tearing, judging that the driver is in the emotion of the injury and belongs to an emotion abnormal state, judging that the driver is abnormal at the moment, and determining that the target person is the driver; if the driver volume in the acquired driver sound information is 100 dB, the driver is judged to be in anger emotion, and the driver belongs to an emotion abnormal state, and at the moment, the driver is judged to be abnormal in state, and the target person is determined to be the driver.
Alternatively, when the driver's state is normal, the target person is determined to be a passenger.
Specifically, when the state of the driver satisfies both driving qualification, no fatigue driving, and emotional normality, the state of the driver is determined to be normal, and at this time, the target person is determined to be a passenger. Optionally, the status of the passenger includes an age segment in which the passenger is located and a physiological activity status of the passenger. The age segments of the passengers can be divided into children under 12 years old, passengers between 12 years old and 60 years old and old people over 60 years old; the physiological activity state of the passenger can be the overstress behaviors such as beating the car door, pulling the driver and the like, or the physiological activity states such as eye-closing rest, normal emotion chat or mobile phone playing.
And step 103, determining target control data of a target system of the vehicle according to the state of the target person.
The target system comprises a driving control system and a man-machine interaction system.
Specifically, when the target person is a driver, determining target control data of a driving control system and a man-machine interaction system of the vehicle according to the state of the driver; or when the target person is a passenger, determining target control data of a driving control system and a man-machine interaction system of the vehicle according to the state of the passenger.
For example, when the target person is a driver and the driver does not have driving capability, if the vehicle is in a powered-on but non-ignition state, the target control data of the driving control system is to turn off the ignition function, thereby prohibiting the vehicle from starting; if the vehicle is in driving, the target control data of the man-machine interaction system is that red big word prompts are carried out on a vehicle central control host machine: please stop as soon as possible, and simultaneously control the in-car audible prompts: please stop as soon as possible, display on the rear car prompter located on the rear glass: "please note that the front truck keeps distance".
For example, when the target person is a driver and the driver is in a fatigue driving state, the target control data of the man-machine interaction system is prompted on the central control host: "you are in tired state, please stop and rest", and meanwhile, the in-car audible prompts: "you are in tired state, please stop for rest", and automatically play music with refreshing function when the driver does not stop the vehicle.
For example, when the target person is a driver and the driver is in a state of emergency, the target control data of the driving control system is to increase the engine power, decrease the steering assist, increase the steering wheel resistance, increase the brake sensitivity, adjust the air suspension height to the minimum, and adjust the shock absorber to the hardest.
For example, when the target person is a driver and the driver is in an anger state, the target control data of the human-computer interaction system is prompted on the central control host: 'do not anger, influence driving', control the stereo set to play the music that eases automatically; the target control data of the driving control system is to reduce the power of an engine, increase the steering assistance, weaken the resistance of a steering wheel and reduce the brake sensitivity.
For example, when the target person is a passenger and the passenger has old people or children, the riding comfort in the vehicle should be improved, and the target control data of the driving control system is to reduce the power of the engine, increase the steering assistance, weaken the resistance of the steering wheel and reduce the brake sensitivity; the target control data of the man-machine interaction system are displayed on a rear car prompter positioned on rear glass: "the old and the child are in the car, do not urge them.
For example, when the target person is a passenger and some passengers are in a sleeping state, the riding comfort in the vehicle should be improved again, and the target control data of the driving control system is to reduce the power of the engine to the minimum, increase the steering assistance to the maximum, reduce the resistance of the steering wheel to the minimum and reduce the brake sensitivity to the minimum; the target control data of the man-machine interaction system is to adjust the volume of sound in the trolley.
For example, when the target person is a passenger and someone in the passenger is in a state of emergency, the target control data of the driving control system is that the engine power is increased to the highest, the steering assist is decreased to the lowest, the steering wheel resistance is increased to the highest, the brake sensitivity is increased to the highest, the air suspension height is adjusted to the lowest, and the shock absorber is adjusted to the hardest.
For example, when the target person is a passenger and some people in the passenger are in an overdriving state, such as a driver is pulled, the target control data of the man-machine interaction system is a red large word prompt on the central control host machine: please stop as soon as possible, and simultaneously, the in-car audible prompt: the passenger does not alarm and stops as soon as possible, and the rear car prompter on the rear glass displays: please note that a distance is kept from the front car.
And 104, controlling the target system according to the target control data.
Specifically, after the target control data is determined, the target system is controlled according to the target control data.
For example, when the target person is a driver and the driver does not have driving capability, if the vehicle is in a powered-on but non-ignition state, the driving control system turns off the ignition function, and prohibits the vehicle from starting; if the vehicle is in driving, the human-computer interaction system carries out red big word prompt on the vehicle central control host: please stop as soon as possible, and simultaneously control the in-car audible prompts: please stop as soon as possible, display on the rear car prompter located on the rear glass: "please note that the front truck keeps distance".
For example, when the target person is a driver and the driver is in a fatigue driving state, the man-machine interaction system prompts on the central control host: "you are in tired state, please stop and rest", and meanwhile, the in-car audible prompts: "you are in tired state, please stop for rest", and automatically play music with refreshing function when the driver does not stop the vehicle.
Illustratively, when the target person is a driver and the driver is in a state of emergency, the drive control system increases engine power, decreases steering assist, increases steering wheel resistance, increases brake sensitivity, and adjusts air suspension height to a minimum and shock absorber to a maximum.
Illustratively, when the target person is a driver and the driver is in an anger state, the human-computer interaction system prompts on the central control host: 'do not anger, influence driving', control the stereo set to play the music that eases automatically; the driving control system reduces the power of the engine, increases the steering power, reduces the resistance of the steering wheel and reduces the brake sensitivity.
For example, when the target person is a passenger and the passenger has old people or children, the riding comfort in the vehicle should be improved, the driving control system reduces the power of the engine, increases the steering assistance, reduces the resistance of the steering wheel and reduces the braking sensitivity; the human-computer interaction system displays on a rear car prompter positioned on the rear glass: "the old and the child are in the car, do not urge them.
For example, when the target person is a passenger and someone in the passenger is in a sleep state, the driving control system minimizes engine power, maximizes steering assistance, minimizes steering wheel resistance, and minimizes brake sensitivity; the man-machine interaction system adjusts the volume of sound in the trolley.
Illustratively, when the target person is a passenger and someone in the passenger is in a state of emergency, the drive control system maximizes engine power, minimizes steering assistance, maximizes steering wheel drag, maximizes brake sensitivity, minimizes air suspension height, and maximizes shock absorber.
For example, when the target person is a passenger and a person in the passenger is in an overdriving state, such as a driver is pulled, the human-computer interaction system carries out red character prompt on the central control host machine: please stop as soon as possible, and simultaneously, the in-car audible prompt: the passenger does not alarm and stops as soon as possible, and the rear car prompter on the rear glass displays: please note that a distance is kept from the front car.
According to the scheme, the roles and the states of the candidate persons in the vehicle are identified; determining a target person from the candidate persons according to the roles and the states of the candidate persons; determining target control data of a target system of the vehicle according to the state of the target person; and controlling the target system according to the target control data. The method and the system determine the target personnel through the roles and the states of different personnel in the vehicle, and control the vehicle according to the states of the target personnel, so that the personnel with different roles in the vehicle are considered, the intelligent degree is improved, and the use experience of a user is improved.
Fig. 2 is another schematic flow chart of a vehicle control method provided by the present invention, and this embodiment is a further refinement of the foregoing embodiment, and as shown in fig. 2, the method may include the following steps:
in step 201, image information and sound information of a candidate person are collected.
The image information of the candidate person refers to information which can be collected in a picture containing the candidate person, and the sound information of the candidate person refers to sound which can be collected by a person in the vehicle. Specifically, image information and sound information of a candidate person in the vehicle are collected by hardware equipment pre-installed in the vehicle.
For example, 3 cameras and one microphone are respectively arranged in the front and middle of the cabin in the vehicle, and the front 3 cameras are used for acquiring the face information of the front-row personnel from the front, the left front and the right front; the 3 cameras in the middle are used for collecting face information of the rear-row personnel from the front, the left front and the right front; the microphone is used for collecting sound information of personnel in the vehicle.
Step 202, analyzing the image information and the sound information of the candidate person, thereby identifying the role and the state of the candidate person. If the candidate includes only the driver, step 203 is executed, and if the candidate includes the driver and the passenger, step 204 is executed.
Specifically, after image information and sound information of the candidate person are acquired, the acquired images are transmitted to an image processing device, and roles of the candidate person in the vehicle are identified. The method includes the steps of identifying images acquired by a front-row camera, wherein a person character with a steering wheel in front is a driver, and a person character without the steering wheel in front is a passenger. The software divides the image processing into three stages.
And transmitting the acquired images to image processing equipment, and identifying the states of the candidate persons in the vehicle. Illustratively, face information of a person in the vehicle, such as a face contour, a face wrinkle state, a five-sense organ state, and the like, is extracted from the image; wherein, the age section of the personnel in the vehicle can be determined according to the facial outline and the facial fold state, for example, the personnel in the vehicle can be divided into teenagers under 18 years old, middle aged people between 18 years old and 60 years old and old people over 60 years old according to the facial fold state; it is also possible to determine whether the person in the vehicle is in a sleep state or not, etc., based on the physiological activity state of the person in the vehicle in the five sense organs state, for example, based on the eye state.
The collected sound information is transmitted to a sound processing device, and the states of the candidate persons in the vehicle, such as the volume, the speech speed and the like, are identified through the sound information. Illustratively, the volume of the candidate person is determined from the collected sound information, and if the volume exceeds 80 db, the candidate person is judged to be in an abnormal emotion state.
Optionally, after acquiring the roles and states of the candidate personnel, when only the driver information can be acquired in the candidate personnel information, confirming that only the driver is included in the roles of the candidate personnel; when the driver information and the front passenger or rear passenger information can be acquired from the candidate person information, the role of the candidate person is confirmed to include the driver and the passenger.
In step 203, the target person is determined to be the driver.
Specifically, when only the driver is included in the roles of the candidate persons, the target person is determined to be the driver.
Step 204, it is determined whether the driver is qualified for driving, tired driving, and abnormal emotion.
Specifically, when the candidate person includes the driver and the passenger, it is determined whether the driver is qualified for driving, tired driving, or abnormal emotion. When the roles of the candidate personnel are confirmed to include the driver and the passenger, judging the age interval of the driver according to the collected image information and sound information, and further judging whether the driver has driving qualification or not; judging the five sense organs state of the driver according to the collected image information, judging whether the intonation of the driver is in a drowsy state according to the sound information, and further judging whether the driver is tired to drive; and judging the facial expression of the driver according to the collected image information, and judging the volume and tone of the driver according to the collected sound information, so as to judge whether the emotion of the driver is abnormal.
For example, whether the driver is under 16 years old or over 70 years old is judged according to the facial fold state and the sound state of the driver, and if the driver is in the age group, the driver is judged to have no driving qualification; if the driver is determined to have limb deformity or drunk refreshing state through the image information of the driver, the driver is judged to have no driving qualification; judging whether the driver is in fatigue driving according to the blink frequency of the driver, and if the driver is in fatigue driving for a plurality of times within 1 minute, judging that the driver is in fatigue driving; if the driver has excessive volume and the face color is reddish, the facial expression is in an anger and magic state, and the driver is judged to be in an abnormal emotion state.
Step 205, determining the state of the driver as abnormal when the state of the driver satisfies at least one of driving failure, fatigue driving, and emotional abnormality. If the driver status is abnormal, step 206 is performed, and if the driver status is normal, step 207 is performed.
Among them, the state of the driver is classified into normal and abnormal. Specifically, when the state of the driver satisfies at least one of driving failure, fatigue driving, and emotional abnormality, the state of the driver is determined to be abnormal; and when the state of the driver meets the driving qualification, the fatigue driving and the emotional normality at the same time, determining that the state of the driver is normal.
Step 206, determining the target person as the driver.
Specifically, when the state of the driver is abnormal, the target person is determined to be the driver, and the target control data of the target system of the vehicle is determined according to the state of the driver.
Step 207, determining the target person as a passenger.
Specifically, when the state of the driver is normal, the target person is determined to be a passenger, and the target control data of the target system of the vehicle is determined according to the state of the passenger.
Optionally, the status of the passenger includes an age segment in which the passenger is located and a physiological activity status of the passenger.
The age section of the passenger is a preset method for dividing different age sections of the passenger; the physiological activity state of the passenger is the behavior of the passenger such as actions, expressions, and expression states. Illustratively, the age segments of passengers may be categorized into children under 12 years old, passengers between 12 and 60 years old and elderly over 60 years old; the physiological activity state of the passengers can be the overstimulation actions of beating the car door, pulling the driver and the like, or the eye-closing rest, the normal emotion chatting or the mobile phone playing and the like.
Step 208, determining target control data of a target system of the vehicle according to the state of the target person.
Step 209, controlling the target system according to the target control data.
It should be understood that, although the steps in the flowchart of fig. 2 are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least a portion of the steps in fig. 2 may include a plurality of sub-steps or stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of execution of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with at least a portion of the sub-steps or stages of other steps or other steps, which is not limited in this regard.
According to the scheme, the roles and the states of the candidate persons in the vehicle can be identified; determining a target person from the candidate persons according to the roles and the states of the candidate persons; determining target control data of a target system of the vehicle according to the state of the target person; and controlling the target system according to the target control data. The method and the system determine the target personnel through the roles and the states of different personnel in the vehicle, and control the vehicle according to the states of the target personnel, so that the personnel with different roles in the vehicle are considered, the intelligent degree is improved, and the use experience of a user is improved.
Fig. 3 is a schematic structural diagram of a vehicle control apparatus according to the present invention, which is adapted to perform the vehicle control method according to the present invention. As shown in fig. 3, the apparatus may specifically include:
an identification module 301 for identifying the roles and states of candidate persons in the vehicle;
the target person determining module 302 is configured to determine a target person from the candidate persons according to the roles and states of the candidate persons;
a data determining module 303, configured to determine target control data of a target system of the vehicle according to a state of the target person;
and the control module 304 is configured to control the target system according to the target control data.
In one embodiment, the identification module 301 is specifically configured to:
collecting image information and sound information of the candidate person;
and analyzing the image information and the sound information of the candidate personnel so as to identify the roles and the states of the candidate personnel.
In one embodiment, the target person determination module 302 is specifically configured to:
when only a driver is included in the roles of the candidate persons, determining that the target person is the driver;
and when the roles of the candidate personnel comprise a driver and a passenger, determining a target personnel from the candidate personnel according to the state of the driver.
In one embodiment, the determining a target person from the candidate persons according to the state of the driver, the target person determining module 302 is specifically configured to:
judging whether the state of the driver is abnormal;
when the state of the driver is abnormal, determining that the target person is the driver;
and when the state of the driver is normal, determining that the target person is the passenger.
In one embodiment, the determining whether the state of the driver is abnormal is specifically configured to:
judging whether the driver is qualified for driving, tired driving and abnormal emotion;
and determining the state of the driver as abnormal when the state of the driver satisfies at least one of driving qualification, fatigue driving and emotional abnormality.
In one embodiment, the status of the passenger includes an age segment in which the passenger is located and a physiological activity status of the passenger.
In one embodiment, the target system includes a driving control system and a human-computer interaction system.
The device provided by the invention can identify the roles and states of candidate persons in the vehicle; determining a target person from the candidate persons according to the roles and the states of the candidate persons; determining target control data of a target system of the vehicle according to the state of the target person; and controlling the target system according to the target control data. The method and the system determine the target personnel through the roles and the states of different personnel in the vehicle, and control the vehicle according to the states of the target personnel, so that the personnel with different roles in the vehicle are considered, the intelligent degree is improved, and the use experience of a user is improved.
The invention also provides a vehicle, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the vehicle control method provided by any one of the embodiments when executing the program.
The present invention also provides a computer-readable medium having stored thereon a computer program which, when executed by a processor, implements the vehicle control method provided by any of the above embodiments.
Referring now to FIG. 4, a schematic diagram of a computer system 400 suitable for use with a vehicle embodying the present invention is shown. The vehicle illustrated in fig. 4 is merely an example, and should not be construed as limiting the functionality and scope of use of the present invention.
As shown in fig. 4, the computer system 400 includes a Central Processing Unit (CPU) 401, which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 402 or a program loaded from a storage section 408 into a Random Access Memory (RAM) 403. In the RAM403, various programs and data required for the operation of the computer system 400 are also stored. The CPU 401, ROM 402, and RAM403 are connected to each other by a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
The following components are connected to the I/O interface 405: an input section 406 including a keyboard, a mouse, and the like; an output portion 407 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker, and the like; a storage section 408 including a hard disk or the like; and a communication section 409 including a network interface card such as a LAN card, a modem, or the like. The communication section 409 performs communication processing via a network such as the internet. The drive 410 is also connected to the I/O interface 405 as needed. A removable medium 411 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed on the drive 410 as needed, so that a computer program read therefrom is installed into the storage section 408 as needed.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 409 and/or installed from the removable medium 411. The above-described functions defined in the system of the present invention are performed when the computer program is executed by a Central Processing Unit (CPU) 401.
The computer readable medium shown in the present invention may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules and/or units described in the present invention may be implemented in software or in hardware. The described modules and/or units may also be provided in a processor, e.g., may be described as: a processor includes an identification module, a target person determination module, a data determination module, and a control module. The names of these modules do not constitute a limitation on the module itself in some cases.
As another aspect, the present invention also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be present alone without being fitted into the device. The computer readable medium carries one or more programs which, when executed by a device, cause the device to include:
identifying roles and states of candidate persons in the vehicle; determining a target person from the candidate persons according to the roles and the states of the candidate persons; determining target control data of a target system of the vehicle according to the state of the target person; and controlling the target system according to the target control data.
According to the technical scheme, the roles and the states of the candidate persons in the vehicle can be identified; determining a target person from the candidate persons according to the roles and the states of the candidate persons; determining target control data of a target system of the vehicle according to the state of the target person; and controlling the target system according to the target control data. The method and the system determine the target personnel through the roles and the states of different personnel in the vehicle, and control the vehicle according to the states of the target personnel, so that the personnel with different roles in the vehicle are considered, the intelligent degree is improved, and the use experience of a user is improved.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The technical scheme of the application is used for acquiring, storing, using, processing and the like of the data, and accords with the relevant regulations of national laws and regulations. The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives can occur depending upon design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A vehicle control method characterized by comprising:
identifying roles and states of candidate persons in the vehicle;
determining a target person from the candidate persons according to the roles and the states of the candidate persons;
determining target control data of a target system of the vehicle according to the state of the target person;
and controlling the target system according to the target control data.
2. The method of claim 1, wherein the identifying the role and status of the candidate in the vehicle comprises:
collecting image information and sound information of the candidate person;
and analyzing the image information and the sound information of the candidate personnel so as to identify the roles and the states of the candidate personnel.
3. The method of claim 1, wherein said determining a target person from said candidate persons based on their roles and status comprises:
when only a driver is included in the roles of the candidate persons, determining that the target person is the driver;
and when the roles of the candidate personnel comprise a driver and a passenger, determining a target personnel from the candidate personnel according to the state of the driver.
4. A method according to claim 3, wherein said determining a target person from said candidate persons based on the status of said driver comprises:
judging whether the state of the driver is abnormal;
when the state of the driver is abnormal, determining that the target person is the driver;
and when the state of the driver is normal, determining that the target person is the passenger.
5. The method of claim 4, wherein the determining whether the driver's status is abnormal comprises:
judging whether the driver is qualified for driving, tired driving and abnormal emotion;
and determining the state of the driver as abnormal when the state of the driver satisfies at least one of driving qualification, fatigue driving and emotional abnormality.
6. The method of claim 3 or 4, wherein the status of the passenger comprises an age segment in which the passenger is located and a physiological activity status of the passenger.
7. The method of claim 1, wherein the target system comprises a drive control system and a human-machine interaction system.
8. A vehicle control apparatus characterized by comprising:
the identification module is used for identifying the roles and states of candidate persons in the vehicle;
the target person determining module is used for determining target persons from the candidate persons according to the roles and the states of the candidate persons;
the data determining module is used for determining target control data of a target system of the vehicle according to the state of the target person;
and the control module is used for controlling the target system according to the target control data.
9. A vehicle comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the vehicle control method according to any one of claims 1 to 7 when executing the program.
10. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when executed by a processor, implements the vehicle control method according to any one of claims 1 to 7.
CN202310171503.4A 2023-02-27 2023-02-27 Vehicle control method and device, vehicle and storage medium Pending CN116061951A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310171503.4A CN116061951A (en) 2023-02-27 2023-02-27 Vehicle control method and device, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310171503.4A CN116061951A (en) 2023-02-27 2023-02-27 Vehicle control method and device, vehicle and storage medium

Publications (1)

Publication Number Publication Date
CN116061951A true CN116061951A (en) 2023-05-05

Family

ID=86173096

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310171503.4A Pending CN116061951A (en) 2023-02-27 2023-02-27 Vehicle control method and device, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN116061951A (en)

Similar Documents

Publication Publication Date Title
EP3675121B1 (en) Computer-implemented interaction with a user
CN106803423B (en) Man-machine interaction voice control method and device based on user emotion state and vehicle
CN111402925B (en) Voice adjustment method, device, electronic equipment, vehicle-mounted system and readable medium
CN110395260B (en) Vehicle, safe driving method and device
JP2022095768A (en) Method, device, apparatus, and medium for dialogues for intelligent cabin
WO2022022162A1 (en) Vehicle reminder method and vehicle
CN112041201B (en) Method, system, and medium for controlling access to vehicle features
JP6075577B2 (en) Driving assistance device
CN113771859A (en) Intelligent driving intervention method, device and equipment and computer readable storage medium
JP2015182755A (en) Movable body equipment control device, portable terminal, movable body equipment control system, and movable body equipment control method
CN116061951A (en) Vehicle control method and device, vehicle and storage medium
CN116567895A (en) Vehicle-mounted atmosphere lamp control method and device, electronic equipment and vehicle
CN115830724A (en) Vehicle-mounted recognition interaction method and system based on multi-mode recognition
CN113299286A (en) Interaction method, device and system based on vehicle-mounted robot and readable medium
CN114120983A (en) Audio data processing method and device, equipment and storage medium
CN111703385B (en) Content interaction method and vehicle
CN115440207A (en) Multi-screen voice interaction method, device, equipment and computer readable storage medium
CN106057198A (en) Method for controlling intelligent means of transportation and device for controlling intelligent means of transportation
Lashkov et al. Dangerous state detection in vehicle cabin based on audiovisual analysis with smartphone sensors
JP2020095538A (en) Information processor and program
JP2020103702A (en) Information processing device and program
CN111783550B (en) Monitoring and adjusting method and system for emotion of driver
CN114291008B (en) Vehicle agent device, vehicle agent system, and computer-readable storage medium
CN117672515A (en) Method and system for alerting passengers of allergens
JP2018018184A (en) Vehicle event discrimination device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination