CN113867363B - Vehicle control method and device, vehicle and storage medium - Google Patents

Vehicle control method and device, vehicle and storage medium Download PDF

Info

Publication number
CN113867363B
CN113867363B CN202111232970.0A CN202111232970A CN113867363B CN 113867363 B CN113867363 B CN 113867363B CN 202111232970 A CN202111232970 A CN 202111232970A CN 113867363 B CN113867363 B CN 113867363B
Authority
CN
China
Prior art keywords
brain
vehicle
data
control
driver
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111232970.0A
Other languages
Chinese (zh)
Other versions
CN113867363A (en
Inventor
黄雨其
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xiaopeng Autopilot Technology Co Ltd
Original Assignee
Guangzhou Xiaopeng Autopilot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xiaopeng Autopilot Technology Co Ltd filed Critical Guangzhou Xiaopeng Autopilot Technology Co Ltd
Priority to CN202111232970.0A priority Critical patent/CN113867363B/en
Publication of CN113867363A publication Critical patent/CN113867363A/en
Application granted granted Critical
Publication of CN113867363B publication Critical patent/CN113867363B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0225Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Optics & Photonics (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

The embodiment of the invention provides a vehicle control method, a vehicle control, a vehicle and a storage medium, wherein the method comprises the following steps: generating environment perception data based on external environment data acquired in real time; transmitting the context awareness data to a brain-computer connected to the vehicle to feed back the context awareness data to the driver's brain; receiving a control instruction sent by the brain-computer; the control instruction is determined according to feedback information made by the brain of the driver for the environment sensing data; determining driving control information based on the received control instruction; and controlling the vehicle according to the driving control information. According to the embodiment of the invention, the control of the vehicle based on the brain-computer can be realized, the control is carried out according to the consciousness of the driver, and the driving experience is improved.

Description

Vehicle control method and device, vehicle and storage medium
Technical Field
The present invention relates to the field of automatic driving technology, and in particular, to a vehicle control method, a vehicle control method apparatus, a vehicle, and a storage medium.
Background
With the development of intelligent vehicles, at present, L3-level automatic driving can be realized on a mass production vehicle, and L4-level automatic driving can be realized on the technical realization. In the current automatic driving decision, the scheme is that firstly, a neural network model is trained according to a large amount of historical data to obtain an automatic driving decision model, during automatic driving, environments are identified according to vehicle-mounted sensors, the results of the environment identification are input into the decision model to obtain decision results, and the power and steering of a vehicle are controlled according to the decision results to realize automatic driving. However, as different drivers have different driving styles, the same automatic driving mode cannot meet the experience of the drivers, if the user has driving intention to access the automatic driving system, the system can cancel the execution of the automatic driving, so that the automatic driving cannot be performed according to the consciousness of the drivers, and the driving experience is greatly reduced.
Disclosure of Invention
In view of the above problems, embodiments of the present invention have been made to provide a vehicle control method, a vehicle control method apparatus, a vehicle, and a storage medium that overcome or at least partially solve the above problems.
In order to solve the above problems, an embodiment of the present invention discloses a vehicle control method, including:
Generating environment perception data based on external environment data acquired in real time;
Transmitting the context awareness data to a brain-computer connected to the vehicle to feed back the context awareness data to the driver's brain;
Receiving a control instruction sent by the brain-computer; the control instruction is determined according to feedback information made by the brain of the driver for the environment sensing data;
determining driving control information based on the received control instruction;
And controlling the vehicle according to the driving control information.
Optionally, the vehicle is provided with a sensor, and the method further comprises:
Acquiring real-time detection signals of the sensor
Reading map data of a high-precision map, wherein the map data corresponds to the position of the vehicle in the high-precision map;
and determining the detection signal and the map data as external environment data.
Optionally, a brain-computer interface is arranged in the brain-computer; the step of transmitting the context awareness data to a brain-computer connected to the vehicle to feed back the context awareness data to the brain of the driver comprises:
and sending the environment sensing data to the brain-computer, wherein the brain-computer is used for carrying out noise reduction processing on the environment sensing data, converting the noise-reduced environment sensing data into brain electrical signals, and feeding back the brain electrical signals to the brain of a driver by adopting the brain-computer interface.
Optionally, the step of determining driving control information based on the received control instruction includes:
Determining a target driving mode matched with the received control instruction from a plurality of preset driving modes;
Determining a vehicle target control parameter corresponding to the target driving mode;
driving control information is generated based on the vehicle target control parameter.
Optionally, the step of determining driving control information based on the received control instruction includes:
determining control amount information based on the received control instruction, wherein the control amount information comprises a speed control amount and/or a steering control amount;
Driving control information is generated based on the control amount information.
The embodiment of the invention also discloses a vehicle control method, which comprises the following steps:
receiving environment sensing data sent by a vehicle; the environment sensing data are generated by the vehicle according to external environment data acquired in real time;
Feeding back the context awareness data to the driver's brain;
Determining a control instruction based on feedback information made by the driver's brain for the context awareness data;
Sending the control instruction to the vehicle; the vehicle is used for determining driving control information based on the received control instruction and controlling the vehicle according to the driving control information.
The embodiment of the invention also discloses a vehicle control device, which comprises:
the sensing module is used for generating environment sensing data based on the external environment data acquired in real time;
the first feedback module is used for sending the environment sensing data to a brain machine connected with the vehicle so as to feed back the environment sensing data to the brain of a driver;
The first receiving module is used for receiving a control instruction sent by the brain-computer; the control instruction is determined according to feedback information made by the brain of the driver for the environment sensing data;
The driving control information determining module is used for determining driving control information based on the received control instruction;
And the control module is used for controlling the vehicle according to the driving control information.
The embodiment of the invention also discloses a vehicle control device, which comprises:
The second receiving module is used for receiving the environment sensing data sent by the vehicle; the environment sensing data are generated by the vehicle according to external environment data acquired in real time;
The second feedback module is used for feeding back the environment sensing data to the brain of the driver;
A control instruction determining module for determining a control instruction based on feedback information made by the driver's brain for the context awareness data;
The sending module is used for sending the control instruction to the vehicle; the vehicle is used for determining driving control information based on the received control instruction and controlling the vehicle according to the driving control information.
The embodiment of the invention also discloses a vehicle, which comprises:
One or more processors; and
One or more machine readable media having instructions stored thereon, which when executed by the one or more processors, cause the vehicle to perform the method as described above.
One or more machine-readable storage media having instructions stored thereon, which when executed by one or more processors, cause the processors to perform the method as described above are also disclosed.
The embodiment of the invention has the following advantages:
The embodiment of the invention generates the environment perception data based on the external environment data acquired in real time; acquiring the environment outside the vehicle in real time, and drawing out that the current vehicle is a real-time environment; transmitting the context awareness data to a brain-computer connected to the vehicle to feed back the context awareness data to the driver's brain; the sensing data of the real-time environment is fed back to the brain of the driver, so that the brain of the driver can sense the external environment of the vehicle, a control instruction is sent to the vehicle aiming at the current environment, and the control instruction sent by the brain-computer is received; the control instruction is determined according to feedback information made by the brain of the driver for the environment sensing data; determining driving control information based on the received control instruction; controlling the vehicle according to the driving control information; according to the control instruction fed back to the vehicle by the brain of the driver, the vehicle can be controlled through consciousness by the driver, the vehicle is not required to be controlled through limbs, and the experience of the driver is improved.
Drawings
FIG. 1 is a flowchart of steps of a first embodiment of a vehicle control method of the present invention;
FIG. 2 is a flow chart of steps of a second embodiment of a vehicle control method of the present invention;
FIG. 3 is a flow chart of steps of a third embodiment of a vehicle control method of the present invention;
FIG. 4 is a hardware execution flow chart of an example of a vehicle control method of the present invention;
FIG. 5 is a flow chart of steps of a fourth embodiment of a vehicle control method of the present invention;
Fig. 6 is a block diagram of a vehicle control apparatus according to a first embodiment of the present invention;
Fig. 7 is a block diagram showing the structure of a second embodiment of a vehicle control apparatus according to the present invention.
Detailed Description
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description.
In the related art, driving control of a vehicle is mainly two aspects: firstly, the traditional automobile driving is that a driver controls the position and the state of the automobile through parameters such as the opening degree or the position of a control element on the automobile by limb movement; for example, the driver controls the braking force through the right foot to control the speed of the vehicle or adjusts the position of the gear lever of the transmission to control the traveling direction of the vehicle. Secondly, under a specific environment, the automatic driving module of the vehicle inputs specific data to a preset neural network model, and the neural network model outputs a control result and carries out the vehicle based on the control result; if the vehicle is in a parking scene, after the sensor detects the parking space, the vehicle is controlled to automatically park in the parking space; or on the expressway, when the automatic driving assistance function is started, automatic lane changing and the like are performed. Due to the development of current sensor technology, more and more vehicles are configured with a second type of driving control; however, as the output effect of the model is related to the training material, the output result is uniform; under the same scene, the vehicles are controlled based on the same output result, the acceptance degree of the drivers to the control result is not the same, and no more drivers feel; for example, when parking automatically, the model output tends to be parked in the center of the parking space, but the driver may consider the parked position to be on the right for convenient entry and exit. It can be seen that the control of the vehicle by the autopilot sometimes does not take into consideration the feeling of the driver, resulting in a reduction in the experience.
Referring to fig. 1, a flowchart illustrating steps of a first embodiment of a vehicle control method according to the present invention may specifically include the following steps:
step 101, generating environment perception data based on external environment data acquired in real time;
the external environment refers to the environment outside the vehicle, including the vehicle surroundings in the front and rear, side and bottom of the vehicle, and of course, when required by those skilled in the art, such as when the vehicle control is performed only on the vehicle external environment basis of the front and rear parts of the vehicle, etc., the external environment also belongs to the external environment described in the embodiments of the present invention. The data detected for the external environment is the external environment data.
Environmental perception data which can be perceived by the brain of a driver is generated based on external environmental data acquired in real time.
Step 102, sending the environment sensing data to a brain machine connected with the vehicle so as to feed back the environment sensing data to the brain of a driver;
It should be noted that, the vehicle and the brain-computer may be connected to at least one brain-computer according to the selection of the driver after the vehicle is powered on, and then one of them is selected to transmit the environmental perception data; or can be connected with a brain machine as default and directly sent to the brain machine after acquiring the environment sensing data. After the vehicle is connected with the brain-computer, the generated environment-aware data is transmitted to the brain-computer in real time, wherein the transmission mode comprises, but is not limited to, transmission through an internet communication protocol, a blockchain protocol and cloud data communication protocol.
After receiving the environmental perception data, the brain machine converts the environmental perception data into perceivable environmental perception data of the brain of the driver so as to feed back the environmental perception data to the brain of the driver. After the driver receives the environmental perception data, the brain of the driver can judge the external environment corresponding to the environmental perception data so as to output brain activity physical parameters corresponding to the consciousness activity, such as brain current. The brain-computer determines appropriate control instructions based on physical parameters of the driver's brain activities and sends the control instructions to the vehicle.
Step 103, receiving a control instruction sent by the brain-computer; the control instruction is determined according to feedback information made by the brain of the driver for the environment sensing data;
the vehicle can also receive control instructions sent by the brain machine in the modes of internet communication protocol, blockchain protocol, cloud data communication protocol and the like, wherein the control instructions are determined by the brain machine according to feedback information such as brain current, which is made by the brain of a driver for the environment perception data.
Step 104, determining driving control information based on the received control instruction;
After receiving the control instruction, the vehicle determines corresponding driving control information based on the received control instruction, and two modes for determining the driving control information are included, wherein one mode is based on a driving mode matched with the control instruction, and the vehicle is automatically controlled based on the driving mode; another is to determine a specific control amount of the vehicle as driving control information based on the control instruction.
And 105, controlling the vehicle according to the driving control information.
After the driving control information is determined, control parameters corresponding to the vehicle, such as vehicle speed, gear, direction and the like, are adjusted based on the driving control information.
The embodiment of the invention generates the environment perception data based on the external environment data acquired in real time; acquiring the environment outside the vehicle in real time, and drawing out that the current vehicle is a real-time environment; transmitting the context awareness data to a brain-computer connected to the vehicle to feed back the context awareness data to the driver's brain; the sensing data of the real-time environment is fed back to the brain of the driver, so that the brain of the driver can sense the external environment of the vehicle, a control instruction is sent to the vehicle aiming at the current environment, and the control instruction sent by the brain-computer is received; the control instruction is determined according to feedback information made by the brain of the driver for the environment sensing data; determining driving control information based on the received control instruction; controlling the vehicle according to the driving control information; according to the control instruction fed back to the vehicle by the brain of the driver, the vehicle can be controlled through consciousness by the driver, the vehicle is not required to be controlled through limbs, and the experience of the driver is improved.
Referring to fig. 2, there is shown a flow chart of steps of a second embodiment of a vehicle control method of the present invention, the vehicle being provided with a sensor,
It should be noted that the sensor may refer to a detection signal corresponding to a target position in any direction on the vehicle, such as being placed in front of the vehicle, a bottom of a rearview mirror of the vehicle, a top of the vehicle, and the like, and the embodiment of the present invention is not limited. The sensor may be a vision sensor, a laser radar, a ranging sensor, etc., and those skilled in the art may select a different sensor, which is not particularly limited in the embodiment of the present invention.
The method specifically comprises the following steps:
step 201, acquiring a real-time detection signal of the sensor;
After the vehicle is powered on, the acquisition of real-time detection signals of the sensor is started, such as the acquisition of detection signals of the vehicle, pedestrians and obstacles through the vision sensor.
Step 202, reading map data of a high-precision map, wherein the map data corresponds to the position of the vehicle in the high-precision map;
It should be noted that the absolute precision of the high-precision map may be in sub-meter level, and even under the condition of partial high-precision requirement, the absolute precision is up to within 10 cm, and in addition, the transverse relative precision is higher than the absolute precision. The high-precision map not only has high-precision coordinates, but also has accurate road shape, and the gradient, curvature, heading, elevation and rolling data of each lane are also contained, in addition, the lane line between each lane and the lane is a dotted line, a solid line or a double yellow line, the color of the line, the isolation zone of the road, and even the arrow and text content on the road, the position of which is described.
The map data of the high-precision map, which corresponds to the position of the vehicle in the high-precision map, can be read based on the position of the vehicle in the high-precision map while the sensor real-time detection signal is acquired. The parameters included in the map data may be selected by those skilled in the art according to actual needs, and the embodiment of the present invention is not particularly limited.
Step 203, determining the detection signal and the map data as external environment data;
After the detection signal is acquired and the map data is read, the detection signal and the map data are determined to be external environment data. If one of the data is missing, the system can detect whether the network state of the vehicle is faulty or the sensor is faulty or not through the self-detection system of the vehicle, and if the fault is, a warning signal is sent or a background maintenance personnel is notified.
Step 204, generating environment sensing data based on the external environment data acquired in real time;
and converting the real-time external environment data to generate environment perception data. Specifically, the time stamp of the external environment data can be used for converting the external environment data into the external environment data which can be continuously identified by the brain in the order of time; in one example of the invention, the conversion may be performed by:
sub-step S2041, determining a timestamp of the detection signal;
Since the sensor is connected to the electronic control unit of the vehicle, the signal output thereof can generate a time value of the signal detection time based on the time of the electronic control unit as a time stamp of the detection signal. When the detection signal is acquired, the timestamp carried by the detection signal is determined.
Sub-step S2042, determining a time stamp of the map data;
The map data may be read from a high-definition map stored in a storage medium built in the vehicle, and the time of the map data may be a time at which the electronic control unit is used, and a time stamp of the reading time is determined as a time stamp of the map data when the map data is read. The map data may be obtained by remotely reading the high-precision map from a third party database or the like, and at this time, a time difference between the time of the third party database and the electronic control unit of the vehicle may be determined, and the time stamp of the map data may be determined in combination with the time difference on the basis of determining the time stamp of the reading time when the map data is remotely read.
Substep S2043 fuses the detection signal and map data of the same time stamp to generate context awareness data.
After the time stamp of the detection signal and the map data is obtained, the detection signal and the map data with the same time stamp are fused to generate the environment perception data. Of course, due to a certain detection error, the detection signal and the timestamp of the map data may be the timestamp detection signal and the map data within an error range are fused.
In the fusion process, the detection signals with the same time stamp can be mapped onto map data to generate environment perception data. For example, the sensor senses the detection signals of the surrounding object information, such as a bystander, a cone and bystanders, and then the sensed surrounding object data and the positioning position of the bicycle are mapped to the correct position of the map data to generate environment sensing data by combining the map data of the high-precision map with the same time stamp.
In addition, due to the large data volume and the security of the data to be ensured, the data can be encrypted and compressed in the fusion process, so that the transmitted data volume is reduced and the security of the data is enhanced.
Step 205, transmitting the environment sensing data to a brain-computer connected with the vehicle to feed back the environment sensing data to the brain of a driver;
The vehicle sends the environment sensing data to the brain-computer through a communication channel with the vehicle, and the environment sensing data is fed back to the brain of the driver through the brain-computer. The driver may be seated in the cockpit in the vehicle, or may remotely control the vehicle outside the vehicle.
In a preferred embodiment of the present invention, a brain-computer interface is provided in the brain-computer; the step of transmitting the context awareness data to a brain-computer connected to the vehicle to feed back the context awareness data to the brain of the driver comprises:
and step S2051, transmitting the environment sensing data to the brain computer, wherein the brain computer is used for carrying out noise reduction processing on the environment sensing data, converting the noise-reduced environment sensing data into an electroencephalogram signal, and feeding back the electroencephalogram signal to the brain of a driver by adopting the brain computer interface.
The brain-computer system is provided with a brain-computer interface (Brain Computer Interface, BCI) which is an interface device for realizing information exchange between the brain and the equipment, wherein the direct connection is created between the brain and the external equipment. The context awareness data is sent to the brain-computer. The brain machine firstly carries out noise reduction and denoising treatment on the received environment sensing data, and filters noise in the data; the denoising and denoising mode can adopt a preset denoising algorithm to filter the environmental perception data, the preset algorithm can be a Kalman filtering algorithm, a median filtering algorithm and other denoising and denoising algorithms, the denoising and denoising modes can be selected by a person skilled in the art according to requirements, and the embodiment of the invention is not limited to the denoising and denoising mode. The brain-computer also converts the environment-perceived data after the noise reduction processing into an electroencephalogram signal, wherein the electroencephalogram signal refers to an electroencephalogram signal which can be perceived by the brain, and the electroencephalogram signal can be an electroencephalogram signal or a brain neuron signal, and the embodiment of the invention is not limited to the electroencephalogram signal or the brain neuron signal. In an example of the present invention, the environment sensing data is an analog signal, and the brain-computer converts the noise-reduced environment sensing data into an electroencephalogram signal by:
Converting the environment sensing data after the noise reduction treatment into a digital signal;
and converting the digital signal into an electroencephalogram signal.
Because the data types of the sensor and the high-precision map are analog information, the fused environment sensing data are analog signals, but the brain cannot directly recognize the analog signals, so that the environment sensing data after noise reduction processing are required to be converted into digital signals, and the digital signals after conversion are converted into brain electrical signals, so that the brain can recognize the sensing. In addition, if the data is compressed and encrypted in the fusion process, the environment-aware data after the noise reduction process is decompressed and decrypted before being converted into a digital signal.
Step 206, receiving a control instruction sent by the brain-computer; the control instruction is determined according to feedback information made by the brain of the driver for the environment sensing data;
The vehicle receives a control instruction determined according to feedback information made by the brain of the driver for the environmental awareness data through the internet.
Step 207, determining driving control information based on the received control instruction;
the vehicle may determine current driving control information based on the received control instruction.
In a preferred embodiment of the present invention, the step of determining driving control information based on the received control instruction includes:
sub-step S2071, determining a target driving mode matched with the received control instruction from a plurality of preset driving modes;
the vehicle can be provided with a plurality of driving modes, such as a standard mode, an economic mode, a sport mode or a strong recovery mode, and the like, and a person skilled in the art can also set different driving modes according to requirements, wherein the different driving modes correspond to different vehicle control strategies, such as the sport mode, the vehicle control strategies can be more aggressive, and the strong recovery mode can control the vehicle on the basis of ensuring the cruising ability. And after receiving a control instruction sent by the brain-computer, determining a target driving mode matched with the control instruction from a plurality of preset time modes.
Sub-step S2072, determining a vehicle target control parameter corresponding to the target driving mode;
Various control parameters of the vehicle are preset in the target driving mode, and when the target driving mode is determined, the corresponding control parameters are adopted as the target control parameters of the vehicle.
Substep S2073 generates driving control information based on the vehicle target control parameters.
When the vehicle target control parameter is determined, the target control parameter may be set as a control parameter of each control component in the driving control information.
And step 208, performing vehicle control according to the driving control information.
And controlling the vehicle to realize automatic driving according to the determined driving control information, such as the speed, the traveling direction, the steering angle and the like of the vehicle.
In the embodiment of the invention, the environment perception data is generated by the external environment data perceived by the vehicle, the environment perception data is sent to the brain of the driver through the brain-computer system, the brain feedback information of the driver determines a control instruction to the brain-computer system, and the vehicle determines driving control information to control the vehicle based on the control instruction. The vehicle can be controlled according to the consciousness of the driver, and the automatic driving control of the vehicle is not performed only mechanically, so that the automatic driving of the vehicle is more in line with the intention of the driver, and the driving experience of the driver is improved.
Referring to fig. 3, a flowchart illustrating steps of a third embodiment of a vehicle control method according to the present invention may specifically include the following steps:
Step 301, generating environment sensing data based on external environment data acquired in real time;
The environment sensing data are generated by acquiring the external environment data such as the geography of the current position of the vehicle, road marks and the like through a high-precision map.
Step 302, transmitting the environment sensing data to a brain machine connected with the vehicle so as to feed back the environment sensing data to the brain of a driver;
the vehicle sends the environment sensing data to the brain computer connected with the vehicle in an internet mode, and the environment sensing data is fed back to the brain of the driver through the brain computer. And the brain machine determines a control instruction according to the information fed back by the brain of the driver.
Step 303, receiving a control instruction sent by the brain-computer; the control instruction is determined according to feedback information made by the brain of the driver for the environment sensing data;
And the vehicle receives a control instruction sent by the brain machine through the Internet, and the control instruction is determined according to feedback information of the brain of the driver, which is made for the environment perception data.
Step 304, determining control amount information based on the received control instruction, wherein the control amount information comprises a speed control amount and/or a steering control amount;
The control command may include specific control parameters, and the control quantity information is determined based on the control parameters in the control command, where the control quantity information includes a speed control quantity and/or a steering control quantity, that is, the angle information of the steering wheel and/or the depth information of the accelerator.
A step 305 of generating driving control information based on the control amount information;
When the control quantity information is acquired, the steering angle information and/or the depth information of the accelerator of the steering wheel of the vehicle and the current steering angle and/or the accelerator depth of the steering wheel of the vehicle are required to be controlled, and a parameter value required to be controlled is determined to generate driving control information; if the current steering wheel angle is 10 degrees in the left direction, the control amount information angle is 90 degrees in the left direction, and the driving control information angle of the driving control information is determined to be 80 degrees in the left direction, and the driving control information angle of the steering wheel angle is 80 degrees in the left direction.
And 306, controlling the vehicle according to the driving control information.
And controlling specific parts of the vehicle according to the driving control information, and controlling the steering wheel to turn left by 80 degrees if the driving control information of which the left direction is 80 degrees is steering angle.
The invention will be further described by way of example, in order to facilitate a further understanding of the skilled person:
Referring to fig. 4, a hardware execution flow chart of an example of a vehicle control method of the invention is shown;
In this example, the highway driving system based on visual brain-computer interaction realizes the control of the vehicle, and mainly comprises four parts: the external device 401 collects vehicle external environment data such as visual sensors. The in-vehicle terminal system 402 is in-vehicle software for processing data of the sensor and the high-precision map. The brain-computer system 404 is an operation platform with a brain-computer interface and two channels, and has the functions of hyper-compression and convergence keys. Specifically, for example, an electroencephalogram code module 405, an electroencephalogram signal extraction device 407, an amplification device 408, and a denoising device 409; the operation platform is provided with an operation interface, and can send digital signals to the network connection module as control instructions through software installed in the operation interface, and can also receive digital signals fed back by external equipment sent by the network connection module; the network connection module 403 is configured to receive a digital signal sent by the brain-computer system, and operate the connected external device as a digital signal instruction in an internet communication protocol, a blockchain protocol, and a cloud data communication protocol manner; and is further configured to receive a digital signal fed back by the external device 401 through an internet communication protocol, a blockchain protocol, and a cloud data communication protocol after operation.
First, surrounding external environment data is extracted by a sensor and a high-precision map included in an external device 401 of a vehicle, and transmitted to an in-vehicle terminal system 402; the vehicle-mounted terminal system 402 fuses external environment data into environment sensing data, the environment sensing data is transmitted to the brain-computer system 404 of the server through the network connection module 403, the brain electrolysis code module 405 in the brain-computer system 404 decodes the environment sensing data based on the super-compression function and the convergence key function, the environment sensing data is decoded into brain wave signals or brain neuron signals which can be identified by the brain of a user, and the brain is extracted for the human brain instructions 406 through the brain wave extraction device 407 after responding. After the human brain command 406 is extracted, the physical quantity of the human brain command 406 is amplified by the amplifying device 408, so that the brain-computer system can recognize and process. The amplified human brain command is denoised (denoised) by a denoising device 409 to reduce interference, and the denoised human brain command is fed back to the brain electrolysis code module 405. At this time, the brain electrolysis code module 405 decodes the human brain command subjected to noise removal and amplification into a control command, and the vehicle performs vehicle control by transmitting the control command to the network connection module 403, and the vehicle determines driving control information according to the control command.
Referring to fig. 5, a flowchart illustrating steps of a fourth embodiment of a vehicle control method according to the present invention may specifically include the following steps:
Step 501, receiving environment sensing data sent by a vehicle; the environment sensing data are generated by the vehicle according to external environment data acquired in real time;
the method of the embodiment of the invention can be applied to brain-computer systems, which can be visual brain-computer systems. After the vehicle is connected, receiving environment sensing data sent by the vehicle; the connection mode can be referred to as the connection mode of the previous embodiment. The environment sensing data are generated by the vehicle according to the external environment acquired in real time so as to simulate the actual environment outside the vehicle.
Step 502, feeding back the environment sensing data to the brain of the driver;
After receiving the environment sensing data, the environment sensing data can be processed, converted into environment sensing data which can be identified by the brain of the driver and fed back to the brain of the driver.
Step 503, determining a control instruction based on feedback information made by the brain of the driver for the environmental awareness data;
After receiving the environmental awareness data, the driver's brain will make feedback based on the environmental awareness data, and the brain-computer receives the feedback information and determines the control command based on the feedback information.
Step 504, sending the control instruction to the vehicle; the vehicle is used for determining driving control information based on the received control instruction and controlling the vehicle according to the driving control information.
And sending the control instruction to the vehicle, determining driving control information based on the received control instruction by the vehicle, and controlling the vehicle according to the driving control information so as to realize automatic driving control through a brain-computer.
In the embodiment of the invention, the environment sensing data sent by the vehicle is received and fed back to the brain of the driver. The actual environment outside the vehicle is fed back to the brain of the driver, so that the driver can directly perceive the external environment condition of the vehicle under the assistance of the brain-computer, feedback information is generated to the brain-computer based on the external environment condition, the brain-computer determines a control instruction based on the feedback information and sends the control instruction to the vehicle, and the vehicle can determine driving control information based on the control instruction to control the vehicle. The brain-computer system can control the vehicle under the condition that the driver does not touch the vehicle, so that the automatic driving control of the vehicle is realized.
It should be noted that, for simplicity of description, the method embodiments are shown as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred embodiments, and that the acts are not necessarily required by the embodiments of the invention.
Referring to fig. 6, a block diagram of a first embodiment of a vehicle control apparatus according to the present invention is shown, and may specifically include the following modules:
A sensing module 601, configured to generate environmental sensing data based on external environmental data acquired in real time;
a first feedback module 602, configured to send the context awareness data to a brain-computer connected to the vehicle, so as to feed back the context awareness data to the brain of the driver;
A first receiving module 603, configured to receive a control instruction sent by the brain-computer; the control instruction is determined according to feedback information made by the brain of the driver for the environment sensing data;
a driving control information determination module 604 for determining driving control information based on the received control instruction;
And the control module 605 is used for controlling the vehicle according to the driving control information.
Optionally, the vehicle is provided with a sensor, and the device further comprises:
The acquisition module is used for acquiring real-time detection signals of the sensor;
the reading module is used for reading map data of a high-precision map, and the map data corresponds to the position of the vehicle in the high-precision map;
and the determining module is used for determining the detection signal and the map data as external environment data.
Optionally, the sensing module 601 includes:
a first timestamp determination submodule for determining a timestamp of the detection signal;
a second timestamp determination sub-module for determining a timestamp of the map data;
And the fusion sub-module is used for fusing the detection signals with the same time stamp and the map data to generate environment perception data.
Optionally, a brain-computer interface is arranged in the brain-computer; the first feedback module 602 includes:
the feedback processing sub-module is used for sending the environment sensing data to the brain computer, the brain computer is used for carrying out noise reduction processing on the environment sensing data, converting the noise reduction processed environment sensing data into brain electrical signals, and feeding back the brain electrical signals to the brain of a driver by adopting the brain computer interface.
Optionally, the environment sensing data is an analog signal, and the brain-computer converts the environment sensing data after the noise reduction processing into an electroencephalogram signal by the following manner:
Converting the environment sensing data after the noise reduction treatment into a digital signal;
and converting the digital signal into an electroencephalogram signal.
Optionally, the driving control information determining module 604 includes:
The matching sub-module is used for determining a target driving mode matched with the received control instruction from a plurality of preset driving modes;
The vehicle target control parameter determining submodule is used for determining vehicle target control parameters corresponding to the target driving mode;
and the first generation sub-module is used for generating driving control information based on the vehicle target control parameters.
Optionally, the driving control information determining module 604 includes:
a control amount information determination sub-module for determining control amount information based on the received control instruction, the control amount information including a speed control amount and/or a steering control amount;
and the second generation sub-module is used for generating driving control information based on the control quantity information.
Referring to fig. 7, a block diagram of a second embodiment of a vehicle control apparatus according to the present invention is shown, and may specifically include the following modules:
A second receiving module 701, configured to receive environmental awareness data sent by a vehicle; the environment sensing data are generated by the vehicle according to external environment data acquired in real time;
A second feedback module 702 for feeding back the context awareness data to the brain of the driver;
a control instruction determination module 703 for determining a control instruction based on feedback information made by the driver's brain for the context awareness data;
A transmitting module 704, configured to transmit the control instruction to the vehicle; the vehicle is used for determining driving control information based on the received control instruction and controlling the vehicle according to the driving control information.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points.
The embodiment of the application also provides a vehicle, which comprises: one or more processors; and one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the vehicle to perform the vehicle control method as described above. The processes of the embodiments can achieve the same technical effects, and in order to avoid repetition, the description is omitted here.
The embodiment of the application also provides one or more machine-readable storage media, on which instructions are stored, which when executed by one or more processors, cause the processors to perform the processes of the vehicle control method embodiment described above, and achieve the same technical effects, and in order to avoid repetition, a description is omitted herein.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
It will be apparent to those skilled in the art that embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the invention may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or terminal device that comprises the element.
The foregoing has outlined a detailed description of a vehicle control method, a vehicle control device, a vehicle and a storage medium, wherein specific examples are provided herein to illustrate the principles and embodiments of the present invention, and the above examples are provided to assist in understanding the method and core concepts of the present invention; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present invention, the present description should not be construed as limiting the present invention in view of the above.

Claims (10)

1. A vehicle control method characterized by comprising:
Generating environment perception data based on external environment data acquired in real time;
Transmitting the context awareness data to a brain-computer connected to the vehicle to feed back the context awareness data to the driver's brain;
Receiving a control instruction sent by the brain-computer; the control instruction is determined according to feedback information made by the brain of the driver for the environment sensing data; the brain of the driver perceives the environment perception data through brain electrical signals; the control command comprises a speed control amount and/or a steering control amount;
determining driving control information based on the received control instruction;
And controlling the vehicle according to the driving control information.
2. The method of claim 1, wherein the vehicle is provided with a sensor, the method further comprising:
Acquiring a real-time detection signal of the sensor;
Reading map data of a high-precision map, wherein the map data corresponds to the position of the vehicle in the high-precision map;
and determining the detection signal and the map data as external environment data.
3. The method according to claim 1 or 2, wherein a brain-computer interface is provided in the brain-computer; the step of transmitting the context awareness data to a brain-computer connected to the vehicle to feed back the context awareness data to the brain of the driver comprises:
and sending the environment sensing data to the brain-computer, wherein the brain-computer is used for carrying out noise reduction processing on the environment sensing data, converting the noise-reduced environment sensing data into brain electrical signals, and feeding back the brain electrical signals to the brain of a driver by adopting the brain-computer interface.
4. The method of claim 1, wherein the step of determining driving control information based on the received control instruction comprises:
Determining a target driving mode matched with the received control instruction from a plurality of preset driving modes;
Determining a vehicle target control parameter corresponding to the target driving mode;
driving control information is generated based on the vehicle target control parameter.
5. The method of claim 1, wherein the step of determining driving control information based on the received control instruction comprises:
determining control quantity information based on the received control instruction;
Driving control information is generated based on the control amount information.
6. A vehicle control method characterized by comprising:
receiving environment sensing data sent by a vehicle; the environment sensing data are generated by the vehicle according to external environment data acquired in real time;
Feeding back the context awareness data to the driver's brain; the brain of the driver perceives the environment perception data through brain electrical signals;
Determining a control instruction based on feedback information made by the driver's brain for the context awareness data; the control command comprises a speed control amount and/or a steering control amount;
Sending the control instruction to the vehicle; the vehicle is used for determining driving control information based on the received control instruction and controlling the vehicle according to the driving control information.
7. A vehicle control apparatus characterized by comprising:
the sensing module is used for generating environment sensing data based on the external environment data acquired in real time;
The first feedback module is used for sending the environment sensing data to a brain machine connected with the vehicle so as to feed back the environment sensing data to the brain of a driver; the brain of the driver perceives the environment perception data through brain electrical signals; the control command comprises a speed control amount and/or a steering control amount;
The first receiving module is used for receiving a control instruction sent by the brain-computer; the control instruction is determined according to feedback information made by the brain of the driver for the environment sensing data;
The driving control information determining module is used for determining driving control information based on the received control instruction;
And the control module is used for controlling the vehicle according to the driving control information.
8. A vehicle control apparatus characterized by comprising:
The second receiving module is used for receiving the environment sensing data sent by the vehicle; the environment sensing data are generated by the vehicle according to external environment data acquired in real time;
the second feedback module is used for feeding back the environment sensing data to the brain of the driver; the brain of the driver perceives the environment perception data through brain electrical signals;
A control instruction determining module for determining a control instruction based on feedback information made by the driver's brain for the context awareness data; the control command comprises a speed control amount and/or a steering control amount;
The sending module is used for sending the control instruction to the vehicle; the vehicle is used for determining driving control information based on the received control instruction and controlling the vehicle according to the driving control information.
9. A vehicle, characterized by comprising:
One or more processors; and
One or more machine readable media having instructions stored thereon, which when executed by the one or more processors, cause the vehicle to perform the method of any of claims 1-5 or claim 6.
10. One or more machine-readable storage media having instructions stored thereon, which when executed by one or more processors, cause the processors to perform the method of any of claims 1-5 or claim 6.
CN202111232970.0A 2021-10-22 2021-10-22 Vehicle control method and device, vehicle and storage medium Active CN113867363B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111232970.0A CN113867363B (en) 2021-10-22 2021-10-22 Vehicle control method and device, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111232970.0A CN113867363B (en) 2021-10-22 2021-10-22 Vehicle control method and device, vehicle and storage medium

Publications (2)

Publication Number Publication Date
CN113867363A CN113867363A (en) 2021-12-31
CN113867363B true CN113867363B (en) 2024-06-07

Family

ID=78997172

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111232970.0A Active CN113867363B (en) 2021-10-22 2021-10-22 Vehicle control method and device, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN113867363B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107015632A (en) * 2016-01-28 2017-08-04 南开大学 Control method for vehicle, system based on brain electricity driving
CN107548312A (en) * 2015-03-10 2018-01-05 赫尔实验室有限公司 System and method for training and assessing
CN108446021A (en) * 2018-02-28 2018-08-24 天津大学 Application process of the P300 brain-computer interfaces in smart home based on compressed sensing
CN109782917A (en) * 2019-01-21 2019-05-21 中国联合网络通信集团有限公司 A kind of consciousness industrial control system and method based on brain-computer interface
CN112109718A (en) * 2020-06-17 2020-12-22 上汽通用五菱汽车股份有限公司 Vehicle control method, device and computer readable storage medium
CN112114670A (en) * 2020-09-10 2020-12-22 季华实验室 Man-machine co-driving system based on hybrid brain-computer interface and control method thereof
CN112356841A (en) * 2020-11-26 2021-02-12 中国人民解放军国防科技大学 Vehicle control method and device based on brain-computer interaction

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA3087786A1 (en) * 2018-01-09 2019-07-18 Holland Bloorview Kids Rehabilitation Hospital In-ear eeg device and brain-computer interfaces

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107548312A (en) * 2015-03-10 2018-01-05 赫尔实验室有限公司 System and method for training and assessing
CN107015632A (en) * 2016-01-28 2017-08-04 南开大学 Control method for vehicle, system based on brain electricity driving
CN108446021A (en) * 2018-02-28 2018-08-24 天津大学 Application process of the P300 brain-computer interfaces in smart home based on compressed sensing
CN109782917A (en) * 2019-01-21 2019-05-21 中国联合网络通信集团有限公司 A kind of consciousness industrial control system and method based on brain-computer interface
CN112109718A (en) * 2020-06-17 2020-12-22 上汽通用五菱汽车股份有限公司 Vehicle control method, device and computer readable storage medium
CN112114670A (en) * 2020-09-10 2020-12-22 季华实验室 Man-machine co-driving system based on hybrid brain-computer interface and control method thereof
CN112356841A (en) * 2020-11-26 2021-02-12 中国人民解放军国防科技大学 Vehicle control method and device based on brain-computer interaction

Also Published As

Publication number Publication date
CN113867363A (en) 2021-12-31

Similar Documents

Publication Publication Date Title
US10189482B2 (en) Apparatus, system and method for personalized settings for driver assistance systems
US10183679B2 (en) Apparatus, system and method for personalized settings for driver assistance systems
CN109421738B (en) Method and apparatus for monitoring autonomous vehicles
US9428188B2 (en) Lane assist functions for vehicles with a trailer
US11150652B2 (en) Method for operating a driver assistance device of a motor vehicle
CN110214107B (en) Autonomous vehicle providing driver education
JP4694440B2 (en) Driver support system
CN111016924B (en) Remote driving control method and device for automatic driving vehicle and remote driving system
US20170277182A1 (en) Control system for selective autonomous vehicle control
US8918227B2 (en) Device and method for determining a vigilance state
US9643493B2 (en) Display control apparatus
CN110178141A (en) Method for manipulating autonomous motor vehicles
DE102017202984A1 (en) DEVICE FOR AUTONOMOUS DRIVING
US20210090439A1 (en) Method for warning a vulnerable road user
US20190291639A1 (en) Support for hearing-impaired drivers
US11403494B2 (en) Obstacle recognition assistance device, obstacle recognition assistance method, and storage medium
US20200047751A1 (en) Cooperative vehicle safety system and method
CN115551757A (en) Passenger screening
CN113867363B (en) Vehicle control method and device, vehicle and storage medium
DE102021116309A1 (en) ASSISTANCE FOR DISABLED DRIVERS
JP2022139524A (en) Drive support device and recording medium
Mrazovac et al. Human-centric role in self-driving vehicles: Can human driving perception change the flavor of safety features?
JP2008105511A (en) Driving support apparatus
CN115720555A (en) Method and system for improving user alertness in an autonomous vehicle
DE102017223570B3 (en) A method for enriching a cloud with acoustic vehicle environment data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant