CN111559382B - Vehicle running control method and device - Google Patents

Vehicle running control method and device Download PDF

Info

Publication number
CN111559382B
CN111559382B CN202010385575.5A CN202010385575A CN111559382B CN 111559382 B CN111559382 B CN 111559382B CN 202010385575 A CN202010385575 A CN 202010385575A CN 111559382 B CN111559382 B CN 111559382B
Authority
CN
China
Prior art keywords
eyeball
vehicle
detection system
tracking detection
driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010385575.5A
Other languages
Chinese (zh)
Other versions
CN111559382A (en
Inventor
王大宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010385575.5A priority Critical patent/CN111559382B/en
Publication of CN111559382A publication Critical patent/CN111559382A/en
Application granted granted Critical
Publication of CN111559382B publication Critical patent/CN111559382B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • B60W30/12Lane keeping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses vehicle driving control method and device, which are applied to a first vehicle, wherein the first vehicle comprises an eyeball tracking detection system and a driving auxiliary system, and the method comprises the following steps: the eyeball tracking detection system determines a first scene type corresponding to vehicle driving parameters of a first vehicle, and the vehicle driving parameters are sent to the eyeball tracking detection system by the driving auxiliary system; the eyeball tracking detection system determines the eyeball movement track of the driver of the first vehicle through an eyeball tracking technology; the eyeball tracking detection system determines the driving state of the driver according to the first scene type and the eyeball moving track; the eyeball tracking detection system determines a control strategy according to the driving state and the vehicle driving parameters, and sends the control strategy to the driving auxiliary system, and the control strategy is used for the driving auxiliary system to actively control the vehicle. By the adoption of the method and the device, accuracy and real-time performance of vehicle running control can be improved.

Description

Vehicle running control method and device
Technical Field
The present disclosure relates to the field of electronic technologies, and in particular, to a method and an apparatus for controlling vehicle driving.
Background
With the development of science and technology, the intelligent automobile mostly uses a driving auxiliary system, and the driving auxiliary system mainly comprises a lane keeping auxiliary system, an automatic parking auxiliary system, a brake auxiliary system, a backing auxiliary system, a driving auxiliary system and the like.
At present, the driving assistance system improves the assistance system from the perspective of an automobile, so that the automobile is more and more intelligent, the screen in the automobile is more and more large along with the fact that the driving system in the automobile is more and more complex, and the driver is also more easily disturbed by distraction.
Disclosure of Invention
The embodiment of the application provides a vehicle running control method and device, so that the accuracy and the real-time performance of vehicle running control are expected to be improved.
In a first aspect, an embodiment of the present application provides a vehicle driving control method, which is applied to a first vehicle including an eyeball tracking detection system and a driving assistance system, and includes:
the eyeball tracking detection system determines a first scene type corresponding to vehicle driving parameters of the first vehicle, wherein the vehicle driving parameters are sent to the eyeball tracking detection system by the driving auxiliary system;
the eyeball tracking detection system determines the eyeball movement track of the driver of the first vehicle through an eyeball tracking technology;
the eyeball tracking detection system determines the driving state of the driver according to the first scene type and the eyeball moving track;
the eyeball tracking detection system determines a control strategy according to the driving state and the vehicle driving parameters, and sends the control strategy to the driving assistance system, wherein the control strategy is used for the driving assistance system to actively control the vehicle.
In a second aspect, an embodiment of the present application provides a vehicle travel control device applied to a first vehicle including an eyeball tracking detection system and a travel assistance system, the vehicle travel control device including a determination unit and a transmission unit, wherein:
the determining unit is configured to determine, by the eyeball tracking detection system, a first scene type corresponding to a vehicle driving parameter of the first vehicle, where the vehicle driving parameter is sent to the eyeball tracking detection system by the driving assistance system through the sending unit; and means for determining, by the eye tracking detection system, an eye movement trajectory of a driver of the first vehicle according to an eye tracking technique; and determining, by the eye tracking detection system, a driving state of the driver from the first scene type and the eye movement trajectory; and for determining a control strategy by the eye tracking detection system from the driving state and the vehicle driving parameters;
the sending unit is configured to send the control policy to the driving assistance system through the eyeball tracking detection system, where the control policy is used for the driving assistance system to actively control the vehicle.
In a third aspect, embodiments of the present application provide a first vehicle, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing steps of any of the methods of the first aspect of the embodiments of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps described in any one of the methods of the first aspect of the present application.
In a fifth aspect, the present application provides a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform some or all of the steps as described in any one of the methods of the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, a first vehicle determines, by using the eyeball tracking detection system, a first scene type corresponding to a vehicle driving parameter of the first vehicle, determines, by using the eyeball tracking detection system, an eyeball movement trajectory of a driver of the first vehicle according to an eyeball tracking technology, further determines, by using the eyeball tracking detection system, a driving state of the driver according to the first scene type and the eyeball movement trajectory, then determines, by using the eyeball tracking detection system, a control strategy according to the driving state and the vehicle driving parameter, and sends the control strategy to the driving assistance system, where the control strategy is used for the driving assistance system to actively control the vehicle. Therefore, the first vehicle determines the driving state according to the eyeball movement track of the user under different scene types, not only the driving state is determined according to the eyeball movement track, the accuracy of vehicle driving control is improved, but also the control strategy is determined according to the driving state of the driver and the current vehicle driving parameters, and the real-time performance of vehicle control is improved on the basis of further improving the accuracy of vehicle driving control.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic structural diagram of a first vehicle provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of an eyeball tracking detection system provided by an embodiment of the application;
FIG. 3 is a schematic flow chart of a vehicle driving control method according to an embodiment of the present application;
FIG. 4 is a schematic flow chart diagram illustrating another method for controlling vehicle operation according to an embodiment of the present disclosure;
FIG. 5 is a schematic flow chart diagram illustrating a further method for controlling vehicle operation according to an embodiment of the present application;
fig. 6 is a block diagram of distributed functional units of a vehicle travel control device according to an embodiment of the present application;
fig. 7 is a block diagram of an integrated functional unit of a vehicle running control device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
In order to better understand the scheme of the embodiments of the present application, the following first introduces the related terms and concepts that may be involved in the embodiments of the present application.
1) Eyeball tracking, also known as eye tracking, human eye tracking/tracing, gaze point tracking/tracing, and the like, refers to a mechanism for determining a user's gaze direction and gaze point based on fused image acquisition, gaze estimation techniques.
2) The fixation point refers to a point where the line of sight of human eyes falls on the plane where the screen is located.
For example, fig. 1 shows a schematic structural diagram of a first vehicle 100. The first vehicle 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, an eye tracking detection system 140, a driving assistance system 150, a camera 160, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the first vehicle 100. In other embodiments of the present application, the first vehicle 100 may include more or fewer components than illustrated, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein the different processing units may be separate components or may be integrated in one or more processors. In some embodiments, the first vehicle 100 may also include one or more processors 110. The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to complete the control of instruction fetching and instruction execution. In other embodiments, a memory may also be provided in processor 110 for storing instructions and data. Illustratively, the memory in the processor 110 may be a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. This avoids repeated accesses, reduces the latency of the processor 110, and thus improves the efficiency of the first vehicle 100 in processing data or executing instructions.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an inter-integrated circuit (I2C) interface, an inter-integrated circuit audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a SIM card interface, a USB interface, and/or the like. The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the first vehicle 100, and may also be used to transmit data between the first vehicle 100 and peripheral devices. The USB interface 130 may also be used to connect to a headset to play audio through the headset.
It should be understood that the interfacing relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not constitute a limitation to the structure of the first vehicle 100. In other embodiments of the present application, the first vehicle 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the first vehicle 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function.
Internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may cause the first vehicle 100 to perform the method of displaying page elements provided in some embodiments of the present application, as well as various applications and data processing, etc., by executing the above-described instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. Wherein, the storage program area can store an operating system; the storage program area may also store one or more applications (e.g., gallery, contacts, etc.), and the like. The stored data area may store data (e.g., photos, contacts, etc.) created during use of the first vehicle 100, and the like. Further, the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic disk storage components, flash memory components, Universal Flash Storage (UFS), and the like. In some embodiments, the processor 110 may cause the first vehicle 100 to execute the method for displaying page elements provided in the embodiments of the present application, and other applications and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor 110.
As shown in fig. 2, the eye tracking detection system 140 includes a microprocessor 210, an eye detection module 220 and a data processing module 230, wherein the eye detection module 220 implements an eye tracking technique through the camera 160 to obtain an eye movement trajectory of the driver, the data processing module 230 includes an algorithm model, performs data processing on data obtained by the microprocessor 210 or the eye detection module 220, and the microprocessor 210 can determine a control strategy and the like on a data processing result of the data processing module 230.
The driving assistance system 150 includes a sensor module, which may include a speed sensor, an acceleration sensor, a distance sensor 180F, a proximity light sensor, and the like, and the driving assistance system 150 may acquire parameters of the first vehicle 100, such as an acceleration, a speed, a distance between the first vehicle 100 and a front vehicle and a rear vehicle, and whether the vehicle deviates from a lane line or not, during the driving process of the vehicle, through the camera 160 and the sensor module.
Specifically, the eyeball tracking detection system 140 and the driving assistance system 150 may communicate through a serial bus in the first vehicle 100, the eyeball tracking detection system 140 may receive data during driving of the vehicle sent by the driving assistance system 150, and the eyeball tracking detection system 140 may also send a control strategy to the driving assistance system 150.
The following describes embodiments of the present application in detail.
Referring to fig. 3, fig. 3 is a flowchart illustrating a vehicle driving control method according to an embodiment of the present application, applied to a first vehicle including an eye tracking detection system and a driving assistance system, where as shown in the figure, the vehicle driving control method includes the following operations.
S301, a first vehicle determines a first scene type corresponding to vehicle driving parameters of the first vehicle through the eyeball tracking detection system, wherein the vehicle driving parameters are sent to the eyeball tracking detection system by the driving auxiliary system;
the first scene type may be various, and for example, the first scene type may be an acceleration driving scene, a deceleration driving scene, a constant speed driving scene, a high speed driving scene, a low speed driving scene, and the like, which is not limited herein.
The different scenes may further include a plurality of subdivision types, for example, a scene in which a distance between front and rear vehicles is smaller than a preset distance in a high-speed driving scene, a scene in which a lane line is rolled in a constant-speed driving scene, and the like, which are not limited herein.
S302, determining an eyeball movement track of a driver of the first vehicle through an eyeball tracking technology by the first vehicle through the eyeball tracking detection system;
the specific implementation manner of determining the eyeball movement track of the driver of the first vehicle through the eyeball tracking detection system by the eyeball tracking technology is that an eyeball detection device in the eyeball tracking detection system acquires a plurality of fixation points of the driver within a period of time, and the eyeball movement track is formed through the plurality of fixation points.
S303, the first vehicle determines the driving state of the driver according to the first scene type and the eyeball movement track through the eyeball tracking detection system;
wherein the driving state may include a normal driving state and an abnormal driving state, wherein the abnormal driving state may include a fatigue driving state, a distraction driving state, and the like.
For example, the eyeball movement of the acceleration driving scene is always in a projection area of the driver on the front window, and the range of the eyeball movement of the uniform speed driving scene is relatively large, and for example, the uniform speed driving scene may include an in-vehicle display screen and the like, which is not limited herein.
S304, the first vehicle determines a control strategy through the driving state and the vehicle driving parameters through the eyeball tracking detection system, and sends the control strategy to the driving assistance system, and the control strategy is used for the driving assistance system to actively control the vehicle.
For example, when the driving state is an abnormal driving state, the control strategy may be determined according to the vehicle driving parameter, wherein when the driving state is determined to be acceleration driving according to the vehicle driving parameter, the control strategy may be to determine the acceleration of the vehicle according to the vehicle driving parameter so as to perform deceleration control on the vehicle; for another example, when the driving state is an abnormal driving state, the control strategy may be determined according to the vehicle driving parameter, wherein when the vehicle driving parameter is rolling lane line driving, the control strategy may be to control the vehicle direction so that the vehicle does not deviate from the lane or the like.
It can be seen that, in the embodiment of the present application, a first vehicle determines, by using the eyeball tracking detection system, a first scene type corresponding to a vehicle driving parameter of the first vehicle, determines, by using the eyeball tracking detection system, an eyeball movement trajectory of a driver of the first vehicle according to an eyeball tracking technology, further determines, by using the eyeball tracking detection system, a driving state of the driver according to the first scene type and the eyeball movement trajectory, then determines, by using the eyeball tracking detection system, a control strategy according to the driving state and the vehicle driving parameter, and sends the control strategy to the driving assistance system, where the control strategy is used for the driving assistance system to actively control the vehicle. Therefore, the first vehicle determines the driving state according to the eyeball movement track of the user under different scene types, not only the driving state is determined according to the eyeball movement track, the accuracy of vehicle driving control is improved, but also the control strategy is determined according to the driving state of the driver and the current vehicle driving parameters, and the real-time performance of vehicle control is improved on the basis of further improving the accuracy of vehicle driving control.
In one possible example, the eye tracking detection system determines the driving state of the driver based on the first scene type and the eye movement trajectory, including:
the eyeball tracking detection system inputs the first scene type and the eyeball moving track into a first model;
the eyeball tracking detection system determines whether the eyeball movement track conforms to the first scene type according to the data output by the first model;
when the eye movement trajectory conforms to the first scene type, the eye tracking detection system determines a first driving state of the driver;
when the eye movement trajectory does not conform to the first scene type, the eye tracking detection system determines a second driving state of the driver.
The data output by the first model may be various, for example, may be a state identifier, for example, when the identifier is 1, it indicates that the eye movement trajectory conforms to the first scene type, and when the identifier is 2, it indicates that the eye movement trajectory does not conform to the first scene type; or the output data may be a matching degree, when the matching degree is greater than or equal to a preset matching degree threshold, it indicates that the eye movement trajectory conforms to the first scene type, and when the matching degree is less than the preset matching degree threshold, it indicates that the eye movement trajectory does not conform to the first scene type, and so on, and therefore the data output by the first model is not specifically limited.
Wherein the first driving state represents a normal driving state, and the second driving state represents any one or more abnormal driving states.
Therefore, in the example, the eyeball tracking detection system of the first vehicle determines whether the eyeball movement track is normal according to the first model, so that the driving state of the driver is determined, and the intelligence and the accuracy of vehicle driving control are improved.
In this possible example, the method further comprises:
the eyeball tracking detection system acquires a plurality of pieces of historical data, wherein the historical data comprises a plurality of eyeball movement tracks corresponding to each scene type, and the scene types comprise an acceleration driving scene, a deceleration driving scene, a constant speed driving scene, a high speed driving scene and a low speed driving scene;
and the eyeball tracking detection system trains a model according to the plurality of pieces of historical data through a machine learning algorithm to obtain the first model.
The historical data is data acquired by the eyeball tracking detection system and the driving auxiliary system in the historical driving process of the first vehicle, the plurality of pieces of historical data comprise historical data of different scene types, each scene type can comprise a plurality of pieces of historical data, each piece of historical data comprises a scene type and an eyeball moving track, and the first vehicle can update the historical data at regular time so as to update the first model, so that the first model is more in line with the driving habits of users.
The machine learning training model may be, for example, a neural network model, a deep learning model, or the like, which is not limited herein.
Therefore, in the example, the first vehicle performs machine learning algorithm training on a plurality of pieces of historical data through the eyeball tracking detection system to obtain the first model, so that the intelligence and the accuracy of vehicle driving control are improved, and the vehicle driving control is more in line with the driving habits of the user.
In one possible example, the eye tracking detection system determines the driving state of the driver based on the first scene type and the eye movement trajectory, including:
the eyeball tracking detection system determines an eyeball movement range and the maximum fixation time of a fixation point corresponding to the first scene type;
the eyeball tracking detection system determines whether the eyeball movement is within the eyeball movement range according to the eyeball movement track and determines whether the gaze duration corresponding to each gaze point in the eyeball movement track is less than the maximum gaze duration;
when the eyeball moves in the eyeball movement range and the watching duration of the watching point in the eyeball movement track is smaller than the maximum watching duration, the eyeball tracking detection system determines a first driving state of the driver;
and when the eyeball movement is not within the eyeball movement range or the fixation point in the eyeball motion track has the fixation point with the fixation time length longer than the maximum fixation time length, the eyeball tracking detection system determines a second driving state of the driver.
The eyeball detection system can determine the corresponding eyeball movement range and the maximum watching duration of the watching point according to the first scene type.
The eyeball movement range and the maximum watching time of the watching point corresponding to different first scene types are different, for example, when the current vehicle distance is short, the eyeball movement range is small, and the maximum watching time of the watching point is long; or, when the vehicle is running at a constant speed, the eyeball moving range is large, and the maximum watching time of the watching point is short, which is not limited herein.
When the fixation point in the eyeball movement track has a fixation point with a fixation time length longer than the maximum fixation time length, the user is possibly in a fatigue or vague state.
Therefore, in this example, the eyeball tracking detection system of the first vehicle determines the driving state of the driver according to the eyeball movement range corresponding to the first scene type and the maximum gazing duration of the gazing point, which is beneficial to improving the accuracy of determining the driving state and better meets the actual situation in the driving process.
In one possible example, the eye tracking detection system determines a control strategy from the driving state and the vehicle driving parameter, including:
when the driving state is the second driving state, the eyeball tracking detection system judges whether the vehicle driving parameters are within a preset range;
when the vehicle driving parameters are all within the preset range, the eyeball tracking detection system determines a first control strategy, the first control strategy is to send out warning information, and the warning information is used for sending out early warning to the driver;
and when the vehicle driving parameters have parameters which do not accord with the preset range, the eyeball tracking detection system determines a second control strategy according to the vehicle driving parameters, wherein the second control strategy is used for controlling the speed and the direction of the first vehicle.
The vehicle driving parameters may include speed, acceleration, distances between the front vehicle and the rear vehicle and the first vehicle, a distance that the first vehicle deviates from a lane line, and the like, which are not limited herein.
The warning information may be various, for example, a warning sound may be sent, or a warning message may pop up on the display screen when it is detected that the gaze point of the user is on the display screen, or a steering wheel may be heated, and the like, which is not limited herein.
The different first scene types may correspond to different preset ranges of the vehicle driving parameters, and are not limited herein.
It can be seen that, in this example, when the driving state of the first vehicle is the second driving state, the vehicle driving parameter is determined, when the vehicle driving parameter is normal, control is not required, only the driver is warned, complexity of vehicle control is reduced, and when the vehicle driving parameter is abnormal, the control strategy is determined according to the vehicle driving parameter, the vehicle is controlled, and the effectiveness of vehicle control is favorably improved.
In this possible example, the control strategy is a first control strategy, and after the sending the control strategy to the driving assistance system, the method further comprises:
after the first control strategy sends out a preset time period, the eyeball tracking detection system updates the driving state of the driver;
and when the driving state is the second driving state, the eyeball tracking detection system determines a third control strategy according to the vehicle driving parameters.
Therefore, in this example, after the first vehicle sends out the warning information for the preset time period, and it is detected that the driver is still in the abnormal driving state, the third control strategy is determined according to the vehicle driving parameters, the initiative of vehicle control is mastered, an accident is avoided, and the safety of vehicle driving is improved.
In one possible example, the eye tracking detection system determines a second control strategy based on the vehicle driving parameters, including:
the eyeball tracking detection system determines a target vehicle speed according to the distance between the first vehicle and a front vehicle;
the eyeball tracking detection system calculates acceleration according to the target vehicle speed and the current vehicle speed;
the eyeball tracking detection system determines a control direction according to the position relation between the first vehicle and a lane line, and the acceleration and the control direction are used as the second control strategy.
In this example, the first vehicle determines the control direction according to the distance from the preceding vehicle and the position relationship with the lane line, which is beneficial to improving the intelligence of vehicle control and the safety of vehicle driving.
Referring to fig. 4, fig. 4 is a schematic flowchart of another vehicle driving control method according to an embodiment of the present disclosure, which can be applied to a first vehicle. As shown in the drawing, the present vehicle travel control method includes the operations of:
s401, a first vehicle determines a first scene type corresponding to vehicle driving parameters of the first vehicle through an eyeball tracking detection system, wherein the vehicle driving parameters are sent to the eyeball tracking detection system by a driving auxiliary system.
S402, determining an eyeball movement track of a driver of the first vehicle through the eyeball tracking detection system by an eyeball tracking technology by the first vehicle.
S403, the first vehicle inputs the first scene type and the eyeball movement track into a first model through the eyeball tracking detection system.
S404, the first vehicle determines whether the eyeball movement track conforms to a first scene type according to the data output by the first model through the eyeball tracking detection system.
S405, when the eyeball movement track does not accord with the first scene type, the first vehicle determines a second driving state of the driver through the eyeball tracking detection system.
S406, the first vehicle judges whether the vehicle driving parameter is in a preset range through the eyeball tracking detection system.
S407, when the vehicle running parameters are all within a preset range, the first vehicle determines a first control strategy through the eyeball tracking detection system, the first control strategy is to send out warning information, and the warning information is used for sending out early warning to the driver.
S408, when the vehicle driving parameters have parameters which do not accord with the preset range, the first vehicle determines a second control strategy according to the vehicle driving parameters through the eyeball tracking detection system, wherein the second control strategy is used for controlling the speed and the direction of the first vehicle and sending the second control strategy to the driving auxiliary system.
It can be seen that, in the embodiment of the present application, a first vehicle determines, by using the eyeball tracking detection system, a first scene type corresponding to a vehicle driving parameter of the first vehicle, determines, by using the eyeball tracking detection system, an eyeball movement trajectory of a driver of the first vehicle according to an eyeball tracking technology, further determines, by using the eyeball tracking detection system, a driving state of the driver according to the first scene type and the eyeball movement trajectory, then determines, by using the eyeball tracking detection system, a control strategy according to the driving state and the vehicle driving parameter, and sends the control strategy to the driving assistance system, where the control strategy is used for the driving assistance system to actively control the vehicle. Therefore, the first vehicle determines the driving state according to the eyeball movement track of the user under different scene types, not only the driving state is determined according to the eyeball movement track, the accuracy of vehicle driving control is improved, but also the control strategy is determined according to the driving state of the driver and the current vehicle driving parameters, and the real-time performance of vehicle control is improved on the basis of further improving the accuracy of vehicle driving control.
In addition, whether the moving track of the eyeballs is normal or not is determined by the first vehicle through the eyeball tracking detection system according to the first model, and then the driving state of a driver is determined, so that the intelligence and the accuracy of vehicle driving control are improved.
In addition, when the driving state of the first vehicle is the second driving state, the vehicle driving parameters are judged, when the vehicle driving parameters are normal, control is not needed, only the driver is warned, the complexity of vehicle control is reduced, when the vehicle driving parameters are abnormal, the control strategy is determined according to the vehicle driving parameters, the vehicle is controlled, and the effectiveness of vehicle control is improved.
Referring to fig. 5, fig. 5 is a schematic flow chart of another vehicle driving control method according to an embodiment of the present application, where the vehicle driving control method can be applied to a first vehicle. As shown in the drawing, the present vehicle travel control method includes the operations of:
s501, a first vehicle determines a first scene type corresponding to vehicle driving parameters of the first vehicle through an eyeball tracking detection system, wherein the vehicle driving parameters are sent to the eyeball tracking detection system by a driving auxiliary system.
S502, the first vehicle determines an eyeball movement track of a driver of the first vehicle through the eyeball tracking detection system through an eyeball tracking technology.
S503, the first vehicle determines, through the eyeball tracking detection system, an eyeball movement range and a maximum gazing duration of the gazing point corresponding to the first scene type.
S504, the first vehicle determines whether the eyeball movement is in the eyeball movement range or not according to the eyeball movement track through the eyeball tracking detection system, and determines whether the fixation time length corresponding to each fixation point in the eyeball movement track is smaller than the maximum fixation time length or not.
And S505, when the eyeball movement is not within the eyeball movement range, or the fixation point in the eyeball movement track has a fixation point with a fixation time length longer than the maximum fixation time length, the first vehicle determines a second driving state of the driver through the eyeball tracking detection system.
S506, when the driving state is the second driving state, the first vehicle judges whether the vehicle driving parameter is in a preset range through the eyeball tracking detection system.
And S507, when the vehicle driving parameters of the first vehicle are all within the preset range, the eyeball tracking detection system determines a first control strategy, the first control strategy is to send out warning information, and the warning information is used for sending out early warning to the driver.
And S508, after the first control strategy is sent out for a preset time period, the first vehicle updates the driving state of the driver through the eyeball tracking detection system.
And S509, when the driving state is the second driving state, the first vehicle determines a third control strategy according to the vehicle driving parameters through the eyeball tracking detection system, and sends the third control strategy to the driving assistance system, wherein the third control strategy is used for the driving assistance system to actively control the vehicle.
It can be seen that, in the embodiment of the present application, a first vehicle determines, by using the eyeball tracking detection system, a first scene type corresponding to a vehicle driving parameter of the first vehicle, determines, by using the eyeball tracking detection system, an eyeball movement trajectory of a driver of the first vehicle according to an eyeball tracking technology, further determines, by using the eyeball tracking detection system, a driving state of the driver according to the first scene type and the eyeball movement trajectory, then determines, by using the eyeball tracking detection system, a control strategy according to the driving state and the vehicle driving parameter, and sends the control strategy to the driving assistance system, where the control strategy is used for the driving assistance system to actively control the vehicle. Therefore, the first vehicle determines the driving state according to the eyeball movement track of the user under different scene types, not only the driving state is determined according to the eyeball movement track, the accuracy of vehicle driving control is improved, but also the control strategy is determined according to the driving state of the driver and the current vehicle driving parameters, and the real-time performance of vehicle control is improved on the basis of further improving the accuracy of vehicle driving control.
In addition, the eyeball tracking detection system of the first vehicle determines the driving state of the driver according to the eyeball movement range corresponding to the first scene type and the maximum watching duration of the watching point, so that the accuracy of determining the driving state is improved, and the actual situation in the driving process is better met.
In addition, when the driving state of the first vehicle is the second driving state, the vehicle driving parameters are judged, when the vehicle driving parameters are normal, control is not needed, only the driver is warned, the complexity of vehicle control is reduced, and after the first vehicle sends warning information for a preset time period, the driver is detected to be still in an abnormal driving state, a third control strategy is determined according to the vehicle driving parameters, the initiative of vehicle control is mastered, accidents are avoided, and the safety of vehicle driving is improved.
The embodiment of the present application provides a vehicle travel control apparatus, which may be a first vehicle 100. Specifically, the vehicle travel control device is configured to execute the steps of the above vehicle travel control method. The vehicle running control device provided by the embodiment of the application can comprise modules corresponding to the corresponding steps.
The embodiment of the present application may divide the functional modules of the vehicle driving control device according to the above method example, for example, each functional module may be divided according to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The division of the modules in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 6 is a schematic diagram showing a possible configuration of the vehicle travel control device according to the above embodiment, in a case where each functional module is divided in correspondence with each function. As shown in fig. 6, the vehicle travel control apparatus 600 includes a determination unit 601 and a transmission unit 602, in which:
the determining unit 601 is configured to determine, by the eyeball tracking detection system, a first scene type corresponding to a vehicle driving parameter of the first vehicle, where the vehicle driving parameter is sent to the eyeball tracking detection system by the driving assistance system through the sending unit 602; and means for determining, by the eye tracking detection system, an eye movement trajectory of a driver of the first vehicle according to an eye tracking technique; and determining, by the eye tracking detection system, a driving state of the driver from the first scene type and the eye movement trajectory; and for determining a control strategy by the eye tracking detection system from the driving state and the vehicle driving parameters;
the sending unit 602 is configured to send the control policy to the driving assistance system through the eyeball tracking detection system, where the control policy is used for the driving assistance system to actively control a vehicle.
All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again. Of course, the vehicle driving control device provided in the embodiment of the present application includes, but is not limited to, the above modules, for example: the vehicle travel control apparatus may further include a storage unit 603. The storage unit 603 may be used to store program codes and data of the vehicle travel control apparatus.
In the case of using an integrated unit, a schematic structural diagram of a vehicle travel control device provided in an embodiment of the present application is shown in fig. 7. In fig. 7, vehicle travel control apparatus 700 includes: a processing module 702 and a communication module 701. The processing module 702 is used for controlling and managing the actions of the vehicle travel control device, for example, executing the steps performed by the determination unit 601, and/or other processes for performing the techniques described herein. The communication module 701 is used to support interaction between the vehicle driving control device and other devices or between modules inside the vehicle driving control device, such as communication between the eyeball tracking detection system and the driving assistance system. As shown in fig. 7, the vehicle driving control device may further include a storage module 703, and the storage module 703 is used for storing program codes and data of the vehicle driving control device, for example, contents stored in the storage module 703.
The Processing module 702 may be a Processor or a controller, such as a Central Processing Unit (CPU), a general-purpose Processor, a Digital Signal Processor (DSP), an ASIC, an FPGA or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others. The communication module 701 may be a transceiver, a radio frequency circuit or a communication interface, etc. The storage module 703 may be a memory.
All relevant contents of each scene related to the method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again. Both of the vehicle running control apparatus 600 and the vehicle running control apparatus 700 may execute the vehicle running control method shown in any one of fig. 3 to 5.
The present embodiment also provides a computer storage medium having stored therein computer instructions that, when executed on a first vehicle, cause the first vehicle to perform the above-described associated method steps to implement the method of operation in the above-described embodiments.
The present embodiment also provides a computer program product which, when run on a computer, causes the computer to execute the above-described related steps to implement the vehicle travel control method in the above-described embodiment.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the vehicle running control method in the above method embodiments.
The first vehicle, the computer storage medium, the computer program product, or the chip provided in this embodiment are all configured to execute the corresponding method provided above, so that beneficial effects achieved by the first vehicle, the computer storage medium, the computer program product, or the chip may refer to beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the above embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the above functional modules is used as an example, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. A vehicle travel control method applied to a first vehicle including an eye tracking detection system and a travel assist system, the method comprising:
the eyeball tracking detection system determines a first scene type corresponding to vehicle driving parameters of the first vehicle, wherein the vehicle driving parameters are sent to the eyeball tracking detection system by the driving auxiliary system;
the eyeball tracking detection system determines the eyeball movement track of the driver of the first vehicle through an eyeball tracking technology;
the eyeball tracking detection system determines the driving state of the driver according to the first scene type and the eyeball moving track; the eyeball tracking detection system determines a control strategy according to the driving state and the vehicle driving parameters, and sends the control strategy to the driving assistance system, wherein the control strategy is used for the driving assistance system to actively control the vehicle;
wherein the eyeball tracking detection system determines the driving state of the driver according to the first scene type and the eyeball movement track, and comprises:
the eyeball tracking detection system inputs the first scene type and the eyeball moving track into a first model; the eyeball tracking detection system determines whether the eyeball movement track conforms to the first scene type according to the data output by the first model; when the eye movement trajectory conforms to the first scene type, the eye tracking detection system determines a first driving state of the driver; when the eyeball movement track does not conform to the first scene type, the eyeball tracking detection system determines a second driving state of the driver;
the first model is obtained by training a model through a machine learning algorithm according to a plurality of pieces of historical data; the historical data comprises a plurality of eyeball movement tracks corresponding to each scene type, and the scene types comprise an acceleration driving scene, a deceleration driving scene, a constant speed driving scene, a high speed driving scene and a low speed driving scene;
alternatively, the first and second electrodes may be,
wherein the eyeball tracking detection system determines the driving state of the driver according to the first scene type and the eyeball movement track, and comprises:
the eyeball tracking detection system determines an eyeball movement range and the maximum fixation time of a fixation point corresponding to the first scene type; the eyeball tracking detection system determines whether the eyeball movement is within the eyeball movement range according to the eyeball movement track and determines whether the watching duration corresponding to each watching point in the eyeball movement track is smaller than the maximum watching duration; when the eyeball moves in the eyeball movement range and the watching duration of the watching point in the eyeball movement track is smaller than the maximum watching duration, the eyeball tracking detection system determines a first driving state of the driver; and when the eyeball movement is not within the eyeball movement range or the fixation point in the eyeball movement track has the fixation point with the fixation time length longer than the maximum fixation time length, the eyeball tracking detection system determines a second driving state of the driver.
2. The method of claim 1, wherein the eye tracking detection system determines a control strategy from the driving state and the vehicle travel parameters, comprising:
when the driving state is a second driving state, the eyeball tracking detection system judges whether the vehicle driving parameters are within a preset range; when the vehicle driving parameters are all within the preset range, the eyeball tracking detection system determines a first control strategy, the first control strategy is to send out warning information, and the warning information is used for sending out early warning to the driver;
and when the vehicle driving parameters have parameters which do not accord with the preset range, the eyeball tracking detection system determines a second control strategy according to the vehicle driving parameters, wherein the second control strategy is used for controlling the speed and the direction of the first vehicle.
3. The method of claim 2, wherein the control policy is a first control policy, and wherein after sending the control policy to the driving assistance system, the method further comprises: after the first control strategy sends out a preset time period, the eyeball tracking detection system updates the driving state of the driver;
and when the driving state is the second driving state, the eyeball tracking detection system determines a third control strategy according to the vehicle driving parameters.
4. A method according to claim 2 or 3, wherein the eye tracking detection system determines a second control strategy based on the vehicle driving parameters, comprising:
the eyeball tracking detection system determines a target vehicle speed according to the distance between the first vehicle and a front vehicle;
the eyeball tracking detection system calculates acceleration according to the target vehicle speed and the current vehicle speed;
the eyeball tracking detection system determines a control direction according to the position relation between the first vehicle and a lane line, and the acceleration and the control direction are used as the second control strategy.
5. A vehicle travel control method applied to a first vehicle including an eye tracking detection system and a travel assist system, the method comprising:
the eyeball tracking detection system determines a first scene type corresponding to vehicle driving parameters of the first vehicle, wherein the vehicle driving parameters are sent to the eyeball tracking detection system by the driving auxiliary system; the eyeball tracking detection system determines the eyeball movement track of the driver of the first vehicle through an eyeball tracking technology;
the eyeball tracking detection system determines the driving state of the driver according to the first scene type and the eyeball moving track; the eyeball tracking detection system determines a control strategy according to the driving state and the vehicle driving parameters, and sends the control strategy to the driving assistance system, wherein the control strategy is used for the driving assistance system to actively control the vehicle; when the eye movement trajectory conforms to the first scene type, the eye tracking detection system determines a first driving state of the driver; when the eyeball movement track does not conform to the first scene type, the eyeball tracking detection system determines a second driving state of the driver; or, when the eyeball moves within the eyeball movement range and the gazing duration of the gazing point in the eyeball movement track is less than the maximum gazing duration, the eyeball tracking detection system determines a first driving state of the driver; when the eyeball movement is not within the eyeball movement range or a fixation point with fixation time length longer than the maximum fixation time length exists in the fixation point in the eyeball movement track, the eyeball tracking detection system determines a second driving state of the driver;
wherein the content of the first and second substances,
the eyeball tracking detection system determines a control strategy through the driving state and the vehicle driving parameters, and comprises the following steps: when the driving state is a second driving state, the eyeball tracking detection system judges whether the vehicle driving parameters are within a preset range; when the vehicle driving parameters are all within the preset range, the eyeball tracking detection system determines a first control strategy, the first control strategy is to send out warning information, and the warning information is used for sending out early warning to the driver; and when the vehicle driving parameters have parameters which do not accord with the preset range, the eyeball tracking detection system determines a second control strategy according to the vehicle driving parameters, wherein the second control strategy is used for controlling the speed and the direction of the first vehicle.
6. A vehicular travel control apparatus applied to a first vehicle including an eyeball tracking detection system and a travel assist system, comprising a determination unit and a transmission unit, wherein:
the determining unit is configured to determine, by the eyeball tracking detection system, a first scene type corresponding to a vehicle driving parameter of the first vehicle, where the vehicle driving parameter is sent to the eyeball tracking detection system by the driving assistance system through the sending unit; and means for determining, by the eye tracking detection system, an eye movement trajectory of a driver of the first vehicle according to an eye tracking technique; and determining, by the eye tracking detection system, a driving state of the driver from the first scene type and the eye movement trajectory; and for determining a control strategy by the eye tracking detection system from the driving state and the vehicle driving parameters;
the sending unit is configured to send the control strategy to the driving assistance system through the eyeball tracking detection system, where the control strategy is used for the driving assistance system to actively control a vehicle;
wherein the content of the first and second substances,
the determining the driving state of the driver according to the first scene type and the eye movement trajectory comprises: inputting the first scene type and the eyeball movement track into a first model; determining whether the eyeball movement track conforms to the first scene type according to the data output by the first model; determining a first driving state of the driver when the eye movement trajectory conforms to the first scene type; determining a second driving state of the driver when the eye movement trajectory does not conform to the first scene type; the first model is obtained by training a model through a machine learning algorithm according to a plurality of pieces of historical data; the historical data comprises a plurality of eyeball movement tracks corresponding to each scene type, and the scene types comprise an acceleration driving scene, a deceleration driving scene, a constant speed driving scene, a high speed driving scene and a low speed driving scene;
alternatively, the first and second electrodes may be,
wherein the content of the first and second substances,
the eyeball tracking detection system determines the driving state of the driver according to the first scene type and the eyeball moving track, and comprises the following steps: the eyeball tracking detection system determines an eyeball movement range and the maximum fixation time of a fixation point corresponding to the first scene type; the eyeball tracking detection system determines whether the eyeball movement is within the eyeball movement range according to the eyeball movement track and determines whether the watching duration corresponding to each watching point in the eyeball movement track is smaller than the maximum watching duration; when the eyeball moves in the eyeball movement range and the watching duration of the watching point in the eyeball movement track is smaller than the maximum watching duration, the eyeball tracking detection system determines a first driving state of the driver; and when the eyeball movement is not within the eyeball movement range or the fixation point in the eyeball movement track has the fixation point with the fixation time length longer than the maximum fixation time length, the eyeball tracking detection system determines a second driving state of the driver.
7. A vehicular travel control apparatus applied to a first vehicle including an eyeball tracking detection system and a travel assist system, comprising a determination unit and a transmission unit, wherein:
the determining unit is configured to determine, by the eyeball tracking detection system, a first scene type corresponding to a vehicle driving parameter of the first vehicle, where the vehicle driving parameter is sent to the eyeball tracking detection system by the driving assistance system through the sending unit; and means for determining, by the eye tracking detection system, an eye movement trajectory of a driver of the first vehicle according to an eye tracking technique; and determining, by the eye tracking detection system, a driving state of the driver from the first scene type and the eye movement trajectory; and for determining a control strategy by the eye tracking detection system from the driving state and the vehicle driving parameters; wherein the eye tracking detection system determines a first driving state of the driver when the eye movement trajectory conforms to the first scene type; when the eyeball movement track does not conform to the first scene type, the eyeball tracking detection system determines a second driving state of the driver; or when the eyeball moves in the eyeball movement range and the fixation duration of the fixation point in the eyeball movement track is less than the maximum fixation duration, the eyeball tracking detection system determines the first driving state of the driver; when the eyeball movement is not within the eyeball movement range or a fixation point with fixation time length longer than the maximum fixation time length exists in the fixation point in the eyeball movement track, the eyeball tracking detection system determines a second driving state of the driver;
the sending unit is configured to send the control strategy to the driving assistance system through the eyeball tracking detection system, where the control strategy is used for the driving assistance system to actively control a vehicle;
wherein the content of the first and second substances,
the eyeball tracking detection system determines a control strategy through the driving state and the vehicle driving parameters, and comprises the following steps: when the driving state is a second driving state, the eyeball tracking detection system judges whether the vehicle driving parameters are within a preset range; when the vehicle driving parameters are all within the preset range, the eyeball tracking detection system determines a first control strategy, the first control strategy is to send out warning information, and the warning information is used for sending out early warning to the driver; and when the vehicle driving parameters have parameters which do not accord with the preset range, the eyeball tracking detection system determines a second control strategy according to the vehicle driving parameters, wherein the second control strategy is used for controlling the speed and the direction of the first vehicle.
8. A first vehicle comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-5.
9. A computer-readable storage medium, characterized in that,
a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any of claims 1-5.
CN202010385575.5A 2020-05-09 2020-05-09 Vehicle running control method and device Active CN111559382B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010385575.5A CN111559382B (en) 2020-05-09 2020-05-09 Vehicle running control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010385575.5A CN111559382B (en) 2020-05-09 2020-05-09 Vehicle running control method and device

Publications (2)

Publication Number Publication Date
CN111559382A CN111559382A (en) 2020-08-21
CN111559382B true CN111559382B (en) 2021-11-02

Family

ID=72074599

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010385575.5A Active CN111559382B (en) 2020-05-09 2020-05-09 Vehicle running control method and device

Country Status (1)

Country Link
CN (1) CN111559382B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112396235B (en) * 2020-11-23 2022-05-03 浙江天行健智能科技有限公司 Traffic accident occurrence time prediction modeling method based on eyeball motion tracking

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105654674A (en) * 2014-10-14 2016-06-08 大众汽车有限公司 Monitoring of attention degree of vehicle driver
CN109435958A (en) * 2018-10-18 2019-03-08 巴中门口网络科技有限公司 Using the anti-fatigue-driving method of biological identification technology
CN110648502A (en) * 2019-09-26 2020-01-03 中国第一汽车股份有限公司 Method, system, device and storage medium for preventing fatigue driving
CN110816543A (en) * 2019-10-28 2020-02-21 东南大学 Driver distraction driving detection and early warning system and method under vehicle turning and lane changing scenes

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9202106B2 (en) * 2011-07-11 2015-12-01 Toyota Jidosha Kabushiki Kaisha Eyelid detection device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105654674A (en) * 2014-10-14 2016-06-08 大众汽车有限公司 Monitoring of attention degree of vehicle driver
CN109435958A (en) * 2018-10-18 2019-03-08 巴中门口网络科技有限公司 Using the anti-fatigue-driving method of biological identification technology
CN110648502A (en) * 2019-09-26 2020-01-03 中国第一汽车股份有限公司 Method, system, device and storage medium for preventing fatigue driving
CN110816543A (en) * 2019-10-28 2020-02-21 东南大学 Driver distraction driving detection and early warning system and method under vehicle turning and lane changing scenes

Also Published As

Publication number Publication date
CN111559382A (en) 2020-08-21

Similar Documents

Publication Publication Date Title
EP3033999A1 (en) Apparatus and method for determining the state of a driver
US20200339133A1 (en) Driver distraction determination
JP7329755B2 (en) Support method and support system and support device using the same
CN110155072B (en) Carsickness prevention method and carsickness prevention device
JP2017136922A (en) Vehicle control device, on-vehicle device controller, map information generation device, vehicle control method, and on-vehicle device control method
CN110733426B (en) Sight blind area monitoring method, device, equipment and medium
US10462281B2 (en) Technologies for user notification suppression
CN108928294A (en) Driving dangerous based reminding method, device, terminal and computer readable storage medium
US20190283774A1 (en) Travel control apparatus, vehicle, travel control system, travel control method, and storage medium
EP3885220B1 (en) Automatically estimating skill levels and confidence levels of drivers
CN109720352B (en) Vehicle driving assistance control method and apparatus
CN111559382B (en) Vehicle running control method and device
CN112477859B (en) Lane keeping assist method, apparatus, device and readable storage medium
CN115123245A (en) Driving mode switching method and device and vehicle
CN113460050A (en) Vehicle follow-up stop control method and device and computer readable storage medium
JP6978962B2 (en) Mobile control system and control method
JP6922494B2 (en) Driving support method and driving support device
CN113942504B (en) Self-adaptive cruise control method and device
CN114954447A (en) Target vehicle control method and device
JP2018081555A (en) Vehicular display device, vehicular display method, and vehicular display program
KR20220051889A (en) Apparatus and method for controlling driving of vehicle
CN114954473A (en) Control method, device and equipment for vehicle driving mode and vehicle
CN110696712B (en) Automatic adjusting method and device for automobile rearview mirror, computer storage medium and automobile
CN111866056B (en) Information pushing method, device, electronic equipment and storage medium
CN115675499A (en) Lane deviation warning method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant