WO2021249020A1 - 一种行驶状态的预测方法、装置和终端设备 - Google Patents

一种行驶状态的预测方法、装置和终端设备 Download PDF

Info

Publication number
WO2021249020A1
WO2021249020A1 PCT/CN2021/087578 CN2021087578W WO2021249020A1 WO 2021249020 A1 WO2021249020 A1 WO 2021249020A1 CN 2021087578 W CN2021087578 W CN 2021087578W WO 2021249020 A1 WO2021249020 A1 WO 2021249020A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
time
predicted route
driving state
detection information
Prior art date
Application number
PCT/CN2021/087578
Other languages
English (en)
French (fr)
Inventor
董卉
周伟
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021249020A1 publication Critical patent/WO2021249020A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0097Predicting future conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0953Predicting travel path or likelihood of collision the prediction being responsive to vehicle dynamic parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/12Lateral speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/14Yaw
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4041Position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4042Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4043Lateral speed

Definitions

  • the present invention relates to the technical field of intelligent driving, in particular to a method, a device and terminal equipment for predicting a driving state.
  • Lane changing is a common vehicle maneuvering behavior, and it is also one of the main causes of vehicle collision accidents. Therefore, identifying the lane change behavior of surrounding vehicles in advance is of great significance to improving the safety of the self-vehicle.
  • the driving state of the vehicle is only divided into straight line keeping and lane changing, and there is no subdivision of how the vehicle changes lanes.
  • the self-vehicle finds that other vehicles change lanes and cause danger to the self-vehicle, if it is impossible to determine whether other vehicles cut in from the left or the right, they cannot turn to avoid the danger, and a collision with other vehicles may occur.
  • the embodiments of the present application provide a method, device and terminal device for predicting a driving state.
  • the present application provides a method for predicting a driving state, including: acquiring detection information at the current moment and detection information within N seconds before the current moment. Obtained by detecting at least one vehicle around the first vehicle, the at least one vehicle includes a second vehicle, and the N seconds is a positive number greater than zero and less than ten; determining the driving of the second vehicle at the current moment State, the driving state is obtained by inputting the detection information at the current moment and the detection information within N seconds before the current moment into a classification model, and the driving state includes keeping straight ahead, turning left, and One of left lane change cut out, right lane change cut in or right lane change cut out.
  • the first vehicle detects other surrounding vehicles through sensors, and then inputs the current time detection information and the detection information detected within a period of time before the current time into the classification model, so as to accurately identify the vehicles surrounding the vehicle at the current time.
  • the driving state at the moment is divided into five categories: straight keeping, left lane change cut in, left lane change cut out, right lane change cut in or right lane change cut out, to more accurately predict the driving state of the vehicle.
  • the first vehicle can make a corresponding accurate driving strategy or prompt according to the predicted driving state of the vehicle.
  • the detection information includes: the speed and yaw angle of the first vehicle, and the lateral speed, longitudinal speed or speed, global position and/or relative position, brake lights of the second vehicle One or more of, left turn signal, right turn signal, instantaneous angular velocity, yaw angle, and heading angle.
  • the first vehicle is not limited to acquiring lane line information, but can also predict the driving state of other vehicles based on its own vehicle information and other vehicle information, so that it can still accurately predict other vehicles in the absence of lane lines.
  • the driving state of the vehicle is not limited to acquiring lane line information, but can also predict the driving state of other vehicles based on its own vehicle information and other vehicle information, so that it can still accurately predict other vehicles in the absence of lane lines. The driving state of the vehicle.
  • the detection information within N seconds before the current moment includes: acquiring detection information within Q seconds before the current moment, where Q is a positive number greater than zero and less than N.
  • the historical vehicle information and other information in the specified time period are insufficient, the historical vehicle information and other information in a shorter time period can be selected to ensure that the classification model accurately predicts The driving status of other vehicles.
  • the method when the current moment is the time when the sensor starts to detect, the method further includes: determining the driving state of the second vehicle according to the detection information at the current moment.
  • the current vehicle information and other information can be selected to ensure that the classification model can predict the driving state of other vehicles.
  • the method further includes: calculating a predicted route of the second vehicle within M seconds after the current moment, the predicted route being calculated according to the driving state of the second vehicle
  • the M seconds is a time period set in the first vehicle for predicting the predicted route of other vehicles, and the value is a positive number greater than zero.
  • the method further includes: calculating the time when the second vehicle collides with the first vehicle according to the predicted route of the second vehicle; when the time when the collision occurs is not greater than a set time When the threshold value of, the alarm information is generated, and the alarm information is used to remind the user that the first vehicle is in a dangerous state.
  • the method further includes: sending the predicted route, or predicted route, and the time of the collision of the second vehicle to ADAS, which is based on the predicted route of the second vehicle. , Or predict the route and the time of the collision, and adjust the following mode and the following distance of the first vehicle.
  • the ADAS automatic start mode, warning prompt, automatic deceleration or active braking so that there is enough Time changes the driving state of the own car to avoid collisions between the own car and other vehicles.
  • the method further includes: sending the predicted route, or predicted route, and the time of the collision of the second vehicle to ADAS, and the ADAS is based on the predicted route or predicted route of the second vehicle. And the time of the collision, generating the warning message or controlling the first vehicle to brake.
  • ADAS activates the automatic steering mode, carries out distance analysis on other vehicles around the vehicle 100, and controls the steering of the self-vehicle There is no danger in the lane to ensure that there is enough time to change the driving state of the self-car to avoid collisions between the self-car and other vehicles.
  • the method further includes: sending the predicted route, or predicted route, and the time of the collision of the second vehicle to ADAS, and the ADAS is based on the predicted route or predicted route of the second vehicle. And the time of the collision, control the first vehicle to turn.
  • ADAS changes the "following mode” and “following distance” to the target vehicle; if a vehicle cuts out of its own lane, ADAS will set the “following mode” and “following distance” The target of “car distance” is switched, and then the target of “following mode” and “following distance” will be searched for other vehicles in the lane.
  • the classification model is a five-classification model, such as a support vector machine SVM model, a hidden Markov model HMM, and the like.
  • the method further includes: displaying the position relationship between the first vehicle, the second vehicle, and the second vehicle and the first vehicle on a display screen; when the When the time of the collision between the second vehicle and the first vehicle is not greater than the set threshold, the warning information is displayed on the display screen.
  • the driver can intuitively know the surroundings of the own vehicle based on this information, and can subjectively judge Whether the vehicle is in danger.
  • the alarm information is displayed to remind the driver to make avoidance operations in time.
  • the method further includes: when the time for the second vehicle to collide with the first vehicle is not greater than a set threshold, displaying the second vehicle and the The positional relationship between the second vehicle and the first vehicle.
  • the display screen may not display information such as the own vehicle, other vehicles, and the position relationship between the own vehicle and other vehicles. At this time, other information such as navigation, music, and other vehicles may be displayed.
  • Video interface when the own vehicle is in a safe state, the display screen may not display information such as the own vehicle, other vehicles, and the position relationship between the own vehicle and other vehicles. At this time, other information such as navigation, music, and other vehicles may be displayed.
  • Video interface when the own vehicle is in a safe state, the display screen may not display information such as the own vehicle, other vehicles, and the position relationship between the own vehicle and other vehicles. At this time, other information such as navigation, music, and other vehicles may be displayed.
  • Video interface when the own vehicle is in a safe state, the display screen may not display information such as the own vehicle, other vehicles, and the position relationship between the own vehicle and other vehicles. At this time, other information such as navigation, music, and other vehicles may be displayed.
  • Video interface when the own vehicle is in a safe state, the display screen may not display
  • the present application provides an apparatus for predicting a driving state, which includes at least one processor configured to execute instructions stored in a memory, so that the terminal executes each possible embodiment as in the first aspect.
  • the present application provides a terminal device, which is characterized in that it includes at least one sensor, a memory, and a processor for executing each possible implementation of the embodiment in the first aspect.
  • the present application provides an intelligent driving car, which is used to implement each possible implementation of the first aspect.
  • the present application provides a computer-readable storage medium on which a computer program is stored, and when the computer program is executed in a computer, the computer is caused to execute each possible embodiment as in the first aspect.
  • the present application provides a computing device including a memory and a processor, wherein the memory stores executable code, and when the processor executes the executable code, the implementation is as in the first aspect Various possible implementations.
  • FIG. 1 is a schematic structural diagram of a vehicle provided by an embodiment of the application.
  • FIG. 2 is a schematic flowchart of a method for predicting a driving state provided by an embodiment of the application
  • FIG. 3 is a schematic diagram of measuring the speed of a vehicle 200 according to an embodiment of the application.
  • FIG. 4 is a schematic diagram of measuring the speed and speed deflection angle of the vehicle 200 according to an embodiment of the application
  • FIG. 5 is a schematic diagram of the relationship between the heading angle, the side slip angle and the yaw angle of the vehicle provided by an embodiment of the application;
  • FIG. 6 is a schematic diagram of five driving states of the driving state of the vehicle provided by an embodiment of the application.
  • FIG. 7 is a structural diagram of the SVM model provided by an embodiment of the application.
  • FIG. 8 is a schematic diagram of a model for predicting a collision between a vehicle 100 and a vehicle 200 under the CV model provided by an embodiment of the application;
  • FIG. 9 is a schematic diagram of a model for predicting a collision between a vehicle 100 and a vehicle 200 according to an embodiment of the application.
  • FIG. 10 is a schematic diagram of the interface displayed on the display screen of the vehicle 100 and the state of providing warnings according to an embodiment of the application;
  • FIG. 11 is a flowchart of a process for predicting a driving state and a predicted route of a vehicle according to an embodiment of the application
  • FIG. 12 is a schematic structural diagram of a driving state prediction device provided by an embodiment of the application.
  • Fig. 1 is a schematic structural diagram of a vehicle provided by an embodiment of the present invention.
  • the vehicle 100 includes a sensor 101, a processor 102, a memory 103 and a bus 104.
  • the sensor 101, the processor 102, and the memory 103 in the vehicle 100 may establish a communication connection through the bus 104.
  • the sensor 101 can be one or more of a camera, an ultrasonic radar, a lidar, a millimeter wave radar, a global navigation satellite system (GNSS), an inertial navigation system (INS), etc., each of which The number of devices is at least one.
  • the various components of the sensor 101 can be installed on the front of the vehicle 100, the door, the rear, the roof, the inside of the vehicle, and the like.
  • the sensor 101 detects the vehicle 100, the road where the vehicle 100 is located, and other vehicles around the vehicle 100 to obtain the lateral speed, longitudinal speed, yaw angle and other information of the vehicle 100, and whether the road where the vehicle 100 is located has lane lines and lanes Line structure, road edge and other information, lateral speed, longitudinal speed, global position, relative position, brake light, left turn signal, right turn signal, instantaneous angular speed, yaw angle, heading angle, etc. of each vehicle around the vehicle 100 .
  • the processor 102 may be a central processing unit (CPU).
  • the processor 102 is connected to the sensor 101, and is used to process the data detected by the sensor 101 to determine the driving state of other vehicles around the vehicle 100, such as keeping straight, turning left, turning left, turning right, turning right Or change lanes to the right and cut out.
  • the memory 103 may include volatile memory (volatile memory, VM), such as random-access memory (random-access memory, RAM); the memory 103 may also include non-volatile memory (non-volatile memory, NVM), such as only Read-only memory (ROM), flash memory, hard disk drive (HDD), or solid state drive (SSD); memory 804 may also include a combination of the foregoing types of memories.
  • the memory 103 is connected to the sensor 101, and is used to store data obtained by the sensor 101 detecting the vehicle 100, the road where the vehicle 100 is located, and other vehicles around the vehicle 100.
  • the memory 103 is also connected to the processor 102 to store data processed by the processor 102, and store the program instructions corresponding to the processor 102 to implement the foregoing processing procedures, and so on.
  • the vehicle 100 executes the following process during operation.
  • FIG. 2 is a schematic flowchart of a method for predicting a driving state according to an embodiment of the application. As shown in Figure 2, the vehicle 100 performs the following steps:
  • Step S201 Obtain detection information at the current moment.
  • the control sensor 101 When the vehicle 100 receives an instruction input by the user, or the vehicle 100 is started, or the speed of the vehicle 100 reaches a set threshold, or a specific program is run, the control sensor 101 is turned on, and the vehicle 100 and the vehicle The road and other vehicles around the vehicle 100 are detected.
  • the sensor 101 includes at least one camera and at least one ultrasonic radar, and each camera and ultrasonic radar is set on the front, door, roof, etc. of the vehicle 100 so as to detect from various positions of the vehicle 100.
  • the vehicle 100 controls the camera and the ultrasonic radar to work
  • the camera monitors the front, sides, and rear areas of the vehicle 100
  • the ultrasonic radar sends and receives ultrasonic waves to the front, both sides of the vehicle and other areas.
  • a camera it generally includes a camera sensor and a processing unit.
  • the processing unit performs preliminary processing on each frame of the acquired image, such as identifying whether there are other vehicles in the image, the brake lights of each vehicle, and the steering of each vehicle.
  • Lights and other information can also identify whether there are road lines in the image, the color of the road line, whether the road line is a solid line or a dashed line, the shape of the road line (such as U-turn, left turn, no turning, zebra crossing, etc.), road edges, etc. information.
  • the processing unit when it recognizes that there are other vehicles around, it then recognizes some basic information of other vehicles, such as the vehicle type of other vehicles, the length and width of the vehicle, the horizontal distance between the vehicle and the vehicle, and the longitudinal direction between the vehicle and the vehicle. Identify one or more of the distance, the lateral speed of the vehicle and the vehicle, the longitudinal speed of the vehicle and the vehicle, the heading angle of the vehicle, the yaw angle of the vehicle, the side slip angle of the vehicle centroid, and the instantaneous angular velocity.
  • the vehicle type of other vehicles such as the vehicle type of other vehicles, the length and width of the vehicle, the horizontal distance between the vehicle and the vehicle, and the longitudinal direction between the vehicle and the vehicle. Identify one or more of the distance, the lateral speed of the vehicle and the vehicle, the longitudinal speed of the vehicle and the vehicle, the heading angle of the vehicle, the yaw angle of the vehicle, the side slip angle of the vehicle centroid, and the instantaneous angular velocity.
  • the ultrasonic radar generally includes a sending unit, a receiving unit, and a processing unit.
  • the processing unit can calculate the distance between the target vehicle and the ultrasonic radar, the shape of the target vehicle, and the relative speed of the target vehicle relative to the vehicle 100 according to the time difference between the ultrasonic signal sent by the transmitting unit and the ultrasonic signal received by the receiving unit.
  • the processor 102 After the processor 102 receives the data collected by the sensor 101, it comprehensively analyzes the various features identified by the image/radar information at the same time, the distance and speed determined according to the time difference between sending and receiving ultrasonic signals, and other data to determine the speed of the vehicle 100 , Yaw rate, one or more of the information, whether there is a lane line on the road where the vehicle 100 is located, the color of the road line, whether the road line is a solid line or a dashed line, etc., each vehicle around the vehicle 100 One or more of the horizontal speed, longitudinal speed, global position, relative position, brake light, left turn signal, right turn signal, instantaneous angular speed, yaw angle, and heading angle.
  • the way for the processor 102 to determine the lateral speed, longitudinal speed, yaw angle and other information of the vehicle 100 may be: the processor 102 obtains the current driving speed v1 of the vehicle 100 and the lateral speed of the vehicle 100 from the driving system of the vehicle 100. Then, according to the traveling speed v 1 and the yaw angle ⁇ , the longitudinal speed v 1x of the vehicle 100 is calculated by the lateral speed v 1y of the vehicle 100.
  • Information such as whether there is a lane line on the road where the vehicle 100 is located, the color of the road line, whether the road line is a solid line or a dashed line, and a road edge are generally obtained by the processor 102 or the processing unit in the camera through the recognition of the image, and can also be obtained by Ultrasonic radar calculates the curb information.
  • the processor 102 determines the lateral speed, longitudinal speed, global position, relative position, brake light, left turn signal, right turn signal, instantaneous angular speed, yaw angle, heading angle, and other information of each vehicle around the vehicle 100.
  • the method may be :
  • the processor 102 can determine whether there is a vehicle around the vehicle 100 and the number of vehicles by recognizing the acquired image or radar information; the processor 102 is located at the vehicle 100 In the process of driving, if an obstacle that is relatively stationary or has a relatively small change in the relative distance from the vehicle 100 is detected by the ultrasonic radar, it is determined that there are other vehicles around the vehicle 100; the images collected by each camera and each ultrasonic radar can also be calculated The obstacles are analyzed comprehensively to detect vehicles moving around the vehicle 100.
  • the processor 102 can recognize the acquired image, and if it recognizes that the color of the brake light is red (taking red as an example), it indicates the brake light Not turned on; if the color of the brake light is recognized as bright and yellow-white, it indicates that the brake light is turned on and the vehicle brakes and slows down. Similarly, if the turn signal is red, it means the turn signal is not turned on; if it is recognized that the turn signal is bright and yellow-white, it means the turn signal is turned on, and the vehicle is turning to the left (or right).
  • the target vehicle For the method of determining the lateral speed, longitudinal speed, global position, relative position, instantaneous angular speed, yaw angle, heading angle and other information of other vehicles (take a vehicle as an example, it is called the "target vehicle"): if The detection information of the target vehicle is provided by the camera, and the processor 102 calculates the lateral velocity, longitudinal velocity, global position, relative position, instantaneous angular velocity, horizontal Swing angle and other information; if the detection information of the target vehicle is provided by the ultrasonic radar, the processor 102 calculates the lateral velocity, the longitudinal velocity, and the global position according to the time when the ultrasonic signal is received in at least two adjacent periods obtained by the ultrasonic radar.
  • the processor 102 comprehensively calculates the lateral velocity, longitudinal velocity, Global position, relative position, instantaneous angular velocity, yaw angle and other information.
  • the vehicle 100 obtains the current running speed v 1 of the vehicle 100 and the horizontal direction of the vehicle 100 from the driving system.
  • Swing angle ⁇ after obtaining multiple adjacent frames of images, according to the pixels of the last frame of image and the size of the target vehicle in the image, calculate the lateral distance dy and longitudinal distance dx between the vehicle 200 and the vehicle 100, and according to Images of adjacent frames, the lateral velocity v 2y and longitudinal velocity v 2x of the vehicle 200 relative to the vehicle 100 are obtained; then the lateral velocity v y and the longitudinal velocity v x of the vehicle 200 relative to the world coordinate system are obtained according to the following formulas: :
  • the distance measured by the two ultrasonic radars in the two transceiving cycles is used to determine whether the vehicle 200 is relative to the vehicle 100. Displacement.
  • the processor 102 records the ultrasonic radar A and the ultrasonic radar B to the detection point C1 The distance between X a1 and X b1 ; after t1 time, in the second state, the processor 102 records the distance X a2 and X b2 between the ultrasonic radar A and the ultrasonic radar B to the detection point C2.
  • the processor 102 combines the coordinates A (x A , y A ) of the ultrasonic radar A and the coordinates B (x B , y B ) of the ultrasonic radar B to obtain the distance X a1 and X b1 value, the coordinate C1 (x C1 , y C1 ) of the measurement point C1 is obtained, and then the distance X a2 and X b2 values are combined to obtain the coordinate C2 (x C2 , y C2 ) of the measurement point C2.
  • the processor 102 obtains the coordinates C1 (x C1 , y C1 ) of C1 and the coordinates C2 (x C2 , y C2 ) of C2, and calculates the relative displacement L and the speed clamp of the vehicle 200 relative to the vehicle 100 through the following formula Angle ⁇ .
  • the formula is:
  • the heading angle ⁇ of the vehicle is the angle between the vehicle's centroid velocity and the horizontal axis in the ground coordinate system;
  • the vehicle's centroid side slip angle ⁇ is the angle between the vehicle's centroid velocity direction and the head of the vehicle;
  • the vehicle's yaw angle ⁇ The heading angle ⁇ -the center of mass side slip angle ⁇ , the specific relationship between the heading angle ⁇ , the side slip angle ⁇ and the yaw angle ⁇ is shown in Figure 5.
  • the memory 103 can be used as a storage unit of the sensor 101 to store data collected by the sensor 101 in real time; it can also be used as a storage unit of the processor 102 to store data sent by the sensor 101 in real time.
  • the processor 102 will process the data acquired by the sensor 101 when the vehicle 100 executes the early warning function to obtain the detection information at the current moment.
  • the detection information includes one or more of the lateral speed, longitudinal speed, yaw angle and other information of the vehicle 100, or whether the road where the vehicle 100 is located has lane lines, the color of the road line, whether the road line is a solid line or a dashed line, etc.
  • Step S203 Determine the driving state of the second vehicle at the current moment.
  • the processor 102 After the processor 102 obtains the detection information of the current time of the vehicle 200, it then selects the historical detection information within a period of N seconds before the current time of the vehicle 200, and then compares the detection information of the selected vehicle 200 at the current time with the previous detection information of the current time.
  • the historical detection information in the N second period of time is input into the classification model, and the driving state of the vehicle 200 is determined by processing the input data.
  • this application divides the driving state of the vehicle into lane change left cutin (LCL_CUTIN), lane change left cutout (LCL_CUTOUT), and lane change right cutin (LCR_CUTIN) , Lane change right cutout (LCR_CUTOUT) and lane keep (LK).
  • LCL_CUTIN refers to other vehicles turning left to enter the lane of vehicle 100, or turning left and the driving direction intersects with the driving direction of vehicle 100
  • LCL_CUTOUT refers to other vehicles turning left to exit the lane of vehicle 100, or Turn left and the driving direction is away from the driving direction of the vehicle 100
  • LCR_CUTIN refers to other vehicles turning right to cut into the lane of the vehicle 100, or turning right and the driving direction intersects with the driving direction of the vehicle 100
  • LCR_CUTOUT refers to other vehicles turning right and driving out The lane of the vehicle 100, or turn to the right and the driving direction is away from the driving direction of the vehicle 100
  • LK means that other vehicles are driving in front of the vehicle 100 in the same lane.
  • the classification model may be a support vector machine (SVM) model.
  • SVM support vector machine
  • the driving state is divided into left lane change cut in (type 1), left lane change cut out (type 2), right lane change cut in (type 3), right lane change cut out (type 4) and The five types of (type 5) are kept straight, so there are a total of (type 1, type 2), (type 1, type 3)...
  • the processor 102 can only continue to detect the current time and the future for a period of time, and then input the detection information in the time period into the classification model to determine the vehicle 200; if the time period between the current time and the time when the vehicle 100 turns on the early warning function is less than N seconds, the processor 102 selects a time period less than N seconds (such as Q seconds), and replaces the time period before the current time
  • N seconds such as Q seconds
  • the detection information in the Q second time period is input to the classification model to determine the driving state of the vehicle 200; if the time period between the current time and the time when the vehicle 100 turns on the early warning function is shorter than Q seconds, the processor 102 selects another one A time period shorter than Q seconds until a suitable time period is selected.
  • the processor 102 may also input information such as driving speed and yaw rate obtained from the driving system of the vehicle 100 into the classification model to determine the driving state of the vehicle 100.
  • the vehicle 100 detects other surrounding vehicles through the sensor 101, and then inputs the current time detection information and the detection information detected within a period of time before the current time into the classification model, so as to accurately identify the surrounding vehicles of the vehicle 100.
  • the driving status at the current moment is not limited to, but rather to, but rather to, but rather to, but rather to, but rather to, but rather to, the surrounding vehicles of the vehicle 100.
  • the processor 102 determines the driving state of each vehicle around the vehicle 100, according to the driving state of each vehicle around the vehicle 100 and the vehicle 100, predicts the predicted route of the vehicle 100 and each vehicle around the vehicle 100 within a period of time in the future, and judges Whether the vehicle 100 is likely to collide with other surrounding vehicles.
  • the processor 102 predicts the motion trajectory of the vehicle 200, which needs to be obtained according to the driving state given by the kinematic model and the classification model. Taking a constant velocity (CV) model as an example, when the driving state of the vehicle 200 is straight and maintained, the processor 102 predicts the trajectory of the vehicle 200 as:
  • x 2 is the displacement of the vehicle 200 in the direction parallel to the lane line or curb
  • y 2 is the displacement of the vehicle 200 perpendicular to the lane line or curb
  • v 2x is the lateral speed of the vehicle 200 relative to the vehicle 100
  • M is The time period set in the vehicle 100 for predicting the route of other vehicles.
  • the processor 102 predicts the trajectory of the vehicle 200 as shown in FIG. Then, through the CV model, a series of points are generated between the position P of the vehicle 200 at the current moment and the position Q at M seconds, and then a cubic equation curve is fitted according to the least square method, which is the predicted trajectory .
  • the fitted cubic equation is:
  • f(t) is the center position of each series of points from the vehicle 100
  • a is the coefficient of the third-order equation of the future trajectory
  • t is the time of each series of points
  • t ⁇ M is the coefficient of the third-order equation of the future trajectory
  • the processor 102 calculates the movement trajectory of the vehicle 100, that is,
  • x 1 is the displacement of the vehicle 100 in the direction parallel to the lane line or curb
  • y 1 is the displacement of the vehicle 100 perpendicular to the lane line or curb
  • v 1 is the speed of the vehicle 100
  • M is the setting in the vehicle 100
  • the time period for predicting the route of other vehicles is generally 2S or other time period.
  • the trajectory of the internal vehicle 200 relative to the world coordinate system is:
  • x 2 is the displacement of the vehicle 200 in the direction parallel to the lane line or curb
  • y 2 is the displacement of the vehicle 200 perpendicular to the lane line or curb
  • v 2 is the relative speed of the vehicle 200 relative to the vehicle 100
  • is The speed angle of the vehicle 200 relative to the vehicle 100
  • M is the time period set in the vehicle 100 for predicting the predicted route of other vehicles, generally 2S or other time periods.
  • the processor 102 determines whether the vehicle 200 collides with the vehicle 100 within M seconds according to the motion trajectory of the vehicle 100, the motion trajectory of the vehicle 200, and the relative displacement L between the vehicle 200 and the vehicle 100, that is:
  • L is the relative displacement between vehicle 100 and vehicle 200
  • Is the angle between the position of the vehicle 200 relative to the vehicle 100
  • is the angle between the speed of the vehicle 200 relative to the vehicle 100
  • ⁇ L is the distance between the two vehicles without collision, generally 3.75m.
  • the processor 102 If the collision time T between the vehicle 100 and the vehicle 200 is greater than M seconds, it indicates that the probability of a collision between the vehicle 100 and the vehicle 200 is relatively low; if the collision time T between the vehicle 100 and the vehicle 200 is less than M seconds, it indicates The probability of collision between vehicle 100 and vehicle 200 is relatively high.
  • the processor 102 generates a warning message when it is determined that the probability of a collision is relatively high, and reminds the driver through voice broadcast, screen display, seat vibration, and the like.
  • the vehicle 100 when the vehicle 100 detects that there are other vehicles around, it will display on the display of the vehicle 100 its own vehicle, other vehicles, and the positional relationship between its own vehicle and other vehicles. , And even show the route trajectory of your own car and other vehicles in the future. If other cars collide with your own car within a safe distance, the display will show the warning message "Attention! Attention! There is a vehicle on the left side, please avoid! as shown in Figure 10(b) to remind the driver The officer made a evasive operation in time.
  • the display screen does not display the interface as shown in Figure 10(a), only when the other vehicles and the self-vehicle
  • the interface shown in Figure 10(b) will be displayed only when the car will collide within a safe distance. This allows the display screen to display other interfaces such as navigation, music, and video without danger.
  • the driving state of the vehicle 200 is left turning and cutting in.
  • the principle of judging whether a collision occurs is basically the same. .
  • the processor 102 sends data such as the driving status, driving trajectory, and collision time of the vehicle 100 and other vehicles around the vehicle 100 to an advanced driving assistance system (ADAS), so that ADAS is not available in other vehicles.
  • ADAS advanced driving assistance system
  • ADAS can be divided into automatic emergency braking (AEB) module, automatic emergency steering (AES) module, and adaptive cruise control (ACC) module according to different functions implemented. etc.
  • AEB automatic emergency braking
  • AES automatic emergency steering
  • ACC adaptive cruise control
  • the AEB module can make an early warning or decelerate behavior; that is, if there is a collision between the vehicle 100 and any vehicle around the vehicle 100
  • the auto start mode will give a warning, automatically decelerate or actively brake, so that there is enough time to change the driving state of the vehicle 100 to avoid collisions between the vehicle 100 and other vehicles.
  • the automatic steering mode is activated.
  • Other vehicles around the vehicle 100 perform distance analysis and control the vehicle 100 to turn to a lane that is not dangerous to ensure that there is enough time to change the driving state of the vehicle 100 to avoid collisions between the vehicle 100 and other vehicles.
  • the ACC module receives the data sent by the processor 102, if a vehicle cuts into the lane, the target of "following mode” and “following distance” is changed to cut into the vehicle; if there is a vehicle cuts out of the lane , Then switch the targets of "following mode” and “following distance”, and then look for other vehicles in the lane as the targets of "following mode” and “following distance”.
  • FIG. 11 is a flowchart of a process for predicting the driving state and route of a vehicle provided by an embodiment of the application. As shown in Figure 11, in the process of training to obtain model 1:
  • the processor 102 After the processor 102 obtains some data sent by the own car system, it performs extraction processing to obtain the speed, position, heading angle and other information of the own car; after obtaining some data reported by the sensor 101, it performs extraction processing to obtain the lane line Information such as existence, lane line color, lane line type, curb etc.;
  • the processor 102 After the processor 102 receives some data of other surrounding vehicles through the sensor 101, it performs extraction processing to obtain the speed, position, heading angle and other information of other vehicles;
  • the processor 102 trains the classifier through SVM model, cluster analysis method, Bayesian classification method, etc., after obtaining the current self-vehicle information, lane line information, and target vehicle information.
  • the target vehicle information can separately model the driving state of the target vehicle.
  • the processor 102 Compared with the process of training to obtain Model 1, the processor 102 also adds self-vehicle information, lane line information and target in the process of training to obtain Model 2 in step (3).
  • the corresponding information of the vehicle information for a period of time in the history by splicing the self-vehicle information, lane line information and target vehicle information at the current moment with the historical self-vehicle information, lane line information and target vehicle information, and then perform classifier training , Get model 2.
  • the specific process is detailed in the process of training to get model 1.
  • the processor 102 In the process of predicting the route, the processor 102:
  • the processor 102 obtains some data of the target vehicle sent by the sensor 101 in real time;
  • the processor 102 still uses the method of step (1) in the process of training to obtain the model 1 to obtain its own vehicle information and lane line information;
  • the processor 102 still obtains the target vehicle information by means of step (2) in the process of obtaining the model 1 through training;
  • the processor 102 judges whether there is historical information of the target vehicle and the duration of the historical information
  • the processor 102 inputs the current vehicle information, lane line information, and target vehicle information into model 1, and predicts the driving state and predicted route of the target vehicle;
  • the processor 102 will splice the current vehicle information, lane line information, and target vehicle information with the historical self-vehicle information, lane line information, and target vehicle information. , And then input to model 2, predict the driving state and predicted route of the target vehicle.
  • FIG. 12 is a schematic structural diagram of a lane change prediction device for intelligent driving provided by an embodiment of the application.
  • the device 1200 includes: a transceiver unit 1201, an identification unit 1202, a trajectory prediction unit 1203, and a risk assessment unit 1204 according to the execution function.
  • the transceiver unit 1201 is used to receive detection information sent by the sensor 101.
  • the sent detection information includes one or more of the lateral speed, longitudinal speed, yaw angle and other information of the vehicle, or whether there are lane lines, the color of the road line, and the road line on the road where other vehicles around the vehicle are located.
  • One or more of the information such as a solid line or a dashed line, or the lateral speed, longitudinal speed, global position, relative position, brake light, left turn signal, right turn signal, instantaneous angular speed, and yaw of each vehicle around the vehicle
  • One or more of the information such as angle and heading angle.
  • the identification unit 1202 is used to determine the driving state of each vehicle around the own vehicle.
  • the driving state includes LCL_CUTIN, LCL_CUTOUT, LCR_CUTIN, LCR_CUTOUT and LK.
  • the trajectory prediction unit 1103 is used to calculate the predicted route of the own vehicle and each vehicle around the own vehicle in a period of time in the future according to the driving state of the own vehicle and each vehicle around the own vehicle.
  • the risk assessment unit 1204 is used to calculate the time of collision between the self-vehicle and the vehicles around the self-vehicle according to the predicted route of the self-vehicle and each vehicle around the self-vehicle in a period of time in the future. If it is determined that the probability of a collision is relatively high, a warning message is generated to remind the driver through voice broadcast, screen display, seat vibration, etc.
  • the transceiver unit 1201 is also used to send data such as the driving status, driving trajectory, and collision probability of the vehicle 100 and other vehicles around the vehicle 100 to ADAS, so that each module in the ADAS advances when other vehicles do not completely change lanes. Take corresponding measures to improve safety and comfort.
  • the present invention provides a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed in a computer, the computer is caused to execute any of the above methods.
  • the present invention provides a computing device including a memory and a processor, the memory stores executable code, and the processor implements any of the above methods when the executable code is executed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)

Abstract

本申请提供了一种行驶状态的预测方法、装置和终端设备,涉及智能驾驶技术领域。所述方法包括:获取当前时刻的检测信息和在当前时刻之前的N秒内的检测信息,检测信息是通过传感器对第一车辆和第一车辆周围的至少一个车辆进行检测得到的;确定当前时刻的第二车辆的行驶状态,行驶状态是通过将当前时刻的检测信息和在所前时刻之前的N秒内的检测信息输入分类模型得到,行驶状态包括直行保持、左变道切入、左变道切出、右变道切入或右变道切出中的一种。本申请通过检测自车和周围其它车辆,然后将当前时刻检测信息和在当前时刻之前一段时间内检测到的检测信息输入到分类模型,从而准确识别出车辆周围车辆在当前时刻的变道机动情况。

Description

一种行驶状态的预测方法、装置和终端设备
本申请要求于2020年06月10日提交中国国家知识产权局、申请号为202010521167.8、申请名称为“一种行驶状态的预测方法、装置和终端设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及智能驾驶技术领域,尤其涉及一种行驶状态的预测方法、装置和终端设备。
背景技术
车道变换是一种常见的车辆机动行为,也是引发车辆发生碰撞事故的主要原因之一。因此提前识别周围车辆的车道变化行为,对提高自车安全性具有重要意义。
现有技术中,车辆在道路上行驶的过程中,随时对车辆周围的环境进行检测,当检测到周围有车辆时,根据当前探测到其它车辆的一些信息,判断其它车辆的行驶状态。但是,如果其它车辆的驾驶员车技不好导致车辆走“S”型、其它车辆的驾驶员误打转向灯、自车变道等情况时,仍采用现有技术中只根据当前时刻探测的信息来判断的话,则会导致自车对其它车辆的行驶状态判断错误。
另外,现有技术中,车辆的行驶状态只是分为直线保持和变道,对于车辆如何变道并没有细分。当自车发现其它车辆变道对自车造成危险时,如果无法确定其它车辆从左边切入还是从右边切入,则无法进行转向来避险,从而会发生与其它车辆发生相撞。
发明内容
为了解决上述的问题,本申请的实施例提供了一种行驶状态的预测方法、装置和终端设备。
第一方面,本申请提供一种行驶状态的预测方法,包括:获取当前时刻的检测信息和在所述当前时刻之前的N秒内的检测信息,所述检测信息是通过传感器对第一车辆和所述第一车辆周围的至少一个车辆进行检测得到的,所述至少一个车辆包括第二车辆,所述N秒为大于零且小于十的正数;确定所述当前时刻的第二车辆的行驶状态,所述行驶状态是通过将所述当前时刻的检测信息和所述在所述当前时刻之前的N秒内的检测信息输入分类模型得到,所述行驶状态包括直行保持、左变道切入、左变道切出、右变道切入或右变道切出中的一种。
在该实施方式中,第一车辆通过传感器检测周围的其它车辆,然后将当前时刻检测信息和在当前时刻之前一段时间内检测到的检测信息输入到分类模型,从而准确识 别出车辆周围车辆在当前时刻的行驶状态。并且,根据分类模型,将车辆的行驶状态分为直行保持、左变道切入、左变道切出、右变道切入或右变道切出五类,更加准确的预测出车辆的行驶状态,从而使得第一车辆可以依据所预测的车辆的行驶状态做出对应的准确的驾驶策略或提示。
在一种实施方式中,所述检测信息包括:所述第一车辆的速度和横摆角,和所述第二车辆的横向速度、纵向速度或速度、全局位置和/或相对位置、刹车灯、左转向灯、右转向灯、瞬时角速度、横摆角、航向角中的一个或多个。
在该实施方式中,第一车辆根据不仅限于获取车道线信息,还可以根据自车信息和其它车辆信息,来预测其它车辆的行驶状态,从而在缺乏车道线的情况下,仍能准确预测其它车辆的行驶状态。
在一种实施方式中,当所述传感器从开始进行检测的时间到所述当前时刻之间的时间段小于所述N秒时,所述在所述当前时刻之前的N秒内的检测信息,包括:获取在所述当前时刻之前的Q秒内的检测信息,所述Q为大于零且小于所述N的正数。
在该实施方式中,如果规定时间段内的历史上自车信息和其它信息不够的情况下,可以选取更短一点的时间段内的历史上自车信息和其它信息,从而确保分类模型准确预测其它车辆的行驶状态。
在一种实施方式中,当所述当前时刻为所述传感器开始进行检测的时间时,所述方法还包括:根据所述当前时刻的检测信息,确定所述第二车辆的行驶状态。
在该实施方式中,如果没有历史上自车信息和其它信息不够的情况下,可以选取当前时刻的自车信息和其它信息,从而确保分类模型能预测其它车辆的行驶状态。
在一种实施方式中,所述方法还包括:计算所述第二车辆在所述当前时刻之后的M秒内的预测路线,所述预测路线是根据所述第二车辆的行驶状态计算得到的,所述M秒为所述第一车辆内设定的预测其它车辆的预测路线的时间段,数值为大于零的正数。
在一种实施方式中,所述方法还包括:根据所述第二车辆的预测路线,计算所述第二车辆与所述第一车辆碰撞的时间;当所述发生碰撞的时间不大于设定的阈值时,生成报警信息,所述报警信息用于提醒用户所述第一车辆处在危险状态。
在一种实施方式中,所述方法还包括:将所述第二车辆的预测路线、或预测路线和碰撞的时间发送给高级驾驶辅助系统ADAS,所述ADAS根据所述第二车辆的预测路线、或预测路线和碰撞的时间,调整所述第一车辆的跟车模式和跟车距离。
在该实施方式中,如果出现自车与其它车辆中任一车辆之间的发生碰撞时间小于安全时间秒时,ADAS自动启动模式,进行警告提示、自动减速或主动进行制动,从而有足够的时间改变自车的行驶状态,以避免自车与其它车辆发生碰撞。
在一种实施方式中,所述方法还包括:将所述第二车辆的预测路线、或预测路线和碰撞的时间发送给ADAS,所述ADAS根据所述第二车辆的预测路线、或预测路线和碰撞的时间,生成所述警告信息或控制所述第一车辆进行制动。
在该实施方式中,如果出现自车与其它车辆中任一车辆之间的发生碰撞时间小于安全时间秒时,ADAS启动自动转向模式,对车辆100周围的其它车辆进行距离分析,控制自车转向没有危险的车道上,以保证有足够的时间改变自车的行驶状态,来避免自车与其它车辆发生碰撞。
在一种实施方式中,所述方法还包括:将所述第二车辆的预测路线、或预测路线和碰撞的时间发送给ADAS,所述ADAS根据所述第二车辆的预测路线、或预测路线和碰撞的时间,控制所述第一车辆进行转向。
在该实施方式中,如果有车辆切入本车道,ADAS改变“跟车模式”和“跟车距离”的目标为切入车辆;如果有车辆切出本车道,ADAS将“跟车模式”和“跟车距离”的目标切换掉,后续再寻找本车道其它车辆为“跟车模式”和“跟车距离”的目标。
在一种实施方式中,所述分类模型为五分类模型,例如支持向量机SVM模型、隐马尔科夫模型HMM等。
在一种实施方式中,所述方法还包括:在显示屏上显示所述第一车辆、所述第二车辆和所述第二车辆与所述第一车辆之间的位置关系;当所述第二车辆与所述第一车辆发生碰撞的时间不大于设定的阈值时,在所述显示屏上显示所述报警信息。
在该实施方式中,通过将自车、其它车辆和自车与其它车辆之间的位置关系显示在显示屏上,以便驾驶员可以直观的根据这些信息,知道自车周围情况,及可以主观判断自车是否有危险。同时显示报警信息,提醒驾驶员及时作出避让操作。
在一种实施方式中,所述方法还包括:当所述第二车辆与所述第一车辆发生碰撞的时间不大于设定的阈值时,在显示屏上显示所述第二车辆和所述第二车辆与所述第一车辆之间的位置关系。
在该实施方式中,当自车处在安全状态下,显示屏上可以不显示自车、其它车辆和自车与其它车辆之间的位置关系等信息,此时可以显示其它如导航、音乐、视频等界面。
第二方面,本申请提供一种行驶状态的预测装置,包括至少一个处理器,所述处理器用于执行存储器中存储的指令,以使得终端执行如第一方面中各个可能实现的实施例。
第三方面,本申请提供一种终端设备,其特征在于,包括至少一个传感器、存储器和用于执行如第一方面中各个可能实现的实施例的处理器。
第四方面,本申请提供一种智能驾驶汽车,用于执行如第一方面中各个可能实现的实施例。
第五方面,本申请提供一种计算机可读存储介质,其上存储有计算机程序,当所述计算机程序在计算机中执行时,令计算机执行如第一方面中各个可能实现的实施例。
第六方面,本申请提供一种计算设备,包括存储器和处理器,其特征在于,所述存储器中存储有可执行代码,所述处理器执行所述可执行代码时,实现如第一方面中 各个可能实现的实施例。
附图说明
图1为本申请实施例提供的一种车辆的结构示意图;
图2为本申请实施例提供的一种行驶状态的预测方法的流程示意图;
图3为本申请实施例提供的测量车辆200的速度示意图;
图4为本申请实施例提供的测量车辆200的速度和速度偏向角示意图;
图5为本申请实施例提供的车辆的航向角、侧偏角和横摆角之间的关系示意图;
图6为本申请实施例提供的车辆的行驶状态的五种行驶状态示意图;
图7为本申请实施例提供的SVM模型结构图;
图8为本申请实施例提供的在CV模型下预测车辆100与车辆200发生碰撞的模型示意图;
图9为本申请实施例提供的一种预测车辆100与车辆200发生碰撞的模型示意图;
图10为本申请实施例提供的车辆100的显示屏显示的界面以及提供警示的状态的示意图;
图11为本申请实施例提供的一种预测车辆的行驶状态和预测路线的过程流程图;
图12为本申请实施例提供的一种行驶状态的预测装置结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。
图1为本发明实施例提供的一种车辆的结构示意图。如图1所示,该车辆100包括传感器101、处理器102、存储器103和总线104。车辆100中的传感器101、处理器102和存储器103可以通过总线104建立通信连接。
传感器101可以为摄像头、超声波雷达、激光雷达、毫米波雷达、全球卫星定位系统(global navigation satellite system,GNSS)、惯性导航系统(inertial navigation system,INS)等器件中一个或多个,其中每种器件的数量至少为一个。所述传感器101中的各个器件可以安装在车辆100的车头、车门、车尾、车顶、车体内等位置上。传感器101通过对车辆100、车辆100所处道路和车辆100周围的其它车辆进行检测,以获取车辆100的横向速度、纵向速度、横摆角等信息,车辆100所处道路是否有车道线、车道线结构、路沿等信息,车辆100周围的每个车辆的横向速度、纵向速度、全局位置、相对位置、刹车灯、左转向灯、右转向灯、瞬时角速度、横摆角、航向角等信息。
处理器102可以为中央处理器(central processing unit,CPU)。处理器102与传感器101连接,用于对传感器101检测到的数据进行处理,确定车辆100周围的其它车 辆的行驶状态,如直行保持、左变道切入、左变道切出、右变道切入或右变道切出。
存储器103可以包括易失性存储器(volatile memory,VM),例如随机存取存储器(random-access memory,RAM);存储器103也可以包括非易失性存储器(non-volatile memory,NVM),例如只读存储器(read-only memory,ROM)、快闪存储器、硬盘(hard disk drive,HDD)或固态硬盘(solid state drive,SSD);存储器804还可以包括上述种类的存储器的组合。存储器103与传感器101连接,用于存储传感器101对车辆100、车辆100所处道路和车辆100周围的其它车辆进行检测得到的数据。另外,存储器103还与处理器102连接,用于存储处理器102处理后的数据,以及存储处理器102实现上述处理过程对应的程序指令等等。
基于图1所示的车辆100的架构图,所述车辆100在运行时执行如下的过程。
图2为本申请实施例提供的一种行驶状态的预测方法流程示意图。如图2所示,所述车辆100执行如下步骤:
步骤S201,获取当前时刻的检测信息。
车辆100通过接收用户输入的指令、或车辆100启动、或车辆100车速达到设定的阈值、或运行特定程序等方式触发执行提前预警功能时,控制传感器101开启,对车辆100、车辆所处的道路和车辆100周围的其它车辆进行检测。
示例性地,传感器101包括至少一个摄像头和至少一个超声波雷达,且每个摄像头和超声波雷达设置在车辆100的车头、车门、车顶等位置上,以便从车辆100的各个方位进行检测。车辆100控制摄像头和超声波雷达进行工作时,摄像头对车辆100的前方、两侧、后方等区域进行摄像监控,超声波雷达向车辆的前方、两侧等区域发送和接收超声波。
可选地,对于摄像头来说,一般包括摄像头传感器和处理单元。在摄像头传感器获取车辆100周围某个区域的图像后,处理单元对获取的每一帧图像进行初步的处理,例如识别出图像中是否有其它车辆、每个车辆的刹车灯、每个车辆的转向灯等信息,也可以识别出图像中是否有道路线、道路线的颜色、道路线是实线还是虚线、道路线的形状(如掉头、左转向、禁止转向、斑马线等形状)、路沿等信息。
其中,处理单元在识别出周围有其它车辆时,再对其它车辆的一些基本信息进行识别,例如对其它车辆的车辆种类、车辆长宽、车辆与本车的横向距离、车辆与本车的纵向距离、车辆与本车的横向速度、车辆与本车的纵向速度、车辆的航向角、车辆横摆角、车辆质心侧偏角、瞬时角速度等中一个或多个信息进行识别。
可选地,对于超声波雷达来说,一般包括发送单元、接收单元和处理单元。处理单元可以根据发送单元发送超声波信号与接收单元接收到超声波信号之间的时间差,可以计算出目标车辆与超声波雷达之间的距离、目标车辆的形状、目标车辆相对于车辆100的相对移动速度、目标车辆相对于车辆100的横向速度、目标车辆相对于车辆100的纵向速度等信息。
处理器102在接收到传感器101采集的数据后,将同一时刻的图像/雷达信息识别 出的各个特征、根据收发超声波信号的时间差确定的距离和速度等数据进行综合分析,确定出车辆100的速度、横摆角速度,等信息中一个或多个,车辆100所处道路是否有车道线、道路线的颜色、道路线是实线还是虚线等信息中一个或多个,车辆100周围的每个车辆的横向速度、纵向速度、全局位置、相对位置、刹车灯、左转向灯、右转向灯、瞬时角速度、横摆角、航向角等信息中一个或多个。
示例性地,处理器102确定车辆100的横向速度、纵向速度、横摆角等信息的方式可以为:处理器102从车辆100的驾驶系统中获取目前车辆100的行驶速度v1和车辆100的横摆角θ,然后根据行驶速度v 1和横摆角θ,计算出车辆100的纵向速度v 1x车辆100的横向速度v 1y
车辆100所处道路是否有车道线、道路线的颜色、道路线是实线还是虚线、路沿等信息,一般都是由处理器102或摄像头中的处理单元通过识别图像得到的,也可以通过超声波雷达计算出路沿信息。
处理器102确定车辆100周围的每个车辆的横向速度、纵向速度、全局位置、相对位置、刹车灯、左转向灯、右转向灯、瞬时角速度、横摆角、航向角等信息的方式可以为:
(1)对于在车辆周围是否有车辆:处理器102可以通过对获取的图像或者雷达信息进行识别,来确定是否在车辆100的周围是否有车辆,以及车辆的数量;处理器102在车辆100处在行驶的过程中,通过超声波雷达检测到与车辆100相对静止或相对距离变化较小的障碍物,则确定车辆100的周围有其它车辆;也可以将各个摄像头采集的图像和各个超声波雷达计算得到的障碍物进行综合分析,以检测出在车辆100周围运动的车辆。
(2)对于是否开启刹车灯、左转向灯、右转向灯的方式:处理器102可以通过对获取的图像进行识别,如果识别出刹车灯颜色为红色(以红色为例),则表明刹车灯未开启;如果识别出刹车灯颜色为高亮且为黄白色,则表明刹车灯开启,此时该车辆刹车减速。同理,如果转向灯颜色为红色,则表明转向灯未开启;如果识别出转向灯颜色为高亮且为黄白色,则表明转向灯开启,此时该车辆向左(或向右)转向。
(3)对于确定其它车辆的横向速度、纵向速度、全局位置、相对位置、瞬时角速度、横摆角、航向角等信息的方式(以一辆车为例,称为“目标车辆”):如果目标车辆的检测信息是由摄像头提供,处理器102则根据摄像头获取的相邻多个帧图像上的该车辆的位置变化,计算出横向速度、纵向速度、全局位置、相对位置、瞬时角速度、横摆角等信息;如果目标车辆的检测信息是由超声波雷达提供,处理器102则根据超声波雷达获取的相邻至少两个周期的收到超声波信号的时间,计算出横向速度、纵向速度、全局位置、相对位置、瞬时角速度、横摆角等信息;如果目标车辆的检测信息是由摄像头和超声波雷达一起提供,处理器102则根据摄像头和超声波雷达提供的数据,综合计算出横向速度、纵向速度、全局位置、相对位置、瞬时角速度、横摆角等信息。
示例性地,对于摄像头提供图像的情况,如图3所示,假设车辆100为直线行驶, 速度为v 1,车辆100从驾驶系统中获取当前时刻车辆100的行驶速度v 1和车辆100的横摆角θ,在得到相邻的多个帧图像后,根据最后一帧图像的像素和图像中目标车辆的大小,计算出车辆200与车辆100之间的横向距离dy和纵向距离dx,以及根据相邻帧的图像,得到车辆200相对于车辆100的横向速度v 2y和纵向速度v 2x;然后根据下列公式得到车辆200相对于世界坐标系的前进方向的横向速度v y和纵向速度v x为:
Figure PCTCN2021087578-appb-000001
示例性地,对于超声波雷达提供收发超声波信号的情况,如图4(a)所示,通过两个超声波雷达在两个收发周期中测量的距离,来判断车辆200是否与车辆100之间有相对位移。假设超声波雷达A、超声波雷达B和车辆200的探测点C处在同一个平面上(即距离地面相同),在第一状态下,处理器102记录下超声波雷达A和超声波雷达B到探测点C1之间的距离X a1和X b1;t1时间后,在第二状态下,处理器102记录下超声波雷达A和超声波雷达B到探测点C2之间的距离X a2和X b2
如图4(b)所示,处理器102根据设定的超声波雷达A的坐标A(x A,y A)和超声波雷达B的坐标B(x B,y B),结合得到距离X a1和X b1数值,得到测量点C1的坐标C1(x C1,y C1),再结合得到距离X a2和X b2数值,得到测量点C2的坐标C2(x C2,y C2)。然后处理器102根据得到C1的坐标C1(x C1,y C1)和C2的坐标C2(x C2,y C2),通过下面公式,计算出车辆200相对于车辆100发生的相对位移L和速度夹角α。公式为:
Figure PCTCN2021087578-appb-000002
处理器102在得到相对位移L后,根据速度公式,计算出车辆200相对于车辆100的相对速度v 2=L/t1,然后根据车辆100的纵向速度v 1x和横向速度v 2y,计算出车辆200相对于世界坐标系的纵向速度v x=v 1x±v2*cosα和横向速度v y=v 2y±v2*sinα,然后根据车辆100的横摆角θ和速度夹角α,计算出车辆200的相对于世界坐标系的航向角γ。
其中,车辆的航向角γ为在地面坐标系下,车辆质心速度与横轴的夹角;车辆的质心侧偏角δ为车辆质心速度方向与车头指向的夹角;车辆的横摆角β=航向角γ-质心侧偏角δ,具体航向角γ、侧偏角δ和横摆角β之间的关系如图5所示。
对于存储器103来说,可以作为传感器101的存储单元,实时存储传感器101采集的数据;也可以作为处理器102的存储单元,实时存储处理器102对传感器101发送的数据进行处理的数据。
存储器103作为传感器101的存储单元时,车辆100在执行提前预警功能时,处理器102才会对传感器101获取的数据进行处理,得到当前时刻的检测信息。其中, 检测信息包括车辆100的横向速度、纵向速度、横摆角等信息中一个或多个,或车辆100所处道路是否有车道线、道路线的颜色、道路线是实线还是虚线等信息中一个或多个,或车辆100周围的每个车辆的横向速度、纵向速度、全局位置、相对位置、刹车灯、左转向灯、右转向灯、瞬时角速度、横摆角、航向角等信息中一个或多个。
步骤S203,确定当前时刻的第二车辆的行驶状态。
处理器102在得到车辆200的当前时刻的检测信息后,再选择车辆200的当前时刻之前的N秒时间段内的历史检测信息,然后将选择的车辆200的当前时刻的检测信息和当前时刻之前的N秒时间段内的历史检测信息输入到分类模型中,通过对输入的数据进行处理,确定出车辆200的行驶状态。
其中,本申请将车辆的行驶状态分为左变道切入(lane change left cutin,LCL_CUTIN)、左变道切出(lane change left cutout,LCL_CUTOUT)、右变道切入(lane change right cutin,LCR_CUTIN)、右变道切出(lane change right cutout,LCR_CUTOUT)和直行保持(lane keep,LK)。
如图6所示,LCL_CUTIN是指其他车辆向左转向进入车辆100的车道,或向左转向且行驶方向与车辆100行驶方向相交;LCL_CUTOUT是指其他车辆向左转向驶出车辆100的车道,或向左转向且行驶方向远离车辆100行驶方向;LCR_CUTIN是指其他车辆向右转向切入车辆100的车道,或向右转向且行驶方向与车辆100行驶方向相交;LCR_CUTOUT是指其他车辆向右转向驶出车辆100的车道,或向右转向且行驶方向远离车辆100行驶方向;LK是指其他车辆在车辆100同车道前方行驶。
示例性地,分类模型可以为支持向量机(support vector machine,SVM)模型。如图7所示,由于将行驶状态分为左变道切入(类型1)、左变道切出(类型2)、右变道切入(类型3)、右变道切出(类型4)和直行保持(类型5)这五种,所以在分类上共有(类型1,型类2)、(类型1,类型3)…...(类型4,类型5)十个SVM分类面,将训练数据X分别输入十个SVM分类面,假如在(类型1、类型2)属于类型1、在(类型1、类型3)中属于类型1、在(类型1,类型4)中属于类型1、在(类型1、类型5)中属于类型1,那么该数据X属于类型1的投票数是4,如果属于其他类型数不大于4时,那么X就属于类型1,通过这种方式训练出一个五分类SVM模型。在使用时,将当前时刻得到的自车信息、车道线信息和目标车辆信息,历史上N秒内得到的自车信息、车道线信息和目标车辆信息,输入到该SVM模型中,在10个SVM分类面投票中得票最高的类型即为此时目标车辆的行驶状态。
另外,如果当前时刻即为车辆100开启提前预警功能的时间时,处理器102只能将当前时刻以及往后继续检测一段时间,然后将该时间段内的检测信息输入到分类模型,来确定车辆200的行驶状态;如果当前时刻与车辆100开启提前预警功能的时间之间的时间段小于N秒时,处理器102选择一个比N秒小的时间段(如Q秒),将当前时刻之前的Q秒时间段内的检测信息输入到分类模型,来确定车辆200的行驶状态;如果当前时刻与车辆100开启提前预警功能的时间之间的时间段比Q秒还短,处理器102再选择一个比Q秒还短的时间段,直到选择出一个合适的时间段为止。
对于车辆100的行驶状态,处理器102也可以将从车辆100的驾驶系统获取的驾驶速度和横摆角速度等信息输入到分类模型,来确定车辆100的行驶状态。
本申请实施例中,车辆100通过传感器101检测周围的其它车辆,然后将当前时刻检测信息和在当前时刻之前一段时间内检测到的检测信息输入到分类模型,从而准确识别出车辆100周围车辆在当前时刻的行驶状态。
可选地,处理器102在确定车辆100周围各个车辆的行驶状态后,根据车辆100和车辆100周围各个车辆的行驶状态,预测车辆100和车辆100周围各个车辆在未来一段时间内预测路线,判断车辆100是否有可能与周围的其它车辆发生碰撞。
一种可能实现的例子中,处理器102预测车辆200的运动轨迹,需要根据运动学模型和分类模型给出的行驶状态来得到。以等速(constant velocity,CV)模型为例,当车辆200行驶状态为直行保持,处理器102预测车辆200的运动轨迹为:
Figure PCTCN2021087578-appb-000003
其中,x 2为车辆200与车道线或路沿相平行方向的位移,y 2为车辆200垂至于车道线或路沿方向的位移,v 2x为车辆200相对于车辆100的横向速度,M为车辆100内设定的预测其它车辆预测路线的时间段,一般为2S或其它时间段。
当车辆200行驶状态为左变道切入,处理器102预测车辆200的运动轨迹为:如图8所示,根据车辆200的横向速度、纵向速度和车道约束条件,计算出在未来的M秒时的位置Q,然后通过CV模型,在当前时刻车辆200所在的位置P和M秒时的位置Q之间生成一系列点,然后根据最小二乘法拟合一条三次方程曲线,该曲线即为预测轨迹。其中,拟合的三次方程为:
f(t)=a 0+a 1t+a 2t 2+a 3t 3
其中,f(t)为每个系列点距离车辆100所述的车辆的中心位置,a为未来轨迹3次方程的系数,t为每个系列点的时间,且t<M。
另一种可能实现的例子中,如图9所示,当车辆100的行驶状态为直行保持,处理器102计算出车辆100的移动轨迹,也即
Figure PCTCN2021087578-appb-000004
其中,x 1为车辆100与车道线或路沿相平行方向的位移,y 1为车辆100垂直于车道线或路沿方向的位移,v 1为车辆100的速度,M为车辆100内设定的预测其它车辆预测路线的时间段,一般为2S或其它时间段。
当车辆200的行驶状态为左变道切入,根据车辆200相对于世界坐标系的纵向速度为v x=v 1*cosθ±v 2*cosα(此时由于车辆100的行驶状态为直行保持,也即θ=0,所以 车辆200相对于世界坐标系移动速度为v 1+v 2*cosα)、横向速度为v y=v 2*sinα和速度夹角α,然后根据速度公式,计算在M秒内车辆200相对于世界坐标系的轨迹为:
Figure PCTCN2021087578-appb-000005
其中,x 2为车辆200与车道线或路沿相平行方向的位移,y 2为车辆200垂直于车道线或路沿方向的位移,v 2为车辆200相对于车辆100的相对速度,α为车辆200相对于车辆100的速度夹角,M为车辆100内设定的预测其它车辆预测路线的时间段,一般为2S或其它时间段。
处理器102根据车辆100的运动轨迹、车辆200的运动轨迹、车辆200与车辆100之间的相对位移L,判断在M秒内,车辆200是否与车辆100发生碰撞,也即:
Figure PCTCN2021087578-appb-000006
其中,L为车辆100与车辆200之间相对位移,
Figure PCTCN2021087578-appb-000007
为车辆200相对于车辆100的位置之间的夹角,α为车辆200相对于车辆100的速度夹角,△L为两辆车之间不发生碰撞距离,一般为3.75m。
如果车辆100与车辆200之间发生碰撞的时间T大于M秒,则表明车辆100与车辆200发生碰撞的概率比较低;如果车辆100与车辆200之间发生碰撞的时间T小于M秒,则表明车辆100与车辆200会发生碰撞的概率比较高。可选地,处理器102在确定发生碰撞概率比较高的情况下,生成一个警告信息,通过语音播报、屏幕显示、座椅震动等方式提醒驾驶员。
示例性地,如图10(a)所示,车辆100在检测到周围有其它车辆时,会在车辆100的显示屏上显示出自车、其它车辆、以及自车和其它车辆之间的位置关系,甚至显示出自车和其它车辆在未来一段时间内行驶的路线轨迹。如果其它车与自车在安全距离内会发生碰撞时,显示屏上会显示如图10(b)所示“注意!注意!左侧有车辆切入,请避让!”的警告信息,以提醒驾驶员及时作出避让操作。
示例性地,车辆100在检测周围有其它车辆时,如果其它车与自车在安全距离内不会发生碰撞时,显示屏不显示如图10(a)所示界面,只有在其它车与自车在安全距离内会发生碰撞时,才会显示如图10(b)所示的界面。这样可以让显示屏在没有危险的情况下,显示其它如导航、音乐、视频等界面。
上述例子是以车辆200的行驶状态为左转向切入为例,实际上,不论是相邻车道上车辆切入本车道,还是相同车道上车辆切出相邻车道,其判断是否发生碰撞的原理基本相同。
可选地,处理器102将车辆100和车辆100周围的其它各个车辆的行驶状态、行驶轨迹、碰撞时间等数据发送给高级驾驶辅助系统(advanced driving assistant system,ADAS),使得ADAS在其它车辆未完全发生变道时,提前作出相应措施,以提高安全 性和舒适性。
其中,ADAS根据实现的功能不同,可将ADAS分为自动紧急制动(autonomous emergency braking,AEB)模块、自动紧急转向(automatic emergency steering,AES)模块、自适应巡航(adaptive cruise control,ACC)模块等等。
示例性地,AEB模块在接收到处理器102发送的M秒内可能发生碰撞后,可作出提前预警或减速行为;也即如果出现车辆100与车辆100周围的任意一辆车辆之间的发生碰撞时间T小于安全时间M秒时,自动启动模式,进行警告提示、自动减速或主动进行制动,从而有足够的时间改变车辆100的行驶状态,以避免车辆100与其它车辆发生碰撞。
示例性地,AES模块在接收到处理器102发送的数据后,如果出现车辆100与车辆100周围的任意一辆车辆之间的发生碰撞时间T小于安全时间M秒时,启动自动转向模式,对车辆100周围的其它车辆进行距离分析,控制车辆100转向没有危险的车道上,以保证有足够的时间改变车辆100的行驶状态,来避免车辆100与其它车辆发生碰撞。
示例性地,ACC模块在接收到处理器102发送的数据后,如果有车辆切入本车道,则改变“跟车模式”和“跟车距离”的目标为切入车辆;如果有车辆切出本车道,则将“跟车模式”和“跟车距离”的目标切换掉,后续再寻找本车道其它车辆为“跟车模式”和“跟车距离”的目标。
图11为本申请实施例提供的一种预测车辆的行驶状态和预测路线的过程流程图。如图11所示,在训练得到模型1的过程中:
(1)处理器102获取得到自车系统发送的一些数据后,进行提取处理,得到自车的速度、位置、航向角等信息;获取传感器101上报的一些数据后,进行提取处理,得到车道线是否存在、车道线颜色、车道线类型、路沿等信息;
(2)处理器102通过传感器101接收到周围其他车辆的一些数据后,进行提取处理,得到其他车辆的速度、位置、航向角等信息;
(3)处理器102将当前得到的自车信息、车道线信息和目标车辆信息后,然后通过SVM模型、聚类分析法、贝叶斯分类法等等,进行分类器训练,得到能够通过输入目标车辆信息,能分别出该目标车辆的行驶状态的模型。
在训练得到模型2的过程中:相比较训练得到模型1的过程中,处理器102在训练得到模型2的过程中,在第(3)步中还增加了自车信息、车道线信息和目标车辆信息的历史上一段时间的对应信息,通过将当前时刻的自车信息、车道线信息和目标车辆信息与历史上的自车信息、车道线信息和目标车辆信息进行拼接,然后进行分类器训练,得到模型2,具体过程详见上述训练得到模型1的过程。
处理器102在预测路线的过程中:
(1)处理器102实时获取传感器101发送的目标车辆的一些数据;
(2)处理器102仍采用训练得到模型1的过程中步骤(1)的方式得到自车信息和车道线信息;
(3)处理器102仍采用训练得到模型1的过程中步骤(2)的方式得到目标车辆信息;
(4)处理器102判断是否有目标车辆的历史信息且历史信息时长;
(5)如果没有或者历史信息时长小于模型2需求时长,处理器102将当前时刻的自车信息、车道线信息和目标车辆信息输入到模型1,预测出目标车辆的行驶状态和预测路线;
(6)如果有且历史信息时长满足模型2需求时长,处理器102将当前时刻的自车信息、车道线信息和目标车辆信息与历史上的自车信息、车道线信息和目标车辆信息进行拼接,然后输入到模型2,预测出目标车辆的行驶状态和预测路线。
图12为本申请实施例提供的一种智能驾驶的变道预测装置结构示意图。如图12所示,该装置1200根据执行功能,包括:收发单元1201、识别单元1202、轨迹预测单元1203和风险评估单元1204。
收发单元1201用于接收传感器101发送的检测信息。其中,发送的检测信息包括自车的横向速度、纵向速度、横摆角等信息中一个或多个,或自车周围的其它车辆所处的道路是否有车道线、道路线的颜色、道路线是实线还是虚线等信息中一个或多个,或自车周围的每个车辆的横向速度、纵向速度、全局位置、相对位置、刹车灯、左转向灯、右转向灯、瞬时角速度、横摆角、航向角等信息中一个或多个。
识别单元1202用于确定自车周围的每个车辆的行驶状态。其中,行驶状态包括LCL_CUTIN、LCL_CUTOUT、LCR_CUTIN、LCR_CUTOUT和LK。
轨迹预测单元1103用于根据自车和自车周围的每个车辆的行驶状态,计算出自车和自车周围的每个车辆的在未来一段时间内预测路线。
风险评估单元1204用于根据自车和自车周围的每个车辆的在未来一段时间内预测路线,计算自车与自车周围的车辆发生碰撞的时间。如果在确定发生碰撞概率比较高的情况下,生成一个警告信息,通过语音播报、屏幕显示、座椅震动等方式提醒驾驶员。
另外,收发单元1201还用于将车辆100和车辆100周围的其它各个车辆的行驶状态、行驶轨迹、碰撞概率等数据发送给ADAS,使得ADAS中各个模块在其它车辆未完全发生变道时,提前作出相应措施,以提高安全性和舒适性。
预测装置1200中各个单元具体如何工作,可以参考图2-图11以及上述相应的实施例中描述技术方案,申请人在此不再赘述。
本发明提供一种计算机可读存储介质,其上存储有计算机程序,当所述计算机程序在计算机中执行时,令计算机执行上述任一项方法。
本发明提供一种计算设备,包括存储器和处理器,所述存储器中存储有可执行代 码,所述处理器执行所述可执行代码时,实现上述任一项方法。
需要理解,本文中的“第一”,“第二”等描述,仅仅为了描述的简单而对相似概念进行区分,并不具有其他限定作用。
本领域普通技术人员应该还可以进一步意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、计算机软件或者二者的结合来实现,为了清楚地说明硬件和软件的可互换性,在上述说明中已经按照功能一般性地描述了各示例的组成及步骤。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本发明的范围。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分步骤是可以通过程序来指令处理器完成,所述的程序可以存储于计算机可读存储介质中,所述存储介质是非短暂性(non-transitory)介质,例如随机存取存储器、只读存储器、快闪存储器、硬盘、固态硬盘、磁带(magnetic tape,MT)、软盘(floppy disk,FD)、光盘(optical disc,OD)及其任意组合。以上所述,仅为本发明较佳的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到的变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应该以权利要求的保护范围为准。

Claims (17)

  1. 一种行驶状态的预测方法,其特征在于,包括:
    获取当前时刻的检测信息和在所述当前时刻之前的N秒内的检测信息,所述检测信息是通过传感器对第一车辆和所述第一车辆周围的至少一个车辆进行检测得到的,所述至少一个车辆包括第二车辆,所述N秒为大于零且小于十的正数;
    确定所述当前时刻的第二车辆的行驶状态,所述行驶状态是通过将所述当前时刻的检测信息和所述在所述当前时刻之前的N秒内的检测信息输入分类模型得到,所述行驶状态包括直行保持、左变道切入、左变道切出、右变道切入或右变道切出中的一种。
  2. 根据权利要求1所述的方法,其特征在于,所述检测信息包括:
    所述第一车辆的速度和横摆角,和
    所述第二车辆的横向速度、纵向速度或速度、全局位置和/或相对位置、刹车灯、左转向灯、右转向灯、瞬时角速度、横摆角、航向角中的一个或多个。
  3. 根据权利要求1-2中任意一项所述的方法,其特征在于,当所述传感器从开始进行检测的时间到所述当前时刻之间的时间段小于所述N秒时,
    所述在所述当前时刻之前的N秒内的检测信息,包括:
    获取在所述当前时刻之前的Q秒内的检测信息,所述Q为大于零且小于所述N的正数。
  4. 根据权利要求1-2中任意一项所述的方法,其特征在于,当所述当前时刻为所述传感器开始进行检测的时间时,
    所述方法还包括:
    根据所述当前时刻的检测信息,确定所述第二车辆的行驶状态。
  5. 根据权利要求1-4中任意一项所述的方法,其特征在于,所述方法还包括:
    计算所述第二车辆在所述当前时刻之后的M秒内的预测路线,所述预测路线是根据所述第二车辆的行驶状态计算得到的,所述M秒为所述第一车辆内设定的预测其它车辆的预测路线的时间段,数值为大于零的正数。
  6. 根据权利要求5所述的方法,其特征在于,所述方法还包括:
    根据所述第二车辆的预测路线,计算所述第二车辆与所述第一车辆碰撞的时间;
    当所述发生碰撞的时间不大于设定的阈值时,生成报警信息,所述报警信息用于提醒用户所述第一车辆处在危险状态。
  7. 根据权利要求1-6中任意一项所述的方法,其特征在于,所述方法还包括:
    将所述第二车辆的预测路线、或预测路线和碰撞的时间发送给高级驾驶辅助系统ADAS,所述ADAS根据所述第二车辆的预测路线、或预测路线和碰撞的时间,调整 所述第一车辆的跟车模式和跟车距离。
  8. 根据权利要求1-6中任意一项所述的方法,其特征在于,所述方法还包括:
    将所述第二车辆的预测路线、或预测路线和碰撞的时间发送给ADAS,所述ADAS根据所述第二车辆的预测路线、或预测路线和碰撞的时间,生成所述警告信息或控制所述第一车辆进行制动。
  9. 根据权利要求1-6中任意一项所述的方法,其特征在于,所述方法还包括:
    将所述第二车辆的预测路线、或预测路线和碰撞的时间发送给ADAS,所述ADAS根据所述第二车辆的预测路线、或预测路线和碰撞的时间,控制所述第一车辆进行转向。
  10. 根据权利要求1-9中任意一项所述的方法,其特征在于,所述分类模型为五分类模型,所述五分类模型包括支持向量机SVM模型。
  11. 根据权利要求1-10中任意一项所述的方法,其特征在于,所述方法还包括:
    在显示屏上显示所述第一车辆、所述第二车辆和所述第二车辆与所述第一车辆之间的位置关系;
    当所述第二车辆与所述第一车辆发生碰撞的时间不大于设定的阈值时,在所述显示屏上显示所述报警信息。
  12. 根据权利要求1-10中任意一项所述的方法,其特征在于,所述方法还包括:
    当所述第二车辆与所述第一车辆发生碰撞的时间不大于设定的阈值时,在显示屏上显示所述第二车辆和所述第二车辆与所述第一车辆之间的位置关系。
  13. 一种行驶状态的预测装置,包括至少一个处理器,所述处理器用于执行存储器中存储的指令,以使得终端执行如权利要求1-12任一所述的方法。
  14. 一种终端设备,其特征在于,包括至少一个传感器、存储器和用于执行如权利要求1-12中的任一项所述的方法的处理器。
  15. 一种智能驾驶汽车,用于执行如权利要求1-12任一所述的方法。
  16. 一种计算机可读存储介质,其上存储有计算机程序,当所述计算机程序在计算机中执行时,令计算机执行权利要求1-12中任一项的所述的方法。
  17. 一种计算设备,包括存储器和处理器,其特征在于,所述存储器中存储有可执行代码,所述处理器执行所述可执行代码时,实现权利要求1-12中任一项所述的方法。
PCT/CN2021/087578 2020-06-10 2021-04-15 一种行驶状态的预测方法、装置和终端设备 WO2021249020A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010521167.8A CN113771867B (zh) 2020-06-10 2020-06-10 一种行驶状态的预测方法、装置和终端设备
CN202010521167.8 2020-06-10

Publications (1)

Publication Number Publication Date
WO2021249020A1 true WO2021249020A1 (zh) 2021-12-16

Family

ID=78834512

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/087578 WO2021249020A1 (zh) 2020-06-10 2021-04-15 一种行驶状态的预测方法、装置和终端设备

Country Status (2)

Country Link
CN (1) CN113771867B (zh)
WO (1) WO2021249020A1 (zh)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114919574A (zh) * 2022-04-29 2022-08-19 东风汽车集团股份有限公司 基于前车运行状态的自动紧急避让系统及控制方法
CN115116236A (zh) * 2022-08-30 2022-09-27 福思(杭州)智能科技有限公司 行驶行为的预测方法、装置、存储介质和电子装置
CN115214692A (zh) * 2022-08-18 2022-10-21 中国第一汽车股份有限公司 质心速度确定方法以及装置
CN115257734A (zh) * 2022-07-25 2022-11-01 上汽通用五菱汽车股份有限公司 一种自适应巡航控制方法、装置及计算机设备
CN115631653A (zh) * 2022-09-30 2023-01-20 中国第一汽车股份有限公司 换道时间区间确定方法、装置、存储介质及电子装置
CN115688682A (zh) * 2022-12-29 2023-02-03 浙江工业大学 一种基于模糊预测的车辆轨迹数据压缩方法及装置
CN116110216A (zh) * 2022-10-21 2023-05-12 中国第一汽车股份有限公司 车辆跨线时间确定方法、装置、存储介质及电子装置
CN116674548A (zh) * 2023-05-10 2023-09-01 苏州畅行智驾汽车科技有限公司 一种转向避撞路径确定方法和装置
CN118135542A (zh) * 2024-05-06 2024-06-04 武汉未来幻影科技有限公司 一种障碍物动静态判定方法及其相关设备

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114228746B (zh) * 2022-01-17 2024-05-07 北京经纬恒润科技股份有限公司 一种预测车辆运动轨迹的方法及装置
CN114537441A (zh) * 2022-03-17 2022-05-27 福思(杭州)智能科技有限公司 一种车辆行驶意图预测方法、装置、系统及车辆
CN114973055A (zh) * 2022-03-25 2022-08-30 成都臻识科技发展有限公司 车辆运动状态检测方法、装置、设备及存储介质
CN114919597A (zh) * 2022-04-22 2022-08-19 一汽奔腾轿车有限公司 一种基于转向灯的驾驶员辅助方法及驾驶员辅助系统
CN117994754A (zh) * 2022-10-31 2024-05-07 华为技术有限公司 车辆的位置获取方法、模型的训练方法以及相关设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108032858A (zh) * 2017-11-30 2018-05-15 广州小鹏汽车科技有限公司 基于旁车行驶路径预测的自适应巡航控制方法及系统
EP3495220A1 (en) * 2017-12-11 2019-06-12 Volvo Car Corporation Path predicition for a vehicle
CN110146100A (zh) * 2018-02-13 2019-08-20 华为技术有限公司 轨迹预测方法、装置及存储介质
CN110606084A (zh) * 2019-09-19 2019-12-24 中国第一汽车股份有限公司 巡航控制方法、装置、车辆及存储介质
CN111114554A (zh) * 2019-12-16 2020-05-08 苏州智加科技有限公司 行驶轨迹预测方法、装置、终端及存储介质

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2942765B1 (en) * 2014-05-07 2018-12-26 Honda Research Institute Europe GmbH Method and system for predictive lane change assistance, program software product and vehicle
US9229453B1 (en) * 2014-08-29 2016-01-05 GM Global Technology Operations LLC Unified motion planner for autonomous driving vehicle in avoiding the moving obstacle
CN104882025B (zh) * 2015-05-13 2017-02-22 东华大学 一种基于车联网技术的碰撞检测预警方法
CN110816526A (zh) * 2019-11-29 2020-02-21 苏州智加科技有限公司 自动驾驶车辆躲避威胁加速度控制方法、装置及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108032858A (zh) * 2017-11-30 2018-05-15 广州小鹏汽车科技有限公司 基于旁车行驶路径预测的自适应巡航控制方法及系统
EP3495220A1 (en) * 2017-12-11 2019-06-12 Volvo Car Corporation Path predicition for a vehicle
CN110146100A (zh) * 2018-02-13 2019-08-20 华为技术有限公司 轨迹预测方法、装置及存储介质
CN110606084A (zh) * 2019-09-19 2019-12-24 中国第一汽车股份有限公司 巡航控制方法、装置、车辆及存储介质
CN111114554A (zh) * 2019-12-16 2020-05-08 苏州智加科技有限公司 行驶轨迹预测方法、装置、终端及存储介质

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114919574A (zh) * 2022-04-29 2022-08-19 东风汽车集团股份有限公司 基于前车运行状态的自动紧急避让系统及控制方法
CN115257734A (zh) * 2022-07-25 2022-11-01 上汽通用五菱汽车股份有限公司 一种自适应巡航控制方法、装置及计算机设备
CN115214692A (zh) * 2022-08-18 2022-10-21 中国第一汽车股份有限公司 质心速度确定方法以及装置
CN115116236A (zh) * 2022-08-30 2022-09-27 福思(杭州)智能科技有限公司 行驶行为的预测方法、装置、存储介质和电子装置
CN115631653A (zh) * 2022-09-30 2023-01-20 中国第一汽车股份有限公司 换道时间区间确定方法、装置、存储介质及电子装置
CN116110216A (zh) * 2022-10-21 2023-05-12 中国第一汽车股份有限公司 车辆跨线时间确定方法、装置、存储介质及电子装置
CN116110216B (zh) * 2022-10-21 2024-04-12 中国第一汽车股份有限公司 车辆跨线时间确定方法、装置、存储介质及电子装置
CN115688682A (zh) * 2022-12-29 2023-02-03 浙江工业大学 一种基于模糊预测的车辆轨迹数据压缩方法及装置
CN115688682B (zh) * 2022-12-29 2023-11-14 浙江工业大学 一种基于模糊预测的车辆轨迹数据压缩方法及装置
CN116674548A (zh) * 2023-05-10 2023-09-01 苏州畅行智驾汽车科技有限公司 一种转向避撞路径确定方法和装置
CN116674548B (zh) * 2023-05-10 2024-04-05 苏州畅行智驾汽车科技有限公司 一种转向避撞路径确定方法和装置
CN118135542A (zh) * 2024-05-06 2024-06-04 武汉未来幻影科技有限公司 一种障碍物动静态判定方法及其相关设备

Also Published As

Publication number Publication date
CN113771867A (zh) 2021-12-10
CN113771867B (zh) 2023-03-03

Similar Documents

Publication Publication Date Title
WO2021249020A1 (zh) 一种行驶状态的预测方法、装置和终端设备
US10668925B2 (en) Driver intention-based lane assistant system for autonomous driving vehicles
US10864911B2 (en) Automated detection of hazardous drifting vehicles by vehicle sensors
US10479373B2 (en) Determining driver intention at traffic intersections for automotive crash avoidance
US10710580B2 (en) Tailgating situation handling by an automated driving vehicle
US10800455B2 (en) Vehicle turn signal detection
CN106873580B (zh) 基于感知数据在交叉口处自主驾驶
US9963149B2 (en) Vehicle control device
US9902399B2 (en) Vehicle travelling control device for controlling a vehicle in traffic
US9550496B2 (en) Travel control apparatus
US9751506B2 (en) Algorithms for avoiding automotive crashes at left and right turn intersections
JP6684714B2 (ja) 車両の運転者支援のための方法及びシステム
CN109841088B (zh) 车辆驾驶辅助系统及方法
JP5345350B2 (ja) 車両の運転支援装置
US20190071071A1 (en) Vehicle control device, vehicle control method, and storage medium
US11208084B2 (en) Brake assistance apparatus and brake assistance control method for vehicle
JP2017074823A (ja) 車線変更支援装置
KR101511858B1 (ko) 보행자 또는 이륜차를 인지하는 운전보조시스템 및 그 제어방법
KR20210083462A (ko) 운전자 보조 시스템, 그를 가지는 차량 및 그 제어 방법
JP7509250B2 (ja) 画像表示装置、画像表示方法、及び、画像表示プログラム
US20220253065A1 (en) Information processing apparatus, information processing method, and information processing program
JPWO2018047223A1 (ja) 障害物判定方法、駐車支援方法、出庫支援方法、及び障害物判定装置
CN111376901B (zh) 车辆的控制方法、装置及车辆
US20240208409A1 (en) Notification device, notification method, and non-transitory storage medium storing a notification program
JP2021018743A (ja) 画像表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21821035

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21821035

Country of ref document: EP

Kind code of ref document: A1