WO2021253245A1 - 识别车辆变道趋势的方法和装置 - Google Patents

识别车辆变道趋势的方法和装置 Download PDF

Info

Publication number
WO2021253245A1
WO2021253245A1 PCT/CN2020/096415 CN2020096415W WO2021253245A1 WO 2021253245 A1 WO2021253245 A1 WO 2021253245A1 CN 2020096415 W CN2020096415 W CN 2020096415W WO 2021253245 A1 WO2021253245 A1 WO 2021253245A1
Authority
WO
WIPO (PCT)
Prior art keywords
lane
vehicle
distance relationship
values
distance
Prior art date
Application number
PCT/CN2020/096415
Other languages
English (en)
French (fr)
Inventor
陈昊
鲁浩
高趁
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN202080005121.9A priority Critical patent/CN112753038B/zh
Priority to EP20940942.4A priority patent/EP4160346A4/en
Priority to PCT/CN2020/096415 priority patent/WO2021253245A1/zh
Publication of WO2021253245A1 publication Critical patent/WO2021253245A1/zh
Priority to US18/065,510 priority patent/US20230110730A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/803Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Definitions

  • This application relates to the field of automatic driving technology, and in particular to a method and device for identifying a vehicle's lane change trend.
  • lidar technology is usually used to obtain laser point cloud data of surrounding scenes, and to detect the contour vertex of the target vehicle, and obtain the lateral distance between the contour vertex and the vehicle. Furthermore, the lane change tendency of the target vehicle is judged based on the change over time of the lateral distance.
  • machine vision technology is used to obtain the surrounding scene image, and the lane line and the target vehicle in the scene image are detected to obtain the target vehicle and the target vehicle.
  • the lane change tendency of the target vehicle is judged based on the change of the distance over time.
  • lidar technology For the use of lidar technology to identify the lane change trend of the target vehicle, the exhaust gas and dust of the target vehicle will lead to inaccurate detection of the contour vertex of the target vehicle, resulting in inaccurate judgment of the lane change trend of the target vehicle.
  • machine vision technology To identify the lane change trend of the target vehicle, there may be no lane line in the intersection area, which leads to the failure to recognize the lane change trend of the target vehicle.
  • the embodiments of the present application provide a method and device for identifying a lane change trend of a vehicle, so as to solve the problem that the laser radar technology or the machine vision technology alone cannot accurately identify the lane change trend of the vehicle in the related technology.
  • the technical scheme is as follows:
  • a method for recognizing a lane change tendency of a vehicle includes:
  • Acquire laser point cloud data of the detected target vehicle where the target vehicle is a vehicle traveling in a scene around the own vehicle; based on the laser point cloud data, obtain the center line of the lane where the own vehicle is located and the The first distance relationship value of the target vehicle; obtain a scene image containing the target vehicle; based on the scene image, obtain the second distance relationship value between the center line of the lane where the own vehicle is located and the target vehicle; the calculation has been obtained The first credibility of the multiple first distance relationship values and the second credibility of the multiple second distance relationship values that have been obtained; based on the first credibility and the second credibility, calculate Multiple fused distance relationship values of the multiple first distance relationship values and the multiple second distance relationship values; based on the multiple fused distance relationship values, it is determined whether the target vehicle has a lane change tendency.
  • the own vehicle can collect laser point cloud data of surrounding scenes through lidar, and obtain the first distance relationship value between the center line of the lane where the own vehicle is located and the target vehicle based on the laser point cloud data. It is also possible to collect scene images of surrounding scenes through machine vision equipment, such as a camera, and obtain the second distance relationship value between the center line of the lane where the vehicle is located and the target vehicle according to the scene images.
  • the distance relationship value may be a value used to reflect the distance relationship between the center line of the lane where the target vehicle and the host vehicle are located, such as the ratio of the distance to the lane width.
  • the credibility of the multiple first distance relationship values and the credibility of the multiple second distance relationship values can be calculated, and according to the calculation For the obtained credibility, the multiple first distance relationship values and the multiple second distance relationship values are fused to obtain multiple fused distance relationship values. Finally, according to the transformation relationship of the fusion distance relationship value of multiple detection periods over time, the lane change trend of the target vehicle is judged.
  • the obtaining the first distance relationship value between the centerline of the lane where the own vehicle is located and the target vehicle according to the laser point cloud data and the centerline point set includes:
  • a centerline point set of the lane where the own vehicle is located Based on a high-precision map, acquiring a centerline point set of the lane where the own vehicle is located, where the centerline point set includes the coordinates of multiple sampling points on the centerline of the lane where the own vehicle is located in the world coordinate system; According to the laser point cloud data and the centerline point set, a first distance relationship value between the centerline of the lane where the own vehicle is located and the target vehicle is obtained.
  • the first distance relationship between the center line of the lane where the vehicle is located and the target vehicle can be obtained by first using a high-precision map.
  • the centerline point set includes the coordinates of multiple sampling points on the centerline of the lane where the vehicle is located in the world coordinate system. Then, according to the laser point cloud data of the target vehicle in the obtained laser point cloud data and the centerline point set of the lane where the vehicle is located, the first distance relationship between the centerline of the lane where the vehicle is located and the target vehicle can be obtained value.
  • the obtaining the first distance relationship value between the centerline of the lane where the own vehicle is located and the target vehicle according to the laser point cloud data and the centerline point set includes:
  • the laser point cloud data Based on the laser point cloud data, obtain the first coordinate of the target vehicle in the own vehicle coordinate system of the own vehicle; convert the first coordinate to the second coordinate of the target vehicle in the world coordinate system; The minimum distance between the coordinates of each sampling point included in the centerline point set in the world coordinate system and the second coordinates is used as the first distance between the centerline of the lane where the own vehicle is located and the target vehicle; Obtain the width of the lane where the host vehicle is located, calculate the first ratio between the first distance and the width of the lane line where the host vehicle is located, as the centerline of the lane where the host vehicle is located and the first ratio of the target vehicle A distance relationship value.
  • the obtained laser point cloud data can be identified by the target vehicle, the coordinates of the target vehicle in the lidar coordinate system can be obtained, and then the coordinates of the target vehicle in the lidar coordinate system Convert to the first coordinate in the own vehicle's own vehicle coordinate system. Then, convert the first coordinate to the second coordinate of the target vehicle in the world coordinate system, and set the minimum between the second coordinate and the coordinates of each sampling point in the centerline point of the lane where the vehicle is located in the high-precision map The distance is the first distance between the center line of the lane where the vehicle is located and the target vehicle.
  • the coordinates of each sampling point in the centerline point collection of the lane where the vehicle is located in the high-precision map are coordinates in the world coordinate system.
  • the width of the lane where the vehicle is located can be obtained from the high-precision map, and the first ratio between the first distance and the width of the lane line where the vehicle is located is calculated as the first ratio between the center line of the lane where the vehicle is located and the target vehicle. Distance relationship value.
  • the obtaining the second distance relationship value between the center line of the lane where the own vehicle is located and the target vehicle based on the scene image includes:
  • lane line recognition and target vehicle recognition can be performed on scene images.
  • a bounding box corresponding to the target vehicle can be generated, and the bounding box of the target vehicle can be obtained.
  • the coordinates of the two vertices in the box close to the vehicle and in contact with the ground in the image coordinate system.
  • the first credibility of the multiple first distance relationship values that have been obtained by the calculation and the second credibility of the multiple second distance relationship values that have been obtained include:
  • the first credibility of the multiple first distance relationship values that have been obtained and the second credibility of the multiple second distance relationship values that have been obtained are calculated.
  • the ideal lane-changing model is used to characterize the time change of the distance relationship value between the center line of the lane where the vehicle is located and other vehicles in the surrounding scene where the vehicle is located. relation.
  • the ideal along-lane motion model is used to characterize the time-varying relationship of the distance relationship between other vehicles and the center line of the lane where the vehicle is located when other vehicles are moving along the lane. According to the relationship between the multiple first distance relationship values over time and the fit between the ideal lane-changing model and the ideal lane movement model, the credibility of the first distance relationship value can be obtained. Similarly, the first distance relationship value can be obtained. Second, the reliability of the distance relationship value.
  • the first credibility of the multiple first distance relationship values that have been obtained and the multiple second distance relationships that have been obtained are calculated based on the ideal lane-changing model and the ideal along-lane motion model
  • the second credibility of the value includes:
  • the degree of fit, and the second degree of fit for the first available lane-long motion model; based on the first degree of fit and the second degree of fit, the obtained multiple first distance relationships are obtained The first credibility of the value; based on the obtained multiple second distance relationship values, calculate the value of each unknown parameter in the ideal lane change model to obtain the second usable lane change model; based on the obtained multiple The second distance relationship value is calculated, the value of each unknown parameter in the ideal along-lane motion model is calculated to obtain the second available lane-long motion model; the multiple second distance relationship values that have been obtained are calculated for the second The third degree of fit of the available lane-changing model and the fourth degree of fit of the second available along-lane motion model; based on the third degree of fit and the fourth degree of fit, the existing The second credibility of the obtained multiple second distance relationship values.
  • the KL divergence between the multiple obtained first distance relationship values and the ideal lane change model can be calculated, and the values of the unknown parameters in the ideal lane change model can be adjusted.
  • the KL diverges When the degree is the smallest, the first available lane changing model is obtained.
  • the KL divergence is the smallest, the first usable along-lane is obtained Model.
  • the first degree of fit of the multiple obtained first distance relationship values to the first available lane-changing model and the second degree of fit to the first available lane-long motion model can be calculated respectively.
  • the first degree of fit and the second degree of fit is larger, it means that the multiple first distance relationship values that have been obtained have higher credibility.
  • the third degree of fit of the multiple obtained second distance relationship values to the second available lane changing model and the fourth degree of fit to the second available lane movement model can be obtained.
  • the third degree of fit and the fourth degree of fit is larger, it means that the multiple second distance relationship values that have been obtained have higher credibility.
  • the obtaining the first credibility of the plurality of obtained first distance relationship values based on the first degree of fit and the second degree of fit includes:
  • the obtaining the second credibility of the multiple obtained second distance relationship values based on the third degree of fit and the fourth degree of fit includes:
  • the reciprocal of the smaller value of the third degree of fit and the fourth degree of fit is determined as the second credibility of the obtained multiple second distance relationship values.
  • the obtaining the first distance relationship value between the center line of the lane where the own vehicle is located and the target vehicle based on the laser point cloud data includes:
  • the detection period obtain the first distance relationship value between the center line of the lane where the own vehicle is located and the target vehicle based on the laser point cloud data;
  • the obtaining the second distance relationship value between the center line of the lane where the own vehicle is located and the target vehicle based on the scene image includes:
  • the calculating a plurality of fused distance relationship values of the plurality of first distance relationship values and the plurality of second distance relationship values based on the first credibility and the second credibility includes:
  • the first weight corresponding to the plurality of first distance relationship values and the second weight corresponding to the plurality of second distance relationship values are calculated; Among the plurality of first distance relationship values and the plurality of second distance relationship values, the target first distance relationship value and the target second distance relationship value obtained in the same detection period are obtained; and the target first distance relationship value and The product of the first weight is added to the product of the target second distance relationship value and the second weight to obtain the detection period corresponding to the target first distance relationship value and the target second relationship value The value of the fusion distance relationship.
  • the first distance relationship value and the second distance relationship value may be acquired according to the same detection period.
  • the first distance relationship value and the second distance relationship value obtained in each of the multiple detection periods can be calculated separately. That is, for each of the obtained multiple detection cycles, the corresponding fusion distance relationship value can be calculated.
  • the judging whether the target vehicle has a tendency to change lanes based on the multiple fused distance relationship values includes:
  • the target vehicle has a lane changing tendency . In order to control the vehicle in advance to decelerate the target vehicle and other processing.
  • a device for recognizing a lane change tendency of a vehicle includes:
  • the acquisition module is used to acquire the laser point cloud data of the detected target vehicle, where the target vehicle is a vehicle traveling in a scene around the own vehicle; based on the laser point cloud data, obtain the information of the lane where the own vehicle is located A first distance relationship value between the center line and the target vehicle; obtain a scene image containing the target vehicle; based on the scene image, obtain a second distance relationship between the center line of the lane where the own vehicle is located and the target vehicle value;
  • a calculation module for calculating the first credibility of the multiple first distance relationship values that have been obtained and the second credibility of the multiple second distance relationship values that have been obtained;
  • a fusion module configured to calculate a plurality of fusion distance relationship values of the plurality of first distance relationship values and the plurality of second distance relationship values based on the first credibility and the second credibility;
  • the determination module is configured to determine whether the target vehicle has a lane change tendency based on the multiple fusion distance relationship values.
  • the acquisition module is used to:
  • acquiring a centerline point set of the lane where the own vehicle is located where the centerline point set includes the coordinates of multiple sampling points on the centerline of the lane where the own vehicle is located in the world coordinate system;
  • a first distance relationship value between the centerline of the lane where the own vehicle is located and the target vehicle is obtained.
  • the acquisition module is used to:
  • the minimum distance between the coordinates of each sampling point included in the centerline point set in the world coordinate system and the second coordinate is taken as the first distance between the centerline of the lane where the own vehicle is located and the target vehicle ;
  • the acquisition module is used to:
  • the calculation module is used for:
  • the ideal lane-changing model is used to characterize the time-varying relationship of the distance relationship between other vehicles in the surrounding scene where the own vehicle is located and the center line of the lane where the own vehicle is located.
  • the lane motion model is used to characterize the time-varying relationship of the distance relationship value between the other vehicle and the center line of the lane where the own vehicle is located when the other vehicle is moving along the lane.
  • the calculation module is used for:
  • the second credibility of the obtained multiple second distance relationship values is obtained.
  • the calculation module is used for:
  • the obtaining the second credibility of the multiple obtained second distance relationship values based on the third degree of fit and the fourth degree of fit includes:
  • the reciprocal of the smaller value of the third degree of fit and the fourth degree of fit is determined as the second credibility of the obtained multiple second distance relationship values.
  • the acquisition module is used to:
  • the detection period obtain the first distance relationship value between the center line of the lane where the own vehicle is located and the target vehicle based on the laser point cloud data;
  • the fusion module is used for:
  • the judgment module is used to:
  • the fifth degree of fit is greater than a preset degree of fit threshold, it is determined that the target vehicle has a lane change tendency.
  • a vehicle lane change trend recognition device in a third aspect, includes a processor and a memory, and at least one instruction is stored in the memory, and the instruction is loaded and executed by the processor In order to realize the operation performed by the method for recognizing a lane change tendency of a vehicle as described in the first aspect described above.
  • a computer-readable storage medium is provided, and at least one instruction is stored in the storage medium, and the instruction is loaded and executed by a processor to realize the identification of the vehicle lane change tendency as described in the first aspect. The action performed by the method.
  • a computer program product containing instructions is provided.
  • the vehicle lane change tendency recognition device executes the above-mentioned first aspect.
  • the first distance relationship value between the center line of the lane where the vehicle is located and the target vehicle is obtained, and the center line and the center line of the lane where the vehicle is located are obtained according to the scene image.
  • the second distance relationship value between the target vehicles, and the first distance relationship value and the second distance relationship value are merged to obtain the fused distance relationship value.
  • the lane change trend of the target vehicle is judged. It can be seen that this scheme combines the laser point cloud data obtained by lidar technology and the scene image obtained by machine vision technology to comprehensively judge the lane change trend of the target vehicle, which can effectively avoid the single use of lidar technology or machine vision technology. The problem of inaccurate recognition of vehicle lane change trends.
  • FIG. 1 is a schematic structural diagram of a vehicle lane change trend recognition device provided by an embodiment of the present application
  • Figure 2 is a flow chart of a method for identifying a vehicle lane change trend provided by an embodiment of the present application
  • Fig. 3 is a schematic diagram of an image coordinate system of a scene image provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of an image coordinate system of a scene image provided by an embodiment of the present application.
  • Fig. 5 is a schematic structural diagram of a device for identifying a lane change tendency of a vehicle provided by an embodiment of the present application.
  • the embodiment of the present application provides a method for recognizing the tendency of a vehicle to change lanes, and the method can be applied to an autonomous vehicle.
  • the method can be implemented by the vehicle lane change trend recognition device in an autonomous vehicle.
  • Sensing systems, positioning systems, etc. can be deployed in autonomous vehicles.
  • the sensing system may include a lidar, a camera, etc.
  • the positioning system may be a global positioning system (Global Positioning System, GPS), a Beidou system, etc.
  • Fig. 1 is a schematic diagram of a vehicle lane change trend recognition device 100 provided by an embodiment of the present application.
  • the vehicle lane change tendency recognition device may include a processor 101 and a memory 102.
  • the processor 101 may be a central processing unit (CPU).
  • the processor 101 may refer to one processor, or may include multiple processors.
  • the memory 102 may include volatile memory, such as random access memory (RAM); the memory may also include non-volatile memory, such as read-only memory (ROM), flash memory, etc.
  • the memory may also include a combination of the above-mentioned types of memory.
  • the memory 102 may refer to one memory, or may include multiple memories.
  • the memory 102 stores computer-readable instructions, and the computer-readable instructions can be executed by the processor 101 to implement the method for identifying a lane change tendency of a vehicle provided in an embodiment of the present application.
  • FIG. 2 is a flowchart of a method for identifying a vehicle lane change trend provided by an embodiment of the present application.
  • the process of the method may include the following steps:
  • Step 201 Acquire laser point cloud data of the detected target vehicle.
  • the target vehicle is a vehicle traveling in a scene around the host vehicle.
  • the vehicle can be equipped with lidar, which scans the surrounding scene at a fixed frequency to obtain laser point cloud data.
  • the laser point cloud data of the detected target vehicle may not be acquired in each frame, but the laser point cloud data of the detected target vehicle may be acquired according to the detection cycle.
  • the lidar can collect one or more frames of laser point cloud data of the detected target vehicle. In each detection cycle, the laser point cloud data of the last frame of the detected target vehicle acquired in the current detection cycle can be obtained.
  • Step 202 Obtain a first distance relationship value between the center line of the lane where the vehicle is located and the target vehicle based on the laser point cloud data.
  • the target vehicle can be identified, and the coordinates of the laser point corresponding to the apex of the target vehicle's contour line in the lidar coordinate system can be acquired. Then, according to the pre-calibrated transformation matrix from the lidar coordinate system to the self-car coordinate system, the coordinates of the contour vertices of the target vehicle in the lidar coordinate system are converted to the self-car coordinate system to obtain the target vehicle in the self-car coordinate system
  • the acquired coordinates of the target vehicle in the lidar coordinate system may be the coordinates of the apex of the target vehicle's contour line in the lidar coordinate system.
  • the first coordinate in the self-vehicle coordinate system after conversion may include The coordinates of the vertices of the target vehicle's contour line in the own vehicle coordinate system.
  • the number of contour vertices corresponding to the shape of the target vehicle may also be different.
  • the first coordinate of the target vehicle in the own vehicle coordinate system can be converted to the world coordinate system to obtain the second coordinate of the target vehicle in the world coordinate system. That is, the coordinates of the contour vertices of the target vehicle in the own vehicle coordinate system are converted to the world coordinate system, and the coordinates of the contour vertices of the target vehicle in the world coordinate system are obtained.
  • the method for transforming the coordinates of the vertex of each contour line of the target vehicle from the self-car coordinate system to the world coordinate system can be as follows:
  • x i , y i are the abscissa and ordinate of the i-th contour vertex of the target vehicle in the own vehicle's own vehicle coordinate system
  • x i ′, y i ′ are the i-th contour vertex of the target vehicle
  • the rotation matrix can be expressed as follows:
  • the centerline point set of the lane where the vehicle is located can be obtained in the high-precision map.
  • the acquired centerline point set may be a centerline point set satisfying a preset distance condition from the own vehicle, for example, a centerline point set of the lane where the own vehicle is located within 100 meters of the own vehicle.
  • the distance between the second coordinates of the target vehicle in the world coordinate system and the coordinates of each sampling point in the centerline point set of the lane where the vehicle is located in the high-precision map obtained above and select the minimum distance.
  • the minimum distance is the distance between the target vehicle and the center line of the lane where the vehicle is located.
  • the distance between the target vehicle and the center line of the lane where the vehicle is located can be denoted as d 1 .
  • the ratio of the distance d 1 between the center line of the lane where the target vehicle and the host vehicle is located and the width D 1 of the lane where the host vehicle is located can be calculated as the first distance relationship value ⁇ l between the center line of the lane where the target vehicle and the host vehicle are located. . which is,
  • the width D 1 of the lane where the own vehicle is located can be obtained through a high-precision map.
  • the distance relationship value is used to indicate the distance relationship between other vehicles and the center line of the lane where the vehicle is located, except that the distance between the center line of the lane where the other vehicle and the vehicle is located and the width of the lane where the vehicle is located are used In addition to the ratio of, you can also use the reciprocal of the ratio.
  • Step 203 Acquire a scene image containing the target vehicle.
  • the vehicle can be equipped with machine vision equipment, such as a camera.
  • the machine vision equipment takes pictures of surrounding scenes at a fixed frequency to obtain scene images.
  • the laser point cloud data including the target vehicle may be acquired according to the detection cycle.
  • the machine vision device can obtain one or more frames of scene images including the target vehicle.
  • the last frame acquired in the current detection cycle includes the scene image of the target vehicle.
  • Step 204 Obtain a second distance relationship value between the center line of the lane where the vehicle is located and the target vehicle based on the scene image.
  • lane line recognition and target vehicle recognition can be performed.
  • target vehicle recognition a bounding box corresponding to the target vehicle can be generated, and the coordinates of two vertices in the bounding box of the target vehicle that are close to the vehicle and in contact with the ground in the image coordinate system can be obtained.
  • the image coordinate system can take the upper left corner of the image as the origin.
  • the coordinates of vertex 1 in the bounding box of the target vehicle that is close to the vehicle and contacting the ground can be denoted as (x v1 , y v1 ) and the coordinates of vertex 2 in the bounding box of the target vehicle that is close to the vehicle and contacting the ground can be Denoted as (x v2 , y v2 ).
  • the point A 1 on the left lane line of the lane where the vehicle is located is the same as the vertical coordinate of vertex 1
  • the point B on the right lane line that is the same as the vertical coordinate of vertex 1 1
  • the coordinates of A 1 are (x l1 , y v1 )
  • the coordinates of B 1 are (x r1 , y v1 )
  • a point A 2 that is the same as the ordinate of the vertex 2 on the left lane line of the lane where the vehicle is located is obtained.
  • the coordinates of A 2 are (x l2 , y v2 ), and the coordinates of B 2 are (x r2 , y v2 ).
  • the positional relationship of vertices 1, A 1 , B 1 , vertices 2, A 2 and B 2 may be as shown in FIG. 4, and the image coordinate system shown in FIG. 4 is the same as the image coordinate system shown in FIG. 3.
  • the distance relationship between the corresponding vertex and the center line of the lane where the vehicle is located can be calculated respectively, and the calculated smaller value is used as the second distance between the center line of the target vehicle and the lane where the vehicle is located Relationship value.
  • the distance relationship between vertex 1 and the center line of the lane where the vehicle is located can be calculated as follows:
  • the calculation method can be as follows:
  • the distance relationship between vertex 2 and the center line of the lane where the vehicle is located can be calculated as follows:
  • the calculation method can be as follows:
  • Step 205 Calculate the first credibility of the multiple first distance relationship values that have been obtained and the second credibility of the multiple second distance relationship values that have been obtained.
  • the first credibility of the first distance relationship value and the second credibility of the second distance relationship value can be calculated .
  • the credibility of the first distance relationship value and the second relationship value can be calculated, and , The reliability of the first distance relationship value and the second distance relationship value can be recalculated in each subsequent detection cycle.
  • M is a preset positive integer, which can be set according to actual needs, for example, it can be set to 10, 15 and so on.
  • the ideal lane-changing model and the ideal along-lane motion model can be used to calculate the calculate.
  • the ideal lane-changing model is used to characterize the time-varying relationship of the distance relationship between other vehicles in the surrounding scene of the vehicle and the center line of the lane where the vehicle is located.
  • the ideal lane-moving model is used for It characterizes the time-varying relationship of the distance relationship between other vehicles and the center line of the lane where the vehicle is located when other vehicles are moving along the lane. The following is based on the ideal lane-changing model and the ideal along-lane motion model, calculating the first credibility of the multiple first distance relationship values that have been obtained and the second credibility of the multiple second distance relationship values that have been obtained Method to explain:
  • the first usable lane-changing model and the first usable-lane motion model are obtained.
  • the second usable lane-changing model and the second usable-lane motion model are obtained.
  • the obtained multiple first distance relationship values and multiple second distance relationship values may be the first distance relationship values and the first distance relationship values obtained in the first preset number of consecutive detection periods including the current detection period. Two distance relationship values.
  • the first preset number can be set according to actual needs. For example, it can be the same as the value of M, set to 10, 15, and so on.
  • the method for acquiring the second usable lane-changing model is the same as the method for acquiring the first usable lane-changing model, and the method for acquiring the second usable along-lane motion model is the same as the first usable along-lane motion model.
  • the method for obtaining the first available lane-changing model and the first available lane-moving model will be described below as an example.
  • the ideal lane changing model can be expressed as the following relationship (6):
  • q LC (x i ) is the predicted distance relationship value.
  • ⁇ 1 , ⁇ 2 and ⁇ 3 are unknown parameters to be determined.
  • ⁇ 4 is the unknown parameter to be determined
  • q LK (x i ) is the predicted distance relationship value
  • the method of calculating the unknown parameters ⁇ 1 , ⁇ 2 and ⁇ 3 to be determined in the ideal lane changing model can be calculated as the following formula (8):
  • N is the number of first distance relationship values that have been obtained
  • p(x i ) is the first distance relationship value obtained by the i-th among the plurality of first distance relationship values that have been obtained.
  • the method to determine the unknown parameter ⁇ 4 to be determined in the ideal along-lane motion model is as follows:
  • the multiple first distance relationship values that have been obtained and the corresponding ranking values are substituted into the above formula (10), and the value of the unknown parameter ⁇ 4 is adjusted.
  • the first usable along-lane motion model can be obtained, as shown in the following formula (11):
  • the ideal lane-changing model, and the ideal-lane motion model are obtained.
  • the second available lane-changing model can be the following formula (12):
  • the second usable along-lane motion model can be the following formula (13):
  • the obtained multiple first distance relationship values can be calculated separately for the first usable
  • the first degree of fit of the lane-changing model, and the second degree of fit for the first along-lane motion model, and the third degree of fit for the second available lane-changing model is calculated for multiple second distance relationship values that have been obtained , And the fourth degree of fit for the second along-lane motion model.
  • the reciprocal of the smaller value of the first degree of fit and the second degree of fit is calculated as the first reliability T 1 of the obtained first distance relationship value.
  • the reciprocal of the smaller value of the third degree of fit and the fourth degree of fit is calculated as the first credibility T 2 of the obtained second distance relationship value.
  • Step 206 Calculate multiple fused distance relationship values of multiple first distance relationship values and multiple second distance relationship values based on the first credibility and the second credibility.
  • the target first distance relationship value and the target second distance relationship value obtained in the same detection period can be obtained.
  • the corresponding fusion distance value can be obtained for the detection periods to which the multiple first distance relationship values and the multiple second distance relationship values belong.
  • the first weights W 1 corresponding to the multiple obtained first distance relationship values can be calculated according to the following formula (18).
  • the second weight W 2 corresponding to the obtained multiple second distance relationship values can be calculated according to the following formula (19).
  • the target first distance relationship value and the target second distance relationship value obtained in the same detection period are sequentially acquired.
  • the product of the first distance relation value ⁇ li and the first weight W 1 of the target obtained in the i-th detection cycle and the product of the target second distance relation value ⁇ vi and the second weight W 2 obtained in the i-th detection cycle Add together to obtain the fusion distance relationship value ⁇ fi corresponding to the i-th detection period, where i is the detection period to which the currently acquired target first distance relationship value and the target second distance relationship value belong.
  • the calculation formula can be as follows (20):
  • Step 207 Determine whether the target vehicle has a lane change trend based on the multiple fusion distance relationship values.
  • the fusion distance relationship value and the corresponding ranking value corresponding to each detection period in the multiple detection periods are substituted into the above formula (8).
  • the ranking value corresponding to the fusion distance value can be represented by the ranking value of the detection period of the fusion distance value obtained by calculation in the above-mentioned multiple detection periods.
  • the ranking value corresponding to the fusion distance value is substituted into x i in the above formula (8), and the fusion distance relationship value is substituted into p(x i ) in the above formula (8).
  • the calculation method can be the following formula (22), and the ranking value corresponding to the fusion distance value is substituted into formula (22) the x i, the relationship between the fusion distance values in equation (22) p (x i).
  • N is the number of fusion distance values used to determine whether the target vehicle has a lane change tendency.
  • the laser point cloud data obtained by lidar technology and the scene image obtained by machine vision technology are combined to comprehensively determine the lane change trend of the target vehicle, which can effectively avoid the single use of lidar technology or machine vision technology.
  • an embodiment of the present application also provides a device for identifying a vehicle lane change trend, which can be applied to a vehicle lane change trend recognition device.
  • the device for recognizing the tendency of a vehicle to change lanes includes:
  • the obtaining module 510 is used to obtain laser point cloud data of a detected target vehicle, where the target vehicle is a vehicle traveling in a scene around the host vehicle; based on the laser point cloud data, obtain the lane where the host vehicle is located The first distance relationship value between the center line of the vehicle and the target vehicle; obtain a scene image containing the target vehicle; based on the scene image, obtain the second distance between the center line of the lane where the own vehicle is located and the target vehicle Relationship value.
  • the acquisition function in the above steps 201-204 and other implicit steps can be realized.
  • the calculation module 520 is configured to calculate the first credibility of the multiple first distance relationship values that have been obtained and the second credibility of the multiple second distance relationship values that have been obtained. Specifically, the calculation function in step 205 and other implicit steps can be realized.
  • the fusion module 530 is configured to calculate a plurality of fusion distance relationship values of the plurality of first distance relationship values and the plurality of second distance relationship values based on the first credibility and the second credibility . Specifically, the fusion function in step 206 and other implicit steps can be realized.
  • the determination module 540 is configured to determine whether the target vehicle has a lane change tendency based on the multiple fusion distance relationship values. Specifically, the judgment function in step 207 and other implicit steps can be realized.
  • the obtaining module 510 is configured to:
  • acquiring a centerline point set of the lane where the own vehicle is located where the centerline point set includes the coordinates of multiple sampling points on the centerline of the lane where the own vehicle is located in the world coordinate system;
  • a first distance relationship value between the centerline of the lane where the own vehicle is located and the target vehicle is obtained.
  • the obtaining module 510 is configured to:
  • the minimum distance between the coordinates of each sampling point included in the centerline point set in the world coordinate system and the second coordinate is taken as the first distance between the centerline of the lane where the own vehicle is located and the target vehicle ;
  • the obtaining module 510 is configured to:
  • the calculation module 520 is configured to:
  • the ideal lane-changing model is used to characterize the time-varying relationship of the distance relationship between other vehicles in the surrounding scene where the own vehicle is located and the center line of the lane where the own vehicle is located.
  • the lane motion model is used to characterize the time-varying relationship of the distance relationship value between the other vehicle and the center line of the lane where the own vehicle is located when the other vehicle is moving along the lane.
  • the calculation module 520 is configured to:
  • the second credibility of the obtained multiple second distance relationship values is obtained.
  • the calculation module 520 is configured to:
  • the obtaining the second credibility of the multiple obtained second distance relationship values based on the third degree of fit and the fourth degree of fit includes:
  • the reciprocal of the smaller value of the third degree of fit and the fourth degree of fit is determined as the second credibility of the obtained multiple second distance relationship values.
  • the obtaining module 510 is configured to:
  • the detection period obtain the first distance relationship value between the center line of the lane where the own vehicle is located and the target vehicle based on the laser point cloud data;
  • the fusion module 530 is used to:
  • the judgment module 540 is configured to:
  • the fifth degree of fit is greater than a preset degree of fit threshold, it is determined that the target vehicle has a lane change tendency.
  • the device for identifying vehicle lane change trends only uses the division of the above functional modules to illustrate when recognizing the vehicle lane change trend.
  • the above functions can be allocated according to needs. Different functional modules are completed, that is, the internal structure of the vehicle lane change trend recognition device is divided into different functional modules to complete all or part of the functions described above.
  • the device for recognizing a lane change trend of a vehicle provided by the foregoing embodiment belongs to the same concept as the embodiment of a method for recognizing a lane change trend of a vehicle. For the specific implementation process, please refer to the method embodiment, which will not be repeated here.
  • the computer program product includes one or more computer instructions, and when the computer program instructions are loaded and executed on a device, the processes or functions described in the embodiments of the present application are generated in whole or in part.
  • the computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • the computer instructions may be transmitted from a website, computer, server, or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by the device or a data storage device such as a server or data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (such as a floppy disk, a hard disk, and a magnetic tape, etc.), an optical medium (such as a digital video disk (Digital Video Disk, DVD), etc.), or a semiconductor medium (such as a solid state hard disk, etc.).

Abstract

一种识别车辆变道趋势的方法和装置,属于自动驾驶技术领域。可以应用在智能汽车、新能源汽车、网联汽车上,识别车辆变道趋势的方法包括:获取检测到目标车辆的激光点云数据(201),基于激光点云数据,得到本车辆所在车道的中心线与目标车辆的第一距离关系值(202)。获取包含目标车辆的场景图像(203),基于场景图像,得到上述中心线与目标车辆的第二距离关系值(204)。计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度(205)。基于第一可信度和第二可信度,计算多个第一距离关系值和多个第二距离关系值的多个融合距离关系值(206)。基于多个融合距离关系值,判断目标车辆是否存在变道趋势(207)。采用这种方法和装置,可以使车辆变道趋势识别更加准确。

Description

识别车辆变道趋势的方法和装置 技术领域
本申请涉及自动驾驶技术领域,特别涉及一种识别车辆变道趋势的方法和装置。
背景技术
目前,在城市道路中由于车辆变道导致的交通事故十分常见。因此,对于自动驾驶而言,能够准确识别本车辆周围车辆的变道趋势也是十分重要的。
在一些自动驾驶车辆中,判断周围目标车辆的变道趋势时,通常会采用激光雷达技术获取周围场景的激光点云数据,并检测目标车辆的轮廓顶点,获取轮廓顶点与本车辆的横向距离。进而,根据该横向距离随时间的变化判断目标车辆变道趋势。
此外,还有一些自动驾驶车辆中,判断周围目标车辆的变道趋势时,会采用机器视觉技术获取周围的场景图像,并检测场景图像中本车辆所在的车道线和目标车辆,获取目标车辆和本车辆所在车道线之间的距离。进而,根据该距离随时间的变化判断目标车辆变道趋势。
在实现本申请的过程中,发明人发现相关技术至少存在以下问题:
对于采用激光雷达技术识别目标车辆的变道趋势来说,会受到目标车辆的尾气,扬尘等导致目标车辆的轮廓顶点检测不准确,从而导致对目标车辆的变道趋势判断不准确。对于采用机器视觉技术识别目标车辆的变道趋势来说,在路口区域可能没有车道线,从而导致无法识别目标车辆的变道趋势。
发明内容
本申请实施例提供了一种识别车辆变道趋势的方法和装置,以解决相关技术中单独采用激光雷达技术或者机器视觉技术无法准确识别车辆变道趋势的问题。该技术方案如下:
第一方面,提供了一种识别车辆变道趋势的方法,所述方法包括:
获取检测到目标车辆的激光点云数据,其中,所述目标车辆为所述本车辆周围场景中行驶的车辆;基于所述激光点云数据,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值;获取包含所述目标车辆的场景图像;基于所述场景图像,得到所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值;计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度;基于所述第一可信度和所述第二可信度,计算所述多个第一距离关系值和所述多个第二距离关系值的多个融合距离关系值;基于所述多个融合距离关系值,判断所述目标车辆是否存在变道趋势。
本申请实施例所示的方案中,本车辆可以通过激光雷达采集周围场景的激光点云数据,并根据激光点云数据,得到本车辆所在车道的中心线与目标车辆的第一距离关系值。还可以通过机器视觉设备,如摄像头,采集周围场景的场景图像,并根据场景图像,得到本车辆所在车道的中心线与目标车辆的第二距离关系值。此处,距离关系值可以为用于反映目标车辆和本车辆所在车道的中心线之间距离关系的值,如距离和车道宽度的比值。
然后,对于得到的多个第一距离关系值和多个第二距离关系值,可以计算多个第一距离关系值的可信度和多个第二距离关系值的可信度,并根据计算得到的可信度,对多个第一距离关系值和多个第二距离关系值进行融合,得到多个融合距离关系值。最后,根据多个检测周期的融合距离关系值随时间的变换关系,判断目标车辆的变道趋势。
通过上述方案,结合了激光雷达技术得到的激光点云数据和以及机器视觉技术得到的场景图像,综合判断目标车辆的变道趋势,可以有效避免单一使用激光雷达技术或者机器视觉技术所造成的车辆变道趋势识别不准确的问题。
在一种可能的实现方式中,所述根据所述激光点云数据和所述中心线点集,获得所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值,包括:
基于高精度地图,获取所述本车辆所在车道的中心线点集,其中,所述中心线点集包括所述本车辆所在车道的中心线上的多个采样点在世界坐标系下的坐标;根据所述激光点云数据和所述中心线点集,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值。
本申请实施例所示的方案中,由于激光点云数据中不包括车道线信息,则要得到本车辆所在车道的中心线与目标车辆的第一距离关系值,可以先通过高精度地图,获取到本车辆所在车道的中心线点集,中心线点集包括本车辆所在车道的中心线上的多个采样点在世界坐标系下的坐标。然后,便可以根据得到的激光点云数据中的目标车辆的激光点云数据,和本车辆所在车道的中心线点集,得到本车辆所在车道的中心线与所述目标车辆的第一距离关系值。
在一种可能的实现方式中,所述根据所述激光点云数据和所述中心线点集,获得所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值,包括:
基于所述激光点云数据,获取所述目标车辆在本车辆的自车坐标系下的第一坐标;将所述第一坐标转换为所述目标车辆在世界坐标系下的第二坐标;将所述中心线点集中包括的各采样点在世界坐标系下的坐标与所述第二坐标之间的最小距离,作为所述本车辆所在车道的中心线与所述目标车辆的第一距离;获取所述本车辆所在车道的宽度,计算所述第一距离与所述本车辆所在车道线的宽度之间的第一比值,作为所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值。
本申请实施例所示的方案中,可以对得到的激光点云数据进行目标车辆识别,获取目标车辆在在激光雷达坐标系下的坐标,然后,将目标车辆在在激光雷达坐标系下的坐标转换到本车辆的自车坐标系下的第一坐标。再然后,将第一坐标转换为目标车辆在世界坐标系下的第二坐标,并将第二坐标和高精度地图中本车辆所在车道的中心线点集中的各采样点的坐标之间的最小距离,作为本车辆所在车道的中心线与目标车辆的第一距离。此处,高精度地图中本车辆所在车道的中心线点集中的各采样点的坐标为世界坐标系下的坐标。最后,可以获取在高精度地图中获取本车辆所在车道的宽度,计算第一距离与本车辆所在车道线的宽度之间的第一比值,作为本车辆所在车道的中心线与目标车辆的第一距离关系值。
在一种可能的实现方式中,所述基于所述场景图像,得到所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值,包括:
在所述场景图像的图像坐标系下,计算所述目标车辆与所述本车辆所在车道的中心线垂直距离;计算所述本车辆所在车道在所述场景图像的图像坐标系下的宽度;计算所述垂直距离和所述本车辆所在车道在所述场景图像的图像坐标系下的宽度之间的第二比值,作为所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值。
本申请实施例所示的方案中,可以对场景图像进行车道线识别和目标车辆识别,在进行目标车辆识别时,可以生成目标车辆对应的包围盒(bounding box),并获取该目标车辆的包围盒中靠近本车辆且与地面接触的两个顶点在图像坐标系下的坐标。分别计算这两个顶点和本车辆所在车道的中心线的垂直距离,以及这两个顶点分别对应的本车辆所在车道在图像坐标系下的宽度,进而分别计算出两个顶点与本车辆所在车道的中心线的距离关系值,并将计算出的较小值作为目标车辆和本车辆所在车道的中心线的第二距离关系值。
在一种可能的实现方式中,在所述计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度,包括:
基于理想换道模型和理想沿车道运动模型,计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度。
在本申请实施例所示的方案中,理想换道模型用于表征本车辆所在周围场景中的其他车辆在换道的情况下,与本车辆所在车道的中心线的距离关系值随时间的变化关系。理想沿车道运动模型用于表征其他车辆在沿车道运动的情况下,与本车辆所在车道的中心线之间的距离关系值随时间的变化关系。根据多个第一距离关系值随时间的变化关系,与理想换道模型和理想沿车道运动模型之间的拟合情况,可以得到第一距离关系值的可信度,同样的,可以得到第二距离关系值的可信度。
在一种可能的实现方式中,所述基于理想换道模型和理想沿车道运动模型,计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度,包括:
基于已经得到的多个第一距离关系值,计算所述理想换道模型中各未知参数的取值,得到第一可用换道模型;基于所述已经得到的多个第一距离关系值,计算所述理想沿车道运动模型中各未知参数的取值,得到第一可用沿车道运动模型;计算所述已经得到的多个第一距离关系值对于所述第一可用换道模型的第一拟合度,以及对于所述第一可用沿车道运动模型的第二拟合度;基于所述第一拟合度和所述第二拟合度,得到所述已经得到的多个第一距离关系值的第一可信度;基于已经得到的多个第二距离关系值,计算所述理想换道模型中各未知参数的取值,得到第二可用换道模型;基于所述已经得到的多个第二距离关系值,计算所述理想沿车道运动模型中各未知参数的取值,得到第二可用沿车道运动模型;计算所述已经得到的多个第二距离关系值对于所述第二可用换道模型的第三拟合度,以及对于所述第二可用沿车道运动模型的第四拟合度;基于所述第三拟合度和所述第四拟合度,得到所述已经得到的多个第二距离关系值的第二可信度。
本申请实施例所示的方案中,可以计算已经得到的多个第一距离关系值与理想换道模型之间的K-L散度,调整理想换道模型中各未知参数的取值,当K-L散度最小时,得到第一可用换道模型。计算已经得到的多个第一距离关系值与理想沿车道运动模型之间的K-L散度,调整理想换道模型中各未知参数的取值,当K-L散度最小时,得到第一可用沿车道模型。然后,可以分别计算已经得到的多个第一距离关系值对于第一可用换道模型的第一拟合度,以及对于第一可用沿车道运动模型的第二拟合度。当第一拟合度和第二拟合度中有至少一个拟合度较大时,则说明已经得到的多个第一距离关系值可信度较高。
同样的,可以得到已经得到的多个第二距离关系值对于第二可用换道模型的第三拟合度,以及对于第二可用沿车道运动模型的第四拟合度。当第三拟合度和第四拟合度中有至少一个 拟合度较大时,则说明已经得到的多个第二距离关系值可信度较高。
在一种可能的实现方式中,所述基于所述第一拟合度和所述第二拟合度,得到所述已经得到的多个第一距离关系值的第一可信度,包括:
将所述第一拟合度和所述第二拟合度中的较小值的倒数,确定为所述已经得到的多个第一距离关系值的第一可信度;
所述基于所述第三拟合度和所述第四拟合度,得到所述已经得到的多个第二距离关系值的第二可信度,包括:
将所述第三拟合度和所述第四拟合度中的较小值的倒数,确定为所述已经得到的多个第二距离关系值的第二可信度。
在一种可能的实现方式中,所述基于所述激光点云数据,获得所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值,包括:
按照检测周期,基于所述激光点云数据,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值;
所述基于所述场景图像,得到所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值,包括:
按照所述检测周期,基于所述场景图像,得到所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值;
所述基于所述第一可信度和所述第二可信度,计算所述多个第一距离关系值和所述多个第二距离关系值的多个融合距离关系值,包括:
基于所述第一可信度和所述第二可信度,计算所述多个第一距离关系值对应的第一权重和所述多个第二距离关系值对应的第二权重;在所述多个第一距离关系值和所述多个第二距离关系值中,获取同一检测周期得到的目标第一距离关系值和目标第二距离关系值;将所述目标第一距离关系值和所述第一权重之积,与所述目标第二距离关系值和所述第二权重之积相加,得到所述目标第一距离关系值和所述目标第二关系值所属的检测周期对应的融合距离关系值。
本申请实施例所示的方案中,第一距离关系值和第二距离关系值可以按照相同的检测周期获取。在计算融合距离关系值时,可以对每个多个检测周期中的每个检测周期内得到第一距离关系值和第二距离关系值分别计算。即,对于得到的多个检测周期中的每个检测周期,均可以计算出对应的融合距离关系值。
在一种可能的实现方式中,所述基于所述多个融合距离关系值,判断所述目标车辆是否存在变道趋,包括:
基于所述多个检测周期中每个检测周期分别对应的融合距离关系值,计算所述理想换道模型中各未知参数的取值,得到第三可用换道模型;计算所述多个检测周期对应的融合距离关系值对于所述第三可用换道模型的第五拟合度;如果所述第五拟合度大于预设拟合度阈值,则确定所述目标车辆存在变道趋势。
本申请实施例所示的方案中,如果多个检测周期对应的融合距离关系值对于第三可用换道模型的拟合度大于预设拟合度阈值,则可以认为该目标车辆存在换道趋势,以便提前控制车辆对目标车辆进行减速等处理。
第二方面,提供了一种识别车辆变道趋势的装置,所述装置包括:
获取模块,用于获取检测到目标车辆的激光点云数据,其中,所述目标车辆为所述本车辆周围场景中行驶的车辆;基于所述激光点云数据,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值;获取包含所述目标车辆的场景图像;基于所述场景图像,得到所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值;
计算模块,用于计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度;
融合模块,用于基于所述第一可信度和所述第二可信度,计算所述多个第一距离关系值和所述多个第二距离关系值的多个融合距离关系值;
判定模块,用于基于所述多个融合距离关系值,判断所述目标车辆是否存在变道趋势。
在一种可能的实现方式中,所述获取模块,用于:
基于高精度地图,获取所述本车辆所在车道的中心线点集,其中,所述中心线点集包括所述本车辆所在车道的中心线上的多个采样点在世界坐标系下的坐标;
根据所述激光点云数据和所述中心线点集,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值。
在一种可能的实现方式中,所述获取模块,用于:
基于所述激光点云数据,获取所述目标车辆在本车辆的自车坐标系下的第一坐标;
将所述第一坐标转换为所述目标车辆在世界坐标系下的第二坐标;
将所述中心线点集中包括的各采样点在世界坐标系下的坐标与所述第二坐标之间的最小距离,作为所述本车辆所在车道的中心线与所述目标车辆的第一距离;
获取所述本车辆所在车道的宽度,计算所述第一距离与所述本车辆所在车道线的宽度之间的第一比值,作为所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值。
在一种可能的实现方式中,所述获取模块,用于:
在所述场景图像的图像坐标系下,计算所述目标车辆与所述本车辆所在车道的中心线垂直距离;
计算所述本车辆所在车道在所述场景图像的图像坐标系下的宽度;
计算所述垂直距离和所述本车辆所在车道在所述场景图像的图像坐标系下的宽度之间的第二比值,作为所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值。
在一种可能的实现方式中,所述计算模块,用于:
基于理想换道模型和理想沿车道运动模型,计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度,其中,所述理想换道模型用于表征所述本车辆所在周围场景中的其他车辆在换道的情况下,与所述本车辆所在车道的中心线的距离关系值随时间的变化关系,所述理想沿车道运动模型用于表征所述其他车辆在沿车道运动的情况下,与所述本车辆所在车道的中心线之间的距离关系值随时间的变化关系。
在一种可能的实现方式中,所述计算模块,用于:
基于已经得到的多个第一距离关系值,计算所述理想换道模型中各未知参数的取值,得到第一可用换道模型;
基于所述已经得到的多个第一距离关系值,计算所述理想沿车道运动模型中各未知参数的取值,得到第一可用沿车道运动模型;
计算所述已经得到的多个第一距离关系值对于所述第一可用换道模型的第一拟合度,以及对于所述第一可用沿车道运动模型的第二拟合度;
基于所述第一拟合度和所述第二拟合度,得到所述已经得到的多个第一距离关系值的第一可信度;
基于已经得到的多个第二距离关系值,计算所述理想换道模型中各未知参数的取值,得到第二可用换道模型;
基于所述已经得到的多个第二距离关系值,计算所述理想沿车道运动模型中各未知参数的取值,得到第二可用沿车道运动模型;
计算所述已经得到的多个第二距离关系值对于所述第二可用换道模型的第三拟合度,以及对于所述第二可用沿车道运动模型的第四拟合度;
基于所述第三拟合度和所述第四拟合度,得到所述已经得到的多个第二距离关系值的第二可信度。
在一种可能的实现方式中,所述计算模块,用于:
将所述第一拟合度和所述第二拟合度中的较小值的倒数,确定为所述已经得到的多个第一距离关系值的第一可信度;
所述基于所述第三拟合度和所述第四拟合度,得到所述已经得到的多个第二距离关系值的第二可信度,包括:
将所述第三拟合度和所述第四拟合度中的较小值的倒数,确定为所述已经得到的多个第二距离关系值的第二可信度。
在一种可能的实现方式中,所述获取模块,用于:
按照检测周期,基于所述激光点云数据,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值;
按照所述检测周期,基于所述场景图像,得到所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值;
所述融合模块,用于:
基于所述第一可信度和所述第二可信度,计算所述多个第一距离关系值对应的第一权重和所述多个第二距离关系值对应的第二权重;
在所述多个第一距离关系值和所述多个第二距离关系值中,获取同一检测周期得到的目标第一距离关系值和目标第二距离关系值;
将所述目标第一距离关系值和所述第一权重之积,与所述目标第二距离关系值和所述第二权重之积相加,得到所述目标第一距离关系值和所述目标第二关系值所属的检测周期对应的融合距离关系值。
在一种可能的实现方式中,所述判断模块,用于:
基于所述多个检测周期中每个检测周期分别对应的融合距离关系值,计算所述理想换道模型中各未知参数的取值,得到第三可用换道模型;
计算所述多个检测周期对应的融合距离关系值对于所述第三可用换道模型的第五拟合度;
如果所述第五拟合度大于预设拟合度阈值,则确定所述目标车辆存在变道趋势。
第三方面,提供了一种车辆变道趋势识别设备,所述车辆变道趋势识别设备包括处理器和存储器,所述存储器中存储有至少一条指令,所述指令由所述处理器加载并执行以实现如上述第一方面所述的识别车辆变道趋势的方法所执行的操作。
第四方面,提供了一种计算机可读存储介质,所述存储介质中存储有至少一条指令,所述指令由处理器加载并执行以实现如上述第一方面所述的识别车辆变道趋势的方法所执行的操作。
第五方面,提供了一种包含指令的计算机程序产品,当所述计算机程序产品在车辆变道趋势识别设备上运行时,使得所述车辆变道趋势识别设备执行如上述第一方面所述的识别车辆变道趋势的方法。
本申请实施例提供的技术方案带来的有益效果是:
在本申请实施例所示的方案中,根据激光点云数据,得到本车辆所在车道的中心线和目标车辆之间的第一距离关系值,根据场景图像,得到本车辆所在车道的中心线和目标车辆之间的第二距离关系值,并对第一距离关系值和第二距离关系值融合,得到融合距离关系值。最后,根据融合距离关系值,判断目标车辆的变道趋势。可见,在该方案中结合了激光雷达技术得到的激光点云数据和机器视觉技术得到的场景图像,综合判断目标车辆的变道趋势,可以有效避免单一使用激光雷达技术或者机器视觉技术所造成的车辆变道趋势识别不准确的问题。
附图说明
图1是本申请实施例提供的一种车辆变道趋势识别设备的结构示意图;
图2是本申请实施例提供的一种识别车辆变道趋势的方法流程图;
图3是本申请实施例提供的一种场景图像的图像坐标系的示意图;
图4是本申请实施例提供的一种场景图像的图像坐标系的示意图;
图5是本申请实施例提供的一种识别车辆变道趋势的装置结构示意图。
具体实施方式
本申请实施例提供了一种识别车辆变道趋势的方法,该方法可以应用于自动驾驶汽车中。该方法可以由自动驾驶汽车中的车辆变道趋势识别设备实现。在自动驾驶汽车中可部署有感知系统、定位系统等。其中,感知系统可以包括有激光雷达、摄像头等,定位系统可以为全球定位系统(Global Positioning System,GPS)、北斗系统等。
如图1所示是本申请实施例提供的一种车辆变道趋势识别设备100的示意图。在图1中,车辆变道趋势识别设备可以包括有处理器101和存储器102。处理器101可以是中央处理器(central processing unit,CPU)。处理器101可以是指一个处理器,也可以包括多个处理器。存储器102可以包括易失性存储器,例如随机存取存储器(random access memory,RAM);存储器也可以包括非易失性存储器,例如只读存储器(read-only memory,ROM),快闪存储器等。存储器还可以包括上述种类的存储器的组合。存储器102可以是指一个存储器,也可 以包括多个存储器。存储器102中存储有计算机可读指令,该计算机可读指令可以由处理器101执行,以实现本申请实施例提供的识别车辆变道趋势的方法。
如图2所示是本申请实施例提供的一种识别车辆变道趋势的方法流程图,该方法的流程可以包括如下步骤:
步骤201、获取检测到目标车辆的激光点云数据。
其中,目标车辆为所述本车辆周围场景中行驶的车辆。
在实施中,本车辆可以安装有激光雷达,激光雷达以固定频率对周围场景进行扫描,得到激光点云数据。在获取检测到目标车辆的激光点云数据时,可以不获取每帧检测到目标车辆的激光点云数据,而是按照检测周期,获取检测到目标车辆的激光点云数据。在一个检测周期内,激光雷达可以采集到一帧或多帧检测到目标车辆的激光点云数据激光点云数据。在每个检测周期内,可以获取当前检测周期中获取到的最后一帧检测到目标车辆的激光点云数据。
步骤202、基于激光点云数据,得到本车辆所在车道的中心线与目标车辆的第一距离关系值。
在实施中,对于获取的激光点云数据,可以识别出目标车辆,并获取目标车辆轮廓线顶点对于的激光点在激光雷达坐标系下的坐标。然后,根据预先标定得到激光雷达坐标系到自车坐标系的变换矩阵,将目标车辆的轮廓线顶点在激光雷达坐标系下的坐标转换到自车坐标系下,得到目标车辆在自车坐标系下的第一坐标。其中,获取到的目标车辆在激光雷达坐标系下的坐标,可以是目标车辆的轮廓线顶点在激光雷达坐标系下的坐标,相应的,转换后在自车坐标系下的第一坐标可以包括目标车辆的轮廓线顶点在自车坐标系下的坐标。此处,根据目标车辆形状的不同对应的轮廓线顶点数量也可能不同。
然后,可以将目标车辆在自车坐标系下的第一坐标转换到世界坐标系下,得到目标车辆在世界坐标系下的第二坐标。即,将目标车辆的各轮廓线顶点在自车坐标系下的坐标转换到世界坐标系下,得到目标车辆的各轮廓线顶点在世界坐标系下的坐标。对于目标车辆的每个轮廓线顶点的坐标由自车坐标系转换到世界坐标系下的方法可以如下:
对激光雷达和定位系统进行时间戳对齐,通过定位系统获取本车辆在世界坐标系下的坐标(x 0,y 0)和朝向角θ 0。然后,根据如下公式(1)对目标车辆在自车坐标系下的第一坐标进行转换:
Figure PCTCN2020096415-appb-000001
其中,x i,y i为目标车辆的第i个轮廓线顶点在本车辆的自车坐标系下的横坐标和纵坐标,x i′,y i′为目标车辆的第i个轮廓线顶点在世界坐标系下的横坐标和纵坐标。
Figure PCTCN2020096415-appb-000002
为自车坐标系到世界坐标系下的旋转矩阵,该旋转矩阵可以有如下表示:
Figure PCTCN2020096415-appb-000003
在得到目标车辆在世界坐标系下的第二坐标后,可以在高精度地图中获取本车辆所在车道的中心线点集。此处,获取的中心线点集可以为与本车辆满足预设距离条件的中心线点集,例如,本车辆100米范围内的本车辆所在车道的中心线点集。
再然后,可以计算目标车辆在世界坐标系下的第二坐标和上述获取的高精度地图中本车 辆所在车道的中心线点集中的各采样点的坐标之间的距离,并选取其中的最小距离作为目标车辆与本车辆所在车道的中心线的距离。也即是,计算目标车辆在世界坐标系下的各轮廓线顶点的坐标和上述获取的高精度地图中本车辆所在车道的中心线点集中的各采样点的坐标之间的距离,并选取其中的最小距离作为目标车辆与本车辆所在车道的中心线的距离。此处,目标车辆与本车辆所在车道的中心线的距离可以记作d 1
再然后,可以计算目标车辆与本车辆所在车道的中心线的距离d 1与本车辆所在车道的宽度D 1的比值,作为目标车辆与本车辆所在车道的中心线的第一距离关系值γ l。即,
Figure PCTCN2020096415-appb-000004
此处,本车辆所在车道的宽度D 1可以通过高精度地图获取。
此外还需说明的是,距离关系值是用来表示其他车辆与本车辆所在车道的中心线的距离关系的,除了使用其他车辆与本车辆所在车道的中心线的距离与本车辆所在车道的宽度的比值表示外,还可以使用该比值的倒数表示。
步骤203、获取包含目标车辆的场景图像。
在实施中,本车辆可以安装有机器视觉设备,如摄像头。该机器视觉设备以固定频率对周围场景进行拍摄,得到场景图像。在获取包括目标车辆的场景图像时,可以不获取每帧包括目标车辆的场景图像,而是按照检测周期,获取包括目标车辆的激光点云数据。在一个检测周期内,机器视觉设备可以获取一帧或多帧包括目标车辆的场景图像。在一个检测周期内,可以获取当前检测周期中获取到的最后一帧包括目标车辆的场景图像。
步骤204、基于场景图像,得到本车辆所在车道的中心线与目标车辆的第二距离关系值。
在实施中,对于获取的包括目标车辆的场景图像,可以进行车道线识别和目标车辆识别。在进行目标车辆识别时,可以生成目标车辆对应的包围盒(bounding box),并获取该目标车辆的包围盒中靠近本车辆且与地面接触的两个顶点在图像坐标系下的坐标。如图3所述,图像坐标系可以以图像左上角点为原点。其中,目标车辆的包围盒中靠近本车辆且与地面接触的顶点1的坐标可以记为(x v1,y v1)和目标车辆的包围盒中靠近本车辆且与地面接触的顶点2的坐标可以记为(x v2,y v2)。
在上述图3所示的图像坐标系下,分别获取本车辆所在车道的左车道线上与顶点1的纵坐标相同的点A 1,以及右车道线上与顶点1的纵坐标相同的点B 1,其中,A 1的坐标为(x l1,y v1),B 1的坐标为(x r1,y v1)。同样的,获取本车辆所在车道的左车道线上与顶点2的纵坐标相同的点A 2,以及与车道线上与顶点2的纵坐标相同的点B 2。其中,A 2的坐标为(x l2,y v2),B 2的坐标为(x r2,y v2)。顶点1、A 1、B 1、顶点2、A 2和B 2的位置关系可以如图4所示,图4中所示的图像坐标系与图3所示的图像坐标系相同。对于顶点1和顶点2,可以分别计算出对应的顶点与本车辆所在车道的中心线的距离关系值,并将计算出的较小值作为目标车辆和本车辆所在车道的中心线的第二距离关系值。
顶点1与本车辆所在车道的中心线的距离关系值,计算可以如下:
首先,获取顶点1对应的本车道的中心线上的点C 1,C 1的坐标为(x c1,y v1),其中,x c1可以按照如下公式(2)计算:
Figure PCTCN2020096415-appb-000005
然后,可以计算顶点1与本车辆所在车道的中心线的距离关系值,计算方法可以如下公式(3):
Figure PCTCN2020096415-appb-000006
顶点2与本车辆所在车道的中心线的距离关系值,计算可以如下:
首先,获取顶点2对应的本车道的中心线上的点C 2,C 2的坐标为(x c2,y v2),其中,x c2可以按照如下公式(4)计算:
Figure PCTCN2020096415-appb-000007
然后,可以计算顶点2与本车辆所在车道的中心线的距离关系值,计算方法可以如下公式(5):
Figure PCTCN2020096415-appb-000008
最后,取γ v1和γ v2中的较小值为目标车辆和本车辆所在车道的中心线的第二距离关系值。
步骤205、计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度。
在实施中,可以在获取到多个第一距离关系值和多个第二距离关系值时,再计算第一距离关系值的第一可信度和第二距离关系值的第二可信度。对于获取第一距离关系值和第二距离关系值是按照相同检测周期的情况,此处,可以在M个检测周期后,开始计算第一距离关系值和第二关系值的可信度,并且,可以在之后的每个检测周期均重新计算第一距离关系值和第二距离关系值的可信度。其中,M为预设正整数,可以根据实际需求进行设置,例如可以设置为10、15等。
在计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度时,可以通过理想换道模型和理想沿车道运动模型来计算。其中,理想换道模型用于表征本车辆所在周围场景中的其他车辆在换道的情况下,与本车辆所在车道的中心线的距离关系值随时间的变化关系,理想沿车道运动模型用于表征其他车辆在沿车道运动的情况下,与本车辆所在车道的中心线之间的距离关系值随时间的变化关系。下面对基于理想换道模型和理想沿车道运动模型,计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度的方法进行说明:
首先,根据已经得到的多个第一距离关系值、理想换道模型和理想沿车道运动模型,得到第一可用换道模型和第一可用沿车道运动模型。根据已经得到的多个第二距离关系值、理想换道模型和理想沿车道运动模型,得到第二可用换道模型和第二可用沿车道运动模型。
需要说明的是,已经得到的多个第一距离关系值和多个第二距离关系值可以为包括当前检测周期在内的第一预设数目个连续检测周期得到的第一距离关系值和第二距离关系值,该第一预设数目可以根据实际需求设置,例如,可以与上述M的取值相同,设置为10、15等。
对于第二可用换道模型的获取方法和第一可用换道模型的获取方法相同,对于第二可用沿车道运动模型的获取方法和第一可用沿车道运动模型相同。下面以对第一可用换道模型和第一可以沿车道运动模型的获取方法为例进行说明。
理想换道模型可以表示为如下关系式(6)所示:
Figure PCTCN2020096415-appb-000009
其中,q LC(x i)为预测距离关系值。α 1、α 2和α 3为待确定的未知参数。
理想沿车道运动模型可以表示为如下关系式(7)所示:
q LK(x i)=α 4  (7)
其中,α 4为待确定的未知参数,q LK(x i)为预测距离关系值。
根据已经得到的多个第一距离关系值,计算理想换道模型中待确定的未知参数α 1、α 2和α 3的方法,计算方法可以如下公式(8):
Figure PCTCN2020096415-appb-000010
其中,N为已经得到的第一距离关系值的数量,x i可以为已经得到的多个第一距离关系值中的第i个得到的第一距离关系值对应的排序值,例如,x 1为已经得到的多个第一距离关系值中的第1个得到的第一距离关系值对应的排序值,x 1=1,依次类推。p(x i)为在已经得到的多个第一距离关系值中的第i个得到的第一距离关系值。将用于确定理想换道模型中待确定未知参数的已经得到的多个第一距离关系值与分别对应的排序值,代入上述公式(8),并调整未知参数α 1、α 2和α 3的取值,当上述公式(8)的累加结果最小时,获取此时α 1、α 2和α 3的取值,例如,α 1=a 1,α 2=a 2,α 3=a 3,并将α 1、α 2和α 3的取值代入上述公式(6),即可得到第一可用换道模型,如下公式(9):
Figure PCTCN2020096415-appb-000011
根据多个第一距离关系值,确定理想沿车道运动模型中待确定的未知参数α 4的方法,方法如下公式(10):
Figure PCTCN2020096415-appb-000012
其中,将用于已经得到的多个第一距离关系值与分别对应的排序值,代入上述公式(10),并调整未知参数α 4的取值,当上述公式(10)的累加结果最小时,获取此时α 4的取值,例如,α 4=a 4。并将α 4的取值代入上述公式(7),即可得到第一可用沿车道运动模型,如下公式(11):
q LK1(x i)=a 4   (11)
同样的,根据已经得到的多个第二距离关系值、理想换道模型和理想沿车道运动模型,得到第二可用换道模型和第二可用沿车道运动模型。第二可用换道模型,可以如下公式(12):
Figure PCTCN2020096415-appb-000013
第二可用沿车道运动模型,可以如下公式(13):
q LK2(x i)=b 4    (13)
然后,在得到第一可用换道模型、第一沿车道运动模型、第二可用换道模型和第二沿车道运动模型后,可以分别计算已经得到的多个第一距离关系值对于第一可用换道模型的第一拟合度,以及对于第一沿车道运动模型的第二拟合度,并计算已经得到的多个第二距离关系值对于第二可用换道模型的第三拟合度,以及对于第二沿车道运动模型的第四拟合度。
计算已经得到的多个第一距离关系值对于第一可用换道模型的第一拟合度,可以采用如下公式(14):
Figure PCTCN2020096415-appb-000014
计算已经得到的多个第一距离关系值对于第一可用沿车道运动模型的第二拟合度,可以采用如下公式(15):
Figure PCTCN2020096415-appb-000015
计算已经得到的多个第二距离关系值对于第二可用换道模型的第三拟合度,可以采用如下公式(16):
Figure PCTCN2020096415-appb-000016
计算已经得到的多个第二距离关系值对于第二可用沿车道运动模型的第四拟合度,可以采用如下公式(17):
Figure PCTCN2020096415-appb-000017
最后,计算第一拟合度和第二拟合度中的较小值的倒数,作为已经得到的第一距离关系值的第一可信度T 1。计算第三拟合度和第四拟合度中的较小值的倒数,作为已经得到的第二距离关系值的第一可信度T 2
步骤206、基于第一可信度和第二可信度,计算多个第一距离关系值和多个第二距离关系值的多个融合距离关系值。
在实施中,对于已经得到的多个第一距离关系值和多个第二距离关系值,可以获取同一检测周期得到的目标第一距离关系值和目标第二距离关系值。将目标第一距离关系值和第一权重之积,与目标第二距离关系值和第二权重之积相加,得到目标第一距离关系值和目标第二关系值所属的检测周期对应的融合距离关系值。这样,对于已经得到的多个第一距离关系值和多个第二距离关系值所属的检测周期,都可以得到对应的融合距离值。
可以按照如下公式(18)计算已经得到的多个第一距离关系值对应的第一权重W 1
Figure PCTCN2020096415-appb-000018
可以按照如下公式(19)计算已经得到的多个第二距离关系值对应的第二权重W 2
Figure PCTCN2020096415-appb-000019
然后,在已经得到的多个第一距离关系值和多个第二距离关系值中,依次获取同一检测周期得到的目标第一距离关系值和目标第二距离关系值。将第i个检测周期得到的目标第一距离关系值γ li和第一权重W 1之积,与该第i个检测周期得到的目标第二距离关系值γ vi和第二权重W 2之积相加,得到该第i个检测周期对应的融合距离关系值γ fi,其中,i为当前获取的目标第一距离关系值和目标第二距离关系值所属的检测周期,在上述已经得到的多个第一距离关系值和多个第二距离关系值所属的多个检测周期中的排序值。
计算公式可以如下公式(20):
γ fi=W 1·γ li+W 2·γ vi   (20)
步骤207、基于多个融合距离关系值,判断目标车辆是否存在变道趋势。
在实施中,将多个检测周期中每个检测周期分别对应的融合距离关系值和对应的排序值代入上述公式(8)。融合距离值对应的排序值可以用计算得到该融合距离值的检测周期在上述多个检测周期中的排序值表示。其中,融合距离值对应的排序值代入上述公式(8)中的x i,融合距离关系值代入上述公式(8)中的p(x i)。并调整未知参数α 1、α 2和α 3的取值,当上述公式(8)的累加结果最小时,获取此时α 1、α 2和α 3的取值,例如,α 1=b 1,α 2=b 2,α 3=b 3,并将α 1、α 2和α 3的取值代入上述公式(6),即可得到第三可用换道模型,如下公式(21):
Figure PCTCN2020096415-appb-000020
再然后,可以计算上述多个检测周期对应的融合距离关系值对于该第三可用换道模型的拟合度,计算方法可以如下公式(22),融合距离值对应的排序值代入公式(22)中的x i,融合距离关系值代入公式(22)中的p(x i)。
Figure PCTCN2020096415-appb-000021
其中,N为用于判定目标车辆是否存在变道趋势的融合距离值的数量。
最后,确定计算出的拟合度D LC3是否大于预设拟合度,如果计算出的拟合度D LC3大于预设拟合度,则可以确定该目标车辆存在变道趋势。
本申请实施例中,结合了激光雷达技术得到的激光点云数据和以及机器视觉技术得到的场景图像,综合判断目标车辆的变道趋势,可以有效避免单一使用激光雷达技术或者机器视觉技术所造成的车辆变道趋势识别不准确的问题。
基于相同的技术构思,本申请实施例还提供了一种识别车辆变道趋势的装置,可以应用车辆变道趋势识别设备中。如图5所示,该识别车辆变道趋势的装置包括:
获取模块510,用于获取检测到目标车辆的激光点云数据,其中,所述目标车辆为所述本车辆周围场景中行驶的车辆;基于所述激光点云数据,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值;获取包含所述目标车辆的场景图像;基于所述场景图像,得到所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值。具体可以实现上述步骤201-204中的获取功能,以及其他隐含步骤。
计算模块520,用于计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度。具体可以实现上述步骤205中的计算功能,以及其他隐含步骤。
融合模块530,用于基于所述第一可信度和所述第二可信度,计算所述多个第一距离关系值和所述多个第二距离关系值的多个融合距离关系值。具体可以实现上述步骤206中的融合功能,以及其他隐含步骤。
判定模块540,用于基于所述多个融合距离关系值,判断所述目标车辆是否存在变道趋势。具体可以实现上述步骤207中的判断功能,以及其他隐含步骤。
在一种可能的实现方式中,所述获取模块510,用于:
基于高精度地图,获取所述本车辆所在车道的中心线点集,其中,所述中心线点集包括所述本车辆所在车道的中心线上的多个采样点在世界坐标系下的坐标;
根据所述激光点云数据和所述中心线点集,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值。
在一种可能的实现方式中,所述获取模块510,用于:
基于所述激光点云数据,获取所述目标车辆在本车辆的自车坐标系下的第一坐标;
将所述第一坐标转换为所述目标车辆在世界坐标系下的第二坐标;
将所述中心线点集中包括的各采样点在世界坐标系下的坐标与所述第二坐标之间的最小距离,作为所述本车辆所在车道的中心线与所述目标车辆的第一距离;
获取所述本车辆所在车道的宽度,计算所述第一距离与所述本车辆所在车道线的宽度之间的第一比值,作为所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值。
在一种可能的实现方式中,所述获取模块510,用于:
在所述场景图像的图像坐标系下,计算所述目标车辆与所述本车辆所在车道的中心线垂直距离;
计算所述本车辆所在车道在所述场景图像的图像坐标系下的宽度;
计算所述垂直距离和所述本车辆所在车道在所述场景图像的图像坐标系下的宽度之间的第二比值,作为所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值。
在一种可能的实现方式中,所述计算模块520,用于:
基于理想换道模型和理想沿车道运动模型,计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度,其中,所述理想换道模型用于表征所述本车辆所在周围场景中的其他车辆在换道的情况下,与所述本车辆所在车道的中心线的距离关系值随时间的变化关系,所述理想沿车道运动模型用于表征所述其他车辆在沿车道运动的情况下,与所述本车辆所在车道的中心线之间的距离关系值随时间的变化关系。
在一种可能的实现方式中,所述计算模块520,用于:
基于已经得到的多个第一距离关系值,计算所述理想换道模型中各未知参数的取值,得到第一可用换道模型;
基于所述已经得到的多个第一距离关系值,计算所述理想沿车道运动模型中各未知参数的取值,得到第一可用沿车道运动模型;
计算所述已经得到的多个第一距离关系值对于所述第一可用换道模型的第一拟合度,以及对于所述第一可用沿车道运动模型的第二拟合度;
基于所述第一拟合度和所述第二拟合度,得到所述已经得到的多个第一距离关系值的第一可信度;
基于已经得到的多个第二距离关系值,计算所述理想换道模型中各未知参数的取值,得到第二可用换道模型;
基于所述已经得到的多个第二距离关系值,计算所述理想沿车道运动模型中各未知参数的取值,得到第二可用沿车道运动模型;
计算所述已经得到的多个第二距离关系值对于所述第二可用换道模型的第三拟合度,以及对于所述第二可用沿车道运动模型的第四拟合度;
基于所述第三拟合度和所述第四拟合度,得到所述已经得到的多个第二距离关系值的第二可信度。
在一种可能的实现方式中,所述计算模块520,用于:
将所述第一拟合度和所述第二拟合度中的较小值的倒数,确定为所述已经得到的多个第一距离关系值的第一可信度;
所述基于所述第三拟合度和所述第四拟合度,得到所述已经得到的多个第二距离关系值的第二可信度,包括:
将所述第三拟合度和所述第四拟合度中的较小值的倒数,确定为所述已经得到的多个第二距离关系值的第二可信度。
在一种可能的实现方式中,所述获取模块510,用于:
按照检测周期,基于所述激光点云数据,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值;
按照所述检测周期,基于所述场景图像,得到所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值;
所述融合模块530,用于:
基于所述第一可信度和所述第二可信度,计算所述多个第一距离关系值对应的第一权重和所述多个第二距离关系值对应的第二权重;
在所述多个第一距离关系值和所述多个第二距离关系值中,获取同一检测周期得到的目标第一距离关系值和目标第二距离关系值;
将所述目标第一距离关系值和所述第一权重之积,与所述目标第二距离关系值和所述第二权重之积相加,得到所述目标第一距离关系值和所述目标第二关系值所属的检测周期对应的融合距离关系值。
在一种可能的实现方式中,所述判断模块540,用于:
基于所述多个检测周期中每个检测周期分别对应的融合距离关系值,计算所述理想换道模型中各未知参数的取值,得到第三可用换道模型;
计算所述多个检测周期对应的融合距离关系值对于所述第三可用换道模型的第五拟合度;
如果所述第五拟合度大于预设拟合度阈值,则确定所述目标车辆存在变道趋势。
需要说明的是:上述实施例提供的识别车辆变道趋势的装置在识别车辆变道趋势时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将车辆变道趋势识别设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的识别车辆变道趋势的装置与识别车辆变道趋势的方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现,当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令,在设备上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴光缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是设备能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(如软盘、硬盘和磁带等),也可以是光介质(如数字视盘(Digital Video Disk,DVD)等),或者半导体介质(如固态硬盘等)。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本申请一个实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (20)

  1. 一种识别车辆变道趋势的方法,其特征在于,所述方法包括:
    获取检测到目标车辆的激光点云数据,其中,所述目标车辆为所述本车辆周围场景中行驶的车辆;
    基于所述激光点云数据,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值;
    获取包含所述目标车辆的场景图像;
    基于所述场景图像,得到所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值;
    计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度;
    基于所述第一可信度和所述第二可信度,计算所述多个第一距离关系值和所述多个第二距离关系值的多个融合距离关系值;
    基于所述多个融合距离关系值,判断所述目标车辆是否存在变道趋势。
  2. 根据权利要求1所述的方法,其特征在于,所述基于所述激光点云数据,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值,包括:
    基于高精度地图,获取所述本车辆所在车道的中心线点集,其中,所述中心线点集包括所述本车辆所在车道的中心线上的多个采样点在世界坐标系下的坐标;
    根据所述激光点云数据和所述中心线点集,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值。
  3. 根据权利要求2所述方法,其特征在于,所述根据所述激光点云数据和所述中心线点集,获得所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值,包括:
    基于所述激光点云数据,获取所述目标车辆在本车辆的自车坐标系下的第一坐标;
    将所述第一坐标转换为所述目标车辆在世界坐标系下的第二坐标;
    将所述中心线点集中包括的各采样点在世界坐标系下的坐标与所述第二坐标之间的最小距离,作为所述本车辆所在车道的中心线与所述目标车辆的第一距离;
    获取所述本车辆所在车道的宽度,计算所述第一距离与所述本车辆所在车道线的宽度之间的第一比值,作为所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值。
  4. 根据权利要求1-3中任一项所述的方法,其特征在于,所述基于所述场景图像,得到所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值,包括:
    在所述场景图像的图像坐标系下,计算所述目标车辆与所述本车辆所在车道的中心线的垂直距离;
    计算所述本车辆所在车道在所述场景图像的图像坐标系下的宽度;
    计算所述本车辆所在车道在所述图像坐标系下的宽度与所述垂直距离之间的第二比值, 作为所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值。
  5. 根据权利要求1-4中任一项所述的方法,其特征在于,所述计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度,包括:
    基于理想换道模型和理想沿车道运动模型,计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度,其中,所述理想换道模型用于表征所述本车辆所在周围场景中的其他车辆在换道的情况下,与所述本车辆所在车道的中心线的距离关系值随时间的变化关系,所述理想沿车道运动模型用于表征所述其他车辆在沿车道运动的情况下,与所述本车辆所在车道的中心线之间的距离关系值随时间的变化关系。
  6. 根据权利要求5所述的方法,其特征在于,所述基于理想换道模型和理想沿车道运动模型,计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度,包括:
    基于已经得到的多个第一距离关系值,计算所述理想换道模型中各未知参数的取值,得到第一可用换道模型;
    基于所述已经得到的多个第一距离关系值,计算所述理想沿车道运动模型中各未知参数的取值,得到第一可用沿车道运动模型;
    计算所述已经得到的多个第一距离关系值对于所述第一可用换道模型的第一拟合度,以及对于所述第一可用沿车道运动模型的第二拟合度;
    基于所述第一拟合度和所述第二拟合度,得到所述已经得到的多个第一距离关系值的第一可信度;
    基于已经得到的多个第二距离关系值,计算所述理想换道模型中各未知参数的取值,得到第二可用换道模型;
    基于所述已经得到的多个第二距离关系值,计算所述理想沿车道运动模型中各未知参数的取值,得到第二可用沿车道运动模型;
    计算所述已经得到的多个第二距离关系值对于所述第二可用换道模型的第三拟合度,以及对于所述第二可用沿车道运动模型的第四拟合度;
    基于所述第三拟合度和所述第四拟合度,得到所述已经得到的多个第二距离关系值的第二可信度。
  7. 根据权利要求6所述的方法,其特征在于,所述基于所述第一拟合度和所述第二拟合度,得到所述已经得到的多个第一距离关系值的第一可信度,包括:
    获取所述第一拟合度和所述第二拟合度中的较小值的倒数,作为所述已经得到的多个第一距离关系值的第一可信度;
    所述基于所述第三拟合度和所述第四拟合度,得到所述已经得到的多个第二距离关系值的第二可信度,包括:
    获取所述第三拟合度和所述第四拟合度中的较小值的倒数,作为所述已经得到的多个第二距离关系值的第二可信度。
  8. 根据权利要求1-7中任一项所述的方法,其特征在于,所述基于所述激光点云数据,获得所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值,包括:
    按照检测周期,基于所述激光点云数据,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值;
    所述基于所述场景图像,得到所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值,包括:
    按照所述检测周期,基于所述场景图像,得到所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值;
    所述基于所述第一可信度和所述第二可信度,计算所述多个第一距离关系值和所述多个第二距离关系值的多个融合距离关系值,包括:
    基于所述第一可信度和所述第二可信度,计算所述多个第一距离关系值对应的第一权重和所述多个第二距离关系值对应的第二权重;
    在所述多个第一距离关系值和所述多个第二距离关系值中,获取同一检测周期得到的目标第一距离关系值和目标第二距离关系值;
    将所述目标第一距离关系值和所述第一权重之积,与所述目标第二距离关系值和所述第二权重之积相加,得到所述目标第一距离关系值和所述目标第二关系值所属的检测周期对应的融合距离关系值。
  9. 根据权利要求1-8中任一项所述的方法,其特征在于,所述基于所述多个融合距离关系值,判断所述目标车辆是否存在变道趋势,包括:
    基于所述多个融合距离关系值,计算所述理想换道模型中各未知参数的取值,得到第三可用换道模型;
    计算所述多个融合距离关系值对于所述第三可用换道模型的第五拟合度;
    如果所述第五拟合度大于预设拟合度阈值,则确定所述目标车辆存在变道趋势。
  10. 一种识别车辆变道趋势的装置,其特征在于,所述装置包括:
    获取模块,用于获取检测到目标车辆的激光点云数据,其中,所述目标车辆为所述本车辆周围场景中行驶的车辆;基于所述激光点云数据,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值;获取包含所述目标车辆的场景图像;基于所述场景图像,得到所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值;
    计算模块,用于计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度;
    融合模块,用于基于所述第一可信度和所述第二可信度,计算所述多个第一距离关系值和所述多个第二距离关系值的多个融合距离关系值;
    判定模块,用于基于所述多个融合距离关系值,判断所述目标车辆是否存在变道趋势。
  11. 根据权利要求10所述的装置,其特征在于,所述获取模块,用于:
    基于高精度地图,获取所述本车辆所在车道的中心线点集,其中,所述中心线点集包括所述本车辆所在车道的中心线上的多个采样点在世界坐标系下的坐标;
    根据所述激光点云数据和所述中心线点集,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值。
  12. 根据权利要求11所述装置,其特征在于,所述获取模块,用于:
    基于所述激光点云数据,获取所述目标车辆在本车辆的自车坐标系下的第一坐标;
    将所述第一坐标转换为所述目标车辆在世界坐标系下的第二坐标;
    将所述中心线点集中包括的各采样点在世界坐标系下的坐标与所述第二坐标之间的最小距离,作为所述本车辆所在车道的中心线与所述目标车辆的第一距离;
    获取所述本车辆所在车道的宽度,计算所述第一距离与所述本车辆所在车道线的宽度之间的第一比值,作为所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值。
  13. 根据权利要求10-12中任一项所述的装置,其特征在于,所述获取模块,用于:
    在所述场景图像的图像坐标系下,计算所述目标车辆与所述本车辆所在车道的中心线垂直距离;
    计算所述本车辆所在车道在所述场景图像的图像坐标系下的宽度;
    计算所述垂直距离和所述本车辆所在车道在所述场景图像的图像坐标系下的宽度之间的第二比值,作为所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值。
  14. 根据权利要求10-13中任一项所述的装置,其特征在于,所述计算模块,用于:
    基于理想换道模型和理想沿车道运动模型,计算已经得到的多个第一距离关系值的第一可信度和已经得到的多个第二距离关系值的第二可信度,其中,所述理想换道模型用于表征所述本车辆所在周围场景中的其他车辆在换道的情况下,与所述本车辆所在车道的中心线的距离关系值随时间的变化关系,所述理想沿车道运动模型用于表征所述其他车辆在沿车道运动的情况下,与所述本车辆所在车道的中心线之间的距离关系值随时间的变化关系。
  15. 根据权利要求14所述的装置,其特征在于,所述计算模块,用于:
    基于已经得到的多个第一距离关系值,计算所述理想换道模型中各未知参数的取值,得到第一可用换道模型;
    基于所述已经得到的多个第一距离关系值,计算所述理想沿车道运动模型中各未知参数的取值,得到第一可用沿车道运动模型;
    计算所述已经得到的多个第一距离关系值对于所述第一可用换道模型的第一拟合度,以及对于所述第一可用沿车道运动模型的第二拟合度;
    基于所述第一拟合度和所述第二拟合度,得到所述已经得到的多个第一距离关系值的第一可信度;
    基于已经得到的多个第二距离关系值,计算所述理想换道模型中各未知参数的取值,得到第二可用换道模型;
    基于所述已经得到的多个第二距离关系值,计算所述理想沿车道运动模型中各未知参数的取值,得到第二可用沿车道运动模型;
    计算所述已经得到的多个第二距离关系值对于所述第二可用换道模型的第三拟合度,以 及对于所述第二可用沿车道运动模型的第四拟合度;
    基于所述第三拟合度和所述第四拟合度,得到所述已经得到的多个第二距离关系值的第二可信度。
  16. 根据权利要求15所述的装置,其特征在于,所述计算模块,用于:
    将所述第一拟合度和所述第二拟合度中的较小值的倒数,确定为所述已经得到的多个第一距离关系值的第一可信度;
    所述基于所述第三拟合度和所述第四拟合度,得到所述已经得到的多个第二距离关系值的第二可信度,包括:
    将所述第三拟合度和所述第四拟合度中的较小值的倒数,确定为所述已经得到的多个第二距离关系值的第二可信度。
  17. 根据权利要求10-16中任一项所述的装置,其特征在于,所述获取模块,用于:
    按照检测周期,基于所述激光点云数据,得到所述本车辆所在车道的中心线与所述目标车辆的第一距离关系值;
    按照所述检测周期,基于所述场景图像,得到所述本车辆所在车道的中心线与所述目标车辆的第二距离关系值;
    所述融合模块,用于:
    基于所述第一可信度和所述第二可信度,计算所述多个第一距离关系值对应的第一权重和所述多个第二距离关系值对应的第二权重;
    在所述多个第一距离关系值和所述多个第二距离关系值中,获取同一检测周期得到的目标第一距离关系值和目标第二距离关系值;
    将所述目标第一距离关系值和所述第一权重之积,与所述目标第二距离关系值和所述第二权重之积相加,得到所述目标第一距离关系值和所述目标第二关系值所属的检测周期对应的融合距离关系值。
  18. 根据权利要求10-17中任一项所述的装置,其特征在于,所述判断模块,用于:
    基于所述多个检测周期中每个检测周期分别对应的融合距离关系值,计算所述理想换道模型中各未知参数的取值,得到第三可用换道模型;
    计算所述多个检测周期对应的融合距离关系值对于所述第三可用换道模型的第五拟合度;
    如果所述第五拟合度大于预设拟合度阈值,则确定所述目标车辆存在变道趋势。
  19. 一种车辆变道趋势识别设备,其特征在于,所述车辆变道趋势识别设备包括处理器和存储器,所述存储器中存储有至少一条指令,所述指令由所述处理器加载并执行以实现如权利要求1至权利要求9任一项所述的识别车辆变道趋势的方法所执行的操作。
  20. 一种计算机可读存储介质,其特征在于,所述存储介质中存储有至少一条指令,所述指令由处理器加载并执行以实现如权利要求1至权利要求9任一项所述的识别车辆变道趋 势的方法所执行的操作。
PCT/CN2020/096415 2020-06-16 2020-06-16 识别车辆变道趋势的方法和装置 WO2021253245A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN202080005121.9A CN112753038B (zh) 2020-06-16 2020-06-16 识别车辆变道趋势的方法和装置
EP20940942.4A EP4160346A4 (en) 2020-06-16 2020-06-16 METHOD AND DEVICE FOR IDENTIFYING VEHICLE LANE CHANGING TENDENCY
PCT/CN2020/096415 WO2021253245A1 (zh) 2020-06-16 2020-06-16 识别车辆变道趋势的方法和装置
US18/065,510 US20230110730A1 (en) 2020-06-16 2022-12-13 Method and apparatus for recognizing vehicle lane change trend

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/096415 WO2021253245A1 (zh) 2020-06-16 2020-06-16 识别车辆变道趋势的方法和装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/065,510 Continuation US20230110730A1 (en) 2020-06-16 2022-12-13 Method and apparatus for recognizing vehicle lane change trend

Publications (1)

Publication Number Publication Date
WO2021253245A1 true WO2021253245A1 (zh) 2021-12-23

Family

ID=75651289

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/096415 WO2021253245A1 (zh) 2020-06-16 2020-06-16 识别车辆变道趋势的方法和装置

Country Status (4)

Country Link
US (1) US20230110730A1 (zh)
EP (1) EP4160346A4 (zh)
CN (1) CN112753038B (zh)
WO (1) WO2021253245A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115546319A (zh) * 2022-11-24 2022-12-30 北京理工大学深圳汽车研究院(电动车辆国家工程实验室深圳研究院) 车道保持方法、装置、计算机设备及存储介质

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11273836B2 (en) * 2017-12-18 2022-03-15 Plusai, Inc. Method and system for human-like driving lane planning in autonomous driving vehicles
CN113428141B (zh) * 2021-07-15 2022-12-09 东风汽车集团股份有限公司 一种前车紧急切入及时响应的智能检测方法及系统
CN116392369B (zh) * 2023-06-08 2023-09-08 中国电建集团昆明勘测设计研究院有限公司 一种基于盲道的识别感应方法、装置、设备及存储介质
CN116559899B (zh) * 2023-07-12 2023-10-03 蘑菇车联信息科技有限公司 自动驾驶车辆的融合定位方法、装置和电子设备

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020091479A1 (en) * 2001-01-09 2002-07-11 Nissan Motor Co., Ltd. Braking control system with object detection system interaction
WO2007051835A1 (de) * 2005-11-04 2007-05-10 Continental Teves Ag & Co. Ohg Verfahren zum unterstützen eines fahrers beim fahren mit einem fahrzeug
CN103196418A (zh) * 2013-03-06 2013-07-10 山东理工大学 一种弯道车距测量方法
FR3029154B1 (fr) * 2014-12-02 2016-12-30 Valeo Schalter & Sensoren Gmbh Systeme embarque sur un vehicule automobile pour une fonctionnalite de changement de voie completement automatisee, et procede de controle associe
CN106598053A (zh) * 2016-12-19 2017-04-26 驭势科技(北京)有限公司 一种自动驾驶车辆横向运动控制对象选择方法
CN106627585A (zh) * 2016-12-27 2017-05-10 长安大学 一种基于图像处理的车辆变道辅助装置及其工作方法
CN106647776A (zh) * 2017-02-24 2017-05-10 驭势科技(北京)有限公司 车辆变道趋势的判断方法、判断装置和计算机存储介质
CN107038896A (zh) * 2017-06-21 2017-08-11 成都锐奕信息技术有限公司 便于采集分析车辆信息的方法
CN107121980A (zh) * 2017-03-17 2017-09-01 北京理工大学 一种基于虚拟约束的自动驾驶车辆路径规划方法
CN108961839A (zh) * 2018-09-05 2018-12-07 奇瑞汽车股份有限公司 行车变道方法及装置
CN108961799A (zh) * 2018-07-24 2018-12-07 佛山市高明曦逻科技有限公司 驾驶信息道路辅助系统
CN110097785A (zh) * 2019-05-30 2019-08-06 长安大学 一种前车切入或紧急换道识别预警装置及预警方法
CN110949395A (zh) * 2019-11-15 2020-04-03 江苏大学 一种基于多传感器融合的弯道acc目标车辆识别方法
CN111222417A (zh) * 2019-12-24 2020-06-02 武汉中海庭数据技术有限公司 一种用于提高基于车载图像车道线提取精度的方法及装置

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100492437C (zh) * 2007-06-01 2009-05-27 清华大学 一种目标车换道工况下的快速识别方法
US8355539B2 (en) * 2007-09-07 2013-01-15 Sri International Radar guided vision system for vehicle validation and vehicle motion characterization

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020091479A1 (en) * 2001-01-09 2002-07-11 Nissan Motor Co., Ltd. Braking control system with object detection system interaction
WO2007051835A1 (de) * 2005-11-04 2007-05-10 Continental Teves Ag & Co. Ohg Verfahren zum unterstützen eines fahrers beim fahren mit einem fahrzeug
CN103196418A (zh) * 2013-03-06 2013-07-10 山东理工大学 一种弯道车距测量方法
FR3029154B1 (fr) * 2014-12-02 2016-12-30 Valeo Schalter & Sensoren Gmbh Systeme embarque sur un vehicule automobile pour une fonctionnalite de changement de voie completement automatisee, et procede de controle associe
CN106598053A (zh) * 2016-12-19 2017-04-26 驭势科技(北京)有限公司 一种自动驾驶车辆横向运动控制对象选择方法
CN106627585A (zh) * 2016-12-27 2017-05-10 长安大学 一种基于图像处理的车辆变道辅助装置及其工作方法
CN106647776A (zh) * 2017-02-24 2017-05-10 驭势科技(北京)有限公司 车辆变道趋势的判断方法、判断装置和计算机存储介质
CN107121980A (zh) * 2017-03-17 2017-09-01 北京理工大学 一种基于虚拟约束的自动驾驶车辆路径规划方法
CN107038896A (zh) * 2017-06-21 2017-08-11 成都锐奕信息技术有限公司 便于采集分析车辆信息的方法
CN108961799A (zh) * 2018-07-24 2018-12-07 佛山市高明曦逻科技有限公司 驾驶信息道路辅助系统
CN108961839A (zh) * 2018-09-05 2018-12-07 奇瑞汽车股份有限公司 行车变道方法及装置
CN110097785A (zh) * 2019-05-30 2019-08-06 长安大学 一种前车切入或紧急换道识别预警装置及预警方法
CN110949395A (zh) * 2019-11-15 2020-04-03 江苏大学 一种基于多传感器融合的弯道acc目标车辆识别方法
CN111222417A (zh) * 2019-12-24 2020-06-02 武汉中海庭数据技术有限公司 一种用于提高基于车载图像车道线提取精度的方法及装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115546319A (zh) * 2022-11-24 2022-12-30 北京理工大学深圳汽车研究院(电动车辆国家工程实验室深圳研究院) 车道保持方法、装置、计算机设备及存储介质
CN115546319B (zh) * 2022-11-24 2023-03-28 北京理工大学深圳汽车研究院(电动车辆国家工程实验室深圳研究院) 车道保持方法、装置、计算机设备及存储介质

Also Published As

Publication number Publication date
EP4160346A4 (en) 2023-08-02
US20230110730A1 (en) 2023-04-13
EP4160346A1 (en) 2023-04-05
CN112753038A (zh) 2021-05-04
CN112753038B (zh) 2022-04-12

Similar Documents

Publication Publication Date Title
WO2021253245A1 (zh) 识别车辆变道趋势的方法和装置
US10670416B2 (en) Traffic sign feature creation for high definition maps used for navigating autonomous vehicles
WO2022083402A1 (zh) 障碍物检测方法、装置、计算机设备和存储介质
US10964054B2 (en) Method and device for positioning
CN109829351B (zh) 车道信息的检测方法、装置及计算机可读存储介质
WO2018068653A1 (zh) 点云数据处理方法、装置及存储介质
US20200393265A1 (en) Lane line determination for high definition maps
US11625851B2 (en) Geographic object detection apparatus and geographic object detection method
CN114359181B (zh) 一种基于图像和点云的智慧交通目标融合检测方法及系统
CN111091023B (zh) 一种车辆检测方法、装置及电子设备
WO2019198076A1 (en) Real-time raw data- and sensor fusion
CN111652072A (zh) 轨迹获取方法、轨迹获取装置、存储介质和电子设备
CN114495064A (zh) 一种基于单目深度估计的车辆周围障碍物预警方法
CN115376109B (zh) 障碍物检测方法、障碍物检测装置以及存储介质
CN111160132B (zh) 障碍物所在车道的确定方法、装置、电子设备和存储介质
Tak et al. Development of AI-based vehicle detection and tracking system for C-ITS application
WO2022166606A1 (zh) 一种目标检测方法及装置
CN113643431A (zh) 一种用于视觉算法迭代优化的系统及方法
WO2022099620A1 (zh) 三维点云分割方法和装置、可移动平台
CN111353481A (zh) 基于激光点云与视频图像的道路障碍物识别方法
CN115359332A (zh) 基于车路协同的数据融合方法、装置、电子设备及系统
CN111709357B (zh) 识别目标区域的方法、装置、电子设备和路侧设备
CN111709354B (zh) 识别目标区域的方法、装置、电子设备和路侧设备
WO2021008500A1 (zh) 图像处理方法和装置
WO2022021209A9 (zh) 电子地图生成方法、装置、计算机设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20940942

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020940942

Country of ref document: EP

Effective date: 20221226